CN106441275A - Method and device for updating planned path of robot - Google Patents
Method and device for updating planned path of robot Download PDFInfo
- Publication number
- CN106441275A CN106441275A CN201610843839.0A CN201610843839A CN106441275A CN 106441275 A CN106441275 A CN 106441275A CN 201610843839 A CN201610843839 A CN 201610843839A CN 106441275 A CN106441275 A CN 106441275A
- Authority
- CN
- China
- Prior art keywords
- robot
- dimensional coordinate
- point
- coordinates
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
Description
技术领域technical field
本发明属于机器人技术领域,尤其涉及一种机器人规划路径的更新方法及装置。The invention belongs to the technical field of robots, and in particular relates to a method and device for updating a planned path of a robot.
背景技术Background technique
现有机器人通常采用全球定位系统(Global Positioning System,缩写为GPS)来定位,并采用激光对周边的环境进行探测,以实现路径的自主规划。GPS是美国主导的全球定位系统,是一种具有全方位、全天候、全时段、高精度的卫星导航系统,而采用激光传感器是利用激光测距的原理,通过记录被测物体表面大量的密集的点的三维坐标、反射率和纹理等信息,复建出被测目标的三维模型及线、面、体等各种图件数据。Existing robots usually use a Global Positioning System (Global Positioning System, abbreviated as GPS) for positioning, and use lasers to detect the surrounding environment, so as to realize autonomous path planning. GPS is the global positioning system dominated by the United States. It is a satellite navigation system with all-round, all-weather, full-time and high precision. The laser sensor uses the principle of laser ranging to record a large number of dense The three-dimensional coordinates, reflectivity and texture information of the points are used to reconstruct the three-dimensional model of the measured target and various map data such as lines, surfaces, and volumes.
然而,利用GPS在小范围内进行物体定位时存在一些问题。一方面,GPS信号容易被建筑物、山体等物体遮挡,另一方面,GPS很难获取到机器人移动角度等重要地理信息,而采用激光传感器来探知周边环境也需要比较久的计算时间,从而使得机器人在移动范围比较小的时候,很难实现机器人的实时、精准定位。However, there are some problems when using GPS to locate objects in a small area. On the one hand, GPS signals are easily blocked by objects such as buildings and mountains. On the other hand, it is difficult for GPS to obtain important geographical information such as the moving angle of the robot, and it takes a long time to calculate the surrounding environment using laser sensors, which makes When the moving range of the robot is relatively small, it is difficult to realize the real-time and precise positioning of the robot.
发明内容Contents of the invention
本发明的目的在于提供一种机器人规划路径的更新方法及装置,旨在解决由于现有技术无法提供一种有效的机器人路径更新方法,导致机器人在小范围移动时难以实现机器人的实时、精准定位的问题。The purpose of the present invention is to provide a method and device for updating a robot's planned path, aiming at solving the problem that the existing technology cannot provide an effective method for updating the robot path, which makes it difficult to realize real-time and precise positioning of the robot when the robot moves in a small range The problem.
一方面,本发明提供了一种机器人规划路径的更新方法,所述方法包括下述步骤:In one aspect, the present invention provides a method for updating a robot's planned path, the method comprising the following steps:
实时获取机器人移动过程中的当前载体坐标,将所述当前载体坐标变换为所述机器人的当前全局坐标;Obtaining the current carrier coordinates during the movement of the robot in real time, and transforming the current carrier coordinates into the current global coordinates of the robot;
根据所述当前全局坐标和当前移动速度,计算所述机器人移动到下一地点的移动速度估计和位置估计;calculating a moving speed estimate and a position estimate for the robot to move to a next location according to the current global coordinates and the current moving speed;
获取表示所述机器人周围环境中障碍物深度图像中像素点的三维坐标点,将所述三维坐标点转换为二维坐标点,以得到所述机器人周围环境中障碍物的投影轮廓;Obtaining three-dimensional coordinate points representing pixel points in the depth image of obstacles in the surrounding environment of the robot, converting the three-dimensional coordinate points into two-dimensional coordinate points to obtain the projected outline of obstacles in the surrounding environment of the robot;
根据所述二维坐标点获取所述周围环境中障碍物的角点;Acquiring corner points of obstacles in the surrounding environment according to the two-dimensional coordinate points;
根据所述位置估计和所述障碍物的角点更新所述机器人的规划路径。Updating the planned path of the robot according to the position estimate and the corner points of the obstacles.
另一方面,本发明提供了一种机器人规划路径的更新装置,所述装置包括:In another aspect, the present invention provides a device for updating a robot's planned path, the device comprising:
坐标变换单元,用于实时获取机器人移动过程中的当前载体坐标,将所述当前载体坐标变换为所述机器人的当前全局坐标;A coordinate transformation unit, configured to acquire the current carrier coordinates during the movement of the robot in real time, and transform the current carrier coordinates into the current global coordinates of the robot;
参数获取单元,用于根据所述当前全局坐标和当前移动速度,计算所述机器人移动到下一地点的移动速度估计和位置估计;A parameter acquisition unit, configured to calculate a moving speed estimate and a position estimate for the robot to move to a next location according to the current global coordinates and the current moving speed;
坐标转换单元,用于获取表示所述机器人周围环境中障碍物深度图像中像素点的三维坐标点,将所述三维坐标点转换为二维坐标点,以得到所述机器人周围环境中障碍物的投影轮廓;A coordinate conversion unit, configured to obtain three-dimensional coordinate points representing pixels in the depth image of obstacles in the surrounding environment of the robot, and convert the three-dimensional coordinate points into two-dimensional coordinate points, so as to obtain the coordinates of obstacles in the surrounding environment of the robot projection profile;
角点获取单元,用于根据所述二维坐标点获取所述周围环境中障碍物的角点;以及a corner point acquiring unit, configured to acquire corner points of obstacles in the surrounding environment according to the two-dimensional coordinate points; and
路径更新单元,用于根据所述位置估计和所述障碍物的角点更新所述机器人的规划路径。a path updating unit, configured to update the planned path of the robot according to the estimated position and the corner points of the obstacles.
本发明通过实时获取机器人移动过程中的当前载体坐标,将当前载体坐标变换为机器人的当前全局坐标,根据当前全局坐标和当前移动速度,计算机器人移动到下一地点的移动速度估计和位置估计,获取表示机器人周围环境中障碍物深度图像中像素点的三维坐标点,将三维坐标点转换为二维坐标点,以得到机器人周围环境中障碍物的投影轮廓,进而根据二维坐标点获取周围环境中障碍物的角点,最后结合位置估计和障碍物的角点更新机器人的规划路径,从而利用机器人自身传感器获取的速度和位置信息实现了障碍物的精确定位和规避,并通过三维坐标转二维坐标降低了路径更新过程中的计算复杂度,提高了路径更新的响应速度。The present invention acquires the current carrier coordinates during the robot's moving process in real time, transforms the current carrier coordinates into the current global coordinates of the robot, and calculates the moving speed estimation and position estimation of the robot moving to the next location according to the current global coordinates and the current moving speed. Obtain the three-dimensional coordinate points representing the pixel points in the depth image of obstacles in the surrounding environment of the robot, convert the three-dimensional coordinate points into two-dimensional coordinate points, so as to obtain the projected outline of obstacles in the surrounding environment of the robot, and then obtain the surrounding environment according to the two-dimensional coordinate points The corner points of the obstacles in the center, and finally update the robot's planned path by combining the position estimation and the corner points of the obstacles, so that the speed and position information obtained by the robot's own sensors can be used to realize the precise positioning and avoidance of obstacles, and through the three-dimensional coordinates to two The dimensional coordinates reduce the computational complexity in the path update process and improve the response speed of the path update.
附图说明Description of drawings
图1是本发明实施例提供的机器人规划路径的更新方法的实现流程图;Fig. 1 is the implementation flowchart of the update method of the robot planning path provided by the embodiment of the present invention;
图2是本发明实施例提供的机器人规划路径的更新装置的结构示意图;以及FIG. 2 is a schematic structural diagram of an updating device for a robot planning path provided by an embodiment of the present invention; and
图3是本发明实施例提供的机器人规划路径的更新装置的优选结构示意图。Fig. 3 is a schematic diagram of a preferred structure of an updating device for a robot planned path provided by an embodiment of the present invention.
具体实施方式detailed description
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.
以下结合具体实施例对本发明的具体实现进行详细描述:The specific realization of the present invention is described in detail below in conjunction with specific embodiment:
图1示出了本发明实施例提供的机器人规划路径的更新方法的实现流程,为了便于说明,仅示出了与本发明实施例相关的部分,详述如下:Figure 1 shows the implementation process of the method for updating the robot planning path provided by the embodiment of the present invention. For the convenience of description, only the parts related to the embodiment of the present invention are shown, and the details are as follows:
在步骤S101中,实时获取机器人移动过程中的当前载体坐标,将当前载体坐标变换为机器人的当前全局坐标。In step S101, the current carrier coordinates during the movement of the robot are obtained in real time, and the current carrier coordinates are transformed into the current global coordinates of the robot.
本发明实施例适用于机器人,用于机器人移动过程中的自主路径规划。机器人上配置有相应的传感器,例如,加速度传感器、陀螺仪等,以获取移动过程中与机器人移动相关的数据,例如,加速度、移动过程的偏转方面等。当前载体坐标是机器人在当前位置时在载体坐标系中对应的坐标,而载体坐标系是固连在机器人上的坐标系。作为示例地,该坐标系的原点Ob可位于机器人的质心处。xb朝向机器人移动的方向,yb则指向机器人前进方向的右手边,zb垂直于Obxbyb平面且满足右手法则。当前全局坐标是机器人在当前位置时在全局坐标系中对应的坐标,全局坐标系是在具体应用中(例如,导航应用)设置的全局坐标体系。The embodiment of the present invention is applicable to a robot, and is used for autonomous path planning during the moving process of the robot. Corresponding sensors are configured on the robot, such as acceleration sensors, gyroscopes, etc., to obtain data related to the movement of the robot during the movement process, such as acceleration, deflection aspects of the movement process, and the like. The current carrier coordinates are the corresponding coordinates in the carrier coordinate system when the robot is at the current position, and the carrier coordinate system is the coordinate system fixed to the robot. As an example, the origin O b of the coordinate system may be located at the center of mass of the robot. x b points to the moving direction of the robot, y b points to the right-hand side of the robot's forward direction, and z b is perpendicular to the O b x b y b plane and satisfies the right-hand rule. The current global coordinates are the corresponding coordinates in the global coordinate system when the robot is at the current position, and the global coordinate system is the global coordinate system set in a specific application (for example, a navigation application).
因此,优选地,应预先构建机器人的载体坐标系和全局坐标系,从而利用载体坐标系(机器人)上的坐标对机器人进行定位,无需使用专用的定位系统或全局系统对机器人进行定位,提高了机器人在小范围内移动时的定位精度。在将当前载体坐标变换为机器人的当前全局坐标时,可利用载体坐标系和全局坐标系之间的转换关系进行转换。具体地,可利用公式计算机器人的当前全局坐标,以获取机器人精确的全局坐标。其中,为当前载体坐标, 其中ψ为航向角(机器人绕z轴的旋转角度),θ为俯仰角(机器人绕y轴的旋转角度),γ为横滚角(机器人绕x轴的旋转角度),λ为预设的常数。Therefore, preferably, the carrier coordinate system and the global coordinate system of the robot should be constructed in advance, thereby using the coordinates on the carrier coordinate system (robot) to locate the robot, without using a dedicated positioning system or global system to locate the robot, improving the The positioning accuracy of the robot when moving in a small range. When transforming the current carrier coordinates into the current global coordinates of the robot, the transformation relationship between the carrier coordinate system and the global coordinate system can be used for conversion. Specifically, the formula can be used Calculate the current global coordinates of the robot to obtain the precise global coordinates of the robot. in, is the current vector coordinates, Where ψ is the heading angle (the rotation angle of the robot around the z-axis), θ is the pitch angle (the rotation angle of the robot around the y-axis), γ is the roll angle (the rotation angle of the robot around the x-axis), and λ is a preset constant .
在步骤S102中,根据当前全局坐标和当前移动速度,计算机器人移动到下一地点的移动速度估计和位置估计。In step S102, according to the current global coordinates and the current moving speed, calculate the estimated moving speed and position of the robot moving to the next location.
在本发明实施例中,需要获取机器人的当前移动速度,以根据当前全局坐标和当前移动速度,计算机器人移动到下一地点的移动速度估计和位置估计。也就是说,从当前位置或坐标以该速度沿着机器人在前一位置时确定的路径移动,以确定机器人下一可能的移动速度和位置或坐标。In the embodiment of the present invention, the current moving speed of the robot needs to be obtained, so as to calculate the moving speed estimation and position estimation of the robot moving to the next location according to the current global coordinates and the current moving speed. That is, from the current position or coordinates, move along the path determined by the robot at the previous position at this speed to determine the next possible moving speed and position or coordinates of the robot.
优选地,根据公式 Preferably, according to the formula
分别计算机器人移动到下一地点的移动速度估计和位置估计。其中,VxW表示机器人在全局坐标系下x轴方向上的速度,VyW表示全局坐标系下y轴方向上的速度,VzW表示全局坐标系下z轴方向上的速度,Δωxw(j)表示全局坐标系下x轴方向j时间点与j-1时间点之间加速度变化值,Δωyw(j)表示全局坐标系下y轴方向j时间点与j-1时间点之间加速度变化值,Δωzw(j)表示全局坐标系下z轴方向j时间点与j-1时间点之间加速度变化值,k、k+1表示当前时间点和下一时间点,xw表示全局坐标系下x轴方向上的位置,yw表示全局坐标系下y轴方向上的位置,zw表示全局坐标系下z轴方向上的位置,T2表示导航定位的计算周期,T1表示为传感器采样周期,n为常数,且T2=n*T1。这样, 则分别表示当前时间点(k时间点)、下一时间点(k+1时间点)机器人在全局坐标系下各方向上的速度,则分别表示当前时间点(k时间点)、下一时间点(k+1时间点)机器人在全局坐标系下各方向上的坐标。Compute the velocity estimate and position estimate for the robot moving to the next location, respectively. Among them, V xW represents the speed of the robot in the direction of the x-axis in the global coordinate system, V yW represents the speed in the direction of the y-axis in the global coordinate system, V zW represents the speed in the direction of the z-axis in the global coordinate system, Δω xw (j ) represents the acceleration change value between time point j and j-1 time point in the x-axis direction in the global coordinate system, and Δω yw (j) represents the acceleration change value between time point j and j-1 time point in the y-axis direction in the global coordinate system Δω zw (j) represents the acceleration change value between j time point and j-1 time point in the z-axis direction in the global coordinate system, k and k+1 represent the current time point and the next time point, and x w represents the global coordinate The position in the x-axis direction of the global coordinate system, y w represents the position in the y-axis direction of the global coordinate system, z w represents the position in the z-axis direction of the global coordinate system, T 2 represents the calculation cycle of navigation and positioning, and T 1 is expressed as Sensor sampling period, n is a constant, and T 2 =n*T 1 . so, Then respectively represent the current time point (k time point) and the next time point (k+1 time point) the speed of the robot in each direction under the global coordinate system, Then respectively represent the coordinates of the robot in each direction under the global coordinate system at the current time point (k time point) and the next time point (k+1 time point).
在步骤S103中,获取表示机器人周围环境中障碍物深度图像中像素点的三维坐标点,将三维坐标点转换为二维坐标点,以得到机器人周围环境中障碍物的投影轮廓。In step S103, the three-dimensional coordinate points representing the pixel points in the depth image of the obstacle in the surrounding environment of the robot are obtained, and the three-dimensional coordinate points are converted into two-dimensional coordinate points to obtain the projected outline of the obstacle in the surrounding environment of the robot.
在本发明实施例中,可通过Kinect深度传感器获取机器人移动路径上的地图或障碍物图像,接着将表示障碍物图像的三维坐标点转换为二维坐标点,然后将深度图像中像素点的深度信息、横轴信息以及纵轴信息转化为点状云数据,根据预设的换算关系和点状云数据,计算表示机器人周围环境中障碍物的三维坐标点,根据三维坐标和二维坐标之间预设的投影关系,将三维坐标点转换为二维坐标点,从而得到机器人周围环境中障碍物的二维投影轮廓。这样,在由深度图像转换得到的二维地图或障碍物图像、不降低二维地图或障碍物图像精确度的同时,可有效降低后续计算的复杂度,提高了机器人路径的更新效率。In the embodiment of the present invention, the Kinect depth sensor can be used to obtain the map or obstacle image on the moving path of the robot, and then convert the three-dimensional coordinate point representing the obstacle image into a two-dimensional coordinate point, and then convert the depth of the pixel point in the depth image Information, horizontal axis information, and vertical axis information are converted into point cloud data. According to the preset conversion relationship and point cloud data, calculate the three-dimensional coordinate points representing obstacles in the surrounding environment of the robot. According to the relationship between the three-dimensional coordinates and the two-dimensional coordinates The preset projection relationship converts the three-dimensional coordinate points into two-dimensional coordinate points, so as to obtain the two-dimensional projection outline of the obstacles in the surrounding environment of the robot. In this way, while the two-dimensional map or obstacle image converted from the depth image does not reduce the accuracy of the two-dimensional map or obstacle image, the complexity of subsequent calculations can be effectively reduced, and the update efficiency of the robot path is improved.
在步骤S104中,根据二维坐标点获取周围环境中障碍物的角点。In step S104, corner points of obstacles in the surrounding environment are obtained according to the two-dimensional coordinate points.
在本发明实施例中,在获取周围环境中障碍物的角点时,使用预设的窗口函数计算每一个二维坐标点的相关矩阵,根据相关矩阵计算每一个二维坐标点的Harris角点响应,接着对每一个二维坐标点的Harris角点响应做一个非极大值抑制,以在预设窗口内的二维坐标点中找寻极大值点,当在预设窗口内的二维坐标点的Harris角点响应大于预设的阀值、且该二维坐标点的Harris角点响应为预设窗口内的局部极大值,将该二维坐标点确定为周围环境中障碍物的角点,从而识别出障碍物图像中的障碍物或障碍物边界。In the embodiment of the present invention, when obtaining the corner points of obstacles in the surrounding environment, the correlation matrix of each two-dimensional coordinate point is calculated using a preset window function, and the Harris corner point of each two-dimensional coordinate point is calculated according to the correlation matrix Response, and then do a non-maximum value suppression for the Harris corner point response of each two-dimensional coordinate point, to find the maximum value point in the two-dimensional coordinate points in the preset window, when the two-dimensional coordinate point in the preset window The Harris corner point response of the coordinate point is greater than the preset threshold, and the Harris corner point response of the two-dimensional coordinate point is the local maximum value in the preset window, and the two-dimensional coordinate point is determined as the obstacle in the surrounding environment. Corner points, thereby identifying obstacles or obstacle boundaries in the obstacle image.
在步骤S105中,根据位置估计和障碍物的角点更新机器人的规划路径。In step S105, the planned path of the robot is updated according to the position estimation and the corner points of the obstacles.
在本发明实施例中,根据前面得到的机器人可能移动到的下一地点,结合识别的障碍物,可进一步对机器人在前一位置时确定的路径进行修正、更新,以防止机器人在移动过程中碰撞障碍物。In the embodiment of the present invention, according to the next location where the robot may move to, combined with the identified obstacles, the path determined by the robot at the previous location can be further corrected and updated to prevent the robot from moving Collision with obstacles.
本发明实施例利用机器人自身传感器获取的速度和位置信息实现了障碍物的精确定位和规避,并通过三维坐标转二维坐标降低了路径更新过程中的计算复杂度,提高了路径更新的响应速度。The embodiment of the present invention uses the speed and position information obtained by the robot's own sensor to realize the precise positioning and avoidance of obstacles, and reduces the computational complexity in the path update process by converting three-dimensional coordinates to two-dimensional coordinates, and improves the response speed of path update .
图2示出了本发明实施例提供的机器人规划路径的更新装置的结构,为了便于说明,仅示出了与本发明实施例相关的部分,其中包括:Fig. 2 shows the structure of the update device for the robot planning path provided by the embodiment of the present invention. For the convenience of description, only the parts related to the embodiment of the present invention are shown, including:
坐标变换单元21,用于实时获取机器人移动过程中的当前载体坐标,将当前载体坐标变换为机器人的当前全局坐标;The coordinate transformation unit 21 is used to obtain the current carrier coordinates during the movement of the robot in real time, and transform the current carrier coordinates into the current global coordinates of the robot;
参数获取单元22,用于根据当前全局坐标和当前移动速度,计算机器人移动到下一地点的移动速度估计和位置估计;The parameter acquisition unit 22 is used to calculate the moving speed estimation and position estimation of the robot moving to the next location according to the current global coordinates and the current moving speed;
坐标转换单元23,用于获取表示机器人周围环境中障碍物深度图像中像素点的三维坐标点,将三维坐标点转换为二维坐标点,以得到机器人周围环境中障碍物的投影轮廓;The coordinate conversion unit 23 is used to obtain three-dimensional coordinate points representing pixels in the depth image of obstacles in the surrounding environment of the robot, and convert the three-dimensional coordinate points into two-dimensional coordinate points to obtain the projected outline of obstacles in the surrounding environment of the robot;
角点获取单元24,用于根据二维坐标点获取周围环境中障碍物的角点;以及A corner point obtaining unit 24, configured to obtain the corner points of obstacles in the surrounding environment according to the two-dimensional coordinate points; and
路径更新单元25,用于根据位置估计和障碍物的角点更新机器人的规划路径。The path update unit 25 is configured to update the planned path of the robot according to the position estimation and the corner points of the obstacles.
如图3所示,优选地,该更新装置还包括:As shown in Figure 3, preferably, the update device also includes:
坐标系构建单元20,用于预先构建机器人的载体坐标系和全局坐标系。The coordinate system construction unit 20 is used to pre-construct the carrier coordinate system and the global coordinate system of the robot.
坐标变换单元21包括:Coordinate transformation unit 21 includes:
坐标计算子单元211,用于根据公式计算机器人的当前全局坐标,其中,为当前载体坐标, 其中ψ为航向角(机器人绕z轴的旋转角度),θ为俯仰角(机器人绕y轴的旋转角度),γ为横滚角(机器人绕x轴的旋转角度),λ为预设的常数。The coordinate calculation subunit 211 is used to calculate according to the formula Calculate the current global coordinates of the robot, where, is the current vector coordinates, Where ψ is the heading angle (the rotation angle of the robot around the z-axis), θ is the pitch angle (the rotation angle of the robot around the y-axis), γ is the roll angle (the rotation angle of the robot around the x-axis), and λ is a preset constant .
参数获取单元22包括:The parameter acquisition unit 22 includes:
参数计算子单元221,用于根据公式: The parameter calculation subunit 221 is used for according to the formula:
分别计算机器人移动到下一地点的移动速度估计和位置估计。其中,VxW表示机器人在全局坐标系下x轴方向上的速度,VyW表示全局坐标系下y轴方向上的速度,VzW表示全局坐标系下z轴方向上的速度,Δωxw(j)表示全局坐标系下x轴方向j时间点与j-1时间点之间加速度变化值,Δωyw(j)表示全局坐标系下y轴方向j时间点与j-1时间点之间加速度变化值,Δωzw(j)表示全局坐标系下z轴方向j时间点与j-1时间点之间加速度变化值,k、k+1表示当前时间点和下一时间点,xw表示全局坐标系下x轴方向上的位置,yw表示全局坐标系下y轴方向上的位置,zw表示全局坐标系下z轴方向上的位置,T2表示导航定位的计算周期,T1表示为传感器采样周期,n为常数,且T2=n*T1。这样, 则分别表示当前时间点(k时间点)、下一时间点(k+1时间点)机器人在全局坐标系下各方向上的速度,则分别表示当前时间点(k时间点)、下一时间点(k+1时间点)机器人在全局坐标系下各方向上的坐标。Compute the velocity estimate and position estimate for the robot moving to the next location, respectively. Among them, V xW represents the speed of the robot in the direction of the x-axis in the global coordinate system, V yW represents the speed in the direction of the y-axis in the global coordinate system, V zW represents the speed in the direction of the z-axis in the global coordinate system, Δω xw (j ) represents the acceleration change value between time point j and j-1 time point in the x-axis direction in the global coordinate system, and Δω yw (j) represents the acceleration change value between time point j and j-1 time point in the y-axis direction in the global coordinate system Δω zw (j) represents the acceleration change value between j time point and j-1 time point in the z-axis direction in the global coordinate system, k and k+1 represent the current time point and the next time point, and x w represents the global coordinate The position in the x-axis direction of the global coordinate system, y w represents the position in the y-axis direction of the global coordinate system, z w represents the position in the z-axis direction of the global coordinate system, T 2 represents the calculation cycle of navigation and positioning, and T 1 is expressed as Sensor sampling period, n is a constant, and T 2 =n*T 1 . so, Then respectively represent the current time point (k time point) and the next time point (k+1 time point) the speed of the robot in each direction under the global coordinate system, Then respectively represent the coordinates of the robot in each direction under the global coordinate system at the current time point (k time point) and the next time point (k+1 time point).
在本发明实施例中,该更新装置的各单元可由相应的硬件或软件单元实现,各单元可以为独立的软、硬件单元,也可以集成为机器人的一个软、硬件单元,在此不用以限制本发明。In the embodiment of the present invention, each unit of the update device can be realized by corresponding hardware or software units, and each unit can be an independent software and hardware unit, or can be integrated into a software and hardware unit of a robot, which is not limited here. this invention.
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。The above descriptions are only preferred embodiments of the present invention, and are not intended to limit the present invention. Any modifications, equivalent replacements and improvements made within the spirit and principles of the present invention should be included in the protection of the present invention. within range.
Claims (10)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610843839.0A CN106441275A (en) | 2016-09-23 | 2016-09-23 | Method and device for updating planned path of robot |
| PCT/CN2017/085352 WO2018054080A1 (en) | 2016-09-23 | 2017-05-22 | Method and device for updating planned path of robot |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610843839.0A CN106441275A (en) | 2016-09-23 | 2016-09-23 | Method and device for updating planned path of robot |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN106441275A true CN106441275A (en) | 2017-02-22 |
Family
ID=58166445
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201610843839.0A Pending CN106441275A (en) | 2016-09-23 | 2016-09-23 | Method and device for updating planned path of robot |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106441275A (en) |
| WO (1) | WO2018054080A1 (en) |
Cited By (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106292673A (en) * | 2016-09-29 | 2017-01-04 | 深圳大学 | A kind of method for optimizing route and system |
| CN106909149A (en) * | 2017-03-14 | 2017-06-30 | 深圳蓝因机器人科技有限公司 | A kind of method and device of depth camera avoidance |
| CN106919260A (en) * | 2017-03-07 | 2017-07-04 | 百度在线网络技术(北京)有限公司 | Web page operation method and apparatus |
| CN107229903A (en) * | 2017-04-17 | 2017-10-03 | 深圳奥比中光科技有限公司 | Method, device and the storage device of robot obstacle-avoiding |
| CN107608392A (en) * | 2017-09-19 | 2018-01-19 | 浙江大华技术股份有限公司 | The method and apparatus that a kind of target follows |
| CN107678405A (en) * | 2017-08-22 | 2018-02-09 | 广东美的智能机器人有限公司 | Robot ride method and its device |
| WO2018054080A1 (en) * | 2016-09-23 | 2018-03-29 | 深圳大学 | Method and device for updating planned path of robot |
| CN108151742A (en) * | 2017-11-20 | 2018-06-12 | 北京理工华汇智能科技有限公司 | The data processing method and its intelligent apparatus of robot navigation |
| CN108256430A (en) * | 2017-12-20 | 2018-07-06 | 北京理工大学 | Obstacle information acquisition methods, device and robot |
| CN108445503A (en) * | 2018-03-12 | 2018-08-24 | 吉林大学 | The unmanned path planning algorithm merged with high-precision map based on laser radar |
| CN108733011A (en) * | 2017-04-18 | 2018-11-02 | 富士通株式会社 | Non-transitory computer-readable storage medium, robot moving time estimation method and device |
| CN110220524A (en) * | 2019-04-23 | 2019-09-10 | 炬星科技(深圳)有限公司 | Paths planning method, electronic equipment, robot and computer readable storage medium |
| CN110362098A (en) * | 2018-03-26 | 2019-10-22 | 北京京东尚科信息技术有限公司 | UAV visual servo control method, device and UAV |
| CN110587622A (en) * | 2019-09-09 | 2019-12-20 | 深圳市三宝创新智能有限公司 | Old-people-and-disabled-helping robot with wheelchair |
| CN111652113A (en) * | 2020-05-29 | 2020-09-11 | 北京百度网讯科技有限公司 | Obstacle detection method, device, device and storage medium |
| CN111897361A (en) * | 2020-08-05 | 2020-11-06 | 广州市赛皓达智能科技有限公司 | Unmanned aerial vehicle autonomous route planning method and system |
| CN112070782A (en) * | 2020-08-31 | 2020-12-11 | 腾讯科技(深圳)有限公司 | Method and device for identifying scene contour, computer readable medium and electronic equipment |
| CN112074383A (en) * | 2018-05-01 | 2020-12-11 | X开发有限责任公司 | Robot navigation using 2D and 3D path planning |
| CN112506189A (en) * | 2020-11-19 | 2021-03-16 | 深圳优地科技有限公司 | Method for controlling robot to move |
| CN112631266A (en) * | 2019-09-20 | 2021-04-09 | 杭州海康机器人技术有限公司 | Method and device for mobile robot to sense obstacle information |
| CN113446971A (en) * | 2020-03-25 | 2021-09-28 | 扬智科技股份有限公司 | Space recognition method, electronic device, and non-transitory computer-readable storage medium |
| CN114167871A (en) * | 2021-12-06 | 2022-03-11 | 北京云迹科技有限公司 | An obstacle detection method, device, electronic device and storage medium |
| CN114527755A (en) * | 2022-02-21 | 2022-05-24 | 山东新一代信息产业技术研究院有限公司 | Method, equipment and storage medium for automatic pile returning and charging of robot |
| CN114721386A (en) * | 2022-04-08 | 2022-07-08 | 深圳市普渡科技有限公司 | Robot path planning method and device, computer equipment and storage medium |
| CN114879677A (en) * | 2022-05-17 | 2022-08-09 | 珠海格力电器股份有限公司 | Object placement method, device, robot, electronic device and storage medium |
| CN115525053A (en) * | 2022-09-16 | 2022-12-27 | 深圳市优必选科技股份有限公司 | A kind of operating method of robot and robot |
| CN119704163A (en) * | 2025-02-10 | 2025-03-28 | 广州海同工业技术有限公司 | Robot self-adaptive material grabbing method and device based on machine vision |
Families Citing this family (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111476879B (en) * | 2019-01-24 | 2025-01-14 | 北京京东乾石科技有限公司 | Point cloud data processing method, terminal and storage medium |
| CN111061270B (en) * | 2019-12-18 | 2023-12-29 | 深圳拓邦股份有限公司 | Full coverage method, system and operation robot |
| CN112711255B (en) * | 2020-12-24 | 2024-01-19 | 南方科技大学 | Mobile robot obstacle avoidance method, control equipment and storage medium |
| CN112833898B (en) * | 2020-12-30 | 2023-03-21 | 清华大学 | ROS-oriented unmanned vehicle backing method |
| CN112987734B (en) * | 2021-02-23 | 2023-05-02 | 京东科技信息技术有限公司 | Robot travel method, robot travel device, electronic device, storage medium, and program product |
| CN113034579B (en) * | 2021-03-08 | 2023-11-24 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | A dynamic obstacle trajectory prediction method for mobile robots based on laser data |
| CN113177980B (en) * | 2021-04-29 | 2023-12-26 | 北京百度网讯科技有限公司 | Target object speed determining method and device for automatic driving and electronic equipment |
| CN114326710B (en) * | 2021-12-04 | 2024-05-24 | 深圳市普渡科技有限公司 | Robot, robot travel strategy determination method, apparatus and storage medium |
| CN114266829A (en) * | 2021-12-24 | 2022-04-01 | 珠海格力电器股份有限公司 | Object processing method and device, electronic equipment and computer readable storage medium |
| CN114211173B (en) * | 2022-01-27 | 2024-05-31 | 上海电气集团股份有限公司 | A method, device and system for determining welding position |
| CN115574822B (en) * | 2022-09-23 | 2025-11-07 | 国家电网有限公司 | Robot vision positioning method, device and system |
| CN115657674B (en) * | 2022-10-26 | 2023-05-05 | 宝开(上海)智能物流科技有限公司 | A distributed path planning method and device based on graph neural network |
| CN115855042B (en) * | 2022-12-12 | 2024-09-06 | 北京自动化控制设备研究所 | Pedestrian visual navigation method based on laser radar cooperative assistance |
| CN116380110B (en) * | 2023-06-07 | 2023-08-04 | 上海伯镭智能科技有限公司 | A real-time path planning method for unmanned vehicles based on big data |
| CN117315175B (en) * | 2023-09-28 | 2024-05-14 | 广东拓普视科技有限公司 | Composition positioning device and method based on robot |
| CN119356834B (en) * | 2024-12-26 | 2025-04-04 | 北京通用人工智能研究院 | Asynchronous planning method, device, equipment, medium and product of intelligent agent |
| CN119987418A (en) * | 2025-04-15 | 2025-05-13 | 台州学院 | A fusion obstacle avoidance system and obstacle avoidance method for safety inspection robot |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101769754A (en) * | 2010-01-19 | 2010-07-07 | 湖南大学 | Quasi three-dimensional map-based mobile robot global path planning method |
| CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
| CN103076619A (en) * | 2012-12-27 | 2013-05-01 | 山东大学 | System and method for performing indoor and outdoor 3D (Three-Dimensional) seamless positioning and gesture measuring on fire man |
| CN103413306A (en) * | 2013-08-01 | 2013-11-27 | 西北工业大学 | Self-adaptation threshold value Harris corner detection method |
| CN104268138A (en) * | 2014-05-15 | 2015-01-07 | 西安工业大学 | Method for capturing human motion by aid of fused depth images and three-dimensional models |
| CN104346608A (en) * | 2013-07-26 | 2015-02-11 | 株式会社理光 | Sparse depth map densing method and device |
| CN104697526A (en) * | 2015-03-26 | 2015-06-10 | 上海华测导航技术股份有限公司 | Strapdown inertial navitation system and control method for agricultural machines |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10027952B2 (en) * | 2011-08-04 | 2018-07-17 | Trx Systems, Inc. | Mapping and tracking system with features in three-dimensional space |
| CN108594851A (en) * | 2015-10-22 | 2018-09-28 | 飞智控(天津)科技有限公司 | A kind of autonomous obstacle detection system of unmanned plane based on binocular vision, method and unmanned plane |
| CN106441275A (en) * | 2016-09-23 | 2017-02-22 | 深圳大学 | Method and device for updating planned path of robot |
-
2016
- 2016-09-23 CN CN201610843839.0A patent/CN106441275A/en active Pending
-
2017
- 2017-05-22 WO PCT/CN2017/085352 patent/WO2018054080A1/en not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101769754A (en) * | 2010-01-19 | 2010-07-07 | 湖南大学 | Quasi three-dimensional map-based mobile robot global path planning method |
| CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
| CN103076619A (en) * | 2012-12-27 | 2013-05-01 | 山东大学 | System and method for performing indoor and outdoor 3D (Three-Dimensional) seamless positioning and gesture measuring on fire man |
| CN104346608A (en) * | 2013-07-26 | 2015-02-11 | 株式会社理光 | Sparse depth map densing method and device |
| CN103413306A (en) * | 2013-08-01 | 2013-11-27 | 西北工业大学 | Self-adaptation threshold value Harris corner detection method |
| CN104268138A (en) * | 2014-05-15 | 2015-01-07 | 西安工业大学 | Method for capturing human motion by aid of fused depth images and three-dimensional models |
| CN104697526A (en) * | 2015-03-26 | 2015-06-10 | 上海华测导航技术股份有限公司 | Strapdown inertial navitation system and control method for agricultural machines |
Non-Patent Citations (1)
| Title |
|---|
| DENG GENQIANG ET AL.: "SLAM :Depth Image Information for Mapping and Inertial Navigation System Localization", 《2016 ASIA-PACIFIC CONFERENCE ON INTELLIGENCE ROBOT SYSTEMS》 * |
Cited By (44)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018054080A1 (en) * | 2016-09-23 | 2018-03-29 | 深圳大学 | Method and device for updating planned path of robot |
| CN106292673A (en) * | 2016-09-29 | 2017-01-04 | 深圳大学 | A kind of method for optimizing route and system |
| CN106292673B (en) * | 2016-09-29 | 2019-02-12 | 深圳大学 | A path optimization method and system |
| CN106919260A (en) * | 2017-03-07 | 2017-07-04 | 百度在线网络技术(北京)有限公司 | Web page operation method and apparatus |
| CN106909149B (en) * | 2017-03-14 | 2020-09-22 | 深圳蓝因机器人科技有限公司 | Method and device for avoiding obstacles by depth camera |
| CN106909149A (en) * | 2017-03-14 | 2017-06-30 | 深圳蓝因机器人科技有限公司 | A kind of method and device of depth camera avoidance |
| CN107229903A (en) * | 2017-04-17 | 2017-10-03 | 深圳奥比中光科技有限公司 | Method, device and the storage device of robot obstacle-avoiding |
| CN108733011B (en) * | 2017-04-18 | 2020-12-15 | 富士通株式会社 | Non-transitory computer readable storage medium, robot movement time estimation method and apparatus |
| CN108733011A (en) * | 2017-04-18 | 2018-11-02 | 富士通株式会社 | Non-transitory computer-readable storage medium, robot moving time estimation method and device |
| CN107678405A (en) * | 2017-08-22 | 2018-02-09 | 广东美的智能机器人有限公司 | Robot ride method and its device |
| CN107608392A (en) * | 2017-09-19 | 2018-01-19 | 浙江大华技术股份有限公司 | The method and apparatus that a kind of target follows |
| CN108151742A (en) * | 2017-11-20 | 2018-06-12 | 北京理工华汇智能科技有限公司 | The data processing method and its intelligent apparatus of robot navigation |
| CN108256430B (en) * | 2017-12-20 | 2021-01-29 | 北京理工大学 | Obstacle information acquisition method and device and robot |
| CN108256430A (en) * | 2017-12-20 | 2018-07-06 | 北京理工大学 | Obstacle information acquisition methods, device and robot |
| CN108445503A (en) * | 2018-03-12 | 2018-08-24 | 吉林大学 | The unmanned path planning algorithm merged with high-precision map based on laser radar |
| CN110362098B (en) * | 2018-03-26 | 2022-07-05 | 北京京东尚科信息技术有限公司 | Unmanned aerial vehicle visual servo control method and device and unmanned aerial vehicle |
| CN110362098A (en) * | 2018-03-26 | 2019-10-22 | 北京京东尚科信息技术有限公司 | UAV visual servo control method, device and UAV |
| CN112074383A (en) * | 2018-05-01 | 2020-12-11 | X开发有限责任公司 | Robot navigation using 2D and 3D path planning |
| CN110220524A (en) * | 2019-04-23 | 2019-09-10 | 炬星科技(深圳)有限公司 | Paths planning method, electronic equipment, robot and computer readable storage medium |
| WO2020215901A1 (en) * | 2019-04-23 | 2020-10-29 | 炬星科技(深圳)有限公司 | Path planning method, electronic device, robot and computer-readable storage medium |
| CN110587622A (en) * | 2019-09-09 | 2019-12-20 | 深圳市三宝创新智能有限公司 | Old-people-and-disabled-helping robot with wheelchair |
| JP7314411B2 (en) | 2019-09-20 | 2023-07-25 | 杭州海康机器人股▲ふん▼有限公司 | Obstacle information sensing method and device for mobile robot |
| KR20220066325A (en) * | 2019-09-20 | 2022-05-24 | 항저우 히크로봇 테크놀로지 씨오., 엘티디. | Obstacle information detection method and device for mobile robot |
| KR102728080B1 (en) * | 2019-09-20 | 2024-11-11 | 항저우 히크로봇 씨오., 엘티디. | Method and device for detecting obstacle information for mobile robot |
| CN112631266A (en) * | 2019-09-20 | 2021-04-09 | 杭州海康机器人技术有限公司 | Method and device for mobile robot to sense obstacle information |
| JP2022548743A (en) * | 2019-09-20 | 2022-11-21 | 杭州海康机器人股▲ふん▼有限公司 | Obstacle information sensing method and device for mobile robot |
| CN113446971B (en) * | 2020-03-25 | 2023-08-08 | 扬智科技股份有限公司 | Spatial recognition method, electronic device and non-transitory computer readable storage medium |
| CN113446971A (en) * | 2020-03-25 | 2021-09-28 | 扬智科技股份有限公司 | Space recognition method, electronic device, and non-transitory computer-readable storage medium |
| US11688177B2 (en) | 2020-05-29 | 2023-06-27 | Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. | Obstacle detection method and device, apparatus, and storage medium |
| CN111652113A (en) * | 2020-05-29 | 2020-09-11 | 北京百度网讯科技有限公司 | Obstacle detection method, device, device and storage medium |
| CN111897361A (en) * | 2020-08-05 | 2020-11-06 | 广州市赛皓达智能科技有限公司 | Unmanned aerial vehicle autonomous route planning method and system |
| CN111897361B (en) * | 2020-08-05 | 2023-08-22 | 广州市赛皓达智能科技有限公司 | Unmanned aerial vehicle autonomous route planning method and system |
| CN112070782B (en) * | 2020-08-31 | 2024-01-09 | 腾讯科技(深圳)有限公司 | Methods, devices, computer-readable media and electronic equipment for identifying scene contours |
| CN112070782A (en) * | 2020-08-31 | 2020-12-11 | 腾讯科技(深圳)有限公司 | Method and device for identifying scene contour, computer readable medium and electronic equipment |
| US12456210B2 (en) | 2020-08-31 | 2025-10-28 | Tencent Technology (Shenzhen) Company Limited | Scene contour recognition in video based on depth information |
| CN112506189A (en) * | 2020-11-19 | 2021-03-16 | 深圳优地科技有限公司 | Method for controlling robot to move |
| CN114167871A (en) * | 2021-12-06 | 2022-03-11 | 北京云迹科技有限公司 | An obstacle detection method, device, electronic device and storage medium |
| CN114527755A (en) * | 2022-02-21 | 2022-05-24 | 山东新一代信息产业技术研究院有限公司 | Method, equipment and storage medium for automatic pile returning and charging of robot |
| CN114721386A (en) * | 2022-04-08 | 2022-07-08 | 深圳市普渡科技有限公司 | Robot path planning method and device, computer equipment and storage medium |
| CN114879677A (en) * | 2022-05-17 | 2022-08-09 | 珠海格力电器股份有限公司 | Object placement method, device, robot, electronic device and storage medium |
| CN115525053A (en) * | 2022-09-16 | 2022-12-27 | 深圳市优必选科技股份有限公司 | A kind of operating method of robot and robot |
| CN115525053B (en) * | 2022-09-16 | 2024-10-25 | 深圳市优必选科技股份有限公司 | Robot operation method and robot |
| CN119704163A (en) * | 2025-02-10 | 2025-03-28 | 广州海同工业技术有限公司 | Robot self-adaptive material grabbing method and device based on machine vision |
| CN119704163B (en) * | 2025-02-10 | 2025-10-21 | 广州海同工业技术有限公司 | Robot adaptive material grasping method and device based on machine vision |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2018054080A1 (en) | 2018-03-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106441275A (en) | Method and device for updating planned path of robot | |
| EP3715785B1 (en) | Slam assisted ins | |
| CN108051002B (en) | Transport vehicle space positioning method and system based on inertial measurement auxiliary vision | |
| CN108226938B (en) | AGV trolley positioning system and method | |
| Thrun et al. | Scan alignment and 3-D surface modeling with a helicopter platform | |
| CN109885080B (en) | Autonomous control system and autonomous control method | |
| US9386209B2 (en) | Method and apparatus for estimating position | |
| CN105354875B (en) | A kind of indoor environment is two-dimentional with the construction method and system of three-dimensional conjunctive model | |
| Wang et al. | A simple and parallel algorithm for real-time robot localization by fusing monocular vision and odometry/AHRS sensors | |
| CN112254729B (en) | A mobile robot positioning method based on multi-sensor fusion | |
| CN110345937A (en) | Appearance localization method and system are determined in a kind of navigation based on two dimensional code | |
| CN111207774A (en) | Method and system for laser-IMU external reference calibration | |
| CN103412565B (en) | A kind of robot localization method with the quick estimated capacity of global position | |
| CN110609570A (en) | A UAV-based Autonomous Obstacle Avoidance Inspection Method | |
| KR20140049361A (en) | Multiple sensor system, and apparatus and method for three dimensional world modeling using the same | |
| CN113587930A (en) | Indoor and outdoor navigation method and device of autonomous mobile robot based on multi-sensor fusion | |
| JP2013187862A (en) | Image data processing device, image data processing method, and program for image data processing | |
| JPWO2014076844A1 (en) | Autonomous mobile system and control device | |
| Tomažič et al. | Fusion of visual odometry and inertial navigation system on a smartphone | |
| CN115930948A (en) | A Fusion Positioning Method for Orchard Robot | |
| CN112578363B (en) | Laser radar motion track obtaining method and device and medium | |
| Karam et al. | Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping | |
| Bikmaev et al. | Improving the accuracy of supporting mobile objects with the use of the algorithm of complex processing of signals with a monocular camera and LiDAR | |
| Yang et al. | Simultaneous estimation of ego-motion and vehicle distance by using a monocular camera | |
| Wang et al. | Micro aerial vehicle navigation with visual-inertial integration aided by structured light |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170222 |
|
| RJ01 | Rejection of invention patent application after publication |