CN100449444C - A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments - Google Patents
A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments Download PDFInfo
- Publication number
- CN100449444C CN100449444C CNB2006100536902A CN200610053690A CN100449444C CN 100449444 C CN100449444 C CN 100449444C CN B2006100536902 A CNB2006100536902 A CN B2006100536902A CN 200610053690 A CN200610053690 A CN 200610053690A CN 100449444 C CN100449444 C CN 100449444C
- Authority
- CN
- China
- Prior art keywords
- line segment
- map
- mobile robot
- global
- local
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
本发明公开了一种移动机器人在未知环境中同时定位与地图构建的方法。包括如下步骤:1)根据移动机器人上所安装测距传感器获得的数据,构建当前观测到的局部占用栅格地图和局部线段特征地图;2)根据测距传感器数据、航位推测传感器数据、上一周期对移动机器人的位姿估计和前期构建得到的全局线段特征地图,估计当前周期移动机器人的位姿;3)根据对当前周期移动机器人的位姿估计、构建得到的局部线段特征地图和局部占用栅格地图,更新全局占用栅格地图和全局线段特征地图。本发明构建得到线段特征地图和占用栅格地图可供各类应用机器人用于路径规划和导航,免除人工为移动机器人输入环境地图的工作,可有效提高各类机器人对未知环境的认知能力。
The invention discloses a method for simultaneous positioning and map construction of a mobile robot in an unknown environment. The method includes the following steps: 1) Construct the currently observed local occupancy grid map and local line feature map according to the data obtained by the ranging sensor installed on the mobile robot; 2) According to the ranging sensor data, the dead reckoning sensor data, the upper Estimating the pose of the mobile robot in one cycle and the global line segment feature map constructed in the previous stage, and estimating the pose of the mobile robot in the current cycle; 3) According to the pose estimation and construction of the mobile robot in the current cycle Occupancy grid map, update the global occupancy grid map and the global line segment feature map. The line segment feature map and occupied grid map constructed by the invention can be used by various application robots for path planning and navigation, eliminating the need to manually input the environment map for the mobile robot, and effectively improving the cognition ability of various robots to unknown environments.
Description
技术领域 technical field
本发明涉及一种移动机器人在未知环境中同时定位与地图构建的方法和系统,它们可用于移动机器人在未知环境的任意一点出发,根据测距传感器和航位推测传感器的感知数据估计机器人位姿并构建环境地图。The present invention relates to a method and system for simultaneous positioning and map construction of a mobile robot in an unknown environment, which can be used for a mobile robot to start at any point in an unknown environment, and estimate the pose of the robot based on the perception data of a ranging sensor and a dead reckoning sensor and build a map of the environment.
背景技术 Background technique
随着移动机器人应用范围的逐渐扩展,如何使机器人自主认知未知环境成为近年来机器人和人工智能领域中的一个研究热点。机器人自主认知未知环境的方式之一就是构建未知环境的模型或者地图,即由机器人根据传感器测量信息自主构建其所在环境的空间模型或地图(参见“S.Thrun.Robotic mapping:Asurvey.In G.Lakemeyer and B.Nebel,editors,Exploring Artificial Intelligence in theNew Millenium.Morgan Kaufmann,2002.”及“陈卫东,张飞.移动机器人的同步自定位与地图创建研究进展.控制理论与应用,22(3):455-460,2005.”)。通过构建得到的环境地图,机器人可以进行任务规划、路径规划、并执行各种作业。With the gradual expansion of the application range of mobile robots, how to make robots autonomously recognize unknown environments has become a research hotspot in the field of robotics and artificial intelligence in recent years. One of the ways for the robot to autonomously recognize the unknown environment is to build a model or map of the unknown environment, that is, the robot autonomously builds a spatial model or map of its environment based on sensor measurement information (see "S. Thrun. Robotic mapping: Asurvey. In G .Lakemeyer and B.Nebel, editors, Exploring Artificial Intelligence in the New Millenium. Morgan Kaufmann, 2002." and "Chen Weidong, Zhang Fei. Synchronous self-localization and map creation research progress of mobile robots. Control Theory and Applications, 22(3) : 455-460, 2005."). By constructing the obtained environmental map, the robot can perform task planning, path planning, and perform various tasks.
早期的未知环境地图构建假设机器人移动过程中的所有位姿可以由航位推测传感器精确获得,由此提出了根据测距传感器信息构建各类环境地图的方法,包括基于布尔贝叶斯滤波构建占用栅格地图的方法(参见“A.Elfes.Sonar-basedreal-world mapping and navigation.IEEE Journal of Robotics and Automation,RA-3(3):249-265,1987.”)和基于卡尔曼滤波构建特征地图的方法(参见“K.S.Chong,L.Kleeman.Mobile-robot map building from an advanced sonar arrayand accurate odometry.International Journal of Robotics Research,18(1):20-36,1999.”)。The early construction of unknown environment maps assumed that all poses during the movement of the robot could be accurately obtained by dead reckoning sensors, and thus proposed methods for constructing various environmental maps based on ranging sensor information, including building occupancy maps based on Boolean Bayesian filtering. The method of raster map (see "A.Elfes.Sonar-basedreal-world mapping and navigation.IEEE Journal of Robotics and Automation, RA-3(3):249-265, 1987.") and constructing features based on Kalman filter Mapping methods (see "K.S. Chong, L. Kleeman. Mobile-robot map building from an advanced sonar array and accurate odometry. International Journal of Robotics Research, 18(1):20-36, 1999.").
然而航位推测传感器所获得的机器人位姿信息并不是确定的,相反它存在着统计上相关的测量误差,即误差会随着时间积累。但传统的机器人位姿估计,如Markov定位法和Monte Carlo定位法,总是假设地图已知(参见“D.Fox,W.Burgard,and S.Thrun.Markov localization for mobile robots in dynamicenvironments.Journal of Artificial Intelligence Research,11:391-427,1999.”和“S.Thrun,D.Fox,W.Burgard and F.Dellaert.Robust Monte Carlo localization formobile robots.Artificial Intelligence,101:99-141,2000.”)。However, the pose information of the robot obtained by the dead reckoning sensor is not deterministic, on the contrary, it has a statistically related measurement error, that is, the error will accumulate over time. However, traditional robot pose estimation, such as Markov localization method and Monte Carlo localization method, always assumes that the map is known (see "D. Fox, W. Burgard, and S. Thrun. Markov localization for mobile robots in dynamic environments. Journal of Artificial Intelligence Research, 11:391-427, 1999.” and “S. Thrun, D. Fox, W. Burgard and F. Dellaert. Robust Monte Carlo localization for mobile robots. Artificial Intelligence, 101: 99-141, 2000.” ).
可见,在未知环境地图构建中,由于机器人用于感知外部环境信息和自身移动信息的传感器测量数据中都存在着不可避免的测量噪声,使得机器人定位和环境地图构建中均存在不确定性,且这些不确定性相互关联、相互影响。在根据机器人位姿将当前观测到的信息加入到所构建的地图中时,会将机器人定位的不确定性带入到所构建的地图中,使所构建的地图出现偏差;在根据当前观测与已构建地图的匹配关系估计机器人位姿时,当前观测和已构建地图中的不确定性又会使机器人定位出现偏差。因此未知环境地图构建必须要同时解决机器人定位和地图构建问题。It can be seen that in the construction of the unknown environment map, due to the unavoidable measurement noise in the sensor measurement data used by the robot to perceive the external environment information and its own movement information, there are uncertainties in both the robot positioning and the construction of the environmental map, and These uncertainties are interrelated and affect each other. When the currently observed information is added to the constructed map according to the robot pose, the uncertainty of robot positioning will be brought into the constructed map, which will make the constructed map deviate; based on the current observation and When the matching relationship of the constructed map estimates the pose of the robot, the uncertainty in the current observation and the constructed map will make the robot positioning bias. Therefore, the unknown environment map construction must solve the problem of robot positioning and map construction at the same time.
现有的同时定位与地图构建方法有扩展卡尔曼滤波法、期望值最大化法、粒子滤波法。基于扩展卡尔曼滤波的同时定位与地图构建方法(参见“J.A.Castellanos,et al.The SPmap:A probabilistic framework for simultaneouslocalization and map building.IEEE Transactions on Robotics and Automation,15(5):948-953,1999.”)中地图由一系列陆标组成,算法维护着机器人位姿和所有陆标位置的后验概率。该类方法可以实现在线地图构建,但由于采用了高斯噪声的假设,使得它要求陆标能够被充分地相互区别,陆标的缺失、陆标识别的误差和匹配错误都将导致算法的失败。基于期望值最大化的同时定位与地图构建方法(参见“W.Burgard,et al.Sonar-based mapping of large-scale mobile robotenvironments using EM.in Proc.of the International Conference on MachineLearning,Slovenia,1999,pp.67-76.”)通过不断地重复两个步骤来逐步提高地图的精确性,一个是根据给定的地图计算机器人位姿的后验概率,另一个是根据位姿期望值计算出最可能的地图。该方法可以避免由于陆标识别误差和匹配误差导致地图构建失败,但由于它在所有地图的空间中执行爬山法,因而无法实现在线地图构建。基于粒子滤波的同时定位与地图构建方法(参见“M.Montemerlo and S.Thrun.Simultaneous localization and mapping with unknowndata association using FastSLAM.in Proceedings of the 2003 IEEE InternationalConference on Robotics&Automation,Taipei,Taiunn,2003,pp.1985-1991.”)是将粒子滤波和扩展卡尔曼滤波结合起来,它采用粒子滤波估计机器人的位姿,每一个粒子表示一个可能的机器人路径,同时维护着一个由该路径确定的地图,地图中陆标位置的估计采用扩展卡尔曼滤波法。该方法通过多假设解决了定位与地图构建不确定性相互影响的问题,但由于为每个粒子维护一张相应的地图,需要消耗较大的存储空间和计算资源。Existing simultaneous localization and map construction methods include extended Kalman filter method, expected value maximization method, and particle filter method. Simultaneous localization and map building method based on extended Kalman filtering (see "J.A. Castellanos, et al. The SPmap: A probabilistic framework for simultaneous localization and map building. IEEE Transactions on Robotics and Automation, 15(5): 948-953, 1999 .”) The map consists of a series of landmarks, and the algorithm maintains the robot pose and the posterior probabilities of all landmark locations. This type of method can realize online map construction, but due to the assumption of Gaussian noise, it requires that the landmarks can be fully distinguished from each other, and the lack of landmarks, landmark recognition errors and matching errors will lead to the failure of the algorithm. Simultaneous positioning and map construction method based on expected value maximization (see "W. Burgard, et al. Sonar-based mapping of large-scale mobile robotenvironments using EM. in Proc. of the International Conference on Machine Learning, Slovenia, 1999, pp. 67-76.") to gradually improve the accuracy of the map by repeating two steps, one is to calculate the posterior probability of the robot pose according to the given map, and the other is to calculate the most likely map according to the expected value of the pose . This method can avoid map construction failures due to landmark recognition errors and matching errors, but it cannot achieve online map construction because it performs hill-climbing in all map spaces. Simultaneous localization and map construction method based on particle filter (see "M.Montemerlo and S.Thrun.Simultaneous localization and mapping with unknown data association using FastSLAM.in Proceedings of the 2003 IEEE International Conference on Robotics & Automation, Taipei, Taiunn, 2003, pp.1985 -1991.") is a combination of particle filter and extended Kalman filter, which uses particle filter to estimate the pose of the robot, each particle represents a possible robot path, and maintains a map determined by the path, in the map The position of the landmarks is estimated using the extended Kalman filter method. This method solves the problem of mutual influence between localization and map construction uncertainty through multiple assumptions, but maintains a corresponding map for each particle, which consumes a large amount of storage space and computing resources.
中国发明专利第00816243.3号提出了一个多机器人协作系统,由一个或多个导航机器人完成环境的绘图、自身定位、功能机器人定位和功能机器人任务分配,其中导航机器人在定位自身时采用了陆标识别和推算算法。由于轮子滑动或失调的因素,推算算法的错误会随着时间累积。为了弥补这个误差,该方法在定位时结合了陆标识别。但由于它将系统中的功能机器人作为陆标,使得该方法并不适用于单个在未知环境中作业的机器人,特别是当当前观测中不存在陆标时,机器人定位将出现较大不确定性。Chinese Invention Patent No. 00816243.3 proposes a multi-robot collaboration system, in which one or more navigation robots complete the drawing of the environment, self-positioning, functional robot positioning and functional robot task assignment, in which the navigation robot uses landmark recognition when positioning itself and calculation algorithms. Errors in the reckoning algorithm accumulate over time due to factors such as wheel slippage or misalignment. To compensate for this error, the method incorporates landmark recognition when positioning. However, since it uses the functional robot in the system as a landmark, this method is not suitable for a single robot working in an unknown environment, especially when there is no landmark in the current observation, the robot positioning will have great uncertainty .
中国发明专利第200410100518.9号所提出的环境识别设备及方法主要面向多平面环境(即楼梯),并没有考虑机器人定位的不确定性问题。The environment recognition equipment and method proposed in Chinese Invention Patent No. 200410100518.9 are mainly for multi-plane environments (ie stairs), and do not consider the uncertainty of robot positioning.
因此有必要开发一种既能实现准确的机器人位姿估计和地图构建、又能实时在线运行、对存储空间和计算资源要求小、并且所构建得到的地图能够提供详细的环境信息、适用于机器人导航等作业的同时定位与地图构建方法。Therefore, it is necessary to develop a robot that can not only realize accurate robot pose estimation and map construction, but also run online in real time, require less storage space and computing resources, and the constructed map can provide detailed environmental information, which is suitable for robots. Simultaneous localization and map construction methods for navigation and other tasks.
发明内容 Contents of the invention
本发明的目的是提供一种移动机器人在未知环境中同时定位与地图构建的方法。The purpose of the present invention is to provide a method for simultaneous localization and map construction of a mobile robot in an unknown environment.
移动机器人在未知环境中同时定位与地图构建的方法包括如下步骤:The method for simultaneous localization and map construction of a mobile robot in an unknown environment comprises the following steps:
1)根据移动机器人上所安装测距传感器获得的数据,构建当前观测到的局部占用栅格地图和局部线段特征地图;1) According to the data obtained by the ranging sensor installed on the mobile robot, construct the currently observed local occupancy grid map and local line segment feature map;
2)根据测距传感器数据、航位推测传感器数据、上一周期对移动机器人的位姿估计和前期构建得到的全局线段特征地图,估计当前周期移动机器人的位姿;2) Estimate the pose of the mobile robot in the current cycle based on the ranging sensor data, the dead reckoning sensor data, the pose estimation of the mobile robot in the previous cycle, and the global line segment feature map constructed in the previous stage;
3)根据对当前周期移动机器人的位姿估计、构建得到的局部线段特征地图和局部占用栅格地图,更新全局占用栅格地图和全局线段特征地图。3) Update the global occupancy grid map and the global line segment feature map based on the estimated and constructed local line segment feature map and local occupancy grid map of the current cycle mobile robot.
所述的测距传感器获得的数据是测距传感器在测距高度平面上扫描得到的环境中障碍物上各个点相对于移动机器人的距离和角度。The data obtained by the ranging sensor is the distance and angle of each point on the obstacle in the environment relative to the mobile robot obtained by scanning the ranging sensor on the ranging height plane.
局部占用栅格地图和局部线段特征地图采用以当前周期移动机器人位置为原点、以当前周期移动机器人正方向为X轴的坐标系。The local occupancy grid map and the local line segment feature map adopt a coordinate system with the current cycle mobile robot position as the origin and the current cycle mobile robot positive direction as the X axis.
全局占用栅格地图和全局线段特征地图采用以第一周期移动机器人位置为原点、以第一周期移动机器人正方向为X轴的坐标系。The global occupancy grid map and the global line segment feature map adopt a coordinate system with the position of the mobile robot in the first period as the origin and the positive direction of the mobile robot in the first period as the X-axis.
根据移动机器人上所安装测距传感器获得的数据,构建当前观测到的局部占用栅格地图和局部线段特征地图:它包括局部占用栅格地图的构建和局部线段特征地图的构建,其中局部占用栅格地图构建采用布尔贝叶斯滤波法,局部线段特征地图构建的步骤为:According to the data obtained by the ranging sensor installed on the mobile robot, construct the currently observed local occupancy grid map and local line segment feature map: it includes the construction of the local occupancy grid map and the construction of the local line segment feature map, where the local occupancy grid The grid map construction adopts the Boolean Bayesian filtering method, and the steps of local line segment feature map construction are as follows:
1)对测距传感器数据,利用哈夫变换直线拟合方法得到初步拟合线段;1) For the ranging sensor data, the Hough transform straight line fitting method is used to obtain a preliminary fitting line segment;
2)对初步拟合线段进行同向性判断、共线性判断和重合性判断,合并大致同向共线重合的线段的拟合点集合;2) Carry out isotropic judgment, collinearity judgment and coincidence judgment on the preliminary fitting line segment, and merge the fitting point set of the line segment coincident with the collinear line roughly in the same direction;
3)对于合并后的线段拟合点集合,利用最小二乘拟合法得到精确拟合线段;3) For the merged line segment fitting point set, use the least squares fitting method to obtain an accurate fitting line segment;
根据测距传感器数据、航位推测传感器数据、上一周期对移动机器人的位姿估计和前期构建得到的全局线段特征地图,估计当前周期移动机器人的位姿的步骤为:According to the ranging sensor data, the dead reckoning sensor data, the pose estimation of the mobile robot in the previous cycle and the global line segment feature map constructed in the previous period, the steps to estimate the pose of the mobile robot in the current cycle are as follows:
1)根据航位推测传感器数据和上一周期对移动机器人的位姿估计,利用移动机器人运动模型,预估当前周期移动机器人的位姿;1) According to the dead reckoning sensor data and the pose estimation of the mobile robot in the previous cycle, use the motion model of the mobile robot to estimate the pose of the mobile robot in the current cycle;
2)根据预估位姿,从全局线段特征地图中取出落在测距传感器扫描范围内的特征,获得预计当前可见的线段特征;2) According to the estimated pose, extract the features that fall within the scanning range of the ranging sensor from the global line segment feature map, and obtain the estimated currently visible line segment features;
3)将预计当前可见的线段特征从全局地图坐标系转换到局部地图坐标系中;3) Transform the currently visible line segment features from the global map coordinate system to the local map coordinate system;
4)通过寻找测距传感器数据与预计当前可见线段特征的最佳相合,计算位姿校正偏移量;4) Calculate the pose correction offset by finding the best match between the ranging sensor data and the expected current visible line segment features;
5)根据当前周期移动机器人的预估位姿和位姿校正偏移量,计算当前周期移动机器人的位姿估计。5) According to the estimated pose and pose correction offset of the mobile robot in the current period, calculate the pose estimation of the mobile robot in the current period.
根据航位推测传感器数据和上一周期对移动机器人的位姿估计,利用移动机器人运动模型,预估当前周期移动机器人的位姿:包括如下步骤,According to the dead reckoning sensor data and the pose estimation of the mobile robot in the previous cycle, the motion model of the mobile robot is used to estimate the pose of the mobile robot in the current cycle: including the following steps,
1)根据当前周期的航位推测传感器数据和上一周期的航位推测传感器数据,计算这期间移动机器人的近似移动速度;1) According to the dead reckoning sensor data of the current cycle and the dead reckoning sensor data of the previous cycle, calculate the approximate moving speed of the mobile robot during this period;
2)根据计算得到的移动机器人的近似移动速度和上一周期对移动机器人的估计位姿,计算得到当前周期移动机器人的位姿预估。2) According to the calculated approximate moving speed of the mobile robot and the estimated pose of the mobile robot in the previous cycle, calculate the estimated pose of the mobile robot in the current cycle.
通过寻找测距传感器数据与预计当前可见线段特征的最佳相合,计算位姿校正偏移量:包括如下步骤,Calculate the pose correction offset by finding the best match between the ranging sensor data and the expected current visible line segment features: including the following steps,
1)采用点到线段特征距离最短的匹配方法,寻找测距传感器数据与预计当前可见线段特征的匹配关系;1) Use the matching method with the shortest feature distance from the point to the line segment to find the matching relationship between the ranging sensor data and the expected current visible line segment feature;
2)从与同一线段特征相匹配的数据点关系着手,对上述匹配关系进行判断,删除不当匹配;2) Start with the data point relationship matching the same line segment feature, judge the above matching relationship, and delete the inappropriate match;
3)从匹配线段特征生成时间、匹配线段特征的可信度、匹配线段是否为拐角组成线段、匹配线段长度、以及匹配数据是否为某一局部线段特征的拟合点五个方面进行比例量化定义匹配权值;3) Proportional quantification is defined from five aspects: the generation time of the matching line segment feature, the reliability of the matching line segment feature, whether the matching line segment is a corner composition line segment, the length of the matching line segment, and whether the matching data is a fitting point of a local line segment feature matching weight;
4)根据最佳匹配是数据点到匹配线段特征距离为零的条件构建匹配部分最佳相合的参数模型结构,利用加权最小二乘参数估计法位姿校正偏移量;4) According to the condition that the best match is that the characteristic distance from the data point to the matching line segment is zero, the parameter model structure of the best match of the matching part is constructed, and the weighted least squares parameter estimation method is used to correct the offset;
5)将测距传感器数据按位姿校正偏移量作平移和旋转;5) Translate and rotate the ranging sensor data according to the pose correction offset;
6)重复步骤1)-5),直到满足加权最小二乘参数估计的残差小于一定值或者稳定的收敛条件。6) Steps 1)-5) are repeated until the residual error of weighted least squares parameter estimation is less than a certain value or the convergence condition is stable.
根据对当前周期移动机器人的位姿估计、构建得到的局部线段特征地图和局部占用栅格地图,更新全局占用栅格地图和全局线段特征地图:包括全局占用栅格地图更新和全局线段特征地图更新,其中全局占用栅格地图更新采用布尔贝叶斯滤波法,全局线段特征地图更新包括如下步骤:Update the global occupancy grid map and the global line segment feature map based on the pose estimation of the current cycle mobile robot, the constructed local line segment feature map and the local occupancy grid map: including the global occupancy grid map update and the global line segment feature map update , where the global occupancy grid map is updated using the Boolean Bayesian filtering method, and the global line segment feature map update includes the following steps:
1)对于局部线段特征地图中的线段特征,在全局线段特征地图中寻找是否存在一个全局线段特征与该局部线段特征满足同向性、共线性和重合性要求;1) For the line segment feature in the local line segment feature map, find whether there is a global line segment feature and the local line segment feature in the global line segment feature map to meet the requirements of isotropy, collinearity and coincidence;
2)如果存在,则合并线段拟合点,利用最小二乘直线拟合法重新拟合得到线段,替换原全局线段特征;2) If it exists, merge the line segment fitting points, use the least squares straight line fitting method to re-fit the line segment, and replace the original global line segment feature;
3)如果不存在,则将局部线段特征加入到全局线段特征地图中。3) If it does not exist, add the local line feature to the global line feature map.
4)利用全局占用栅格地图计算全局线段特征地图中线段特征的可信度。4) Using the global occupancy grid map to calculate the credibility of the line feature in the global line feature map.
本发明提出的移动机器人在未知环境中同时定位与地图构建的方法解决了未知环境地图构建中机器人位姿的不确定性与地图构建的不确定性相互关联、相互影响的问题,所占存储空间和计算资源较少,可以实现在线未知环境地图构建,所构建得到全局线段特征地图和全局占用栅格地图适用于各类应用机器人进行路径规划和导航。The method for simultaneous positioning and map construction of a mobile robot in an unknown environment proposed by the present invention solves the problem that the uncertainty of the pose of the robot and the uncertainty of map construction in the construction of the map of the unknown environment are interrelated and affect each other, and the storage space occupied And with less computing resources, online unknown environment map construction can be realized. The constructed global line segment feature map and global occupancy grid map are suitable for path planning and navigation of various application robots.
附图说明 Description of drawings
图1是实施移动机器人在未知环境中同时定位与地图构建的软件流程图;Figure 1 is a software flow chart for implementing simultaneous localization and map construction of mobile robots in an unknown environment;
图2是在局部占用栅格地图构建中对每个障碍物点数据所构造的锥形图;Fig. 2 is a cone diagram constructed for each obstacle point data in the local occupancy grid map construction;
图3是本发明方法中估计当前周期移动机器人位姿的操作流程图;Fig. 3 is the operation flowchart of estimating the pose of the mobile robot in the current cycle in the method of the present invention;
图4是本发明方法中计算位姿校正偏移量的操作流程图;Fig. 4 is the operation flowchart of calculating pose correction offset in the method of the present invention;
图5是利用本发明方法中的局部占用栅格地图构建方法和局部线段特征地图构建方法实际绘制局部占用栅格地图和局部线段特征地图的一个实例;Fig. 5 is an example of actually drawing a local occupancy grid map and a local line segment feature map by utilizing the local occupancy grid map construction method and the local line segment feature map construction method in the method of the present invention;
图5(a)是测距传感器获得障碍物点数据的一个实例;Figure 5(a) is an example of the obstacle point data obtained by the ranging sensor;
图5(b)是利用利用本发明方法中的局部占用栅格地图构建方法,对图5(a)中的数据实例构建得到的一个局部占用栅格地图,其中黑色表示被占用区域,白色表示空闲区域,灰色表示不确定区域;Fig. 5 (b) is a local occupancy grid map obtained by constructing the data instance in Fig. 5 (a) by using the local occupancy grid map construction method in the method of the present invention, wherein black represents the occupied area, and white represents Free area, gray indicates uncertain area;
图5(c)是利用利用本发明方法中的局部线段特征地图构建方法,对图5(a)中的数据实例构建得到的一个局部线段特征地图;Fig. 5 (c) utilizes the local line segment feature map construction method in the method of the present invention to construct a local line segment feature map obtained from the data instance in Fig. 5 (a);
图6是利用本发明方法中的局部占用栅格地图构建方法和局部线段特征地图构建方法实际绘制局部占用栅格地图和局部线段特征地图的另一个实例;Fig. 6 is another example of actually drawing a local occupancy grid map and a local line segment feature map by using the local occupancy grid map construction method and the local line segment feature map construction method in the method of the present invention;
图6(a)是测距传感器获得障碍物点数据的另一个实例;Fig. 6 (a) is another example of the obstacle point data obtained by the ranging sensor;
图6(b)是利用利用本发明方法中的局部占用栅格地图构建方法,对图6(a)中的数据实例构建得到的一个局部占用栅格地图,其中黑色表示被占用区域,白色表示空闲区域,灰色表示不确定区域;Fig. 6 (b) is a local occupancy grid map obtained by constructing the data instance in Fig. 6 (a) by using the local occupancy grid map construction method in the method of the present invention, wherein black represents the occupied area, and white represents Free area, gray indicates uncertain area;
图6(c)是利用利用本发明方法中的局部线段特征地图构建方法,对图6(a)中的数据实例构建得到的一个局部线段特征地图;Fig. 6 (c) utilizes the local line segment feature map construction method in the method of the present invention to construct a local line segment feature map obtained from the data instance in Fig. 6 (a);
图7是利用本发明方法中计算当前周期移动机器人位姿的一个实例;Fig. 7 is an example of calculating the current cycle mobile robot pose in the method of the present invention;
图7(a)是当前周期测距传感器所获取的障碍物点数据与根据当前周期航位推测传感器数据所获取的预计当前可见线段特征的匹配情况的实例,其中黑色点是当前周期测距传感器观测到的障碍物数据点,黑色线是根据当前周期航位推测传感器数据获取得到的预计当前可见的线段特征;Figure 7(a) is an example of the matching situation between the obstacle point data obtained by the current period ranging sensor and the estimated current visible line segment features obtained according to the current period dead reckoning sensor data, where the black point is the current period ranging sensor Observed obstacle data points, the black line is the expected current visible line segment feature obtained according to the current cycle dead reckoning sensor data;
图7(b)是当前周期测距传感器所获取的障碍物点数据与按照本发明方法计算得到的当前周期机器人预估位姿所获取的预计当前可见线段特征的匹配情况的实例,其中黑色点是当前周期测距传感器观测到的障碍物数据点,黑色线是按照本发明方法计算得到的当前周期机器人预估位姿所获取得到的预计当前可见的线段特征;Figure 7(b) is an example of the matching situation of the obstacle point data acquired by the ranging sensor in the current period and the estimated current visible line segment features obtained by the estimated pose of the robot in the current period calculated according to the method of the present invention, wherein the black dots is the obstacle data point observed by the current period ranging sensor, and the black line is the estimated current visible line segment feature obtained by the estimated pose of the robot in the current period calculated according to the method of the present invention;
图7(c)是当前测距传感器所获取的障碍物点数据与按照本发明方法计算得到的当前周期机器人位姿所获取的预计当前可见线段特征的匹配情况的实例,其中黑色点是当前周期测距传感器观测到的障碍物数据点,黑色线是按照本发明方法计算得到的当前周期机器人位姿所获取得到的预计当前可见的线段特征;Figure 7(c) is an example of the matching situation between the obstacle point data acquired by the current ranging sensor and the estimated current visible line segment features obtained by the current period robot pose calculated according to the method of the present invention, wherein the black point is the current period The obstacle data points observed by the ranging sensor, the black line is the estimated current visible line segment feature obtained by the current period robot pose calculated according to the method of the present invention;
图8是利用本发明方法绘制的一个全局占用栅格地图和全局线段特征地图实例;Fig. 8 is a global occupancy grid map and a global line segment feature map example drawn by the method of the present invention;
图8(a)是对一个长为16.4m、宽为7.9m的房间利用本发明方法绘制的一个全局占用栅格地图实例。其中灰色线为航位推测传感器获得的机器人位置连线,黑色线为利用本发明方法计算得到的机器人位置连线;Fig. 8(a) is an example of a global occupancy grid map drawn by the method of the present invention for a room with a length of 16.4m and a width of 7.9m. Wherein the gray line is the connection line of the robot position obtained by the dead reckoning sensor, and the black line is the connection line of the robot position calculated by the method of the present invention;
图8(b)是对图8(a)中所述房间利用本发明方法绘制的一个全局线段特征地图实例。Fig. 8(b) is an example of a global line segment feature map drawn by using the method of the present invention for the room described in Fig. 8(a).
具体实施方式 Detailed ways
下面将参照附图,详细描述根据本发明机器人在未知环境中同时定位与地图构建的一个实施例子。在该实施例子中,机器人能够独立移动,所配备的测距传感器为激光测距仪、或声纳测距仪、或立体视觉系统、或它们的组合,所配备的航位推测传感器为里程计。An implementation example of simultaneous localization and map construction of a robot in an unknown environment according to the present invention will be described in detail below with reference to the accompanying drawings. In this implementation example, the robot can move independently, and the distance measuring sensor equipped is a laser range finder, or a sonar range finder, or a stereo vision system, or a combination thereof, and the dead reckoning sensor equipped is an odometer .
图1是实施本发明方法的软件流程图。移动机器人从未知环境中的任意一个位置、按任意一个方向启动。首先进行传感器数据采集(步骤S1),构建得到局部占用栅格地图和局部线段特征地图(步骤S2),通过坐标系转换,将局部占用栅格地图数据和局部线段特征地图数据转换到全局地图坐标系中,从而获得初始的全局占用栅格地图和全局线段特征地图(步骤S3)。然后机器人开始移动(步骤S4),并间隔一定时间停止,进行传感器数据采集(步骤S5),完成同时定位和地图构建(步骤S6),再继续移动,如此循环反复,当机器人不再移动时,退出循环。其中同时定位和地图构建包括步骤:局部占用栅格地图和局部线段特征地图构建;机器人位姿估计和校正;全局占用栅格地图和全局线段特征地图的更新。Fig. 1 is the software flowchart of implementing the method of the present invention. The mobile robot starts from any position in the unknown environment and in any direction. First, collect sensor data (step S1), construct a local occupancy grid map and a local line segment feature map (step S2), and convert the local occupancy grid map data and local line segment feature map data into global map coordinates through coordinate system conversion system, thereby obtaining an initial global occupancy grid map and a global line segment feature map (step S3). Then the robot starts to move (step S4), and stops at intervals for a certain period of time to collect sensor data (step S5), complete simultaneous positioning and map construction (step S6), and then continue to move, so that the cycle repeats, when the robot no longer moves, exit the loop. The simultaneous localization and map construction include steps: construction of local occupancy grid map and local line segment feature map; robot pose estimation and correction; update of global occupancy grid map and global line segment feature map.
在实施例子中,测距传感器获得一组在测距高度平面上环境中障碍物上各点相对于机器人的距离和角度。以当前移动机器人位置为原点、以当前移动机器人正方向为X轴构造局部地图坐标系,则所检测到的各个障碍物点数据即为局部地图坐标系中的数对(ri,αi),i=1,...,n,n表示点数,ri表示点到坐标系原点的距离,αi表示坐标系原点到点的连线与X轴的夹角。各个障碍物点的笛卡尔坐标为(xi,yi),xi=ri cos αi,yi=ri sin αi.In an implementation example, the ranging sensor obtains a set of distances and angles of each point on the obstacle in the environment on the ranging height plane relative to the robot. The local map coordinate system is constructed with the current mobile robot position as the origin and the positive direction of the current mobile robot as the X axis, then the detected obstacle point data is the number pair (r i , α i ) in the local map coordinate system , i=1,...,n, n represents the number of points, r i represents the distance from the point to the origin of the coordinate system, α i represents the angle between the line from the origin of the coordinate system to the point and the X axis. The Cartesian coordinates of each obstacle point are (x i , y i ), x i = r i cos α i , y i = r i sin α i .
局部占用栅格地图构建采用布尔贝叶斯滤波法。在该实施例子中,对每个障碍物点数据(ri,αi)首先构建如图2所示的锥形区域,其中d1=ri-d,d2=ri+d,θ为相邻障碍物点与坐标原点连线的夹角的1/2。对于该障碍物点,区域1中栅格的被占用概率计算方法为Local occupancy grid maps are constructed using Boolean Bayesian filtering. In this implementation example, for each obstacle point data (r i , α i ), first construct a cone-shaped area as shown in Figure 2, where d 1 =r i -d, d 2 =r i +d, θ It is 1/2 of the angle between the adjacent obstacle point and the coordinate origin. For this obstacle point, the calculation method of the occupation probability of the grid in
f=1-Er·Eα, (1)f=1-E r E α , (1)
其中Er=1-kr·(l/d1)2,Eα=1-kα·(β/θ)2,kr,kα为常系数,l为栅格到锥形顶点的距离,β为栅格到锥形顶点连线与锥形中心线之间的夹角。区域2中栅格的被占用概率计算方法为Where E r =1-k r ·(l/d 1 ) 2 , E α =1-k α ·(β/θ) 2 , k r , k α is a constant coefficient, l is the Distance, β is the angle between the line connecting the grid to the apex of the cone and the center line of the cone. The calculation method of the occupancy probability of the grid in area 2 is
f=Or·Oα, (2)f=O r O α , (2)
其中Or=1-kr·((l-ri)/d)2,Oα=1-kα·(β/θ)2.利用坐标变换,确定锥形区域中各个栅格在局部占用栅格地图中的对应位置,利用公式where O r =1-k r ·((lr i )/d) 2 , O α =1-k α ·(β/θ) 2 . Using coordinate transformation, it is determined that each grid in the cone-shaped area occupies the local grid The corresponding position in the grid map, using the formula
计算局部占用栅格地图中对应栅格的被占用概率,其中p′为该栅格的原被占用概率,p是所求的更新后的被占用概率。Calculate the occupancy probability of the corresponding grid in the local occupancy grid map, where p' is the original occupancy probability of the grid, and p is the obtained updated occupancy probability.
在该实施例子中,对每个线段特征用[c,θ,l,P,gt,conf,FLine]六项参数描述,记为L。其中c,θ为线段所在直线的参数,直线方程为In this implementation example, each line segment feature is described by six parameters [c, θ, l, P, gt, conf, FLine], denoted as L. Among them, c and θ are the parameters of the straight line where the line segment is located, and the equation of the straight line is
xcosθ+ysinθ+c=0. (4)xcosθ+ysinθ+c=0. (4)
c为原点到直线的距离,θ为直线的法线与坐标系x轴的夹角。l为线段长度,P=(xc,yc)T为线段中心点位置(T表示转置)。gt为时间戳属性,记录线段生成时间。conf为线段的可信度属性。FLine为布尔量,用于表示线段是否为拐角的构成线段,如果是则为真,否则为假。c is the distance from the origin to the straight line, and θ is the angle between the normal of the straight line and the x-axis of the coordinate system. l is the length of the line segment, P=(x c , y c ) T is the position of the center point of the line segment (T means transpose). gt is a timestamp attribute, which records the generation time of the line segment. conf is the credibility attribute of the line segment. FLine is a Boolean quantity, used to indicate whether the line segment is a line segment that constitutes a corner, if it is, it is true, otherwise it is false.
在该实施例子的局部线段特征地图构建中,首先对测距传感器所获得的障碍物点数据集合应用哈夫变换直线拟合法初步确定那些点数据属于哪条线段。具体为按一定的离散度构建二维投票网格,其中一维为直线参数c的离散取值,一维为直线参数θ的离散取值。对于每个障碍物点(xi,yi),都有一组参数为(cj,θj)的直线经过该点,从而为相应网格投票。通过投票计数,可以确定各个障碍物点属于哪条线段以及该线段的参数描述。In the construction of the local line segment feature map in this implementation example, firstly, the Hough transform straight line fitting method is applied to the obstacle point data set obtained by the ranging sensor to preliminarily determine which line segment those point data belong to. Specifically, a two-dimensional voting grid is constructed according to a certain degree of dispersion, in which one dimension is the discrete value of the line parameter c, and the other is the discrete value of the line parameter θ. For each obstacle point (x i , y i ), a set of straight lines with parameters (c j , θ j ) pass through the point, thereby voting for the corresponding grid. Through vote counting, it is possible to determine which line segment each obstacle point belongs to and the parameter description of the line segment.
然而由于测量误差,测距传感器所测得的障碍物点并不是准确的共线,在哈夫变化中也很难选择一个合适的离散网格尺寸,对于一组应该共线的点往往会生成多于一条的拟合线段。因此在实施例子的局部线段特征地图构建中,在哈夫变换直线拟合之后,对所得到线段集合中的线段进行两两同线性判断。当线段L1和线段L2满足如下同线性判断规则时,认为两条线段是同线的,进行线段上点集合的合并。However, due to measurement errors, the obstacle points measured by the ranging sensor are not exactly collinear, and it is difficult to choose an appropriate discrete grid size in the Hough change. For a group of points that should be collinear, it is often generated More than one line segment to fit. Therefore, in the construction of the local line segment feature map of the implementation example, after the Hough transform straight line fitting, pairwise collinearity judgments are performed on the line segments in the obtained line segment set. When the line segment L 1 and the line segment L 2 satisfy the following synlinearity judgment rules, the two line segments are considered to be co-linear, and the point sets on the line segments are merged.
1)同向性,要求|θ1-θ2|≤Δθ,其中Δθ为阀值;1) Isotropy, requiring |θ 1 -θ 2 |≤Δθ, where Δθ is the threshold;
2)共线性,要求‖c1|-|c2‖≤ΔC,
3)重和性,要求
通过同线性判断合并大致重合的线段点集合后,再利用最小二乘直线拟合法求取更为精确的线段参数c,θ,l和P。参数gt为当前时间,参数conf=1。当线段与另一线段的夹角在90度左右且相交交点在线段上时,FLine被置为真。After merging roughly coincident line segment point sets through synlinearity judgment, the least squares straight line fitting method is used to obtain more accurate line segment parameters c, θ, l and P. The parameter gt is the current time, and the parameter conf=1. FLine is set to true when the angle between a line segment and another line segment is around 90 degrees and the intersection point is on the line segment.
在该实施例子中,根据测距传感器数据、航位推测传感器数据、上一周期对移动机器人的位姿估计和前期构建得到的全局线段特征地图和全局占用栅格地图,估计当前周期移动机器人的位姿,其步骤如图3所示。在该实施例子中,以移动机器人初始位姿为原点、以移动机器人初始方向为X轴构造全局地图坐标系。航位推测传感器数据和移动机器人位姿估计均为全局地图坐标系中的数据。由此,机器人在全局占用栅格地图和全局线段特征地图中的初始位姿为(0,0,0)。In this implementation example, based on the ranging sensor data, dead reckoning sensor data, the pose estimation of the mobile robot in the previous period, the global line segment feature map and the global occupancy grid map obtained in the previous period, the mobile robot’s position in the current period is estimated. pose, the steps are shown in Figure 3. In this implementation example, the global map coordinate system is constructed with the initial pose of the mobile robot as the origin and the initial direction of the mobile robot as the X-axis. Both dead reckoning sensor data and mobile robot pose estimation are data in the global map coordinate system. Therefore, the initial pose of the robot in the global occupancy grid map and the global line segment feature map is (0, 0, 0).
在步骤S61中,根据航位推测传感器数据和上一周期对移动机器人的位姿估计,利用移动机器人运动模型,预估当前周期移动机器人的位姿。记当前周期为t,t-1周期航位推测传感器读数为
根据该移动速度和t-1周期移动机器人的位姿估计,可以得到t周期移动机器人位姿的预估
在步骤S62中,根据预估位姿
在该实施例子中,选择局部地图坐标系为匹配坐标系,因此在步骤S63中将预计当前可见的线段特征从全局地图坐标系转换到局部地图坐标系中,即将集合Γ中的线段特征转换到局部地图坐标系中。记集合Γ中的线段为LW:{cW,θW,lW,PW,gtW,confW},转换到局部地图坐标系中的线段为LR:{cR,θR,lR,PR,gtR,confR},转换方程为In this implementation example, the local map coordinate system is selected as the matching coordinate system, so in step S63, the currently visible line segment features are converted from the global map coordinate system to the local map coordinate system, that is, the line segment features in the set Γ are converted to in the local map coordinate system. Note that the line segment in the set Γ is L W : {c W , θ W , l W , P W , gt W , conf W }, and the line segment transformed into the local map coordinate system is L R : {c R , θ R , l R , P R , gt R , conf R }, the conversion equation is
转换得到的线段特征构成集合Γ′。The converted line segment features constitute a set Γ′.
在步骤S64中,通过寻找测距传感器数据与预计当前可见线段特征的最佳相合,计算位姿校正偏移量。图4给出了实现步骤。In step S64, the pose correction offset is calculated by finding the best match between the ranging sensor data and the predicted current visible line segment features. Figure 4 shows the implementation steps.
在步骤S641中,为每个测距传感器测得的障碍物点数据(xi,yi)在Γ′中寻找与之距离最近的线段特征。在该实施例子中,要求障碍物数据点到最近线段特征的距离小于一定的阀值,如果满足条件,就将该线段特征称为该数据点的匹配线段,将该数据点称为匹配点。所有匹配点构成集合V。In step S641, for each obstacle point data ( xi , yi ) measured by the ranging sensor, the feature of the line segment with the closest distance is searched in Γ'. In this implementation example, the distance between the obstacle data point and the nearest line segment feature is required to be less than a certain threshold, and if the condition is satisfied, the line segment feature is called the matching line segment of the data point, and the data point is called the matching point. All matching points form a set V.
步骤S641的匹配方式容易引起不当匹配,这类不当匹配将影响所计算得到的位姿校正偏移量的精确度,需要予以去除。在步骤S642中,进行不当匹配判断,并将判断为不当匹配的点从集合V中删除。不当匹配判断的一个实施例子是:记某一匹配线段为Lp,将集合V中以Lp为匹配线段的匹配点按顺时针或逆时针顺序连接起来,记为集合Z,Z={vi,i=1,…,k},vi=(xi,yi)T,k是匹配点数。利用最小二乘直线拟和法对Z中的点进行线段拟合,记拟合得到的线段为Lq。计算点到拟合线段的距离和d,The matching method in step S641 is likely to cause improper matching, which will affect the accuracy of the calculated pose correction offset and needs to be removed. In step S642, the improper matching is judged, and the points judged to be improper matching are deleted from the set V. An implementation example of improper matching judgment is: record a certain matching line segment as L p , connect the matching points in the set V with L p as the matching line segment clockwise or counterclockwise, and record it as set Z, Z={v i , i=1,...,k}, v i =(xi , y i ) T , k is the number of matching points. Use the least squares straight line fitting method to fit the line segment to the points in Z, and denote the fitted line segment as L q . Calculate the distance and d of the point to the fitted line segment,
如果|θp-θq|>阀值,即匹配点拟和得到的线段与匹配线段角度差较大时,认为匹配错误,在集合V中删除集合Z中的所有点。如果|θp-θq|≤阀值,而d>阀值,则认为Z中的部分点存在误匹配。为了去除这些误匹配,对Z中的每个点进行如下判断:If |θ p -θ q |>threshold value, that is, when the angle difference between the line segment obtained by matching the matching points and the matching line segment is large, it is considered that the matching is wrong, and all points in the set Z are deleted in the set V. If |θ p -θ q |≤threshold value, and d>threshold value, it is considered that some points in Z have mismatching. In order to remove these mismatches, each point in Z is judged as follows:
1)如果vi是当前局部线段特征地图中某线段Lo的拟合点,则转(3),否则转(2);1) If v i is the fitting point of a line segment L o in the current local line segment feature map, then go to (3), otherwise go to (2);
2)在Z中取vi的前一个匹配点vi-1和后一个匹配点vi+1,分别求vi与这两个点的连线长度和角度,θ2=∠vivi+1,Δθ1=|θp-θ1|,Δθ2=|θp-θ2|.如果d1或d2大于一定值,且Δθ1和Δθ1均小于定值,则是正确匹配,否则为误匹配。2) Take the previous matching point v i-1 and the next matching point v i+1 of v i in Z, and calculate the length and angle of the connecting line between v i and these two points, θ 2 =∠v i v i+1 , Δθ 1 = |θ p -θ 1 |, Δθ 2 = |θ p -θ 2 |. If d 1 or d 2 is greater than a certain value, and both Δθ 1 and Δθ 1 If it is less than the fixed value, it is a correct match, otherwise it is a wrong match.
3)如果局部线段Lo与匹配线段Lp满足同线性要求,则是正确匹配,否则是误匹配。3) If the local line segment L o and the matching line segment L p meet the requirement of synteny, it is a correct match, otherwise it is a wrong match.
在集合V中删除判断为误匹配的点。由步骤S42得到匹配点集合V′。In the set V, delete the points that are judged as wrong matches. The matching point set V' is obtained by step S42.
在步骤S643中,对匹配点集合V′中的各个匹配点vi=(xi,yi)T定义匹配权值wi。权值计算公式为In step S643, a matching weight w i is defined for each matching point v i =( xi , y i ) T in the matching point set V'. The weight calculation formula is
公式中Lp为vi对应的匹配线段特征,LineDot为布尔量,用于表示点vi是否为当前局部线段特征地图中某线段特征的拟合点,如果是则为真,并记该局部线段特征为Lo。公式中的第一项表示匹配线段生成时间越早,则匹配权值越高,因为线段特征随着地图的更新被不断地修正;第二项表示当匹配线段为拐角的构成线段时匹配权值较高;第三项说明匹配线可信度越高,匹配权值越高;第四项表示当匹配点是某局部线段的拟合点时,其匹配权值高于未能拟合到任何局部线段中的匹配点的匹配权值,且局部线段长度小于或等于匹配线段长度时,其权值高于局部线段长度大于匹配线段长度的权值,因为后者说明匹配线段将被修正。In the formula, L p is the matching line segment feature corresponding to v i , and LineDot is a Boolean quantity, which is used to indicate whether the point v i is a fitting point of a line segment feature in the current local line segment feature map. If so, it is true, and the local The line segment is characterized by L o . The first item in the formula indicates that the earlier the matching line segment is generated, the higher the matching weight, because the feature of the line segment is constantly corrected as the map is updated; the second item indicates the matching weight when the matching line segment is a corner segment higher; the third item indicates that the higher the reliability of the matching line, the higher the matching weight; the fourth item indicates that when the matching point is a fitting point of a local line segment, its matching weight is higher than that of failing to fit any The matching weight of the matching point in the local line segment, and when the length of the local line segment is less than or equal to the length of the matching line segment, its weight is higher than the weight of the local line segment length greater than the length of the matching line segment, because the latter indicates that the matching line segment will be corrected.
步骤S644计算位姿校正偏移量Δ,Δ=(Δx,Δy,Δθ)T。在实施例子中,先根据最佳匹配构建参数模型结构。最佳匹配情况是对V′中的每个匹配点vi按位姿校正偏移量Δ进行旋转平移后,每个匹配点到对应匹配线的距离均为0。记vi′为vi按Δ进行旋转平移后得到的点。Step S644 calculates the pose correction offset Δ, Δ=(Δ x , Δ y , Δ θ ) T . In the implementation example, the parameter model structure is first constructed according to the best matching. The best matching situation is that after rotating and translating each matching point v i in V′ according to the pose correction offset Δ, the distance from each matching point to the corresponding matching line is 0. Denote v i ' as the point obtained after v i is rotated and translated according to Δ.
其中v为匹配集合V′中所有点的重心。vi′到vi对应匹配线Lp的距离为di=ri-ui·vi′,ui为Lp的单位法线向量,ri为匹配线上任意一点与ui的点乘。由直线方程可得ui=[ui1,ui2]=[cosθp,sinθp],ri=-cp.则由最佳匹配条件di=0得Where v is the center of gravity of all points in the matching set V'. The distance from v i ′ to v i corresponding to the matching line L p is d i = r i -u i ·v i ′, u i is the unit normal vector of L p , and ri is the distance between any point on the matching line and u i Dot multiplication. From the linear equation, u i =[u i1 , u i2 ]=[cosθ p , sinθ p ], r i =-c p . Then from the best matching condition d i =0, we can get
其中in
由集合V′中共有m个点,可得There are m points in the set V′, we can get
Y=ΦΔ. (13)Y=ΦΔ. (13)
其中Y是由yi组成的m×1列向量,Φ是组成的m×3矩阵。利用加权最小二乘估计可计算得到位姿校正偏移量Δ,where Y is an m×1 column vector consisting of y i and Φ is Composed of m×3 matrices. The weighted least squares estimation can be used to calculate the pose correction offset Δ,
Δ=(ΦTWΦ)-1ΦTWY. (14)Δ=(Φ T WΦ) -1 Φ T WY. (14)
其中W为m×m对角正定加权矩阵,W[i,i]为匹配点vi的匹配权值wi。此时匹配部分将达到最佳相合,即匹配误差协方差JW=(Y-ΦΔ)TW(Y-ΦΔ)达到最小值。Where W is an m×m diagonal positive definite weight matrix, and W[i, i] is the matching weight w i of the matching point v i . At this time, the matching part will achieve the best match, that is, the matching error covariance J W =(Y-ΦΔ) T W(Y-ΦΔ) reaches the minimum value.
步骤S645将每个障碍物数据点按位姿校正偏移量作平移和旋转。然后再次执行步骤S41,在步骤S41执行完成后,进行收敛判断。在该实施例子中,收敛条件为加权最小二乘估计的残差小于一定值或者稳定在一定的值上。当收敛条件满足时,所得位姿校正偏移量即为所求位姿校正偏移量;当收敛条件不满足时,回到步骤S42,如此循环往复直到收敛条件满足。Step S645 translates and rotates each obstacle data point according to the pose correction offset. Then step S41 is executed again, and after the execution of step S41 is completed, a convergence judgment is performed. In this implementation example, the convergence condition is that the residual error of the weighted least squares estimation is less than a certain value or is stable at a certain value. When the convergence condition is satisfied, the obtained pose correction offset is the desired pose correction offset; when the convergence condition is not satisfied, return to step S42, and so on until the convergence condition is satisfied.
然后执行步骤S65,根据当前周期移动机器人的预估位姿
根据所得到的机器人位姿估计、局部线段特征地图和局部占用栅格地图,可以更新全局线段特征地图和全局占用栅格地图。根据所估计的机器人位姿,计算局部占用栅格地图中的各个栅格在全局占用栅格地图中所对应的位置,利用公式(3)更新该栅格的被占用概率。同样,根据所估计的机器人位姿,将局部线段特征地图中的各个线段特征转化到全局线段特征地图中,并在全局线段特征地图中寻找是否存在一个全局线段特征与该局部线段特征满足同向性、共线性和重合性要求。如果存在,则合并线段拟合点,利用最小二乘直线拟合法重新计算线段特征属性,替换原全局线段特征,如果不存在,则将该线段特征加入到全局线段特征地图中。最后利用全局占用栅格地图计算各个线段特征的可信度,计算方式为位于被占用栅格内的线段特征长度除以该线段特征的总长度。当可信度低于一定的值时,在全局线段特征地图中删除该线段特征。According to the obtained robot pose estimation, local line feature map and local occupancy grid map, the global line feature map and global occupancy grid map can be updated. According to the estimated pose of the robot, calculate the corresponding position of each grid in the local occupancy grid map in the global occupancy grid map, and use the formula (3) to update the occupancy probability of the grid. Similarly, according to the estimated robot pose, each line feature in the local line feature map is transformed into the global line feature map, and whether there is a global line feature in the global line feature map that satisfies the same direction as the local line feature sex, collinearity and coincidence requirements. If it exists, merge the line segment fitting points, use the least squares line fitting method to recalculate the line segment feature attributes, and replace the original global line segment feature, if not, add the line segment feature to the global line segment feature map. Finally, the credibility of each line segment feature is calculated by using the global occupancy grid map. The calculation method is to divide the length of the line segment feature located in the occupied grid by the total length of the line segment feature. When the confidence is lower than a certain value, the line feature is deleted in the global line feature map.
Claims (9)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CNB2006100536902A CN100449444C (en) | 2006-09-29 | 2006-09-29 | A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CNB2006100536902A CN100449444C (en) | 2006-09-29 | 2006-09-29 | A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN101000507A CN101000507A (en) | 2007-07-18 |
| CN100449444C true CN100449444C (en) | 2009-01-07 |
Family
ID=38692502
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CNB2006100536902A Expired - Fee Related CN100449444C (en) | 2006-09-29 | 2006-09-29 | A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN100449444C (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102597897A (en) * | 2009-11-06 | 2012-07-18 | 株式会社日立制作所 | Mobile robot system |
Families Citing this family (86)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101266659B (en) * | 2008-05-08 | 2010-06-09 | 山东大学 | Robot Grid Submap Fusion Method Based on Immune Adaptive Genetic Algorithm |
| JP2013503404A (en) * | 2009-08-31 | 2013-01-31 | ニート ロボティックス,インコーポレイティド | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
| JP5471626B2 (en) * | 2010-03-09 | 2014-04-16 | ソニー株式会社 | Information processing apparatus, map update method, program, and information processing system |
| JP5452442B2 (en) * | 2010-10-25 | 2014-03-26 | 株式会社日立製作所 | Robot system and map updating method |
| JP4991023B2 (en) * | 2010-11-12 | 2012-08-01 | パナソニック株式会社 | Moving path search device and moving path search method |
| CN102121827B (en) * | 2010-11-29 | 2013-12-18 | 浙江亚特电器有限公司 | Positioning system of mobile robot and positioning method thereof |
| CN102313547B (en) * | 2011-05-26 | 2013-02-13 | 东南大学 | Vision navigation method of mobile robot based on hand-drawn outline semantic map |
| CN103123727B (en) * | 2011-11-21 | 2015-12-09 | 联想(北京)有限公司 | Instant location and map constructing method and equipment |
| CN102402225B (en) * | 2011-11-23 | 2013-09-04 | 中国科学院自动化研究所 | Method for realizing localization and map building of mobile robot at the same time |
| CN103247225B (en) * | 2012-02-13 | 2015-04-29 | 联想(北京)有限公司 | Instant positioning and map building method and equipment |
| CN102902269B (en) * | 2012-09-21 | 2015-07-01 | 北京邮电大学 | Redundant robot dynamic obstacle avoidance method using pre-selected minimum distance index |
| CN103885443B (en) * | 2012-12-20 | 2017-02-08 | 联想(北京)有限公司 | Device, system and method for simultaneous localization and mapping unit |
| CN103884330B (en) * | 2012-12-21 | 2016-08-10 | 联想(北京)有限公司 | Information processing method, mobile electronic equipment, guiding equipment and server |
| CN103064424A (en) * | 2012-12-24 | 2013-04-24 | 深圳市银星智能科技股份有限公司 | Covering method for mobile platform on unknown area |
| CN103914068A (en) * | 2013-01-07 | 2014-07-09 | 中国人民解放军第二炮兵工程大学 | Service robot autonomous navigation method based on raster maps |
| KR20140108821A (en) * | 2013-02-28 | 2014-09-15 | 삼성전자주식회사 | Mobile robot and method of localization and mapping of mobile robot |
| CN103279949B (en) * | 2013-05-09 | 2015-10-07 | 浙江大学 | Based on the multi-camera parameter automatic calibration system operation method of self-align robot |
| DE102013211126A1 (en) * | 2013-06-14 | 2014-12-18 | Robert Bosch Gmbh | Method for modeling an environment of a vehicle |
| CN104655007B (en) * | 2013-11-22 | 2017-08-25 | 中国科学院深圳先进技术研究院 | One kind creates environment scene world coordinates method and system |
| CN103712617B (en) * | 2013-12-18 | 2016-08-24 | 北京工业大学 | A kind of creation method of the multilamellar semanteme map of view-based access control model content |
| CN106133629B (en) * | 2014-04-25 | 2020-04-07 | 索尼公司 | Information processing apparatus, information processing method, program, and imaging system |
| CN103984037B (en) * | 2014-04-30 | 2017-07-28 | 深圳市墨克瑞光电子研究院 | The mobile robot obstacle detection method and device of view-based access control model |
| JP2015215651A (en) | 2014-05-08 | 2015-12-03 | 株式会社日立製作所 | Robot and self-position estimation method |
| CN103971220B (en) * | 2014-05-21 | 2018-04-20 | 上海第二工业大学 | A kind of harmful influence intelligent warehouse management system |
| CN104019813B (en) * | 2014-06-19 | 2017-01-25 | 无锡知谷网络科技有限公司 | Method and system of target immediate location and map establishing |
| CN104077809B (en) * | 2014-06-24 | 2017-04-12 | 上海交通大学 | Visual SLAM method based on structural lines |
| CN104236551B (en) * | 2014-09-28 | 2017-07-28 | 北京信息科技大学 | A kind of map creating method of snake-shaped robot based on laser range finder |
| CN104503449A (en) * | 2014-11-24 | 2015-04-08 | 杭州申昊科技股份有限公司 | Positioning method based on environment line features |
| CN104501794A (en) * | 2014-11-24 | 2015-04-08 | 杭州申昊科技股份有限公司 | Map building method based on environmental linear features |
| CN106153043B (en) * | 2015-04-13 | 2019-09-10 | Tcl集团股份有限公司 | A kind of robot indoor positioning method and system based on infrared distance sensor |
| CN106325266A (en) * | 2015-06-15 | 2017-01-11 | 联想(北京)有限公司 | Spatial distribution map building method and electronic device |
| KR101734654B1 (en) * | 2015-06-25 | 2017-05-11 | 현대자동차주식회사 | System and Method for writing Occupancy Grid Map of sensor centered coordinate system using laser scanner |
| CN104916216A (en) * | 2015-06-26 | 2015-09-16 | 深圳乐行天下科技有限公司 | Map construction method and system thereof |
| JP6849330B2 (en) * | 2015-08-28 | 2021-03-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Map generation method, self-position estimation method, robot system, and robot |
| CN105242667A (en) * | 2015-09-23 | 2016-01-13 | 南京白云化工环境监测有限公司 | Remotely controlled mobile air quality detection platform and application method thereof |
| CN105258702B (en) * | 2015-10-06 | 2019-05-07 | 深圳力子机器人有限公司 | A kind of global localization method based on SLAM navigator mobile robot |
| CN105243665A (en) * | 2015-10-10 | 2016-01-13 | 中国科学院深圳先进技术研究院 | Robot biped positioning method and apparatus |
| KR102403504B1 (en) * | 2015-11-26 | 2022-05-31 | 삼성전자주식회사 | Mobile Robot And Method Thereof |
| US10613546B2 (en) * | 2015-12-02 | 2020-04-07 | Qualcomm Incorporated | Stochastic map-aware stereo vision sensor model |
| US20170160747A1 (en) * | 2015-12-04 | 2017-06-08 | Qualcomm Incorporated | Map generation based on raw stereo vision based measurements |
| CN105466421B (en) * | 2015-12-16 | 2018-07-17 | 东南大学 | Mobile robot autonomous cruise method towards reliable WIFI connections |
| CN105571596B (en) * | 2016-01-18 | 2018-09-18 | 福州华鹰重工机械有限公司 | More vehicle environmental heuristic approach and device |
| EP3428885A4 (en) * | 2016-03-09 | 2019-08-14 | Guangzhou Airob Robot Technology Co., Ltd. | Map construction method, and correction method and apparatus |
| CN105973265B (en) * | 2016-05-19 | 2019-03-19 | 杭州申昊科技股份有限公司 | A kind of mileage estimation method based on scanning laser sensor |
| US9996944B2 (en) * | 2016-07-06 | 2018-06-12 | Qualcomm Incorporated | Systems and methods for mapping an environment |
| CN106326493A (en) * | 2016-09-22 | 2017-01-11 | 乐视控股(北京)有限公司 | Map checking method and device |
| CN106502253B (en) * | 2016-12-22 | 2019-11-22 | 深圳乐动机器人有限公司 | A kind of mobile device independently builds drawing method and device entirely |
| WO2018127461A1 (en) | 2017-01-05 | 2018-07-12 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Generation and use of hd maps |
| CN106643701B (en) * | 2017-01-16 | 2019-05-14 | 深圳优地科技有限公司 | A kind of mutual detection method and device of robot |
| JP6809913B2 (en) * | 2017-01-26 | 2021-01-06 | パナソニック株式会社 | Robots, robot control methods, and map generation methods |
| CN107065872B (en) * | 2017-04-11 | 2020-06-05 | 珠海市一微半导体有限公司 | Grid map creation method of intelligent robot |
| CN107144285B (en) * | 2017-05-08 | 2020-06-26 | 深圳地平线机器人科技有限公司 | Pose information determination method and device and movable equipment |
| WO2018209557A1 (en) * | 2017-05-16 | 2018-11-22 | 深圳市大疆创新科技有限公司 | Method and device for controlling device, and computer readable storage medium |
| CN107390700B (en) * | 2017-09-05 | 2021-06-01 | 珠海市一微半导体有限公司 | Robot dynamic mapping method and chip |
| CN107782311A (en) * | 2017-09-08 | 2018-03-09 | 珠海格力电器股份有限公司 | Mobile path planning method and device for mobile terminal |
| CN107677279B (en) * | 2017-09-26 | 2020-04-24 | 上海思岚科技有限公司 | Method and system for positioning and establishing image |
| CN107728615B (en) * | 2017-09-26 | 2019-12-13 | 上海思岚科技有限公司 | self-adaptive region division method and system |
| CN108133485B (en) * | 2017-12-21 | 2020-08-07 | 重庆玖舆博泓科技有限公司 | GIS rasterization connection problem area identification method, device, terminal and medium |
| CN108508449B (en) * | 2018-02-09 | 2020-05-26 | 意诺科技有限公司 | Positioning method and device |
| CN110207699B (en) * | 2018-02-28 | 2022-04-12 | 北京京东尚科信息技术有限公司 | Positioning method and device |
| CN108507578B (en) * | 2018-04-03 | 2021-04-30 | 珠海市一微半导体有限公司 | Navigation method of robot |
| CN108469826B (en) * | 2018-04-23 | 2021-06-08 | 宁波Gqy视讯股份有限公司 | Robot-based map generation method and system |
| CN108805327B (en) * | 2018-04-23 | 2022-04-26 | 西安科技大学 | Method and system for robot path planning and environment reconstruction based on virtual reality |
| WO2020014951A1 (en) * | 2018-07-20 | 2020-01-23 | 深圳市道通智能航空技术有限公司 | Method and apparatus for building local obstacle map, and unmanned aerial vehicle |
| CN109074085B (en) * | 2018-07-26 | 2021-11-09 | 达闼机器人有限公司 | Autonomous positioning and map building method and device and robot |
| CN108709560A (en) * | 2018-08-15 | 2018-10-26 | 苏州中研讯科智能科技有限公司 | Carrying robot high accuracy positioning air navigation aid based on straightway feature |
| CN109408608A (en) * | 2018-11-01 | 2019-03-01 | 北京云狐时代科技有限公司 | A kind of positioning display method and device |
| US11465633B2 (en) * | 2018-11-14 | 2022-10-11 | Huawei Technologies Co., Ltd. | Method and system for generating predicted occupancy grid maps |
| CN109545072B (en) * | 2018-11-14 | 2021-03-16 | 广州广电研究院有限公司 | Map construction pose calculation method, map construction pose calculation device, map construction pose storage medium and map construction pose calculation system |
| CN109528095B (en) * | 2018-12-28 | 2020-11-17 | 深圳市愚公科技有限公司 | Calibration method of sweeping record chart, sweeping robot and mobile terminal |
| CN109696168A (en) * | 2019-02-27 | 2019-04-30 | 沈阳建筑大学 | A kind of mobile robot geometry map creating method based on laser sensor |
| CN112215887B (en) * | 2019-07-09 | 2023-09-08 | 深圳市优必选科技股份有限公司 | Pose determining method and device, storage medium and mobile robot |
| CN110531725B (en) * | 2019-09-19 | 2022-11-04 | 上海机器人产业技术研究院有限公司 | Cloud-based map sharing method |
| CN110967029B (en) * | 2019-12-17 | 2022-08-26 | 中新智擎科技有限公司 | Picture construction method and device and intelligent robot |
| CN113446971B (en) * | 2020-03-25 | 2023-08-08 | 扬智科技股份有限公司 | Spatial recognition method, electronic device and non-transitory computer readable storage medium |
| CN111426312B (en) * | 2020-03-31 | 2021-10-29 | 上海擎朗智能科技有限公司 | Method, device, device and storage medium for updating positioning map |
| CN112013845B (en) * | 2020-08-10 | 2022-04-22 | 北京轩宇空间科技有限公司 | Fast map updating method, device and storage medium adapting to unknown dynamic space |
| CN112426095B (en) * | 2020-11-24 | 2022-09-09 | 尚科宁家(中国)科技有限公司 | Partition cleaning method of cleaning robot and cleaning robot |
| CN112433211B (en) * | 2020-11-27 | 2022-11-29 | 浙江商汤科技开发有限公司 | Pose determination method and device, electronic equipment and storage medium |
| CN112750161B (en) * | 2020-12-22 | 2023-11-03 | 苏州大学 | Map update method for mobile robots |
| CN112964257B (en) * | 2021-02-05 | 2022-10-25 | 南京航空航天大学 | Pedestrian inertia SLAM method based on virtual landmarks |
| CN115525044B (en) * | 2021-06-09 | 2025-09-09 | 珠海一微半导体股份有限公司 | Repositioning method based on skeleton diagram, chip and mobile robot |
| CN113532439B (en) * | 2021-07-26 | 2023-08-25 | 广东电网有限责任公司 | Transmission line inspection robot synchronous positioning and map construction method and device |
| CN113674351B (en) * | 2021-07-27 | 2023-08-08 | 追觅创新科技(苏州)有限公司 | Drawing construction method of robot and robot |
| CN114577215B (en) * | 2022-03-10 | 2023-10-27 | 山东新一代信息产业技术研究院有限公司 | A feature map update method, equipment and medium for mobile robots |
| CN116086430B (en) * | 2022-12-27 | 2025-02-18 | 深圳市普渡科技有限公司 | Grid map construction method, robot and computer-readable storage medium |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10260724A (en) * | 1997-03-19 | 1998-09-29 | Yaskawa Electric Corp | Passage environment map generation method |
| JPH11305833A (en) * | 1998-04-20 | 1999-11-05 | Yaskawa Electric Corp | Landmark information generator |
| CN1399734A (en) * | 1999-11-24 | 2003-02-26 | 个人机器人技术公司 | Autonomous multi-platform robot system |
| JP2004276168A (en) * | 2003-03-14 | 2004-10-07 | Japan Science & Technology Agency | Map creation system for mobile robots |
| CN1617170A (en) * | 2003-09-19 | 2005-05-18 | 索尼株式会社 | Environment identification device and method, route design device and method and robot |
-
2006
- 2006-09-29 CN CNB2006100536902A patent/CN100449444C/en not_active Expired - Fee Related
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10260724A (en) * | 1997-03-19 | 1998-09-29 | Yaskawa Electric Corp | Passage environment map generation method |
| JPH11305833A (en) * | 1998-04-20 | 1999-11-05 | Yaskawa Electric Corp | Landmark information generator |
| CN1399734A (en) * | 1999-11-24 | 2003-02-26 | 个人机器人技术公司 | Autonomous multi-platform robot system |
| JP2004276168A (en) * | 2003-03-14 | 2004-10-07 | Japan Science & Technology Agency | Map creation system for mobile robots |
| CN1617170A (en) * | 2003-09-19 | 2005-05-18 | 索尼株式会社 | Environment identification device and method, route design device and method and robot |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102597897A (en) * | 2009-11-06 | 2012-07-18 | 株式会社日立制作所 | Mobile robot system |
| CN102597897B (en) * | 2009-11-06 | 2014-10-08 | 株式会社日立制作所 | Mobile robot system |
Also Published As
| Publication number | Publication date |
|---|---|
| CN101000507A (en) | 2007-07-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN100449444C (en) | A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments | |
| CN112347840B (en) | Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method | |
| Boniardi et al. | Robust LiDAR-based localization in architectural floor plans | |
| CN111260751B (en) | Mapping method based on multi-sensor mobile robot | |
| CN102402225B (en) | Method for realizing localization and map building of mobile robot at the same time | |
| CN109186606B (en) | Robot composition and navigation method based on SLAM and image information | |
| Bonnabel et al. | On the covariance of ICP-based scan-matching techniques | |
| CN113674412B (en) | Pose fusion optimization-based indoor map construction method, system and storage medium | |
| CN113776519A (en) | AGV vehicle mapping and autonomous navigation obstacle avoidance method under lightless dynamic open environment | |
| CN106595659A (en) | Map merging method of unmanned aerial vehicle visual SLAM under city complex environment | |
| CN114429432A (en) | A kind of multi-source information layered fusion method, device and storage medium | |
| CN114111818A (en) | A general visual SLAM method | |
| Zhao et al. | Review of slam techniques for autonomous underwater vehicles | |
| CN116380039A (en) | A mobile robot navigation system based on solid-state lidar and point cloud map | |
| CN118707542A (en) | Self-positioning method, system, electronic device and storage medium for mobile robot | |
| Aghamohamadi | Feature-based laser scan matching for accurate and high speed mobile robot localization | |
| Fu et al. | Semantic map-based visual localization with consistency guarantee | |
| CN115060268A (en) | A computer room fusion positioning method, system, equipment and storage medium | |
| KR101502071B1 (en) | Camera Data Generator for Landmark-based Vision Navigation System and Computer-readable Media Recording Program for Executing the Same | |
| Zeghmi et al. | A Kalman-particle hybrid filter for improved localization of AGV in indoor environment | |
| Emter et al. | Stochastic cloning for robust fusion of multiple relative and absolute measurements | |
| Zhao et al. | L-VIWO: Visual-inertial-wheel odometry based on lane lines | |
| Zeng et al. | An indoor global localization technique for mobile robots in long straight environments | |
| Wang et al. | Monocular visual-inertial localization in a point cloud map using feature-to-distribution registration | |
| CN117571002A (en) | Visual navigation method and system based on semi-uniform topology metric map |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20070718 Assignee: HANGZHOU NANJIANG ROBOTICS Co.,Ltd. Assignor: Zhejiang University Contract record no.: 2015330000100 Denomination of invention: Method for moving robot simultanously positioning and map structuring at unknown environment Granted publication date: 20090107 License type: Exclusive License Record date: 20150508 |
|
| LICC | Enforcement, change and cancellation of record of contracts on the licence for exploitation of a patent or utility model | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090107 Termination date: 20210929 |
|
| CF01 | Termination of patent right due to non-payment of annual fee |



