[go: up one dir, main page]

CN101008571A - Three-dimensional environment perception method for mobile robot - Google Patents

Three-dimensional environment perception method for mobile robot Download PDF

Info

Publication number
CN101008571A
CN101008571A CN 200710034343 CN200710034343A CN101008571A CN 101008571 A CN101008571 A CN 101008571A CN 200710034343 CN200710034343 CN 200710034343 CN 200710034343 A CN200710034343 A CN 200710034343A CN 101008571 A CN101008571 A CN 101008571A
Authority
CN
China
Prior art keywords
environment
map
robot
dimensional
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 200710034343
Other languages
Chinese (zh)
Inventor
蔡自兴
邹小兵
王璐
段琢华
于金霞
文志强
陈白帆
郑敏捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN 200710034343 priority Critical patent/CN101008571A/en
Publication of CN101008571A publication Critical patent/CN101008571A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种移动机器人三维环境感知方法,由环境信息的获取、环境信息自适应滤波、环境信息的坐标变换和三维环境感知四部分组成。本发明通过设计一个感知平台来获取机器人周围环境信息,该感知平台由二维激光雷达、旋转云台和步进电机三部分组成,通过对步进电机的控制,使感知平台在俯仰和水平方向转动获取环境信息。针对环境信息的噪声干扰,提出了一种动态自适应滤波器,实现实时动态滤波除噪。针对感知平台,提出了坐标变换公式,实现从环境信息到高度图转换。针对简单的高度图,提出三维环境感知方法,对高度图进行地形平坦性分析,分割出环境地图中的可行进区域与障碍区域。移动机器人三维环境感知方法的实现,能为机器人的避障提供局部环境地图。A three-dimensional environment perception method for a mobile robot is composed of four parts: acquisition of environment information, adaptive filtering of the environment information, coordinate transformation of the environment information and three-dimensional environment perception. The present invention obtains the surrounding environment information of the robot by designing a sensing platform. The sensing platform is composed of three parts: two-dimensional laser radar, rotating platform and stepping motor. Turn for environmental information. Aiming at the noise interference of environmental information, a dynamic adaptive filter is proposed to realize real-time dynamic filtering and denoising. For the perception platform, a coordinate transformation formula is proposed to realize the conversion from environmental information to height map. Aiming at the simple height map, a three-dimensional environment perception method is proposed, and the terrain flatness analysis is performed on the height map, and the travelable area and the obstacle area in the environment map are segmented. The realization of the mobile robot's three-dimensional environment perception method can provide a local environment map for the robot's obstacle avoidance.

Description

一种移动机器人三维环境感知方法A three-dimensional environment perception method for mobile robots

技术领域technical field

本发明涉及移动机器人的外部信息感知及导航方法,是一种非结构化环境下的环境感知方法。The invention relates to an external information perception and navigation method of a mobile robot, which is an environment perception method in an unstructured environment.

背景技术Background technique

长期以来缺乏必要的获取三维环境信息手段。多目计算机视觉在三维建模过程中,易受到光照、分辨率、焦距调节、关注点选择等多种因素影响。近年来发展起来的激光雷达三维成像技术,能够有效地解决视觉技术在获得深度信息的难题。激光测距系统包括单点的测距传感器、在平面上进行线扫描的二维激光雷达以及能够对一个区域进行面扫描的三维激光雷达。在结构化的运行环境中,激光雷达通常被水平安置在移动机器人的某个高度上实现在二维平面上的障碍检测。在非结构化环境中,由于可能存在不同高度的障碍物,因此需要激光雷达能够实现面扫描的功能。但是目前能够进行三维环境测量的激光雷达系统还很昂贵,而且体积、重量都还不适应于一般移动机器人上的应用。For a long time, there has been a lack of necessary means to obtain 3D environmental information. In the process of 3D modeling, multi-eye computer vision is easily affected by various factors such as illumination, resolution, focus adjustment, and focus point selection. The lidar 3D imaging technology developed in recent years can effectively solve the problem of visual technology in obtaining depth information. The laser ranging system includes a single-point ranging sensor, a two-dimensional laser radar that scans a line on a plane, and a three-dimensional laser radar that can scan an area. In a structured operating environment, the lidar is usually placed horizontally at a certain height of the mobile robot to detect obstacles on a two-dimensional plane. In an unstructured environment, since there may be obstacles of different heights, lidar is required to be able to realize the function of area scanning. However, the current laser radar system capable of measuring the three-dimensional environment is still very expensive, and its volume and weight are not yet suitable for applications on general mobile robots.

发明内容Contents of the invention

为克服二维激光雷达的不足,本发明设计并实现了一种基于二维激光雷达的移动机器人三维环境感知方法。该方法使机器人不仅能获取其周围的环境信息,而且能感知三维环境,区分环境地图中的可行区域与障碍区域。In order to overcome the shortcomings of the two-dimensional laser radar, the present invention designs and implements a three-dimensional environment perception method for mobile robots based on the two-dimensional laser radar. This method enables the robot not only to obtain the environment information around it, but also to perceive the three-dimensional environment, and to distinguish the feasible area and the obstacle area in the environment map.

该发明解决其技术问题所采用的技术方案是:该发明由环境信息的获取、环境信息自适应滤波、环境信息的坐标变换和三维环境感知四部分组成,每部分的内容如下。The technical solution adopted by the invention to solve its technical problems is: the invention consists of four parts: acquisition of environmental information, adaptive filtering of environmental information, coordinate transformation of environmental information and three-dimensional environmental perception. The contents of each part are as follows.

1环境信息的获取1 Acquisition of Environmental Information

通过设计一个感知平台来获取机器人周围环境信息。该感知平台由二维激光雷达、旋转云台和步进电机三部分组成。该平台通过二个高精密的电控转台实现水平与俯仰角度的精确转动,传感器转动云台为机器人的“头部”。云台由水平转台与俯仰转台组合而成,激光雷达的信息处理与传感器云台的控制由一台工控机(IPC)实现。IPC配置了一块PCI插槽的高速串行通讯接口卡,与激光雷达通讯。一块基于ISA插槽的步进电机控制卡对传感器云台的水平与俯仰转动进行控制。在工控机端设置转台控制策略,控制程序根据获取的外界信息,选择合适的控制策略,控制平台的俯仰。By designing a perception platform to obtain the environment information of the robot. The perception platform consists of three parts: two-dimensional laser radar, rotating pan-tilt and stepping motor. The platform realizes the precise rotation of horizontal and pitch angles through two high-precision electronically controlled turntables, and the sensor rotating platform is the "head" of the robot. The pan/tilt is composed of a horizontal turntable and a pitch turntable. The information processing of the laser radar and the control of the sensor pan/tilt are realized by an industrial computer (IPC). The IPC is equipped with a high-speed serial communication interface card with a PCI slot to communicate with the lidar. A stepper motor control card based on the ISA slot controls the pan and tilt rotation of the sensor gimbal. The turntable control strategy is set on the industrial computer, and the control program selects an appropriate control strategy based on the obtained external information to control the pitch of the platform.

2环境信息的自适应滤波2 Adaptive filtering of environmental information

激光雷达的测距信息往往会包含一定的噪声干扰。传感器测量过程中的干扰主要包含了有源干扰与混合像素干扰二类,此外由于障碍物的遮挡,存在激光雷达扫描的盲区;激光雷达的扫描点存在间隙,当机器人以一定速度前进或转动时,激光雷达按25Hz的频率扫描环境,也可能会存在一定的扫描间隙。所以必须对激光雷达的测距信息进行滤波。The ranging information of lidar often contains certain noise interference. The interference in the sensor measurement process mainly includes two types of active interference and mixed pixel interference. In addition, due to the obstruction of obstacles, there is a blind area for lidar scanning; there is a gap in the scanning point of lidar. When the robot advances or rotates at a certain speed , the lidar scans the environment at a frequency of 25Hz, and there may also be a certain scanning gap. Therefore, the ranging information of the lidar must be filtered.

3环境信息的坐标变换3 Coordinate transformation of environmental information

通过二个步骤实现激光雷达测量环境信息的三维坐标变换。第一步,将测量信息映射到以机器人为参考中心,机器人车体平台为参考面的三维坐标系中。设传感器云台中激光雷达扫描圆心处的坐标系为{O2},云台面绕坐标系{O1}的y1轴做俯仰角为γp的转动。云台的水平转动等效为围绕机器人参考中心坐标系{Or}绕zr轴的转动,设转动角度为γh。O1在坐标系{Or}中沿zr平移d0;O2在坐标系{O1}中沿z1平移d1,沿x1平移d2The three-dimensional coordinate transformation of the laser radar measurement environment information is realized through two steps. The first step is to map the measurement information to a three-dimensional coordinate system with the robot as the reference center and the robot car body platform as the reference plane. Assume that the coordinate system at the center of the lidar scanning circle in the sensor pan/tilt is {O 2 }, and the pan/tilt rotates around the y1 axis of the coordinate system {O 1 } with a pitch angle of γ p . The horizontal rotation of the gimbal is equivalent to the rotation around the robot reference center coordinate system {O r } around the z r axis, and the rotation angle is γ h . O 1 translates d 0 along z r in the coordinate system {O r }; O 2 translates d 1 along z 1 and d 2 along x 1 in the coordinate system {O 1 }.

激光雷达的扫描面为x2-y2坐标面上以O2为圆心-90~+90°范围的辐射扇面。测量数据表示为极坐标测量值(ρi,j,λi,j),ρi,j为测量环境信息,λi,j为极角(以x2为极轴)。下标i表示以主程序运行周期(40ms)为度量的时刻。在时刻i,激光雷达扫描的数据包中包含了361个测量数据。下标j代表了一个数据包中测量点的编号,将测量值(ρi,j,λi,j)转化为向量 u | O 2 = [ ρ i , j cos λ i , j , ρ i , j sin λ i , j , 0,1 ] T . The scanning surface of the lidar is a radiation sector in the range of -90° to +90° on the x 2 -y 2 coordinate plane with O 2 as the center. The measurement data are expressed as polar coordinate measurement values (ρ i, j , λ i, j ), where ρ i, j is the measurement environment information, and λ i, j is the polar angle (with x 2 as the polar axis). The subscript i represents the moment measured by the running cycle of the main program (40ms). At time i, the lidar scan data packet contains 361 measurement data. The subscript j represents the number of the measurement point in a data packet, and the measurement value (ρ i, j , λ i, j ) is converted into a vector u | o 2 = [ ρ i , j cos λ i , j , ρ i , j sin λ i , j , 0,1 ] T .

第二步,考虑移动机器人在三维地形曲面上的坐标变化,把测量信息映射到全局坐标系{O}中。移动机器人的姿态可以用状态(xi,yi,zi,θi,αi,φi)来表示,其中θi表示机器人的航向角,αi表示机器人的俯仰角,φi表示机器人的横滚角。In the second step, considering the coordinate change of the mobile robot on the three-dimensional terrain surface, the measurement information is mapped to the global coordinate system {O}. The attitude of a mobile robot can be represented by the state (xi , y , zi , θi , αi , φi ), where θi represents the heading angle of the robot, αi represents the pitch angle of the robot, and φi represents the roll angle.

经坐标变换以后,用二维数组A[m][n]来记录平面上环境地形的高度信息。数组A[m][n]的数值就代表了地形曲面与基准面的相对高度。After the coordinate transformation, use the two-dimensional array A [m] [n] to record the height information of the environmental terrain on the plane. The value of the array A [m][n] represents the relative height between the terrain surface and the reference plane.

4三维环境感知4 3D environment perception

移动机器人在运动过程中,环境感知平台以一定的角度(如-45°)对环境进行探测,机器人前方的环境被感知平台检测转化环境信息,经过自适应滤波和坐标变换,转化为环境信息的高度图,三维环境地图的创建必须通过累积这种高度图来完成。在感知三维环境之前,必须针对通过感知平台获得的三维环境高度图进行滤波除噪和栅格缺损插补,以降低测量噪声和减少扫描间隙造成的物体表面缺损。三维环境感知的主要目的是累计高度图,对高度图进行地形平坦性分析,分割出环境地图中的可行进区域与障碍区域。During the movement of the mobile robot, the environment perception platform detects the environment at a certain angle (such as -45°), and the environment in front of the robot is detected by the perception platform and transformed into environmental information. Heightmaps, the creation of a 3D environment map must be done by accumulating such heightmaps. Before sensing the 3D environment, filtering and denoising and grid defect interpolation must be performed on the 3D environment height map obtained through the sensing platform to reduce measurement noise and reduce object surface defects caused by scanning gaps. The main purpose of 3D environment perception is to accumulate the height map, analyze the terrain flatness of the height map, and segment the travelable area and obstacle area in the environmental map.

本发明具有较好的灵活性,能够实现非结构环境下的障碍检测与地形分析,为机器人的避障提供局部环境地图。The invention has good flexibility, can realize obstacle detection and terrain analysis in non-structural environment, and can provide a local environment map for robot obstacle avoidance.

附图说明Description of drawings

图1移动机器人三维环境感知方法的流程图;Fig. 1 is a flowchart of a three-dimensional environment perception method for a mobile robot;

图2激光雷达与旋转云台的示意图;Figure 2 Schematic diagram of lidar and rotating pan/tilt;

图3激光雷达测量系统的坐标变换示意图。Fig. 3 Schematic diagram of coordinate transformation of lidar measurement system.

具体实施方式Detailed ways

下面结合附图和实施方式对本发明进一步说明。The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

附图2为激光雷达与转动云台的示意图,1是俯仰旋转云台,2是激光雷达,3是扫描光心,4是水平转动中心,5是俯仰转动中心,6是步进电机,7是水平旋转平台,8是步进电机。Accompanying drawing 2 is the schematic diagram of lidar and rotating pan-tilt, 1 is pitching and rotating pan-tilt, 2 is lidar, 3 is scanning optical center, 4 is horizontal rotation center, 5 is pitch rotation center, 6 is stepper motor, 7 is a horizontal rotary platform, and 8 is a stepper motor.

二维激光雷达2装载在俯仰旋转云台1上,由工控机(IPC)通过步进电机6控制激光雷达2绕俯仰转动中心5在垂直方向旋转。激光雷达2、俯仰转动中心5、步进电机6都安装在水平旋转平台7的上方,由工控机(IPC)通过步进电机8,控制激光雷达绕在水平转动中心4在水平方向旋转。The two-dimensional laser radar 2 is loaded on the tilting and rotating platform 1, and the industrial computer (IPC) controls the laser radar 2 to rotate in the vertical direction around the pitching rotation center 5 through the stepping motor 6. The laser radar 2, the pitch rotation center 5, and the stepper motor 6 are all installed above the horizontal rotation platform 7, and the industrial computer (IPC) controls the laser radar to rotate in the horizontal direction around the horizontal rotation center 4 through the stepper motor 8.

附图3包括激光雷达扫描圆心处的坐标系{O2},云台面绕坐标系{O1}和全局坐标系{O}。Attached Figure 3 includes the coordinate system {O 2 } at the center of the laser radar scanning circle, the gimbal surface circumscribing coordinate system {O 1 } and the global coordinate system {O}.

1环境信息的获取1 Acquisition of Environmental Information

通过设计一个感知平台来获取机器人周围环境信息。该感知平台由二维激光雷达、旋转云台和步进电机三部分组成。该平台通过二个高精密的电控转台实现水平与俯仰角度的精确转动,传感器转动云台为机器人的“头部”。云台曰水平转台与俯仰转台组合而成,该平台可以实现水平方向±150°、俯仰方向-50~+15°的扫描运动。旋转台的驱动电机采用24VDC的步进电机驱动,步距角为1.8°。驱动器采用10倍细分方式,即一个驱动脉冲产生0.18°的电机旋转。同时旋转台的机械减速比为180∶1,即电机旋转180周时转台旋转1周。一个步进脉冲驱动转台旋转0.001°,从电气控制上保障了旋转台位置的精度。激光雷达及其安装部件的重量约为6Kg,水平转动的最大速度为16°/s,俯仰转动的最大速度为8°/s,旋转台所能达到的重复定位精度为0.01°By designing a perception platform to obtain the environment information of the robot. The perception platform consists of three parts: two-dimensional laser radar, rotating pan-tilt and stepping motor. The platform realizes the precise rotation of horizontal and pitch angles through two high-precision electronically controlled turntables, and the sensor rotating platform is the "head" of the robot. The cloud platform is a combination of a horizontal turntable and a pitch turntable. The platform can realize the scanning movement of ±150°in the horizontal direction and -50~+15°in the pitching direction. The drive motor of the rotary table is driven by a 24VDC stepper motor with a step angle of 1.8°. The driver adopts a 10-fold subdivision method, that is, one driving pulse generates a 0.18° motor rotation. At the same time, the mechanical reduction ratio of the turntable is 180:1, that is, the turntable rotates 1 turn when the motor rotates 180 turns. A step pulse drives the turntable to rotate 0.001°, which ensures the accuracy of the turntable position from the electrical control. The weight of the lidar and its installation parts is about 6Kg, the maximum speed of horizontal rotation is 16°/s, the maximum speed of pitch rotation is 8°/s, and the repeat positioning accuracy that the rotary table can achieve is 0.01°

激光雷达的信息处理与传感器云台的控制由一台工控机(IPC)实现。IPC配置了一块PCI插槽的高速串行通讯接口卡,与激光雷达通过RS422接口实现500K波特率下的通讯。一块基于ISA插槽的步进电机控制卡PCL839对传感器云台的水平与俯仰转动进行控制。在工控机端设置转台控制策略,控制程序根据获取的外界信息,选择合适的控制策略,控制平台的俯仰。The information processing of the lidar and the control of the sensor pan/tilt are realized by an industrial computer (IPC). The IPC is equipped with a high-speed serial communication interface card with a PCI slot, and communicates with the lidar at a 500K baud rate through the RS422 interface. A stepper motor control card PCL839 based on the ISA slot controls the pan and pitch rotation of the sensor pan/tilt. The turntable control strategy is set on the industrial computer, and the control program selects an appropriate control strategy based on the obtained external information to control the pitch of the platform.

激光雷达采用180°/0.5°模式,每次扫描获得361个测量数据。每个测量数据为2个字节,包含起始码与校验码的数据包长度为732个字节。在500K波特率的通讯速率下,传输延迟时间约为13ms,激光雷达的扫描时间为26.67ms,激光雷达的扫描频率约为25Hz,即每秒可以获得25×361=9025个测量点的数据。The lidar adopts 180°/0.5° mode, and obtains 361 measurement data per scan. Each measurement data is 2 bytes, and the length of the data packet including the start code and check code is 732 bytes. At the communication rate of 500K baud rate, the transmission delay time is about 13ms, the scanning time of the laser radar is 26.67ms, and the scanning frequency of the laser radar is about 25Hz, that is, the data of 25×361=9025 measurement points can be obtained per second .

2环境信息的自适应滤波2 Adaptive filtering of environmental information

激光雷达的测距信息往往会包含一定的噪声干扰。传感器测量过程中的干扰主要包含了有源干扰与混合像素干扰二类,此外由于障碍物的遮挡,存在激光雷达扫描的盲区;激光雷达的扫描点存在间隙,当机器人以一定速度前进或转动时,激光雷达按25Hz的频率扫描环境,也可能会存在一定的扫描间隙。所以必须对激光雷达的测距信息进行滤波。本发明提出一种在线滚动的动态自适应滤波(Dynamic adaptive filter,DAF)方法,该方法针对测量值(ρi,j,λi,j),其中ρi,j为测量环境信息,λi,j为极角,建立如下的数据分析窗口:The ranging information of lidar often contains certain noise interference. The interference in the sensor measurement process mainly includes two types of active interference and mixed pixel interference. In addition, due to the obstruction of obstacles, there is a blind area for lidar scanning; there is a gap in the scanning point of lidar. When the robot advances or rotates at a certain speed , the lidar scans the environment at a frequency of 25Hz, and there may also be a certain scanning gap. Therefore, the ranging information of the lidar must be filtered. The present invention proposes an online rolling dynamic adaptive filter (DAF) method, the method is aimed at the measured value (ρ i, j , λ i, j ), where ρ i, j is the measurement environment information, λ i , j is the polar angle, establish the following data analysis window:

ρi-1,j-1,ρi-1,j,ρi-1,j+1 ρ i-1, j-1 , ρ i-1, j , ρ i-1, j+1

ρi,j-1,ρi,j,ρi,j+1             (1)ρ i, j-1 , ρ i, j , ρ i, j+1 (1)

ρi+1,j-1,ρi+1,j,ρi+1,j+1 ρ i+1, j-1 , ρ i+1, j , ρ i+1, j+1

i表示一个数据包的编号,一个数据包有361个测量数据,采样时间间隔为40ms;j是同一组数据中的序列号。上述9个测量值具有时间与空间上最大的相关性,在数据窗口中计算出ρi,j与邻近测量值之差Δρmini represents the number of a data packet, a data packet has 361 measurement data, and the sampling time interval is 40ms; j is the serial number in the same group of data. The above nine measured values have the largest correlation in time and space, and the difference Δρ min between ρ i, j and the adjacent measured values is calculated in the data window:

Δρmin=min{|ρt+i,s+ii,j|,t,s=-1,0,1,且t,s不同时为0}Δρ min =min{|ρ t+i, s+ii, j |, t, s=-1, 0, 1, and t, s are not 0 at the same time}

Δρmin是时间与空间上相邻测量的最小差值,简称邻近差值。如果Δρmin>δ(ρ,v),则测量值ρi,j就被当作测量噪声而不进入动态环境知识库中进行高度图的计算,δ(ρ,v)由(2)式确定。Δρ min is the minimum difference between adjacent measurements in time and space, referred to as adjacent difference. If Δρ min > δ(ρ, v), the measured value ρ i, j will be regarded as measurement noise and not entered into the dynamic environment knowledge base for the calculation of the height map, and δ(ρ, v) is determined by formula (2) .

δδ (( ρρ ,, vv )) == σσ (( ρρ )) ++ 11 2525 (( || vv goalgoal || ++ || vv robotrobot || )) -- -- -- (( 22 ))

σ(ρ)是激光雷达测距的标准差,通过对激光雷达在不同环境测距数据的统计分析获得。vgoal是在环境中可能存在的动态目标的运动速度,Vrobot是机器人自身的运动速度。σ(ρ) is the standard deviation of lidar ranging, which is obtained by statistical analysis of ranging data of lidar in different environments. v goal is the moving speed of the dynamic target that may exist in the environment, and V robot is the moving speed of the robot itself.

3环境信息的坐标变换3 Coordinate transformation of environmental information

通过二个步骤实现激光雷达测量环境信息的三维坐标变换。第一步,将测量信息映射到以机器人为参考中心,机器人车体平台为参考面的三维坐标系中。设传感器云台中激光雷达扫描圆心处的坐标系为{O2},云台面绕坐标系{O1}的y1轴做俯仰角为γp的转动。云台的水平转动等效为围绕机器人参考中心坐标系{Or}绕zr轴的转动,设转动角度为γh。O1在坐标系{Or}中沿zr平移d0;O2在坐标系{O1}中沿z1平移d1,沿x1平移d2,如附图3。The three-dimensional coordinate transformation of the laser radar measurement environment information is realized through two steps. The first step is to map the measurement information to a three-dimensional coordinate system with the robot as the reference center and the robot car body platform as the reference plane. Assume that the coordinate system at the center of the lidar scanning circle in the sensor pan/tilt is {O 2 }, and the pan/tilt rotates around the y1 axis of the coordinate system {O 1 } with a pitch angle of γ p . The horizontal rotation of the gimbal is equivalent to the rotation around the robot reference center coordinate system {O r } around the z r axis, and the rotation angle is γ h . O 1 translates d 0 along z r in the coordinate system {O r }; O 2 translates d 1 along z 1 and d 2 along x 1 in the coordinate system {O 1 }, as shown in Figure 3 .

激光雷达的扫描面为x2-y2坐标面上以O2为圆心-90~+90°范围的辐射扇面。测量数据表示为极坐标测量值(ρi,j,λi,j),ρi,j为测量环境信息,λi,j为极角(以x2为极轴)。下标i表示以主程序运行周期(40ms)为度量的时刻。在时刻i,激光雷达扫描的数据包中包含了361个测量数据。下标j代表了一个数据包中测量点的编号,将测量值(ρi,j,λi,j)转化为向量

Figure A20071003434300071
The scanning surface of the laser radar is a radiation fan in the range of -90~+90° on the x 2 -y 2 coordinate plane with O 2 as the center. The measurement data are expressed as polar coordinate measurement values (ρ i, j , λ i, j ), where ρ i, j is the measurement environment information, and λ i, j is the polar angle (with x 2 as the polar axis). The subscript i represents the moment measured by the running cycle of the main program (40ms). At time i, the lidar scan data packet contains 361 measurement data. The subscript j represents the number of the measurement point in a data packet, and the measurement value (ρ i, j , λ i, j ) is converted into a vector
Figure A20071003434300071

设在时刻j传感器转台的状态为{γh,γp},将向量u|O2转换为机器人参考系{Or}中的向量u|Or。以下公式中Tran()表示坐标平移变换,Rot()表示围绕某坐标轴进行旋转变换,在变换矩阵中s表示正弦函数sin(),c表示余弦函数cos():Assuming that the state of the sensor turntable at time j is {γ h , γ p }, transform the vector u| O2 into the vector u| Or in the robot reference frame {O r }. In the following formula, Tran() represents coordinate translation transformation, Rot() represents rotation transformation around a certain coordinate axis, in the transformation matrix, s represents the sine function sin(), and c represents the cosine function cos():

uu || Oo rr == Oo 22 Oo rr TT ·&Center Dot; uu || Oo 22 -- -- -- (( 33 ))

其中:in:

Oo 22 Oo rr TT == TransTrans (( 0,00,0 ,, dd 00 )) RotRot (( zz rr ,, γγ hh )) RotRot (( vv 11 ,, γγ pp )) TransTrans (( dd 22 ,, 00 ,, dd 11 )) -- -- -- (( 44 ))

经过计算得到:Calculated to get:

Oo 22 Oo rr TT == cγcγ hh cγcγ pp -- sγsγ hh cγcγ hh sγsγ pp dd 22 cγcγ pp cγcγ hh ++ dd 11 sγsγ pp cγcγ hh sthe s γγ hh cγcγ pp cγcγ hh sγsγ hh sγsγ pp dd 22 cγcγ pp sγsγ hh ++ dd 11 sγsγ pp sγsγ hh -- sγsγ pp 00 cγcγ pp -- dd 22 sγsγ pp ++ dd 11 cγcγ pp ++ dd 00 00 00 00 11

第二步,考虑移动机器人在三维地形曲面上的坐标变化,把测量信息映射到全局坐标系{O}中。移动机器人的姿态可以用状态(xi,yi,zi,θi,αi,φi)来表示,其中θi表示机器人的航向角,αi表示机器人的俯仰角,φi表示机器人的横滚角,如附图3。In the second step, considering the coordinate change of the mobile robot on the three-dimensional terrain surface, the measurement information is mapped to the global coordinate system {O}. The attitude of a mobile robot can be represented by the state (xi , y , zi , θi , αi , φi ), where θi represents the heading angle of the robot, αi represents the pitch angle of the robot, and φi represents the roll angle, as shown in Figure 3.

机器人的航向角由光纤陀螺输出,俯仰角与横滚角由机器人平台上的倾角仪输出。坐标平移到(xi,yi,zi),再绕平移后的z轴方向旋转θ(航向角)。绕旋转后的y轴方向旋转α(俯仰角),最后围绕新坐标的xr轴旋转φi′(横滚角):The heading angle of the robot is output by the fiber optic gyroscope, and the pitch and roll angles are output by the inclinometer on the robot platform. The coordinates are translated to (x i , y i , zi ) , and then rotated around the translated z-axis direction by θ (the heading angle). Rotate α (pitch angle) around the rotated y-axis direction, and finally rotate φ i ′ (roll angle) around the x r axis of the new coordinates:

φφ ii ′′ == arcsinarcsin sinsin φφ ii coscos αα ii -- -- -- (( 55 ))

φi是倾角传感器直接测量的机器人坐标系yr轴与参考水平面的夹角,φi′则是考虑俯仰角旋转后围绕xr轴的横滚转动。经过上述坐标变换,最后得到在世界坐标系{O}中的位置向量:φ i is the angle between the y r axis of the robot coordinate system measured directly by the inclination sensor and the reference horizontal plane, and φ i ′ is the roll rotation around the x r axis after considering the pitch angle rotation. After the above coordinate transformation, the position vector in the world coordinate system {O} is finally obtained:

uu || Oo == [[ xx Oo || ii ,, jj ,, ythe y Oo || ii ,, jj ,, zz Oo || ii ,, jj ,, 11 ]] TT == Oo rr Oo TT ·· uu || Oo rr -- -- -- (( 66 ))

其中: O r O T = Trans ( x i , y i , z i ) Rot ( z , θ i ) Rot ( y , α i ) Rot ( x , φ ′ i ) - - - ( 7 ) in: o r o T = Trans ( x i , the y i , z i ) Rot ( z , θ i ) Rot ( the y , α i ) Rot ( x , φ ′ i ) - - - ( 7 )

经过计算得到:Calculated to get:

Oo rr Oo TT == cθcθ ii cαcα ii cθcθ ii sαsα ii sφsφ ′′ ii -- sθsθ ii cφcφ ′′ ii cθcθ ii sαsα ii cφcφ ii ′′ ++ sθsθ ii sφsφ ′′ ii xx ii sθsθ ii cαcα ii sθsθ ii cαcα ii sφsφ ii ′′ sθsθ ii sαsα ii sφsφ ii ′′ -- cθcθ ii cφcφ ii ′′ ythe y ii -- sαsα ii cαcα ii sφsφ ii ′′ cαcα ii cφcφ ii ′′ zz ii 00 00 00 11

经坐标变换以后,用二维数组A[m][n]来记录平面上环境地形的高度信息。数组A[m][n]的数值就代表了地形曲面与基准面的相对高度。m,n代表了在基准面投影上相应的栅格坐标,在基准面上采用3cm解析度的栅格,在高度方向上采用2cm的解析度,则:After the coordinate transformation, use the two-dimensional array A [m] [n] to record the height information of the environmental terrain on the plane. The value of the array A [m][n] represents the relative height between the terrain surface and the reference plane. m and n represent the corresponding grid coordinates on the datum projection, use a 3cm resolution grid on the datum plane, and use a 2cm resolution in the height direction, then:

{{ mm == (( intint )) xx Oo || ii ,, jj 33 nno == (( intint )) ythe y Oo || ii ,, jj 33 AA [[ mm ]] [[ nno ]] == (( intint )) zz Oo || ii ,, jj 22 -- -- -- (( 88 ))

4三维环境感知4 3D environment perception

移动机器人在运动过程中,环境感知平台以一定的角度(如-45°)对环境进行探测,机器人前方的环境被感知平台检测转化环境信息,经过自适应滤波和坐标变换,转化为环境信息的高度图,三维环境地图的创建必须通过累积这种高度图来完成。在感知三维环境之前,必须针对通过感知平台获得的三维环境高度图进行滤波除噪和栅格缺损插补,以降低测量噪声和减少扫描间隙造成的物体表面缺损。三维环境感知的主要目的是累计高度图,对高度图进行地形平坦性分析,分割出环境地图中的可行进区域与障碍区域。三维环境感知方法如下:During the movement of the mobile robot, the environment perception platform detects the environment at a certain angle (such as -45°), and the environment in front of the robot is detected by the perception platform and transformed into environmental information. Heightmaps, the creation of a 3D environment map must be done by accumulating such heightmaps. Before sensing the 3D environment, filtering and denoising and grid defect interpolation must be performed on the 3D environment height map obtained through the sensing platform to reduce measurement noise and reduce object surface defects caused by scanning gaps. The main purpose of 3D environment perception is to accumulate the height map, analyze the terrain flatness of the height map, and segment the travelable area and obstacle area in the environmental map. The three-dimensional environment perception method is as follows:

1)计算地形高度变化梯度的模,如(9)式,式中g[m][n]表示平面坐标(m,n)上相应的地形高度变化梯度的模。1) Calculate the modulus of the terrain height change gradient, such as formula (9), where g[m][n] represents the modulus of the corresponding terrain height change gradient on the plane coordinates (m, n).

g[m][n]=max{|A[m][n]-A[m+i][n+j]|    i,j=-1,0,1}    (9)g [m][n] =max{|A [m][n] -A [m+i][n+j] | i, j=-1, 0, 1} (9)

2)环境地图创建。环境地图用map[m][n]表示,初值为-1时,表示未知区域栅格;为0时表示可行进的自由区域栅格;大于等于1表示障碍区域栅格。运用公式(10)创建环境地图,区分环境地图中的可行进区域与障碍区域。2) Environment map creation. The environment map is represented by map [m][n] . When the initial value is -1, it means an unknown area grid; Use the formula (10) to create the environment map, and distinguish the travelable area and the obstacle area in the environment map.

mapmap [[ mm ]] [[ nno ]] == mapmap [[ mm ]] [[ nno ]] ++ 22 ,, ifif gg [[ mm ]] [[ nno ]] >> 22 Oo ,, ifif gg [[ mm ]] [[ nno ]] ≤≤ 22 -- -- -- (( 1010 ))

Claims (1)

1.一种移动机器人三维环境感知方法,其特征在于:通过感知平台获取环境信息,经动态自适应滤波,减少噪声干扰,再经坐标变换后得到环境高度图,对环境高度图累计和映射得到局部三维环境地图;1. A three-dimensional environment perception method for a mobile robot, characterized in that: obtain environmental information through a perception platform, reduce noise interference through dynamic adaptive filtering, obtain an environmental height map after coordinate transformation, accumulate and map the environmental height map to obtain Local 3D environment map; 感知平台由二维激光雷达、旋转云台和步进电机三部分组成,通过对步进电机的控制,使感知平台在俯仰和水平方向转动获取环境信息,动态自适应滤波方法中的邻近差值方式为:The perception platform consists of three parts: two-dimensional laser radar, rotating pan-tilt and stepping motor. Through the control of the stepping motor, the sensing platform can be rotated in the pitch and horizontal directions to obtain environmental information. The adjacent difference in the dynamic adaptive filtering method The way is: Δρmin=min{|ρt+i,s+ii,j|,t,s=-1,0,1,且t,s不同时为0}如果Δρmin>δ(ρ,v),则测量值ρi,j就被当作测量噪声而不进入动态环境知识库中进行高度图的计算,δ(ρ,v)的值由式: δ ( ρ , ν ) = σ ( ρ ) + 1 25 ( | ν goal | + | ν robot | ) 确定,其中σ(ρ)是激光雷达测距的标准差,vgoal是在环境中动态目标的运动速度,vrobot是机器人自身的运动速度,环境信息的坐标变换公式为:Δρ min = min{|ρ t+i, s+ii, j |, t, s=-1, 0, 1, and t, s are not 0 at the same time} If Δρ min >δ(ρ, v ), then the measured value ρ i, j is regarded as measurement noise and does not enter the dynamic environment knowledge base for the calculation of the height map. The value of δ(ρ, v) is given by the formula: δ ( ρ , ν ) = σ ( ρ ) + 1 25 ( | ν goal | + | ν robot | ) Determine, where σ(ρ) is the standard deviation of the lidar ranging, v goal is the moving speed of the dynamic target in the environment, v robot is the moving speed of the robot itself, and the coordinate transformation formula of the environment information is:
Figure A2007100343430002C2
Figure A2007100343430002C2
在三维感知方法中,应用式:g[m][n]=max{|A[m][n]-A[m+i][n+j]|},其中i,j=-1,0,1,计算地形高度变化梯度模,式中g[m][n]代表平面坐标m,n上相应的地形高度变化梯度的模,运用式:In the three-dimensional perception method, the application formula: g [m][n] = max{|A [m][n] -A [m+i][n+j] |}, where i, j=-1, 0, 1, calculate the gradient modulus of terrain height change, where g [m][n] represents the plane coordinate m, and the modulus of the corresponding terrain height change gradient on n, using the formula: mapmap [[ mm ]] [[ nno ]] == mapmap [[ mm ]] [[ nno ]] ++ 22 ,, ifif gg [[ mm ]] [[ nno ]] >> 22 00 ,, ifif gg [[ mm ]] [[ nno ]] ≤≤ 22 对环境的可行进区域与障碍区域进行区分,环境地图map[m][n]的初值为-1,表示未知区域栅格,为0时表示可行进的自由区域栅格,大于等于1表示障碍区域栅格。Distinguish between the travelable area and the obstacle area of the environment. The initial value of the environment map map [m][n] is -1, which means an unknown area grid. Obstacle area raster.
CN 200710034343 2007-01-29 2007-01-29 Three-dimensional environment perception method for mobile robot Pending CN101008571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200710034343 CN101008571A (en) 2007-01-29 2007-01-29 Three-dimensional environment perception method for mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200710034343 CN101008571A (en) 2007-01-29 2007-01-29 Three-dimensional environment perception method for mobile robot

Publications (1)

Publication Number Publication Date
CN101008571A true CN101008571A (en) 2007-08-01

Family

ID=38697110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200710034343 Pending CN101008571A (en) 2007-01-29 2007-01-29 Three-dimensional environment perception method for mobile robot

Country Status (1)

Country Link
CN (1) CN101008571A (en)

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102062587A (en) * 2010-12-13 2011-05-18 上海大学 Laser sensor-based method for determining poses of multiple mobile robots
CN102419179A (en) * 2011-12-09 2012-04-18 沈阳工业大学 Method and device for positioning mobile robot and recording driving track
CN102538779A (en) * 2010-10-25 2012-07-04 株式会社日立制作所 Robot system and map updating method
CN102564416A (en) * 2011-12-30 2012-07-11 浙江国自机器人技术有限公司 System and method for reconstructing and positioning three-dimensional environment for mirror cleaning robot
CN102980454A (en) * 2012-11-09 2013-03-20 河海大学常州校区 Explosive ordnance disposal (EOD) method of robot EOD system based on brain and machine combination
CN101726741B (en) * 2008-11-03 2013-05-22 三星电子株式会社 Device and method for extracting feature information and device and method for creating feature map
CN103198751A (en) * 2013-03-06 2013-07-10 南京邮电大学 Line feature map creation method of mobile robot based on laser range finder
CN103455145A (en) * 2013-08-30 2013-12-18 哈尔滨工业大学 Sensor combination device for three-dimensional environment sensing
CN103776491A (en) * 2014-01-26 2014-05-07 天津大学 Automatic measuring device for multi-field simultaneous automatic measurement in indoor environment field
CN103884281A (en) * 2014-03-18 2014-06-25 北京控制工程研究所 Patrol device obstacle detection method based on initiative structure light
CN104569998A (en) * 2015-01-27 2015-04-29 长春理工大学 Laser-radar-based vehicle safety running region detection method and device
CN104597453A (en) * 2015-01-27 2015-05-06 长春理工大学 Detection method and device for safety driving area of vehicle corrected by inertial measuring unit
CN104655007A (en) * 2013-11-22 2015-05-27 中国科学院深圳先进技术研究院 Method and system for creating world coordinates of environment scene
CN104750726A (en) * 2013-12-30 2015-07-01 北京中盈安信技术服务有限公司 Three-dimensional map coordinate determining method and device
CN104778260A (en) * 2015-04-21 2015-07-15 电子科技大学 Method for modeling dynamic radar environment knowledge base
CN101430207B (en) * 2007-11-09 2015-09-30 三星电子株式会社 Structured light is used to produce equipment and the method for three-dimensional map
CN105919517A (en) * 2016-06-15 2016-09-07 上海未来伙伴机器人有限公司 Automatic cleaning robot device
CN106323267A (en) * 2015-06-24 2017-01-11 南京农业大学 Orchard work agricultural robot interrow positioning method
CN106580588A (en) * 2015-10-19 2017-04-26 沈阳新松机器人自动化股份有限公司 Bed and chair integrated nursing bed and bed and chair butt-jointing method
CN106886030A (en) * 2017-03-24 2017-06-23 黑龙江硅智机器人有限公司 Synchronous map construction and positioning system and method applied to service robots
CN107123162A (en) * 2016-02-24 2017-09-01 中国科学院沈阳自动化研究所 Three-dimensional environment surface triangle gridding construction method based on two-dimensional laser sensor
WO2017177650A1 (en) * 2016-04-15 2017-10-19 Huawei Technologies Co., Ltd. Systems and methods for environment sensing using radar
CN107505619A (en) * 2017-06-30 2017-12-22 努比亚技术有限公司 A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium
CN107567605A (en) * 2015-03-24 2018-01-09 法国雷恩国立应用科学学院 For the improved method in the personal auxiliary device for moving lieutenant colonel positive rail equipped with sensor
CN107991110A (en) * 2017-11-29 2018-05-04 安徽省通信息科技有限公司 A kind of caterpillar type robot slides parameter detection method
RU179441U1 (en) * 2017-08-31 2018-05-15 Акционерное общество "Уфимский завод эластомерных материалов, изделий и конструкций" Emergency rescue container
CN108153313A (en) * 2017-12-28 2018-06-12 李华 Based on computer vision laser navigation radar and perceive sensing fusion from walking robot system
CN108535736A (en) * 2017-03-05 2018-09-14 苏州中德睿博智能科技有限公司 Three dimensional point cloud acquisition methods and acquisition system
CN109276193A (en) * 2018-11-13 2019-01-29 苏州苏相机器人智能装备有限公司 A robot with adjustable height position and obstacle avoidance method
CN109375618A (en) * 2018-09-27 2019-02-22 深圳乐动机器人有限公司 The navigation barrier-avoiding method and terminal device of clean robot
CN109444937A (en) * 2018-08-08 2019-03-08 北京木业邦科技有限公司 Tree vigorous degree and Tending methods, device, electronic equipment and storage medium
CN109597415A (en) * 2018-12-06 2019-04-09 山东大学 Rover paths planning method and system based on moonscape safe landing area
CN109709564A (en) * 2018-12-05 2019-05-03 交控科技股份有限公司 A kind of shield door anti-clipping system and method based on the detection of laser radar single line
CN111366908A (en) * 2020-04-22 2020-07-03 北京国电富通科技发展有限责任公司 Laser radar rotary table and measuring device and measuring method thereof
CN111562563A (en) * 2020-04-30 2020-08-21 北京国电富通科技发展有限责任公司 Laser radar rotary table calibration method and device and computer readable storage medium
CN111993425A (en) * 2020-08-25 2020-11-27 深圳市优必选科技股份有限公司 Obstacle avoidance method, device, mechanical arm and storage medium
CN112013845A (en) * 2020-08-10 2020-12-01 北京轩宇空间科技有限公司 Fast map updating method, device and storage medium adapting to unknown dynamic space
CN112238438A (en) * 2020-11-16 2021-01-19 南京工业大学 Three-dimensional traceability robot aiming at indoor hazardous substance leakage and air pollution
CN112558599A (en) * 2020-11-06 2021-03-26 深圳拓邦股份有限公司 Robot work control method and device and robot
CN112617700A (en) * 2020-12-21 2021-04-09 追创科技(苏州)有限公司 Laser assembly and self-moving equipment
CN112987734A (en) * 2021-02-23 2021-06-18 京东数科海益信息科技有限公司 Robot running method, device, electronic apparatus, storage medium, and program product
CN113446956A (en) * 2020-03-24 2021-09-28 阿里巴巴集团控股有限公司 Data acquisition equipment, data correction method and device and electronic equipment
CN114647240A (en) * 2021-05-10 2022-06-21 中国科学技术大学 Three-ring perception obstacle avoidance method and obstacle avoidance system using the same
WO2022135230A1 (en) * 2020-12-21 2022-06-30 追觅创新科技(苏州)有限公司 Laser obstacle avoidance mechanism and sweeping machine
CN117409264A (en) * 2023-12-16 2024-01-16 武汉理工大学 Multi-sensor data fusion robot terrain sensing method based on transformer

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9182763B2 (en) 2007-11-09 2015-11-10 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
CN101430207B (en) * 2007-11-09 2015-09-30 三星电子株式会社 Structured light is used to produce equipment and the method for three-dimensional map
CN101726741B (en) * 2008-11-03 2013-05-22 三星电子株式会社 Device and method for extracting feature information and device and method for creating feature map
CN102538779A (en) * 2010-10-25 2012-07-04 株式会社日立制作所 Robot system and map updating method
CN102538779B (en) * 2010-10-25 2015-03-11 株式会社日立制作所 Robot system and map updating method
CN102062587A (en) * 2010-12-13 2011-05-18 上海大学 Laser sensor-based method for determining poses of multiple mobile robots
CN102062587B (en) * 2010-12-13 2013-02-20 上海克来机电自动化工程有限公司 Laser sensor-based method for determining poses of multiple mobile robots
CN102419179A (en) * 2011-12-09 2012-04-18 沈阳工业大学 Method and device for positioning mobile robot and recording driving track
CN102564416A (en) * 2011-12-30 2012-07-11 浙江国自机器人技术有限公司 System and method for reconstructing and positioning three-dimensional environment for mirror cleaning robot
CN102564416B (en) * 2011-12-30 2014-08-20 浙江国自机器人技术有限公司 System and method for reconstructing and positioning three-dimensional environment for mirror cleaning robot
CN102980454B (en) * 2012-11-09 2014-11-26 河海大学常州校区 Explosive ordnance disposal (EOD) method of robot EOD system based on brain and machine combination
CN102980454A (en) * 2012-11-09 2013-03-20 河海大学常州校区 Explosive ordnance disposal (EOD) method of robot EOD system based on brain and machine combination
CN103198751A (en) * 2013-03-06 2013-07-10 南京邮电大学 Line feature map creation method of mobile robot based on laser range finder
CN103455145A (en) * 2013-08-30 2013-12-18 哈尔滨工业大学 Sensor combination device for three-dimensional environment sensing
CN103455145B (en) * 2013-08-30 2016-05-04 哈尔滨工业大学 A kind of sensor assemblies for three-dimensional environment perception
CN104655007A (en) * 2013-11-22 2015-05-27 中国科学院深圳先进技术研究院 Method and system for creating world coordinates of environment scene
CN104750726A (en) * 2013-12-30 2015-07-01 北京中盈安信技术服务有限公司 Three-dimensional map coordinate determining method and device
CN103776491A (en) * 2014-01-26 2014-05-07 天津大学 Automatic measuring device for multi-field simultaneous automatic measurement in indoor environment field
CN103776491B (en) * 2014-01-26 2016-08-17 天津大学 Many of indoor environment field self-operated measuring unit simultaneously
CN103884281A (en) * 2014-03-18 2014-06-25 北京控制工程研究所 Patrol device obstacle detection method based on initiative structure light
CN103884281B (en) * 2014-03-18 2015-10-21 北京控制工程研究所 A method of obstacle detection for patrolling vehicles based on active structured light
CN104569998A (en) * 2015-01-27 2015-04-29 长春理工大学 Laser-radar-based vehicle safety running region detection method and device
CN104597453A (en) * 2015-01-27 2015-05-06 长春理工大学 Detection method and device for safety driving area of vehicle corrected by inertial measuring unit
CN107567605A (en) * 2015-03-24 2018-01-09 法国雷恩国立应用科学学院 For the improved method in the personal auxiliary device for moving lieutenant colonel positive rail equipped with sensor
CN104778260A (en) * 2015-04-21 2015-07-15 电子科技大学 Method for modeling dynamic radar environment knowledge base
CN104778260B (en) * 2015-04-21 2018-02-13 电子科技大学 A kind of dynamic radar environmental knowledge storehouse modeling method
CN106323267A (en) * 2015-06-24 2017-01-11 南京农业大学 Orchard work agricultural robot interrow positioning method
CN106580588A (en) * 2015-10-19 2017-04-26 沈阳新松机器人自动化股份有限公司 Bed and chair integrated nursing bed and bed and chair butt-jointing method
CN107123162B (en) * 2016-02-24 2020-02-21 中国科学院沈阳自动化研究所 Construction method of 3D environment surface triangular mesh based on 2D laser sensor
CN107123162A (en) * 2016-02-24 2017-09-01 中国科学院沈阳自动化研究所 Three-dimensional environment surface triangle gridding construction method based on two-dimensional laser sensor
WO2017177650A1 (en) * 2016-04-15 2017-10-19 Huawei Technologies Co., Ltd. Systems and methods for environment sensing using radar
US10317519B2 (en) 2016-04-15 2019-06-11 Huawei Technologies Co., Ltd. Systems and methods for environment sensing using radar
WO2017215324A1 (en) * 2016-06-15 2017-12-21 上海未来伙伴机器人有限公司 Automatic cleaning robot apparatus
CN105919517A (en) * 2016-06-15 2016-09-07 上海未来伙伴机器人有限公司 Automatic cleaning robot device
CN105919517B (en) * 2016-06-15 2019-04-23 上海未来伙伴机器人有限公司 Automatic cleaning machine people's device
CN108535736A (en) * 2017-03-05 2018-09-14 苏州中德睿博智能科技有限公司 Three dimensional point cloud acquisition methods and acquisition system
CN106886030A (en) * 2017-03-24 2017-06-23 黑龙江硅智机器人有限公司 Synchronous map construction and positioning system and method applied to service robots
CN106886030B (en) * 2017-03-24 2019-05-07 黑龙江硅智机器人有限公司 Synchronous map construction and positioning system and method applied to service robot
CN107505619A (en) * 2017-06-30 2017-12-22 努比亚技术有限公司 A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium
RU179441U1 (en) * 2017-08-31 2018-05-15 Акционерное общество "Уфимский завод эластомерных материалов, изделий и конструкций" Emergency rescue container
CN107991110A (en) * 2017-11-29 2018-05-04 安徽省通信息科技有限公司 A kind of caterpillar type robot slides parameter detection method
CN107991110B (en) * 2017-11-29 2019-11-12 安徽省一一通信息科技有限公司 A detection method for sliding parameters of a crawler robot
CN108153313A (en) * 2017-12-28 2018-06-12 李华 Based on computer vision laser navigation radar and perceive sensing fusion from walking robot system
CN109444937A (en) * 2018-08-08 2019-03-08 北京木业邦科技有限公司 Tree vigorous degree and Tending methods, device, electronic equipment and storage medium
CN109444937B (en) * 2018-08-08 2021-04-02 北京木业邦科技有限公司 Tree modeling and tending method and device, electronic equipment and storage medium
CN109375618A (en) * 2018-09-27 2019-02-22 深圳乐动机器人有限公司 The navigation barrier-avoiding method and terminal device of clean robot
CN109276193A (en) * 2018-11-13 2019-01-29 苏州苏相机器人智能装备有限公司 A robot with adjustable height position and obstacle avoidance method
CN109709564A (en) * 2018-12-05 2019-05-03 交控科技股份有限公司 A kind of shield door anti-clipping system and method based on the detection of laser radar single line
CN109709564B (en) * 2018-12-05 2020-08-28 交控科技股份有限公司 Shielding door anti-pinch system and method based on laser radar single line detection
CN109597415B (en) * 2018-12-06 2020-03-10 山东大学 Patrol device path planning method and system based on lunar surface safe landing area
CN109597415A (en) * 2018-12-06 2019-04-09 山东大学 Rover paths planning method and system based on moonscape safe landing area
CN113446956B (en) * 2020-03-24 2023-08-11 阿里巴巴集团控股有限公司 Data acquisition equipment, data correction method and device and electronic equipment
CN113446956A (en) * 2020-03-24 2021-09-28 阿里巴巴集团控股有限公司 Data acquisition equipment, data correction method and device and electronic equipment
CN111366908B (en) * 2020-04-22 2022-05-24 北京国电富通科技发展有限责任公司 Laser radar rotary table and measuring device and measuring method thereof
CN111366908A (en) * 2020-04-22 2020-07-03 北京国电富通科技发展有限责任公司 Laser radar rotary table and measuring device and measuring method thereof
CN111562563A (en) * 2020-04-30 2020-08-21 北京国电富通科技发展有限责任公司 Laser radar rotary table calibration method and device and computer readable storage medium
CN112013845A (en) * 2020-08-10 2020-12-01 北京轩宇空间科技有限公司 Fast map updating method, device and storage medium adapting to unknown dynamic space
CN111993425A (en) * 2020-08-25 2020-11-27 深圳市优必选科技股份有限公司 Obstacle avoidance method, device, mechanical arm and storage medium
CN111993425B (en) * 2020-08-25 2021-11-02 深圳市优必选科技股份有限公司 Obstacle avoidance method, device, mechanical arm and storage medium
CN112558599A (en) * 2020-11-06 2021-03-26 深圳拓邦股份有限公司 Robot work control method and device and robot
CN112558599B (en) * 2020-11-06 2024-04-02 深圳拓邦股份有限公司 Robot work control method and device and robot
CN112238438A (en) * 2020-11-16 2021-01-19 南京工业大学 Three-dimensional traceability robot aiming at indoor hazardous substance leakage and air pollution
CN112617700A (en) * 2020-12-21 2021-04-09 追创科技(苏州)有限公司 Laser assembly and self-moving equipment
WO2022135230A1 (en) * 2020-12-21 2022-06-30 追觅创新科技(苏州)有限公司 Laser obstacle avoidance mechanism and sweeping machine
CN112987734A (en) * 2021-02-23 2021-06-18 京东数科海益信息科技有限公司 Robot running method, device, electronic apparatus, storage medium, and program product
CN114647240A (en) * 2021-05-10 2022-06-21 中国科学技术大学 Three-ring perception obstacle avoidance method and obstacle avoidance system using the same
CN117409264A (en) * 2023-12-16 2024-01-16 武汉理工大学 Multi-sensor data fusion robot terrain sensing method based on transformer
CN117409264B (en) * 2023-12-16 2024-03-08 武汉理工大学 Multi-sensor data fusion robot terrain sensing method based on transformer

Similar Documents

Publication Publication Date Title
CN101008571A (en) Three-dimensional environment perception method for mobile robot
JP7073315B2 (en) Vehicles, vehicle positioning systems, and vehicle positioning methods
JP7260269B2 (en) Positioning system for aeronautical non-destructive inspection
CN106123908B (en) Automobile navigation method and system
CN106597470B (en) Three-dimensional point cloud data coordinate transformation method based on three-dimensional point cloud data acquisition device
CN102508257B (en) Working method of a vehicle-mounted mobile mapping device
EP3816753B1 (en) Method for learning at least one section of a boundary edge of a surface for a surface treatment system, method for operating a surface processing system, learning system and green area processing system
US20100053593A1 (en) Apparatus, systems, and methods for rotating a lidar device to map objects in an environment in three dimensions
CN101493526B (en) Lunar vehicle high speed three-dimensional laser imaging radar system and imaging method
CN113096190B (en) Omnidirectional mobile robot navigation method based on visual mapping
CN106959697B (en) Automatic indoor map construction system for long straight corridor environment
JP5105596B2 (en) Travel route determination map creation device and travel route determination map creation method for autonomous mobile body
KR101880593B1 (en) Lidar sensor device for automatic driving of unmanned vehicles
JP5310285B2 (en) Self-position estimation apparatus and self-position estimation method
Kuramachi et al. G-ICP SLAM: An odometry-free 3D mapping system with robust 6DoF pose estimation
CN113085896A (en) Auxiliary automatic driving system and method for modern rail cleaning vehicle
CN104914863A (en) Integrated unmanned motion platform environment understanding system and work method thereof
KR101853127B1 (en) Movable Marking System, Controlling Method For Movable Marking Apparatus and Computer Readable Recording Medium
CN103512579A (en) Map building method based on thermal infrared camera and laser range finder
CN106595630A (en) Mapping system based on laser navigation substation patrol robot as well as method
CN103424112A (en) Vision navigating method for movement carrier based on laser plane assistance
CN104777452B (en) Positioning system and positioning method of mobile equipment
Kim et al. Autonomous mobile robot localization and mapping for unknown construction environments
US20070266574A1 (en) Shape Measurement Device and Method Thereof
CN111812659A (en) Iron tower posture early warning device and method based on image recognition and laser ranging

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication