[go: up one dir, main page]

CN106940208A - Robot target demarcates the system with oneself state monitoring function - Google Patents

Robot target demarcates the system with oneself state monitoring function Download PDF

Info

Publication number
CN106940208A
CN106940208A CN201710211381.1A CN201710211381A CN106940208A CN 106940208 A CN106940208 A CN 106940208A CN 201710211381 A CN201710211381 A CN 201710211381A CN 106940208 A CN106940208 A CN 106940208A
Authority
CN
China
Prior art keywords
robot
target
data
module
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710211381.1A
Other languages
Chinese (zh)
Inventor
杨桐
齐洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Donghua University
Original Assignee
Donghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Donghua University filed Critical Donghua University
Priority to CN201710211381.1A priority Critical patent/CN106940208A/en
Publication of CN106940208A publication Critical patent/CN106940208A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a kind of demarcation of robot target and the system of oneself state monitoring function, it is characterised in that range measurement and image taking module, robotary data acquisition module, data transmission module, data processing and display module including object.As a result of above-mentioned technical scheme, the present invention compared with prior art, has the following advantages that and good effect:Range measurement, IMAQ, data processing and display, performance monitoring are integrated in one by the present invention, inexpensive, quick can realize that goal seeking and surrounding environment are built.The present invention is what the principle based on Fusion was realized, by experiment test, present system working stability, favorable expandability, low cost, interactive interface friendly, system working stability directly perceived, can accurately, quickly perform some goal seeking tasks in specific environment.

Description

机器人目标标定和自身状态监控功能的系统A system for robot target calibration and its own state monitoring function

技术领域technical field

本发明涉及一种利用机器人进行基于多传感器融合的目标物识别系统。The invention relates to a target recognition system based on multi-sensor fusion by using a robot.

背景技术Background technique

随着社会信息化技术的发展,具有自主导航能力的移动机器人在各个行业被广泛应用。机器人在行走和探索的过程中,实现机器人在未知环境中的运动、根据人的要求寻找目标并避免造成机器人本体的损坏以及设备的损坏,它必须能够识别出周围环境中的物体。With the development of social information technology, mobile robots with autonomous navigation capabilities are widely used in various industries. In the process of walking and exploring, the robot must be able to recognize objects in the surrounding environment to realize the movement of the robot in the unknown environment, to find the target according to human requirements and to avoid damage to the robot body and equipment.

目前已经有很多方法成功地应用于基于视觉的物体识别中,如激光测距、超声波、毫米波雷达、多源融合以及基于计算机视觉的识别方法。基于超声波的目标物识别方法大多用于机器人避障领域。经典的超声波测距技术有着广泛的应用性和独特的优势,但在实际的应用环境中,依靠单一的超声波测距传感器进行设计适应能力差,只能对障碍物的距离精确测定,障碍物的位置等其他信息大多只能通过模糊推断得到,其实时性、准确性难以保证。此外,超声波在测距精度、测量盲区和测量范围等问题上存在很多局限性。基于视觉的目标识别方法虽然可以实现较大范围、精密的数据采集,但是,采样信息的丰富同时也引入了更高的系统成本;并且,功能模块的扩展需要借助更强大的微电子技术和电源技术才能实现,因此,机器人的功耗提高,连续工作时间变低,实时性下降。At present, many methods have been successfully applied to object recognition based on vision, such as laser ranging, ultrasonic, millimeter-wave radar, multi-source fusion and computer vision-based recognition methods. Target recognition methods based on ultrasonic waves are mostly used in the field of robot obstacle avoidance. The classic ultrasonic ranging technology has a wide range of applications and unique advantages, but in the actual application environment, relying on a single ultrasonic ranging sensor to design poor adaptability, can only accurately measure the distance of obstacles, the distance of obstacles Most other information such as location can only be obtained through fuzzy inference, and its real-time and accuracy are difficult to guarantee. In addition, ultrasonic waves have many limitations in terms of distance measurement accuracy, measurement blind area and measurement range. Although the vision-based target recognition method can achieve large-scale and precise data collection, the richness of sampling information also introduces higher system costs; moreover, the expansion of functional modules requires more powerful microelectronics technology and power supply Therefore, the power consumption of the robot increases, the continuous working time becomes lower, and the real-time performance decreases.

此外,嵌入式机器人并没有一套针对工作状态的可视化操作界面,现有的机器人远程监控系统往往依赖于机器人厂商所提供的监控平台,不但不具有通用性,而且往往价格昂贵,使很多致力于机器人行业的小公司望而却步。目前对于嵌入式机器人的工作状态完全依赖于有经验的机器人工程师对工作实际情况的分析,并没有深入到系统层面,对于嵌入式机器人的CPU工作负载、详细实时运行状态更是一无所知,这对于长期、高效、安全地使用机器人是不利的。In addition, the embedded robot does not have a set of visual operation interface for the working status. The existing robot remote monitoring system often relies on the monitoring platform provided by the robot manufacturer, which is not only not universal, but also often expensive. Small companies in the robotics industry stay away. At present, the working status of embedded robots is completely dependent on the analysis of the actual work situation by experienced robotics engineers, and has not gone deep into the system level. The CPU workload and detailed real-time running status of embedded robots are even more ignorant. This is not good for the long-term, efficient and safe use of robots.

为了克服传统移动机器人在功能和功耗之间的矛盾,使机器人可以在更复杂、恶劣的环境中稳定工作,目前常见的方法是将机器人移动系统与传感器、处理器系统分离,采用高容量电池或长电线利用电网供电,将微机与之组合,实现相应功能。这种方法适用于实验室或野外广阔的环境,可以进行精密的距离测量和复杂的图像处理。但相应机器人的体积偏大、成本高昂、自动化与智能化程度低。In order to overcome the contradiction between the function and power consumption of traditional mobile robots, so that the robot can work stably in more complex and harsh environments, the current common method is to separate the mobile system of the robot from the sensor and processor system, and use high-capacity batteries Or long wires are powered by the power grid, and a microcomputer is combined with it to realize corresponding functions. This method is suitable for laboratory or field environment, and can carry out precise distance measurement and complex image processing. But the volume of the corresponding robot is too large, the cost is high, and the degree of automation and intelligence is low.

发明内容Contents of the invention

本发明的目的是提供一套以机器人为基础的、具有目标距离测量、图像处理和自身状态监控功能的系统,具有实时性强、交互感好、成本低、用途广、精度高、功能扩展方便以及稳定性好等优点。The purpose of the present invention is to provide a robot-based system with target distance measurement, image processing and self-state monitoring functions, which has strong real-time performance, good interaction, low cost, wide application, high precision, and convenient function expansion and good stability.

为了达到上述目的,本发明的技术方案是提供了一种机器人目标标定和自身状态监控功能的系统,其特征在于,包括目标物的距离测量及图像拍摄模块、机器人状态数据采集模块、数据传输模块、数据处理与显示模块,其中:In order to achieve the above object, the technical solution of the present invention is to provide a system of robot target calibration and self-state monitoring function, which is characterized in that it includes a distance measurement and image capture module of the target object, a robot state data acquisition module, and a data transmission module. , data processing and display module, wherein:

距离测量及图像拍摄模块,通过各种传感器采集相应数据,并将采集到的数据储存在机器人上;The distance measurement and image capture module collects corresponding data through various sensors and stores the collected data on the robot;

机器人状态数据采集模块,用于实时监视机器人信息,机器人信息包括机器人各种参数与所述距离测量及图像拍摄模块中各个传感器采集到的数据;The robot state data acquisition module is used to monitor robot information in real time, and the robot information includes various parameters of the robot and the data collected by each sensor in the distance measurement and image capture module;

数据传输模块,通过无线方式将通过距离测量及图像拍摄模块获得的原始数据传输至电脑端;The data transmission module transmits the original data obtained by the distance measurement and image capture module to the computer through wireless means;

数据处理与显示模块,接收电脑端获得的数据后,将其处理为用户想要的、直观的目标物信息,并与由所述机器人状态数据采集模块监测的机器人信息一同在交互界面上显示。The data processing and display module, after receiving the data obtained by the computer terminal, processes it into the intuitive target information desired by the user, and displays it on the interactive interface together with the robot information monitored by the robot state data acquisition module.

优选地,所述距离测量及图像拍摄模块包括超声波传感器、红外传感器、视觉传感器,其中:Preferably, the distance measurement and image capture module includes an ultrasonic sensor, an infrared sensor, and a visual sensor, wherein:

超声波传感器用于测量目标距离,红外传感器用于对近距离超声波传感器的盲区补充测量,超声波传感器及红外传感器均匀分布在机器人的侧面,通过测量值的不同来预估目标个数、距离和大致方位;The ultrasonic sensor is used to measure the distance of the target, and the infrared sensor is used to supplement the blind area of the short-range ultrasonic sensor. The ultrasonic sensor and the infrared sensor are evenly distributed on the side of the robot, and the number, distance and general orientation of the target are estimated by the difference in the measured value. ;

视觉传感器,用于拍摄正前方的环境信息,视觉传感器位于机器人正面。The visual sensor is used to capture the environmental information directly in front, and the visual sensor is located on the front of the robot.

优选地,所述数据处理与显示模块显示的信息包括机器人速度、加速度、目标物距离、电池使用情况、CPU负载程度,同时,还显示通过图像处理得到的目标物形状方位、通过所述视觉传感器标定得到的目标物大小、通过融合多种多个传感器的数据得到目标具体信息及其附近环境信息。Preferably, the information displayed by the data processing and display module includes robot speed, acceleration, target distance, battery usage, CPU load level, and at the same time, it also displays the shape and orientation of the target obtained through image processing, through the visual sensor The size of the target obtained by calibration, and the specific information of the target and its surrounding environment information are obtained by fusing data from multiple sensors.

优选地,所述机器人各种参数包括速度、位置、与目标物距离、电池使用情况、CPU负载程度。Preferably, various parameters of the robot include speed, position, distance from the target, battery usage, and CPU load.

优选地,所述数据传输模块包括Wi-Fi通信模块,通过Wi-Fi通信模块建立网络连接协议和数据传输。Preferably, the data transmission module includes a Wi-Fi communication module, and a network connection protocol and data transmission are established through the Wi-Fi communication module.

由于采用了上述的技术方案,本发明与现有技术相比,具有以下的优点和积极效果:本发明将距离测量、图像采集、数据处理与显示、性能监控集于一体,可以低成本、快速实现目标寻找和周围环境构建。本发明是基于多传感器数据融合的原理实现的,经过实验测试,本发明系统工作稳定、扩展性好、成本低、交互界面直观友好、系统工作稳定,在特定环境中可以准确、快速执行一些目标寻找任务。Due to the adoption of the above-mentioned technical solution, the present invention has the following advantages and positive effects compared with the prior art: the present invention integrates distance measurement, image acquisition, data processing and display, and performance monitoring, and can be low-cost, fast Enables object finding and surrounding environment building. The present invention is realized based on the principle of multi-sensor data fusion. Through experimental tests, the system of the present invention has stable operation, good scalability, low cost, intuitive and friendly interactive interface, stable system operation, and can accurately and quickly execute some goals in a specific environment. Find tasks.

附图说明Description of drawings

图1为本发明的结构框图;Fig. 1 is a block diagram of the present invention;

图2为本发明中距离测量与图像拍摄模块的结构示意图;Fig. 2 is a schematic structural diagram of the distance measurement and image capture module in the present invention;

图3为机器人、本地window机器和Linux虚拟机的交互示意图;Fig. 3 is the interactive schematic diagram of robot, local window machine and Linux virtual machine;

图4为利用机器人超声波传感器和红外传感器共同作用得到周围未知物体与机器人间的距离的示意图;Fig. 4 is a schematic diagram of obtaining the distance between the surrounding unknown objects and the robot by using the robot ultrasonic sensor and the infrared sensor together;

图5为机器人运动模型。Figure 5 is the robot motion model.

具体实施方式detailed description

下面结合具体实施例,进一步阐述本发明。应理解,这些实施例仅用于说明本发明而不用于限制本发明的范围。此外应理解,在阅读了本发明讲授的内容之后,本领域技术人员可以对本发明作各种改动或修改,这些等价形式同样落于本申请所附权利要求书所限定的范围。Below in conjunction with specific embodiment, further illustrate the present invention. It should be understood that these examples are only used to illustrate the present invention and are not intended to limit the scope of the present invention. In addition, it should be understood that after reading the teachings of the present invention, those skilled in the art can make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

本发明的实施方式涉及一套机器人目标寻找和状态监控的系统,系统包括目标物测量与机器人状态数据采集模块、机器人状态数据采集模块、数据传输模块、数据处理与显示模块;所述目标物测量与机器人状态数据采集模块用于原始数据的获取,如目标距离和形状,记录机器人参数,以获取机器人运行状态、推算机器人运动状态;所述数据传输模块通过wifi或蓝牙,利用TCP/IP协议或SSH协议将各个传感器的数据发送至电脑端;所述数据处理与显示模块,通过实时获得的速率、陀螺仪信息等推算出机器人的运动轨迹,利用matlab等数学工具处理获得的原始图片,获得机器人在运动过程中拍摄到物体的形状、颜色等性状,结合视觉传感器的标定推算物体的大小,统计所有信息得到所寻目标及寻找过程中的大概环境信息。The embodiment of the present invention relates to a set of robot target finding and state monitoring system. The system includes a target object measurement and robot state data acquisition module, a robot state data acquisition module, a data transmission module, and a data processing and display module; the target object measurement The robot state data acquisition module is used to obtain raw data, such as target distance and shape, and record robot parameters to obtain the robot’s running state and estimate the robot’s motion state; the data transmission module uses TCP/IP protocol or The SSH protocol sends the data of each sensor to the computer terminal; the data processing and display module calculates the trajectory of the robot through the real-time obtained speed and gyroscope information, and uses mathematical tools such as matlab to process the obtained original pictures to obtain the robot The shape, color and other properties of the object are captured during the movement, combined with the calibration of the visual sensor to calculate the size of the object, and all the information is counted to obtain the target and the approximate environmental information during the search process.

图1是本发明一套基于多传感器融合的机器人目标物识别系统的结构框图。由图1可见,系统包括目标物的距离测量及图像拍摄模块、机器人状态数据采集模块、数据传输模块、数据处理与显示模块。Fig. 1 is a structural block diagram of a robot object recognition system based on multi-sensor fusion in the present invention. It can be seen from Figure 1 that the system includes the distance measurement and image capture module of the target object, the robot state data acquisition module, the data transmission module, and the data processing and display module.

如图1所示,多个嵌入式机器人编队运动,在不同方向对未知物体进行协同探测,并将得到的信息存储;在机器人、本地window机器和Linux虚拟机里均安装zabbix监控软件,监控机器人自身的运动信息,并将传感器得到的数据发送至window机器,利用本地机器处理数据后,将结果送至Linux虚拟机并在前端显示。As shown in Figure 1, multiple embedded robots move in formation, jointly detect unknown objects in different directions, and store the obtained information; install zabbix monitoring software in the robot, local window machine and Linux virtual machine to monitor the robot Its own motion information, and send the data obtained by the sensor to the window machine, use the local machine to process the data, and then send the result to the Linux virtual machine and display it on the front end.

如图2所示,距离测量及图像拍摄模块,主要利用超声波传感器测量机器人与未知物体间的距离,由于超声波传感器存在“盲区”,所以利用红外传感器测量机器人近处作为补偿;利用视觉传感器间断拍摄机器人运动正前方的图像。传感器得到的距离、图像等数据存储在机器人存储器内。As shown in Figure 2, the distance measurement and image capture module mainly uses the ultrasonic sensor to measure the distance between the robot and the unknown object. Due to the "blind area" of the ultrasonic sensor, the infrared sensor is used to measure the proximity of the robot as compensation; the visual sensor is used to intermittently shoot Image of the robot moving straight ahead. The distance, image and other data obtained by the sensor are stored in the memory of the robot.

如图3所示,在机器人、本地window机器和Linux虚拟机里均安装zabbix监控软件,基于本地路由器的wifi模式,采用secureCRT通过SSH协议登陆到机器人操作系统,由机器人运行代码完成在机器人存储器上的数据传输;在机器人、本地Windows机器和Linux虚拟机中均设定监控项,将机器人传感器得到的信息存储在Linux虚拟机的数据库中,并在前端显示。As shown in Figure 3, the zabbix monitoring software is installed in the robot, the local window machine and the Linux virtual machine. Based on the wifi mode of the local router, the secureCRT is used to log in to the robot operating system through the SSH protocol, and the robot runs the code and completes it on the robot memory. Data transmission; set monitoring items in the robot, local Windows machine and Linux virtual machine, store the information obtained by the robot sensor in the database of the Linux virtual machine, and display it on the front end.

如图4所示,利用机器人超声波传感器和红外传感器共同作用得到周围未知物体与机器人间的距离。机器人采用CMOS图像采集传感器,具有高灵敏度、低噪声、光谱响应范围广、输出图像快速和良好的动态性能等优点,能在机器人不规则的运动状态下拍摄到较为清晰的图像。图像处理主要分为:图像灰度化与二值化、滤波、边缘检测和颜色匹配等四个部分。首先,对图像进行灰度化处理,使用单尺度Retinex算法消除背景及障碍物由于自然光照形成的阴影,降低阴影路面与非阴影路面的对比度,增强图像;使用扫描线种子法进行区域填充,利用Canny边缘检测提取边缘,即物体轮廓,并进行信息记录;最后对图像物体所在位置利用颜色特征匹配法,寻找是否与目标颜色匹配。As shown in Figure 4, the distance between the surrounding unknown objects and the robot is obtained by using the robot's ultrasonic sensor and infrared sensor to work together. The robot adopts CMOS image acquisition sensor, which has the advantages of high sensitivity, low noise, wide spectral response range, fast output image and good dynamic performance, and can capture relatively clear images under the irregular motion state of the robot. Image processing is mainly divided into four parts: image grayscale and binarization, filtering, edge detection and color matching. First, the image is processed in grayscale, and the single-scale Retinex algorithm is used to eliminate the shadows formed by the background and obstacles due to natural light, to reduce the contrast between the shadowed road surface and the non-shaded road surface, and to enhance the image; the scan line seed method is used to fill the area, using Canny edge detection extracts the edge, that is, the outline of the object, and records the information; finally, the color feature matching method is used for the position of the image object to find whether it matches the target color.

机器人运动模型如图5,机器人运动只考虑二维平面的运动,不考虑倾角的坡面运动,即不考虑无人车Z轴方向的移动,因此无人车的位姿信息q包括二维平面的位置和朝向。The motion model of the robot is shown in Figure 5. The motion of the robot only considers the movement of the two-dimensional plane, and does not consider the slope movement of the inclination angle, that is, the movement in the Z-axis direction of the unmanned vehicle is not considered. Therefore, the pose information q of the unmanned vehicle includes the two-dimensional plane location and orientation.

q=[x y]T q=[xy] T

式中,x和y是机器人在二维坐标系的位置,是机器人方向与x轴正方向的夹角,若以机器人初始朝向为x轴正方向。In the formula, x and y are the position of the robot in the two-dimensional coordinate system, and are the angle between the direction of the robot and the positive direction of the x-axis. If the initial orientation of the robot is the positive direction of the x-axis.

机器人的线速度为v,则无人车在二维坐标系的轴方向线速度和轴方向线速度为:The linear velocity of the robot is v, then the linear velocity of the unmanned vehicle in the axis direction of the two-dimensional coordinate system linear velocity in and axis direction for:

机器人中的陀螺仪可以测得三个数据X,Y,Z:分别表示机器人在YZ、XZ、XY平面的角加速度,可以利用角加速度进行积分得到角速度ω,再次积分得到角度θ。最后,利用上述获得的数据,推导出机器人的运动轨迹,并结合图像处理的到的物体信息,再标记出物体与机器人的相对位置。The gyroscope in the robot can measure three data X, Y, Z: respectively represent the angular acceleration of the robot in the YZ, XZ, and XY planes, and the angular velocity ω can be obtained by integrating the angular acceleration, and the angle θ can be obtained by integrating again. Finally, use the data obtained above to deduce the trajectory of the robot, and combine the object information obtained by image processing to mark the relative position of the object and the robot.

本实例的数据采集及处理还需要通过相应的程序代码实现,即,首先编写机器人运动控制相关代码,使其既能平稳匀速运动,又能避开障碍并减小障碍物对机器人运动状态的影响;然后在机器人嵌入式系统、本地window系统和Linux虚拟机内安装zabbix监控软件,设定监控项,编写监控和文件传输脚本,并在前端显示数据;接着,在本地系统使用matlab脚本处理原始图像,并将结果放入监控对象;最后对各个传感器得到进行信息融合得到目标物。The data acquisition and processing of this example also need to be realized through the corresponding program code, that is, first write the robot motion control related code, so that it can move smoothly and at a constant speed, avoid obstacles and reduce the impact of obstacles on the robot's motion state ; Then install zabbix monitoring software in the robot embedded system, local window system and Linux virtual machine, set monitoring items, write monitoring and file transfer scripts, and display data on the front end; then, use matlab scripts in the local system to process the original image , and put the result into the monitoring object; finally, carry out information fusion on each sensor to obtain the target object.

Claims (5)

1.一种机器人目标标定和自身状态监控功能的系统,其特征在于,包括目标物的距离测量及图像拍摄模块、机器人状态数据采集模块、数据传输模块、数据处理与显示模块,其中:1. A system of robot target calibration and self-state monitoring function, characterized in that, comprising distance measurement and image capture module of target object, robot state data acquisition module, data transmission module, data processing and display module, wherein: 距离测量及图像拍摄模块,通过各种传感器采集相应数据,并将采集到的数据储存在机器人上;The distance measurement and image capture module collects corresponding data through various sensors and stores the collected data on the robot; 机器人状态数据采集模块,用于实时监视机器人信息,机器人信息包括机器人各种参数与所述距离测量及图像拍摄模块中各个传感器采集到的数据;The robot state data acquisition module is used to monitor robot information in real time, and the robot information includes various parameters of the robot and the data collected by each sensor in the distance measurement and image capture module; 数据传输模块,通过无线方式将通过距离测量及图像拍摄模块获得的原始数据传输至电脑端;The data transmission module transmits the original data obtained by the distance measurement and image capture module to the computer through wireless means; 数据处理与显示模块,接收电脑端获得的数据后,将其处理为用户想要的、直观的目标物信息,并与由所述机器人状态数据采集模块监测的机器人信息一同在交互界面上显示。The data processing and display module, after receiving the data obtained by the computer terminal, processes it into the intuitive target information desired by the user, and displays it on the interactive interface together with the robot information monitored by the robot state data acquisition module. 2.如权利要求1所述的一种机器人目标标定和自身状态监控功能的系统,其特征在于,所述距离测量及图像拍摄模块包括超声波传感器、红外传感器、视觉传感器,其中:2. The system of a kind of robot target calibration and self-state monitoring function as claimed in claim 1, wherein the distance measurement and image capturing modules include ultrasonic sensors, infrared sensors, and visual sensors, wherein: 超声波传感器用于测量目标距离,红外传感器用于对近距离超声波传感器的盲区补充测量,超声波传感器及红外传感器均匀分布在机器人的侧面,通过测量值的不同来预估目标个数、距离和大致方位;The ultrasonic sensor is used to measure the distance of the target, and the infrared sensor is used to supplement the blind area of the short-range ultrasonic sensor. The ultrasonic sensor and the infrared sensor are evenly distributed on the side of the robot, and the number, distance and general orientation of the target are estimated by the difference in the measured value. ; 视觉传感器,用于拍摄正前方的环境信息,视觉传感器位于机器人正面。The visual sensor is used to capture the environmental information directly in front, and the visual sensor is located on the front of the robot. 3.如权利要求2所述的一种机器人目标标定和自身状态监控功能的系统,其特征在于,所述数据处理与显示模块显示的信息包括机器人速度、加速度、目标物距离、电池使用情况、CPU负载程度,同时,还显示通过图像处理得到的目标物形状方位、通过所述视觉传感器标定得到的目标物大小、通过融合多种多个传感器的数据得到目标具体信息及其附近环境信息。3. A system of robot target calibration and self-state monitoring functions as claimed in claim 2, wherein the information displayed by the data processing and display module includes robot speed, acceleration, target distance, battery usage, At the same time, it also displays the shape and orientation of the target obtained through image processing, the size of the target obtained through the calibration of the visual sensor, and the specific information of the target and its surrounding environment information obtained by fusing data from multiple sensors. 4.如权利要求1所述的一种机器人目标标定和自身状态监控功能的系统,其特征在于,所述机器人各种参数包括速度、位置、与目标物距离、电池使用情况、CPU负载程度。4. A system of robot target calibration and self-state monitoring functions as claimed in claim 1, wherein the various parameters of the robot include speed, position, distance from the target, battery usage, and CPU load. 5.如权利要求1所述的一种机器人目标标定和自身状态监控功能的系统,其特征在于,所述数据传输模块包括Wi-Fi通信模块,通过Wi-Fi通信模块建立网络连接协议和数据传输。5. The system of a kind of robot target calibration and self-state monitoring function as claimed in claim 1, wherein said data transmission module comprises a Wi-Fi communication module, and establishes a network connection protocol and data through the Wi-Fi communication module. transmission.
CN201710211381.1A 2017-03-31 2017-03-31 Robot target demarcates the system with oneself state monitoring function Pending CN106940208A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710211381.1A CN106940208A (en) 2017-03-31 2017-03-31 Robot target demarcates the system with oneself state monitoring function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710211381.1A CN106940208A (en) 2017-03-31 2017-03-31 Robot target demarcates the system with oneself state monitoring function

Publications (1)

Publication Number Publication Date
CN106940208A true CN106940208A (en) 2017-07-11

Family

ID=59463628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710211381.1A Pending CN106940208A (en) 2017-03-31 2017-03-31 Robot target demarcates the system with oneself state monitoring function

Country Status (1)

Country Link
CN (1) CN106940208A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109167902A (en) * 2018-10-31 2019-01-08 中国矿业大学(北京) A kind of video camera with the angle detection function
CN109828580A (en) * 2019-02-27 2019-05-31 华南理工大学 A kind of Mobile Robot Formation's tracking and controlling method based on separate type ultrasonic wave

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7386163B2 (en) * 2002-03-15 2008-06-10 Sony Corporation Obstacle recognition apparatus and method, obstacle recognition program, and mobile robot apparatus
CN101561683A (en) * 2009-04-01 2009-10-21 东南大学 Motion control device of robot for detecting environmental pollution
US20110184558A1 (en) * 2008-08-27 2011-07-28 Kuka Laboratories Gmbh Robot And Method For Controlling A Robot
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion
CN104723350A (en) * 2015-03-16 2015-06-24 珠海格力电器股份有限公司 Intelligent control method and system for industrial robot safety protection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7386163B2 (en) * 2002-03-15 2008-06-10 Sony Corporation Obstacle recognition apparatus and method, obstacle recognition program, and mobile robot apparatus
US20110184558A1 (en) * 2008-08-27 2011-07-28 Kuka Laboratories Gmbh Robot And Method For Controlling A Robot
CN101561683A (en) * 2009-04-01 2009-10-21 东南大学 Motion control device of robot for detecting environmental pollution
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion
CN104723350A (en) * 2015-03-16 2015-06-24 珠海格力电器股份有限公司 Intelligent control method and system for industrial robot safety protection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李梁: "变电站巡检机器人视频监控系统设计与实现", 《工程科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109167902A (en) * 2018-10-31 2019-01-08 中国矿业大学(北京) A kind of video camera with the angle detection function
CN109828580A (en) * 2019-02-27 2019-05-31 华南理工大学 A kind of Mobile Robot Formation's tracking and controlling method based on separate type ultrasonic wave
CN109828580B (en) * 2019-02-27 2022-05-24 华南理工大学 Mobile robot formation tracking control method based on separated ultrasonic waves

Similar Documents

Publication Publication Date Title
CN109358340B (en) A method and system for constructing AGV indoor map based on lidar
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN206709853U (en) Drawing system is synchronously positioned and builds in a kind of multi-rotor unmanned aerial vehicle room
CN108181636A (en) Petrochemical factory's crusing robot environmental modeling and map structuring device and method
CN116352722A (en) Multi-sensor fusion mine inspection and rescue robot and its control method
CN205898143U (en) Robot navigation system based on machine vision and laser sensor fuse
CN207373179U (en) A kind of robot control system for being used for SLAM and navigation
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
CN111487964A (en) A robot car and its autonomous obstacle avoidance method and equipment
CN115307646B (en) Multi-sensor fusion robot positioning method, system and device
WO2018228258A1 (en) Mobile electronic device and method therein
CN114290313A (en) Inspection robot, automatic navigation inspection robot system and control method
Jian et al. Lvcp: Lidar-vision tightly coupled collaborative real-time relative positioning
Chen et al. Aerial robots on the way to underground: An experimental evaluation of VINS-mono on visual-inertial odometry camera
Huang et al. Research progress and application of multi-sensor data fusion technology in agvs
CN106940208A (en) Robot target demarcates the system with oneself state monitoring function
Han et al. Grasping control method of manipulator based on binocular vision combining target detection and trajectory planning
Thepsit et al. Localization for Outdoor Mobile Robot Using LiDAR and RTK-GNSS/INS.
Gao et al. A patrol mobile robot for power transformer substations based on ROS
Zhang et al. A low-cost environment-interactive patrol inspection system with navigation based on sensor-fusion and robotic arm contact pose feedback
CN107942748B (en) Mechanical arm space dynamic obstacle avoidance induction bracelet and control system
CN108646760A (en) Based on the target following of monocular vision mobile robot and platform control system and method
Shaw et al. Development of an AI-enabled AGV with robot manipulator
CN118654665A (en) A positioning system based on laser radar sensor and its use method
CN110421563A (en) A kind of industrial robot builds figure positioning system and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170711

RJ01 Rejection of invention patent application after publication