[go: up one dir, main page]

CN106005086A - Leg-wheel composite robot based on Xtion equipment and gesture control method thereof - Google Patents

Leg-wheel composite robot based on Xtion equipment and gesture control method thereof Download PDF

Info

Publication number
CN106005086A
CN106005086A CN201610389736.1A CN201610389736A CN106005086A CN 106005086 A CN106005086 A CN 106005086A CN 201610389736 A CN201610389736 A CN 201610389736A CN 106005086 A CN106005086 A CN 106005086A
Authority
CN
China
Prior art keywords
robot
leg
steering gear
module
xtion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610389736.1A
Other languages
Chinese (zh)
Inventor
丁希仑
齐静
徐坤
彭赛金
杨帆
尹业成
郑羿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201610389736.1A priority Critical patent/CN106005086A/en
Publication of CN106005086A publication Critical patent/CN106005086A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/032Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

本发明公开一种基于Xtion设备的腿轮复合机器人及其手势控制方法,包括机器人本体以及6条机器人单腿结构;机器人本体内部传感层、驱动层、控制层及动力层,用于安装控制系统;通过传感层上安装的XtionPRO LIVE摄像头结合控制层上安装的手势识别模块与运动控制模块实现手势控制;其中,手势识别模块又分为动态手势控制子模块与静态手势控制子模块,可分别实现动态及静态手势控制。本发明的优点为:为后续的图像处理提供更多信息,可实现地图构建、人机交互功能,提高了人与机器人的交互的自然性,使用户可以通过手势控制六足轮腿复合机器人,有利于六足轮腿复合机器人的实际应用;且本发明手势识别控制部分基于ROS设计,可移植性强,可用于其他机器人控制系统。

The invention discloses a leg-wheel composite robot based on Xtion equipment and a gesture control method thereof, including a robot body and six robot single-leg structures; a sensing layer, a driving layer, a control layer and a power layer inside the robot body for installation and control system; the gesture control is realized through the XtionPRO LIVE camera installed on the sensing layer combined with the gesture recognition module and the motion control module installed on the control layer; wherein, the gesture recognition module is divided into a dynamic gesture control sub-module and a static gesture control sub-module, which can be Realize dynamic and static gesture control respectively. The invention has the advantages of providing more information for subsequent image processing, realizing map construction and human-computer interaction functions, improving the naturalness of human-robot interaction, enabling users to control the six-legged wheel-leg composite robot through gestures, It is beneficial to the practical application of the hexapod wheel-leg composite robot; and the gesture recognition control part of the present invention is based on ROS design, has strong portability, and can be used in other robot control systems.

Description

一种基于Xtion设备的腿轮复合机器人及其手势控制方法A leg-wheel composite robot based on Xtion device and its gesture control method

技术领域technical field

本发明属于机器人技术领域,具体涉及一种基于Xtion设备腿轮复合机器人及其手势控制方法和装置。The invention belongs to the technical field of robots, and in particular relates to an Xtion-based leg-wheel composite robot and a gesture control method and device thereof.

背景技术Background technique

机器人在社会生活和工业生产方面有广阔的应用前景。工业生产方面的应用,如星球探测、灾后救援等。社会生活方面的应用前景,如助老、助残。在我国迈入老龄化社会,老年人需要照顾,而青壮年需要工作来维持生计,无暇照顾老人。机器人可以充当部分劳动力,补充劳动力的不足,在助老幼病残孕等方面有广泛的应用前景。Robots have broad application prospects in social life and industrial production. Applications in industrial production, such as planetary detection, post-disaster rescue, etc. The application prospects in social life, such as helping the elderly and the disabled. As our country enters an aging society, the elderly need to be taken care of, while the young and middle-aged need to work to make ends meet, and have no time to take care of the elderly. Robots can act as part of the labor force to supplement the shortage of labor force, and have broad application prospects in helping the elderly, the young, the sick, and the disabled.

腿式机器人能够适应复杂地形,但行走速度较慢,而轮式机器人行走速度较快,但只能在较平坦的地面行走,跨越障碍物的能力较差,腿、轮两种运动方式的结合为机器人的运动提供了新的便利,如何设计轮、腿两种方式结合的机构成为当前需要解决的问题。Legged robots can adapt to complex terrain, but the walking speed is slow, while wheeled robots walk faster, but can only walk on relatively flat ground, and the ability to cross obstacles is poor. The combination of legs and wheels It provides new convenience for the movement of the robot, and how to design a combination of wheels and legs has become a problem that needs to be solved at present.

腿轮式机器人通常需要特定人输入特定控制指令才能控制机器人运动,不懂专业知识的普通人无法直接操作,这种交互方式在一定程度上制约了机器人的广泛应用。Leg-wheeled robots usually require a specific person to input specific control commands to control the movement of the robot. Ordinary people without professional knowledge cannot directly operate it. This interactive method restricts the wide application of robots to a certain extent.

发明内容Contents of the invention

针对上述问题,本发明提供一种基于Xtion设备的腿轮复合机器人;使机器人与人之间实现更好交互。In view of the above problems, the present invention provides a leg-wheel composite robot based on Xtion equipment, which enables better interaction between the robot and humans.

本发明基于Xtion设备的腿轮复合机器人,包括机器人本体以及6条机器人单腿结构,6条机器人单腿结构周向均匀安装于机器人本体上。单腿结构为具有步行与轮行功能的轮腿结构,具有三根腿节,以及4个驱动舵机;令三根腿节分别为第一腿节、第二腿节、第三腿节,4个驱动舵机分别为第一舵机、第二舵机、第三舵机、第四舵机。其中,第一腿节一端固定安装在第一舵机的输出轴上,形成髋关节;第一舵机输出轴轴线与水平面垂直,由第一舵机驱动第一腿节横向摆动。第二舵机固定安装于第一腿节另一端,第二舵机输出轴与第二腿节一端固定,形成膝关节;第二舵机输出轴轴线与第一舵机输出轴轴线垂直,由第二舵机驱动第二腿节纵向摆动。第二腿节另一端与第三舵机输出轴固定,形成踝关节;第三舵机输出轴轴线与第二舵机输出轴轴线平行,由第三舵机驱动第三腿节纵向摆动。第四舵机位于第三腿节中部,第四舵机的输出轴轴线与第三舵机输出轴轴线平行,第四舵机输出轴上通过花键同轴固定安装有轮子,通过第四舵机驱动轮子转动。第三腿节另一端安装有足地检测机构,用于单腿结构与地面间的接触,同时可实现足地接触状态的检测。The leg-wheel composite robot based on the Xtion device of the present invention includes a robot body and 6 robot single-leg structures, and the 6 robot single-leg structures are uniformly installed on the robot body in the circumferential direction. The single-leg structure is a wheel-leg structure with the functions of walking and wheeling. It has three leg sections and four driving servos; the three leg sections are respectively the first leg section, the second leg section, and the third leg section. The driving steering gears are respectively the first steering gear, the second steering gear, the third steering gear and the fourth steering gear. Wherein, one end of the first leg section is fixedly installed on the output shaft of the first steering gear to form a hip joint; the axis of the output shaft of the first steering gear is perpendicular to the horizontal plane, and the first steering gear drives the first leg section to swing laterally. The second steering gear is fixedly installed on the other end of the first leg joint, and the output shaft of the second steering gear is fixed to one end of the second leg joint to form a knee joint; the axis of the output shaft of the second steering gear is perpendicular to the axis of the output shaft of the first steering gear, by The second steering gear drives the second leg section to swing longitudinally. The other end of the second leg section is fixed to the output shaft of the third steering gear to form an ankle joint; the axis of the output shaft of the third steering gear is parallel to the axis of the output shaft of the second steering gear, and the third leg joint is driven by the third steering gear to swing longitudinally. The fourth steering gear is located in the middle of the third leg section. The axis of the output shaft of the fourth steering gear is parallel to the axis of the output shaft of the third steering gear. The output shaft of the fourth steering gear is coaxially fixed with wheels through splines. The machine drives the wheels to turn. The other end of the third leg section is equipped with a foot-ground detection mechanism, which is used for the contact between the single-leg structure and the ground, and can realize the detection of the foot-ground contact state at the same time.

机器人本体内部由具有减重孔的铝合金板分割为四层,由上至下依次为传感层、驱动层、控制层及动力层,用于安装控制系统。其中,传感层上安装有Xtion PRO LIVE摄像头与IMU;Xtion PRO LIVE摄像头用来采集环境信息,实现同时定位及地图创建,以及获取人的信息,用于人机交互;IMU用于获得机器人的自身姿态,辅助定位和创建环境地图。驱动层上安装有六块舵机驱动板,分别用于控制6条单腿结构上的各个舵机运动。控制层上安装有主控板,实现通讯管理、传感器数据采集、数据处理与驱动管理等功能。控制层内还设置有手势控制模块与运动控制模块,实现对腿轮复合机器人的手势控制。动力层上安装有电池盒,电池盒内安装机器人供电电池。The interior of the robot body is divided into four layers by an aluminum alloy plate with weight-reducing holes. From top to bottom, it is the sensing layer, driving layer, control layer and power layer, which are used to install the control system. Among them, Xtion PRO LIVE camera and IMU are installed on the sensing layer; Xtion PRO LIVE camera is used to collect environmental information, realize simultaneous positioning and map creation, and obtain human information for human-computer interaction; IMU is used to obtain robot information Own pose, assist localization and create environment map. Six steering gear drive boards are installed on the driving layer, which are used to control the movement of each steering gear on the 6 single-leg structures. A main control board is installed on the control layer to realize functions such as communication management, sensor data collection, data processing and drive management. A gesture control module and a motion control module are also arranged in the control layer to realize the gesture control of the leg-wheel composite robot. A battery box is installed on the power layer, and a robot power supply battery is installed in the battery box.

机器人本体与远程控制端进行无线通信,由远程控制端进行控制;远程控制端具有安全认证模块、音视频播放模块、操控模块与信息显示模块。各个模块均分为前端与后台;前端为人机交互接口;后台负责网络通信、数据处理。其中,安全认证模块用于获取用户输入的用户信息,经后台封装成数据包发送至机器人的控制主控板中,由主控板反馈登录结果至安全认证模块进行显示。音视频播放模块通过后台接收Xtion PRO LIVE摄像头返回的声音信息和相机采集的图像,经后台解码后在音视频播放模块中进行显示。信息显示模块接收IMU返回的六组机器人传感信息,如姿态、关节角度和关节力矩信息用于显示六足机器人传感信息。操控模块由运动控件组成,可控制机器人运动;同时可以设置机器人行进模式。The robot body communicates wirelessly with the remote control terminal, and is controlled by the remote control terminal; the remote control terminal has a safety authentication module, an audio and video playback module, a control module and an information display module. Each module is divided into a front-end and a back-end; the front-end is a human-computer interaction interface; the back-end is responsible for network communication and data processing. Among them, the security authentication module is used to obtain the user information input by the user, which is encapsulated into a data packet through the background and sent to the main control board of the robot, and the main control board feeds back the login result to the security authentication module for display. The audio and video playback module receives the sound information returned by the Xtion PRO LIVE camera and the images collected by the camera through the background, and displays it in the audio and video playback module after being decoded in the background. The information display module receives six sets of robot sensing information returned by the IMU, such as posture, joint angle and joint torque information, and is used to display the sensing information of the hexapod robot. The control module is composed of motion controls, which can control the movement of the robot; at the same time, it can set the robot's travel mode.

针对上述基于Xtion设备的腿轮复合机器人的手势识别方法为,在控制层上安装手势识别模块与运动控制模块;手势识别模块包括动态手势识别子模块和静态手势识别子模块,其中:The gesture recognition method for the leg-wheel composite robot based on the above-mentioned Xtion equipment is to install a gesture recognition module and a motion control module on the control layer; the gesture recognition module includes a dynamic gesture recognition submodule and a static gesture recognition submodule, wherein:

动态手势识别子模块用于实现动态手势识别,具有稳定检测器、画圆检测器、推检测器、移动检测器。稳定检测器用于识别当前活跃的手部是否处于稳定状态,画圆检测器用于识别当前活跃手部的画圆运动,推检测器用于识别当前活跃手的前推动作,移动检测器用于识别当前活跃手的击打动作。The dynamic gesture recognition sub-module is used to realize dynamic gesture recognition, and has a stability detector, a circle detector, a push detector, and a movement detector. The stability detector is used to identify whether the currently active hand is in a stable state, the circle detector is used to identify the circle movement of the currently active hand, the push detector is used to identify the forward movement of the currently active hand, and the movement detector is used to identify the current active hand. The striking action of the hand.

上述动态手势识别子模块的具体设计方式为:The specific design method of the above-mentioned dynamic gesture recognition sub-module is as follows:

A、使用Xtion PRO LIVE摄像头的Xtion SDK初始化环境;A. Use the Xtion SDK of the Xtion PRO LIVE camera to initialize the environment;

B、建立上下文环境创建图像生成器,并设定图像生成器的像素、帧率;B. Establish a context to create an image generator, and set the pixel and frame rate of the image generator;

C、建立、注册场景管理器,并初始化场景管理器;C. Create and register the scene manager, and initialize the scene manager;

D、建立、注册稳定检测器、画圆检测器、推检测器、移动检测器,并将各检测器中的监听器添加至场景管理器中;同时,设定稳定检测器、画圆检测器、推检测器、移动检测器的回调函数。D. Establish and register stable detectors, circle detectors, push detectors, and motion detectors, and add the listeners in each detector to the scene manager; at the same time, set the stable detectors and circle detectors , push detector, and motion detector callback function.

通过上述方式设计的动态手势识别模块,在进行手势识别时,首先通过Xtion PRO LIVE摄像头采集RGB图像信息;随后基于OpenNI软件对采集的图像进行处理,提取手势的运动轨迹,并通过稳定检测器、画圆检测器、推检测器、移动检测器识别手静止不动、画圆、推和移动的手势;随后稳定检测器、画圆检测器、推检测器、移动检测器将各自的回调函数中输出识别结果,并将识别结果发送至运动控制模块;运动控制模块根据识别结果,结合动态手势识别模块内预定义的动态手势与机器人运动的映射关系,控制机器人完成相应的运动。The dynamic gesture recognition module designed in the above way, when performing gesture recognition, first collects RGB image information through the Xtion PRO LIVE camera; then processes the collected images based on OpenNI software, extracts the trajectory of the gesture, and passes the stable detector, Circle detectors, push detectors, and movement detectors recognize hand gestures of stillness, circles, pushes, and movements; then the stability detectors, circle detectors, push detectors, and movement detectors will be included in their respective callback functions Output the recognition result, and send the recognition result to the motion control module; the motion control module controls the robot to complete the corresponding motion according to the recognition result, combined with the mapping relationship between the dynamic gesture and the robot motion predefined in the dynamic gesture recognition module.

所述静态手势识别子模块基于彩色图像,用于识别静态手势;通过静态手势识别子模块,在进行手势识别时,首先通过Xtion PRO LIVE摄像头采集RGB图像信息,该图像信息通过机器人操作系统中的Topic中的message实现;随后,利用静态手势识别子模块内预存的肤色模型和HOG特征检测出人脸部分,进而根据人体结构特征,检测出人的身体部分,并由身体部分中分割出人的手部区域,最终提取已分割手部区域的图像特征,进行手势的训练识别,并将识别结果发送主控板,根据识别结果,结合预存于的静态手势与机器人运动的映射关系,控制机器人完成相应的运动。The static gesture recognition sub-module is based on a color image and is used to recognize static gestures; through the static gesture recognition sub-module, when performing gesture recognition, at first the RGB image information is collected by the Xtion PRO LIVE camera, and the image information passes through the robot operating system. Message implementation in Topic; then, use the skin color model and HOG features pre-stored in the static gesture recognition sub-module to detect the face part, and then detect the human body part according to the human body structure features, and segment the human body part from the body part In the hand area, finally extract the image features of the segmented hand area, conduct gesture training and recognition, and send the recognition result to the main control board. corresponding movement.

本发明的优点在于:The advantages of the present invention are:

1、本发明基于Xtion设备的腿轮复合机器人中,Xtion Pro Live摄像头不仅能够获取RGB信息和深度信息,还可获取用户的骨骼点信息,为后续的图像处理提供更多信息,可实现地图构建、人机交互功能;1. In the leg-wheel composite robot based on the Xtion device of the present invention, the Xtion Pro Live camera can not only obtain RGB information and depth information, but also obtain the user's bone point information, provide more information for subsequent image processing, and realize map construction , human-computer interaction function;

2、本发明基于Xtion设备的腿轮复合机器人及其手势控制方法,可移植性强。该方法基于ROS,具体来说,动态手势识别基于ROS和OpenNI框架设计实现,静态手势识别基于ROS,可用于其他机器人控制系统;2. The leg-wheel composite robot based on the Xtion device of the present invention and its gesture control method have strong portability. The method is based on ROS. Specifically, the dynamic gesture recognition is based on ROS and OpenNI framework design and implementation, and the static gesture recognition is based on ROS, which can be used in other robot control systems;

3、本发明基于Xtion设备的腿轮复合机器人及其手势控制方法,包括静态手势和动态手势控制两种方式,用户可根据需要选择合适的控制方式;3. The leg-wheel composite robot based on the Xtion device of the present invention and its gesture control method include two modes of static gesture control and dynamic gesture control, and the user can choose an appropriate control mode according to needs;

4、本发明基于Xtion设备的腿轮复合机器人及其手势控制方法,研究对象为人的上半身图片,基于人的身体结构分割出手势区域,通用性强。此种分割方法不仅能分割不戴手套的手,还能分割出带手套的手;4. The present invention is based on the Xtion device-based leg-wheel composite robot and its gesture control method. The research object is a picture of the upper body of a person, and the gesture area is segmented based on the human body structure, which has strong versatility. This segmentation method can not only segment ungloved hands, but also segment gloved hands;

5、本发明基于Xtion设备的腿轮复合机器人及其手势控制方法,使用手势控制机器人,提高了人与机器人的交互的自然性,符合人机交互的发展趋势;5. The present invention is based on the Xtion device-based leg-wheel composite robot and its gesture control method, using gestures to control the robot, which improves the naturalness of the interaction between humans and robots, and conforms to the development trend of human-computer interaction;

6、本发明所述基于Xtion设备腿轮复合机器人及其手势控制方法和装置,将静态手势识别和基于OpenNI软件的动态手势识别,应用于六足轮腿复合机器人,使用户可以通过手势控制六足轮腿复合机器人,有利于六足轮腿复合机器人的实际应用。6. The Xtion-based leg-wheel composite robot and its gesture control method and device according to the present invention apply static gesture recognition and dynamic gesture recognition based on OpenNI software to a six-legged wheel-leg composite robot, so that users can control the six-legged wheel-leg composite robot through gestures. The caster-leg compound robot is beneficial to the practical application of the hexapod-wheel-leg compound robot.

附图说明Description of drawings

图1为本发明基于Xtion设备的腿轮复合机器人整体结构示意图;Fig. 1 is the overall structure schematic diagram of the leg-wheel compound robot based on Xtion equipment of the present invention;

图2为本发明基于Xtion设备的腿轮复合机器人中结构一的单腿结构示意图;Fig. 2 is the single-leg structure schematic diagram of structure one in the leg-wheel composite robot based on Xtion equipment of the present invention;

图3为单腿结构中舵机与腿节间的安装方式示意图;Figure 3 is a schematic diagram of the installation method between the steering gear and the leg segment in the single-leg structure;

图4为本发明基于Xtion设备的腿轮复合机器人站立时单腿结构示意图;Fig. 4 is a schematic diagram of the structure of a single leg when the leg-wheel composite robot based on the Xtion device of the present invention stands;

图5为本发明基于Xtion设备的腿轮复合机器人中轮行时单腿结构示意图;Fig. 5 is a schematic diagram of the single leg structure when the wheel is walking in the leg-wheel composite robot based on the Xtion device of the present invention;

图6为本发明基于Xtion设备的腿轮复合机器人中机器人主体结构示意图;Fig. 6 is a schematic diagram of the main body structure of the robot in the leg-wheel composite robot based on the Xtion device of the present invention;

图7为本发明基于Xtion设备的腿轮复合机器人中机器人主体的外壳内部分层结构示意图;7 is a schematic diagram of the layered structure inside the shell of the robot body in the leg-wheel composite robot based on the Xtion device of the present invention;

图8为本发明基于Xtion设备的腿轮复合机器人中充电接口与电源开关设计位置示意图;Fig. 8 is a schematic diagram of the design position of the charging interface and the power switch in the leg-wheel composite robot based on the Xtion device of the present invention;

图9为本发明基于Xtion设备的腿轮复合机器人中动态手势识别子模块的设定方法流程图;Fig. 9 is the flow chart of the setting method of the dynamic gesture recognition submodule in the leg-wheel composite robot based on the Xtion device of the present invention;

图10为本发明基于Xtion设备的腿轮复合机器人中动态手势识别子模块的手势识别方法;Fig. 10 is the gesture recognition method of the dynamic gesture recognition submodule in the leg-wheel composite robot based on the Xtion device of the present invention;

图11为本发明基于Xtion设备的腿轮复合机器人中静态手势识别子模块的手势识别方法流程图。Fig. 11 is a flow chart of the gesture recognition method of the static gesture recognition sub-module in the Xtion device-based leg-wheel composite robot of the present invention.

图中:In the picture:

1-机器人本体 2-单腿结构 3-舵盘1-robot body 2-single-leg structure 3-rudder plate

4-伸出轴 5-轴承 6-轮子4-Protruding shaft 5-Bearing 6-Wheel

7-执行器连接件 8-末端执行器 9-Xtion PRO LIVE摄像头7-Actuator connector 8-End effector 9-Xtion PRO LIVE camera

10-IMU 11-舵机驱动板 12-主控板10-IMU 11-Servo driver board 12-Main control board

13-电池盒 14-充电接口 15-电源开关13-Battery box 14-Charging interface 15-Power switch

101-外壳 102-周边罩 201-第一腿节101-shell 102-perimeter cover 201-first leg section

202-第二腿节 203-第三腿节 204-第一舵机202-Second Leg Section 203-Third Leg Section 204-First Servo

205-第二舵机 206-第三舵机 207-第四舵机205-Second steering gear 206-Third steering gear 207-Fourth steering gear

208-足地检测机构 901-底座 902-倒U型支架208-foot detection mechanism 901-base 902-inverted U-shaped bracket

具体实施方式detailed description

下面结合附图对本发明进行详细说明。The present invention will be described in detail below in conjunction with the accompanying drawings.

本发明基于Xtion设备的腿轮复合机器人,包括机器人本体1以及6条机器人单腿结构2,如图1所示,6条机器人单腿结构2周向均匀安装于机器人本体1上。The leg-wheel composite robot based on Xtion equipment in the present invention includes a robot body 1 and six robot single-leg structures 2. As shown in FIG.

一般机器人步行时为了实现足端在空间中的自由移动,单腿结构2自由度至少为3,机器人单腿结构2自由度越多,腿就越灵活,但其设计、控制难度和腿的质量将逐步地增大。本发明中6条机器人单腿结构2均采用三个舵机输出三个自由度,实现基本步行功能。Generally, in order to realize the free movement of the foot end in space when the robot walks, the single-leg structure 2 degrees of freedom must be at least 3, and the more the 2 degrees of freedom of the single-leg structure of the robot, the more flexible the legs are, but the design, control difficulty and quality of the legs will gradually increase. In the present invention, the six robot single-leg structures 2 all adopt three steering gears to output three degrees of freedom to realize the basic walking function.

本发明中单腿结构为具有步行与轮行功能的轮腿结构,如图2所示,具有三根腿节,以及4个驱动舵机。令三根腿节分别为第一腿节201、第二腿节202、第三腿节203,4个驱动舵机分别为第一舵机204、第二舵机205、第三舵机206、第四舵机207。其中,第一腿节201一端固定安装在第一舵机204的输出轴上,形成髋关节;第一舵机204输出轴轴线与水平面垂直,由第一舵机204驱动第一腿节201横向摆动。第二舵机205固定安装于第一腿节201另一端,第二舵机205输出轴与第二腿节202一端固定,形成膝关节;第二舵机205输出轴轴线与第一舵机204输出轴轴线垂直,由第二舵机205驱动第二腿节202纵向摆动。第二腿节202另一端与第三舵机206输出轴固定,形成踝关节;第三舵机206输出轴轴线与第二舵机205输出轴轴线平行,由第三舵机206驱动第三腿节203纵向摆动。The single-leg structure in the present invention is a wheel-leg structure with the functions of walking and wheeling, as shown in FIG. 2 , with three leg sections and four driving steering gears. Let the three leg sections be respectively the first leg section 201, the second leg section 202, and the third leg section 203, and the four driving steering gears are respectively the first steering gear 204, the second steering gear 205, the third steering gear 206, and the third steering gear 206. Four steering gear 207. Wherein, one end of the first leg section 201 is fixedly installed on the output shaft of the first steering gear 204 to form a hip joint; the axis of the output shaft of the first steering gear 204 is perpendicular to the horizontal plane, and the first steering gear 204 drives the first leg section 201 to move horizontally swing. The second steering gear 205 is fixedly installed on the other end of the first leg section 201, and the output shaft of the second steering gear 205 is fixed to one end of the second leg section 202 to form a knee joint; The axis of the output shaft is vertical, and the second leg section 202 is driven to swing longitudinally by the second steering gear 205 . The other end of the second leg section 202 is fixed to the output shaft of the third steering gear 206 to form an ankle joint; the axis of the output shaft of the third steering gear 206 is parallel to the axis of the output shaft of the second steering gear 205, and the third leg is driven by the third steering gear 206 Section 203 swings longitudinally.

对于上述髋关节、膝关节与踝关节处的舵机与第一腿节201、第二腿节202、第三腿节203间的连接方式相同,如图3所示,其中舵机输出轴上通过螺钉固定安装有舵盘3,舵盘3嵌入腿节的端部一侧连接位上设计的凹槽内定位;舵机的输出轴同轴固定安装有伸出轴4,伸出轴4端部通过轴承5与腿节端部另一侧连接位相连。由此舵机通过上述连接方式将动力传递至腿节,减少了轴承的使用,达到减重的目的。The connection mode between the steering gear at the above hip joint, knee joint and ankle joint and the first leg section 201, the second leg section 202, and the third leg section 203 is the same, as shown in Figure 3, wherein the output shaft of the steering gear The rudder plate 3 is fixed by screws, and the rudder plate 3 is embedded in the groove designed on the connection position of the end of the leg section; The part is connected with the other side connection position of the leg joint end through the bearing 5. Therefore, the steering gear transmits the power to the leg joint through the above connection method, which reduces the use of bearings and achieves the purpose of weight reduction.

上述第三腿节203设计为向机器人主体1方向弯曲,弯曲夹角为140度;使第四舵机207位于弯曲处,第四舵机207的输出轴轴线与第三舵机203输出轴轴线平行,第四舵机207输出轴上通过花键同轴固定安装有轮子6,通过第四舵机207驱动轮子6转动。The above-mentioned third leg section 203 is designed to bend toward the direction of the robot main body 1, and the bending angle is 140 degrees; the fourth steering gear 207 is located at the bend, and the axis of the output shaft of the fourth steering gear 207 is aligned with the axis of the output shaft of the third steering gear 203. In parallel, the output shaft of the fourth steering gear 207 is coaxially fixed with a wheel 6 through a spline, and the fourth steering gear 207 drives the wheel 6 to rotate.

第三腿节203另一端安装有足地检测机构208,用于单腿结构2与地面间的接触,同时可实现足地接触状态的检测。上述足地检测机构208可采用公开号为CN104816766A的发明专利中公开的一种适用于腿式机器人的足端触地检测机构,这种足地接触检测装置实质是在足端安装接触开关。The other end of the third leg section 203 is equipped with a foot-ground detection mechanism 208 for the contact between the single-leg structure 2 and the ground, and at the same time can realize the detection of the foot-ground contact state. The above-mentioned foot-ground detection mechanism 208 can adopt a foot-end touch-ground detection mechanism suitable for legged robots disclosed in the invention patent with publication number CN104816766A. This foot-ground contact detection device essentially installs a contact switch at the foot end.

在上述结构的单腿结构2中,直接将第一舵机204固定与机器人本体1上即完成与机器人本体间的安装。当机器人进行步行前进时,6条单腿结构2的支撑形式根据不同的关节位姿也不尽相同,根据仿生学原理,仿制蟑螂步行时的单腿支撑构型,如图4所示,由第一舵机201、第二舵机202、第三舵机203分别驱动髋关节、膝关节和踝关节转动,调节三个关节的位姿,实现单腿结构2的前后摆动或纵向移动,按照不同的步态前进。当机器人进行轮行前进时,可驱动其中至少3个单腿结构2中膝关节及踝关节转动改变单腿结构2的构型,使单腿结构2中的轮子6着地,如图5所示,实现轮行运动。In the single-leg structure 2 of the above structure, the first steering gear 204 is directly fixed on the robot body 1 to complete the installation with the robot body. When the robot walks forward, the support forms of the six single-leg structures 2 are also different according to different joint poses. According to the principle of bionics, the single-leg support configuration of a cockroach when walking is imitated, as shown in Figure 4. The first steering gear 201, the second steering gear 202, and the third steering gear 203 respectively drive the rotation of the hip joint, the knee joint, and the ankle joint, and adjust the poses of the three joints to realize the forward and backward swing or longitudinal movement of the single-leg structure 2. different gaits forward. When the robot is wheeled forward, it can drive the knee joints and ankle joints in at least three single-leg structures 2 to rotate to change the configuration of the single-leg structure 2, so that the wheels 6 in the single-leg structure 2 are on the ground, as shown in Figure 5 , to realize wheel motion.

所述机器人本体1由外壳101、周边罩102构成,如图6所示;均使用3D打印制作,材料为塑料,质量较轻硬度适宜,并且加工周期短,便于机器人本体1的造型控制。周边罩102加装于外壳101外部,周向上预留单腿结构2的安装口;周边罩102用于保护外壳101及外壳101内部控制系统。上述机器人本体1中,外壳101采用近圆柱体,且内部由具有减重孔的铝合金板分割为四层,如图7所示,由上至下依次为传感层、驱动层、控制层及动力层,用于安装控制系统。The robot body 1 is composed of a shell 101 and a peripheral cover 102, as shown in FIG. 6 ; they are all manufactured by 3D printing, and the material is plastic, which is light in weight and suitable for hardness, and has a short processing cycle, which is convenient for the shape control of the robot body 1 . The peripheral cover 102 is installed on the outside of the casing 101 , and an installation opening of the single-leg structure 2 is reserved in the circumferential direction; the peripheral cover 102 is used to protect the casing 101 and the internal control system of the casing 101 . In the above-mentioned robot body 1, the shell 101 adopts a nearly cylindrical body, and the interior is divided into four layers by an aluminum alloy plate with weight-reducing holes, as shown in FIG. 7 , from top to bottom are the sensing layer, driving layer, and control layer And the power layer, used to install the control system.

其中,传感层上安装有Xtion PRO LIVE摄像头9、IMU(惯性导航单元)10;Xtion PRO LIVE摄像头9位于外壳101外部,其底座901位于传感器层内,固定安装在倒U型支架902上,倒U型支架902固定在传感层上表面。IMU10固定于传感器层上,位于倒U型支架902内,由此实现紧凑设计,减少机器人的体积。上述Xtion PRO LIVE摄像头9有两个作用:一是采集环境信息,实现同时定位及地图创建(Simultaneous Localization and Mapping,SLAM,同时定位及地图创建);二是获取人的信息,用于人机交互。Xtion PRO LIVE摄像头9包含2个相机,分别为一个RGB相机及一个深度相机;深度相机有效距离在0.8m至3.5m之间。另外Xtion PRO LIVE摄像头9两侧的麦克组成麦克阵列,能够有效地获取环境中的声音信息,可用于实现语音控制等自然人机交互方式。IMU10采用惯性导航单元,用于获得机器人的自身姿态,辅助定位和创建环境地图。本发明中IMU10采用MTI-300AHRS,能输出三轴线加速度、三轴角速度、三轴偏角,内部采用突破性的传感器融合算法XEE(Xsens Estimation Engine),一定程度上克服了Kalman滤波的限制,性能接近于光学陀螺仪。Wherein, an Xtion PRO LIVE camera 9 and an IMU (inertial navigation unit) 10 are installed on the sensing layer; the Xtion PRO LIVE camera 9 is located outside the casing 101, and its base 901 is located in the sensor layer, and is fixedly mounted on an inverted U-shaped bracket 902, The inverted U-shaped bracket 902 is fixed on the upper surface of the sensing layer. The IMU 10 is fixed on the sensor layer and located in the inverted U-shaped bracket 902 , thereby achieving a compact design and reducing the volume of the robot. The above-mentioned Xtion PRO LIVE camera 9 has two functions: one is to collect environmental information to realize simultaneous positioning and map creation (Simultaneous Localization and Mapping, SLAM, simultaneous positioning and map creation); the other is to obtain human information for human-computer interaction . Xtion PRO LIVE camera 9 includes 2 cameras, one RGB camera and one depth camera; the effective distance of the depth camera is between 0.8m and 3.5m. In addition, the microphones on both sides of the Xtion PRO LIVE camera 9 form a microphone array, which can effectively obtain sound information in the environment, and can be used to realize natural human-computer interaction methods such as voice control. The IMU10 uses an inertial navigation unit to obtain the robot's own attitude, assist positioning and create an environmental map. In the present invention, the IMU10 adopts MTI-300AHRS, which can output three-axis acceleration, three-axis angular velocity, and three-axis deflection angle. The breakthrough sensor fusion algorithm XEE (Xsens Estimation Engine) is used inside, which overcomes the limitation of Kalman filter to a certain extent, and the performance close to an optical gyroscope.

驱动层上安装有六块舵机驱动板11,分别用于控制6条单腿结构2上的各个舵机运动。六块舵机驱动板11并排设置在驱动层上表面上,可以实现并行计算,实时性好。Six steering gear drive boards 11 are installed on the driving layer, which are respectively used to control the movement of each steering gear on the six single-leg structures 2 . Six steering gear drive boards 11 are arranged side by side on the upper surface of the drive layer, which can realize parallel computing and has good real-time performance.

控制层上安装有主控板12,主控板12为机器人的控制中心,可实现通讯管理、传感器数据采集、数据处理、驱动管理等功能。主控板12采用MIO-2263系列嵌入式单板电脑,为搭载嵌入式CeleronJ1900四核处理器的工业级嵌入式单板电脑。主控板12为x86架构,对各类软硬件具有良好的兼容性,同时其较高的性能可以满足大运算量需求,并且使用ROS框架进行开发。The main control board 12 is installed on the control layer, and the main control board 12 is the control center of the robot, which can realize functions such as communication management, sensor data collection, data processing, and drive management. The main control board 12 adopts MIO-2263 series embedded single-board computer, which is equipped with embedded Industrial-grade embedded single-board computer with CeleronJ1900 quad-core processor. The main control board 12 is an x86 architecture, which has good compatibility with various hardware and software, and its high performance can meet the demand for large amount of calculation, and it is developed using the ROS framework.

动力层上安装有电池盒13,电池盒13安装有一块12V锂电池和一块7.4V锂电池,使机器人不用外接电源,独立自主,扩大机器人适用范围。A battery box 13 is installed on the power layer, and a 12V lithium battery and a 7.4V lithium battery are installed in the battery box 13, so that the robot needs no external power supply, is independent and autonomous, and expands the scope of application of the robot.

上述控制层上还设计有充电接口14与电源开关15,如图9所示;充电导线两端分别与充电接口14和动力层中两块锂电池相连;由此使外接电源通过充电接口14来为锂电池充电。电源开关15来用来控制每条单腿结构2中的各个舵机及主控板12的上电下电。The above-mentioned control layer is also designed with a charging interface 14 and a power switch 15, as shown in Figure 9; the two ends of the charging wire are respectively connected to the charging interface 14 and the two lithium batteries in the power layer; Charge the lithium battery. The power switch 15 is used to control the power-on and power-off of each servo and the main control board 12 in each single-leg structure 2 .

本发明在wlan无线网络环境下,与远程控制端(手机端或平板端)进行无线通信,由远程控制端进行控制。控制端采用Android系统,设计有控制模块;控制模块包括安全认证模块、音视频播放模块、操控模块、信息显示模块,如图8所示。控制端中各个模块均分为前端与后台。前端为人机交互接口;后台负责网络通信、数据处理。其中,安全认证模块用于获取用户输入的用户信息,经后台封装成数据包发送至机器人的控制主控板12中,由主控板反馈登录结果至安全认证模块进行显示。音视频播放模块通过后台接收六足机器人中Xtion PROLIVE摄像头返回的声音信息和相机采集的图像,经后台解码后在音视频播放模块中进行显示。信息显示模块接收IMU返回的六组机器人传感信息,如姿态、关节角度和关节力矩信息用于显示六足机器人传感信息,如姿态、关节角度和关节力矩。操控模块由多个控件组成,可控制六足机器人前进、后退、左转、右转、暂停、停止等动作。同时可以设置机器人行进模式,如腿式行进模式或轮式行进模式。The present invention performs wireless communication with a remote control terminal (mobile phone terminal or tablet terminal) under the WLAN wireless network environment, and is controlled by the remote control terminal. The control terminal adopts the Android system and is designed with a control module; the control module includes a security authentication module, an audio and video playback module, a control module, and an information display module, as shown in Figure 8. Each module in the control terminal is divided into front-end and back-end. The front end is a human-computer interaction interface; the background is responsible for network communication and data processing. Wherein, the security authentication module is used to obtain the user information input by the user, which is encapsulated into a data packet through the background and sent to the main control board 12 of the robot, and the main control board feeds back the login result to the security authentication module for display. The audio and video playback module receives the sound information returned by the Xtion PROLIVE camera in the hexapod robot and the images collected by the camera through the background, and displays it in the audio and video playback module after being decoded by the background. The information display module receives six sets of robot sensing information returned by the IMU, such as attitude, joint angle and joint moment information, and is used to display the sensing information of the hexapod robot, such as attitude, joint angle and joint moment. The control module is composed of multiple controls, which can control the hexapod robot to move forward, backward, turn left, turn right, pause, stop and other actions. At the same time, you can set the robot's travel mode, such as leg travel mode or wheel travel mode.

对于平板电脑端,由于平板电脑的屏幕较大,能够容纳更多的功能模块。相比手机端,本发明中在平板电脑端还增加了三维仿真模块。三维仿真模根据Xtion PRO LIVE摄像头与IMU模块返回的机器人运动及其周围环境信号,同步显示机器人运动和环境地形。在机器人运行过程中,三维仿真模块根据六足机器人关节角度信息,驱动三维模型与真实机器人同步(低延时)运动。As for the tablet computer end, since the tablet computer has a larger screen, it can accommodate more functional modules. Compared with the mobile phone terminal, a three-dimensional simulation module is also added on the tablet computer terminal in the present invention. The 3D simulation model synchronously displays the robot's movement and environmental terrain based on the robot's movement and surrounding environment signals returned by the Xtion PRO LIVE camera and IMU module. During the operation of the robot, the 3D simulation module drives the 3D model to move synchronously (low delay) with the real robot according to the joint angle information of the hexapod robot.

针对上述结构的腿轮复合机器人,本发明还提出一种手势控制方法,在控制层内增加手势控制模块与运动控制模块,实现对腿轮复合机器人的手势控制。本发明中手势识别模块基于Linux Ubuntu 12.04上安装的机器人操作系统(Robot Operating System,ROS)的hydro版本设计,将手势识别模块与运动控制模块抽象为整个机器人操作系统中的不同节点,则手势识别模块与运动控制模块间的通信可看作不同结点间通信,故模块间以ROS(机器人操作系统)中Topic(话题)的形式通信,通信的内容即为Topic中的message(消息)。For the leg-wheel composite robot with the above structure, the present invention also proposes a gesture control method, adding a gesture control module and a motion control module in the control layer to realize gesture control of the leg-wheel composite robot. In the present invention, the gesture recognition module is designed based on the hydro version of the robot operating system (Robot Operating System, ROS) installed on Linux Ubuntu 12.04, and the gesture recognition module and the motion control module are abstracted into different nodes in the entire robot operating system, then the gesture recognition The communication between the module and the motion control module can be regarded as the communication between different nodes, so the communication between the modules is in the form of Topic (topic) in ROS (Robot Operating System), and the communication content is the message (message) in Topic.

上述手势识别模块包括动态手势识别子模块和静态手势识别子模块,用户可根据需要选择适当的手势识别方法,其中,动态手势识别基于OpenNI(Open Natural Interaction)软件设计,用于实现动态手势识别;OpenNI为开发者提供一个多语言、跨平台的框架,并提供其自然交互的API接口。OpenNI API(Application Programming Interface,应用程序编程接口)便是一组编写自然交互应用的接口。OpenNI的主要目的是形成一个标准应用程序编程接口,以此来搭建视觉和音频传感器与视觉和音频感知中间件通信的桥梁。NITE是OpenNI的中间层,主要用来实现手势识别。NITE主要通过手势检测器来实现,这些检测器在NITE程序中以类的形式出现,用来实现特定手势的识别。本发明中的动态手势识别子模块基于NITE设计实现,所使用手势检测器包括稳定检测器(SteadyDetector)、画圆检测器(CircleDetector)、推检测器(PushDetector)、移动检测器(SwipeDetector)。其中,稳定检测器用于识别当前活跃的手部是否处于稳定状态,画圆检测器用于识别当前活跃手部的画圆运动,推检测器用于识别当前活跃手的前推动作(即,手部垂直于身体前推),移动检测器用于识别当前活跃手的击打动作(即,手沿身体正面的平面运动)。The above-mentioned gesture recognition module includes a dynamic gesture recognition submodule and a static gesture recognition submodule, and the user can select an appropriate gesture recognition method as required, wherein the dynamic gesture recognition is based on OpenNI (Open Natural Interaction) software design for realizing dynamic gesture recognition; OpenNI provides developers with a multi-language, cross-platform framework, and provides its natural interactive API interface. OpenNI API (Application Programming Interface, Application Programming Interface) is a set of interfaces for writing natural interactive applications. The main purpose of OpenNI is to form a standard application programming interface to build a bridge for communication between vision and audio sensors and vision and audio perception middleware. NITE is the middle layer of OpenNI, mainly used to realize gesture recognition. NITE is mainly implemented by gesture detectors, which appear in the form of classes in the NITE program and are used to recognize specific gestures. The dynamic gesture recognition sub-module in the present invention is designed and implemented based on NITE, and the gesture detectors used include SteadyDetector, CircleDetector, PushDetector, and SwipeDetector. Among them, the stability detector is used to identify whether the currently active hand is in a stable state, the circle detector is used to identify the circle movement of the currently active hand, and the push detector is used to identify the forward motion of the currently active hand (that is, the hand is vertical The motion detector is used to identify the strike action of the currently active hand (i.e., the hand moves along the plane of the front of the body).

如图9所示,上述动态手势识别子模块的具体设计方式为:As shown in Figure 9, the specific design method of the above-mentioned dynamic gesture recognition sub-module is as follows:

A、使用Xtion PRO LIVE摄像头9的Xtion SDK(android平台上的体感开发工具)中XML文档(如:Sample-tracking.xml)初始化环境;A. Use the XML document (such as Sample-tracking.xml) in the Xtion SDK (sensing development tool on the android platform) of the Xtion PRO LIVE camera 9 to initialize the environment;

B、建立上下文环境创建图像生成器(ImageGenerator),并设定图像生成器的像素、帧率等;B. Create a context environment to create an image generator (ImageGenerator), and set the pixel and frame rate of the image generator;

C、建立、注册场景管理器(SessionManager),并初始化场景管理器;C. Establish and register the scene manager (SessionManager), and initialize the scene manager;

D、建立、注册稳定检测器、画圆检测器、推检测器、移动检测器,并将各检测器中的监听器添加至场景管理器中;同时,设定稳定检测器、画圆检测器、推检测器、移动检测器的回调函数;D. Establish and register stable detectors, circle detectors, push detectors, and motion detectors, and add the listeners in each detector to the scene manager; at the same time, set the stable detectors and circle detectors , the callback function of push detector and mobile detector;

通过上述方式设计的动态手势识别模块,在进行手势识别时,如图10所示,首先通过XtionPRO LIVE摄像头9采集动态手势的RGB图像信息;随后基于OpenNI软件对采集的图像进行处理,提取手势的运动轨迹,并通过稳定检测器、画圆检测器、推检测器、移动检测器识别手静止不动、画圆、推和移动的手势;随后稳定检测器、画圆检测器、推检测器、移动检测器将各自的回调函数中输出识别结果,并在机器人操作系统中以Message的形式将识别结果发送至运动控制模块。运动控制模块根据识别结果,结合动态手势识别模块内预定义的动态手势与机器人运动的映射关系,控制机器人完成相应的运动。上述动态识别子模块使用范围较广,可在室内光线不好的情况下时使用。The dynamic gesture recognition module designed in the above way, when performing gesture recognition, as shown in Figure 10, first collects the RGB image information of the dynamic gesture through the XtionPRO LIVE camera 9; then processes the collected image based on OpenNI software to extract the gesture information Motion trajectory, and through the stability detector, circle detector, push detector, movement detector to identify the gestures of the hand standing still, circle, push and move; then the stability detector, circle detector, push detector, The mobile detectors will output the recognition results in their respective callback functions, and send the recognition results to the motion control module in the form of Message in the robot operating system. According to the recognition result, the motion control module controls the robot to complete the corresponding motion in combination with the mapping relationship between the predefined dynamic gesture in the dynamic gesture recognition module and the robot motion. The above-mentioned dynamic identification sub-module has a wide application range and can be used when the indoor light is not good.

所述静态手势识别子模块基于彩色图像,用于识别静态手势。通过静态手势识别子模块,在进行手势识别时,如图11所示,首先通过Xtion PRO LIVE摄像头9采集人体的RGB图像信息,该图像信息通过机器人操作系统中Topic中的message发送至静态手势识别子模块;随后,利用静态手势识别子模块内预存的肤色模型和HOG特征检测出人脸部分,进而根据人体结构特征,检测出人的身体部分,并由身体部分中分割出人的手部区域,最终提取已分割手部区域的图像特征,进行手势的训练识别,并将识别结果发送运动控制模块,根据识别结果,结合预存于的静态手势与机器人运动的映射关系,控制机器人完成相应的运动。上述静态手势识别子模块可在室内或室外光照好的情况下使用。The static gesture recognition sub-module is based on color images and is used to recognize static gestures. Through the static gesture recognition sub-module, when performing gesture recognition, as shown in Figure 11, the RGB image information of the human body is first collected through the Xtion PRO LIVE camera 9, and the image information is sent to the static gesture recognition through the message in the Topic in the robot operating system Sub-module; then, use the pre-stored skin color model and HOG features in the static gesture recognition sub-module to detect the face part, and then detect the human body part according to the human body structure features, and segment the human hand area from the body part , and finally extract the image features of the segmented hand area, conduct gesture training and recognition, and send the recognition result to the motion control module. According to the recognition result, combined with the mapping relationship between the pre-stored static gesture and the robot movement, the robot is controlled to complete the corresponding movement. . The above-mentioned static gesture recognition sub-module can be used indoors or outdoors with good lighting.

实施例一:使用动态手势控制机器人运动Example 1: Using dynamic gestures to control robot movement

用户朝向Xtion PRO LIVE摄像头9挥手,机器人识别到用户的挥手动作后,机器人原地踏步走,当用户手向上垂直移动时,机器人径直向前行走,当用户手向下垂直移动时,机器人径直后退,当用户手顺时针画圆,机器人顺时针转一圈后原地踏步走,当用户手向前推,机器人停止运动。The user waves towards the Xtion PRO LIVE camera 9. After the robot recognizes the user's waving motion, the robot walks in place. When the user's hand moves vertically upward, the robot walks straight forward. When the user's hand moves vertically downward, the robot goes straight back. , when the user's hand draws a circle clockwise, the robot turns clockwise and then walks in place. When the user's hand pushes forward, the robot stops moving.

实施例二:使用静态手势控制机器人运动Example 2: Using static gestures to control robot movement

静态手势识别模块可以预定义手势种类与机器人动作的映射关系,例如,对用户单手1-9数字手势做预定义,如表1所示:The static gesture recognition module can pre-define the mapping relationship between gesture types and robot actions, for example, pre-define the user's single hand 1-9 digital gestures, as shown in Table 1:

表1 静态手势预定义Table 1 Predefined static gestures

默认情况下,六足机器人以腿式“3+3”步态行走。用户做数字1的手势,六足机器人前进;用户做数字3的手势,六足机器人向左转弯;用户做数字4的手势,六足机器人向右转弯;用户做数字5的手势,六足机器人停止;用户做数字6的手势,六足机器人腿式行走切换为轮式行走;用户做数字2的手势,六足机器人以轮行方式后退;用户做数字7的手势,六足机器人轮行切换为腿式行走;用户做数字4的手势,六足机器人以腿式行走方式向右转弯;用户做数字5的手势,六足机器人停止;用户做数字8的手势,六足机器人抬起一条腿;用户做数字5的手势,六足机器人停止;用户做数字9的手势,六足机器人抬起两条腿;用户做数字5的手势,六足机器人停止。By default, the hexapod walks with a legged "3+3" gait. The user makes a gesture of number 1, and the hexapod robot moves forward; the user makes a gesture of number 3, and the hexapod robot turns left; the user makes a gesture of number 4, and the hexapod robot turns right; the user makes a gesture of number 5, and the hexapod robot Stop; the user makes a gesture of number 6, and the hexapod robot switches from leg-style walking to wheel-walking; the user makes a gesture of number 2, and the hexapod robot moves backward in a wheel-like manner; the user makes a gesture of number 7, and the hexapod robot switches to wheel-walking Walking on legs; the user makes a gesture of number 4, and the hexapod robot turns right in the way of walking on legs; the user makes a gesture of number 5, and the hexapod robot stops; the user makes a gesture of number 8, and the hexapod robot lifts a leg ; The user makes a gesture of number 5, and the hexapod robot stops; the user makes a gesture of number 9, and the hexapod robot lifts two legs; the user makes a gesture of number 5, and the hexapod robot stops.

Claims (8)

1.一种基于Xtion设备的腿轮复合机器人,包括机器人本体以及6条机器人单腿结构,6条机器人单腿结构周向均匀安装于机器人本体上;单腿结构为具有步行与轮行功能的轮腿结构,具有三根腿节,以及4个驱动舵机;令三根腿节分别为第一腿节、第二腿节、第三腿节,4个驱动舵机分别为第一舵机、第二舵机、第三舵机、第四舵机;其中,第一腿节一端固定安装在第一舵机的输出轴上,形成髋关节;第一舵机输出轴轴线与水平面垂直,由第一舵机驱动第一腿节横向摆动;第二舵机固定安装于第一腿节另一端,第二舵机输出轴与第二腿节一端固定,形成膝关节;第二舵机输出轴轴线与第一舵机输出轴轴线垂直,由第二舵机驱动第二腿节纵向摆动;第二腿节另一端与第三舵机输出轴固定,形成踝关节;第三舵机输出轴轴线与第二舵机输出轴轴线平行,由第三舵机驱动第三腿节纵向摆动;第四舵机位于第三腿节中部,第四舵机的输出轴轴线与第三舵机输出轴轴线平行,第四舵机输出轴上通过花键同轴固定安装有轮子,通过第四舵机驱动轮子转动;第三腿节另一端安装有足地检测机构,用于单腿结构2与地面间的接触,同时可实现足地接触状态的检测; 1. A leg-wheel composite robot based on Xtion equipment, including a robot body and 6 robot single-leg structures, and the 6 robot single-leg structures are evenly installed on the robot body in the circumferential direction; the single-leg structure is a walking and wheel walking function The wheel-leg structure has three leg sections and four driving servos; let the three leg sections be the first leg section, the second leg section, and the third leg section, and the four driving steering gears are respectively the first steering gear and the second steering gear. The second steering gear, the third steering gear, and the fourth steering gear; wherein, one end of the first leg section is fixedly installed on the output shaft of the first steering gear to form a hip joint; the axis of the output shaft of the first steering gear is perpendicular to the horizontal plane, and is formed by the first A steering gear drives the first leg to swing laterally; the second steering gear is fixedly installed on the other end of the first leg, and the output shaft of the second steering gear is fixed to one end of the second leg to form a knee joint; the axis of the output shaft of the second steering gear It is perpendicular to the axis of the output shaft of the first steering gear, and the second leg joint is driven by the second steering gear to swing longitudinally; the other end of the second leg joint is fixed to the output shaft of the third steering gear to form an ankle joint; the axis of the output shaft of the third steering gear is aligned with the The axis of the output shaft of the second steering gear is parallel, and the third steering gear drives the third leg section to swing longitudinally; the fourth steering gear is located in the middle of the third leg section, and the axis of the output shaft of the fourth steering gear is parallel to the axis of the output shaft of the third steering gear The output shaft of the fourth steering gear is coaxially fixed with wheels through splines, and the wheels are driven to rotate through the fourth steering gear; the other end of the third leg section is equipped with a foot-ground detection mechanism, which is used for the distance between the single-leg structure 2 and the ground. At the same time, the detection of the contact state of the foot and the ground can be realized; 其特征在于:机器人本体内部由具有减重孔的铝合金板分割为四层,由上至下依次为传感层、驱动层、控制层及动力层,用于安装控制系统;其中,传感层上安装有Xtion PRO LIVE摄像头与IMU;Xtion PRO LIVE摄像头用来采集环境信息,实现同时定位及地图创建,以及获取人的信息,用于人机交互;IMU用于获得机器人的自身姿态,辅助定位和创建环境地图;驱动层上安装有六块舵机驱动板,分别用于控制6条单腿结构上的各个舵机运动;控制层上安装有主控板,实现通讯管理、传感器数据采集、数据处理与驱动管理等功能;动力层上安装有电池盒,电池盒内安装机器人供电电池; It is characterized in that: the interior of the robot body is divided into four layers by an aluminum alloy plate with a weight-reducing hole, and from top to bottom are the sensing layer, driving layer, control layer and power layer, which are used to install the control system; among them, the sensing layer The Xtion PRO LIVE camera and IMU are installed on the layer; the Xtion PRO LIVE camera is used to collect environmental information, realize simultaneous positioning and map creation, and obtain human information for human-computer interaction; the IMU is used to obtain the robot's own posture and assist Locate and create an environment map; six steering gear drive boards are installed on the driver layer, which are used to control the movement of each steering gear on the six single-leg structures; the main control board is installed on the control layer to realize communication management and sensor data collection , data processing and drive management and other functions; a battery box is installed on the power layer, and a robot power supply battery is installed in the battery box; 机器人本体与远程控制端进行无线通信,由远程控制端进行控制;远程控制端具有安全认证模块、音视频播放模块、操控模块与信息显示模块;各个模块均分为前端与后台;前端为人机交互接口;后台负责网络通信、数据处理;其中,安全认证模块用于获取用户输入的用户信息,经后台封装成数据包发送至机器人的控制主控板中,由主控板反馈登录结果至安全认证模块进行显示;音视频播放模块通过后台接收Xtion PRO LIVE摄像头返回的声音信息和相机采集的图像,经后台解码后在音视频播放模块中进行显示;信息显示模块接收IMU返回的六组机器人传感信息,如姿态、关节角度和关节力矩信息用于显示六足机器人传感信息;操控模块由运动控件组成,可控制机器人运动;同时可以设置机器人行进模式。 The robot body communicates wirelessly with the remote control terminal, and is controlled by the remote control terminal; the remote control terminal has a safety authentication module, an audio and video playback module, a control module and an information display module; each module is divided into a front end and a back end; the front end is for human-computer interaction Interface; the background is responsible for network communication and data processing; among them, the security authentication module is used to obtain the user information input by the user, which is encapsulated into a data packet by the background and sent to the main control board of the robot, and the main control board feeds back the login result to the security authentication The module displays; the audio and video playback module receives the sound information returned by the Xtion PRO LIVE camera and the images collected by the camera through the background, and displays it in the audio and video playback module after being decoded by the background; the information display module receives six groups of robot sensors returned by the IMU Information, such as posture, joint angle and joint torque information, is used to display the sensing information of the hexapod robot; the control module is composed of motion controls, which can control the movement of the robot; at the same time, the robot's travel mode can be set. 2.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:髋关节、膝关节与踝关节处的舵机与第一腿节、第二腿节、第三腿节间的连接方式相同,其中舵机输出轴上通过螺钉固定安装有舵盘,舵盘嵌入腿节的端部一侧连接位上设计的凹槽内定位;舵机的 输出轴同轴固定安装有伸出轴,伸出轴端部通过轴承与腿节端部另一侧连接位相连。 2. A kind of leg-wheel composite robot based on Xtion equipment as claimed in claim 1, is characterized in that: the steering gear at the hip joint, knee joint and ankle joint place and the first leg joint, the second leg joint, the third leg joint The connection mode between them is the same, wherein the output shaft of the steering gear is fixed with a rudder plate by screws, and the rudder plate is embedded in the groove designed on the connection position at the end of the leg joint for positioning; the output shaft of the steering gear is coaxially fixed with a A shaft is extended, and the end of the shaft is connected to the connection position on the other side of the end of the leg joint through a bearing. 3.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:机器人本体外加装周边罩,周边罩周向上预留单腿结构的安装口;通过周边罩对机器人本进行保护。 3. a kind of leg-wheel composite robot based on Xtion equipment as claimed in claim 1, is characterized in that: outside the robot body, a peripheral cover is added, and the peripheral cover circumferentially reserves the installation port of the single-leg structure; for protection. 4.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:Xtion PRO LIVE摄像头位于机器人主体外部,底座位于传感器层内,固定安装在倒U型支架上,倒U型支架固定在传感层上表面;IMU位于倒U型支架内。 4. A kind of leg-wheel compound robot based on Xtion equipment as claimed in claim 1, it is characterized in that: Xtion PRO LIVE camera is located outside the main body of the robot, the base is located in the sensor layer, fixedly installed on the inverted U-shaped bracket, inverted U-shaped The bracket is fixed on the upper surface of the sensing layer; the IMU is located in the inverted U-shaped bracket. 5.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:六块舵机驱动板并排设置在驱动层上表面上,可以实现并行计算,实时性好。 5. a kind of leg-wheel compound robot based on Xtion equipment as claimed in claim 1 is characterized in that: six steering gear drive boards are arranged side by side on the upper surface of the drive layer, which can realize parallel computing and good real-time performance. 6.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:控制层上还设计有充电接口与电源开关;充电导线两端分别与充电接口和动力层中两块锂电池相连;由此使外接电源通过充电接口来为锂电池充电;电源开关来用来控制每条单腿结构中的各个舵机及主控板的上电下电。 6. A kind of leg-wheel composite robot based on Xtion equipment as claimed in claim 1, is characterized in that: charging interface and power switch are also designed on the control layer; The charging lead two ends are respectively connected with charging interface and two pieces of lithium in the power layer. The battery is connected; thus, the external power supply can charge the lithium battery through the charging interface; the power switch is used to control the power-on and power-off of each servo and the main control board in each single-leg structure. 7.如权利要求1所述一种基于Xtion设备的腿轮复合机器人,其特征在于:远程控制端还具有三维仿真模块;三维仿真模根据Xtion PRO LIVE摄像头与IMU模块返回的机器人运动及其周围环境信号,同步显示机器人运动和环境地形;在机器人运行过程中,三维仿真模块根据六足机器人关节角度信息,驱动三维模型与真实机器人同步运动。 7. a kind of leg-wheel composite robot based on Xtion equipment as claimed in claim 1, is characterized in that: remote control end also has three-dimensional simulation module; Three-dimensional simulation model returns robot motion and its surroundings according to Xtion PRO LIVE camera and IMU module The environmental signal displays the movement of the robot and the terrain of the environment synchronously; during the operation of the robot, the 3D simulation module drives the 3D model to move synchronously with the real robot according to the joint angle information of the hexapod robot. 8.针对权利要求1所述一种基于Xtion设备的腿轮复合机器人的手势控制方法,其特征在于:在控制层上安装手势识别模块与运动控制模块;手势识别模块包括动态手势识别子模块和静态手势识别子模块,其中: 8. for the gesture control method of a kind of leg-wheel composite robot based on Xtion equipment described in claim 1, it is characterized in that: gesture recognition module and motion control module are installed on the control layer; gesture recognition module includes dynamic gesture recognition submodule and Static gesture recognition submodule, in which: 动态手势识别子模块用于实现动态手势识别,具有稳定检测器、画圆检测器、推检测器、移动检测器;其中,稳定检测器用于识别当前活跃的手部是否处于稳定状态,画圆检测器用于识别当前活跃手部的画圆运动,推检测器用于识别当前活跃手的前推动作,移动检测器用于识别当前活跃手的击打动作; The dynamic gesture recognition sub-module is used to realize dynamic gesture recognition, and has a stability detector, a circle detector, a push detector, and a movement detector; among them, the stability detector is used to identify whether the currently active hand is in a stable state, and the circle detection The detector is used to identify the circular movement of the currently active hand, the push detector is used to identify the forward movement of the currently active hand, and the movement detector is used to identify the hitting action of the currently active hand; 上述动态手势识别子模块的具体设计方式为: The specific design method of the above-mentioned dynamic gesture recognition sub-module is as follows: A、使用Xtion PRO LIVE摄像头的Xtion SDK初始化环境; A. Use the Xtion SDK of the Xtion PRO LIVE camera to initialize the environment; B、建立上下文环境创建图像生成器,并设定图像生成器的像素、帧率; B. Establish a context to create an image generator, and set the pixel and frame rate of the image generator; C、建立、注册场景管理器,并初始化场景管理器; C. Create and register the scene manager, and initialize the scene manager; D、建立、注册稳定检测器、画圆检测器、推检测器、移动检测器,并将各检测器中的监听器添加至场景管理器中;同时,设定稳定检测器、画圆检测器、推检测器、移动检测器的回调函数; D. Establish and register stable detectors, circle detectors, push detectors, and motion detectors, and add the listeners in each detector to the scene manager; at the same time, set the stable detectors and circle detectors , the callback function of push detector and mobile detector; 通过上述方式设计的动态手势识别模块,在进行手势识别时,首先通过Xtion PRO LIVE摄像头采集RGB图像信息;随后基于OpenNI软件对采集的图像进行处理,提取手势的运动轨迹, 并通过稳定检测器、画圆检测器、推检测器、移动检测器识别手静止不动、画圆、推和移动的手势;随后稳定检测器、画圆检测器、推检测器、移动检测器将各自的回调函数中输出识别结果,并将识别结果发送至运动控制模块;运动控制模块根据识别结果,结合动态手势识别模块内预定义的动态手势与机器人运动的映射关系,控制机器人完成相应的运动。 The dynamic gesture recognition module designed in the above way, when performing gesture recognition, first collects RGB image information through the Xtion PRO LIVE camera; then processes the collected images based on OpenNI software, extracts the trajectory of the gesture, and passes the stable detector, Circle detectors, push detectors, and movement detectors recognize hand gestures of stillness, circles, pushes, and movements; then the stability detectors, circle detectors, push detectors, and movement detectors will be included in their respective callback functions Output the recognition result, and send the recognition result to the motion control module; the motion control module controls the robot to complete the corresponding motion according to the recognition result, combined with the mapping relationship between the dynamic gesture and the robot motion predefined in the dynamic gesture recognition module. 所述静态手势识别子模块基于彩色图像,用于识别静态手势;通过静态手势识别子模块,在进行手势识别时,首先通过Xtion PRO LIVE摄像头采集RGB图像信息,该图像信息通过机器人操作系统中的Topic中的message实现;随后,利用静态手势识别子模块内预存的肤色模型和HOG特征检测出人脸部分,进而根据人体结构特征,检测出人的身体部分,并由身体部分中分割出人的手部区域,最终提取已分割手部区域的图像特征,进行手势的训练识别,并将识别结果发送主控板,根据识别结果,结合预存于的静态手势与机器人运动的映射关系,控制机器人完成相应的运动。 The static gesture recognition submodule is based on a color image and is used to recognize static gestures; through the static gesture recognition submodule, when performing gesture recognition, at first the RGB image information is collected by the Xtion PRO LIVE camera, and the image information passes through the robot operating system. The message in the Topic is implemented; then, the skin color model and HOG features pre-stored in the static gesture recognition sub-module are used to detect the face part, and then the body part of the person is detected according to the structural characteristics of the human body, and the human body part is segmented from the body part In the hand area, finally extract the image features of the segmented hand area, conduct gesture training and recognition, and send the recognition result to the main control board. corresponding movement.
CN201610389736.1A 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof Pending CN106005086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610389736.1A CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610389736.1A CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Publications (1)

Publication Number Publication Date
CN106005086A true CN106005086A (en) 2016-10-12

Family

ID=57090510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610389736.1A Pending CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Country Status (1)

Country Link
CN (1) CN106005086A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106363625A (en) * 2016-10-13 2017-02-01 杭州宇树科技有限公司 Quadruped robot tele-operation mode based on operator foot pose sensor
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A remote control system for a multifunctional hexapod robot based on Xtion equipment
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107933733A (en) * 2018-01-03 2018-04-20 河南科技大学 A kind of imitative tortoise returns item pendulum shin coupling turning robot
CN110254554A (en) * 2019-06-24 2019-09-20 重庆大学 A care robot for the elderly
CN111071366A (en) * 2020-01-06 2020-04-28 广东省智行机器人科技有限公司 Bionic parallel photographing robot and photographing control method thereof
CN111142523A (en) * 2019-12-26 2020-05-12 西北工业大学 Wheel-leg type mobile robot motion control system
CN112224304A (en) * 2020-10-28 2021-01-15 北京理工大学 Wheel step composite mobile platform and gesture and voice control method thereof
CN115439876A (en) * 2022-07-19 2022-12-06 河北大学 Gesture Interaction Method for Mobile Manipulation Robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527650A (en) * 1983-03-18 1985-07-09 Odetics, Inc. Walking machine
CN101125564A (en) * 2007-09-28 2008-02-20 北京航空航天大学 Six-wheel/leg hemispherical shell detection robot
CN101948011A (en) * 2010-09-09 2011-01-19 北京航空航天大学 Hexapod universal walking multifunctional moonshot robot
CN102063111A (en) * 2010-12-14 2011-05-18 广东雅达电子股份有限公司 Mobile terminal-based remote robot control system
CN104443105A (en) * 2014-10-29 2015-03-25 西南大学 Low-energy-loss six-foot robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527650A (en) * 1983-03-18 1985-07-09 Odetics, Inc. Walking machine
CN101125564A (en) * 2007-09-28 2008-02-20 北京航空航天大学 Six-wheel/leg hemispherical shell detection robot
CN101948011A (en) * 2010-09-09 2011-01-19 北京航空航天大学 Hexapod universal walking multifunctional moonshot robot
CN102063111A (en) * 2010-12-14 2011-05-18 广东雅达电子股份有限公司 Mobile terminal-based remote robot control system
CN104443105A (en) * 2014-10-29 2015-03-25 西南大学 Low-energy-loss six-foot robot

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JAN H. COCATRE-ZILGIEN: "walking machine with backhoe legs", 《HTTP://COCATREZ.NET/EARTH/AUTONOMOUSWALKINGROBOTS/SIXHOEPROJECT/SIXHOEPATAPP.HTML》 *
PRIMESENSE INC.: "《Prime SensorTM NITE 1.3 Controls Programmer"s Guide》", 31 December 2010 *
华硕电脑股份有限公司: "Kinect Chapter 7.NITE Gestures", 《HTTP://FIVEDOTS.COE.PSU.AC.TH/~AD/NUI163/GESTURES.PDF》 *
师哲等: "手势和体感控制机器人", 《百度文库》 *
杨骐溪: "关于体感机器人技术研究的心得", 《百度文库》 *
陈敬德等: "基于kinect的机器人控制系统", 《电子设计工程》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106363625B (en) * 2016-10-13 2019-03-05 杭州宇树科技有限公司 A kind of quadruped robot teleoperation method based on control staff's foot Position and attitude sensor
CN106363625A (en) * 2016-10-13 2017-02-01 杭州宇树科技有限公司 Quadruped robot tele-operation mode based on operator foot pose sensor
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A remote control system for a multifunctional hexapod robot based on Xtion equipment
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107933733B (en) * 2018-01-03 2023-09-01 河南科技大学 A tortoise-like flipping robot with neck swing and shin coupling
CN107933733A (en) * 2018-01-03 2018-04-20 河南科技大学 A kind of imitative tortoise returns item pendulum shin coupling turning robot
CN110254554A (en) * 2019-06-24 2019-09-20 重庆大学 A care robot for the elderly
CN110254554B (en) * 2019-06-24 2024-07-23 重庆大学 Old person care robot
CN111142523A (en) * 2019-12-26 2020-05-12 西北工业大学 Wheel-leg type mobile robot motion control system
CN111142523B (en) * 2019-12-26 2022-03-15 西北工业大学 A motion control system for a wheel-legged mobile robot
CN111071366A (en) * 2020-01-06 2020-04-28 广东省智行机器人科技有限公司 Bionic parallel photographing robot and photographing control method thereof
CN112224304A (en) * 2020-10-28 2021-01-15 北京理工大学 Wheel step composite mobile platform and gesture and voice control method thereof
CN115439876A (en) * 2022-07-19 2022-12-06 河北大学 Gesture Interaction Method for Mobile Manipulation Robot

Similar Documents

Publication Publication Date Title
CN106005086A (en) Leg-wheel composite robot based on Xtion equipment and gesture control method thereof
US11389686B2 (en) Robotically assisted ankle rehabilitation systems, apparatuses, and methods thereof
CN102699914B (en) Robot
CN102348068B (en) Head gesture control-based following remote visual system
CN105966488A (en) Six-wheel-leg movable operation robot test platform
JP3714268B2 (en) Robot device
CN105563493A (en) Height and direction adaptive service robot and adaptive method
US9159152B1 (en) Mapping between a capture volume and a virtual world in a motion capture simulation environment
CN205075054U (en) A robot for high -risk operation
CN103862457A (en) Service robot with visual system
CN107127760A (en) A kind of track combined anthropomorphic robot of foot
CN106393049A (en) Robot used for high risk operation
CN106217393B (en) Mobile interaction platform of distally coming personally
CN107203192A (en) A kind of mobile robot automatic ride control system based on electronic map
CN106125909A (en) A kind of motion capture system for training
Park et al. Control hardware integration of a biped humanoid robot with an android head
CN204725497U (en) A kind of passive robot system
CN108297109A (en) A kind of intelligent robot system
CN205521430U (en) High position adaptive service robot
CN206484561U (en) A kind of intelligent domestic is accompanied and attended to robot
CN203899136U (en) Exterior structure of embedded vision-based competitive entertainment-type humanoid robot
Spada et al. Locomotion and telepresence in virtual and real worlds
CN217687537U (en) Autonomous mobile human body temperature measuring equipment
CN116991153A (en) Motion control method of mobile robot and mobile robot
Zhang et al. An interactive control system for mobile robot based on cloud services

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20161012

RJ01 Rejection of invention patent application after publication