[go: up one dir, main page]

CN206105862U - A electrical system and humanoid robot for realizing humanoid robot snatchs action - Google Patents

A electrical system and humanoid robot for realizing humanoid robot snatchs action Download PDF

Info

Publication number
CN206105862U
CN206105862U CN201620359286.7U CN201620359286U CN206105862U CN 206105862 U CN206105862 U CN 206105862U CN 201620359286 U CN201620359286 U CN 201620359286U CN 206105862 U CN206105862 U CN 206105862U
Authority
CN
China
Prior art keywords
action
instruction
crawl
robot
action command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201620359286.7U
Other languages
Chinese (zh)
Inventor
俞志晨
贾梓筠
董增增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Guangnian Wuxian Technology Co Ltd
Original Assignee
Beijing Guangnian Wuxian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Guangnian Wuxian Technology Co Ltd filed Critical Beijing Guangnian Wuxian Technology Co Ltd
Priority to CN201620359286.7U priority Critical patent/CN206105862U/en
Application granted granted Critical
Publication of CN206105862U publication Critical patent/CN206105862U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)

Abstract

The utility model discloses a realize that humanoid robot snatchs the electrical system of action, including host computer, the machine of following and action actuators machanism, wherein: the host computer configuration does: acquire and analysis multimode input data formation interactive instruction, work as interactive instruction when grabbing instruction fetch order and/or placing the instruction, according to grab the instruction fetch order or place the instruction and generate corresponding action instruction and export to from the machine, do from the machine configuration: it is analytic action instruction and the operate condition of action actuators machanism feedback, formation drive instruction erupts simultaneously and gives action actuators machanism, until action actuators machanism's operate condition with the action instruction is matchd, the action actuators machanism configuration does: according to the corresponding action of drive command execution to it feeds back to to acquire operate condition in the setting -up time from the machine. Compared with the prior art, the utility model discloses a system not only can realize the action of placing of snatching of high success rate, high rate of accuracy, simple structure moreover, and hardware is with low costs.

Description

For realizing the electric-control system and anthropomorphic robot of anthropomorphic robot crawl behavior
Technical field
This utility model is related to robot field, in particulars relate to a kind of for realizing that anthropomorphic robot captures the electricity of behavior Control system and anthropomorphic robot.
Background technology
It is small-sized humanoid in domestic environments with the continuous development and the continuous progress of artificial intelligence technology of computer technology The application of robot also more and more extensively, towards the small-sized anthropomorphic robot of domestic, especially anthropomorphic robot just in fast development, Its degree continuous improvement that personalizes.
In the prior art, the robot towards domestic cannot realize specifying the crawl of article to place behavior.This is not only limited The range of application of Zhi Liao robots, reduces the degree that personalizes of anthropomorphic robot, and greatly reduces the user of robot Experience.
Therefore, in order to improve the Consumer's Experience of anthropomorphic robot, the range of application of anthropomorphic robot is expanded, needs a kind of use badly In the electric-control system for realizing anthropomorphic robot crawl placement behavior.
Utility model content
In order to improve the Consumer's Experience of robot, the range of application of robot is expanded, this utility model provides a kind of use In the electric-control system for realizing anthropomorphic robot crawl behavior, the system includes main frame, slave and action actuating mechanism, its In:
The main frame is configured to:Obtain and analyze multi-modal input data and generate interactive instruction, when the interactive instruction is Crawl instruction and/or when placing instruction, according to the crawl instruction or place instruction generate corresponding action command and export to Slave;
The slave is configured to:The operating state of the action command and action actuating mechanism feedback is parsed, it is raw Into driving instruction and the action actuating mechanism is sent to, until the operating state of the action actuating mechanism refers to the action Order matching;
The action actuating mechanism is configured to:Corresponding actions are performed according to the driving instruction, and is obtained in setting time Take operating state and feed back to the slave.
In one embodiment, it is described that corresponding action command is generated according to the crawl instruction or placement instruction, including:
Collection external image information simultaneously captures object or placement location according to the external image information analysis;
The action command is generated according to the analysis result of the crawl object or the placement location.
In one embodiment, the main frame is configured to the analysis result life according to the crawl object or the placement location Into the action command, wherein:
According to the crawl object or the analysis result path planning information of the placement location;
The action command that robot motion and generation are matched with the action is determined according to the current routing information.
In one embodiment, the main frame is configured to according to the positional information path planning information, the routing information Including displacement path and crawl/placement path, wherein:
The displacement path is that the robot is captured near object or the placement location from current location to described The motion track of first position point;
The main frame is configured to be generated and displacement described in output matching when the robot is on the displacement path The action command in path;
Its gripper components moves to institute when the crawl/placement path is located at the first position point for the robot State crawl object or the motion track in the placement location;
The main frame is configured to be grabbed described in generation and output matching when the robot is located at the first position point Take/place the action command in path.
In one embodiment, the main frame is additionally configured to:
The action command that record is previously generated is to train preset path plan model.
In one embodiment, the main frame is additionally configured to:
The external image information of change is gathered during robot performs the action command and according to the external world Image information parsing crawl object or placement location;
The action command is updated according to the new analysis result of the crawl object or the placement location.
In one embodiment, the main frame is additionally configured to:
The related information of output voice messaging feedback current action instruction.
In one embodiment, the action actuating mechanism includes multiple self feed back steering wheels, multiple self feed back steering wheel bags Include leg steering wheel and hand steering wheel.
This utility model embodiment also provides a kind of anthropomorphic robot, including this utility model embodiment provide for reality Existing anthropomorphic robot captures the electric-control system of behavior.
According to system of the present utility model, anthropomorphic robot can be realized specifying the crawl of article and article being placed into into finger Determine place.Compared with prior art, system of the present utility model not only can realize high success rate, the crawl of high-accuracy is placed Behavior, and simple structure, hardware cost is low.
Further feature of the present utility model or advantage will be illustrated in the following description.Also, portion of the present utility model Dtex is levied or advantage will be become apparent by description, or is appreciated that by implementing this utility model.This practicality New purpose and certain advantages can be realized by specifically noted structure in description, claims and accompanying drawing Or obtain.
Description of the drawings
Accompanying drawing is used for providing further understanding to of the present utility model, and constitutes a part for description, with this practicality New embodiment is provided commonly for explaining this utility model, does not constitute to restriction of the present utility model.In the accompanying drawings:
Fig. 1 is the system structure sketch according to the embodiment of this utility model one;
Fig. 2 is the components of system as directed structure diagram according to the embodiment of this utility model one;
Fig. 3 is the system hardware interface diagram according to the embodiment of this utility model one.
Specific embodiment
Describe embodiment of the present utility model in detail below with reference to drawings and Examples, it is of the present utility model whereby Enforcement personnel can fully understand how application technology means solving technical problem, and reach technique effect to this utility model Realize process and realize that process is embodied as this utility model according to above-mentioned.As long as it should be noted that not constituting conflict, this reality Can be combined with each other with each embodiment and each feature in each embodiment in new, the technical scheme for being formed exists Within protection domain of the present utility model.
In order to improve the Consumer's Experience of robot, the range of application of robot is expanded, the utility model proposes a kind of use In the electric-control system for realizing anthropomorphic robot crawl behavior.In order to realize crawl behavior, the crawl behavior to the mankind first is carried out Simple analysis.
Crawl behavior can simply be divided into crawl process (getting hold of specified object) and placement process (will be grabbed The object got is placed into appointed place) two parts.The mankind carry out the process of crawl behavior and may be summarized to be:According to concrete Behavior purpose (crawl target or drop target) planning process details;Control body makes concrete behavior to realize said process Details is so as to being finally completed behavior purpose.
Based on the analysis of above-mentioned logical process, in the embodiment of this utility model one, the basic structure of electric-control system includes Main frame, slave and action actuating mechanism.Wherein, main frame is used to plan the procedural details of crawl process or placement process, slave And action actuating mechanism is used for the procedural details that concrete execution main frame is planned.
As shown in figure 1, main frame 100 is configured to:Obtain and analyze multi-modal input data and generate interactive instruction, when interaction refers to Make for crawl instruction and/or when placing instruction, according to crawl instruction or place instruction generate corresponding action command and export to Slave.Slave 110 is configured to:The operating state of parsing action command and action actuating mechanism feedback, generates driving instruction simultaneously Action actuating mechanism is sent to, until the operating state of action actuating mechanism is matched with action command.Action actuating mechanism is configured For:Corresponding actions are performed according to driving instruction, and operating state is obtained in setting time and feed back to slave 110.
Specifically, in the present embodiment, the artificial humanoid robot of machine, action actuating mechanism includes leg steering wheel 121 (for walking, moving integrally robot), hand steering wheel 122 (for moving arm, being accurate to up to pickup/placement location) and Electric magnet 123 (is located at the finger of arm, for drawing article to realize grasping movement).
Explanation is needed exist for, the structure of the anthropomorphic robot of system of the present utility model is not limited to Fig. 1 and sheet Structure shown in other specific embodiments of description description (anthropomorphic robot can be that only some structure imitates humanoid). The steering wheel quantity and sensor type of system can be constructed according to the actual functional capability topology requirement of robot.For example, in this reality With in a new embodiment, robot is simultaneously provided without completely humanoid, and its leg is wheel moving structure.So system can It is changed to increase the motor for driving moment with the setting for cancelling leg steering wheel.Further, embodiment illustrated in fig. 1 adopts electromagnetism Used as grasping mechanism, it can only capture irony article to ferrum.In the embodiment of this utility model one, hold assembly conduct can be constructed Grasping mechanism.For example, can be with the finger structure (increasing new finger steering wheel) of autonomic activitieses, using handss in robot cage structure The crawl of article is realized in the clamping of finger.
Explanation is needed exist for, because one of major technique effect of the present utility model is to realize the crawl row of robot For, therefore in this manual the realization mainly around crawl behavior is illustrated, other are used for auxiliary grip functional realiey Related robot function is just not described in detail.Further, in actual applications, the modules of system of the present utility model The function to be realized be not limited in crawl behavior realization (main frame of such as system be not limited according to crawl instruction Or placement instruction generates corresponding action command, it is also possible to generate corresponding action commands to realize others according to other instructions Function).
Further, system of the present utility model is not strict to the hardware configuration (structure of moving component) of robot Limit.According to concrete application environment, the robot involved by this utility model can possess various different hardware configuration (steering wheels Number and specific steering wheel articulation structure are arranged).In the present embodiment, robot is anthropomorphic robot, its hardware configuration bag Containing both legs, arm and finger.Corresponding steering wheel includes both legs steering wheel, arm steering wheel and finger steering wheel.
The structure of Intrusion Detection based on host, slave and steering wheel, electric-control system of the present utility model can realize the crawl of robot Behavior.One of the realization of said system function, its critical process are main frame 100 according to crawl instruction or place instruction generation phase The action command (planning process details) answered.In order to simply accurately realize above-mentioned functions, further the analysis mankind carry out The detailed process of crawl behavior.
In the present embodiment, positioned to capturing target/placement location using graphical analyses.As shown in figure 1, main frame 100 are configured to:Collection external image information simultaneously captures object 101 (or placement location) according to external image information analysis;According to The analysis result of crawl object 101 (or placement location) generates action command.
Corresponding, the structure of main frame 100 is as shown in Figure 2.Main frame 100 includes interactive instruction acquisition device 211, interactive instruction Analytical equipment 212, image collecting device 210 and image analysis apparatus 220.Interactive instruction acquisition device is configured to obtain and divides Analyse multi-modal input data and generate interactive instruction;When interactive instruction is instructed for crawl instruction and/or placement, interactive instruction analysis Device 212 is from crawl instruction or places the feature description that crawl object or placement location are extracted in instruction;Image collecting device 210 It is configured to gather external image information;Image analysis apparatus 220 are configured to parse external image information according to feature description, obtain Current location and the volume information of crawl object or placement location.
Explanation is needed exist for, Fig. 2 show the functional module structure letter of the main frame of the embodiment of this utility model one Figure.The structure of system of the present utility model is not limited in the structure shown in Fig. 2.In this utility model other embodiment, can To deform to structure shown in Fig. 2 as the case may be.
After the position for capturing object or placement location determines, it is possible to the detailed process that further planning is captured/placed Details.Crawl/placement process includes that handss (are moved in the target of primary area or move the article captured in handss by a displacement process To in placement location).If it is intended to making robot smoothly move to B points from A points must indicate the concrete road of each step to it Footpath;Further, the action in the concrete path of each step is realized.Therefore, in the present embodiment, main frame 100 is configured to basis first Crawl object or placement location positional information path planning information;Then according to routing information determine robot motion and generate with The action command of action matching.
As shown in Fig. 2 main frame 100 includes path planning apparatus 230 and action command generating means 240.Path planning Device 230 is configured to according to crawl object or placement location positional information path planning information;Action command generating means 240 are matched somebody with somebody It is set to and determine robot motion according to routing information and generate the action command matched with action.
Further the refinement analysis mankind carry out the process of crawl behavior, and the mobile behavior for capturing process or placement process can be with It is divided into two parts.First it is global displacement, body moves to crawl object or placement location, and nearby (handss are integrally moved with body It is dynamic);Followed by hand is moved, the handss for needing the handss for capturing article or capturing article are moved in crawl object or placement location (body position is held essentially constant).
Based on above-mentioned analysis, in the present embodiment, the routing information of main frame 100 (path planning apparatus 230) planning is first Including displacement path, wherein:Displacement path be robot from current location to crawl object or placement location near first Put motion track a little;Main frame 100 (action command generating means 240) is configured to be generated when robot is on displacement path And the action command of output matching displacement path.
That is, path planning apparatus 230 plan first displacement path, and action command generating means 240 generate matching displacement road The action command in footpath causes the first position point near robot displacement to crawl object or placement location.
Then, the routing information of main frame 100 (path planning apparatus 230) planning also includes crawl/placement path, wherein: Crawl/placement path is that its gripper components (equivalent to the hand of the mankind) moves to crawl when robot is located at the point of first position Motion track on object or the placement location;Main frame 100 (action command generating means 240) is configured to when robot is located at The action command that simultaneously output matching captured/placed path is generated when at the point of first position.
That is, the planning of path planning apparatus 230 crawl/placement path, the generation matching crawl of action command generating means 240/ The action command for placing path causes the gripper components of robot to be displaced in crawl object or placement location.
Further, in the present embodiment, path planning apparatus 230 enter walking along the street based on the analysis result of image analysis apparatus Plan in footpath.Because image acquisition, graphical analyses, path planning, action command are generated and subsequent action command implementation procedure In there may be error, robot is not ensured that based on initially planned routing information and perfectly move to precalculated position On.
For the problems referred to above, in order to improve the rate that runs succeeded of crawl behavior, in the present embodiment, main frame 100 is also configured that For:The external image information of change is gathered during robot performs action command and is grabbed according to external image information analysis Take object or placement location;According to the new analysis result update action instruction of crawl object or placement location.That is, with machine The external image information of the displacement acquisition change of people's body/gripper components is new more according to the external image information generation of change Plus the routing information of matching practical situation, new action command is adjusted and exported according to new routing information.
In the present embodiment, steering wheel is each configured to self feed back steering wheel.During robot motion, the control of self feed back steering wheel Controlling angle and retaking of a year or grade real-time angular value constitute closed-loop automatic control.Further, the external image information of main frame collection change is (raw The routing information of Cheng Xin, new action command) constitute closed-loop automatic control with action control of the robot under action command. Two close cycles are automatically controlled down, and the rate of running succeeded of robot motion is greatly improved.
For convenience user's checking robot captures the correct degree of behavior, and in the present embodiment, main frame 100 is additionally configured to The related information of output voice messaging feedback current action instruction.So, before robot performs concrete action or tool is performed User just can judge whether the crawl behavior of robot is correct by the voice messaging of robot feedback during body action.
As shown in Fig. 2 main frame 100 also includes voice feedback device 270.Voice feedback device 270 is configured as output to currently The related information of action command, specifically comprising the combination of one or more in following information:The feature description of crawl object, crawl Object location information, placement location feature description, placement location positional information and routing information.Further, in this enforcement In example, the feedback information only when image acquisition, graphical analyses and path planning is initially carried out of voice feedback device 270, shortly When closed loop adjustment action command during do not export related information always, only when the adjustment degree of action command is more than pre- If threshold value when just carry out feedback prompting.
Further, in the present embodiment, path planning apparatus 230 are configured to default path planning model planning Routing information (displacement path and crawl/placement path).But in actual moving process, due to the polytropy of actual environment, Default path planning model can not be with reality Perfect Matchings, and this results in the routing information cooked up can not be fine Matching reality (for example there are path deviations, go for a stroll route or route setting failure).In order to improve the adaptation of robot Property, in the present embodiment, main frame 100 is additionally configured to record the action command being previously generated to train preset path plan model.
As shown in Fig. 2 main frame 100 also includes path planning model optimization device 260, it is configured to record what is be previously generated Action command simultaneously optimizes default path planning model according to the implementation effect of action command, to be given birth to according to path planning model Into more reasonably path.
Article can be just captured when the gripper components of robot are displaced on crawl object or placement location or by article Place, but in actual environment, for different crawl objects, its Grasp Modes is different.The article for example having is adapted to Lateral forces are captured in the way of clamping, and some articles are adapted to bottom stress and lift, and some articles possess specific grasping mechanism (such as handle).Based on above-mentioned analysis, in the present embodiment, main frame 100 is additionally configured to:According to crawl object or placement location Image analysis capture the status information of object or placement location;Crawl/modes of emplacement is planned according to status information;According to crawl/ Modes of emplacement generates the action command of matching.
As shown in Fig. 2 main frame 100 also includes crawl/modes of emplacement determining device 250, it is configured to according to crawl object Or the image analysis of placement location capture the status information of object or placement location and plan crawl/placement side according to status information Formula.Action command generating means 240 generate the action command of matching crawl/modes of emplacement.
Further, it is contemplated that in placement process, robot is to keep article to enter line position under crawled state Move (body displacement and grasping mechanism displacement).In the embodiment of this utility model one, in whole displacement process quilt is monitored The current state (whether incline, slide) of the article of crawl and the state planning and adjusting routing information according to article and placement Mode so as to avoid displacement process in article slide.
To sum up, system of the present utility model not only can cause robot to realize crawl behavior, but also can effectively protect Fairness and order of accuarcy that card crawl behavior is implemented.Further, system of the present utility model is positioned based on image acquisition Crawl object and placement location, its hardware construction is simple, hardware cost is low, and it is very high that this causes system of the present utility model to have Practical value and promotional value.
Next the hardware circuit interface structure of the system of the embodiment of this utility model one is specifically described.As shown in figure 3, In the embodiment of this utility model one, main frame 310 is the master control borad based on MTK8163 processors.MTK8163 is that Lian Fake is promoted mainly A Mobile solution processor, be mainly used in Embedded exploitation.Main frame 310 provides serial communication interface 321, speaker Interface 313 and 314, microphone interface 311 and 312, utilizing camera interface 315 and power interface 319.
Slave is the master control borad based on STM32 processors.STM32 processors are the micro- of ST Microelectronics (ST) release Controller class product, the 32 Cortex-M series kernels released based on ARM companies.Slave 330 is integrated with power management module 345th, lithium battery 349, physical switch 348, Sofe Switch 347.Slave 330 also provides charge port 346, serial communication interface 331, surveys Away from interface 341, attitude transducer interface 342, steering wheel interface 361~377 (18 steering wheel interfaces) and grasping mechanism interface 350。
Specifically, serial communication interface 321 is three-wire interface, and line sequence is descending (TX) for ground connection (GND) up (RX), its company It is connected to the serial communication interface 331 of slave 330.Serial communication interface 331, three-wire interface, line sequence be GND RX TX, main frame 310 And data transfer is realized by serial communication between slave 330.
Speaker interface 313 and 314, each interface is two line interfaces, and line sequence is audio signal just (Speaker+) sound Frequency signal bears (Speaker-), and it is connected respectively to speaker 303 and 304 and (in the present embodiment, connects with 2 speakers Mouthful (left and right acoustic channels), in this utility model other embodiment, can connect according to specifically needing to construct different number of speaker Mouthful).System exports voice and realizes voice feedback by speaker 303 and 304.
Microphone interface 311 and 312, each interface is two line interfaces, and line sequence is MICP (Mic+) mike Signal bears (Mic-), and it is connected respectively to mike 301 and 302 (in the present embodiment, with 2 microphone interfaces (to determine Sound source direction and positional distance), in this utility model other embodiment, can according to specifically need construction it is different number of Microphone interface).
Utilizing camera interface 215, it is connected to photographic head 205.System is adopted by photographic head 205, mike 301 and 302 The outside multi-modal input information (collection user mutual input, external image information) of collection.The present embodiment using voice collecting with The mode that image acquisition is combined obtains user mutual input, in this utility model other embodiment, can be according to actual needs Using other sensors.
Range finding interface 341 is connected to laser radar apparatus 343, and for auxiliary positioning object/placement location is captured.
Charge port 346, two line interfaces, line sequence is power supply (VCC) ground connection (GND), and it is connected to power management module 345 simultaneously Connection lithium battery 349.Power interface 319 is connected to the power management module 345 on slave 330.Power management module 345 and Lithium battery 349 is that slave 330 and main frame 310 are powered, and whether the control system of physical switch 348 switches on power, and Sofe Switch 347 is controlled Whether system processed brings into operation.
Steering wheel interface 361~377 (18 steering wheel interfaces) is connected respectively to steering wheel 381~397.In the present embodiment, 18 Individual steering wheel is respectively each 5 steering wheels of left and right lower limb, each 3 steering wheels of right-hand man and 2 steering wheels of head.Based on 18 steering wheels, Fig. 2 institutes Show that system can support the joint motions of 18 degree of freedom.The robot architecture of the present embodiment employs the structure of humanoid, its By the servo driving robot ambulation (right-hand man's steering wheel coordinates in walking process) of left and right lower limb, so as to realize robot body Mobile (realization of displacement path);By the servo driving robot left/right hand displacement of left/right hand, so as to realize robot The movement (realization in crawl/placement path) in portion.
Further, grasping mechanism interface 350 is connected to electric magnet 351.When robot left/right hand moves to crawl object When upper, electric magnet 351 is powered, and draws crawl object;When robot left/right hand is moved in placement location, electric magnet 351 breaks Electricity, puts down the object being currently drawn to.
Next illustrated based on specific application example the embodiment of this utility model one system host part it is concrete The method of operation.First by taking crawl process as an example, by taking the main machine structure shown in Fig. 2 as an example, in a concrete application environment, its operation Process is as follows:
Interactive instruction acquisition device 211 obtains and forwards crawl instruction;
The parsing crawl instruction of interactive instruction analytical equipment 212 obtains crawl characteristics of objects description;
Image collecting device 210 gathers and sends external image information;
The parsing of image analysis apparatus 220 external image information isolates crawl object images, and further determination is grabbed Take Obj State information and positional information (wherein it is determined that crawl Obj State information include crawl object volume);
Further, in the present embodiment, in order to ensure the rate that runs succeeded of robot grasping movement, robot is concrete Can also do anticipation to the execution probability for capturing behavior before action executing, it is determined that crawl object whether can it is crawled (it is specific, In the present embodiment, completed by image analysis apparatus 220), when robot judges that, crawl object crawled (for example cannot capture Object volume is excessive, position too far/it is excessively high) when voice feedback device 270 export voice feedback and remind.
When robot judges that crawl object can cannot judge that crawl pair is liked with crawled or robot based on available data It is no can with it is crawled when robot carry out position judgment, determine whether current location can perform whether grasping movement (needs position Move, it is contemplated that in certain scenarios, the grasping mechanism of robot is currently located on crawl object, it is not necessary to carry out any position Move) (being completed by image analysis apparatus 220).When current position can just perform grasping movement, crawl/modes of emplacement determines Device 250 determines Grasp Modes, and action command generating means 240 generate the action command of simultaneously output matching Grasp Modes.
(movement is needed when current position cannot just perform grasping movement), robot is it is first determined whether need whole Displacement body (is completed) by image analysis apparatus 220.When global displacement is needed, the planning displacement path of path planning apparatus 230 is moved Make the action command that command generating device 240 generates simultaneously output matching displacement path.(the crawl object when global displacement is not needed Just aside, robot need to only move grasping mechanism), the planning crawl of path planning apparatus 230 path, action command generating means 240 generate the action command that simultaneously output matching captures path.
After action command output, voice feedback device 270 carries out voice feedback, the related information of output action instruction.Machine Device people carries out action under the control of action command, and at the same time, system is returned, and image collecting device 210 is continued executing with Image information collecting works, and gathers the external image information (robot motion, external image information respective change) of change, system Carry out the generation and output of next round action command.
Next by taking placement process as an example, by taking the main machine structure shown in Fig. 2 as an example, in a concrete application environment, its fortune Row process is as follows:
Interactive instruction acquisition device 211 obtains and forwards placement instruction;
The parsing of interactive instruction analytical equipment 212 is placed instruction and obtains placement location feature description;
Image collecting device 210 gathers and sends external image information;
The parsing of image analysis apparatus 220 external image information isolates placement location image, and further determination is put Put ground three-point state information and positional information (wherein it is determined that placement location status information whether be coated to including placement location Lid);
Further, in the present embodiment, in order to ensure the rate that runs succeeded of robot placement action, robot is concrete Can also do anticipation to the execution probability for placing behavior before action executing, determine placement location whether can placing articles (by scheming As resolver 220 is completed), when robot judge placement location cannot placing articles (for example, placement location position too far/mistake High, placement location is capped etc.) when voice feedback device 270 export voice feedback and remind.
When robot judges that placement location can cannot judge placement location with placing articles or robot based on available data Whether position judgment can be carried out with robot during placing articles, determine whether current location can perform whether placement action (needs Want displacement, it is contemplated that in certain scenarios, the grasping mechanism of robot is currently located in placement location, it is not necessary to carried out any Displacement) (being completed by image analysis apparatus 220).When current position just can perform placement action, crawl/modes of emplacement is true Determine device 250 and determine modes of emplacement, action command generating means 240 generate the action command of simultaneously output matching Grasp Modes.
(movement is needed when current position just cannot perform placement action), robot is it is first determined whether need whole Displacement body (is completed) by image analysis apparatus 220.When global displacement is needed, the planning displacement path of path planning apparatus 230 is moved Make the action command that command generating device 240 generates simultaneously output matching displacement path.(the crawl object when global displacement is not needed Just aside, robot need to only move grasping mechanism), path, action command generating means are placed in the planning of path planning apparatus 230 240 generate the action command that simultaneously output matching places path.
Further, after it is determined that needing movement, robot also determines that object current state to be placed.So on planning road When footpath information and generation action command the current state of thing to be placed is just may be referred to so as to avoid article from sliding.
After action command output, voice feedback device 270 carries out voice feedback, the related information of output action instruction.Machine Device people carries out action under the control of action command, and at the same time, main frame is returned, and image collecting device 210 is continued executing with Image information collecting works, and gathers the external image information (robot motion, external image information respective change) of change, system Carry out the generation and output of next round action command.
To sum up, according to system of the present utility model, robot can realize specifying the crawl of article and being placed into article Appointed place.Compared with prior art, system of the present utility model not only can realize high success rate, the crawl of high-accuracy is put Behavior, and simple structure are put, hardware cost is low.
Further, this utility model embodiment is also provided in a kind of anthropomorphic robot, including this utility model embodiment For realizing that anthropomorphic robot captures the electric-control system of behavior
Although embodiment disclosed in the utility model is as above, described content is only to facilitate understand this practicality It is new and adopt embodiment, be not limited to this utility model.System described in the utility model can also have other many Plant embodiment.In the case of without departing substantially from this utility model essence, those of ordinary skill in the art work as can be new according to this practicality Type makes various corresponding changes or deformation, but these corresponding changes or deformation should all belong to claim of the present utility model Protection domain.

Claims (9)

1. a kind of for realizing that anthropomorphic robot captures the electric-control system of behavior, the system includes main frame, slave and action Actuator, the main frame is connected by the slave with the action actuating mechanism, wherein:
The main frame is configured to:Obtain and analyze multi-modal input data and generate interactive instruction, when the interactive instruction is crawl When instructing and/or placing instruction, generate corresponding action command and export to slave according to the crawl instruction or placement instruction;
The slave is configured to:The operating state of the action command and action actuating mechanism feedback is parsed, is generated and is driven Dynamic instruction is simultaneously sent to the action actuating mechanism, until the operating state of the action actuating mechanism and the action command Match somebody with somebody;
The action actuating mechanism is configured to:Corresponding actions are performed according to the driving instruction, and is obtained in setting time dynamic Make feedback of status to the slave.
2. system according to claim 1, it is characterised in that described according to the crawl instruction or to place instruction and generate phase The action command answered, including:
Collection external image information simultaneously captures object or placement location according to the external image information analysis;
The action command is generated according to the analysis result of the crawl object or the placement location.
3. system according to claim 2, it is characterised in that described according to the crawl object or the placement location Analysis result generates the action command, including:
According to the crawl object or the analysis result path planning information of the placement location;
The action command that robot motion and generation are matched with the action is determined according to the current routing information.
4. system according to claim 3, it is characterised in that the main frame is configured to according to the crawl object or placement During the analysis result path planning information in place, based on default path planning model path planning information, the routing information Including displacement path and crawl/placement path, wherein:
The displacement path be the robot from current location to the crawl object or the placement location near first The motion track of location point;
The main frame is configured to be generated and displacement path described in output matching when the robot is on the displacement path Action command;
The crawl/placement path is that its gripper components moves to described grabbing when the robot is located at the first position point Take object or the motion track in the placement location;
The main frame is configured to capture/put described in generation and output matching when the robot is located at the first position point Put the action command in path.
5. according to the system described in claim 3, it is characterised in that the main frame is additionally configured to:
The action command that record is previously generated is to train preset path plan model.
6. system according to claim 1, it is characterised in that the main frame is additionally configured to:
The external image information of change is gathered during robot performs the action command and according to the external image Information analysis capture object or placement location;
The action command is updated according to the new analysis result of the crawl object or the placement location.
7. system according to claim 1, it is characterised in that the main frame is additionally configured to:
The related information of output voice messaging feedback current action instruction.
8. system according to claim 1, it is characterised in that the action actuating mechanism includes multiple self feed back steering wheels, Multiple self feed back steering wheels include leg steering wheel and hand steering wheel.
9. a kind of anthropomorphic robot, it is characterised in that include as described in claim 1-8 is arbitrary for realizing anthropomorphic robot The electric-control system of crawl behavior.
CN201620359286.7U 2016-04-26 2016-04-26 A electrical system and humanoid robot for realizing humanoid robot snatchs action Active CN206105862U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201620359286.7U CN206105862U (en) 2016-04-26 2016-04-26 A electrical system and humanoid robot for realizing humanoid robot snatchs action

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201620359286.7U CN206105862U (en) 2016-04-26 2016-04-26 A electrical system and humanoid robot for realizing humanoid robot snatchs action

Publications (1)

Publication Number Publication Date
CN206105862U true CN206105862U (en) 2017-04-19

Family

ID=58508963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201620359286.7U Active CN206105862U (en) 2016-04-26 2016-04-26 A electrical system and humanoid robot for realizing humanoid robot snatchs action

Country Status (1)

Country Link
CN (1) CN206105862U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105773619A (en) * 2016-04-26 2016-07-20 北京光年无限科技有限公司 Electronic control system used for realizing grabbing behavior of humanoid robot and humanoid robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105773619A (en) * 2016-04-26 2016-07-20 北京光年无限科技有限公司 Electronic control system used for realizing grabbing behavior of humanoid robot and humanoid robot

Similar Documents

Publication Publication Date Title
CN105773619A (en) Electronic control system used for realizing grabbing behavior of humanoid robot and humanoid robot
US10913151B1 (en) Object hand-over between robot and actor
US12131529B2 (en) Virtual teach and repeat mobile manipulation system
US11474510B2 (en) Programming a robot by demonstration
JP2018024082A (en) Multiaxial motion control device, robot arm system, method of controlling movement of robot arm system, and method of controlling movement of multiaxial motion driving device
CN107127760A (en) A kind of track combined anthropomorphic robot of foot
US20240149458A1 (en) Robot remote operation control device, robot remote operation control system, robot remote operation control method, and program
CN107891425A (en) The control method of the intelligent man-machine co-melting humanoid robot system of both arms security cooperation
CN102848388A (en) Multi-sensor based positioning and grasping method for service robot
CN103324197A (en) Voice-control multi-functional intelligent service robot
JP2003266345A (en) Route planning device, route planning method, route planning program, and mobile robot device
CN111085996B (en) Control method, device and system of live working robot
CN111823277A (en) An object grasping platform and method based on machine vision
US11458632B2 (en) Robot having reduced vibration generation in in arm portion
CN206105862U (en) A electrical system and humanoid robot for realizing humanoid robot snatchs action
Sathyamoorthy et al. Automatic robotic arm based on bluetooth regulated for progressed surgical task
CN111309152A (en) A human-computer flexible interaction system and method based on intent recognition and impedance matching
CN211742054U (en) A Human-Computer Flexible Interaction System Based on Intent Recognition and Impedance Matching
Guan et al. On semi-autonomous robotic telemanipulation employing electromyography based motion decoding and potential fields
CN207578422U (en) The intelligent man-machine co-melting robot system of both arms security cooperation
Wang et al. A visual servoing system for interactive human-robot object transfer
CN216748540U (en) Quadruped robot
CN205721358U (en) Robot and control system thereof
CN114128461A (en) The control method of the plug-in seedling raising and transplanting robot and the plug-in seedling raising and transplanting robot
CN106863302A (en) A kind of robot charging device and its implementation

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant