[go: up one dir, main page]

CN106346475A - Robot and robot control method - Google Patents

Robot and robot control method Download PDF

Info

Publication number
CN106346475A
CN106346475A CN201610936946.8A CN201610936946A CN106346475A CN 106346475 A CN106346475 A CN 106346475A CN 201610936946 A CN201610936946 A CN 201610936946A CN 106346475 A CN106346475 A CN 106346475A
Authority
CN
China
Prior art keywords
robot
eye assembly
interactive object
positional information
adjusting parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610936946.8A
Other languages
Chinese (zh)
Inventor
蒋化冰
孙斌
吴礼银
康力方
李小山
张干
赵亮
邹武林
徐浩明
廖凯
齐鹏举
方园
李兰
米万珠
舒剑
吴琨
管伟
罗璇
罗承雄
张海建
马晨星
张俊杰
谭舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Muye Robot Technology Co Ltd
Original Assignee
Shanghai Muye Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Muye Robot Technology Co Ltd filed Critical Shanghai Muye Robot Technology Co Ltd
Priority to CN201610936946.8A priority Critical patent/CN106346475A/en
Publication of CN106346475A publication Critical patent/CN106346475A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Toys (AREA)

Abstract

The invention discloses a robot and a robot control method. The robot comprises an eye assembly installed on the face of the robot, a position detector and a controller. The position detector is used for detecting position information of an interaction object; and the controller is used for determining the adjusting parameters of the eye assembly according to the position information and controlling the eye assembly to make corresponding adjustment, so that the eye assembly on the face of the robot can be changed up and down, left and right according to the position change of the interaction object, and the man-machine interaction performance of the robot is expanded.

Description

Robot and robot control method
Technical field
The application belongs to mobile robot field, more particularly, to a kind of robot and robot control method.
Background technology
In recent years, the development of roboticses and artificial intelligence study deepen continuously, and intelligent robot is in human lives Play the part of more and more important role.Demand with people is on the increase, and more the robot of hommization gradually will can become machine The favorite of Qi Ren circle.
It is desirable to robot can more hommization, be especially desirable to robot can with the interacting of the mankind more Press close to the feature of " people ".But, the interactive function of big multirobot is more single at present, mostly robot be based on user speech or The instruction of touch input, executes corresponding action.
Content of the invention
In view of this, the embodiment of the present application provides a kind of robot, for the Man machine interaction of expanding machinery people.
The embodiment of the present application provides a kind of robot, comprising: the eye assembly installed in robot face, position are examined Survey device and controller;
Described position detector, for detection by the positional information of interactive object;
Described controller, for determining the adjusting parameter of described eye assembly, and according to described according to described positional information Adjusting parameter controls described eye assembly to make corresponding adjustment.
The embodiment of the present application provides a kind of robot control method, comprising:
Detection is by the positional information of interactive object;
Determine the adjusting parameter of the eye assembly of robot according to described positional information;
Described eye assembly is controlled to make corresponding adjustment according to described adjusting parameter.
Robot and robot control method that the application provides, are detected by the position of interactive object by position detector Confidence ceases, and controller determines the adjusting parameter of robot eye parts based on this positional information, thus adjusting the eye of robot Assembly, so that the eye assembly of the face of robot can change up and down according to the change in location by interactive object, expands The Man machine interaction of Zhan Liao robot.
Brief description
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing Have technology description in required use accompanying drawing be briefly described it should be apparent that, drawings in the following description are these Some bright embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can also root Obtain other accompanying drawings according to these accompanying drawings.
Fig. 1 is the structural representation of the robotic embodiment one that the embodiment of the present application provides;
Fig. 2 is the flow chart of the robot control method embodiment one that the embodiment of the present application provides;
Fig. 3 is the flow chart of the robot control method embodiment two that the embodiment of the present application provides.
Specific embodiment
Purpose, technical scheme and advantage for making the embodiment of the present invention are clearer, below in conjunction with the embodiment of the present invention In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described it is clear that described embodiment is The a part of embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art The every other embodiment being obtained under the premise of not making creative work, broadly falls into the scope of protection of the invention.
Fig. 1 is the structural representation of the robotic embodiment one that the embodiment of the present application provides, as shown in figure 1, this robot Including: eye assembly 11, position detector 12 and the controller 13 installed in robot face:
Wherein, position detector 12 is used for detection by the positional information of interactive object.
Controller 13, for determining the adjusting parameter of eye assembly according to positional information, and controls according to this adjusting parameter Corresponding adjustment made by eye assembly 11.
Alternatively, controller 13 can use various application specific integrated circuits (asic), digital signal processor (dsp), digital signal processing appts (dspd), PLD (pld), field programmable gate array (fpga), micro-control Device processed, microprocessor or other electronic components are realized.
Alternatively, in the embodiment of the present application, positional information may include that robot and by the distance between interactive object And/or orientation angles.
In the embodiment of the present application, above-mentioned positional information is obtained by position detector 12.Wherein, position detector 12 can To be in microphone array, image recognizer, range sensor, infrared sensor, laser sensor and ultrasonic sensor One or more.
Alternatively, it is provided with microphone array (mic array) with robot, mic array is one group and is located at space not Press, with the omnidirectional microphone of position, the array that certain regular shape arrangement is formed, be that space is carried out to spatial transmission acoustical signal A kind of device of sampling, the signal collecting contains its spatial positional information.In the embodiment of the present application, according to mic array sound Source location obtains by the positional information of interactive object, thus controller 13 is after receiving this positional information, to eye assembly 11 Rotational parameters be configured.For example, mic array detection to the left side being located at robot body by interactive object, then can be arranged The rotational angle that the eye assembly of robot is current is to turn left, and rotation angle is 5 degree etc..
Image recognizer, can be the integrated equipment with disposal abilities such as shooting, image recognition, calculating.By with taking the photograph Camera or the photographic head image containing face for the collection or video flowing, and detect and track face in the picture, can be in video camera Or find by interactive object in the visual range of photographic head, and obtain by the positional information of interactive object.Wherein, based on image The evaluator relatively impact to image imaging effect by the shooting angle of interactive object, by clap image carry out calculating permissible Determine by the positional information of interactive object opposed robots, this calculating process is referred to related art and realizes, this enforcement Example does not repeat.
Range sensor, infrared sensor, laser sensor and ultrasonic sensor are all by range measurement, relatively Orientation angles come to realize position position detector.General, can be realized above-mentioned by way of multiple sensor combinations Distance, the detection of orientation angles.Its work process is: multiple position detectors are simultaneously emitted by position detection signal, reaches tested After thing position detection signal reflection, position detector record after receiving position detection signal position detection signal round when Between, the spread speed according to position detection signal calculates the positional information obtaining measured object, meanwhile, is passed according to multiple distances The orientation obtaining detected material apart from testing result comprehensive analysis of sensor.
Eye assembly 11 can be display screen can also be mechanical component.
When eye assembly 11 is display screen, display screen can show different types of according to the adjusting parameter of eye assembly Eyes image.Specifically, when eye assembly 11 is display screen, controller 13 can be by aobvious in following process control display screen The eyes image showing: after controller 13 receives positional information, it is corresponding that the default mapping table of inquiry obtains this positional information The mark of eyes image;Being determined according to the mark of the eyes image obtaining needs the eyes image of display, thus controlling in display This eyes image of screen display.
Specifically, several eyes images can be previously stored with robot, each width eyes image has one uniquely Mark (id).Meanwhile, the eyes image that is also stored with robot identifies the mapping table with positional information, and this positional information is The positional information of interactive object opposed robots.The every group of positional information comprising in this mapping table is corresponding with what eyes image identified Relation is to be configured by measured in advance.For example, positional information is 1 meter of left front, when drift angle is 45 degree, in the mapping table The id of the eyes image inquiring is image 1, when positional information is 1 meter of dead ahead, the eyes image inquiring in the mapping table Id be image 1.
Wherein, above-mentioned position, angle are that an accurately digital mode is only for example, in practical application, can also be real It is now relatively small distance range, an angular range.
When eye assembly 11 is mechanical component, controller 13 can control the rotation side of eye assembly based on positional information To and/or rotational angle.
Specifically mechanical component can include Rotation of eyeball device, eyelid opening-closing structure.Thus, controller 13 can basis By the positional information of interactive object, control rotational angle, the direction of adjustment Rotation of eyeball device, opening of eyelid structure can also be controlled Close etc..
Alternatively, controller control to eye assembly 11 based on positional information, can be based on the difference pre-setting Corresponding relation between positional information and different adjusting parameters is realized.For example, become between set location information and adjusting parameter Some linear, or, positional information is divided into different numerical intervals, and adjusts for each numerical intervals setting is corresponding Parameter.Adjusting parameter herein refers to the rotational angle of eye assembly, direction.
In the present embodiment, detected by the position detector of setting with robot and believed by the position of interactive object Breath, thus controller determines the adjusting parameter of the eye assembly of robot based on this positional information, to control adjustment robot Eye assembly is so that the eye assembly of the face of robot can become up and down according to the change in location by interactive object Dynamic, more hommization, extend the interactive mode of robot simultaneously.
Fig. 2 is the flow chart of the robot control method embodiment one that the embodiment of the present application provides, as shown in Fig. 2 the party Method comprises the steps:
Step 101, detection are by the positional information of interactive object.
Step 102, determined according to positional information robot eye assembly adjusting parameter.
Step 103, eye assembly is controlled to make corresponding adjustment according to adjusting parameter.
In the present embodiment, the adjustment process of the detection of above-mentioned positional information and the eye assembly to robot, Ke Yican Description in embodiment, will not be described here as shown in Figure 1.
In the present embodiment, by detection by the positional information of interactive object, so that robot is determined based on this positional information The adjusting parameter of eye assembly, thus make corresponding adjustment according to the eye assembly that this adjusting parameter controls robot so that machine The eye assembly of the face of device people can change up and down according to the change in location by interactive object, extends robot Man machine interaction is so that man-machine interaction more preferably hommization.
In addition, in the embodiment of the present application, when positional information include robot with during by the distance between interactive object, example As, human eye when seeing nearly thing on palpebra inferior opening degree larger, when seeing remote thing, usually see more clearly and can narrow eye, that is, The opening degree of upper palpebra inferior is less, for the Man machine interaction of further hoisting machine people, when by interactive object in not offset When dynamic, obtain robot in real time and by the distance between interactive object, and according to different distance values, eye assembly is adjusted Whole, for example, when eye assembly is mechanical component, robot can be set and by the distance between interactive object and eye assembly In the rotational angle of upper palpebra inferior become certain linear relationship, or go up palpebra inferior accordingly for the interval setting of different distances and turn Dynamic angle;When eye assembly is display screen, the different eye image of palpebra inferior stretching degree can be shown by display screen.
When positional information include robot and described by interactive object between orientation angles when, can be entered using following example Row explanation.For example, it will usually there be the interaction of expression in the eyes when being exchanged between men, for make robot with interacted people it Between have the interaction of " expression in the eyes ", position detector obtains robot and by the orientation angles between interactive object.When by interactive object Orientation differences when, the eye assembly of robot can be followed by the change of interactive object and be rotated.Particularly, multiple when existing During by interactive object, by obtain multiple orientation angles by interactive object, from different exchanged by interactive object when, can Realize the interaction of " expression in the eyes ".Such as, when being, by interactive object, the people that two are located at robot left front and right front respectively, position In left front people when speaking, controller is pointed to the testing result of the orientation angles of people from left front based on position detector, Control the corresponding orientation angles in rotation direction be aligned left front of eyeball in eye assembly;People positioned at right front when speaking, Controller is pointed to the testing result of the orientation angles of people from right front based on position detector, controls eyeball in described eye assembly Rotation direction be aligned right front corresponding orientation angles, it is achieved thereby that robot is handed over by interactive object flexible " expression in the eyes " Stream.
Fig. 3 is the flow chart of the robot control method embodiment two that the embodiment of the present application provides, as shown in figure 3, this machine Device people's control method comprises the steps:
Step 201, detection are by the positional information of interactive object.
Step 202, detection are by the characteristic attribute of interactive object.
Step 203, determined according to characteristic attribute and positional information robot eye assembly adjusting parameter.
Step 204, eye assembly is controlled to make corresponding adjustment according to adjusting parameter.
Alternatively, characteristic attribute is included by least one in the emotion of interactive object, age and sex.
Detection specifically can be real by microphone array, image recognizer and controller by the characteristic attribute of interactive object Existing.Specifically, when obtaining the characteristic attribute by interactive object using microphone array, microphone array is to by the sound of interactive object Message number is sampled, and controller carries out acoustics identification and classification to the acoustical signal that obtains of sampling, thus according to being interacted The characteristic voice of object obtains by information such as the sex of interactive object, age, emotions.Image recognizer can carry out table to face Feelings identification simultaneously obtains by the sex of interactive object, age, emotional state from Expression Recognition result.It should be appreciated that above two pin Can be used alone and can also be applied in combination to method for distinguishing is known by interactive object characteristic attribute, the embodiment of the present application does not limit System.
According to including above-mentioned positional information by interactive object, the adjusting parameter of characteristic attribute carries out robot eyes group The control of part, e.g.: detect by the emotion excitation time of interactive object, direction and the angle of eye rotation can not be changed Degree, show can eyes with excited emoticon by controlling eyelid structure to increase the subtended angle of upper palpebra inferior or on a display screen Image.
In the present embodiment, by analysis by the attribute information of interactive object, thus control the eye assembly of robot according to Different states are assumed by the different attribute of interactive object, improves Consumer's Experience further, extend the man-machine friendship of robot Interaction performance.
Device embodiment described above is only that schematically the wherein said unit illustrating as separating component can To be or to may not be physically separate, as the part that unit shows can be or may not be physics list Unit, you can with positioned at a place, or can also be distributed on multiple NEs.Can be selected it according to the actual needs In the purpose to realize this embodiment scheme for some or all of module.Those of ordinary skill in the art are not paying creativeness Work in the case of, you can to understand and to implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can Mode by software plus necessary general hardware platform to be realized naturally it is also possible to pass through hardware.Based on such understanding, on That states that technical scheme substantially contributes to prior art in other words partly can be embodied in the form of software product, should Computer software product can store in a computer-readable storage medium, such as rom/ram, magnetic disc, CD etc., including some fingers Order is with so that a computer equipment (can be personal computer, server, or network equipment etc.) executes each enforcement Example or some partly described methods of embodiment.
Finally it is noted that above example, only in order to technical scheme to be described, is not intended to limit;Although With reference to the foregoing embodiments the present invention is described in detail, it will be understood by those within the art that: it still may be used To modify to the technical scheme described in foregoing embodiments, or equivalent is carried out to wherein some technical characteristics; And these modification or replace, do not make appropriate technical solution essence depart from various embodiments of the present invention technical scheme spirit and Scope.

Claims (9)

1. a kind of robot is it is characterised in that include eye assembly, position detector and the control installed in robot face Device processed;
Described position detector, for detection by the positional information of interactive object;
Described controller, for determining the adjusting parameter of described eye assembly, and according to described adjustment according to described positional information Corresponding adjustment made by eye assembly described in state modulator.
2. robot according to claim 1 is it is characterised in that described position detector, comprising:
In microphone array, image recognizer, range sensor, infrared sensor, laser sensor and ultrasonic sensor One or more.
3. method according to claim 1 is it is characterised in that described eye assembly, comprising:
Display screen or mechanical component.
4. robot according to claim 3 it is characterised in that when described eye assembly be described mechanical component when, institute State eye assembly and include Rotation of eyeball device, eyelid opening-closing structure;
Correspondingly, the adjusting parameter of described eye assembly includes: the rotation direction of described Rotation of eyeball device and/or rotational angle; And/or, the rotational angle of described eyelid opening-closing structure.
5. robot according to claim 3 it is characterised in that when described eye assembly be described display screen when, described Controller specifically for:
Inquire about the mark that default mapping table obtains the corresponding eyes image of described positional information;
The corresponding eyes image of mark in eyes image described in described display screen display.
6. a kind of robot control method is it is characterised in that include:
Detection is by the positional information of interactive object;
Determine the adjusting parameter of the eye assembly of robot according to described positional information;
Described eye assembly is controlled to make corresponding adjustment according to described adjusting parameter.
7. method according to claim 6 is it is characterised in that methods described also includes:
Detect the described characteristic attribute by interactive object;Wherein, described characteristic attribute includes as at least one in properties: institute State by the emotion of interactive object, age and sex;
The adjusting parameter of the described eye assembly determining robot according to described positional information, comprising:
Determine the adjusting parameter of the eye assembly of described robot according to described positional information and described characteristic attribute.
8. method according to claim 7 is it is characterised in that detect the described characteristic attribute by interactive object, comprising:
Gather the described acoustical signal by interactive object;
Described acoustical signal is analyzed obtain the described described characteristic attribute by interactive object.
9. the method according to claim 7 or 8, it is characterised in that detecting the described characteristic attribute by interactive object, is wrapped Include:
Human facial expression recognition is carried out to described identified object;
Result according to described human facial expression recognition obtains the described described characteristic attribute by interactive object.
CN201610936946.8A 2016-11-01 2016-11-01 Robot and robot control method Pending CN106346475A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610936946.8A CN106346475A (en) 2016-11-01 2016-11-01 Robot and robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610936946.8A CN106346475A (en) 2016-11-01 2016-11-01 Robot and robot control method

Publications (1)

Publication Number Publication Date
CN106346475A true CN106346475A (en) 2017-01-25

Family

ID=57864076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610936946.8A Pending CN106346475A (en) 2016-11-01 2016-11-01 Robot and robot control method

Country Status (1)

Country Link
CN (1) CN106346475A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108647633A (en) * 2018-05-08 2018-10-12 腾讯科技(深圳)有限公司 Recognition and tracking method, recognition and tracking device and robot
CN109108960A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
CN109895140A (en) * 2017-12-10 2019-06-18 湘潭宏远电子科技有限公司 A kind of robotically-driven trigger device
WO2019178863A1 (en) * 2018-03-23 2019-09-26 深圳市大疆创新科技有限公司 Control method, control device, control system, and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002045916A1 (en) * 2000-12-06 2002-06-13 Sony Corporation Robot device, method for controlling motion of robot device, and system for controlling motion of robot device
CN102566474A (en) * 2012-03-12 2012-07-11 上海大学 Interaction system and method for robot with humanoid facial expressions, and face detection and tracking method
CN102915044A (en) * 2012-10-25 2013-02-06 上海大学 Robot head-eye coordination motion control method based on bionic principle
CN103679203A (en) * 2013-12-18 2014-03-26 江苏久祥汽车电器集团有限公司 Robot system and method for detecting human face and recognizing emotion
CN105234945A (en) * 2015-09-29 2016-01-13 塔米智能科技(北京)有限公司 Welcome robot based on network voice dialog and somatosensory interaction
CN105590084A (en) * 2014-11-03 2016-05-18 贵州亿丰升华科技机器人有限公司 Robot human face detection tracking emotion detection system
CN205325695U (en) * 2015-12-29 2016-06-22 广东奥飞动漫文化股份有限公司 Machine people is accompanied to intelligence

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002045916A1 (en) * 2000-12-06 2002-06-13 Sony Corporation Robot device, method for controlling motion of robot device, and system for controlling motion of robot device
CN102566474A (en) * 2012-03-12 2012-07-11 上海大学 Interaction system and method for robot with humanoid facial expressions, and face detection and tracking method
CN102915044A (en) * 2012-10-25 2013-02-06 上海大学 Robot head-eye coordination motion control method based on bionic principle
CN103679203A (en) * 2013-12-18 2014-03-26 江苏久祥汽车电器集团有限公司 Robot system and method for detecting human face and recognizing emotion
CN105590084A (en) * 2014-11-03 2016-05-18 贵州亿丰升华科技机器人有限公司 Robot human face detection tracking emotion detection system
CN105234945A (en) * 2015-09-29 2016-01-13 塔米智能科技(北京)有限公司 Welcome robot based on network voice dialog and somatosensory interaction
CN205325695U (en) * 2015-12-29 2016-06-22 广东奥飞动漫文化股份有限公司 Machine people is accompanied to intelligence

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109108960A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
CN109108960B (en) * 2017-06-23 2022-03-01 卡西欧计算机株式会社 Robot, robot control method, and storage medium
CN109895140A (en) * 2017-12-10 2019-06-18 湘潭宏远电子科技有限公司 A kind of robotically-driven trigger device
WO2019178863A1 (en) * 2018-03-23 2019-09-26 深圳市大疆创新科技有限公司 Control method, control device, control system, and computer readable storage medium
CN108647633A (en) * 2018-05-08 2018-10-12 腾讯科技(深圳)有限公司 Recognition and tracking method, recognition and tracking device and robot
CN108647633B (en) * 2018-05-08 2023-12-22 腾讯科技(深圳)有限公司 Identification tracking method, identification tracking device and robot

Similar Documents

Publication Publication Date Title
CN105159111B (en) Intelligent interaction device control method and system based on artificial intelligence
US10646966B2 (en) Object recognition and presentation for the visually impaired
CN105554385B (en) A kind of remote multi-modal biological characteristic recognition methods and its system
JP2024045273A (en) System and method for detecting human gaze and gesture in unconstrained environments
JP5456832B2 (en) Apparatus and method for determining relevance of an input utterance
US9753119B1 (en) Audio and depth based sound source localization
CN108200334B (en) Image capturing method, device, storage medium and electronic device
CN106335071A (en) The robot and robot control method
CN108369653A (en) Use the eyes gesture recognition of eye feature
KR102463806B1 (en) Electronic device capable of moving and method for operating thereof
US11671739B2 (en) Adjustment mechanism for tissue transducer
CN106346475A (en) Robot and robot control method
CN103516985A (en) Mobile terminal and image acquisition method thereof
CN113692750A (en) Sound transfer function personalization using sound scene analysis and beamforming
US10552675B2 (en) Method and apparatus for eye detection from glints
US11638110B1 (en) Determination of composite acoustic parameter value for presentation of audio content
KR101759444B1 (en) Expression recognition sysyem and method using a head mounted display
CN103279188A (en) Method for operating and controlling PPT in non-contact mode based on Kinect
JP2007257088A (en) Robot apparatus and communication method thereof
US10665243B1 (en) Subvocalized speech recognition
CN107111363A (en) Monitoring
CN111243624B (en) Method and system for evaluating personnel state
CN206475182U (en) Robot
US11234090B2 (en) Using audio visual correspondence for sound source identification
US20230095350A1 (en) Focus group apparatus and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170125