CN105844673B - Full-angle human tracking system based on natural human-computer interaction technology and control method - Google Patents
Full-angle human tracking system based on natural human-computer interaction technology and control method Download PDFInfo
- Publication number
- CN105844673B CN105844673B CN201610342160.3A CN201610342160A CN105844673B CN 105844673 B CN105844673 B CN 105844673B CN 201610342160 A CN201610342160 A CN 201610342160A CN 105844673 B CN105844673 B CN 105844673B
- Authority
- CN
- China
- Prior art keywords
- tracking
- human
- principle
- computer interaction
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 75
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000006243 chemical reaction Methods 0.000 claims abstract description 21
- 238000012544 monitoring process Methods 0.000 claims abstract description 19
- 210000000988 bone and bone Anatomy 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 8
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
A full-angle human tracking system and a control method based on a natural human-computer interaction technology comprise the following steps: the natural human-computer interaction data processing unit is used for analyzing the collected natural human-computer interaction data and processing the natural human-computer interaction data into a control instruction packet for output; the invention also provides a full-angle human tracking control method based on the natural human-computer interaction technology, which comprises the following steps: collecting natural man-machine interaction data; analyzing the natural human-computer interaction data, determining a main tracker, and outputting a control instruction packet; and converting the control instruction packet into a control signal conforming to a full-angle human tracking system. And the control instruction packet conversion unit is used for converting the control instruction packet into a control signal which accords with the all-angle human tracking system so as to complete the control of the monitoring equipment. The invention not only enables the monitoring equipment to track the tracked person without dead angles, but also can intelligently eliminate other external factors and lock the tracked person.
Description
Technical Field
The invention relates to the field of human-computer interaction control systems, in particular to a full-angle human tracking system and a control method based on a natural human-computer interaction technology.
Background
The video tracking is widely applied to the fields of video live broadcast, video conference, remote teaching and the like at the present stage, the distance between two places can be shortened, and the interaction function is realized.
In the prior art, a patent document with publication number CN101290681B discloses a video target tracking method, device and automatic video tracking system, which includes performing gradient vector flow GVF deformation on each candidate position in a current frame to obtain each deformation curve; calculating video characteristics of the deformation curve; and determining a candidate position as a target position according to the video characteristics obtained by calculation. The patent also discloses a video target tracking device and an automatic video tracking system, which can improve the target tracking accuracy. The invention has the defects that the tracking person cannot be monitored in 360 degrees, and when a plurality of persons exist, the tracked person cannot be accurately monitored.
Disclosure of Invention
The invention adopts a natural human-computer interaction technology to determine the person who enters the monitoring picture as the tracked person, so that the tracking equipment always follows the tracked person, and the tracked person can be accurately locked even if other people enter the monitoring picture. When the tracked person is out of the monitoring of the tracking device, the tracking device can adjust to the initial position to wait for the tracked person to reenter the monitoring picture. Can 360 degrees implement the control to the tracking people, solve the problem that has the blind area among the prior art, can lock the people who first gets into the control picture and be tracked the people, solve among the prior art when many people get into the picture, confuse the problem of being tracked the people easily.
The specific technical scheme of the invention is as follows:
the invention provides a full-angle human tracking system based on a natural human-computer interaction technology, which comprises a natural human-computer interaction data acquisition module for acquiring natural human-computer interaction data, and preferably comprises the following modules:
the tracking identification module identifies and locks the tracked person;
the control center module analyzes the collected natural man-machine interaction data and processes the data into a control instruction packet to be output;
the command conversion module is used for converting the control command packet into a control signal which accords with a full-angle human tracking system;
the intelligent tracking module: and receiving a control signal and automatically tracking according to the movement of the human body.
In the above solution, it is preferable that the tracking identification module has a function of locking the tracked person according to an identification principle.
In the foregoing solution, it is preferable that the recognition principle includes at least one of a position recognition principle, a gesture recognition principle, an action recognition principle, a face recognition principle, a voice recognition principle, a lock-in-first principle, and an intelligent ignoring principle.
In the above aspect, it is preferable that the position recognition rule is to determine the tracked person based on a relative position of the person in the recognition area and the sensor.
In the above solution, preferably, the gesture recognition principle is that when a person in the recognition area takes the same static body gesture as that preset by the system, the person is determined to be a tracked person.
In the above solution, preferably, the motion recognition principle is that when a person in the recognition area performs the same motion as that preset by the system, the person is determined to be a tracked person, and the motion recognition includes recognizing at least one of a motion track and a motion time limit.
In the above scheme, it is preferable that the face recognition principle is to determine whether the person is a tracked person according to face data pre-stored in the system.
In the above-mentioned solution, it is preferable that the voice recognition principle determines whether the person is a tracked person according to audio data prestored in the system.
In the above aspect, the first entry locking principle is preferably to determine that the person entering the first identification area is the tracked person.
In the above solution, it is preferable that the intelligent ignoring principle means that other persons entering the identification area are intelligently ignored after the tracked person is locked.
In the above aspect, it is preferable that the tracking identification module further has a function of tracking the tracked person according to a tracking principle.
In the above aspect, it is preferable that the tracking rule includes at least one of a skeleton data tracking rule, a depth data assisted tracking rule, a sound tracking rule, and an automatic return rule.
In the above scheme, preferably, the skeletal data tracking principle is to extract skeletal data of a tracked person and track the skeletal data.
In the above aspect, the depth data-aided tracking principle is preferably that when the skeleton data is disordered, the depth data is retrieved from the acquired data to assist in tracking the tracked person.
In the above aspect, the sound tracking principle is preferably that when the tracked person moves out of the recognition range, the sound of the tracked person can be automatically tracked, and then the person recognition is performed again.
In the above scheme, preferably, the automatic return principle means that after a set time interval, no data of the tracked person is acquired, and the tracked person is automatically adjusted to a preset position to perform person identification again.
In the above aspect, it is preferable that the control center module has a function of receiving information of a person to be tracked transmitted from the tracking identification system.
In the above solution, it is preferable that the control center module further has a function of recording information of a tracked person.
In the above aspect, the tracked person information preferably refers to at least one of skeleton data, position data, posture data, motion data, face data, and audio data.
In the above aspect, preferably, the control center module further has a function of analyzing a voice command, an action command, and a limb command of the tracked person.
In the above solution, preferably, the control center module further has a function of predicting real-time actions of the tracked person.
In the above scheme, preferably, the control center module further has a function of outputting an automatic real-time action instruction of the correction holder.
In the above aspect, it is preferable that the control center module has a function of processing the analyzed command into a control command and outputting the control command.
In the above aspect, it is preferable that the instruction conversion module has a function of receiving a control instruction of the control center.
In the above solution, preferably, the instruction conversion module has a function of converting a control instruction of the control center into an action instruction for controlling the intelligent tracking module.
In the above aspect, it is preferable that the intelligent tracking module has a function of tracking the tracked person according to the motion instruction.
In the above aspect, it is preferable that the intelligent tracking module has a function of controlling the motion of the pan/tilt head.
In the above aspect, it is preferable that the pan/tilt head motion includes at least one of a basic motion, a switching motion, and a high-level motion.
In the above aspect, it is preferable that the basic action includes at least one of an opening module, a closing module, and a pausing module.
In the above aspect, it is preferable that the switching operation includes at least one of mode switching and multi-camera switching.
In the above aspect, it is preferable that the advanced action includes at least one of rotation to a specified angle, adjustment of a focal length of the camera, adjustment of a speed, forward and backward movement, left and right movement, and up and down movement.
In the above aspect, it is preferable that the rotation angle is 0 to 360 degrees.
The second aspect of the present invention further provides a full-angle human tracking control method based on a natural human-computer interaction technology, including collecting natural human-computer interaction data, preferably: comprises the following steps:
step 1: identifying and locking a tracked person;
step 2, analyzing the collected natural man-machine interaction data, and processing the data into a control instruction packet for output;
and step 3: converting the control instruction packet into a control signal conforming to a full-angle human tracking system;
and 4, step 4: and receiving a control signal and automatically tracking according to the movement of the human body.
In the above solution, it is preferable that the step 1 is to lock the tracked person according to the identification principle.
In the foregoing solution, it is preferable that the recognition principle includes at least one of a position recognition principle, a gesture recognition principle, an action recognition principle, a face recognition principle, a voice recognition principle, a lock-in-first principle, and an intelligent ignoring principle.
In the above aspect, it is preferable that the position recognition rule is to determine the tracked person based on a relative position of the person in the recognition area and the sensor.
In the above solution, preferably, the gesture recognition principle is that when a person in the recognition area takes the same static body gesture as that preset by the system, the person is determined to be a tracked person.
In the above solution, preferably, the motion recognition principle is that when a person in the recognition area performs the same motion as that preset by the system, the person is determined to be a tracked person, and the motion recognition includes recognizing at least one of a motion track and a motion time limit.
In the above scheme, it is preferable that the face recognition principle is to determine whether the person is a tracked person according to face data pre-stored in the system.
In the above-mentioned solution, it is preferable that the voice recognition principle determines whether the person is a tracked person according to audio data prestored in the system.
In the above aspect, the first entry locking principle is preferably to determine that the person entering the first identification area is the tracked person.
In the above solution, it is preferable that the intelligent ignoring principle means that other persons entering the identification area are intelligently ignored after the tracked person is locked.
In the above scheme, preferably, the step 1 is tracking the tracked person according to a tracking principle.
In the above aspect, it is preferable that the tracking principle includes at least one of a skeletal data tracking principle, a depth data assisted tracking principle, a voice tracking principle, and an automatic return principle.
In the above scheme, preferably, the skeletal data tracking principle is to extract skeletal data of a tracked person and track the skeletal data.
In the above aspect, the depth data-aided tracking principle is preferably that when the skeleton data is disordered, the depth data is retrieved from the acquired data to assist in tracking the tracked person.
In the above solution, it is preferable that the voice tracking principle is that when the tracked person moves out of the recognition range, the voice of the tracked person is automatically tracked, and then the person recognition is performed again.
In the above scheme, preferably, the automatic return principle means that after a set time interval, no data of the tracked person is acquired, and the tracked person is automatically adjusted to a preset position to perform person identification again.
In the above solution, preferably, the step 2 is to receive the tracked person information transmitted by the tracking identification system.
In the above scheme, preferably, the step 2 is to record the tracked person information.
In the above aspect, the tracked person information preferably refers to at least one of skeleton data, position data, posture data, motion data, face data, and audio data.
In the above aspect, preferably, the step 2 is to analyze the tracked person voice command, the motion command, and the limb command.
In the above solution, preferably, the control center module further has a function of predicting real-time actions of the tracked person.
In the above scheme, preferably, the control center module further has a function of outputting an automatic real-time action instruction of the correction holder.
In the above aspect, preferably, the step 2 is to process the analyzed command into a control command and output the control command.
In the above solution, preferably, the step 3 is to receive a control instruction from a control center.
In the above solution, preferably, the step 3 is to convert a control command of the control center into an action command for controlling the intelligent tracking module.
In the above solution, preferably, the step 4 is tracking the tracked person according to the motion instruction.
In the above scheme, preferably, the step 4 is to control the motion of the pan/tilt head.
In the above aspect, it is preferable that the pan/tilt head motion includes at least one of a basic motion, a switching motion, and a high-level motion.
In the above aspect, it is preferable that the basic action includes at least one of an opening module, a closing module, and a pausing module.
In the above aspect, it is preferable that the switching operation includes at least one of mode switching and multi-camera switching.
In the above aspect, it is preferable that the advanced action includes at least one of rotation to a specified angle, adjustment of a focal length of the camera, adjustment of a speed, forward and backward movement, left and right movement, and up and down movement.
In the above aspect, it is preferable that the rotation angle is 0 to 360 degrees.
The invention has wide applicable field, effectively solves the problem of monitoring the tracked person in a 360-degree full-angle manner, can accurately monitor the tracked person when a plurality of persons exist, and has wide market prospect.
Drawings
Fig. 1A is a schematic flow chart of a full-angle human tracking control method based on a natural human-computer interaction technology according to the present invention.
FIG. 1B is a schematic diagram of the module operation of the full-angle human tracking control method based on the natural human-computer interaction technology.
Fig. 2 is a schematic diagram of a teaching demonstration course recorded in the full-angle human tracking control method based on the natural human-computer interaction technology.
Fig. 3 is a schematic diagram of a remote video conference in the full-angle human tracking control method based on the natural human-computer interaction technology.
Fig. 4 is a schematic diagram of remote video interactive teaching in the full-angle human tracking control method based on the natural human-computer interaction technology.
Fig. 5 is a schematic view of course recording and playing in the full-angle human tracking control method based on the natural human-computer interaction technology.
Detailed Description
Fig. 1A is a schematic flow chart of a full-angle human tracking control method based on a natural human-computer interaction technology according to the present invention. FIG. 1B is a schematic diagram of the module operation of the full-angle human tracking control method based on the natural human-computer interaction technology. As shown in FIGS. 1A-1B, step 110 is to turn on the tracking system and the tracking identifier module 191 begins to operate. The step 120 of sequentially executing connects and turns on the tracking device, and the step 130 of the tracking system performs initialization setting, where the initialization setting includes: setting an initial monitoring area, an identification principle and a tracking principle, wherein the identification principle comprises the following steps: the tracking principle comprises one or more of a bone data tracking principle, a depth data auxiliary tracking principle, a voice tracking principle and an automatic return principle.
In the execution step 160, the control center module 192 sends a control command to the command conversion module 193, the command conversion module 193 converts the control command sent by the control center module 192 into a tracking movement command and sends the tracking movement command to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking device to rotate according to the movement of the tracked person according to the tracking movement command.
In the execution step 165, when the tracked person leaves the monitoring area, the tracking identification module 191 sends information to the control center module 192, the control center module 192 analyzes and processes the information, and sends a control instruction to the instruction conversion module 193, the instruction conversion module 193 converts the control instruction sent by the control center module 192 into a waiting judgment instruction and sends the waiting judgment instruction to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking device to be in a waiting state according to the waiting judgment instruction.
Step 170 is determining whether a person enters the monitored area. If no person enters the monitoring area or the voice command of the tracked person is not received after the waiting time, the control center module 192 sends a control command to the command conversion module 193, the command conversion module 193 converts the control command sent by the control center module 192 into a command for returning to the initial state and sends the command to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking equipment to return to the initial position according to the command for returning to the initial state and continues to execute the step 130.
If someone enters the monitoring area, step 180 is executed, the tracking device automatically judges whether the person is the tracked person according to the final principle, the tracking identification module 191 sends information to the control center module 192, the control center module 192 analyzes and processes the information, and simultaneously sends a control instruction to the instruction conversion module 193, the instruction conversion module 193 converts the control instruction sent by the control center module 192 into a waiting judgment instruction and sends the instruction to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking device to be in a waiting state according to the waiting judgment instruction.
If the person is not the tracked person, the control center module 192 sends a control instruction to the instruction conversion module 193, the instruction conversion module 193 converts the control instruction sent by the control center module 192 into a continuous waiting instruction and sends the instruction to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking device to be in a waiting state according to the continuous waiting instruction and continues to execute the step 165.
If the person is tracked, the control center module 192 sends a control instruction to the instruction conversion module 193, the instruction conversion module 193 converts the control instruction sent by the control center module 192 into a continuous tracking instruction and sends the instruction to the intelligent tracking module 194, and the intelligent tracking module 194 controls the tracking device to track the tracked person according to the continuous tracking instruction and continues to execute the step 160.
Fig. 2 is a schematic diagram of a teaching demonstration course recorded in the full-angle human tracking control method based on the natural human-computer interaction technology. Referring to fig. 2, a video recording device 201 is hung at the top of the center of a classroom, a tracking system is installed in a computer 202, the recognition principle is set as a gesture recognition principle, a face recognition principle and a locking principle, the tracking principle is set as a bone data tracking principle, a depth data assisted tracking principle, a voice tracking principle and an automatic return principle, and the initial monitoring area is an area near a platform. When the lecture teacher 203 enters the shot capture area, the lecture teacher is automatically locked as the tracked person. When any uncertain factors of the control right appear, the lens automatically returns to the blackboard position to wait for the teacher to obtain the control right again, the focal length can be automatically adjusted according to the position of the lecture teacher, and the tracking action can be adjusted according to the voice command of the lecture teacher. The method is not influenced by the brightness of the ambient light, and can still accurately track even if only weak light exists.
Fig. 3 is a schematic diagram of a remote video conference in the full-angle human tracking control method based on the natural human-computer interaction technology. Referring to fig. 3, a video recording device 301 is placed on a table in a conference room, a tracking system is installed in a computer 302, recognition principles are set as a gesture recognition principle, an action recognition principle and a face recognition principle, tracking principles are set as a bone data tracking principle and a voice tracking principle, and an initial monitoring area is a main seat peripheral area. When the company leader 303 enters the shot capture area, the leader is automatically locked as the tracked person. The pace of the company leader can be tracked in real time, the leader is always positioned in the center of a lens, the focal length can be automatically adjusted according to the position of the company leader, and the picture can be always kept clear.
Fig. 4 is a schematic diagram of remote video interactive teaching in the full-angle human tracking control method based on the natural human-computer interaction technology. Referring to fig. 4, a video recording device 401 is placed on a table in a conference room, a tracking system is installed in a computer 402, recognition principles are set as a gesture recognition principle, a face recognition principle and a voice recognition principle, tracking principles are set as a bone data tracking principle and a voice tracking principle, and an initial monitoring area is a platform peripheral area. When the lecture teacher 403 enters the shot capture area, the lecture teacher is automatically locked as the tracked person. When the remote teaching starts, the video signal is transmitted to the computer 408 of the lecture student 405 through the network 404, and is projected to the large screen 409 through the projector 407. The resolution of the transmitted video can be automatically adjusted according to the network loan. When the network speed is low, the resolution of the video is automatically reduced, and the image quality is sacrificed, so that the frame loss of the image is reduced, and the smoothness of the image is ensured.
Fig. 5 is a schematic view of course recording and playing in the full-angle human tracking control method based on the natural human-computer interaction technology. Referring to fig. 5, a video recording device 501 is placed on a table in a conference room, a tracking system is installed in a computer 502, recognition principles are set as a gesture recognition principle, a face recognition principle and a voice recognition principle, tracking principles are set as a bone data tracking principle and a voice tracking principle, and an initial monitoring area is a platform peripheral area. When the lecture teacher 503 enters the shot capture area, the lecture teacher is automatically locked as the tracked person. After the course recording is finished, the course is uploaded to the cloud server 505 through the network 504. The students 506 can watch the course videos on the personal computer 507 through online network or downloading the course videos on the cloud server.
Claims (60)
1. The full-angle human tracking system based on the natural human-computer interaction technology comprises a natural human-computer interaction data acquisition module for acquiring natural human-computer interaction data, and is characterized by comprising the following modules:
an initialization module, configured to perform initialization setting, where the initialization setting includes: setting an initial monitoring area, an identification principle and a tracking principle; the tracking principle comprises a sound tracking principle, wherein the sound tracking principle is that when a tracked person moves out of an identification range, the sound of the tracked person can be automatically tracked, and then person identification is carried out again; the tracking identification module is used for identifying and locking a tracked person, and the identification principle comprises at least one of a position identification principle, a posture identification principle, an action identification principle, a face identification principle, a voice identification principle, a locking principle and an intelligent ignoring principle;
the control center module analyzes the collected natural man-machine interaction data and processes the data into a control instruction packet to be output;
the command conversion module is used for converting the control command packet into a control signal which accords with a full-angle human tracking system;
the intelligent tracking module: and receiving a control signal and automatically tracking according to the movement of the human body.
2. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the tracking identification module has a function of locking the tracked human according to an identification principle.
3. The system according to claim 1, wherein the position recognition principle is to determine the tracked person according to the relative position of the person in the recognition area and the sensor.
4. The system of claim 1, wherein the gesture recognition principle is to determine the person in the recognition area as the tracked person when the person has a static body gesture equal to the preset static body gesture.
5. The system of claim 1, wherein the action recognition rule is to determine a person in the recognition area as a tracked person if the person performs the same action preset by the system, and the action recognition includes at least one of recognition of an action track and an action time limit.
6. The system according to claim 1, wherein the face recognition principle is to determine whether the person is a tracked person according to face data pre-stored in the system.
7. The system according to claim 1, wherein the voice recognition principle is to determine whether the person is the tracked person according to audio data pre-stored in the system.
8. The system of claim 1, wherein the first-in locking principle is to determine the person entering the identification area as the tracked person.
9. The system according to claim 1, wherein the intelligent ignoring principle means that other people entering the identification area are intelligently ignored after the tracked person is locked.
10. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the tracking identification module further has a function of tracking the tracked human according to a tracking principle.
11. The natural human-computer interaction technology based full-angle human tracking system of claim 10, wherein the tracking principle further comprises at least one of a skeletal data tracking principle, a depth data assisted tracking principle and an automatic return principle.
12. The system according to claim 11, wherein the skeletal data tracking principle is to extract skeletal data of a tracked person and track the skeletal data.
13. The system according to claim 11, wherein the depth data aided tracking principle is that when the skeleton data is disordered, the depth data is retrieved from the collected data to assist in tracking the tracked person.
14. The system of claim 11, wherein the automatic return rule is that after a predetermined time interval, if no data of the tracked person is collected, the system is automatically adjusted to a predetermined position for people recognition.
15. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the control center module has a function of receiving information of the tracked human sent by the tracking identification system.
16. The natural human-computer interaction technology-based all-angle human tracking system according to claim 15, wherein the control center module further has a function of recording information of a tracked human.
17. The system according to claim 16, wherein the tracked person information is at least one of bone data, position data, posture data, motion data, human face data, and audio data.
18. The natural human-computer interaction technology-based all-angle human tracking system according to claim 16, wherein the control center module further has a function of analyzing a voice command, an action command and a limb command of the tracked human.
19. The system according to any one of claims 15, 16 and 17, wherein the control center module further has a function of predicting real-time actions of the tracked person.
20. The natural human-computer interaction technology-based full-angle human tracking system according to claim 19, wherein the control center module further has a function of outputting an automatic real-time motion instruction of the correction pan/tilt head.
21. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the control center module has a function of processing the analyzed command into a control command and outputting the control command.
22. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the instruction conversion module has a function of receiving a control instruction of a control center.
23. The natural human-computer interaction technology-based all-angle human tracking system according to claim 22, wherein the command conversion module has a function of converting a control command of the control center into an action command for controlling the intelligent tracking module.
24. The natural human-computer interaction technology-based all-angle human tracking system according to claim 1, wherein the intelligent tracking module has a function of tracking the tracked human according to action instructions.
25. The natural human-computer interaction technology-based full-angle human tracking system according to claim 24, wherein the intelligent tracking module has a function of controlling the motion of a pan-tilt.
26. The natural human-computer interaction technology based full-angle human tracking system of claim 25, wherein the pan-tilt motion comprises at least one of a basic motion, a switching motion and a high-level motion.
27. The natural human-computer interaction technology based full-angle human tracking system of claim 26, wherein the basic action comprises at least one of an on module, an off module and a pause module.
28. The natural human-computer interaction technology based full-angle human tracking system of claim 27, wherein the switching action comprises at least one of mode switching and multi-camera switching.
29. The natural human-computer interaction technology based full-angle human tracking system of claim 28, wherein the high-level actions include at least one of rotation to a specified angle, camera focus adjustment, speed adjustment, forward and backward movement, left and right movement, and up and down movement.
30. The natural human-computer interaction technology-based full-angle human tracking system of claim 29, wherein the specified angle is in the range of 0-360 degrees.
31. The full-angle human tracking control method based on the natural human-computer interaction technology comprises the steps of collecting natural human-computer interaction data, and is characterized in that: comprises the following steps:
step 1: performing initialization setting, wherein the initialization setting comprises: setting an initial monitoring area, an identification principle and a tracking principle; the tracking principle comprises a sound tracking principle, wherein the sound tracking principle is that when a tracked person moves out of an identification range, the sound of the tracked person can be automatically tracked, and then person identification is carried out again;
identifying and locking a tracked person, wherein the identification principle comprises at least one of a position identification principle, a posture identification principle, an action identification principle, a face identification principle, a voice identification principle, a locking principle and an intelligent ignoring principle;
step 2, analyzing the collected natural man-machine interaction data, and processing the data into a control instruction packet for output;
and step 3: converting the control instruction packet into a control signal conforming to a full-angle human tracking system;
and 4, step 4: and receiving a control signal and automatically tracking according to the movement of the human body.
32. The method for controlling the full-angle human tracking based on the natural human-computer interaction technology as claimed in claim 31, wherein the step 1 is to lock the tracked human according to the recognition principle.
33. The method as claimed in claim 31, wherein the position recognition principle is to determine the tracked person according to the relative position of the person in the recognition area and the sensor.
34. The method as claimed in claim 31, wherein the gesture recognition principle is that when the person in the recognition area has a static body gesture the same as the static body gesture preset by the system, the person is determined to be the tracked person.
35. The method as claimed in claim 31, wherein the action recognition rule is that when a person in the recognition area performs the same action as preset by the system, the person is determined to be the tracked person, and the action recognition includes at least one of recognition of an action track and an action time limit.
36. The method according to claim 31, wherein the face recognition rule is to determine whether the person is a tracked person according to face data pre-stored in the system.
37. The method as claimed in claim 31, wherein the voice recognition rule is to determine whether the person is tracked according to audio data pre-stored in the system.
38. The method as claimed in claim 31, wherein the first locking principle is to determine the person entering the identification area as the tracked person.
39. The method as claimed in claim 31, wherein the intelligent ignoring principle means that other people entering the identification area are intelligently ignored after the tracked person is locked.
40. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 31, wherein the step 1 is tracking the tracked human according to a tracking principle.
41. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 40, wherein the tracking principle further comprises at least one of a skeletal data tracking principle, a depth data assisted tracking principle and an automatic return principle.
42. The method as claimed in claim 41, wherein the skeletal data tracking principle is to extract skeletal data of a tracked person and track the skeletal data.
43. The method of claim 41, wherein the depth data assisted tracking principle is that when the skeleton data is disordered, the depth data is retrieved from the collected data to assist the tracked person to continue tracking.
44. The method as claimed in claim 41, wherein the automatic return rule is that after a predetermined time interval, if no data of the tracked person is collected, the tracked person is automatically adjusted to a predetermined position for re-recognition.
45. The method for controlling the tracking of the all-angle person based on the natural human-computer interaction technology as claimed in claim 31, wherein the step 2 is to receive the information of the person to be tracked sent by the tracking identification system.
46. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 45, wherein the step 2 is to record the information of the tracked human.
47. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 46, wherein the tracked human information refers to at least one of bone data, position data, posture data, motion data, human face data and audio data.
48. The natural human-computer interaction technology-based all-angle human tracking control method according to claim 46, wherein the step 2 is to analyze a voice command, an action command and a limb command of the tracked human.
49. The full-angle human tracking control method based on the natural human-computer interaction technology as claimed in any one of claims 45, 46 and 48, wherein the step 2 further comprises predicting the real-time action of the tracked human.
50. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 49, wherein the step 2 further comprises outputting an automatic real-time action command of the correction holder.
51. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 31, wherein the step 2 is to process the parsed command into a control command and output the control command.
52. The method for controlling the full-angle human tracking based on the natural human-computer interaction technology as claimed in claim 31, wherein the step 3 is receiving a control command from a control center.
53. The method for controlling the full-angle human tracking based on the natural human-computer interaction technology as claimed in claim 52, wherein the step 3 is to convert the control command of the control center into the action command for controlling the intelligent tracking module.
54. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 31, wherein the step 4 is tracking the tracked human according to the action command.
55. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 54, wherein the step 4 is controlling a pan-tilt motion.
56. The natural human-computer interaction technology-based full-angle human tracking control method of claim 55, wherein the pan-tilt motion comprises at least one of a basic motion, a switching motion and a high-level motion.
57. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 56, wherein the basic action comprises at least one of an on module, an off module and a pause module.
58. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 56, wherein the switching action comprises at least one of mode switching and multi-camera switching.
59. The natural human-computer interaction technology-based full-angle human tracking control method according to claim 56, wherein the advanced action comprises at least one of rotation to a specified angle, camera focus adjustment, speed adjustment, forward and backward movement, left and right movement, and up and down movement.
60. The natural human-computer interaction technology-based full-angle human tracking control method of claim 59, wherein the range of the specified angle is 0-360 degrees.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610342160.3A CN105844673B (en) | 2016-05-20 | 2016-05-20 | Full-angle human tracking system based on natural human-computer interaction technology and control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610342160.3A CN105844673B (en) | 2016-05-20 | 2016-05-20 | Full-angle human tracking system based on natural human-computer interaction technology and control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105844673A CN105844673A (en) | 2016-08-10 |
CN105844673B true CN105844673B (en) | 2020-03-24 |
Family
ID=56593026
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610342160.3A Active CN105844673B (en) | 2016-05-20 | 2016-05-20 | Full-angle human tracking system based on natural human-computer interaction technology and control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105844673B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102799191A (en) * | 2012-08-07 | 2012-11-28 | 北京国铁华晨通信信息技术有限公司 | Method and system for controlling pan/tilt/zoom based on motion recognition technology |
CN103024344A (en) * | 2011-09-20 | 2013-04-03 | 佳都新太科技股份有限公司 | Automatic PTZ (Pan/Tilt/Zoom) target tracking method based on particle filter |
CN103136899A (en) * | 2013-01-23 | 2013-06-05 | 宁凯 | Intelligent alarming monitoring method based on Kinect somatosensory equipment |
CN104125433A (en) * | 2014-07-30 | 2014-10-29 | 西安冉科信息技术有限公司 | Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure |
CN105025260A (en) * | 2015-07-07 | 2015-11-04 | 合肥指南针电子科技有限责任公司 | Monitoring system intelligent tracing-back method |
CN105407283A (en) * | 2015-11-20 | 2016-03-16 | 成都因纳伟盛科技股份有限公司 | Multi-target active recognition tracking and monitoring method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6175382B1 (en) * | 1997-11-24 | 2001-01-16 | Shell Oil Company | Unmanned fueling facility |
CN101290681B (en) * | 2008-05-26 | 2010-06-02 | 华为技术有限公司 | Video frequency object tracking method, device and automatic video frequency following system |
CN102045497A (en) * | 2009-10-26 | 2011-05-04 | 鸿富锦精密工业(深圳)有限公司 | Image videotaping equipment and method for monitoring sound event |
KR20130122516A (en) * | 2010-04-26 | 2013-11-07 | 캠브리지 메카트로닉스 리미티드 | Loudspeakers with position tracking |
CN104065923B (en) * | 2014-06-23 | 2017-05-17 | 阔地教育科技有限公司 | On-line synchronization classroom tracking control method and system |
CN104635209A (en) * | 2014-12-01 | 2015-05-20 | 李士斌 | Sound localization system |
-
2016
- 2016-05-20 CN CN201610342160.3A patent/CN105844673B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103024344A (en) * | 2011-09-20 | 2013-04-03 | 佳都新太科技股份有限公司 | Automatic PTZ (Pan/Tilt/Zoom) target tracking method based on particle filter |
CN102799191A (en) * | 2012-08-07 | 2012-11-28 | 北京国铁华晨通信信息技术有限公司 | Method and system for controlling pan/tilt/zoom based on motion recognition technology |
CN103136899A (en) * | 2013-01-23 | 2013-06-05 | 宁凯 | Intelligent alarming monitoring method based on Kinect somatosensory equipment |
CN104125433A (en) * | 2014-07-30 | 2014-10-29 | 西安冉科信息技术有限公司 | Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure |
CN105025260A (en) * | 2015-07-07 | 2015-11-04 | 合肥指南针电子科技有限责任公司 | Monitoring system intelligent tracing-back method |
CN105407283A (en) * | 2015-11-20 | 2016-03-16 | 成都因纳伟盛科技股份有限公司 | Multi-target active recognition tracking and monitoring method |
Also Published As
Publication number | Publication date |
---|---|
CN105844673A (en) | 2016-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Trivedi et al. | Dynamic context capture and distributed video arrays for intelligent spaces | |
CN103093654A (en) | Double video camera interactive intelligent tracking teaching system | |
CN101699862B (en) | Acquisition method of high-resolution region-of-interest image of PTZ camera | |
US8754945B2 (en) | Image capturing device and motion tracking method | |
CN101587542A (en) | Field depth blending strengthening display method and system based on eye movement tracking | |
CN110933316A (en) | Teacher tracking teaching system based on double-camera interactive mode | |
WO2019104681A1 (en) | Image capture method and device | |
JP2007147762A (en) | Speaker predicting device and speaker predicting method | |
CN111432115A (en) | Face tracking method based on voice auxiliary positioning, terminal and storage device | |
CN107452021A (en) | Camera to automatically track system and method based on single-lens image Dynamic Recognition | |
JP6859640B2 (en) | Information processing equipment, evaluation systems and programs | |
CN101814242A (en) | Moving object real-time tracking recording device of classes | |
CN205827430U (en) | Camera to automatically track system based on single-lens image Dynamic Recognition | |
Bernardin et al. | Audio-visual multi-person tracking and identification for smart environments | |
CN108877361A (en) | The man-machine robot system for teaching mode altogether | |
CN114779922A (en) | Control method for teaching apparatus, control apparatus, teaching system, and storage medium | |
Salter et al. | The tower game dataset: A multimodal dataset for analyzing social interaction predicates | |
Chou et al. | Automated lecture recording system | |
Ramasso et al. | State filtering and change detection using TBM conflict application to human action recognition in athletics videos | |
CN115733943A (en) | Recording and broadcasting interaction system and method based on multi-camera automatic tracking linkage | |
CN105844673B (en) | Full-angle human tracking system based on natural human-computer interaction technology and control method | |
CN116016978B (en) | Online classroom screen directing method, device, electronic device and storage medium | |
CN111860294A (en) | Face capture equipment convenient to trail | |
Lee et al. | A real-time face tracking system based on a single PTZ camera | |
CN205486164U (en) | Novel people's face 3D expression action identification system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |