CN112333419B - Monitoring tracking method, device, system and computer readable storage medium - Google Patents
Monitoring tracking method, device, system and computer readable storage medium Download PDFInfo
- Publication number
- CN112333419B CN112333419B CN202010855738.1A CN202010855738A CN112333419B CN 112333419 B CN112333419 B CN 112333419B CN 202010855738 A CN202010855738 A CN 202010855738A CN 112333419 B CN112333419 B CN 112333419B
- Authority
- CN
- China
- Prior art keywords
- tracked
- person
- target
- determining
- monitoring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 165
- 238000000034 method Methods 0.000 title claims abstract description 65
- 230000006399 behavior Effects 0.000 claims description 39
- 238000004364 calculation method Methods 0.000 description 8
- 230000014759 maintenance of location Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- GNFTZDOKVXKIBK-UHFFFAOYSA-N 3-(2-methoxyethoxy)benzohydrazide Chemical compound COCCOC1=CC=CC(C(=O)NN)=C1 GNFTZDOKVXKIBK-UHFFFAOYSA-N 0.000 description 2
- YTAHJIFKAKIKAV-XNMGPUDCSA-N [(1R)-3-morpholin-4-yl-1-phenylpropyl] N-[(3S)-2-oxo-5-phenyl-1,3-dihydro-1,4-benzodiazepin-3-yl]carbamate Chemical compound O=C1[C@H](N=C(C2=C(N1)C=CC=C2)C1=CC=CC=C1)NC(O[C@H](CCN1CCOCC1)C1=CC=CC=C1)=O YTAHJIFKAKIKAV-XNMGPUDCSA-N 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 241000282414 Homo sapiens Species 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Alarm Systems (AREA)
Abstract
The invention discloses a monitoring and tracking method, a device, a system and a computer readable storage medium, comprising the following steps: if the person to be tracked is acquired through the camera, determining a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene; determining a target person in the persons to be tracked according to the target scene and the behavior information; and controlling the camera to monitor and track the picture of the target person, and displaying the monitored and tracked picture of the target person on a preset terminal. According to the invention, if the person to be tracked is acquired through the camera, the target scene and corresponding behavior information of the person to be tracked are determined, so that the target person in the person to be tracked is determined, the camera is controlled to monitor and track the target person, and the target person to be monitored and tracked is ensured not to be lost due to interference of other moving objects.
Description
Technical Field
The present invention relates to the field of intelligent monitoring and tracking technologies, and in particular, to a monitoring and tracking method, device, system, and computer readable storage medium.
Background
In recent years, with the rapid development of computer technology and internet technology, the technological revolution induced by artificial intelligence is greatly changing the life of human beings, and is particularly reflected in the field of smart home. In smarthouses, people often set up smart cameras to monitor indoor people. The camera is installed, so that a thief can be prevented from theft, and more importantly, the user can know the condition of the home at any time due to monitoring. When a camera in a household is set to be in a monitoring mode, the camera in the traditional sense only takes the image change at a certain place in a monitoring video as an identification mode, so that a tracked object is determined; when the monitoring video changes at a plurality of places, the camera cannot determine the object to be tracked, for example, the user frequently pays attention to the activities of children, the kittens are in the family, or other people in the family are in the activities, so that the focus of the camera is on other movable objects and is not on the target child concerned by the user, the video of the related child cannot be checked when the user enters the monitoring mode, and the system loses the key tracking object.
Disclosure of Invention
The invention mainly aims to provide a monitoring and tracking method, a device, a system and a computer readable storage medium, which aim to determine a target person to be monitored and tracked through camera acquisition, so that the target person is tracked, and the system cannot lose the target person to be monitored and tracked due to interference of a moving object.
In order to achieve the above object, the present invention provides a monitoring and tracking method, including:
If the person to be tracked is acquired through the camera, determining a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene;
Determining a target person in the persons to be tracked according to the target scene and the behavior information;
And controlling the camera to monitor and track the picture of the target person, and displaying the monitored and tracked picture of the target person on a preset terminal.
In addition, to achieve the above object, the present invention further provides a monitoring and tracking device, including:
The first determining module is used for determining a target scene where the person to be tracked is and behavior information of the person to be tracked in the target scene if the person to be tracked is acquired through the camera;
The second determining module is used for determining target characters in the characters to be tracked according to the target scene and the behavior information;
The monitoring tracking module is used for controlling the camera to monitor and track the picture of the target person;
and the display module is used for displaying the monitored and tracked picture of the target person on a preset terminal.
In addition, to achieve the above object, the present invention further provides a monitoring and tracking system, including: the system comprises a memory, a processor and a monitoring tracking program stored in the memory and capable of running on the processor, wherein the monitoring tracking program realizes the steps of the monitoring tracking method when being executed by the processor.
In addition, in order to achieve the above object, the present invention further provides a computer readable storage medium, on which a monitoring tracking program is stored, which when executed by a processor, implements the steps of the monitoring tracking method as described above.
According to the monitoring and tracking method provided by the invention, if the person to be tracked is acquired through the camera, the target person in the person to be tracked is determined by determining the target scene of the person to be tracked and the behavior information of the person to be tracked in the target scene, so that the camera is controlled to monitor and track the target person, the camera cannot lose the target person due to interference of other moving objects, and the picture finally presented by the camera is ensured to be the picture which the user cares about, so that intelligent monitoring is realized.
Drawings
FIG. 1 is a schematic diagram of a system architecture of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart of a first embodiment of a monitoring and tracking method according to the present invention;
Fig. 3 is a schematic structural diagram of a first embodiment of the monitoring and tracking device of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a monitoring and tracking system of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the system may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Those skilled in the art will appreciate that the system architecture shown in fig. 1 is not limiting of the monitoring and tracking system and may include more or fewer components than shown, or may combine certain components, or may be a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a monitoring tracking program may be included in the memory 1005 as one type of computer storage medium.
The operating system is a program for managing and controlling the monitoring tracking system and software resources and supports the operation of a network communication module, a user interface module, a monitoring tracking program and other programs or software; the network communication module is used to manage and control the network interface 1002; the user interface module is used to manage and control the user interface 1003.
In the monitoring tracking system shown in fig. 1, a monitoring tracking program stored in a memory 1005 is called by a processor 1001, and operations in the respective embodiments of the monitoring tracking method described below are performed.
Based on the above hardware structure, the embodiment of the monitoring and tracking method is provided, the monitoring and tracking method is applied to the monitoring and tracking system of the smart home scene, and for convenience of description, the monitoring and tracking system is simply called a monitoring system.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a monitoring and tracking method according to the present invention, the method includes:
Step S10, if a person to be tracked is acquired through a camera, a monitoring system determines a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene;
step S20, the monitoring system determines a target person in the persons to be tracked according to the target scene and the behavior information;
In step S30, the monitoring system controls the camera to monitor and track the picture of the target person, and displays the monitored and tracked picture of the target person on the preset terminal.
In smart home scenes, a camera is often used for identifying and caring indoor people of a household, when the camera is set in a caring mode, namely a monitoring mode, the camera in a traditional mode is only used as an identification mode according to image changes at a certain place in a video, when the monitoring video changes for a plurality of times, the camera cannot make a decision, namely when a plurality of target people are identified, the camera cannot preferentially identify the target people focused by a user, and even because of untimely response, the monitoring system collapses, so that data of key tracking people are lost.
The person in the home may be a person or an object such as a cat in the home.
According to the monitoring system, the camera is used for collecting the person to be tracked, and the target scene and corresponding behavior information of the person to be tracked are determined, so that the tracked target person is determined, the camera is controlled to monitor the target person, and the system is ensured not to lose the target person which the user wants to pay attention to.
The following will explain each step in detail:
Step S10, if a person to be tracked is acquired through a camera, a monitoring system determines a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene;
In this embodiment, cameras are installed in advance in each scene to be monitored, and the monitoring system stores monitoring videos collected in advance by each camera. If the camera collects the people to be tracked, the people to be tracked can be identified according to the characteristics of different people, meanwhile, the monitoring system can determine the target scene of the people to be tracked through the camera, and then acquire the monitoring video of the people to be tracked in the target scene, so that the respective behavior information of the people to be tracked is determined. Wherein, the scene includes room, living room etc.
Specifically, step S10 includes:
Step a1, a monitoring system determines the position of a camera, and determines a target scene where a person to be tracked is located according to the position of the camera;
In this embodiment, if the camera collects the person to be tracked, person identification can be performed through face recognition and other modes, meanwhile, the monitoring system can determine the target scene where the person to be tracked is located by determining the specific position where the camera collects the person to be tracked is located, that is, the scene where the camera collects the person to be tracked is the target scene where the person to be tracked is located, if the current camera is installed in a study, then the target scene where the person to be tracked is determined to be the study.
Step a2, the monitoring system acquires a pre-acquired character group of the target scene, and determines behavior information of the character to be tracked in the target scene according to the pre-acquired character group.
In this embodiment, the pre-collection character set exists before the current camera collects the character to be tracked. Before the camera is used for collecting, the monitoring system identifies people in each indoor scene through the camera, sets corresponding people groups for different people identified in each scene, and records collected person videos or pictures into corresponding people groups by screening the collected monitoring videos, so that a pre-collected person group is formed on the basis. That is, before the current camera collects, the people appearing in each scene can find a dedicated person group, that is, a pre-collected person group, in each scene in the monitoring system. Therefore, the monitoring system can determine the identity of the person to be tracked by acquiring the pre-acquired person group of the target scene, so as to determine the behavior information of each person to be tracked in the target scene.
It should be noted that if there is only one person to be tracked, the person to be tracked is the target person, and the corresponding person group does not need to be acquired to determine the priority order; if the monitoring system cannot acquire a pre-acquired character group of a certain character to be tracked in the target scene, the character to be tracked is a new character of the target scene, and the new character group can be set for the character group through acquisition and screening of the camera, so that the character group of the new character is blank at first and all behavior information is zero.
Step S20, the monitoring system determines a target person in the persons to be tracked according to the target scene and the behavior information;
in this embodiment, the monitoring system determines the priority order of the people to be tracked through the conditions of the target scene where the people to be tracked are located, the corresponding behavior information of each person to be tracked, and the like, so as to determine the target people in the people to be tracked.
Specifically, step S20 includes:
step b1, the monitoring system calculates the occurrence percentage and the time percentage of the person A to be tracked according to the occurrence times and the residence time of the person A to be tracked of a target scene, wherein the person A to be tracked is any one of at least two persons to be tracked;
In this embodiment, if the number of people to be tracked is greater than or equal to two, and the behavior information includes the number of occurrences and the retention time, then in the process of determining the target people, the monitoring system may obtain the number of occurrences and the retention time of each person to be tracked in the target scene by obtaining the behavior information of each person to be tracked, calculate the sum of the number of occurrences and the sum of the retention time of each person to be tracked, thereby determining the total number of occurrences and the total retention time of each person to be tracked in the target scene, and finally calculate the percentage of occurrences and the percentage of time of each person to be tracked.
Taking any one of at least two people to be tracked as a person A to be tracked as an example, the monitoring system calculates the occurrence percentage and the time percentage of the person A to be tracked according to the occurrence times and the stay time of the person A to be tracked in the target scene.
Specifically, the calculation formula of the occurrence percentage of the person to be tracked a is:
percentage occurrence = number of occurrences/total number of occurrences x 100%
That is, the number of occurrences of the person to be tracked a in the target scene is divided by the total number of occurrences of all the persons to be tracked in the target scene.
The calculation formula of the percentage of time of the person A to be tracked is as follows:
percent time = residence time/total residence time x 100%
That is, the dwell time of the person to be tracked a in the target scene is divided by the total dwell time of all the persons to be tracked in the target scene.
Step b2, the monitoring system determines the corresponding priority percentage of the person A to be tracked according to the occurrence percentage and the time percentage of the person A to be tracked;
then, the monitoring system determines the priority percentage of the person A to be tracked according to the determined occurrence percentage and time percentage, and specifically, the calculation formula of the priority percentage is as follows:
priority percentage = percentage of occurrence + percentage of time
That is, the percentage of appearance of the person to be tracked a is added to the percentage of time to obtain the priority percentage of the person to be tracked a.
In this way, the priority percentages of the individual persons to be tracked are determined in turn.
And b3, determining target characters in the characters to be tracked by the monitoring system according to the priority percentage of each of the at least two characters to be tracked.
In this embodiment, the monitoring system calculates the sum of the occurrence percentage and the time percentage to determine the priority percentage of each person to be tracked, and arranges the priority percentages in a descending order or an ascending order, so as to discharge the priority orders of the persons to be tracked in a one-to-one correspondence manner, thereby determining the target person in the persons to be tracked, wherein in the implementation, the monitoring system may determine the person to be tracked with the front priority order as the target person, or may determine the person to be tracked with the rear priority order as the target person.
In step S30, the monitoring system controls the camera to monitor and track the picture of the target person, and displays the monitored and tracked picture of the target person on the preset terminal.
In this embodiment, the monitoring system is connected to the preset terminal in advance, and after determining the target person in each scene, the monitoring system controls the camera in the target scene to monitor and track the target person, that is, where the target person moves, and the monitoring system can control where the camera turns, and display the monitoring picture of the monitored and tracked target person in the preset terminal.
According to the monitoring and tracking method, if the person to be tracked is acquired through the camera, the target person in the person to be tracked is determined by determining the target scene of the person to be tracked and the behavior information of the person to be tracked in the target scene, so that the camera is controlled to monitor and track the target person, the camera cannot lose the target person due to interference of other moving objects, and the picture finally presented by the camera is ensured to be a picture which a user cares about wanted, so that intelligent monitoring is realized.
Further, based on the first embodiment of the monitoring and tracking method of the present invention, a second embodiment of the monitoring and tracking method of the present invention is provided.
The second embodiment of the monitoring and tracking method differs from the first embodiment of the monitoring and tracking method in that step S20 includes:
Step c, the monitoring system calculates the time percentage of the person B to be tracked according to the stay time of the person B to be tracked in the target scene, wherein the person B to be tracked is any one of at least two persons to be tracked;
step d, the monitoring system determines the priority proportion of the person B to be tracked according to the time percentage of the person B to be tracked and the clicking times of the person B to be tracked in the target scene;
Step e, the monitoring system determines the priority sequence of at least two people to be tracked according to the priority proportion of each of the at least two people to be tracked;
and f, determining target characters in the characters to be tracked by the monitoring system according to the priority order.
In the monitoring and tracking method of the embodiment, if the number of the people to be tracked is greater than or equal to two in the process of determining the target people, and the behavior information comprises the stay time and the click times, the monitoring system can acquire the stay time and the click times of each person to be tracked in the target scene by acquiring the behavior information of each person to be tracked in the process of determining the target people, so that the target people in the people to be tracked are determined.
It should be noted that the clicking times refer to the clicking times counted by the monitoring system for each person in the monitoring picture by clicking the monitoring picture displayed by the preset terminal by the user before the current camera is used for collecting. That is, when the monitoring screen is displayed on the preset terminal, the user can select the monitoring object by clicking the person in the monitoring screen, that is, the switching tracking of the target person is realized, and the more the number of times the user clicks, the corresponding clicked object is the target person which the user wants to track on the surface.
The following will explain each step in detail:
Step c, the monitoring system calculates the time percentage of the person B to be tracked according to the stay time of the person B to be tracked in the target scene, wherein the person B to be tracked is any one of at least two persons to be tracked;
In this embodiment, determining a target scene where the person to be tracked is located through a camera, acquiring behavior information of each person to be tracked, namely acquiring the retention time and the clicking times of each person to be tracked, respectively calculating the sum of the retention time and the clicking times of each person to be tracked in the target scene, thereby determining the total retention time and the total clicking times of each person to be tracked in the target scene, finally obtaining the time percentage and the clicking percentage of each person to be tracked through calculation, and determining the priority ratio of each person to be tracked through calculating the sum of the corresponding time percentage and the clicking percentage of each person to be tracked.
Taking the person to be tracked B as an example, the person to be tracked B is any one of at least two persons to be tracked, the monitoring system calculates the time percentage of the person to be tracked B according to the stay time of the person to be tracked B in the target scene, and the specific calculation mode is consistent with the time percentage calculation mode of the first embodiment, which is not described herein, wherein the person to be tracked B is one of the at least two persons to be tracked.
Step d, the monitoring system determines the priority proportion of the person B to be tracked according to the time percentage of the person B to be tracked and the clicking times of the person B to be tracked in the target scene;
then, the monitoring system determines the click percentage of the person to be tracked B according to the click times of the person to be tracked B in the target scene, wherein the specific calculation formula is as follows:
percentage of clicks = number of clicks/total number of clicks x 100%
The number of clicks of each person to be tracked in at least two persons to be tracked is determined, so that the total number of clicks is obtained by adding, and the number of clicks of the person to be tracked B is divided by the total number of clicks to obtain the percentage of clicks of the person to be tracked.
Finally, the monitoring system determines the priority proportion of the person B to be tracked according to a preset formula, wherein the preset formula is as follows:
priority ratio = percentage of time + percentage of click
In this way, the priority ratio of each of the at least two persons to be tracked is sequentially calculated
And e, determining the priority sequence of the at least two persons to be tracked according to the priority proportion of each of the at least two persons to be tracked by the monitoring system.
In this embodiment, the priority ratios of the persons to be tracked are arranged in a descending order or an ascending order, so as to correspondingly discharge the priority sequences of at least two persons to be tracked in the target scene.
And f, determining target characters in the characters to be tracked by the monitoring system according to the priority order.
Finally, the monitoring system determines a target person in the persons to be tracked according to the priority order, and in specific implementation, the person to be tracked with the forefront priority order, such as the person B to be tracked, is taken as the target person; the person to be tracked with the last priority sequence, such as the person to be tracked A, can be used as the target person, etc.
According to the monitoring and tracking method, the camera is used for collecting the person to be tracked, the target scene where the person to be tracked is located and corresponding behavior information are determined, so that the stay time and the clicking times of the person to be tracked in the target scene are determined, the target person to be tracked by the camera is determined by determining the priority sequence of each person to be tracked, the camera is controlled to monitor the target person, and the system is ensured not to lose the target person.
Further, based on the first and second embodiments of the positioning method of the present invention, a third embodiment of the positioning method of the present invention is provided.
The third embodiment of the positioning method differs from the first and second embodiments of the positioning method in that, after step S30, the monitoring and tracking method further includes:
step g, if the target person is detected to leave the target scene, the monitoring system acquires a first updating time and updates the person group corresponding to the target person according to the first updating time;
And h, the monitoring system acquires the priority order according to the target scene, and determines the next target person monitored and tracked by the camera according to the priority order.
After determining the target person to be monitored and tracked, if the camera detects that the target person leaves the target scene, the monitoring system of the embodiment updates the person group corresponding to the target person, and determines the next target person to be monitored and tracked by the camera by acquiring the priority of the person to be tracked in the target scene.
The following will explain each step in detail:
step g, if the target person is detected to leave the target scene, the monitoring system acquires a first updating time and updates the person group corresponding to the target person according to the first updating time;
In this embodiment, since the monitoring system controls the camera to track the target person in each scene, when the target person leaves the target scene, the first update time is acquired, and the behavior information of the target person in the target scene is updated, that is, the person group corresponding to the target person is updated.
And h, the monitoring system acquires the priority order according to the target scene, and determines the next target person monitored and tracked by the camera according to the priority order.
In this embodiment, when the monitoring system detects that the target person leaves the target scene through the camera, the priority order of the current person to be tracked in the target scene is obtained, and the next target person tracked by the camera is redetermined according to the priority order.
According to the monitoring tracking method, if the monitoring system detects that the target person leaves the target scene through the camera, the person group of the target person is updated, and the priority order of the current person to be tracked in the target scene is obtained, so that the next target person tracked by the camera is determined, and the monitoring system can flexibly switch the tracked object.
Further, based on the first, second and third embodiments of the positioning method of the present invention, a fourth embodiment of the positioning method of the present invention is provided.
The fourth embodiment of the positioning method differs from the first, second and third embodiments of the positioning method in that, after step S30, the monitoring and tracking method further includes:
Step i, the monitoring system acquires a second updating time corresponding to the target scene;
Step j, the monitoring system determines the switching time of the target person according to at least two persons to be tracked and the second updating time;
and k, the monitoring system controls the cameras to sequentially switch and monitor and track the target person according to the switching time and the priority order.
The monitoring system of the embodiment determines the switching time of the cameras by acquiring the second updating time corresponding to the target scene and at least two people to be tracked, and controls the cameras to sequentially switch the target people according to the priority order of the people to be tracked in the target scene.
The following will explain each step in detail:
Step i, the monitoring system acquires a second updating time corresponding to the target scene;
In this embodiment, after the target person is determined, a corresponding second update time of the target scene is acquired, where the second update time of the target scene may be set according to an actual situation, for example, 60min, and the second update time of the target scene may be the same as or different from the update time of the other scenes.
Step j, the monitoring system determines the switching time of the target person according to at least two persons to be tracked and the second updating time;
in this embodiment, the switching time of the camera to switch the tracked target person is determined by the preset updating time of the target scene and the number of persons to be tracked.
Specifically, the specific calculation formula of the switching time of the camera for switching the target person is as follows:
switching time = second update time/number of people
That is, each person to be tracked enjoys the same shot time, and therefore, the switching time of each person to be tracked can be obtained by dividing the second update time by the number of the persons.
And k, the monitoring system controls the cameras to sequentially switch and monitor and track the target person according to the switching time and the priority order.
In this embodiment, if it is detected that the switching time of the cameras in the target scene is up, the priority order of the people to be tracked is obtained, and the cameras are controlled to switch the target people in sequence. For example, if the people to be tracked acquired by the camera at this time have 3 people including dad, mom and dad, the target scene is living room, the second update time of the living room is 60 minutes, then the switching time of the camera in the living room is 60/3=20 minutes, that is, the camera switches the tracked target people every 20 minutes, and meanwhile, the priority order of the people in the living room is known to be the dad, the dad and the dad, then the target people tracked by the camera are the dad, the dad and the dad pass 20 minutes, the monitoring system controls the camera to switch and track the mom, the dad is switched and tracked by the camera for 20 minutes, and the camera is switched and tracked by the dad for 20 minutes.
According to the monitoring tracking method, after the priority sequence of the people to be tracked in the target scene is determined, the switching time of the camera is determined, the people to be tracked are sequentially switched and tracked according to the priority sequence of the people to be tracked and the switching time, and the tracked target people are flexibly switched and tracked.
Further, based on the first, second, third and fourth embodiments of the positioning method of the present invention, a fifth embodiment of the positioning method of the present invention is provided.
The fifth embodiment of the positioning method differs from the first, second, third, and fourth embodiments of the positioning method in that the monitoring and tracking method further includes:
if a monitoring instruction of a preset terminal is detected, the monitoring system determines a target camera corresponding to the monitoring instruction;
The monitoring system controls the target camera to monitor and track the monitored person corresponding to the monitoring instruction.
That is, in an embodiment, a user may perform remote control on a display interface of a preset terminal, for example, click operation of a camera on the display interface, so as to trigger a monitoring instruction, and after detecting the monitoring instruction, the monitoring system determines a target camera corresponding to the monitoring instruction, that is, determines that the user is particularly manipulating the camera under that scene, and then controls the camera to monitor and track a monitoring person corresponding to the monitoring instruction.
If the user wants to monitor and track the movements of the home care giver, the user can remotely control the camera on a preset terminal, such as a mobile phone, so as to trigger a monitoring instruction, and the monitoring system controls the corresponding camera, such as a camera of a living room, to monitor and track the movements of the home care giver according to the monitoring instruction.
The embodiment provides a mode of manually controlling the camera, another monitoring mode and operability of the monitoring system.
Referring to fig. 3, the present invention further provides a monitoring and tracking device, including:
the first determining module 10 is configured to determine a target scene in which the person to be tracked is located and behavior information of the person to be tracked in the target scene if the person to be tracked is acquired through the camera;
a second determining module 20, configured to determine a target person from the persons to be tracked according to the target scene and the behavior information;
a monitoring tracking module 30, configured to control the camera to monitor and track the picture of the target person;
and a display module 40 for displaying the monitored and tracked picture of the target person on the preset terminal.
Preferably, the first determining module is further configured to:
Determining the position of a camera;
Determining a target scene where a person to be tracked is located according to the position of the camera;
and acquiring a pre-acquisition character group of the target scene, and determining the behavior information of the character to be tracked in the target scene according to the pre-acquisition character group.
Preferably, the number of people to be tracked is greater than or equal to two, the behavior information includes a number of occurrences and a residence time, and the second determining module is further configured to:
Calculating the occurrence percentage and the time percentage of the person A to be tracked according to the occurrence times and the residence time of the person A to be tracked of the target scene, wherein the person A to be tracked is any one of at least two persons to be tracked;
determining the priority percentage of the person A to be tracked according to the occurrence percentage and the time percentage of the person A to be tracked;
and determining target characters in the characters to be tracked according to the priority percentage of each of the at least two characters to be tracked.
Preferably, the behavior information includes a stay time and a number of clicks, and the second determining module is further configured to:
Calculating the time percentage of the person B to be tracked according to the stay time of the person B to be tracked in the target scene, wherein the person B to be tracked is any one of at least two persons to be tracked;
Determining the priority proportion of the person B to be tracked according to the time percentage of the person B to be tracked and the clicking times of the person B to be tracked in a target scene;
determining the priority sequence of at least two people to be tracked according to the priority proportion of each of the at least two people to be tracked;
determining target characters in the characters to be tracked according to the priority order
Preferably, the monitoring tracking device further comprises an update tracking module, and the update tracking module is used for:
if the target person is detected to leave the target scene, acquiring a first update time, and updating a person group corresponding to the target person according to the first update time;
and acquiring a priority sequence according to the target scene, and determining the next target person monitored and tracked by the camera according to the priority sequence.
Preferably, the monitoring tracking device further comprises a switching tracking module, and the switching tracking module is used for:
acquiring a second updating time corresponding to the target scene;
determining the switching time of the target person according to at least two persons to be tracked and the second updating time;
and controlling the cameras to sequentially switch and monitor and track the target person according to the switching time and the priority order.
Preferably, the monitoring and tracking device further comprises an instruction tracking module, and the instruction tracking module is used for:
If a monitoring instruction of a preset terminal is detected, determining a target camera corresponding to the monitoring instruction;
the control target camera monitors and tracks the monitoring person corresponding to the monitoring instruction.
The invention also provides a computer readable storage medium, wherein the computer readable storage medium stores a monitoring tracking program, and the monitoring tracking program realizes the steps of the monitoring tracking method when being executed by a processor.
The method implemented when the monitoring and tracking program running on the processor is executed may refer to various embodiments of the monitoring and tracking method of the present invention, which are not described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as above, comprising instructions for causing an end system (which may be a mobile phone, a computer, a server, an air conditioner, or a network system, etc.) to perform the method of the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein, or any application, directly or indirectly, in the field of other related technology.
Claims (9)
1. A method of monitoring and tracking comprising:
If the person to be tracked is acquired through the camera, determining a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene;
determining a target person in the persons to be tracked according to the target scene and the behavior information;
Controlling the camera to monitor and track the picture of the target person, and displaying the monitored and tracked picture of the target person on a preset terminal;
when the number of the people to be tracked is greater than or equal to two, the behavior information includes the occurrence times and the stay time, and the determining the target people in the people to be tracked according to the target scene and the behavior information includes:
Calculating the occurrence percentage and the time percentage of the person A to be tracked according to the occurrence times and the residence time of the person A to be tracked of a target scene, wherein the person A to be tracked is any one of at least two persons to be tracked;
Determining the priority percentage of the person A to be tracked according to the occurrence percentage and the time percentage of the person A to be tracked;
and determining target characters in the characters to be tracked according to the priority percentage of each of the at least two characters to be tracked.
2. The method of claim 1, wherein the determining the target scene in which the person to be tracked is located and the behavior information of the person to be tracked in the target scene comprise:
determining the position of the camera;
determining a target scene where the person to be tracked is located according to the position of the camera;
And acquiring a pre-acquisition character group of the target scene, and determining behavior information of the character to be tracked in the target scene according to the pre-acquisition character group.
3. The method of claim 1, wherein the behavior information includes a dwell time and a number of clicks, and wherein the determining a target person of the persons to be tracked based on the target scene and the behavior information comprises:
calculating the time percentage of the person B to be tracked according to the stay time of the person B to be tracked of the target scene, wherein the person B to be tracked is any one of at least two persons to be tracked;
Determining the priority proportion of the person B to be tracked according to the time percentage of the person B to be tracked and the clicking times of the person B to be tracked in the target scene;
Determining the priority sequence of the at least two people to be tracked according to the priority proportion of each of the at least two people to be tracked;
and determining target characters in the characters to be tracked according to the priority order.
4. The method of claim 3, wherein the screen of the monitored and tracked target person is displayed after a preset terminal, the method further comprising:
If the target person is detected to leave the target scene, acquiring a first updating time, and updating a person group corresponding to the target person according to the first updating time;
and acquiring the priority sequence according to the target scene, and determining the next target person monitored and tracked by the camera according to the priority sequence.
5. The method of claim 4, wherein after the determining the target one of the people to be tracked according to the order of preference, the method further comprises:
Acquiring a second updating time corresponding to the target scene;
determining the switching time of the target person according to the at least two persons to be tracked and the second updating time;
and controlling the cameras to sequentially switch, monitor and track the target person according to the switching time and the priority order.
6. The method of any one of claims 1-5, wherein the method further comprises:
if the monitoring instruction of the preset terminal is detected, determining a target camera corresponding to the monitoring instruction;
and controlling the target camera to monitor and track the monitoring person corresponding to the monitoring instruction.
7. A monitoring and tracking device, comprising:
The first determining module is used for determining a target scene where the person to be tracked is located and behavior information of the person to be tracked in the target scene if the person to be tracked is acquired through the camera;
The second determining module is used for determining target characters in the characters to be tracked according to the target scene and the behavior information;
The monitoring tracking module is used for controlling the camera to monitor and track the picture of the target person;
the display module is used for displaying the monitored and tracked picture of the target person on a preset terminal;
When the number of the people to be tracked is greater than or equal to two, and the behavior information comprises the occurrence times and the residence time, the second determining module is further configured to calculate the occurrence percentage and the time percentage of the people to be tracked according to the occurrence times and the residence time of the people to be tracked a in the target scene, where the people to be tracked a is any one of at least two people to be tracked; determining the priority percentage of the person A to be tracked according to the occurrence percentage and the time percentage of the person A to be tracked; and determining target characters in the characters to be tracked according to the priority percentage of each of the at least two characters to be tracked.
8. A monitoring and tracking system, comprising:
A memory, a processor and a monitoring trace program stored on the memory and executable on the processor, the monitoring trace program when executed by the processor implementing the steps of the monitoring trace method according to any one of claims 1 to 6.
9. A computer-readable storage medium, wherein a monitoring trace program is stored on the computer-readable storage medium, which when executed by a processor, implements the steps of the monitoring trace method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010855738.1A CN112333419B (en) | 2020-08-21 | 2020-08-21 | Monitoring tracking method, device, system and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010855738.1A CN112333419B (en) | 2020-08-21 | 2020-08-21 | Monitoring tracking method, device, system and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112333419A CN112333419A (en) | 2021-02-05 |
CN112333419B true CN112333419B (en) | 2024-08-23 |
Family
ID=74303733
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010855738.1A Active CN112333419B (en) | 2020-08-21 | 2020-08-21 | Monitoring tracking method, device, system and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112333419B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114827464B (en) * | 2022-04-19 | 2023-03-03 | 北京拙河科技有限公司 | Target tracking method and system based on mobile camera |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019180025A (en) * | 2018-03-30 | 2019-10-17 | セコム株式会社 | Monitoring device |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040084222A (en) * | 2003-03-27 | 2004-10-06 | 주식회사 케이티 | System and Method for Tracking and Monitering a Moving Object |
JP4140567B2 (en) * | 2004-07-14 | 2008-08-27 | 松下電器産業株式会社 | Object tracking device and object tracking method |
WO2007119355A1 (en) * | 2006-03-15 | 2007-10-25 | Omron Corporation | Tracking device, tracking method, tracking device control program, and computer-readable recording medium |
JP2008152328A (en) * | 2006-12-14 | 2008-07-03 | Hitachi Information & Control Solutions Ltd | Suspicious person monitoring system |
JP4985742B2 (en) * | 2009-10-19 | 2012-07-25 | 日本電気株式会社 | Imaging system, method and program |
JP5459674B2 (en) * | 2010-09-13 | 2014-04-02 | 株式会社東芝 | Moving object tracking system and moving object tracking method |
WO2014136979A1 (en) * | 2013-03-08 | 2014-09-12 | 株式会社デンソーウェーブ | Device and method for monitoring moving entity |
CN105635654B (en) * | 2014-10-30 | 2018-09-18 | 杭州萤石网络有限公司 | Video frequency monitoring method, apparatus and system, video camera |
CN107948581A (en) * | 2017-10-31 | 2018-04-20 | 易瓦特科技股份公司 | The method, system and device being identified based on unmanned plane to destination object |
JP7128577B2 (en) * | 2018-03-30 | 2022-08-31 | セコム株式会社 | monitoring device |
CN108898079A (en) * | 2018-06-15 | 2018-11-27 | 上海小蚁科技有限公司 | A kind of monitoring method and device, storage medium, camera terminal |
CN110222640B (en) * | 2019-06-05 | 2022-02-18 | 浙江大华技术股份有限公司 | Method, device and method for identifying suspect in monitoring site and storage medium |
CN110688896A (en) * | 2019-08-23 | 2020-01-14 | 北京正安维视科技股份有限公司 | Pedestrian loitering detection method |
CN110751116B (en) * | 2019-10-24 | 2022-07-01 | 银河水滴科技(宁波)有限公司 | Target identification method and device |
CN110691197A (en) * | 2019-11-04 | 2020-01-14 | 上海摩象网络科技有限公司 | Shooting tracking method and device and electronic equipment |
CN110889346B (en) * | 2019-11-15 | 2021-07-02 | 云从科技集团股份有限公司 | Intelligent tracking method, system, equipment and readable medium |
CN111311649A (en) * | 2020-01-15 | 2020-06-19 | 重庆特斯联智慧科技股份有限公司 | Indoor internet-of-things video tracking method and system |
GB2593931A (en) * | 2020-04-09 | 2021-10-13 | Kraydel Ltd | Person monitoring system and method |
CN113674325A (en) * | 2021-09-02 | 2021-11-19 | 中科海微(北京)科技有限公司 | Personnel trajectory tracking method and system |
-
2020
- 2020-08-21 CN CN202010855738.1A patent/CN112333419B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019180025A (en) * | 2018-03-30 | 2019-10-17 | セコム株式会社 | Monitoring device |
Also Published As
Publication number | Publication date |
---|---|
CN112333419A (en) | 2021-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11830251B2 (en) | Video monitoring apparatus, method of controlling the same, computer-readable storage medium, and video monitoring system | |
US20190238786A1 (en) | Image processing system, image processing method, and program | |
CN105336077B (en) | Data processing equipment and its method of operation | |
US8174572B2 (en) | Intelligent camera selection and object tracking | |
WO2014208575A1 (en) | Video monitoring system, video processing device, video processing method, and video processing program | |
CN112770182B (en) | Video playing control method, device, equipment and storage medium | |
CN112333419B (en) | Monitoring tracking method, device, system and computer readable storage medium | |
CN114116089A (en) | Data visualization method, device, equipment and storage medium | |
JP7632524B2 (en) | Information processing device, information processing method, and program | |
CN112333502A (en) | Intelligent television display method, intelligent television and computer readable storage medium | |
CN108474576A (en) | Control device, control method and program | |
US20250209080A1 (en) | Information aggregation in a multi-modal entity-feature graph for intervention prediction for a medical patient | |
JP7389955B2 (en) | Information processing device, information processing method and program | |
CN118212571A (en) | Intelligent terminal user emotion recognition data processing method and equipment | |
JP5836407B2 (en) | Advertisement display control method, advertisement display control apparatus, and program | |
CN117743634A (en) | Object retrieval method, system and equipment | |
JP4610005B2 (en) | Intruding object detection apparatus, method and program by image processing | |
CN115981517A (en) | VR multi-terminal collaborative interaction method and related equipment | |
CN114501747A (en) | Hospital ward light intelligent control method, device, equipment and storage medium | |
CN111225250B (en) | Video extended information processing method and device | |
CN118584824B (en) | Home control system, method, device and medium based on user behavior prediction | |
KR101081916B1 (en) | Apparatus and method for providing adaptive service for an user | |
CN112182049A (en) | Wearable clothes determination method and device, storage medium and electronic device | |
CN115086527B (en) | Household video tracking and monitoring method, device, equipment and storage medium | |
CN117931357B (en) | Intelligent mirror based on interactive data processing, mirror cabinet and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |