CN113470109A - Passenger flow statistical method, electronic equipment and computer storage medium - Google Patents
Passenger flow statistical method, electronic equipment and computer storage medium Download PDFInfo
- Publication number
- CN113470109A CN113470109A CN202110642328.3A CN202110642328A CN113470109A CN 113470109 A CN113470109 A CN 113470109A CN 202110642328 A CN202110642328 A CN 202110642328A CN 113470109 A CN113470109 A CN 113470109A
- Authority
- CN
- China
- Prior art keywords
- target
- camera
- radar
- coordinate system
- passenger flow
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007619 statistical method Methods 0.000 title claims abstract description 5
- 238000012544 monitoring process Methods 0.000 claims abstract description 59
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000006243 chemical reaction Methods 0.000 claims description 11
- 230000008859 change Effects 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The application discloses a passenger flow statistical method, electronic equipment and a computer storage medium, wherein the method comprises the following steps: responding to the first target entering the overlapped monitoring area of the camera and the radar, and acquiring a first image corresponding to the first target shot by the camera and a first track corresponding to the first target detected by the radar; wherein, the object monitored by the camera and the radar is positioned in the overlapping monitoring area; determining whether the first target crosses the object based on the first trajectory; if so, a first image and a first track corresponding to the first target are counted as first passenger flow information. Through the mode, the passenger flow counting method and the passenger flow counting device can improve the passenger flow counting accuracy.
Description
Technical Field
The present application relates to the field of data statistics, and in particular, to a passenger flow statistics method, an electronic device, and a computer storage medium.
Background
With the advent of the big data era, the demand for monitoring passenger flow in commercial places is becoming stronger, in the prior art, monitoring passenger flow is usually performed by shooting an in-and-out target through a camera, the camera is usually arranged at a position opposite to a doorway in the commercial place, and the camera counts pedestrians as passenger flow information after acquiring images of the pedestrians in the doorway area, but in this way, the pedestrians passing through the commercial place but not entering the doorway area are also counted as passenger flow information, so that errors exist in passenger flow counting. In view of the above, how to improve the accuracy of the passenger flow statistics becomes an urgent problem to be solved.
Disclosure of Invention
The technical problem mainly solved by the application is to provide a passenger flow statistics method, an electronic device and a computer storage medium, which can improve the accuracy of passenger flow statistics.
In order to solve the above technical problem, a first aspect of the present application provides a passenger flow statistics method, including: responding to a first target entering an overlapped monitoring area of a camera and a radar, and acquiring a first image corresponding to the first target shot by the camera and a first track corresponding to the first target detected by the radar; wherein the object monitored by the camera and the radar together is located in the overlapping monitoring area; determining whether the first target crosses the object based on the first trajectory; if so, a first image corresponding to the first target and the first track are counted as first passenger flow information.
The passenger flow statistical method further comprises the following steps: obtaining initial positions of the camera and the radar, and acquiring an overlapped monitoring area of the camera and the radar based on the initial positions of the camera and the radar; and obtaining the width of the object, and setting a threshold line corresponding to the object in the overlapping monitoring area based on the initial position of the camera and the width of the object.
Wherein the step of acquiring an overlapping monitoring area of the camera and the radar based on initial positions of the camera and the radar comprises: calibrating a first coordinate system of the radar and a second coordinate system of the camera based on the initial positions of the camera and the radar to obtain the overlapped monitoring area, and establishing a conversion relation between the first coordinate system and the second coordinate system.
Wherein the step of obtaining the width of the object and setting a threshold line corresponding to the object in the overlap monitoring area based on the initial position of the camera and the width of the object comprises: obtaining a width of the object photographed by the camera, and calibrating a position of the object in the second coordinate system based on an initial position of the camera and the width of the object; converting the position of the object in the second coordinate system into the first coordinate system based on the conversion relationship between the first coordinate system and the second coordinate system to obtain the threshold line in the first coordinate system; the step of determining whether the first target crosses the object based on the first trajectory comprises: and judging whether the first track crosses the threshold value line.
Wherein the step of acquiring a first image corresponding to a first target captured by a camera and a first track corresponding to the first target detected by a radar in response to the first target entering an overlapping monitoring area of the camera and the radar includes: responding to the condition that the camera and/or the radar monitors that the human body of the first target enters the overlapping monitoring area, and configuring a corresponding identifier for the human body of the first target; acquiring at least one first image corresponding to the first target shot by the camera; the first image comprises a human body image corresponding to the human body of the first target and a face image corresponding to the face of the first target; and acquiring the first track of the human body of the first target detected by the radar in the first coordinate system.
Wherein the step of acquiring the first trajectory of the human body of the first target in the first coordinate system detected by the radar includes: and acquiring the distance and the angle of the human body of the first target in the first coordinate system relative to the camera, which are detected by the radar, and generating the first track of the human body of the first target in the first coordinate system based on the change of the distance and the angle.
Wherein the step of counting the first image and the first trajectory corresponding to the first object as first passenger flow information includes: binding the human body image corresponding to the human body of the first target with the same identifier with the first track; generating a binding relationship between the first track and the face image based on a corresponding relationship between the human body image and the face image in the first image of the first target of the same identifier acquired by the camera; and counting the first target generating the binding relationship, the face image corresponding to the first target and the first track as the first passenger flow information.
Wherein, after the step of counting the first image and the first trajectory corresponding to the first object as the first passenger flow information, the method includes: and acquiring a face image corresponding to a preset second target, and discarding the face image corresponding to the second target in the first passenger flow information and the first track corresponding to the face image to acquire second passenger flow information.
In order to solve the above technical problem, a second aspect of the present application provides an electronic device, which includes a memory and a processor coupled to each other, wherein the memory stores program data, and the processor calls the program data to execute the passenger flow statistics method according to the first aspect.
In order to solve the above technical problem, a third aspect of the present application provides a computer storage medium, having program data stored thereon, where the program data, when executed by a processor, implements the passenger flow statistics method of the first aspect.
The beneficial effect of this application is: the first target and the monitored object are jointly monitored based on the camera and the radar, when the first target enters an overlapping monitoring area of the camera and the radar, the camera is used for obtaining a first image of the first target, the radar is used for detecting a first track corresponding to the first target, and when the first track crosses the monitored object, the first image and the first track corresponding to the first target are counted together as first passenger flow information. Therefore, the camera can acquire images of the first target, the first target can be distinguished from pedestrians or objects, the first image of the pedestrians in the first target can be further analyzed, the radar can assist the camera in accurately judging whether the first target crosses the monitored object, and therefore the passenger flow statistics accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a method for providing statistics on passenger flow according to the present application;
FIG. 2 is a schematic flow chart diagram illustrating another embodiment of a method for providing customer statistics according to the present application;
FIG. 3 is a schematic flow chart of an embodiment before step S201 in FIG. 2;
FIG. 4 is a schematic diagram of an application scenario of an embodiment of the passenger flow statistics method of the present application;
FIG. 5 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 6 is a schematic structural diagram of an embodiment of a computer storage medium according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two.
Referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of a passenger flow statistics method according to the present application, the method including:
step S101: and responding to the fact that the first target enters an overlapping monitoring area of the camera and the radar, and acquiring a first image corresponding to the first target shot by the camera and a first track corresponding to the first target detected by the radar, wherein an object monitored by the camera and the radar together is located in the overlapping monitoring area.
Specifically, a camera and a radar are mounted in advance at a position facing an object to be monitored, wherein the radar may be integrated on the camera. After the camera and the radar are installed, calibrating an overlapping monitoring area of the camera and the radar, when a first target enters the overlapping monitoring area, monitoring the first target by the camera and the radar respectively, acquiring a first image corresponding to the first target by the camera, and acquiring a first track corresponding to the first target by the radar.
In an application mode, a camera and a radar are adjacently arranged at one end of an object and face the direction of the object, areas respectively monitored by the camera and the radar are determined based on the positions of the camera and the radar, an overlapped monitoring area of the camera and the radar is determined according to parameters and installation positions of the camera and the radar, the positions of the objects which are monitored together are calibrated in the overlapped monitoring area, when the first object is monitored to enter the overlapped monitoring area, the camera carries out video monitoring on the first object and extracts a first image corresponding to the first object from images of a video stream, the radar detects the first object, and when the first object moves, a first track of the first object is generated based on a point cloud image detected by the radar.
In an application scene, a camera and a radar are arranged at positions facing a shop doorway, a shop threshold is used as an object to be monitored by the camera and the radar together, an overlapping monitoring area of the camera and the radar is set, after a first target enters the overlapping monitoring area, the camera records a video and captures a first image corresponding to the first target, the radar detects the first target to generate a plurality of point cloud images, and the point cloud images are overlapped to obtain a first track of the first target.
It should be noted that, when the camera and/or the radar monitor that one or more first targets enter the overlapped monitoring area, the camera and the radar set unique corresponding identifiers for the first targets to distinguish different first targets, and a first image and a first track corresponding to the first target correspond to the identifiers.
Step S102: and judging whether the first target crosses the object or not based on the first track, and if so, counting a first image and the first track corresponding to the first target as first passenger flow information.
Specifically, after a first track detected by the radar is acquired, whether the first track intersects with a position of a target calibrated in an overlapping area is judged according to the first track, so that whether the first target crosses a commonly monitored target is judged.
Further, in response to the first object crossing the commonly monitored object, the first image and the first track corresponding to the first object with the same identifier are uploaded to the server, and the first object, the first image and the first track corresponding to the first object are counted as first passenger flow information.
In an application mode, when the first track crosses the position of the object in the overlapping monitoring area, a first image of a first target corresponding to the first track is obtained, image features on the first image are extracted, whether the first target is a pedestrian is judged based on the image features, if yes, the first image corresponding to the first target and the first track are bound and stored, and then the first target is counted as first passenger flow information.
According to the scheme, the first target and the monitored object are monitored together based on the camera and the radar, when the first target enters the overlapped monitoring area of the camera and the radar, the camera is used for obtaining the first image of the first target, the radar is used for detecting the first track corresponding to the first target, and when the first track crosses the monitored object, the first image corresponding to the first target and the first track are counted together into the first passenger flow information. Therefore, the camera can acquire images of the first target, the first target can be distinguished from pedestrians or objects, the first image of the pedestrians in the first target can be further analyzed, the radar can assist the camera in accurately judging whether the first target crosses the monitored object, and therefore the passenger flow statistics accuracy is improved.
Referring to fig. 2, fig. 2 is a schematic flow chart of another embodiment of the passenger flow statistics method of the present application, the method comprising:
step S201: and responding to the condition that the human body of the first target enters the overlapped monitoring area monitored by the camera and/or the radar, and configuring a corresponding identifier for the human body of the first target.
Specifically, when a first target enters an overlapping monitoring area, a human body of the first target is monitored by a camera and/or a radar, the camera and/or the radar sets a corresponding identifier for the human body of the first target, and the identifier is unique and is bound with the human body of the first target to distinguish different human bodies of the first target.
In an application mode, when any one of the camera or the radar monitors that the first target enters the overlapping monitoring area, the camera or the radar configures an identifier for the first target, the device which detects the first target continuously tracks the first target, waits for the other device to detect the first target, and then configures the same identifier for the same first target.
In another application, when the first target enters the overlapping monitoring area, the camera and the radar jointly configure the same identifier for the first target only when both the camera and the radar detect the first target.
In an embodiment, please refer to fig. 3, fig. 3 is a schematic flowchart illustrating an embodiment before step S201 in fig. 2, and step S201 further includes:
step S301: initial positions of the camera and the radar are obtained, and an overlapped monitoring area of the camera and the radar is obtained based on the initial positions of the camera and the radar.
Specifically, after the user finishes installing the camera and the radar, the initial positions of the camera and the radar are input, wherein the initial positions comprise the distance and the angle between the camera and the radar and an object respectively, and the overlapped monitoring area of the camera and the radar is obtained according to the initial positions of the camera and the radar and the initial parameters of the camera and the radar.
In an application mode, when a user installs the camera and the radar, the initial positions of the camera and the radar are input for many times, so that when the overlapped monitoring areas of the camera and the radar are determined, the monitoring areas of the camera and the radar are overlapped as much as possible, the camera and the radar can monitor the first target as much as possible simultaneously, and the monitoring precision is improved.
In an application scene, calibrating a first coordinate system of the radar and a second coordinate system of the camera based on initial positions of the camera and the radar to obtain an overlapped monitoring area, and establishing a conversion relation between the first coordinate system and the second coordinate system.
Specifically, according to initial positions of a camera and a radar, a first coordinate system of the radar and a second coordinate system of the camera are calibrated through a space calibration algorithm, a conversion relation between the first coordinate system and the second coordinate system is established, so that the camera and the radar correspond to a detected target, coordinates of each pixel point in a picture shot by the camera can be in one-to-one correspondence to coordinates of a certain point on a radar detection area, and positions of each point or point cloud on the radar detection area can also be in one-to-one correspondence to the certain coordinates on the picture shot by the camera.
In a specific application scenario, in a light-deficient scenario, the radar first detects the first target, but the camera temporarily fails to detect the first target, then the radar configures an identifier for the first target, tracks a first track of the first target, extracts a first image corresponding to the first target from the monitoring picture until the camera is at the first moment, acquires the position of the first target detected by the radar at the first moment in the first coordinate system, converts the position of the first target at the first moment into a second coordinate system of the camera based on the conversion relation between the first coordinate system and the second coordinate system, so that the camera marks the same mark for the same first target to facilitate the binding of the first image and the first track of the first target, and the detection of the radar to the first track is not affected, so that omission is avoided when judging whether the first target crosses the commonly monitored object.
Step S302: the width of the object is obtained, and a threshold line corresponding to the object is set in the overlapping monitoring area based on the initial position of the camera and the width of the object.
Specifically, the width of the monitored object is acquired, the position of the object is found in the monitoring picture of the camera according to the initial position of the camera and the width of the object, the position of the object is indicated in the monitoring picture, and the position of the object is used as a threshold line corresponding to the object in the overlapped monitoring area.
In an application mode, obtaining the width of an object shot by a camera, and calibrating the position of the object in a second coordinate system based on the initial position of the camera and the width of the object; and converting the position of the object in the second coordinate system into the first coordinate system based on the conversion relation between the first coordinate system and the second coordinate system so as to obtain a threshold line in the first coordinate system.
Specifically, the position of the object is marked in a monitoring picture shot by a camera by using a second coordinate system as a reference, the width of the object is drawn, the position of the object in the second coordinate system is further obtained, the position of the object in the second coordinate system is converted into the first coordinate system based on the conversion relation between the first coordinate system and the second coordinate system, the position of the object in the first coordinate system is obtained, and the position of the object in the first coordinate system is used as a threshold line.
In a specific application scenario, when an object monitored by a camera and a radar together is a doorsill of a shop, the camera shoots an image of the doorway of the shop to obtain a monitoring picture of the doorway of the shop, the doorsill is drawn in the monitoring picture to obtain the width of the doorsill, the position of the doorsill is further marked in a second coordinate system, pixel points of the doorsill in the monitoring picture are converted into a first coordinate system based on the conversion relation between the first coordinate system and the second coordinate system, so that the radar uses the first coordinate system as a reference to obtain the position of the doorsill in the first coordinate system to further obtain a threshold line, and whether a pedestrian crosses the doorsill to enter the shop is judged by using the threshold line as the reference when the pedestrian serves as a first target.
Step S202: the method comprises the steps of obtaining at least one first image corresponding to a first target shot by a camera, and obtaining a first track of a human body of the first target detected by a radar in a first coordinate system.
Specifically, the first image comprises a human body image corresponding to a human body of the first target and a face image corresponding to a face of the first target, when a pedestrian enters the overlapping monitoring area as the first target, the camera captures the human body and the face of the pedestrian to obtain the human body image corresponding to the human body of the first target and the face image corresponding to the face of the first target, and the radar tracks the first target to obtain a corresponding first track when the first target moves in a first coordinate system.
In an application mode, the distance and the angle of the human body of the first target detected by the radar in the first coordinate system relative to the camera are obtained, and a first track of the human body of the first target in the first coordinate system is generated based on the change of the distance and the angle.
Specifically, referring to fig. 4, fig. 4 is a schematic view of an application scenario of an embodiment of the passenger flow statistics method of the present application, where a radar and a camera are disposed adjacent to each other, a star represents an installation position of the radar and the camera, a sector represents an overlapping monitoring area of the camera and the radar, and a horizontal dotted line represents a threshold line w0When the radar and the camera detect the first target, the ID is configured for the first target, the camera collects a first image including a human body image and a human face image of the first target, the radar continuously tracks the human body of the first target, and the distance L of the first target relative to the camera is detectedIDAnd angle deltaθBased on the distance LIDAnd angle deltaθTo obtain a first trajectory of the human body of the first target in the first coordinate system by a distance L of the first target relative to the cameraIDAnd angle deltaθThe real-time monitoring is carried out to accurately obtain the action track of the first target in the overlapping monitoring area, and further provide a basis for judging whether the first target and the object have a cross relationship.
Further, based on the distance LIDAnd angle deltaθRecording the action direction of the first track, recording the process that the first target approaches the camera as an object entering the co-monitoring, and recording the process that the first target leaves the camera as an object leaving the co-monitoringAnd the object can further take the first target entering the commonly monitored object as the actual passenger flow according to the direction of the first track during the subsequent passenger flow statistics so as to improve the accuracy of the passenger flow statistics.
Step S203: and judging whether the first target crosses the object or not based on the first track, and if so, counting a first image and the first track corresponding to the first target as first passenger flow information.
Specifically, it is determined whether the first trajectory crosses the threshold line. And judging whether the first target crosses the monitored object or not based on whether the first track corresponding to the first target detected by the radar crosses the position of the threshold line or not.
It can be understood that the threshold line corresponds to the position of the object in the first coordinate system, the radar continuously detects the first target to generate a first track corresponding to the first target in the first coordinate system, and then the threshold line and the first track are both in the first coordinate system, and whether the first target crosses the monitored object can be accurately and quickly determined by judging whether the first track crosses the threshold line, so as to improve the accuracy of passenger flow statistics.
Further, when the first track does not intersect with the threshold line, the first track and the first image of the first target are discarded, when the first track intersects with the threshold line, the first image corresponding to the first target is bound with the first track, and the first target generating the binding relationship is counted as the first passenger flow information.
In an application mode, binding a human body image corresponding to a human body of a first target with the same identification with a first track; generating a binding relationship between a first track and a face image based on the corresponding relationship between the face image and the body image in a first image of a first target with the same identifier, which is acquired by a camera; and counting the first target generating the binding relationship, the face image corresponding to the first target and the first track as first passenger flow information.
Specifically, when a first track crosses a threshold line, an identifier of a first target corresponding to the first track is obtained, a human body of the first target with the same identifier is searched, a human body image is bound with the first track, a human body image and a human face image of the first target with the same identifier, which are collected by a camera, are obtained, the first track and the human face image are bound based on the corresponding relation of the human body image and the human face image to generate a binding relation, the human body is used as a medium to finally and accurately generate the binding relation of the first track and the human face image, the first target with the generated binding relation, the human face image corresponding to the first target and the first track are counted as first passenger flow information, and the human face image can be used for analyzing the age and the gender of a client or comparing with the human face image in a graph library to determine the identity information of the client.
Step S204: and acquiring a face image corresponding to a preset second target, and discarding the face image corresponding to the second target in the first passenger flow information and the first track corresponding to the face image to acquire second passenger flow information.
Specifically, the staff is used as a second target, a face image of the second target is collected in advance and stored, after first passenger flow information is obtained, the first target and a face image corresponding to the first target in the first passenger flow information are compared with the face image of the second target, the second face image in the first passenger flow information is extracted, the second face image is discarded, a preset target is removed, and therefore more accurate passenger flow information is obtained.
In an application scene, a shop worker is used as a second target, a face image of the worker is collected and stored for calling, after a first image and a first track corresponding to the first target are collected, the first target crossed with a threshold line is used as first passenger flow information crossing a shop threshold, the face image of the worker is searched from the first passenger flow information, the face image of the worker and a bound action track of the worker are all deleted from the first passenger flow information, so that data of the worker in the first passenger flow information are all deleted, the worker is not counted into final second passenger flow information at a doorway of the shop, and the passenger flow statistical accuracy is improved.
In the embodiment, coordinate systems of a camera and a radar are respectively calibrated, a conversion relation between the coordinate systems is generated, the position of an object is calibrated in a second coordinate system and converted into a first coordinate system to obtain a threshold line, the radar monitors a first target to generate a first track, whether the first track intersects with the threshold line is judged in the first coordinate system to eliminate the first target which does not cross the object, the passenger flow counting accuracy is improved, the first track which intersects with the threshold line is bound with a face image of the first target corresponding to the same identification to serve as first passenger flow information, and the face image of the second target is further eliminated from the first passenger flow information to obtain more accurate passenger flow information.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of an electronic device 50 of the present application, the electronic device 50 includes a memory 501 and a processor 502 coupled to each other, wherein the memory 501 stores program data (not shown), and the processor 502 invokes the program data to implement the passenger flow statistics method in any of the above embodiments.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of a computer storage medium 60 of the present application, the computer storage medium 60 stores program data 600, and the program data 600 is executed by a processor to implement the passenger flow statistics method in any of the above embodiments, and the description of the related contents refers to the detailed description of the above method embodiments, which is not repeated herein.
It should be noted that, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.
Claims (10)
1. A method of statistics of passenger flow, the method comprising:
responding to a first target entering an overlapped monitoring area of a camera and a radar, and acquiring a first image corresponding to the first target shot by the camera and a first track corresponding to the first target detected by the radar; wherein the object monitored by the camera and the radar together is located in the overlapping monitoring area;
determining whether the first target crosses the object based on the first trajectory; if so, a first image corresponding to the first target and the first track are counted as first passenger flow information.
2. The passenger flow statistics method of claim 1, further comprising:
obtaining initial positions of the camera and the radar, and acquiring an overlapped monitoring area of the camera and the radar based on the initial positions of the camera and the radar;
and obtaining the width of the object, and setting a threshold line corresponding to the object in the overlapping monitoring area based on the initial position of the camera and the width of the object.
3. The method of claim 2, wherein the step of obtaining overlapping surveillance areas of the camera and the radar based on initial positions of the camera and the radar comprises:
calibrating a first coordinate system of the radar and a second coordinate system of the camera based on the initial positions of the camera and the radar to obtain the overlapped monitoring area, and establishing a conversion relation between the first coordinate system and the second coordinate system.
4. The passenger flow statistics method of claim 3, wherein the step of obtaining the width of the object and setting the threshold line corresponding to the object within the overlap monitoring area based on the initial position of the camera and the width of the object comprises:
obtaining a width of the object photographed by the camera, and calibrating a position of the object in the second coordinate system based on an initial position of the camera and the width of the object;
converting the position of the object in the second coordinate system into the first coordinate system based on the conversion relationship between the first coordinate system and the second coordinate system to obtain the threshold line in the first coordinate system;
the step of determining whether the first target crosses the object based on the first trajectory comprises:
and judging whether the first track crosses the threshold value line.
5. The method of claim 3, wherein the step of acquiring a first image corresponding to a first target captured by a camera and a first trajectory corresponding to the first target detected by a radar in response to the first target entering an overlapping surveillance area of the camera and the radar comprises:
responding to the condition that the camera and/or the radar monitors that the human body of the first target enters the overlapping monitoring area, and configuring a corresponding identifier for the human body of the first target;
acquiring at least one first image corresponding to the first target shot by the camera; the first image comprises a human body image corresponding to the human body of the first target and a face image corresponding to the face of the first target; and the number of the first and second groups,
and acquiring the first track of the human body of the first target detected by the radar in the first coordinate system.
6. The passenger flow statistical method according to claim 5, wherein the step of obtaining the first trajectory in the first coordinate system of the human body of the first target detected by the radar comprises:
and acquiring the distance and the angle of the human body of the first target in the first coordinate system relative to the camera, which are detected by the radar, and generating the first track of the human body of the first target in the first coordinate system based on the change of the distance and the angle.
7. The method of claim 5, wherein said step of summarizing the first image and the first trajectory corresponding to the first object as first traffic information comprises:
binding the human body image corresponding to the human body of the first target with the same identifier with the first track;
generating a binding relationship between the first track and the face image based on a corresponding relationship between the human body image and the face image in the first image of the first target of the same identifier acquired by the camera;
and counting the first target generating the binding relationship, the face image corresponding to the first target and the first track as the first passenger flow information.
8. The method of claim 7, wherein said step of summarizing the first image and the first trajectory corresponding to the first object as first traffic information is followed by the step of:
and acquiring a face image corresponding to a preset second target, and discarding the face image corresponding to the second target in the first passenger flow information and the first track corresponding to the face image to acquire second passenger flow information.
9. An electronic device, comprising: a memory and a processor coupled to each other, wherein the memory stores program data that the processor calls to perform the method of any of claims 1-8.
10. A computer storage medium having program data stored thereon, which program data, when executed by a processor, implements the method according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110642328.3A CN113470109A (en) | 2021-06-09 | 2021-06-09 | Passenger flow statistical method, electronic equipment and computer storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110642328.3A CN113470109A (en) | 2021-06-09 | 2021-06-09 | Passenger flow statistical method, electronic equipment and computer storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113470109A true CN113470109A (en) | 2021-10-01 |
Family
ID=77869455
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110642328.3A Pending CN113470109A (en) | 2021-06-09 | 2021-06-09 | Passenger flow statistical method, electronic equipment and computer storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113470109A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115034350A (en) * | 2022-04-27 | 2022-09-09 | 青岛民航凯亚系统集成有限公司 | Passenger flow monitoring device, method and storage medium in each area of terminal building |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102768803A (en) * | 2012-07-31 | 2012-11-07 | 株洲南车时代电气股份有限公司 | Vehicle intelligent monitoring and recording system and method based on radar and video detection |
CN103021059A (en) * | 2012-12-12 | 2013-04-03 | 天津大学 | Video-monitoring-based public transport passenger flow counting method |
CN109327676A (en) * | 2017-07-31 | 2019-02-12 | 杭州海康威视数字技术股份有限公司 | A kind of falling object from high altitude monitoring system, method and device |
CN109816702A (en) * | 2019-01-18 | 2019-05-28 | 苏州矽典微智能科技有限公司 | A kind of multiple target tracking device and method |
CN110457993A (en) * | 2019-06-26 | 2019-11-15 | 广州鹰云信息科技有限公司 | Passenger flow statistical method and device based on recognition of face |
CN111582171A (en) * | 2020-05-08 | 2020-08-25 | 济南博观智能科技有限公司 | Method, device and system for monitoring pedestrian running red light and readable storage medium |
CN111738134A (en) * | 2020-06-18 | 2020-10-02 | 北京市商汤科技开发有限公司 | Method, device, equipment and medium for acquiring passenger flow data |
CN111784387A (en) * | 2020-06-23 | 2020-10-16 | 大连中维世纪科技有限公司 | Multi-dimensional big data-based consumer brand loyalty analysis method |
CN111783588A (en) * | 2020-06-23 | 2020-10-16 | 大连中维世纪科技有限公司 | Distributed intelligent passenger flow statistics effective de-duplication method |
CN111983603A (en) * | 2020-08-31 | 2020-11-24 | 杭州海康威视数字技术股份有限公司 | Motion trajectory relay method, system and device and central processing equipment |
CN112016483A (en) * | 2020-08-31 | 2020-12-01 | 杭州海康威视数字技术股份有限公司 | Relay system, method, device and equipment for target detection |
CN112465855A (en) * | 2021-02-02 | 2021-03-09 | 南京甄视智能科技有限公司 | Passenger flow statistical method, device, storage medium and equipment |
-
2021
- 2021-06-09 CN CN202110642328.3A patent/CN113470109A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102768803A (en) * | 2012-07-31 | 2012-11-07 | 株洲南车时代电气股份有限公司 | Vehicle intelligent monitoring and recording system and method based on radar and video detection |
CN103021059A (en) * | 2012-12-12 | 2013-04-03 | 天津大学 | Video-monitoring-based public transport passenger flow counting method |
CN109327676A (en) * | 2017-07-31 | 2019-02-12 | 杭州海康威视数字技术股份有限公司 | A kind of falling object from high altitude monitoring system, method and device |
CN109816702A (en) * | 2019-01-18 | 2019-05-28 | 苏州矽典微智能科技有限公司 | A kind of multiple target tracking device and method |
CN110457993A (en) * | 2019-06-26 | 2019-11-15 | 广州鹰云信息科技有限公司 | Passenger flow statistical method and device based on recognition of face |
CN111582171A (en) * | 2020-05-08 | 2020-08-25 | 济南博观智能科技有限公司 | Method, device and system for monitoring pedestrian running red light and readable storage medium |
CN111738134A (en) * | 2020-06-18 | 2020-10-02 | 北京市商汤科技开发有限公司 | Method, device, equipment and medium for acquiring passenger flow data |
CN111784387A (en) * | 2020-06-23 | 2020-10-16 | 大连中维世纪科技有限公司 | Multi-dimensional big data-based consumer brand loyalty analysis method |
CN111783588A (en) * | 2020-06-23 | 2020-10-16 | 大连中维世纪科技有限公司 | Distributed intelligent passenger flow statistics effective de-duplication method |
CN111983603A (en) * | 2020-08-31 | 2020-11-24 | 杭州海康威视数字技术股份有限公司 | Motion trajectory relay method, system and device and central processing equipment |
CN112016483A (en) * | 2020-08-31 | 2020-12-01 | 杭州海康威视数字技术股份有限公司 | Relay system, method, device and equipment for target detection |
CN112465855A (en) * | 2021-02-02 | 2021-03-09 | 南京甄视智能科技有限公司 | Passenger flow statistical method, device, storage medium and equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115034350A (en) * | 2022-04-27 | 2022-09-09 | 青岛民航凯亚系统集成有限公司 | Passenger flow monitoring device, method and storage medium in each area of terminal building |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10417503B2 (en) | Image processing apparatus and image processing method | |
US9792505B2 (en) | Video monitoring method, video monitoring system and computer program product | |
EP2549738B1 (en) | Method and camera for determining an image adjustment parameter | |
AU2021202451A1 (en) | Methods circuits devices systems and associated computer executable code for multi factor image feature registration and tracking | |
EP2665017B1 (en) | Video processing apparatus and method for managing tracking object | |
WO2016172870A1 (en) | Video monitoring method, video monitoring system and computer program product | |
CN113869268B (en) | Obstacle ranging method, obstacle ranging device, electronic equipment and readable medium | |
US10692225B2 (en) | System and method for detecting moving object in an image | |
CN104954747B (en) | Video monitoring method and device | |
CN112633150A (en) | Target trajectory analysis-based retention loitering behavior identification method and system | |
CN117315574B (en) | Blind area track completion method, blind area track completion system, computer equipment and storage medium | |
CN113420726B (en) | Region de-duplication passenger flow statistical method based on overlook image | |
CN113793365B (en) | Target tracking method and device, computer equipment and readable storage medium | |
CN110765823A (en) | Target identification method and device | |
JP2017220151A (en) | Information processing unit, information processing method, and program | |
CN108111802B (en) | Video monitoring method and device | |
CN113470109A (en) | Passenger flow statistical method, electronic equipment and computer storage medium | |
CN110444026B (en) | Triggering snapshot method and system for vehicle | |
CN110956644B (en) | Motion trail determination method and system | |
CN111369587B (en) | Tracking method and device | |
CN114820692B (en) | State analysis method, device, storage medium and terminal for tracking target | |
Nakano et al. | Complementing vehicle trajectories using two camera viewpoints | |
CN114882066B (en) | Target tracking method and related device, electronic equipment and storage medium | |
CN110930362A (en) | Screw safety detection method, device and system | |
JP2011248513A (en) | Busyness detection system and busyness detection program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |