WO2014010174A1 - 画角変動検知装置、画角変動検知方法および画角変動検知プログラム - Google Patents
画角変動検知装置、画角変動検知方法および画角変動検知プログラム Download PDFInfo
- Publication number
- WO2014010174A1 WO2014010174A1 PCT/JP2013/003762 JP2013003762W WO2014010174A1 WO 2014010174 A1 WO2014010174 A1 WO 2014010174A1 JP 2013003762 W JP2013003762 W JP 2013003762W WO 2014010174 A1 WO2014010174 A1 WO 2014010174A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- angle
- view
- fixed point
- feature
- imaging device
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 78
- 238000003384 imaging method Methods 0.000 claims abstract description 66
- 230000008859 change Effects 0.000 claims abstract description 45
- 238000000605 extraction Methods 0.000 claims description 23
- 238000000034 method Methods 0.000 claims description 21
- 238000004364 calculation method Methods 0.000 claims description 18
- 239000000284 extract Substances 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 241000661823 Canopus Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/14—Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to an angle-of-view variation detecting device, an angle-of-view variation detecting method, and an angle-of-view variation detecting program for detecting an angle of view variation of an imaging device.
- the angle of view of the camera is shifted due to an earthquake or the like, the angle of view of the camera is returned to the original state by performing calibration again.
- the camera orientation is stored. For example, when the camera recovers from a power failure or the like, the camera posture is automatically set based on the stored angle. Can be restored.
- planetary probes that automatically correct the direction of the camera are also known. This planetary probe detects the position of a specific star such as canopus using a star tracker, and automatically corrects the direction of the camera.
- Patent Document 1 describes a correction device for a monitoring camera.
- the correction device described in Patent Document 1 stores a reference pattern that gives a positional reference when correcting deviation of a captured image. This reference pattern models an initial positional relationship with respect to a specific part existing at a known position in the monitoring area.
- the correction device described in Patent Document 1 performs image correction so that the comparison pattern matches the reference pattern.
- the method for correcting the orientation of the camera of the planetary explorer has been described above, but the environment is completely different between the space photographed by the camera used in a general surveillance system and the outer space.
- the foreground object moves in front of the camera due to the influence of wind blowing at the installation location.
- it is desirable that the change in the angle of view can be appropriately detected even when various effects are exerted on the shooting environment of the camera.
- Patent Document 1 it is described that a captured image is captured by a monitoring camera attached in the vicinity of a door mirror of a vehicle. Surveillance cameras mounted in the vicinity of vehicle door mirrors are particularly affected by wind and the like, and it is assumed that the angle of view frequently changes. However, Patent Document 1 does not mention any corresponding method. That is, the correction apparatus described in Patent Document 1 has a problem that image correction is performed every time it is affected by the imaging environment.
- an object of the present invention is to provide an angle-of-view variation detecting device, an angle-of-view variation detecting method, and an angle-of-view variation detecting program capable of appropriately detecting an angle-of-view variation in accordance with a shooting environment of the image capturing apparatus.
- the angle-of-view variation detection device is based on fixed point information including the position of a fixed point specified from the angle of view of the imaging device and the feature amount indicating the feature of the fixed point, and the image captured by the imaging device And a state determination means for determining that the angle of view of the imaging device has changed when the change of the angle of view is stable for a certain period of time.
- the angle-of-view variation detection method is based on the fixed point information including the position of the fixed point specified from the angle of view of the imaging device and the feature amount indicating the feature of the fixed point, and the image captured by the imaging device. Then, the angle of view variation of the imaging device is detected, and it is determined that the angle of view of the imaging device has changed when the variation in the angle of view is stable for a certain period.
- An angle-of-view variation detection program allows a computer to detect, based on fixed point information including a fixed point position specified from an angle of view of an image pickup device and a feature amount indicating a feature of the fixed point.
- View angle variation detection processing that detects the angle of view variation of the imaging device from the video to be shot, and state determination processing that determines that the angle of view of the image capture device has changed when the change in the angle of view is stable for a certain period of time It is characterized by making it.
- the present invention it is possible to appropriately detect the change in the angle of view according to the shooting environment of the image pickup apparatus.
- FIG. 1 is a block diagram showing a configuration example of a first embodiment of a field angle variation detection device according to the present invention.
- the angle-of-view variation detection device of this embodiment includes a video acquisition unit 10, a position calculation unit 11, a camera calibration information storage unit 12, a view angle variation determination unit 13, and a camera control information generation unit 14. .
- the video acquisition unit 10 captures a video and inputs the video to the position calculation unit 11 and the view angle variation determination unit 13.
- the video acquisition unit 10 is realized by an imaging device such as a camera.
- the angle-of-view variation detection device detects the angle-of-view variation of the video acquisition unit 10.
- the video acquisition unit 10 is an imaging device capable of controlling the angle of view from the outside.
- the position calculation means 11 extracts the position of an object or person from the video input from the video acquisition means 10. Specifically, the position calculation means 11 converts the two-dimensional position obtained from the input video into a coordinate system in the real space using camera calibration information described later, and in the real space of the person or object Calculate the position and size at.
- the position data calculated in time series becomes the flow line information. Therefore, it is possible to analyze the action of the person by analyzing this information.
- security applications for example, it is possible to detect the intrusion of a person into a specific area, and it is possible to detect a person who is wandering in the specific area.
- the flow line information can be used to improve the work efficiency of the store clerk, and the flow line of the customer can be obtained. Therefore, information that can be used for marketing can be obtained by using the flow line information.
- the position calculation means 11 calculates the size of the object, so that the object can be classified (for example, a person and an object are classified).
- the information calculated by the position calculation means 11 can be used as data that is the basis of such flow line analysis and object analysis.
- the camera calibration information storage unit 12 stores camera calibration information used by the position calculation unit 11 to calculate the position and size of a person or an object in real space.
- the camera calibration information is stored in the camera calibration information storage unit 12 in advance by, for example, a user.
- Camera calibration information is also called a camera parameter, and is classified into an internal parameter and an external parameter.
- the internal parameters are parameters that describe distortion of the image due to lens distortion and parameters that describe the focal length (zoom rate).
- the external parameter is a parameter representing the position where the camera is installed and the direction (posture) of the camera.
- the position calculation means 11 calculates a three-dimensional (real space) position from the two-dimensional coordinates of the image using these parameters. For example, when specifying the position of a person, the position calculation means 11 calculates the coordinates of the person's feet on the image, and based on the camera calibration information, this position is calculated in real space (that is, the person stands up). What is necessary is just to convert into the floor surface position.
- the angle-of-view variation determining means 13 determines whether or not the angle of view has changed using the input video. If there is a change, the angle-of-view fluctuation determination means 13 calculates the amount of change or a parameter indicating the change.
- information indicating the variation in the angle of view is referred to as angle-of-view variation information.
- the camera control information generation means 14 calculates the pan and tilt amounts necessary for returning the angle of view of the video acquisition means 10 based on the angle of view fluctuation information generated by the angle of view fluctuation determination means 13.
- the obtained information is input to the video acquisition means 10 as camera control information.
- the video acquisition means 10 adjusts the angle of view based on the camera control information, and performs processing for returning to the pre-change.
- the method for calculating the pan and tilt amounts necessary to restore the angle of view is arbitrary. That is, the camera control information is a parameter for moving the camera in a direction to return the variation amount determined by the angle-of-view variation determination unit 13. For example, the amount of pixel movement when the pan angle is changed is determined according to the degree of zoom. Therefore, the camera control information generation unit 14 stores in advance the relationship between the pan angle and the pixel movement amount with respect to the current zoom level, and from the apparent movement of the video when there is a change in the angle of view. The amount of movement may be calculated to obtain the pan angle.
- the tilt angle can also be calculated using a similar method.
- the method by which the camera control information generation unit 14 calculates the camera control information is not limited to the above method.
- the camera control information generation unit 14 may predetermine a function for calculating the pan and tilt amounts according to the variation amount, and calculate the camera control information based on the function. Note that a method for calculating the pan and tilt amounts is widely known, and thus detailed description thereof is omitted here.
- FIG. 2 is a block diagram illustrating a configuration example of the field angle variation determination unit 13 of the present embodiment.
- the view angle variation determination unit 13 includes a view angle variation detection unit 101, a fixed point information storage unit 102, a feature amount extraction unit 103, and a feature amount tracking unit 104.
- the fixed point information storage unit 102 stores fixed point information including the position of the fixed point and a feature amount indicating the feature of the fixed point.
- the fixed point is a point specified from the video imaged by the video acquisition means 10 and means that the position and the feature amount are stable for a certain time or more.
- the position of the fixed point is represented by information (for example, two-dimensional coordinates) that can specify the position in the image.
- the fixed point feature amount is represented by, for example, luminance gradient information, SIFT (Scale Invariant Feature Transform), SURF (Speeded-Up Robust Feature), and the like.
- SIFT Scale Invariant Feature Transform
- SURF Speeded-Up Robust Feature
- the fixed point is determined in advance by the user or the like, and the position and the feature amount of the fixed point are stored in the fixed point information storage unit 102 in advance.
- the user may determine a fixed point by visually determining a non-moving point such as a corner of a building.
- the angle-of-view fluctuation detection unit 101 detects whether or not there is a change in the angle of view of the video acquisition unit 10 from the input video based on the fixed point information. To do.
- information indicating whether or not the angle of view has changed, and information indicating the content of the change when the angle of view has changed are collectively referred to as angle-of-view variation detection information.
- the angle-of-view variation detection unit 101 calculates the feature amount of the input video at the same position as the fixed point position stored in the fixed point information storage unit 102 and compares it with the fixed point feature amount. Then, the number of fixed points at which the feature amounts substantially match is calculated. When the number of matching fixed points is equal to or larger than a certain number, the angle-of-view variation detecting unit 101 determines that the angle of view is not varied, and generates angle-of-view variation detection information indicating that the angle of view is not varied. To do.
- the angle-of-view variation detecting unit 101 determines that the angle of view has changed, and displays information indicating that the angle of view has changed and information indicating the content of the change. Generated as angle-of-view variation detection information.
- the view angle variation detection unit 101 inputs the generated view angle variation detection information to the feature amount extraction unit 103.
- the method of detecting whether or not the angle of view has changed by calculating the number of points that match the fixed point by the angle of view detection means 101 has been described.
- the method of detecting whether or not the angle of view has changed is not limited to the method of calculating the number of points that match the fixed point.
- the angle-of-view variation detection unit 101 has a similarity between the input image and the image indicated by the fixed point information as the statistical amount of the distance between the feature amount vectors representing the feature amount (average, median, percentile value, etc.) decreases. If it is determined that the angle of view is high and the similarity is below a predetermined threshold, it may be determined that the angle of view has changed.
- the angle-of-view variation detecting means 101 may determine that the angle of view has changed when the difference between the input image and the image indicated by the fixed point information is larger than a predetermined amount.
- Feature amount extraction means 103 extracts feature points from the input video, and extracts feature amounts at the positions of the feature points.
- the feature amount extraction unit 103 calculates a feature amount such as SIFT or SURF, but the content of the feature amount is not limited to SIFT or SURF. Since a method for extracting feature points and a method for calculating SIFT and SURF are widely known, detailed description thereof is omitted here.
- the feature quantity extraction unit 103 calculates a feature quantity that can be compared with the feature quantity included in the fixed point information.
- information including the coordinates of the feature points extracted by the feature amount extraction unit 103 and the feature amounts of the feature points will be referred to as feature point information.
- the feature amount extraction unit 103 may calculate the feature amount when the angle of view does not change. Since the feature quantity need not be frequently calculated at points where the feature quantity is stable, the feature quantity extraction unit 103 reduces the frequency of calculating the feature quantity for such a feature point. May be.
- the feature amount tracking unit 104 collates the fixed point with the feature point within the search range, and associates the two points.
- a parameter for converting the position of each feature point to the position of the fixed point (hereinafter referred to as an angle of view variation parameter) is calculated. Then, the feature amount tracking unit 104 calculates a view angle variation parameter that can convert the most feature points into corresponding fixed points.
- the feature amount tracking unit 104 may calculate, for example, the distance or similarity between the calculated feature amount of the feature point and the feature amount of the fixed point.
- the feature amount tracking unit 104 extracts a pair of feature points and fixed points whose calculated distance is equal to or smaller than a predetermined threshold value or whose calculated similarity is equal to or larger than a predetermined threshold value, and uses the extracted pairs.
- the parameter candidates for converting the angle of view are calculated. That is, this pair indicates a combination of fixed points corresponding to feature points.
- the feature amount tracking unit 104 calculates rotation / translation parameters as parameter candidates when the feature point moves to the fixed point extracted as a pair.
- the feature amount tracking unit 104 may calculate the parameter using, for example, affine transformation.
- the feature amount tracking unit 104 may calculate parameter candidates for the plurality of extracted pairs, and set the parameter candidate having the highest frequency among them as the angle-of-view variation parameter. At this time, the feature amount tracking unit 104 may randomly select a pair from a plurality of pairs.
- the feature amount tracking unit 104 arbitrarily moves and converts the input video, and calculates the distance or similarity between the feature amount of the fixed point and the feature amount of the feature point. It may be calculated.
- the feature amount tracking unit 104 may use, as the angle-of-view variation parameter, a parameter indicating movement and conversion to a state where the calculated distance is equal to or smaller than a predetermined threshold value or the calculated similarity is equal to or larger than the predetermined threshold value.
- the feature amount tracking unit 104 temporarily stores the calculated view angle variation parameter.
- the feature amount tracking unit 104 calculates the angle-of-view variation parameter in the same manner for the input next video (that is, the video of the next frame).
- the feature amount tracking unit 104 compares the temporarily stored view angle variation parameter with the newly calculated view angle variation parameter.
- the feature amount tracking unit 104 determines that there has been a change in the view angle, and outputs the view angle change parameter as the view angle change information. . That is, the feature amount tracking unit 104 determines that the angle of view of the video acquisition unit 10 has changed when the change in the angle of view is stable for a certain period.
- being stable for a certain period means that the change in the angle-of-view variation parameter between frames is within a predetermined threshold within a predetermined period.
- the feature amount tracking unit 104 determines that the angle of view of the image acquisition unit 10 has changed when the change in the angle-of-view variation parameter between frames in the input image is stable for a certain period. For this reason, it is possible to avoid erroneous detection due to a momentary change in the angle of view.
- the feature amount tracking unit 104 may determine whether or not the calculated angle-of-view variation parameter is within a normal range of angle-of-view variation. Then, the feature amount tracking unit 104 may instruct the camera control information generation unit 14 to generate the camera control information when it is within the normal range of view angle fluctuation. On the other hand, when it is not within the range of the normal angle of view, there is a possibility that the video acquisition means 10 cannot return to the state before the change based on the camera control information. In this case, the feature amount tracking unit 104 may list only an alert indicating that a change in the angle of view has occurred, and prompt the user to perform recalibration manually.
- the angle-of-view variation detection unit 101, the feature amount extraction unit 103, and the feature amount tracking unit 104 are realized by a CPU of a computer that operates according to a program (view angle variation detection program).
- the program is stored in a storage unit (not shown) of the view angle variation determination unit 13, and the CPU reads the program, and the view angle variation detection unit 101, the feature amount extraction unit 103, and the feature amount tracking according to the program. It may operate as the means 104.
- each of the angle-of-view variation detecting unit 101, the feature amount extracting unit 103, and the feature amount tracking unit 104 may be realized by dedicated hardware.
- the fixed point information storage means 102 is realized by, for example, a magnetic disk.
- FIG. 3 is a flowchart illustrating an operation example of the field angle variation detection device according to the first embodiment.
- the video acquisition unit 10 inputs the captured video to the angle-of-view variation determination unit 13 (step S1).
- the angle-of-view variation detecting unit 101 detects the angle-of-view variation of the video acquisition unit 10 from the video captured by the video acquisition unit 10 based on the fixed point information (step S2). For example, the angle-of-view variation detection unit 101 may calculate the number of fixed points at which the feature amount at the same position as the fixed point position in the input video substantially matches the fixed point feature amount. Then, the angle-of-view fluctuation detecting unit 101 may determine that the angle of view is fluctuating when there are no more than a certain number of matching fixed points.
- Feature amount extraction means 103 extracts feature points from the input video (step S3).
- the feature amount tracking unit 104 associates the fixed point with the feature point, and calculates the view angle variation parameter. Then, the feature amount tracking unit 104 determines whether or not the angle-of-view variation parameter is stable for a certain period (step S4). When the angle of view variation parameter is not stable for a certain period (N in step S4), the processing after step S1 for detecting the angle of view variation is repeated for the input video.
- the feature amount tracking unit 104 determines that the angle of view of the video acquisition unit 10 has changed, and the angle of view variation information including the angle of view variation parameter. Is input to the camera control information generation means 14 (step S5). Then, the camera control information generation unit 14 generates camera control information based on the input angle-of-view variation information and inputs it to the video acquisition unit 10 (step S6). The video acquisition unit 10 adjusts the angle of view based on the input camera control information, and performs a process of returning to the pre-change state (step S7).
- the angle-of-view fluctuation detecting unit 101 detects the angle-of-view fluctuation of the video acquisition unit 10 from the video imaged by the video acquisition unit 10 based on the fixed point information. Then, the feature amount tracking unit 104 determines that the angle of view of the video acquisition unit 10 has changed when the change in the angle of view is stable for a certain period. Therefore, it is possible to appropriately detect the change in the angle of view according to the shooting environment of the imaging apparatus.
- a camera having a pan / tilt / zoom function is controlled to return to the original angle when the power is turned on again.
- the accuracy varies depending on the camera. Therefore, the accuracy of returning to the original angle can be improved by applying the field angle variation detection device of the present embodiment to a camera having a pan / tilt / zoom function.
- the video acquisition unit 10 is an imaging device capable of controlling the angle of view from the outside. That is, in the first embodiment, the video acquisition unit 10 adjusts the angle of view based on the camera control information generated by the camera control information generation unit 14 and returns the image to the state before the change.
- FIG. 4 is a block diagram illustrating a configuration example of a modification of the field angle variation detection device of the first embodiment.
- the angle-of-view variation detection device of this modification includes a video acquisition unit 20, a position calculation unit 11, a camera calibration information storage unit 22, a field angle variation determination unit 13, and a camera calibration information generation unit 24. .
- the video acquisition unit 20 captures a video and inputs the video to the position calculation unit 11 and the view angle variation determination unit 13.
- the video acquisition unit 20 is different from the video acquisition unit 10 of the first embodiment in that the video acquisition unit 20 is an imaging device that cannot control the angle of view from the outside.
- the video acquisition means 20 is realized not only by a camera but also by a VTR (Video Tape Recorder) or a hard disk recorder that can play back recorded video and acquire the played back video.
- the camera calibration information generation unit 24 generates camera calibration information used by the position calculation unit 11 and stores it in the camera calibration information storage unit 22. It should be noted that the camera calibration information generating means 24 may use any method for generating camera calibration information.
- the camera calibration information generation unit 24 stores, for example, pixel movement amounts with respect to predetermined zoom, pan, and tilt amounts in advance, as in the case of the camera control information generation unit 14. The movement amount may be calculated from the apparent movement. In addition, since the calculation method of camera calibration information is widely known, detailed description is omitted here.
- the camera parameters can be changed to change the image from the two-dimensional coordinates to three-dimensional (real space). Can be calculated appropriately.
- the feature amount tracking unit 104 may determine whether or not the calculated view angle variation parameter is within a normal view angle variation range. Then, the feature amount tracking unit 104 may instruct the camera calibration information generating unit 24 to generate the camera calibration information when it is within the normal range of the view angle fluctuation. On the other hand, when it is not within the range of the normal angle of view, there is a possibility that the position calculation means 11 cannot return to the state before the change based on the camera calibration information. In this case, the feature amount tracking unit 104 may list only an alert indicating that a change in the angle of view has occurred, and prompt the user to perform recalibration manually.
- Embodiment 2 a second embodiment of the field angle variation detection device according to the present invention will be described.
- the configuration of the angle-of-view variation detection device of this embodiment is the same as the configuration illustrated in FIG. 1, but the content of the angle-of-view variation determination unit 13 is different from that of the first embodiment.
- FIG. 5 is a block diagram illustrating a configuration example of the view angle variation determination unit 13 of the second embodiment.
- the angle-of-view variation determining unit 13 of the present embodiment includes an angle-of-view variation detecting unit 101, a fixed point information accumulating unit 102, a feature amount extracting unit 103, a feature amount tracking unit 104, and a fixed point detecting unit 204. .
- symbol same as FIG. 2 is attached
- the feature amount of the feature point calculated by the feature amount extraction unit 103 is also input to the fixed point detection unit 204.
- the position of the feature point and the feature amount of the feature point input from the feature amount extraction unit 103 are collectively referred to as feature point information.
- the fixed point detecting means 204 detects, as fixed points, feature points that are continuously located at the same position for a predetermined time among the feature points extracted by the feature amount extracting means 103.
- the fixed point detection unit 204 uses the feature point information of the feature points not matched with the fixed point among the feature points included in the feature point information input from the feature amount extraction unit 103 as the fixed point candidate. Extract and keep as.
- the fixed point detection unit 204 compares the input feature point information with the stored fixed point candidate, It is determined whether or not the extracted feature points are continuously present at the same position. Note that whether or not feature points are continuously present at the same position can be determined based on a change in the feature amount of a point corresponding to the feature point.
- the fixed point detection unit 204 may update the feature amount for the feature point detected as existing at the same position.
- the fixed point detection unit 204 may calculate the number of detection times of the feature point in a certain period. And the fixed point detection means 204 may detect the feature point as a fixed point when the number of detections within the period exceeds a predetermined threshold.
- the fixed point detection unit 204 stores the feature point information of the feature point detected as the fixed point in the fixed point information storage unit 102. That is, the fixed point detection unit 204 updates the fixed point information stored in the fixed point information storage unit 102.
- the fixed point detection unit 204 calculates the number of detections within the period, the number of detections for the period is set as a reliability indicating the likelihood of the fixed point of the feature point, and the reliability is associated with the feature point information. You may memorize
- the reliability is used, for example, as a weight when the feature amount tracking unit 104 collates a fixed point with a feature point.
- the feature amount tracking unit 104 may preferentially select a pair with high reliability when selecting a pair of a fixed point and a feature point. In this way, high-precision fixed point information can be used preferentially.
- the fixed point detection unit 204 may calculate the detection time of the feature point and store it in the fixed point information storage unit 102.
- the fixed point detection unit 204 detects the feature point when the elapsed time from the time when the feature point was last detected exceeds a certain time. May not be a fixed point, and the feature point information of the feature point may be deleted. By doing in this way, highly accurate fixed point information can be maintained.
- Embodiment 3 a third embodiment of the field angle variation detection apparatus according to the present invention will be described.
- the configuration of the angle-of-view variation detection device of this embodiment is the same as the configuration illustrated in FIG. 1, but the content of the angle-of-view variation determination unit 13 is different from that of the first embodiment.
- the fixed point is switched according to the situation where the imaging device captures an image.
- FIG. 6 is a block diagram illustrating a configuration example of the angle-of-view variation determination unit 13 of the third embodiment.
- the view angle variation determination unit 13 of the present embodiment includes a view angle variation detection unit 101, a fixed point information storage unit 302, a feature amount extraction unit 103, a feature amount tracking unit 104, and a situation determination unit 306.
- symbol same as FIG. 2 is attached
- subjected and description is abbreviate
- the situation determination unit 306 determines a situation where the imaging apparatus captures an image. Specifically, the situation determination unit 306 determines the current situation in which the imaging device is shooting based on the video shot by the imaging device, such as the brightness and time of the entire video. For example, the situation determination unit 306 may determine the time zone (morning, noon, night, etc.) in which the image is taken from the brightness and time of the entire video. The situation determination unit 306 stores information for identifying the situation (hereinafter referred to as situation identification information) in the fixed point information storage unit 302.
- the situation identification information for example, an index for determining the situation is used.
- the situation identification information may be defined with an index indicating daytime being 1 and an index indicating night being 2.
- the situation determination unit 306 may generate situation identification information indicating both situations. At this time, the situation determination unit 306 may digitize the probability of each situation and include the value as reliability information in the state identification information.
- the fixed point information accumulating unit 302 stores, as fixed point information, information indicating the position of the fixed point and the feature value indicating the feature of the fixed point, as well as information indicating the situation in which the fixed point is used.
- the situation where the fixed point is used may be information indicating a situation where the fixed point is specified, or may be information indicating a situation where the fixed point is to be applied. In the case of a fixed point applicable to a plurality of situations, the reliability for each situation may be used as information indicating the situation in which the fixed point is used.
- the fixed point information accumulating unit 302 may store the fixed points that are used in common and the fixed points for each situation as a group. In this case, the fixed point for each situation is added to the fixed point that is commonly used according to each situation, and is used.
- the angle-of-view fluctuation detecting unit 101 selects fixed point information to be used based on the situation identification information, and whether or not the angle of view of the video acquisition unit 10 has changed from the input video based on the selected fixed point information. Is detected.
- the feature amount tracking unit 104 selects fixed point information to be used based on the situation identification information, and calculates an angle-of-view variation parameter for converting the position of each feature point to the position of the selected fixed point.
- the effectiveness of the fixed point specified from the angle of view of the imaging device varies depending on the time zone such as day, night, and dawn.
- the fixed point is selected according to the situation where the imaging device is photographing, the accuracy of the angle of view variation can be further improved.
- situation determination unit of this embodiment may be applied to the angle-of-view variation determination unit 13 illustrated in the second embodiment.
- Embodiment 4 FIG. Next, a fourth embodiment of the field angle variation detection device according to the present invention will be described.
- the configuration of the angle-of-view variation detection device of this embodiment is the same as the configuration illustrated in FIG. 1, but the content of the angle-of-view variation determination unit 13 is different from that of the first embodiment.
- FIG. 7 is a block diagram illustrating a configuration example of the field angle variation determination unit 13 according to the fourth embodiment.
- the angle-of-view variation determining unit 13 of the present embodiment includes an angle-of-view variation detecting unit 101, a fixed point information accumulating unit 102, a feature amount extracting unit 103, a feature amount tracking unit 104, and a fixed point learning unit 406. .
- symbol same as FIG. 2 is attached
- the fixed point learning means 406 automatically extracts a fixed point from the background image during a time period when it is determined that there is no foreground.
- the fixed point learning means 406 may determine that there is no foreground when there is no change in the video of the frame for a certain period, and may specify a time zone in which no foreground exists according to a manual instruction.
- the fixed point learning unit 406 extracts a fixed point using a method similar to the method by which the fixed point detection unit 204 of the second embodiment detects a fixed point, and stores the fixed point in the fixed point information storage unit 102. Therefore, highly accurate fixed point information can be acquired.
- FIG. 8 is a block diagram showing an outline of the angle-of-view variation detecting device according to the present invention.
- the angle-of-view variation detection device according to the present invention is based on fixed point information including the position of a fixed point specified from the angle of view of an imaging device (for example, the video acquisition unit 10) and a feature amount indicating the feature of the fixed point.
- An angle-of-view variation detecting unit 81 (for example, an angle-of-view variation detecting unit 101) that detects an angle-of-view variation of the image-capturing device from a video imaged by the image-capturing device;
- State determination means 82 for example, feature amount tracking means 104) for determining that the angle of view has changed is provided.
- the angle-of-view variation detection device converts a feature point extraction unit (for example, feature amount extraction unit 103) that extracts a feature point from a video captured by the imaging device, and converts the position of the feature point into a corresponding fixed point position.
- a feature point extraction unit for example, feature amount extraction unit 103
- the parameter calculation means which calculates the view angle fluctuation
- the state determination unit 82 may determine that the angle of view of the imaging device has changed when the change in the angle of view variation parameter between frames of the video imaged by the imaging device has stabilized for a certain period.
- the feature point extraction unit calculates a feature amount of the feature point extracted from the video imaged by the imaging apparatus, and the parameter calculation unit determines that the distance between the feature point feature amount and the fixed point feature amount is a predetermined threshold value. If the similarity between the feature quantity of the feature point and the feature quantity of the fixed point is equal to or greater than a predetermined threshold, the angle of view variation parameter is calculated by determining that the fixed point corresponds to the feature point. Also good.
- the angle-of-view variation detection device is a fixed point detection unit (for example, fixed point detection) that detects, as a fixed point, a feature point that remains in the same position for a predetermined time among the feature points extracted by the feature point extraction unit. Means 204) may be provided. In this case, the accuracy of the fixed point to be used can be improved.
- fixed point detection unit for example, fixed point detection
- the angle-of-view variation detecting unit 81 calculates the number of fixed points at which the feature amount at the same position as the fixed point position in the video input from the imaging device substantially matches the fixed point feature amount, and the matching fixed point It may be determined that the angle of view fluctuates when there is no predetermined number or more.
- the angle-of-view variation detection device may include a fixed point storage unit (for example, a fixed point information storage unit 102) that stores fixed point information. Then, the view angle variation detection unit 81 may detect the view angle variation of the imaging apparatus based on the stored fixed point information.
- a fixed point storage unit for example, a fixed point information storage unit 102
- the view angle variation detection unit 81 may detect the view angle variation of the imaging apparatus based on the stored fixed point information.
- the angle-of-view variation detection device may include a situation determination unit (for example, a situation determination unit 306) that determines a situation in which the imaging device captures an image.
- the fixed point storage means stores fixed point information including information indicating the situation where the fixed point is used, and the angle-of-view variation detection means 81 stores the fixed point information corresponding to the determined situation. It is also possible to select from the fixed point information to be stored, and to detect the fluctuation of the angle of view of the imaging device from the video imaged by the imaging device based on the fixed point information.
- the present invention is preferably applied to detecting a change in the angle of view of the imaging apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図1は、本発明による画角変動検知装置の第1の実施形態の構成例を示すブロック図である。本実施形態の画角変動検知装置は、映像取得手段10と、位置算出手段11と、カメラ校正情報蓄積手段12と、画角変動判定手段13と、カメラ制御情報生成手段14とを備えている。
次に、本発明による画角変動検知装置の第2の実施形態を説明する。本実施形態の画角変動検知装置の構成は、図1に例示する構成と同様であるが、画角変動判定手段13の内容が第1の実施形態と異なる。
次に、本発明による画角変動検知装置の第3の実施形態を説明する。本実施形態の画角変動検知装置の構成は、図1に例示する構成と同様であるが、画角変動判定手段13の内容が第1の実施形態と異なる。本実施形態の画角変動検知装置では、撮像装置が撮影する状況に応じて不動点を切り替える。
次に、本発明による画角変動検知装置の第4の実施形態を説明する。本実施形態の画角変動検知装置の構成は、図1に例示する構成と同様であるが、画角変動判定手段13の内容が第1の実施形態と異なる。
11 位置算出手段
12,22 カメラ校正情報蓄積手段
13 画角変動判定手段
14 カメラ制御情報生成手段
24 カメラ校正情報生成手段
101 画角変動検出手段
102,302 不動点情報蓄積手段
103 特徴量抽出手段
104 特徴量追跡手段
204 不動点検出手段
306 状況判別手段
406 不動点学習手段
Claims (11)
- 撮像装置の画角から特定される不動点の位置と当該不動点の特徴を示す特徴量とを含む不動点情報に基づいて、当該撮像装置が撮影する映像から当該撮像装置の画角変動を検出する画角変動検出手段と、
前記画角変動の変化が一定期間安定した場合に前記撮像装置の画角が変動したと判定する状態判定手段とを備えた
ことを特徴とする画角変動検知装置。 - 撮像装置が撮影する映像から特徴点を抽出する特徴点抽出手段と、
前記特徴点の位置を対応する不動点の位置へ変換するパラメータである画角変動パラメータを算出するパラメータ算出手段とを備え、
状態判定手段は、撮像装置が撮影する映像のフレーム間における前記画角変動パラメータの変化が一定期間安定した場合に、撮像装置の画角が変動したと判定する
請求項1記載の画角変動検知装置。 - 特徴点抽出手段は、撮像装置によって撮影される映像から抽出した特徴点の特徴量を算出し、
パラメータ算出手段は、前記特徴点の特徴量と不動点の特徴量との距離が所定の閾値以下、または、前記特徴点の特徴量と不動点の特徴量との類似度が所定の閾値以上である場合、当該不動点を前記特徴点に対応する点と判断して画角変動パラメータを算出する
請求項2記載の画角変動検知装置。 - 特徴点抽出手段によって抽出された特徴点のうち、継続して所定時間同じ位置に存在する特徴点を不動点として検出する不動点検出手段を備えた
請求項2または請求項3記載の画角変動検知装置。 - 画角変動検出手段は、撮像装置から入力された映像において不動点の位置と同じ位置の特徴量が不動点の特徴量とほぼ一致する不動点の数を算出し、一致する不動点の数が一定数以上存在しない場合に画角が変動していると判断する
請求項1から請求項4のうちのいずれか1項に記載の画角変動検知装置。 - 不動点情報を記憶する不動点記憶手段を備え、
画角変動検出手段は、記憶された不動点情報に基づいて撮像装置の画角変動を検出する
請求項1から請求項5のうちのいずれか1項に記載の画角変動検知装置。 - 撮像装置が撮影する状況を判別する状況判別手段を備え、
不動点記憶手段は、不動点が用いられる状況を示す情報を含む不動点情報を記憶し、
画角変動検出手段は、判別された状況に応じた不動点情報を前記不動点記憶手段が記憶する不動点情報の中から選択し、当該不動点情報に基づいて、撮像装置が撮影する映像から当該撮像装置の画角変動を検出する
請求項6記載の画角変動検知装置。 - 撮像装置の画角から特定される不動点の位置と当該不動点の特徴を示す特徴量とを含む不動点情報に基づいて、当該撮像装置が撮影する映像から当該撮像装置の画角変動を検出し、
前記画角変動の変化が一定期間安定した場合に前記撮像装置の画角が変動したと判定する
ことを特徴とする画角変動検知方法。 - 撮像装置が撮影する映像から特徴点を抽出し、
前記特徴点の位置を対応する不動点の位置へ変換するパラメータである画角変動パラメータを算出し、
撮像装置が撮影する映像のフレーム間における前記画角変動パラメータの変化が一定期間安定した場合に、撮像装置の画角が変動したと判定する
請求項8記載の画角変動検知方法。 - コンピュータに、
撮像装置の画角から特定される不動点の位置と当該不動点の特徴を示す特徴量とを含む不動点情報に基づいて、当該撮像装置が撮影する映像から当該撮像装置の画角変動を検出する画角変動検出処理、および、
前記画角変動の変化が一定期間安定した場合に前記撮像装置の画角が変動したと判定する状態判定処理
を実行させるための画角変動検知プログラム。 - コンピュータに、
撮像装置が撮影する映像から特徴点を抽出する特徴点抽出処理、および、
前記特徴点の位置を対応する不動点の位置へ変換するパラメータである画角変動パラメータを算出するパラメータ算出処理を実行させ、
状態判定処理で、撮像装置が撮影する映像のフレーム間における前記画角変動パラメータの変化が一定期間安定した場合に、撮像装置の画角が変動したと判定させる
請求項10記載の画角変動検知プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/414,154 US9922423B2 (en) | 2012-07-12 | 2013-06-17 | Image angle variation detection device, image angle variation detection method and image angle variation detection program |
JP2014524621A JP6036824B2 (ja) | 2012-07-12 | 2013-06-17 | 画角変動検知装置、画角変動検知方法および画角変動検知プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-156772 | 2012-07-12 | ||
JP2012156772 | 2012-07-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014010174A1 true WO2014010174A1 (ja) | 2014-01-16 |
Family
ID=49915659
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/003762 WO2014010174A1 (ja) | 2012-07-12 | 2013-06-17 | 画角変動検知装置、画角変動検知方法および画角変動検知プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9922423B2 (ja) |
JP (1) | JP6036824B2 (ja) |
WO (1) | WO2014010174A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150269778A1 (en) * | 2014-03-20 | 2015-09-24 | Kabushiki Kaisha Toshiba | Identification device, identification method, and computer program product |
JP2016157268A (ja) * | 2015-02-24 | 2016-09-01 | 日本電信電話株式会社 | 画像照合装置、及び方法 |
WO2016191181A1 (en) * | 2015-05-26 | 2016-12-01 | Crown Equipment Corporation | Systems and methods for image capture device calibration for a materials handling vehicle |
US9921067B2 (en) | 2015-05-26 | 2018-03-20 | Crown Equipment Corporation | Systems and methods for materials handling vehicle odometry calibration |
CN112184664A (zh) * | 2020-09-27 | 2021-01-05 | 杭州依图医疗技术有限公司 | 一种椎骨检测方法及计算机设备 |
CN112215894A (zh) * | 2019-07-09 | 2021-01-12 | 杭州萤石软件有限公司 | 限位检测方法、装置、电子设备及可读存储介质 |
WO2022091182A1 (ja) * | 2020-10-26 | 2022-05-05 | 日本電気株式会社 | 撮像状況監視システム、撮像状況監視方法、及び記録媒体 |
JP7675369B2 (ja) | 2021-04-12 | 2025-05-13 | パナソニックIpマネジメント株式会社 | 監視装置および監視システム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102185131B1 (ko) * | 2014-03-27 | 2020-12-01 | 삼성전자주식회사 | 썸네일 생성 방법 및 그 전자 장치 |
US10565712B2 (en) * | 2016-06-21 | 2020-02-18 | Canon Kabushiki Kaisha | Image processing apparatus and method for controlling the same |
CN106504270B (zh) * | 2016-11-08 | 2019-12-20 | 浙江大华技术股份有限公司 | 一种视频中目标物体的展示方法及装置 |
KR101969550B1 (ko) * | 2018-09-12 | 2019-04-16 | 한국지질자원연구원 | 사용자 생활 공간의 내진 취약성 분석 시스템 및 이를 이용한 사용자 생활 공간의 내진 취약성 분석 방법 |
CN110909823B (zh) * | 2019-12-03 | 2024-03-26 | 携程计算机技术(上海)有限公司 | 图片特征点提取及相似度的判断方法、系统、设备和介质 |
CN111083361A (zh) * | 2019-12-11 | 2020-04-28 | 维沃移动通信有限公司 | 图像获取方法及电子设备 |
CN112135122A (zh) * | 2020-09-21 | 2020-12-25 | 北京百度网讯科技有限公司 | 用于监测成像设备的方法、装置、电子设备以及路侧设备 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004343737A (ja) * | 2003-04-24 | 2004-12-02 | Sumitomo Electric Ind Ltd | 画像補正方法、プログラム及び装置 |
JP2009267466A (ja) * | 2008-04-22 | 2009-11-12 | Mega Chips Corp | 動体検知機能付監視カメラ |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030012410A1 (en) * | 2001-07-10 | 2003-01-16 | Nassir Navab | Tracking and pose estimation for augmented reality using real features |
US7688381B2 (en) * | 2003-04-08 | 2010-03-30 | Vanbree Ken | System for accurately repositioning imaging devices |
US7616807B2 (en) * | 2005-02-24 | 2009-11-10 | Siemens Corporate Research, Inc. | System and method for using texture landmarks for improved markerless tracking in augmented reality applications |
JP4906628B2 (ja) | 2007-08-01 | 2012-03-28 | 富士重工業株式会社 | 監視用カメラの補正装置 |
JP2009122859A (ja) * | 2007-11-13 | 2009-06-04 | Toyota Motor Corp | 物体検出装置 |
GB0810427D0 (en) * | 2008-06-07 | 2008-07-09 | El Electronic Ideas Ltd | Levelling apparatus |
EP2339537B1 (en) * | 2009-12-23 | 2016-02-24 | Metaio GmbH | Method of determining reference features for use in an optical object initialization tracking process and object initialization tracking method |
US8755633B2 (en) * | 2012-03-14 | 2014-06-17 | Sony Corporation | Automated synchronized navigation system for digital pathology imaging |
-
2013
- 2013-06-17 WO PCT/JP2013/003762 patent/WO2014010174A1/ja active Application Filing
- 2013-06-17 JP JP2014524621A patent/JP6036824B2/ja active Active
- 2013-06-17 US US14/414,154 patent/US9922423B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004343737A (ja) * | 2003-04-24 | 2004-12-02 | Sumitomo Electric Ind Ltd | 画像補正方法、プログラム及び装置 |
JP2009267466A (ja) * | 2008-04-22 | 2009-11-12 | Mega Chips Corp | 動体検知機能付監視カメラ |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150269778A1 (en) * | 2014-03-20 | 2015-09-24 | Kabushiki Kaisha Toshiba | Identification device, identification method, and computer program product |
JP2016157268A (ja) * | 2015-02-24 | 2016-09-01 | 日本電信電話株式会社 | 画像照合装置、及び方法 |
KR102490138B1 (ko) * | 2015-05-26 | 2023-01-19 | 크라운 이큅먼트 코포레이션 | 자재 취급 차량을 위한 이미지 캡처 디바이스 교정을 위한 시스템들 및 방법들 |
KR20180012287A (ko) * | 2015-05-26 | 2018-02-05 | 크라운 이큅먼트 코포레이션 | 자재 취급 차량을 위한 이미지 캡처 디바이스 교정을 위한 시스템들 및 방법들 |
US9921067B2 (en) | 2015-05-26 | 2018-03-20 | Crown Equipment Corporation | Systems and methods for materials handling vehicle odometry calibration |
US10455226B2 (en) | 2015-05-26 | 2019-10-22 | Crown Equipment Corporation | Systems and methods for image capture device calibration for a materials handling vehicle |
WO2016191181A1 (en) * | 2015-05-26 | 2016-12-01 | Crown Equipment Corporation | Systems and methods for image capture device calibration for a materials handling vehicle |
CN112215894A (zh) * | 2019-07-09 | 2021-01-12 | 杭州萤石软件有限公司 | 限位检测方法、装置、电子设备及可读存储介质 |
CN112184664A (zh) * | 2020-09-27 | 2021-01-05 | 杭州依图医疗技术有限公司 | 一种椎骨检测方法及计算机设备 |
CN112184664B (zh) * | 2020-09-27 | 2023-05-26 | 杭州依图医疗技术有限公司 | 一种椎骨检测方法及计算机设备 |
WO2022091182A1 (ja) * | 2020-10-26 | 2022-05-05 | 日本電気株式会社 | 撮像状況監視システム、撮像状況監視方法、及び記録媒体 |
JP7521590B2 (ja) | 2020-10-26 | 2024-07-24 | 日本電気株式会社 | 撮像状況監視システム、撮像状況監視方法、及び記録媒体 |
JP7675369B2 (ja) | 2021-04-12 | 2025-05-13 | パナソニックIpマネジメント株式会社 | 監視装置および監視システム |
Also Published As
Publication number | Publication date |
---|---|
US20150146988A1 (en) | 2015-05-28 |
US9922423B2 (en) | 2018-03-20 |
JPWO2014010174A1 (ja) | 2016-06-20 |
JP6036824B2 (ja) | 2016-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6036824B2 (ja) | 画角変動検知装置、画角変動検知方法および画角変動検知プログラム | |
JP5001260B2 (ja) | オブジェクト追跡方法及びオブジェクト追跡装置 | |
JP6428266B2 (ja) | 色補正装置、色補正方法および色補正用プログラム | |
KR101071352B1 (ko) | 좌표맵을 이용한 팬틸트줌 카메라 기반의 객체 추적 장치 및 방법 | |
US9098748B2 (en) | Object detection apparatus, object detection method, monitoring camera system and storage medium | |
KR101910542B1 (ko) | 객체 검출을 위한 영상분석 서버장치 및 방법 | |
JP7334432B2 (ja) | 物体追跡装置、監視システムおよび物体追跡方法 | |
KR102144394B1 (ko) | 영상 정합 장치 및 이를 이용한 영상 정합 방법 | |
KR102002812B1 (ko) | 객체 검출을 위한 영상분석 서버장치 및 방법 | |
JP2004227160A (ja) | 侵入物体検出装置 | |
JP6551226B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6444283B2 (ja) | 姿勢判定装置 | |
US20220366570A1 (en) | Object tracking device and object tracking method | |
JP5279517B2 (ja) | 物体検知装置及び物体検知方法 | |
US20160210756A1 (en) | Image processing system, image processing method, and recording medium | |
US7982774B2 (en) | Image processing apparatus and image processing method | |
JP4999794B2 (ja) | 静止領域検出方法とその装置、プログラム及び記録媒体 | |
JP2020149642A (ja) | 物体追跡装置および物体追跡方法 | |
JP4578864B2 (ja) | 自動追尾装置及び自動追尾方法 | |
US11716448B2 (en) | Information processing apparatus, information processing method, and storage medium | |
JP2007156655A (ja) | 変動領域検出装置及びその方法 | |
KR20130058172A (ko) | 침입자의 얼굴감지 시스템 및 그 방법 | |
JP2020160901A (ja) | 物体追跡装置および物体追跡方法 | |
JP5947588B2 (ja) | 監視装置 | |
JP6752317B2 (ja) | 画像処理装置及び方法、並びに命令を格納する記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13817199 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014524621 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14414154 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13817199 Country of ref document: EP Kind code of ref document: A1 |