WO2025013285A1 - Object detection device - Google Patents
Object detection device Download PDFInfo
- Publication number
- WO2025013285A1 WO2025013285A1 PCT/JP2023/025889 JP2023025889W WO2025013285A1 WO 2025013285 A1 WO2025013285 A1 WO 2025013285A1 JP 2023025889 W JP2023025889 W JP 2023025889W WO 2025013285 A1 WO2025013285 A1 WO 2025013285A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- vehicle
- width
- detection device
- height
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 71
- 230000008859 change Effects 0.000 claims abstract description 68
- 238000004364 calculation method Methods 0.000 claims description 38
- 238000012545 processing Methods 0.000 description 28
- 238000012544 monitoring process Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 19
- 238000000034 method Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000007257 malfunction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to an object detection device for vehicles.
- Patent Document 1 monitors the rear of a vehicle using a stereo camera and a monocular camera.
- the width (or height) of a following vehicle in the recognition range of the stereo camera is determined and stored, and when the following vehicle moves from the recognition range of the stereo camera into the recognition range of the monocular camera, the distance to the following vehicle is determined based on the stored width (or height) and the width (or height) of the following vehicle determined from the recognition result of the monocular camera.
- the width of a motorcycle seen from the front is narrower than that of an automobile, but when the motorcycle makes a lane change or other movement that makes the side visible to each sensor, the apparent change in vehicle width at each sensor tends to be greater than that of an automobile.
- the technology of Patent Document 1 is used to calculate distance based on the vehicle width from the recognition results of a monocular camera, the monocular camera may recognize the vehicle width as having changed significantly more than the vehicle width obtained from the recognition results of a stereo camera, and the distance to the object may be calculated as a value closer than the actual value. Conversely, if the monocular camera recognizes the vehicle width as having changed slightly, the distance to the object may be calculated as a value farther than the actual value.
- the distance calculation based on Patent Document 1 may cause differences in the distance to the actual object, and there is room for improvement.
- this problem is due to the characteristics of a monocular camera, which is prone to distance errors due to dimensional changes in the image of the object, and can be pointed out not only in the case of changes in the width (vehicle width) of the object pointed out above, but also in the case of the height of the object detected by the sensor as having changed in appearance.
- the object of the present invention is to provide an object detection device that can prevent the calculated distance between the vehicle and a detected object from deviating from the actual distance when distance calculation is handed over from the first sensor to the second sensor.
- the present application includes multiple means for solving the above problem, and an example thereof is an object detection device for a vehicle having at least one processor, the processor calculates multiple feature quantities of an object from the detection result of a first sensor mounted on the vehicle, calculates changes in the multiple feature quantities obtained from the first sensor, determines a reference feature quantity to be used in calculating the relative distance between the vehicle and the object from the feature quantities of the object calculated from the detection result of a second sensor mounted on the vehicle based on the changes in the multiple feature quantities, calculates the reference feature quantity from the detection result of the second sensor when the object moves from the recognition range of the first sensor to the recognition range of the second sensor, and calculates the relative distance based on the reference feature quantity and the feature quantity corresponding to the reference feature quantity from the multiple feature quantities obtained from the first sensor.
- the present invention makes it possible to prevent the calculated distance between the vehicle and a detected object from deviating from the actual distance when distance calculation is handed over from the first sensor to the second sensor.
- FIG. 1 is a configuration diagram including an object detection device according to a first embodiment of the present invention and its peripheral devices.
- FIG. 1 is a diagram showing an example of the locations where a stereo camera 101 and a monocular camera 102 for monitoring the rear are installed.
- FIG. 2 is a software configuration diagram of the ECU 105.
- FIG. 1 is an explanatory diagram of a two-dimensional bounding box. 1 shows a processing flow of rear monitoring executed by the ECU 105 (processor 105a) according to the first embodiment.
- 4A to 4C are diagrams illustrating the operation of rear monitoring according to the first embodiment; 13 shows a process flow of rear monitoring executed by the ECU 105 (processor 105a) according to the second embodiment.
- FIG. 13 is a diagram illustrating an example of a weighting map.
- FIG. 13 is a diagram illustrating an example of a weighting map.
- FIG. 13 is a diagram illustrating an example of a weighting map.
- FIG. 1 is an explanatory diagram of a three-dimensional bounding box.
- FIG. 11 is a diagram showing a processing flow of rear monitoring according to the second embodiment.
- FIG. 1 is a diagram showing an example of the locations where a stereo camera 101 and a monocular camera 102 for monitoring the forward direction are installed.
- FIG. 13 is a diagram illustrating the operation of forward monitoring according to the fourth embodiment.
- the subject of this embodiment is a detection device for detecting objects approaching a vehicle, such as an automobile or motorcycle, while the vehicle is traveling on a road.
- a detection device for detecting objects approaching a vehicle, such as an automobile or motorcycle, while the vehicle is traveling on a road.
- FIG. 1 is a block diagram including an object detection device according to a first embodiment of the present invention and its peripheral devices.
- the object detection device according to this embodiment includes a first sensor 101, a second sensor 102, a first processing circuit 103, a second processing circuit 104, and an electronic control device (hereinafter, sometimes referred to as ECU) 105.
- the ECU 105 can be connected to a vehicle control device (another ECU) 107 and an alarm device (monitor, speaker, etc.) 108 via a CAN interface (CAN IF) 106.
- CAN IF CAN interface
- the first sensor 101 is a sensor mounted on the host vehicle to monitor the rear of the host vehicle, and is any one of a stereo camera, a radar, a sonar, a LiDAR, and a monocular camera.
- the first sensor 101 is a stereo camera.
- the camera image of the stereo camera 101 is image-processed by the first processing circuit 103 and input to the ECU 105.
- the second sensor 102 is a sensor mounted on the vehicle to monitor the rear side of the vehicle, and is preferably a monocular camera.
- the camera image of the monocular camera 102 is processed by the second processing circuit 104 and input to the ECU 105.
- the ECU 105 is equipped with a memory 105b which is a data storage device, and at least one processor 105a which executes various processes according to the programs stored in the memory 105b, and processes information on an object (e.g., another vehicle) approaching the vehicle using the processing results of the first processing circuit 103 and the second processing circuit 104 (in other words, the detection results of the first sensor 101 and the second sensor 102) within the ECU 105.
- This information includes relative distance data between the vehicle and the object.
- information on an object approaching from the rear side of the vehicle is sent to the CAN bus via the CAN IF 106, and output to the vehicle control device 107 and the alarm device 108.
- a system including signal input units (not shown) that input signals from the stereo camera 101 and the monocular camera 102, the first processing circuit 103 and the second processing circuit 104, the ECU 105, and the CAN IF 106 is shown as an example, but these are not necessarily devices contained within a single unit.
- the cameras 101 and 102 and the corresponding processing circuits 103 and 104 may each exist as a single unit, and be connected to some control device including the ECU 105 by a signal line.
- the first processing circuit 103 and the second processing circuit 104 may be contained within the ECU 105, or the first processing circuit 103, the second processing circuit 104, and the ECU 105 may be contained within the vehicle control device 107.
- Figure 2 shows an example of the installation locations of the stereo camera 101 and monocular camera 102 for rear monitoring.
- the stereo camera 101 can be installed, for example, on the rear bumper to monitor the rear of the vehicle 201.
- the monocular camera 102 can be installed, for example, on the door mirror to monitor the rear side of the vehicle.
- Area 202-S in the figure indicates the recognition area of the stereo camera 101
- area 203-M indicates the recognition area of the monocular camera 102.
- the recognition area may be the camera viewing angle or angle of view, or it may indicate an area within the camera viewing angle that can be processed by the image processing unit.
- the stereo camera 101 has a range that can be viewed in stereo, which is the shaded area 202-S where the fields of view of the left and right cameras overlap.
- the stereo camera 101 may be installed inside the vehicle 201.
- the stereo camera 101 may be installed on the ceiling on the rear seat side so that it can monitor the rear of the vehicle through the rear window. This can prevent water droplets, mud, etc. from adhering to the stereo camera 101 in rainy weather.
- FIG. 3 is a diagram showing the software configuration of the ECU 105 inside the ECU 105.
- the ECU 105 can function as a first feature amount calculation unit 301, a second feature amount calculation unit 302, a sensor recognition area movement determination unit 303, an object distance estimation unit 304, and a collision risk determination unit 305 by executing software stored in the memory 105b with the processor 105a.
- the first feature amount calculation unit 301 assigns a recognition ID to the following vehicle traveling behind the vehicle detected in the recognition area 202-S of the stereo camera 101, calculates multiple feature amounts (width and height) of the following vehicle from the detection result (camera image) of the stereo camera 101, and calculates the amount of change in the multiple feature amounts caused by the movement of the following vehicle.
- the first feature amount calculation unit 301 can also determine, from the feature amounts of the following vehicle calculated from the detection result of the monocular camera 102, a feature amount (hereinafter sometimes referred to as a "reference feature amount") to be used in calculating the relative distance between the vehicle and the following vehicle, based on the amount of change in the multiple feature amounts caused by the movement of the following vehicle.
- the reference feature amount is a feature amount obtained by the monocular camera 102 that is used in calculating the relative distance when the following vehicle is present in the recognition area 203-M of the monocular camera 102.
- the feature amount of the following vehicle can be calculated, for example, using bounding box detection, which is known as a general technique for detecting objects.
- the first feature amount calculation unit 301 uses deep learning or the like to assign a frame line (called a two-dimensional bounding box) 402, which is displayed as the smallest square (rectangle) circumscribing the object (following vehicle 401) in the camera images of the stereo camera 101 and the monocular camera 102, as shown in FIG. 4.
- the first feature amount calculation unit 301 (processor 105a) can then calculate the width and height (feature amount) of the following vehicle from the width and height of the two-dimensional bounding box.
- the first feature amount calculation unit 301 can also calculate the amount of change in the width and height (feature amount) of the following vehicle by monitoring the change in the width and height of the two-dimensional bounding box caused by the movement of the following vehicle.
- the calculated feature amount and the data on the amount of change are used, for example, by the sensor recognition area movement determination unit 303 and the object distance estimation unit 304.
- the width and height of the following vehicle may be calculated from, for example, the edges of the following vehicle in the camera image of the stereo camera 101.
- the second feature amount calculation unit 302 calculates multiple feature amounts (width and height) of the following vehicle traveling to the rear side of the host vehicle, detected in the recognition area 203-M of the monocular camera 102, from the detection result (camera image) of the monocular camera 102.
- the feature amounts can be calculated in the same way as the first feature amount calculation unit 301, and can be calculated, for example, based on a two-dimensional bounding box.
- the calculated feature amounts include reference feature amounts.
- the calculated feature amount data is used, for example, by the sensor recognition area movement determination unit 303 and the object distance estimation unit 304.
- the sensor recognition area movement determination unit 303 determines whether the same following vehicle has moved from the recognition area 202-S of the stereo camera 101 to the recognition area 203-M of the monocular camera 102, and if it determines that the same following vehicle has moved, it assigns the following vehicle in the recognition area 203-M the same recognition ID as when it was in the recognition area 202-S (inherits the recognition ID).
- the object distance estimation unit 304 calculates the relative distance between the following vehicle and the vehicle based on a reference feature value used to calculate the relative distance between the following vehicle and the vehicle from among the feature values of the following vehicle obtained from the monocular camera 102, and a feature value corresponding to the reference feature value from among the multiple feature values of the following vehicle obtained from the stereo camera 101.
- the collision risk determination unit 305 determines whether there is a possibility of a collision between the host vehicle and the following vehicle from the relative distance between the host vehicle and the following vehicle calculated by the object distance estimation unit 304, and transmits permission or denial of collision avoidance control by the vehicle control device 107 based on the determination result via the CAN IF 106. If the collision risk determination unit 305 determines that there is a possibility of a collision based on the determination result, it may transmit a message to that effect to the alarm device 108.
- Figure 5 shows the flow of processing for rear monitoring executed by the ECU 105 (processor 105a) according to the first embodiment.
- the processor 105a assigns a recognition ID to the following vehicle and calculates the feature values (width and height) of the following vehicle based on the camera image of the stereo camera 101 (S501).
- the processor 105a then calculates the amount of change in the feature values (width and height) that occurs due to the movement of the following vehicle, such as a lane change, based on the camera image of the stereo camera 101 (S502).
- processor 105a determines a reference feature (i.e., one of the width and height of the following vehicle) to be used in the calculation of the relative distance from the width and height (feature) of the following vehicle obtained from monocular camera 102 in the subsequent processing (S503).
- processor 105a selects as the reference feature the feature (width or height) with the least amount of change from the feature of the following vehicle obtained from stereo camera 101 until just before the following vehicle goes out of recognition area 202-S of stereo camera 101 (S503). Thereafter, the feature corresponding to the feature selected in S503 from the feature of the following vehicle obtained from monocular camera 102 is used in the calculation of the relative distance in S506.
- the height captured by the monocular camera 102 is used as a feature in the calculation of the relative distance in S506 using the image from the monocular camera 102.
- a reference feature (a feature used when calculating relative distance) from an image of the monocular camera 102 described here is just one example, and other selection methods may be used as long as the spirit of this embodiment can be reproduced. For example, if the amount of change in the width of the feature captured by the stereo camera 101 exceeds a predetermined threshold, height may be selected as the feature captured by the monocular camera 102 and used as the reference feature.
- the processor 105a judges whether the following vehicle has moved out of the recognition area 202-S of the stereo camera 101 and into the recognition area 203-M of the monocular camera 102 based on the recognition ID of the following vehicle (S504), and proceeds to S505 if it is judged that the following vehicle has moved into the recognition area 203-M of the monocular camera 102.
- the processor 105a calculates the feature quantities (width and height) of the following vehicle based on the camera image of the monocular camera 102 (S505). Note that since the feature quantities used to calculate the relative distance have already been determined in S503, here it is also possible to calculate only the determined feature quantities from the camera image of the monocular camera 102.
- the processor 105a calculates the relative distance to the following vehicle (object) located to the rear side using the feature calculated in S505 that corresponds to the feature determined in S503 and one of the following formulas (1) and (2) (S506). Whether formula (1) or (2) is used to calculate the relative distance depends on the feature determined in S503. If the determined feature is "width”, formula (1) is used, and if it is "height”, formula (2) is used.
- the following formula (1) is a formula for calculating the relative distance (first relative distance) dw when the feature determined in S503 is "width".
- dw is the relative distance (first relative distance) from the host vehicle to the following vehicle (object)
- f is the focal length of the monocular camera 102
- W is the width of the following vehicle (object) calculated by the stereo camera 101
- ⁇ x is the width of the following vehicle (object) in the image of the monocular camera 102.
- the following formula (2) is a formula for calculating the relative distance (second relative distance) dh when the feature determined in S503 is "height."
- dh is the relative distance (second relative distance) from the host vehicle to the following vehicle (object)
- f is the focal length of the monocular camera 102
- H is the height of the following vehicle (object) calculated by the stereo camera 101
- ⁇ y is the height of the following vehicle (object) in the image of the monocular camera 102.
- the processor 105a determines whether or not there is a possibility that the vehicle on the rear side will collide with the vehicle on the rear side based on the calculation results of the relative distances dw, dh between the vehicle and the vehicle on the rear side (S507). If it is determined that there is a possibility of a collision, it makes a decision to permit control intervention to avoid the collision and transmits the decision (S508). If it is determined that there is no possibility of a collision, it makes a decision to prohibit control intervention to avoid the collision and transmits the decision (S509).
- (Operation) 6 is a diagram illustrating the operation of rear monitoring according to this embodiment.
- Reference numerals 401-404 in the diagram indicate the positions of a vehicle (motorcycle) following the host vehicle 201, and the following vehicle moves from position 401 to position 404 after changing lanes.
- the stereo camera 101 detects a following vehicle at position 401 behind the vehicle, and the processor 105a (first feature amount calculation unit 301) calculates the feature amounts (width and height) of the following vehicle and the amount of change therein from position 401 through position 402 until the following vehicle leaves the recognition area 202-S (S501, 502). At this time, the processor 105a (first feature amount calculation unit 301) may also calculate the relative speed and relative position between the vehicle 201 and the following vehicle.
- a recognition ID is assigned to the following vehicle.
- the processor 105a determines the feature (width or height) that will be used as a reference for calculating the relative distance using the camera image of the monocular camera 102, depending on the change in the feature (width and height) of the following vehicle in the recognition area 202-S that includes the area from position 401 to position 402 (S503).
- the processor 105a determines that the following vehicle has moved from recognition area 202-S to recognition area 203-M by the stereo camera 101 and the monocular camera 102 (S504). Whether the following vehicle detected by the stereo camera 101 and the monocular camera 102 is the same or not can be determined, for example, if the relative position difference between the following vehicles detected simultaneously by both the stereo camera 101 and the monocular camera 102 is within a predetermined value, it can be determined that it is the same following vehicle. If it is the same following vehicle, the recognition ID is also passed on.
- the processor 105a calculates data related to the following vehicle.
- the data calculated at this time includes the feature amounts (width and height) of the following vehicle calculated from the camera image of the monocular camera 102 (S505).
- the processor 105a calculates the relative distances dw, dh between the following vehicle at position 403 and the host vehicle 201 based on, for example, the characteristic amounts (width ⁇ x and height ⁇ y) of the following vehicle captured by the monocular camera 102 at position 403, the characteristic amounts (width W or height H) of the following vehicle captured by the stereo camera 101 selected in S503, and the above formula (1) or (2) (S506).
- the presence or absence of a collision is determined from the calculation results of the relative distances dw, dh between the following vehicle 403 and the vehicle 201 (S507).
- the result of this determination is transmitted from the CAN IF 106 to the vehicle control device 107 and the alarm device 108 (S508, S509). Note that monitoring by the monocular camera 102 (calculation of the relative distance) may continue even if the following vehicle moves to position 404 and completes the lane change.
- the processor 105a calculates a plurality of feature amounts (width and height) of an object (following vehicle) from the detection result of the first sensor (stereo camera 101) mounted on the host vehicle 201, calculates the amount of change in each of the plurality of feature amounts (width and height) obtained from the first sensor (stereo camera 101), and calculates the relative distance between the host vehicle 201 and the object (following vehicle) from among the feature amounts of the object calculated from the detection result of the second sensor (monocular camera 102) mounted on the host vehicle 201.
- a reference feature used in calculating the distance is determined based on the amount of change in the multiple feature values.
- the reference feature value is calculated from the detection result of the second sensor (monocular camera 102), and the relative distance is calculated based on the reference feature value and one of the multiple feature values obtained from the first sensor (stereo camera 101) that corresponds to the reference feature value.
- the relative distance in the recognition range 203-M of the second sensor is calculated based on the feature amount (reference feature amount) determined from the change amount of the feature amount of the object obtained from the first sensor (stereo camera 101) among the feature amounts of the object obtained from the second sensor (monocular camera 102) and the feature amount of the object obtained from the first sensor (stereo camera 101)
- the relative distance can be calculated based on the feature amount (reference feature amount) corresponding to the change amount of the feature amount obtained from the first sensor among the feature amounts obtained from the second sensor while using the accurate feature amount of the object obtained from the first sensor (stereo camera 101), so that the calculation accuracy of the relative distance by the second sensor can be improved.
- the feature of the object calculated and used in the calculation (1) above is related to the dimensions of the object.
- the first sensor is preferably a stereo camera, radar, sonar, LiDAR, or monocular camera
- the second sensor is preferably a monocular camera.
- the stereo camera 101 any of the sensors radar, sonar, LiDAR, or monocular camera may be used, but when considering the balance between monetary cost and calculation accuracy, it is most preferable to use the stereo camera 101 (the same applies to each of the following embodiments).
- the multiple feature amounts are the width and height of the object, and the processor 105a determines the reference feature amount based on the amount of change in the width and height of the object obtained from the stereo camera 101 (first sensor).
- the multiple features are the width and height of the object
- the processor 105a determines the feature among the width W and height H of the object that has the least amount of change as the reference feature based on the amount of change in the width W and height H of the object obtained from the first sensor.
- the processor 105a calculates the width W and height H of the following vehicle (object) from the detection results of the stereo camera (first sensor) 101 mounted on the host vehicle 201, calculates the change in the width W and height H of the following vehicle obtained from the stereo camera 101, and when the following vehicle moves out of the recognition range 202-S of the stereo camera 101 and into the recognition range 203-M of the monocular camera (second sensor) 102 mounted on the host vehicle 201, selects the feature value of the width and height of the following vehicle (object) that has the least change based on the change in the width W and height H of the following vehicle (object) obtained from the stereo camera (first sensor) 101, and calculates the relative distances dw and dh based on the feature value ( ⁇ x or ⁇ y) corresponding to the selected feature value out of the width ⁇ x and height ⁇ y of the following vehicle (object) obtained from the monocular camera (second sensor) 102.
- the object detection device When the object detection device is configured in this manner, when the object to be detected moves out of the recognition range 202-S of the stereo camera 101 and into the recognition range 203-M of the monocular camera 102, the relative distance is calculated based on features that change little even if the object moves due to a lane change, etc., so that the accuracy of the relative distance calculated by the monocular camera 102 can be prevented from decreasing.
- the processor 105a "determines the feature with the least amount of change among the width and height of the following vehicle (object) obtained from the stereo camera (first sensor) 101 as the reference feature, and calculates the relative distances dw and dh based on the feature ( ⁇ x or ⁇ y) corresponding to the reference feature among the width ⁇ x and height ⁇ y of the following vehicle (object) obtained from the monocular camera (second sensor) 102," but instead, the processor 105a may be configured to "determine the height ⁇ y of the following vehicle (object) obtained from the monocular camera (second sensor) 102 as the reference feature when the amount of change in the width W of the following vehicle (object) obtained from the stereo camera (first sensor) 101 exceeds a predetermined threshold, and use this to calculate the relative distance dh.”
- the relative distance dh is calculated based on the "height,” which changes less with object movement than the "width,” so as described above, the accuracy of the relative distance calculated by the monocular camera 102 can be prevented from decreasing.
- the relative distance d is calculated by weighting the relative distance calculated from equation (1) (first relative distance dw) and the relative distance calculated from equation (2) (second relative distance dh) according to the amount of change in the feature of the following vehicle calculated from the camera image of the stereo camera 101 (for example, the rate of change in the width of the following vehicle).
- FIG. 7 shows the flow of the rear monitoring process executed by the ECU 105 (processor 105a) according to the second embodiment.
- steps S501 and S502 at the beginning of the flow are the same as those in the first embodiment shown in Figure 5.
- the processor 105a calculates a weight from the amount of change calculated in S502 and a weighting map.
- FIG. 8 is an example of a weighting map.
- the rate of change of the width of the following vehicle as seen by the stereo camera 101 is calculated as the "amount of change".
- Various methods for calculating the rate of change can be used, for example, a method that calculates the ratio of the width of the following vehicle just before it leaves the recognition area 202-S to the width when the following vehicle enters the recognition area 202-S.
- the rate of change calculated in S502 is converted into a weight wf (0 ⁇ wf ⁇ 1) of the second relative distance dh by the map in FIG. 8.
- the weighting is performed such that the weight wf of the second relative distance dh increases as the rate of change (amount of change) of the width of the following vehicle obtained from the stereo camera 101 increases.
- this weighting map based on the rate of change of the characteristic amount (width or height) is only one example, and any map can be set. In other words, other weighting may be added as long as the purpose of this embodiment can be reproduced.
- the weighting may be different depending on the type of the following vehicle (object), as in the weighting diagram based on the type of following vehicle shown in FIG. 9. As shown in FIG. 9, for example, a motorcycle has a larger ratio of length to width than a passenger car, so it is preferable to make the weight wf of the second relative distance dh larger than that of a passenger car.
- the weighting may be changed depending on the relative distance between the host vehicle and the following vehicle (object), as in the weighting diagram based on the relative distance between the host vehicle and the following vehicle (object) shown in FIG. 10.
- the weighting may be set lighter because the rate of change of the characteristic amount (width and height) is more difficult to capture due to the influence of noise, etc., as the relative distance between the host vehicle and the following vehicle (object) increases.
- the processor 105a first calculates the first relative distance dw and the second relative distance dh from the feature amount (width W and height H) obtained by the stereo camera 101 in S501, the feature amount (width ⁇ x and height ⁇ y) obtained by the monocular camera 102 in S505, and equations (1) and (2) (in the first embodiment, one of the two relative distances dw, dh was calculated, but in this embodiment, two relative distances dw, dh are calculated).
- the processor 105a calculates the relative distance d between the host vehicle and the following vehicle based on the calculated first relative distance dw and second relative distance dh, the weight wf calculated in S503-W, and the following equation (3).
- the following formula (3) is a formula for calculating the relative distance d from the weight wf and the first and second relative distances dw and dh.
- wf represents the weight of the second relative distance dh
- dw represents the first relative distance calculated from formula (1) above
- dh represents the second relative distance calculated from formula (2) above.
- the relative distance d is calculated as the average (weighted average) of the first and second relative distances, taking into account the weight wf according to the amount of change in the width of the following vehicle obtained from the stereo camera 101.
- weighting was performed according to the amount of change (rate of change) in the width of the following vehicle obtained from the stereo camera 101, but weighting may also be performed according to the amount of change (rate of change) in the height of the following vehicle obtained from the stereo camera 101.
- dw weight hf of the first relative distance dw
- the processor 105a calculates a first relative distance dw between the host vehicle and the following vehicle (object) based on the width of the following vehicle (object) obtained from the stereo camera 101 and the width of the following vehicle (object) obtained from the monocular camera 102 (second sensor), calculates a second relative distance dh between the host vehicle and the following vehicle (object) based on the height of the following vehicle (object) obtained from the stereo camera 101 and the height of the following vehicle (object) obtained from the monocular camera 102 (second sensor), and calculates the relative distance d by weighting the first relative distance dw and the second relative distance dh according to the amount of change in the width (or height) of the following vehicle (object) obtained from the stereo camera 101 (first sensor) and summing the two.
- the relative distance is calculated taking into account the amount of change in both the width and height of the following vehicle obtained from the stereo camera 101 (first sensor), thereby improving the accuracy of the relative distance.
- the relative distance d is calculated so that the weight of the height of the following vehicle relative to its width increases as the rate of change of the width of the following vehicle increases.
- the relative distance is calculated with an emphasis on the height, which has a relatively small rate of change, improving the accuracy of the calculation of the relative distance.
- the detection confidence level is an index value indicating how accurate the features (width, height, length, type) of the detected object are, and here, the larger the value, the higher the confidence level. For example, if the features (width, height, length, type) of the object are continuously captured during object detection, the detection confidence level can be said to be high.
- 3D bounding box detection is object detection that uses AI (artificial intelligence) learning of objects in the camera images of the stereo camera 101 and monocular camera 102 by the first processing circuit 103 and the second processing circuit 104 to extract the features of the object, capture it three-dimensionally, and display a three-dimensional frame (3D bounding box) that surrounds the object in a cube, as shown in FIG. 11.
- the first feature amount calculation unit 301 and the second feature amount calculation unit 302 can obtain the feature amounts (width, height, length, type) of the object, detection confidence information, and the distance from the vehicle to the object from the detected cube frame, and assign a recognition ID.
- FIG. 12 shows the processing flow for rear monitoring according to this embodiment.
- the processor 105a calculates the features of the following vehicle (width, height, length, type), the distance to the following vehicle, and the detection confidence level (S551).
- the processor 105a compares the detection confidence level obtained in S551 with a predetermined value (S552), and if the detection confidence level is equal to or lower than the predetermined value, executes the processing from S501 onward in FIG. 5 or FIG. 7.
- the processor 105a assigns a 3D bounding box to the following vehicle (object) on the image obtained by the stereo camera 101, and if the detection confidence of the 3D bounding box is equal to or lower than a predetermined value, it performs the processing required to calculate the relative distance from S501 onward in FIG. 5 or FIG. 7.
- the flow in FIG. 5 or FIG. 7 is executed only when the confidence of the 3D bounding box detection is low, and the present invention can also be applied to systems that use 3D bounding boxes for object detection.
- FIG. 13 shows an example of the installation locations of the stereo camera 101 and monocular camera 102 for forward monitoring.
- the stereo camera 101 is installed at the front of the vehicle interior to monitor the area ahead of the vehicle 201.
- the monocular camera 102 is installed at a position where it can monitor the area ahead of the vehicle 201 and monitor the blind spot of the stereo camera 101 located in front of the vehicle 201.
- the recognition areas obtained thereby are, for example, 204-S and 205-M, respectively.
- the recognition area may be the camera viewing angle or field of view, or it may be an area within the camera viewing angle that can be processed by the image processing unit.
- the stereo camera 101 the range that can be viewed in stereo is the area 204-S where the fields of view of the left and right cameras overlap.
- the range at a predetermined distance from the front of the vehicle 201 is the blind spot of the stereo camera 101.
- the hardware configuration of the object detection device is the same as that shown in Figure 1.
- the software configuration of the ECU 105 is also the same as that shown in FIG. 3.
- the sensor recognition area movement determination unit 303 of this embodiment determines whether a vehicle ahead of the vehicle has entered the blind spot of the recognition area 204-S of the stereo camera 101 and moved to the recognition area 205-M of only the monocular camera 102, and if it determines that the same vehicle has moved, it assigns the following vehicle in the recognition area 205-M the same recognition ID as when it was in the recognition area 204-S (inherits the recognition ID).
- ECU 105 The processing flow for forward monitoring executed by ECU 105 (processor 105a) is the same as that shown in Figure 5, although the following vehicle must be interpreted as the vehicle ahead.
- FIG. 14 shows an explanatory diagram of the forward monitoring operation according to this embodiment.
- Reference numerals 601-604 in the figure indicate the positions of a vehicle (motorcycle) ahead of the host vehicle 201, and the vehicle ahead moves from position 601 to position 604 after changing course.
- the stereo camera 101 detects a vehicle ahead of the host vehicle at position 601, and the processor 105a (first feature amount calculation unit 301) calculates the feature amounts (width and height) of the vehicle ahead and the amount of change therein from position 601 through position 603 until the vehicle ahead enters the blind spot of the recognition area 204-S (S501, 502). At this time, the processor 105a (first feature amount calculation unit 301) may also calculate the relative speed and relative position between the host vehicle 201 and the vehicle ahead. When the stereo camera 101 detects a vehicle ahead, a recognition ID is assigned to the vehicle ahead.
- the processor 105a determines the feature (width or height) that will be used as a reference for calculating the relative distance using the camera image of the monocular camera 102, depending on the change in the feature (width and height) of the vehicle ahead in the recognition area 204-S that includes the positions 601 to 603 (S503).
- the processor 105a determines that the vehicle ahead has moved into the area where the blind spot area of recognition area 204-S and recognition area 205-M overlap in the stereo camera 101 and monocular camera 102 (S504). Whether the vehicle ahead detected by the stereo camera 101 and the monocular camera 102 are the same or not can be determined, for example, if the relative position difference between the vehicle ahead detected simultaneously by both the stereo camera 101 and the monocular camera 102 is within a predetermined value, it can be determined that the vehicle ahead is the same. If the vehicle ahead is the same, the recognition ID is also passed on.
- the processor 105a calculates data related to the vehicle ahead.
- the data calculated at this time includes the feature amounts (width and height) of the vehicle ahead calculated from the camera image of the monocular camera 102 (S505).
- the processor 105a calculates the relative distances dw, dh between the vehicle ahead at position 604 and the host vehicle 201 based on, for example, the features of the vehicle ahead captured by the monocular camera 102 at position 604 (width ⁇ x and height ⁇ y), the features of the vehicle ahead captured by the stereo camera 101 selected in S503 (width W or height H), and the above formula (1) or (2) (S506).
- the presence or absence of a collision is determined based on the calculation results of the relative distances dw, dh between the vehicle ahead 604 and the vehicle itself 201 (S507).
- the result of this determination is transmitted from the CAN IF 106 to the vehicle control device 107 and the alarm device 108 (S508, S509).
- the relative distance is calculated based on features that change little even if the object moves due to a lane change, etc., so that the accuracy of the relative distance calculated by the monocular camera 102 can be prevented from decreasing.
- the relative distances dw and dh between the host vehicle and the following vehicle can be prevented from deviating from the actual distance.
- the accuracy of the relative distance calculation is improved, the malfunction or non-operation of the collision avoidance operation can be reduced.
- weighting as in the second embodiment or a 3D bounding box as in the third embodiment may also be used for forward monitoring.
- the present invention is not limited to the above-described embodiments, and includes various modifications that do not deviate from the gist of the invention.
- the present invention is not limited to those that include all of the configurations described in the above-described embodiments, and also includes those in which some of the configurations are omitted. Also, it is possible to add or replace some of the configurations of one embodiment with the configurations of another embodiment.
- the configurations of the ECU 105 and the functions and execution processes of the configurations may be realized in part or in whole by hardware (for example, by designing the logic that executes each function in an integrated circuit).
- the configurations of the ECU 105 may be realized as programs (software) that are read and executed by an arithmetic processing device (for example, a CPU) to realize the functions of the ECU 105 configuration.
- Information related to the programs may be stored, for example, in semiconductor memory (flash memory, SSD, etc.), magnetic storage devices (hard disk drives, etc.), and recording media (magnetic disks, optical disks, etc.).
- control lines and information lines are those that are considered necessary for the explanation of the embodiment, but they do not necessarily show all the control lines and information lines related to the product. In reality, it can be considered that almost all components are interconnected.
- 101...Stereo camera first sensor
- 102...Monocular camera second sensor
- 105...ECU 105a...Processor
- 201...Own vehicle 202-S...Stereo camera recognition area (recognition range)
- 203-M...Monocular camera recognition area recognition range
- 204-S...Stereo camera recognition area recognition range
- 205-M...Monocular camera recognition area 401-404...Position of rear vehicle, 601-604...Position of front vehicle
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
本発明は車両用の物体検知装置に関する。 The present invention relates to an object detection device for vehicles.
自動車に各種センサを搭載して、自車両周辺を監視することによって安全を確保する運転支援技術の開発が盛んにおこなわれている。例えば、カメラにより車両前方を監視し、白線を認識することによって自車両が走行レーンを逸脱しないように警報するレーン逸脱警報装置、または前方障害物を監視して、衝突の可能性がある場合に自動で制動を行う衝突軽減ブレーキなどがあげられる。 There has been active development of driving support technologies that ensure safety by equipping automobiles with various sensors and monitoring the area around the vehicle. For example, there are lane departure warning systems that use cameras to monitor the vehicle ahead and recognize white lines to warn the driver not to deviate from the lane, and collision mitigation brakes that monitor obstacles ahead and automatically apply the brakes if there is a possibility of a collision.
これらの支援技術は車両の前方のみならず、後方、側方などを監視するように拡大している。それに伴い、検出手段もカメラだけではなく、レーダー、ソナーなどを適切に組み合わせて車両にとって注意を必要とする物体を監視している。これはFUSION技術と呼ばれ、センサの組み合わせや、各センサの認識技術処理の最適化が重要である。そして、監視可能な方向や範囲、精度、環境条件がセンサそれぞれ異なっていることから、物標(監視対象)が車両に接近している際に一つのセンサからもう一つのセンサに当該監視対象を引き継ぐ必要性が生じ得る。 These support technologies are expanding beyond just the front of the vehicle to monitor the rear, sides, etc. Accordingly, detection methods are no longer limited to cameras; radar, sonar, and other devices are appropriately combined to monitor objects that require the vehicle's attention. This is called FUSION technology, and it is important to combine sensors and optimize the recognition technology processing of each sensor. Furthermore, since each sensor has different monitoring directions, ranges, accuracy, and environmental conditions, it may become necessary to hand over the monitoring of an object (target) from one sensor to another when the target is approaching the vehicle.
例えば、特許文献1に記載の技術では、ステレオカメラと単眼カメラを使った車両後方監視について、ステレオカメラの認識範囲にいる後続車両の車幅(又は高さ)を求めて記憶するとともに、当該後続車両がステレオカメラの認識範囲から単眼カメラの認識範囲に移動したときに、記憶された当該車幅(又は高さ)と、単眼カメラの認識結果から求めた当該後続車両の車幅(又は高さ)とに基づいて、当該後続車両との距離を求めている。
For example, the technology described in
ところで、検知対象の物体が第1のセンサの認識範囲から第2のセンサの認識範囲に移動する際に各センサに対する当該物体の姿勢が変化するように当該物体が移動すると、各センサで検出される当該物体の幅や高さ(各センサでの見かけの寸法)が大きく変化したように認識される場合がある。 However, when an object to be detected moves from the recognition range of a first sensor to the recognition range of a second sensor in such a way that the object's attitude relative to each sensor changes, the width and height of the object detected by each sensor (apparent dimensions as seen by each sensor) may be perceived as having changed significantly.
例えば、自動二輪車をその前方から見たときの幅は自動車のそれよりも狭いが、車線変更等により各センサからその側方が見えるような動きをした場合、各センサでの見かけの車幅の変化量が自動車よりも大きくなる傾向がある。このような状況の下、特許文献1の技術で単眼カメラの認識結果から車幅に基づいて距離を演算すると、ステレオカメラの認識結果から求めた車幅よりも大幅に大きく変化したように単眼カメラで車幅を認識してしまい、当該物体との距離が実際よりも近い値で算出される可能性がある。また、逆に車幅が小さく変化したように単眼カメラが認識した場合は、当該物体との距離が実際よりも遠い値で算出される可能性がある。すなわち、特許文献1に基づく距離演算は、実際の物体との距離に差異が発生する可能性があり改善の余地がある。なお、この問題は、物体の画像上の寸法変化に起因した距離誤差が出やすいという単眼カメラの特徴に起因しており、上記で指摘した物体の幅(車幅)の変化だけでなく、物体の高さが見かけ上変化したようにセンサで検出される場合についても同様に指摘できる。
For example, the width of a motorcycle seen from the front is narrower than that of an automobile, but when the motorcycle makes a lane change or other movement that makes the side visible to each sensor, the apparent change in vehicle width at each sensor tends to be greater than that of an automobile. Under such circumstances, if the technology of
本発明の目的は、第1のセンサから第2のセンサへの距離演算の引継ぎの際に自車両と検知物体との演算距離が実際の距離と乖離することを抑制できる物体検知装置を提供することにある。 The object of the present invention is to provide an object detection device that can prevent the calculated distance between the vehicle and a detected object from deviating from the actual distance when distance calculation is handed over from the first sensor to the second sensor.
本願は上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、少なくとも1つのプロセッサを備える車両用の物体検知装置であって、前記プロセッサは、車両に搭載された第1センサの検出結果から物体の複数の特徴量を演算し、前記第1センサから得られる前記複数の特徴量の変化量をそれぞれ演算し、前記車両に搭載された第2センサの検出結果から演算される前記物体の特徴量のうち、前記車両と前記物体との相対距離の演算に利用する基準特徴量を前記複数の特徴量の変化量に基づいて決定し、前記物体が前記第1センサの認識範囲から前記第2センサの認識範囲に移動したとき、前記第2センサの検出結果から前記基準特徴量を演算し、前記基準特徴量と、前記第1センサから得られた前記複数の特徴量のうち前記基準特徴量に対応する特徴量とに基づいて前記相対距離を演算するものとする。 The present application includes multiple means for solving the above problem, and an example thereof is an object detection device for a vehicle having at least one processor, the processor calculates multiple feature quantities of an object from the detection result of a first sensor mounted on the vehicle, calculates changes in the multiple feature quantities obtained from the first sensor, determines a reference feature quantity to be used in calculating the relative distance between the vehicle and the object from the feature quantities of the object calculated from the detection result of a second sensor mounted on the vehicle based on the changes in the multiple feature quantities, calculates the reference feature quantity from the detection result of the second sensor when the object moves from the recognition range of the first sensor to the recognition range of the second sensor, and calculates the relative distance based on the reference feature quantity and the feature quantity corresponding to the reference feature quantity from the multiple feature quantities obtained from the first sensor.
本発明によれば、第1のセンサから第2のセンサへの距離演算の引継ぎの際に自車両と検知物体との演算距離が実際の距離と乖離することを抑制できる。 The present invention makes it possible to prevent the calculated distance between the vehicle and a detected object from deviating from the actual distance when distance calculation is handed over from the first sensor to the second sensor.
以下、本発明にかかる実施形態について図面を用いて説明する。本実施形態で対象とするのは、自動車、二輪車等、道路走行中に自車両に接近してくる物体の検知装置である。なお、以下では複数の実施形態について説明するが、各実施形態で共通する部分には同じ符号を付し、繰り返しの説明を省略することがある。なお、図面は説明をより明確にするため、実際の態様に比べて模式的に表される場合があるが、これはあくまで一例であって、本発明の解釈を限定するものではない。 Below, an embodiment of the present invention will be described with reference to the drawings. The subject of this embodiment is a detection device for detecting objects approaching a vehicle, such as an automobile or motorcycle, while the vehicle is traveling on a road. Note that, although multiple embodiments will be described below, parts common to each embodiment will be given the same reference numerals, and repeated explanations may be omitted. Note that, to make the explanation clearer, the drawings may be shown more diagrammatically than the actual aspect, but this is merely an example and does not limit the interpretation of the present invention.
<第1実施形態>
図1は、本発明の第1実施形態に係る物体検知装置とその周辺装置を含む構成図である。本実施形態に係る物体検知装置は、第1センサ101と、第2センサ102と、第1処理回路103と、第2処理回路104と、電子制御装置(以下、ECUと称することがある)105とを備えている。ECU105は、CANインターフェース(CAN IF)106を介して車両制御装置(他のECU)107や報知装置(モニタやスピーカ等)108などと接続できる
第1センサ101は、自車両の後方を監視するために自車両に搭載されたセンサであり、ステレオカメラ、レーダー、ソナー、LiDAR、及び単眼カメラのいずれかである。本実施形態では第1センサ101はステレオカメラとする。ステレオカメラ101のカメラ画像は、第1処理回路103で画像処理されてECU105に入力される。
First Embodiment
FIG. 1 is a block diagram including an object detection device according to a first embodiment of the present invention and its peripheral devices. The object detection device according to this embodiment includes a
第2センサ102は、自車両の後側方を監視するために自車両に搭載されたセンサであり、単眼カメラであることが好ましい。単眼カメラ102のカメラ画像は第2処理回路104で画像処理されてECU105に入力される。
The
ECU105は、データの記憶装置であるメモリ105bと、メモリ105bに記憶されたプログラムに従って各種処理を実行する少なくとも1つのプロセッサ105aとを備えており、第1処理回路103、第2処理回路104の処理結果(換言すると第1センサ101、第2センサ102の検出結果)を用いて自車両に接近してくる物体(例えば他車両)の情報をECU105内で処理する。当該情報には自車両と当該物体との相対距離データが含まれる。そして、自車両後側方から接近してくる物体の情報を、CAN IF106を通じてCANバスに流し、車両制御装置107や報知装置108に出力する。
The ECU 105 is equipped with a
ここでは、ステレオカメラ101及び単眼カメラ102からの信号を入力するそれぞれの信号入力部(図示しない)と、第1処理回路103及び第2処理回路104と、ECU105,CAN IF106を含むシステムを例として示すが、これらが一つのユニット内に収まっている装置であるとは限らない。例えばカメラ101,102とそれに対応する処理回路103,104がそれぞれ1つのユニットとして存在し、そこからECU105を含む何らかの制御装置に信号線で結ばれていても良い。また、第1処理回路103や第2処理回路104がECU105内に収められても良いし、第1処理回路103、第2処理回路104及びECU105が車両制御装置107内に収められても良い。
Here, a system including signal input units (not shown) that input signals from the
図2は後方監視のステレオカメラ101,単眼カメラ102の設置場所の例を示す。ステレオカメラ101は、自車両201の後方を監視するために例えばリアバンパーに設置できる。一方、単眼カメラ102は、自車両後側方を監視するために例えばドアミラーに設置できる。図中の領域202-Sは、ステレオカメラ101の認識領域を示し、領域203-Mは単眼カメラ102の認識領域を示す。ここで認識領域とは、カメラ視野角や画角の場合もあるし、またカメラ視野角の中でも画像処理部が処理可能なエリアを示しても良い。但し、単眼カメラ102については、自車両201の映りこみが少なくなるように、光軸を自車両201からできるだけ離すように設置するのが望ましい。また、ステレオカメラ101では、ステレオ視できる範囲は左右カメラの視野が重なった斜線を付した領域202-Sとなる。ステレオカメラ101は自車201内に設置しても良い。例えば、リアシート側の天井に、リアウィンドウ越しに自車両後方を監視するように設置しても良い。これにより、雨天時に水滴や泥等がステレオカメラ101に付着することを避けることができる。
Figure 2 shows an example of the installation locations of the
図3は、ECU105のソフトウェア構成図をECU105の内部に示した図である。ECU105はメモリ105b内に格納されたソフトウェアをプロセッサ105aによって実行することで、第1特徴量演算部301、第2特徴量演算部302、センサ認識領域移動判定部303、物体距離推定部304、及び衝突危険度判定部305として機能し得る。
FIG. 3 is a diagram showing the software configuration of the
第1特徴量演算部301(プロセッサ105a)は、ステレオカメラ101の認識領域202-Sで検知した自車両後方を走行中の後続車に認識IDを付与し、ステレオカメラ101の検出結果(カメラ画像)から当該後続車の複数の特徴量(幅及び高さ)を演算し、当該後続車両の移動によって発生する当該複数の特徴量の変化量を演算する。また、第1特徴量演算部301は、単眼カメラ102の検出結果から演算される後続車の特徴量のうち、自車両と後続車との相対距離の演算に利用する特徴量(以下、「基準特徴量」と称することがある)を当該後続車両の移動によって発生する前記複数の特徴量の変化量に基づいて決定することもできる。基準特徴量は、後続車が単眼カメラ102の認識領域203-Mに存在するときの相対距離の演算に利用される単眼カメラ102で得られる特徴量である。
The first feature amount calculation unit 301 (
後続車両の特徴量は、例えば、物体を検出する一般的な技術として知られているバウンディングボックス検知を用いて演算できる。第1特徴量演算部301(プロセッサ105a)は、ステレオカメラ101及び単眼カメラ102のカメラ画像の中にある物体に対して、図4に示すような当該物体(後続車両401)に外接する最小の四角形(長方形)で表示される枠線(2次元バウンディングボックスという)402をディープラーニング等を利用して付与する。そして、第1特徴量演算部301(プロセッサ105a)は、後続車両の幅と高さ(特徴量)を、2次元バウンディングボックスの幅と高さから演算することができる。また、第1特徴量演算部301(プロセッサ105a)は、当該後続車の移動によって発生する2次元バウンディングボックスの幅と高さの変化を監視することで後続車両の幅と高さ(特徴量)の変化量を演算することができる。演算した特徴量とその変化量のデータは、例えばセンサ認識領域移動判定部303や物体距離推定部304で用いられる。なお、バウンディングボックスの幅及び高さに代えて、ステレオカメラ101のカメラ画像における後続車の例えばエッジから当該後続車の幅や高さを演算しても良い。
The feature amount of the following vehicle can be calculated, for example, using bounding box detection, which is known as a general technique for detecting objects. The first feature amount calculation unit 301 (
第2特徴量演算部302(プロセッサ105a)は、単眼カメラ102の認識領域203-Mで検知した自車両後側方を走行中の後続車の複数の特徴量(幅及び高さ)を単眼カメラ102の検出結果(カメラ画像)から演算する。特徴量については第1特徴量演算部301と同様に演算でき、例えば2次元バウンディングボックスに基づいて演算できる。演算された特徴量には基準特徴量が含まれることになる。演算した特徴量のデータは、例えばセンサ認識領域移動判定部303や物体距離推定部304で用いられる。
The second feature amount calculation unit 302 (
センサ認識領域移動判定部303(プロセッサ105a)は、ステレオカメラ101の認識領域202-Sから単眼カメラ102の認識領域203-Mに同じ後続車が移動したか否かを判定し、同じ後続車が移動したと判定した場合には認識領域203-Mの当該後続車に、認識領域202-Sに存在したときと同じ認識IDを付与する(認識IDを引き継ぐ)。
The sensor recognition area movement determination unit 303 (
物体距離推定部304(プロセッサ105a)は、センサ認識領域移動判定部303で後続車がステレオカメラの認識領域202-Sから単眼カメラ102の認識領域203-Mに移動したと判定されたとき、単眼カメラ102から得られる後続車両の特徴量のうち当該後続車両と自車との相対距離の演算に利用する基準特徴量と、ステレオカメラ101から得られた当該後続車両の複数の特徴量のうち基準特徴量に対応する特徴量とに基づいて、当該後続車両と自車との相対距離を演算する。
When the sensor recognition area
衝突危険度判定部305(プロセッサ105a)は、物体距離推定部304で演算された自車両と後続車との相対距離から、自車両と当該後続車が衝突する可能性の有無を判定し、当該判定結果から車両制御装置107による衝突回避制御の許可及び非許可をCAN IF106を介して送信する。衝突危険度判定部305は、その判定結果から衝突可能性があると判定された場合には、その旨を報知装置108に送信しても良い。
The collision risk determination unit 305 (
図5は、第1実施形態に係るECU105(プロセッサ105a)で実行される後方監視の処理フローを示している。まず、ステレオカメラ101で自車両の後続車が検知されると、プロセッサ105aは、当該後続車に認識IDを付与し、ステレオカメラ101のカメラ画像を基に当該後続車の特徴量(幅及び高さ)を演算する(S501)。そして、プロセッサ105aは、当該後続車の車線変更等による移動によって発生する当該特徴量(幅及び高さ)の変化量をステレオカメラ101のカメラ画像を基に演算する(S502)。
Figure 5 shows the flow of processing for rear monitoring executed by the ECU 105 (
次に、プロセッサ105aは、当該後続車の幅及び高さの変化量に基づいて、後続の処理で単眼カメラ102から得られる当該後続車の幅及び高さ(特徴量)のうち相対距離の演算に利用する基準特徴量(つまり、当該後続車の幅と高さのうちいずれか1つ)を決定する(S503)。本実施形態では、プロセッサ105aは、ステレオカメラ101の認識領域202-Sの認識外となる直前までの間で、ステレオカメラ101から得られる当該後続車の特徴量のうち変化量が少ない特徴量(幅または高さ)を基準特徴量として選択している(S503)。その後、単眼カメラ102から得られる当該後続車の特徴量のうちS503で選択された特徴量に対応する特徴量がS506の相対距離の演算に利用されることになる。つまり、例えばS503でステレオカメラ101による特徴量のうち高さの変化量が少ないと判定された場合には、単眼カメラ102の画像によるS506の相対距離の演算には単眼カメラ102による高さが特徴量として利用される。
Next, based on the amount of change in the width and height of the following vehicle,
なお、ここで説明した単眼カメラ102の画像から基準特徴量(相対距離を演算する際に利用する特徴量)を選択する方法は一例であり、本実施形態の趣旨の再現が可能であれば、ほかの選択方法としても良い。例えば、ステレオカメラ101による特徴量の幅の変化量が所定の閾値を超えた場合は、単眼カメラ102による特徴量として高さを選択し、それを基準特徴量としても良い。
Note that the method of selecting a reference feature (a feature used when calculating relative distance) from an image of the
次に、プロセッサ105aは、後続車がステレオカメラ101の認識領域202-Sから外れ、単眼カメラ102の認識領域203-Mへ移動したか否かを、後続車の認識IDによって判定し(S504)、当該後続車が単眼カメラ102の認識領域203-Mに移動したと判定された場合にS505に移行する。例えば、ステレオカメラ101の認識領域202-S内で検知されていた所定の認識IDの後続車が認識領域202-S外に移動したと判定され、かつ、それから間もなく単眼カメラ102の認識領域203-Mで後続車両が検知された場合には当該後続車に同じ認識IDを引き継いで、認識領域202-Sから認識領域203-Mに当該認識IDの後続車が移動したとみなす。
Then, the
次に、後続車が単眼カメラ102の認識領域203-Mに移動したと判定した後には、プロセッサ105aは、単眼カメラ102のカメラ画像を基に当該後続車の特徴量(幅及び高さ)を演算する(S505)。なお、相対距離の演算に用いる特徴量は既にS503で決定されているので、ここでは当該決定された特徴量だけを単眼カメラ102のカメラ画像から演算しても良い。
Next, after determining that the following vehicle has moved into the recognition area 203-M of the
続いてプロセッサ105aは、S505で演算した特徴量のうちS503で決定した特徴量に対応する特徴量と、下記の式(1)及び式(2)のいずれか1つとによって後側方に位置する後続車(物体)の相対距離を演算する(S506)。式(1)及び式(2)のどちらを相対距離の演算に利用されるかはS503で決定した特徴量に従う。決定された特徴量が「幅」の場合には式(1)が利用され、「高さ」の場合には式(2)が利用される。
Then, the
下記式(1)は、S503で決定した特徴量が「幅」の場合に相対距離(第1相対距離)dwを演算するための式である。式(1)におけるdwは自車両から後続車(物体)までの相対距離(第1相対距離)、fは単眼カメラ102の焦点距離、Wはステレオカメラ101で演算された当該後続車(物体)の幅、△xは単眼カメラ102の画像における当該後続車(物体)の幅を示す。
The following formula (1) is a formula for calculating the relative distance (first relative distance) dw when the feature determined in S503 is "width". In formula (1), dw is the relative distance (first relative distance) from the host vehicle to the following vehicle (object), f is the focal length of the
下記式(2)は、S503で決定した特徴量が「高さ」の場合に相対距離(第2相対距離)dhを演算するための式である。式(2)におけるdhは自車両から後続車(物体)までの相対距離(第2相対距離)、fは単眼カメラ102の焦点距離、Hはステレオカメラ101で演算された当該後続車(物体)の高さ、△yは単眼カメラ102の画像における当該後続車(物体)の高さを示す。
The following formula (2) is a formula for calculating the relative distance (second relative distance) dh when the feature determined in S503 is "height." In formula (2), dh is the relative distance (second relative distance) from the host vehicle to the following vehicle (object), f is the focal length of the
次にプロセッサ105aは、自車両と後側方車との相対距離dw,dhの演算結果から、当該後側方車が自車両と衝突する可能性があるか否かを判定する(S507)。衝突する可能性があると判定された場合は、衝突を回避するための制御介入許可判断を実施して送信する(S508)。衝突する可能性が無いと判定された場合は、衝突を回避するための制御介入禁止判断を実施して送信する(S509)。
Then, the
(動作)
図6は、本実施形態に係る後方監視の動作説明図を示す。図中の符号401-404は自車両201の後続車(自動二輪車)の位置を示しており、後続車は位置401から車線変更を経て位置404まで移動するものとする。
(Operation)
6 is a diagram illustrating the operation of rear monitoring according to this embodiment. Reference numerals 401-404 in the diagram indicate the positions of a vehicle (motorcycle) following the
ステレオカメラ101は自車両後方の後続車を位置401で検知しており、プロセッサ105a(第1特徴量演算部301)は、当該後続車が位置401から位置402を経由して認識領域202-Sから外れるまでの間、後続車の特徴量(幅及び高さ)及びその変化量を演算する(S501,502)。この時にプロセッサ105a(第1特徴量演算部301)は、自車両201と後続車との相対速度及び相対位置を演算しても良い。なお、ステレオカメラ101で後続車を検知した時には、当該後続車に認識IDを付与する。
The
次にプロセッサ105a(第1特徴量演算部301)は、位置401から位置402までを含む認識領域202-Sにおける後続車の特徴量(幅及び高さ)の変化に応じて、単眼カメラ102のカメラ画像を利用して相対距離を演算するための基準となる特徴量(幅また高さ)を決定する(S503)。
Next, the
次に自車両201の後方の位置401を走行していた後続車が車線変更を開始し、自車両201の後側方の位置403に移動したとする。その際、プロセッサ105a(センサ認識領域移動判定部部303)は、ステレオカメラ101と単眼カメラ102で認識領域202-Sから認識領域203-Mに後続車が移動したことを判定する(S504)。ステレオカメラ101と単眼カメラ102のそれぞれで検知された後続車が同一であるかどうかは、例えば、ステレオカメラ101と単眼カメラ102の両カメラで、同時に検知している後続車の相対位置差が所定値以内であれば同一後続車であると判定できる。また同一後続車であれば認識IDも受け渡す。
Next, assume that a following vehicle traveling at position 401 behind the
次に、自車両201の後側方の位置403の後続車が単眼カメラ102で検知された時に、当該後続車に関するデータをプロセッサ105a(第2特徴量演算部302)が演算する。この時に演算されるデータには、単眼カメラ102のカメラ画像から算出される後続車の特徴量(幅及び高さ)が含まれる(S505)。
Next, when the following vehicle at position 403 on the rear side of the
プロセッサ105a(物体距離推定部304)は、例えば位置403で取得した単眼カメラ102による後続車の特徴量(幅Δx及び高さΔy)と、S503で選択されたステレオカメラ101による後続車の特徴量(幅Wまたは高さH)と、上記式(1)又は(2)とに基づいて、位置403の後続車と自車両201との相対距離dw,dhを演算する(S506)。
The
後続車403と自車両201との相対距離dw,dhの演算結果から、衝突の有無を判定する(S507)。その判定結果をCAN IF106から車両制御装置107及び報知装置108に送信する(S508,S509)。尚、後続車が位置404に移動して車線変更を完了しても単眼カメラ102による監視(相対距離の演算)は継続しても良い。
The presence or absence of a collision is determined from the calculation results of the relative distances dw, dh between the following vehicle 403 and the vehicle 201 (S507). The result of this determination is transmitted from the CAN IF 106 to the
(効果)
(1)上記のように本実施形態では、プロセッサ105aは、自車両201に搭載された第1センサ(ステレオカメラ101)の検出結果から物体(後続車)の複数の特徴量(幅及び高さ)を演算し、第1センサ(ステレオカメラ101)から得られる複数の特徴量(幅及び高さ)の変化量をそれぞれ演算し、自車両201に搭載された第2センサ(単眼カメラ102)の検出結果から演算される物体の特徴量のうち、自車両201と物体(後続車)との相対距離の演算に利用する基準特徴量を前記複数の特徴量の変化量に基づいて決定し、物体(後続車)が第1センサ(ステレオカメラ101)の認識範囲202-Sから第2センサ(単眼カメラ102)の認識範囲203-Mに移動したとき、第2センサ(単眼カメラ102)の検出結果から基準特徴量を演算し、基準特徴量と、第1センサ(ステレオカメラ101)から得られた複数の特徴量のうち基準特徴量に対応する特徴量とに基づいて相対距離を演算している。
(effect)
(1) As described above, in this embodiment, the
このように、第2センサ(単眼カメラ102)から得られる物体の特徴量のうち第1センサ(ステレオカメラ101)から得られる当該物体の特徴量の変化量から決定した特徴量(基準特徴量)と、第1センサ(ステレオカメラ101)から得られた当該物体の特徴量とに基づいて、第2センサ(単眼カメラ102)の認識範囲203-Mでの相対距離を演算すると、第1センサ(ステレオカメラ101)から得られた正確な当該物体の特徴量を利用しつつ、第2センサで得られる特徴量のうち第1センサで得られた特徴量の変化量に応じた特徴量(基準特徴量)に基づいて相対距離を演算できるので、第2センサによる相対距離の演算精度を向上させることができる。つまり、第1センサ(ステレオカメラ101)から第2センサ(単眼カメラ102)への距離演算の引継ぎの際に自車両と後続車との相対距離が実際の距離と乖離することを抑制できる。また、相対距離演算精度が向上するので衝突回避動作の誤作動または不作動を低減できる。 In this way, when the relative distance in the recognition range 203-M of the second sensor (monocular camera 102) is calculated based on the feature amount (reference feature amount) determined from the change amount of the feature amount of the object obtained from the first sensor (stereo camera 101) among the feature amounts of the object obtained from the second sensor (monocular camera 102) and the feature amount of the object obtained from the first sensor (stereo camera 101), the relative distance can be calculated based on the feature amount (reference feature amount) corresponding to the change amount of the feature amount obtained from the first sensor among the feature amounts obtained from the second sensor while using the accurate feature amount of the object obtained from the first sensor (stereo camera 101), so that the calculation accuracy of the relative distance by the second sensor can be improved. In other words, it is possible to suppress the deviation of the relative distance between the host vehicle and the following vehicle from the actual distance when the distance calculation is handed over from the first sensor (stereo camera 101) to the second sensor (monocular camera 102). In addition, since the accuracy of the relative distance calculation is improved, it is possible to reduce the malfunction or non-operation of the collision avoidance operation.
なお、上記(1)の演算で算出・利用する物体の特徴量は、当該物体の寸法に係るものであることが好ましい。 It is preferable that the feature of the object calculated and used in the calculation (1) above is related to the dimensions of the object.
(2)上記(1)において、第1センサは、ステレオカメラ、レーダー、ソナー、LiDAR、及び単眼カメラのいずれかであり、第2センサは、単眼カメラであることが好ましい。なお、ステレオカメラ101に代えて、レーダー、ソナー、LiDAR、及び単眼カメラのいずれかのセンサを利用しても良いが、金銭的コストと演算精度のバランスを考慮するとステレオカメラ101を用いるのが最も好ましい(以下の各実施形態でも同様とする)。
(2) In the above (1), the first sensor is preferably a stereo camera, radar, sonar, LiDAR, or monocular camera, and the second sensor is preferably a monocular camera. Note that, instead of the
(3)上記(1)において、前記複数の特徴量は、前記物体の幅及び高さであり、プロセッサ105aは、ステレオカメラ101(第1センサ)から得られた前記物体の幅及び高さの変化量に基づいて基準特徴量を決定することが好ましい。
(3) In the above (1), it is preferable that the multiple feature amounts are the width and height of the object, and the
(4)上記(1)において、前記複数の特徴量は、前記物体の幅及び高さであり、プロセッサは105a、前記第1センサから得られた前記物体の幅W及び高さHの変化量に基づいて前記物体の幅W及び高さHのうち変化量が少ない特徴量を前記基準特徴量として決定することが好ましい。具体的には、本実施形態では、プロセッサ105aは、自車両201に搭載されたステレオカメラ(第1センサ)101の検出結果から後続車(物体)の幅W及び高さHを演算し、ステレオカメラ101から得られる前記後続車の幅W及び高さHの変化量をそれぞれ演算し、前記後続車がステレオカメラ101の認識範囲202-Sから外れ、自車両201に搭載された単眼カメラ(第2センサ)102の認識範囲203-Mに移動したとき、ステレオカメラ(第1センサ)101から得られた後続車(物体)の幅W及び高さHの変化量に基づいて後続車(物体)の幅及び高さのうち変化量が少ない特徴量を選択し、単眼カメラ(第2センサ)102から得られた後続車(物体)の幅Δx及び高さΔyのうち前記選択された特徴量に対応する特徴量(Δx又はΔy)に基づいて相対距離dw、dhを演算している。
(4) In (1) above, it is preferable that the multiple features are the width and height of the object, and the
このように物体検知装置を構成すると、検知対象の物体がステレオカメラ101の認識範囲202-Sを外れて単眼カメラ102の認識範囲203-Mに移動したとき、車線変更等で当該物体が移動しても変化量の少ない特徴量に基づいて相対距離が演算されるので、単眼カメラ102により演算される相対距離の精度低下を抑制できる。
When the object detection device is configured in this manner, when the object to be detected moves out of the recognition range 202-S of the
なお、上記ではプロセッサ105aが「ステレオカメラ(第1センサ)101から得られた後続車(物体)の幅及び高さのうち変化量が少ない特徴量を基準特徴量として決定し、単眼カメラ(第2センサ)102から得られた後続車(物体)の幅Δx及び高さΔyのうち前記基準特徴量に対応する特徴量(Δx又はΔy)に基づいて相対距離dw、dhを演算」する場合について説明したが、これに代えて、プロセッサ105aが「ステレオカメラ(第1センサ)101から得られた後続車(物体)の幅Wの変化量が所定の閾値を超えた場合、単眼カメラ(第2センサ)102から得られた後続車(物体)の高さΔyを基準特徴量として決定し、それを利用して相対距離dhを演算」するように構成しても良い。
In the above, the
このように構成すると、物体移動に伴う変化量が「幅」よりも相対的に少ない「高さ」に基づいて相対距離dhが演算されるので、上記と同様に、単眼カメラ102により演算される相対距離の精度低下を抑制できる。
When configured in this manner, the relative distance dh is calculated based on the "height," which changes less with object movement than the "width," so as described above, the accuracy of the relative distance calculated by the
<第2実施形態>
本実施形態では、式(1)から演算される相対距離(第1相対距離dw)と式(2)から演算される相対距離(第2相対距離dh)とに対して、ステレオカメラ101のカメラ画像から演算される後続車の特徴量の変化量(例えば、後続車の幅の変化率)に応じた重み付けをして相対距離dを演算する例について説明する。
Second Embodiment
In this embodiment, an example will be described in which the relative distance d is calculated by weighting the relative distance calculated from equation (1) (first relative distance dw) and the relative distance calculated from equation (2) (second relative distance dh) according to the amount of change in the feature of the following vehicle calculated from the camera image of the stereo camera 101 (for example, the rate of change in the width of the following vehicle).
図7は、第2実施形態に係るECU105(プロセッサ105a)で実行される後方監視の処理フローを示している。
FIG. 7 shows the flow of the rear monitoring process executed by the ECU 105 (
まず、フロー冒頭のS501とS502は図5に示した第1実施形態ものと同じである。 First, steps S501 and S502 at the beginning of the flow are the same as those in the first embodiment shown in Figure 5.
次に、プロセッサ105aは、S502で演算した変化量と重み付けマップとから重みを演算する。図8は重み付けマップの一例である。この重み付けマップを利用するに際して、S502では「変化量」としてステレオカメラ101による後続車の幅の変化率を演算している。変化率の演算は種々のものが利用できるが、例えば、認識領域202-S内に後続車が入ったときの幅に対する、当該後続車が認識領域202-Sから出る直前の幅の比率を演算するものがある。S502で演算された変化率は、図8のマップによって第2相対距離dhの重みwf(0<wf≦1)に変換される。図8では、ステレオカメラ101から得られた後続車の幅の変化率(変化量)が大きくなるほど、第2相対距離dhの重みwfが大きくなるように重み付けされている。
Next, the
なお、この特徴量(幅または高さ)の変化率による重み付けマップは一例であり、任意のマップ設定が可能である。つまり、本実施形態の趣旨の再現が可能であれば、ほかの重み付けを追加しても良い。例えば、図8の重み付けに加えて又は代えて、図9に示す後続車の種別による重み付け図のように、後続車(物体)の種類によって重みを異ならせても良い。図9に示すように、例えば、自動二輪車は、幅に対する長さの割合が乗用車のそれよりも大きいので、乗用車よりも第2相対距離dhの重みwfを大きくすることが好ましい。さらに、図10に示す自車と後続車(物体)との相対距離による重み付け図のように、自車両と後続車(物体)との相対距離に応じて重みを変化させても良い。例えば、図10の例では、後続車(物体)との相対距離が遠いほどノイズ等の影響により特徴量(幅及び高さ)の変化率を捉えにくいため、重み付けを軽く設定しても良い。 Note that this weighting map based on the rate of change of the characteristic amount (width or height) is only one example, and any map can be set. In other words, other weighting may be added as long as the purpose of this embodiment can be reproduced. For example, in addition to or instead of the weighting in FIG. 8, the weighting may be different depending on the type of the following vehicle (object), as in the weighting diagram based on the type of following vehicle shown in FIG. 9. As shown in FIG. 9, for example, a motorcycle has a larger ratio of length to width than a passenger car, so it is preferable to make the weight wf of the second relative distance dh larger than that of a passenger car. Furthermore, the weighting may be changed depending on the relative distance between the host vehicle and the following vehicle (object), as in the weighting diagram based on the relative distance between the host vehicle and the following vehicle (object) shown in FIG. 10. For example, in the example of FIG. 10, the weighting may be set lighter because the rate of change of the characteristic amount (width and height) is more difficult to capture due to the influence of noise, etc., as the relative distance between the host vehicle and the following vehicle (object) increases.
S504,505は、第1実施形態と同じである。 S504 and 505 are the same as in the first embodiment.
S506-Wでは、プロセッサ105aは、まず、S501で取得したステレオカメラ101による特徴量(幅W及び高さH)と、S505で取得した単眼カメラ102による特徴量(幅Δx及び高さΔy)と、式(1)及び式(2)とから第1相対距離dw及び第2相対距離dhを演算する(第1実施形態では2つの相対距離dw,dhのいずれか一方を演算したが、本実施形態では2つの相対距離dw,dhを演算する)。次に、演算した第1相対距離dw及び第2相対距離dhと、S503-Wで演算した重みwfと、下記式(3)とに基づいて、自車と後続車の相対距離dを演算する。
In S506-W, the
下記式(3)は、重みwfと第1相対距離dw及び第2相対距離dhから相対距離dを演算するための式である。式(3)におけるwfは第2相対距離dhの重み、dwは上記式(1)から演算される第1相対距離、dhは上記式(2)から演算される第2相対距離を示す。つまり、本実施形態の相対距離dは、ステレオカメラ101から得られた後続車の幅の変化量に応じた重みwfを考慮した第1相対距離と第2相対距離の平均(加重平均)として演算される。
The following formula (3) is a formula for calculating the relative distance d from the weight wf and the first and second relative distances dw and dh. In formula (3), wf represents the weight of the second relative distance dh, dw represents the first relative distance calculated from formula (1) above, and dh represents the second relative distance calculated from formula (2) above. In other words, in this embodiment, the relative distance d is calculated as the average (weighted average) of the first and second relative distances, taking into account the weight wf according to the amount of change in the width of the following vehicle obtained from the
残りのS507,508,509は、第1実施形態と同じである。 The remaining steps S507, 508, and 509 are the same as in the first embodiment.
なお、上記の説明では、ステレオカメラ101から得られる後続車の幅の変化量(変化率)に応じた重み付けを行ったが、ステレオカメラ101から得られる後続車の高さの変化量(変化率)に応じた重み付けを行っても構わない。なお、この場合の重み付けマップは、後続車の高さの変化量(変化率)を、第1相対距離dwの重みhf(幅で測距する割合hf)に変換するものを利用することが好ましい。また、上記式(3)に代えて下記式(4)を利用することが好ましい。
In the above explanation, weighting was performed according to the amount of change (rate of change) in the width of the following vehicle obtained from the
上記のように構成された本実施形態では、プロセッサ105aが、ステレオカメラ101から得られた後続車(物体)の幅と、単眼カメラ102(第2センサ)から得られた後続車(物体)の幅とに基づいて自車両と後続車(物体)との第1相対距離dwを演算し、ステレオカメラ101から得られた後続車(物体)の高さと、単眼カメラ102(第2センサ)から得られた後続車(物体)の高さとに基づいて自車両と後続車(物体)との第2相対距離dhを演算し、ステレオカメラ101(第1センサ)から得られた後続車(物体)の幅(又は高さ)の変化量に応じた重み付けを第1相対距離dwと第2相対距離dhとに行って両者を合計することで相対距離dを演算することとした。
In this embodiment configured as described above, the
このように第1相対距離dwと第2相対距離dhの加重平均として相対距離を演算すると、ステレオカメラ101(第1センサ)から得られた後続車の幅及び高さ双方の変化量を考慮して相対距離が演算されることになるので、当該相対距離の精度を向上できる。 In this way, by calculating the relative distance as a weighted average of the first relative distance dw and the second relative distance dh, the relative distance is calculated taking into account the amount of change in both the width and height of the following vehicle obtained from the stereo camera 101 (first sensor), thereby improving the accuracy of the relative distance.
特に、上記の実施形態では、後続車の幅の変化率が大きくなるにしたがって、後続車の幅に対する高さの重みが大きくなるように相対距離dを演算している。これにより後続車の幅の変化率が大きくなると、相対的に変化率の少ない高さを重視して相対距離の算出することとなり、相対距離の演算精度が向上する。 In particular, in the above embodiment, the relative distance d is calculated so that the weight of the height of the following vehicle relative to its width increases as the rate of change of the width of the following vehicle increases. As a result, when the rate of change of the width of the following vehicle increases, the relative distance is calculated with an emphasis on the height, which has a relatively small rate of change, improving the accuracy of the calculation of the relative distance.
<第3実施形態>
上記の実施形態では物体検知に2次元バンディングボックスを利用することについて言及したが、本実施形態では3次元バウンディングボックス(3Dバウンディングボックス)を利用する場合について説明する。本実施形態では、3Dバウンディングボックスを利用して後続車両を検知し、その検知確信度が所定値以下となり物体検知の確信度が低い場合に第1実施形態や第2実施形態のフロー(図5,7)を実行するものとする。検知確信度は、検知された物体の特徴量(幅・高さ・長さ・種別)がどの程度正しいかを示す指標値であり、ここでは数値が大きいほど確信度が高いものとする。検知確信度は、例えば物体検知時に物体の特徴量(幅・高さ・長さ・種別)を連続的に捉えている状態であれば、検知確信度が高いと言える。
Third Embodiment
In the above embodiment, the use of a two-dimensional bounding box for object detection has been mentioned, but in this embodiment, a case where a three-dimensional bounding box (3D bounding box) is used will be described. In this embodiment, a following vehicle is detected using a 3D bounding box, and when the detection confidence level is equal to or lower than a predetermined value and the confidence level of the object detection is low, the flow of the first embodiment or the second embodiment (FIGS. 5 and 7) is executed. The detection confidence level is an index value indicating how accurate the features (width, height, length, type) of the detected object are, and here, the larger the value, the higher the confidence level. For example, if the features (width, height, length, type) of the object are continuously captured during object detection, the detection confidence level can be said to be high.
まず、ここで3Dバウンディングボックス検知について説明する。3Dバウンディングボックス検知とは、ステレオカメラ101及び単眼カメラ102のカメラ画像の中にある物体を第1処理回路103及び、第2処理回路104でAI(人工知能)学習することによって、図11に示すように物体の特徴を抽出して立体的に捉えて当該物体を立方体で囲うように表示される3次元の枠線(3Dバウンディングボックス)を利用した物体検知である。その3Dバウンディングボックス検知によって、検知した立方体の枠線から第1特徴量演算部301及び第2特徴量演算部302で物体の特徴量(幅・高さ・長さ・種別)、検知確信度の情報、及び自車から物体までの距離を取得し、認識IDの付与を行うことができる。
First, 3D bounding box detection will be explained here. 3D bounding box detection is object detection that uses AI (artificial intelligence) learning of objects in the camera images of the
図12に本実施形態に係る後方監視の処理フローを示す。 FIG. 12 shows the processing flow for rear monitoring according to this embodiment.
まず、後続車がステレオカメラ101で3Dバウンディングボックス検知されると、プロセッサ105aは、当該後続車の特徴量(幅・高さ・長さ・種別)と、当該後続車までの距離と、検知確信度を演算する(S551)。
First, when the following vehicle is detected by the
次にプロセッサ105aは、S551で取得した検知確信度を所定値と比較し(S552)、検知確信度が当該所定値以下の場合には図5又は図7のS501以下の処理を実行するものとする。
The
つまり、本実施形態ではプロセッサ105aは、ステレオカメラ101で得られた画像上で後続車(物体)に3Dバウンディングボックスを付与し、当該3Dバウンディングボックスの検知確信度が所定値以下の場合、図5又は図7のS501以下の相対距離の演算に必要な処理を行うこととした。つまり、本実施形態によれば、3Dバウンディングボックス検知の確信度が低い場合にだけ図5又は図7のフローが実行されることになり、物体検知に3Dバウンディングボックスを利用しているシステムにも本発明を適用することができる。
In other words, in this embodiment, the
<第4実施形態>
次に、ステレオカメラ101と単眼カメラ102の双方を自車両前方に設置して、自車両前方の車両を検知して距離を推定する実施形態について説明する。
Fourth Embodiment
Next, an embodiment in which both the
図13は前方監視のステレオカメラ101,単眼カメラ102の設置場所の例を示す。ステレオカメラ101は、自車両201の前方を監視するために車内前方に設置する。一方、単眼カメラ102は、自車両201の前方を監視し、自車両201の前方に位置するステレオカメラ101の死角領域を監視可能な位置に設置する。それによって得られる認識領域は、例えば、それぞれ204-S,205-Mのようになる。ここで認識領域とは、カメラ視野角や画角の場合もあるし、またカメラ視野角の中でも画像処理部が処理可能なエリアを示しても良い。ステレオカメラ101では、ステレオ視できる範囲は左右カメラの視野が重なる204-Sの領域となる。例えば、自車両201の前方から所定の距離の範囲はステレオカメラ101の死角領域となる。
FIG. 13 shows an example of the installation locations of the
物体検知装置のハードウェア構成は図1と同じである。 The hardware configuration of the object detection device is the same as that shown in Figure 1.
ECU105のソフトウェア構成も図3と同じである。ただし、本実施形態のセンサ認識領域移動判定部303は、自車両前方の車両がステレオカメラ101の認識領域204-Sの死角に侵入して、単眼カメラ102のみの認識領域205-Mに移動したか否かを判定するものとし、同一の車両が移動したと判定した場合には認識領域205-Mの当該後続車に、認識領域204-Sに存在したときと同じ認識IDを付与するものとする(認識IDを引き継ぐ)。
The software configuration of the
ECU105(プロセッサ105a)で実行される前方監視の処理フローも、後続車を前方車両と読み替える必要はあるが、図5と同じである。
The processing flow for forward monitoring executed by ECU 105 (
図14は、本実施形態に係る前方監視の動作説明図を示す。図中の符号601-604は自車両201の前方車両(自動二輪車)の位置を示しており、前方車は位置601から進路変更を経て位置604まで移動するものとする。
FIG. 14 shows an explanatory diagram of the forward monitoring operation according to this embodiment. Reference numerals 601-604 in the figure indicate the positions of a vehicle (motorcycle) ahead of the
ステレオカメラ101は自車両前方の前方車を位置601で検知しており、プロセッサ105a(第1特徴量演算部301)は、当該前方車が位置601から位置603を経由して認識領域204-Sの死角に侵入するまでの間、前方車の特徴量(幅及び高さ)及びその変化量を演算する(S501,502)。この時にプロセッサ105a(第1特徴量演算部301)は、自車両201と前方車との相対速度及び相対位置を演算しても良い。なお、ステレオカメラ101で前方車を検知した時には、当該前方車に認識IDを付与する。
The
次にプロセッサ105a(第1特徴量演算部301)は、位置601から位置603までを含む認識領域204-Sにおける前方車の特徴量(幅及び高さ)の変化に応じて、単眼カメラ102のカメラ画像を利用して相対距離を演算するための基準となる特徴量(幅また高さ)を決定する(S503)。
Next, the
次に自車両201の前方の位置601を走行していた前方車が進路変更を開始し、自車両201の前方の位置604に移動したとする。その際、プロセッサ105a(センサ認識領域移動判定部303)は、ステレオカメラ101と単眼カメラ102で認識領域204-Sの死角領域と認識領域205-Mの重複した領域に前方車が移動したことを判定する(S504)。ステレオカメラ101と単眼カメラ102のそれぞれで検知された前方車が同一であるかどうかは、例えば、ステレオカメラ101と単眼カメラ102の両カメラで、同時に検知している前方車の相対位置差が所定値以内であれば同一前方車であると判定できる。また同一前方車であれば認識IDも受け渡す。
Next, assume that a vehicle ahead that was traveling at position 601 ahead of the
次に、自車両201の前方の位置604の前方車が単眼カメラ102で検知された時に、当該前方車に関するデータをプロセッサ105a(第2特徴量演算部302)が演算する。この時に演算されるデータには、単眼カメラ102のカメラ画像から算出される前方車の特徴量(幅及び高さ)が含まれる(S505)。
Next, when a vehicle ahead at position 604 ahead of the
プロセッサ105a(物体距離推定部304)は、例えば位置604で取得した単眼カメラ102による前方車の特徴量(幅Δx及び高さΔy)と、S503で選択されたステレオカメラ101による前方車の特徴量(幅Wまたは高さH)と、上記式(1)又は(2)とに基づいて、位置604の前方車と自車両201との相対距離dw,dhを演算する(S506)。
The
前方車604と自車両201との相対距離dw,dhの演算結果から、衝突の有無を判定する(S507)。その判定結果をCAN IF106から車両制御装置107及び報知装置108に送信する(S508,S509)。
The presence or absence of a collision is determined based on the calculation results of the relative distances dw, dh between the vehicle ahead 604 and the vehicle itself 201 (S507). The result of this determination is transmitted from the CAN IF 106 to the
以上のように本実施形態のようにステレオカメラ101と単眼カメラ102で前方監視をする場合についても、検知対象の物体がステレオカメラ101の認識範囲204-Sの死角に入って単眼カメラ102のみの認識範囲205-Mに移動したとき、進路変更等で当該物体が移動しても変化量の少ない特徴量に基づいて相対距離が演算されるので、単眼カメラ102により演算される相対距離の精度低下を抑制できる。つまり、ステレオカメラ(第1センサ)101から単眼カメラ(第2センサ)102への距離演算の引継ぎの際に自車両と後続車との相対距離dw,dhが実際の距離と乖離することを抑制できる。また、相対距離演算精度が向上するので衝突回避動作の誤作動または不作動を低減できる。
As described above, even in the case of forward monitoring using the
なお、前方監視において、第2実施形態のような重み付けや、第3実施形態のような3Dバウンディングボックスを利用しても良いことは言うまでもない。 It goes without saying that weighting as in the second embodiment or a 3D bounding box as in the third embodiment may also be used for forward monitoring.
なお、本発明は、上記の実施の形態に限定されるものではなく、その要旨を逸脱しない範囲内の様々な変形例が含まれる。例えば、本発明は、上記の実施の形態で説明した全ての構成を備えるものに限定されず、その構成の一部を削除したものも含まれる。また、ある実施の形態に係る構成の一部を、他の実施の形態に係る構成に追加又は置換することが可能である。 The present invention is not limited to the above-described embodiments, and includes various modifications that do not deviate from the gist of the invention. For example, the present invention is not limited to those that include all of the configurations described in the above-described embodiments, and also includes those in which some of the configurations are omitted. Also, it is possible to add or replace some of the configurations of one embodiment with the configurations of another embodiment.
また、上記のECU105に係る各構成や当該各構成の機能及び実行処理等は、それらの一部又は全部をハードウェア(例えば各機能を実行するロジックを集積回路で設計する等)で実現しても良い。また、上記のECU105に係る構成は、演算処理装置(例えばCPU)によって読み出し・実行されることでECU105の構成に係る各機能が実現されるプログラム(ソフトウェア)としてもよい。当該プログラムに係る情報は、例えば、半導体メモリ(フラッシュメモリ、SSD等)、磁気記憶装置(ハードディスクドライブ等)及び記録媒体(磁気ディスク、光ディスク等)等に記憶することができる。
Furthermore, the configurations of the
また、上記の各実施の形態の説明では、制御線や情報線は、当該実施の形態の説明に必要であると解されるものを示したが、必ずしも製品に係る全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えて良い。 In addition, in the above explanation of each embodiment, the control lines and information lines are those that are considered necessary for the explanation of the embodiment, but they do not necessarily show all the control lines and information lines related to the product. In reality, it can be considered that almost all components are interconnected.
101…ステレオカメラ(第1センサ)、102…単眼カメラ(第2センサ)、105…ECU、105a…プロセッサ、201…自車両、202-S…ステレオカメラの認識領域(認識範囲)、203-M…単眼カメラの認識領域(認識範囲)、204-S…ステレオカメラの認識領域(認識範囲)、205-M…単眼カメラの認識領域(認識範囲)、401-404…後方車両の位置、601-604…前方車両の位置 101...Stereo camera (first sensor), 102...Monocular camera (second sensor), 105...ECU, 105a...Processor, 201...Own vehicle, 202-S...Stereo camera recognition area (recognition range), 203-M...Monocular camera recognition area (recognition range), 204-S...Stereo camera recognition area (recognition range), 205-M...Monocular camera recognition area (recognition range), 401-404...Position of rear vehicle, 601-604...Position of front vehicle
Claims (12)
前記プロセッサは、
車両に搭載された第1センサの検出結果から物体の複数の特徴量を演算し、
前記第1センサから得られる前記複数の特徴量の変化量をそれぞれ演算し、
前記車両に搭載された第2センサの検出結果から演算される前記物体の特徴量のうち、前記車両と前記物体との相対距離の演算に利用する基準特徴量を前記複数の特徴量の変化量に基づいて決定し、
前記物体が前記第1センサの認識範囲から前記第2センサの認識範囲に移動したとき、前記第2センサの検出結果から前記基準特徴量を演算し、前記基準特徴量と、前記第1センサから得られた前記複数の特徴量のうち前記基準特徴量に対応する特徴量とに基づいて前記相対距離を演算する
ことを特徴とする物体検知装置。 An object detection device for a vehicle comprising at least one processor,
The processor,
Calculating a plurality of feature amounts of an object from a detection result of a first sensor mounted on the vehicle;
calculating changes in the plurality of feature quantities obtained from the first sensor,
determining a reference feature quantity to be used in calculating a relative distance between the vehicle and the object, among the feature quantities of the object calculated from the detection result of a second sensor mounted on the vehicle, based on an amount of change in the plurality of feature quantities;
an object detection device comprising: when the object moves from the recognition range of the first sensor to the recognition range of the second sensor, the object detection device calculates the reference feature from the detection result of the second sensor; and calculates the relative distance based on the reference feature and a feature corresponding to the reference feature among the plurality of feature values obtained from the first sensor.
前記第1センサは、ステレオカメラ、レーダー、ソナー、LiDAR、及び単眼カメラのいずれかであり、
前記第2センサは、単眼カメラである
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
The first sensor is any one of a stereo camera, a radar, a sonar, a LiDAR, and a monocular camera;
The object detection device, wherein the second sensor is a monocular camera.
前記複数の特徴量は、前記物体の幅及び高さであり、
前記プロセッサは、前記第1センサから得られた前記物体の幅及び高さの変化量に基づいて前記基準特徴量を決定する
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the plurality of feature amounts are a width and a height of the object,
The object detection device, wherein the processor determines the reference feature amount based on an amount of change in width and height of the object obtained from the first sensor.
前記複数の特徴量は、前記物体の幅及び高さであり、
前記プロセッサは、前記第1センサから得られた前記物体の幅及び高さの変化量のうち変化量が少ない特徴量を前記基準特徴量として決定する
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the plurality of feature amounts are a width and a height of the object,
The object detection device, wherein the processor determines, as the reference feature, a feature that has a smaller amount of change among the amount of change in width and height of the object obtained from the first sensor.
前記複数の特徴量は、前記物体の幅及び高さであり、
前記プロセッサは、前記第1センサから得られた前記物体の幅の変化量が所定の閾値を超えた場合、前記物体の高さを前記基準特徴量として決定する
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the plurality of feature amounts are a width and a height of the object,
The object detection device, wherein the processor determines a height of the object as the reference feature when a change in width of the object obtained from the first sensor exceeds a predetermined threshold.
前記複数の特徴量は、前記物体の幅及び高さであり、
前記基準特徴量は、前記物体の幅及び高さであり、
前記プロセッサは、
前記第1センサから得られた前記物体の幅と、前記第2センサから得られた前記物体の幅とに基づいて前記車両と前記物体との第1相対距離を演算し、
前記第1センサから得られた前記物体の高さと、前記第2センサから得られた前記物体の高さとに基づいて前記車両と前記物体との第2相対距離を演算し、
前記第1センサから得られた前記物体の幅及び高さの変化量に応じた重み付けを前記第1相対距離と前記第2相対距離とに行って両者を合計することで前記相対距離を演算する
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the plurality of feature amounts are a width and a height of the object,
the reference feature amounts are a width and a height of the object,
The processor,
calculating a first relative distance between the vehicle and the object based on a width of the object obtained from the first sensor and a width of the object obtained from the second sensor;
calculating a second relative distance between the vehicle and the object based on the height of the object obtained from the first sensor and the height of the object obtained from the second sensor;
an object detection device comprising: a first sensor for detecting an object from the object; a second sensor for detecting an object from the object; a first sensor for detecting a width of the object; a height of the object;
前記第1センサから得られた前記物体の幅の変化量が大きくなるほど、前記第2相対距離の重みが大きくなるように設定されている
ことを特徴とする物体検知装置。 7. The object detection device according to claim 6,
an object detection device, characterized in that a weight of the second relative distance is set to be larger as a change in width of the object obtained from the first sensor increases.
前記第1センサから得られた前記物体の幅及び高さの変化量に基づく重みづけは、前記車両の種類によって異なる
ことを特徴とする物体検知装置。 7. The object detection device according to claim 6,
The object detection device according to claim 1, wherein weighting based on the amount of change in width and height of the object obtained from the first sensor varies depending on the type of the vehicle.
前記第1センサから得られた前記物体の幅及び高さの変化量に基づく重みづけは、前記第1相対距離と前記第2相対距離の大きさによって異なる
ことを特徴とする物体検知装置。 7. The object detection device according to claim 6,
an object detection device, characterized in that a weight based on an amount of change in width and height of the object obtained from the first sensor varies depending on the magnitude of the first relative distance and the second relative distance.
前記第1センサは、前記車両の後方を監視するステレオカメラであり、
前記第2センサは、前記車両の後側方を監視する単眼カメラである
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the first sensor is a stereo camera that monitors a rear area of the vehicle;
The object detection device according to claim 1, wherein the second sensor is a monocular camera that monitors a rear side of the vehicle.
前記第1センサは、前記車両の前方を監視するステレオカメラであり、
前記第2センサは、前記車両の前方を監視し、かつ、前記車両の前方に位置する前記ステレオカメラの死角を監視可能な位置に設置されている単眼カメラである
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the first sensor is a stereo camera that monitors a front of the vehicle;
the second sensor is a monocular camera that monitors a front of the vehicle and is installed at a position where it can monitor a blind spot of the stereo camera located in front of the vehicle.
前記第1センサは、ステレオカメラであり、
前記プロセッサは、前記ステレオカメラで得られた画像上で前記物体に3次元バウンディングボックスを付与し、前記3次元バウンディングボックスの検知確信度が所定値以下の場合、前記相対距離の演算に必要な演算を実行する
ことを特徴とする物体検知装置。 2. The object detection device according to claim 1,
the first sensor is a stereo camera,
The processor assigns a three-dimensional bounding box to the object in the image obtained by the stereo camera, and, if a detection confidence of the three-dimensional bounding box is equal to or lower than a predetermined value, executes a calculation necessary to calculate the relative distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/025889 WO2025013285A1 (en) | 2023-07-13 | 2023-07-13 | Object detection device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/025889 WO2025013285A1 (en) | 2023-07-13 | 2023-07-13 | Object detection device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2025013285A1 true WO2025013285A1 (en) | 2025-01-16 |
Family
ID=94215024
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/025889 WO2025013285A1 (en) | 2023-07-13 | 2023-07-13 | Object detection device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2025013285A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008123462A (en) * | 2006-11-16 | 2008-05-29 | Hitachi Ltd | Object detection device |
JP2018106334A (en) * | 2016-12-26 | 2018-07-05 | トヨタ自動車株式会社 | Warning device for vehicle |
JP2021168065A (en) * | 2020-04-13 | 2021-10-21 | トヨタ自動車株式会社 | On-vehicle sensor system |
-
2023
- 2023-07-13 WO PCT/JP2023/025889 patent/WO2025013285A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008123462A (en) * | 2006-11-16 | 2008-05-29 | Hitachi Ltd | Object detection device |
JP2018106334A (en) * | 2016-12-26 | 2018-07-05 | トヨタ自動車株式会社 | Warning device for vehicle |
JP2021168065A (en) * | 2020-04-13 | 2021-10-21 | トヨタ自動車株式会社 | On-vehicle sensor system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9937905B2 (en) | Side collision avoidance system and method for vehicle | |
CN107798699B (en) | Depth map estimation with stereo images | |
JP4420011B2 (en) | Object detection device | |
US20060085131A1 (en) | Path estimation and confidence level determination system for a vehicle | |
GB2541274A (en) | Collision mitigation and avoidance | |
JP7359099B2 (en) | Mobile object interference detection device, mobile object interference detection system, and mobile object interference detection program | |
KR101984520B1 (en) | Apparatus and method for preventing vehicle collision | |
US20170220875A1 (en) | System and method for determining a visibility state | |
JP2013190421A (en) | Method for improving detection of traffic-object position in vehicle | |
JP6332383B2 (en) | Vehicle target detection system | |
JP7122101B2 (en) | Vehicle obstacle detection device | |
JP2018200267A (en) | Upper structure determination device and driving support system | |
EP3530536A1 (en) | Autonomous emergency braking system and method for vehicle at crossroad | |
US12071010B2 (en) | Onboard display device, onboard display method, and computer readable storage medium | |
CN114056325A (en) | Device and method for reducing collision risk | |
CN113276864B (en) | System and method for obstacle proximity detection | |
GB2576206A (en) | Sensor degradation | |
WO2017056724A1 (en) | Periphery recognition device | |
WO2021172532A1 (en) | Parking assistance device and parking assistance method | |
US20230227025A1 (en) | Vehicle drive assist apparatus | |
CN117585006A (en) | System and method for estimating lateral speed of vehicle | |
JP2019172032A (en) | Automatic braking device | |
US20240034286A1 (en) | Collision avoidance assistance device | |
WO2025013285A1 (en) | Object detection device | |
TWI541152B (en) | Traffic safety system and its obstacle screening method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23945158 Country of ref document: EP Kind code of ref document: A1 |