[go: up one dir, main page]

WO2024204271A1 - Control device, and control method - Google Patents

Control device, and control method Download PDF

Info

Publication number
WO2024204271A1
WO2024204271A1 PCT/JP2024/012087 JP2024012087W WO2024204271A1 WO 2024204271 A1 WO2024204271 A1 WO 2024204271A1 JP 2024012087 W JP2024012087 W JP 2024012087W WO 2024204271 A1 WO2024204271 A1 WO 2024204271A1
Authority
WO
WIPO (PCT)
Prior art keywords
reflected wave
real space
image
reflected
irradiation range
Prior art date
Application number
PCT/JP2024/012087
Other languages
French (fr)
Japanese (ja)
Inventor
洵也 岸本
佑介 林
英次郎 渋沢
弘理 外舘
喜央 ▲高▼山
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2024204271A1 publication Critical patent/WO2024204271A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • This disclosure relates to a control device and a control method.
  • Patent Document 1 Technology for detecting objects such as fallen objects on the road is known (for example, see Patent Document 1).
  • the road monitoring device described in Patent Document 1 includes an imaging unit that captures images of at least the front or rear of the vehicle and sequentially outputs the image data, and a discrimination unit that determines whether or not a specified monitoring object is included in the image corresponding to the image data based on the image data.
  • a control device includes: The positions of the first object and the second object in the real space are identified based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.
  • a control device includes: The positions of the first object and the second object in the real space are identified based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.
  • a control method includes: The method includes identifying the positions of the first object and the second object in the real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.
  • a control method includes: The method includes identifying positions of the first object and the second object in the real space based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.
  • FIG. 1 is a diagram showing a schematic configuration of a detection system according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram of the detection system shown in FIG. 1 .
  • FIG. 4 is a diagram showing an example of a captured image.
  • FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave.
  • FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave.
  • FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave.
  • 4 is a diagram for explaining a detection direction corresponding to the irradiation range shown in FIG. 3 .
  • FIG. FIG. 4 is a diagram for explaining the direction of light in the configuration shown in FIG. 3 .
  • 10 is a flowchart illustrating an example procedure of an object detection process according to an embodiment of the present disclosure.
  • objects can be detected with high accuracy. According to one embodiment of the present disclosure, objects can be detected with high accuracy.
  • a detection system 1 is mounted on a moving object 2.
  • the moving object 2 is, for example, a vehicle, a motorcycle, a robot, or the like.
  • the detection system 1 does not have to be mounted on the moving object 2.
  • the detection system 1 may be incorporated in an electronic device such as a smartphone.
  • the detection system 1 includes a distance measuring device 10, an imaging device 20, and a control device 30.
  • the distance measuring device 10 and the imaging device 20 can communicate with the control device 30 via a communication line.
  • the control device 30 does not have to be mounted on the moving body 2. If the control device 30 is not mounted on the moving body 2, it may be configured to be able to communicate with the distance measuring device 10 and the imaging device 20 mounted on the moving body 2 via a network or the like.
  • the distance measuring device 10 is configured to include, for example, LiDAR (Light Detection And Ranging).
  • LiDAR is configured to include, for example, a laser light source that emits electromagnetic waves, an optical system, and a detection unit that detects the electromagnetic waves.
  • the electromagnetic waves are, for example, infrared rays, visible light, ultraviolet rays, or radio waves.
  • the ranging device 10 emits electromagnetic waves into the real space around the ranging device 10.
  • the electromagnetic waves emitted by the ranging device 10 are reflected by an object and return to the ranging device 10 as reflected waves.
  • the ranging device 10 detects the electromagnetic waves that are reflected by an object and return, i.e., the reflected waves, from the emitted electromagnetic waves.
  • the ranging device 10 measures the distance to the object by detecting the reflected waves.
  • the ranging device 10 may measure the distance to the object by any method, such as the ToF (Time of Flight) method. In the case of ToF (Time of Flight), the ranging device 10 measures the distance to the object by directly measuring the time from when the electromagnetic waves are emitted to when the reflected waves of the emitted electromagnetic waves are detected.
  • ToF Time of Flight
  • the ranging device 10 measures the distance to an object and obtains data on the intensity of the reflected wave reflected back by the object.
  • the intensity of the reflected wave may be a value based on the intensity of the optical signal input to the detection unit.
  • the ranging device 10 measures the distance to an object by scanning real space with electromagnetic waves and detecting the reflected waves that are reflected by objects, and obtains data on the intensity of the reflected wave reflected back by the object.
  • the range in real space that the ranging device 10 scans with electromagnetic waves is also referred to as the "scanning range.”
  • the scanning range includes multiple irradiation ranges.
  • the irradiation range is the range that is irradiated with electromagnetic waves when the ranging device 10 irradiates electromagnetic waves in one direction in real space.
  • the irradiation range corresponds to the angle ⁇ shown in Figures 7 and 8 described below.
  • the ranging device 10 detects the reflected wave of the electromagnetic wave irradiated to one irradiation range as a reflected wave arriving from one direction.
  • one irradiation range corresponds to one detection direction in which the ranging device 10 detects the electromagnetic wave.
  • the ranging device 10 scans the scanning range with electromagnetic waves and acquires data on the distance from the ranging device 10 to the object on which the reflected wave is reflected for each irradiation range.
  • the acquired distance data is also referred to as the "distance measurement image.”
  • the ranging image includes multiple pixels. One pixel of the ranging image corresponds to one irradiation range of the scanning range. Additionally, each pixel in the distance measurement image is associated with distance data acquired in the corresponding illumination range and data on the intensity of the reflected waves.
  • the distance measuring device 10 acquires data on the distance to the object and data on the intensity of the reflected wave for each irradiation range, as shown in Figures 4 to 6 described below.
  • the distance measuring device 10 transmits the acquired data on the distance to the object and data on the intensity of the reflected wave for each irradiation range to the control device 30.
  • the imaging device 20 is configured to include an imaging optical system and an imaging element.
  • the imaging device 20 generates an image by capturing an image of the real space around the imaging device 20.
  • the captured image includes a plurality of pixels.
  • the range in which the imaging device 20 captures the real space is also referred to as the "imaging range.” At least a portion of the imaging range of the imaging device 20 overlaps with at least a portion of the scanning range of the distance measuring device 10.
  • the imaging device 20 may generate an image at any frame rate.
  • the imaging device 20 transmits data of the generated captured image to the control device 30.
  • the captured image of the imaging device 20 and each irradiation range of the scanning range of the distance measuring device 10 are associated in advance.
  • one pixel of the distance measuring image corresponds to one irradiation range of the scanning range. Therefore, the coordinate system set for the captured image and the coordinate system set for the distance measuring image may be associated in advance. By associating these two coordinate systems in advance, it is possible to associate the captured image of the imaging device 20 with each irradiation range of the scanning range of the distance measuring device 10 in advance.
  • each pixel of the captured image of the imaging device 20 and each pixel of the distance measuring image of the distance measuring device 10 may be associated in advance. By associating each pixel of the captured image with each pixel of the distance measuring image in advance, it is possible to associate the captured image of the imaging device 20 with each irradiation range of the distance measuring device 10 in advance.
  • the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 may coincide.
  • the optical axis of the imaging optical system of the imaging device 20 and the optical axis of the LiDAR optical system of the distance measuring device 10 may coincide.
  • a part of the imaging optical system of the imaging device 20 and a part of the LiDAR optical system of the distance measuring device 10 may be common.
  • the imaging device 20 and the distance measuring device 10 may be configured as a single device.
  • the imaging device 20 and the distance measuring device 10 may be configured as described in JP 2018-132384 A.
  • the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 do not have to coincide.
  • the control device 30 is, for example, a computer.
  • the control device 30 may be any computer, such as a general-purpose computer, a workstation, or a dedicated computer.
  • the control device 30 includes a communication unit 31, an output unit 32, a memory unit 33, and a control unit 34.
  • the communication unit 31 includes at least one communication module capable of communicating with the distance measuring device 10 and the imaging device 20 via a communication line.
  • the communication module is a communication module that complies with the standard of the communication line.
  • the communication line includes at least one of a wired line and a wireless line.
  • the communication unit 31 may be configured to include at least one communication module capable of communicating with other systems of the mobile body 2.
  • the communication module is a communication module compatible with a communication standard between the control device 30 and other systems.
  • the communication standard is, for example, a standard such as CAN (Controller Area Network).
  • the output unit 32 is capable of outputting data.
  • the output unit 32 includes at least one output interface capable of outputting data.
  • the output interface is, for example, a display. If the mobile object 2 is a vehicle, the display may be disposed on the dashboard of the vehicle.
  • the output unit 32 may include a device capable of outputting information to an external storage medium.
  • the memory unit 33 is configured to include at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or a combination of at least two of these.
  • the semiconductor memory is, for example, a RAM (Random Access Memory) or a ROM (Read Only Memory).
  • the RAM is, for example, an SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory).
  • the ROM is, for example, an EEPROM (Electrically Erasable Programmable Read Only Memory).
  • the memory unit 33 may function as a main memory device, an auxiliary memory device, or a cache memory.
  • the memory unit 33 stores data used in the operation of the control device 30 and data obtained by the operation of the control device 30.
  • the control unit 34 is configured to include at least one processor, at least one dedicated circuit, or a combination of these.
  • the processor is, for example, a general-purpose processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a dedicated processor specialized for a specific process.
  • the dedicated circuit is, for example, an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit), etc.
  • the control unit 34 controls each part of the control unit 30 and executes processes related to the operation of the control unit 30.
  • the control unit 34 receives data on the distance to the object and data on the intensity of the reflected wave for each irradiation range of the distance measuring device 10 from the distance measuring device 10 via the communication unit 31.
  • the control unit 34 also receives captured image data from the imaging device 20 via the communication unit 31.
  • the control unit 34 receives data of a captured image 40 as shown in FIG. 3.
  • the captured image 40 includes objects on the tire 41 and the road surface 42.
  • an object included in the captured image means an object depicted in the captured image.
  • the imaging range of the imaging device 20 is the same as the scanning range of the distance measuring device 10.
  • the position of each irradiation range of the scanning range of the distance measuring device 10 is indicated by a dashed line.
  • one irradiation range corresponds to one pixel of the distance measuring image. Therefore, as can be seen from the position of the dashed line, the size of the pixels of the captured image 40 is smaller than the size of the pixels of the distance measuring image. In other words, the resolution of the captured image 40 is higher than the resolution of the distance measuring image.
  • control unit 34 receives data on the distance to an object and data on the intensity of the reflected wave as shown in Figures 4 to 6.
  • the horizontal axis indicates the distance to the object
  • the vertical axis indicates the intensity of the reflected wave.
  • the data shown in Figure 4 is data on the distance to an object in the irradiation range 43 shown in Figure 3 and data on the intensity of the reflected wave.
  • the portion of the captured image 40 corresponding to the irradiation range 43 includes the tire 41 and the road surface 42.
  • the data on the intensity of the reflected wave in the irradiation range 43 includes data based on the reflected wave from the tire 41 (first reflected wave) and the reflected wave from the road surface 42 (second reflected wave). Therefore, the data on the intensity of the reflected wave shown in Figure 4 includes an intensity peak P1 caused by the reflected wave of the electromagnetic wave reflected by the tire 41, and an intensity peak P2 caused by the reflected wave of the electromagnetic wave reflected by the road surface 42.
  • the intensity peaks P1 and P2 can be said to be multiple intensity peaks with different detection times detected from reflected waves received during a specified period.
  • the specified period is, for example, the period from when irradiation of the electromagnetic wave starts on the irradiation range 43 to when irradiation of the electromagnetic wave ends in that frame during the period (one frame) during which the electromagnetic wave is irradiated over the entire scanning range, or a period shorter than the period from when irradiation of the electromagnetic wave starts on the irradiation range 43 to when irradiation of the electromagnetic wave ends in that frame.
  • the data shown in FIG. 5 is data on the distance to an object in the irradiation range 44 shown in FIG. 3 and data on the intensity of the reflected wave.
  • the portion of the captured image 40 that corresponds to the irradiation range 44 includes the tire 41 but does not include the road surface 42.
  • the data on the intensity of the reflected wave in the irradiation range 44 includes data based on the reflected wave from the tire 41. Therefore, the data on the intensity of the reflected wave shown in FIG. 5 includes an intensity peak P3 that occurs due to the reflected wave when the electromagnetic wave is reflected by the tire 41.
  • the data shown in FIG. 6 is data on the distance to an object in the irradiation range 45 shown in FIG. 3 and data on the intensity of the reflected wave.
  • the portion of the captured image 40 that corresponds to the irradiation range 45 includes the road surface 42 but does not include the tires 41.
  • the data on the intensity of the reflected wave in the irradiation range 45 includes data based on the wave reflected from the road surface 42. Therefore, the data on the intensity of the reflected wave shown in FIG. 6 includes an intensity peak P4 that occurs due to the electromagnetic wave being reflected by the road surface 42.
  • the distance measuring device 10 detects the reflected wave of the electromagnetic wave irradiated to one irradiation range as a reflected wave arriving from one direction.
  • one irradiation range corresponds to one detection direction in which the distance measuring device 10 detects the reflected wave of the electromagnetic wave.
  • FIG. 7 shows detection direction d1 corresponding to the irradiation range 43 shown in FIG. 3.
  • the configuration shown in FIG. 7 corresponds to a top view of the configuration shown in FIG. 1.
  • the angle ⁇ corresponds to one irradiation range.
  • the detection direction d1 corresponding to the irradiation range 43 is the direction from the distance measuring device 10 to the center of the irradiation range 43.
  • the detection direction in which the ranging device 10 detects the reflected wave is regarded as the direction of arrival of the reflected wave reflected by the object.
  • this detection direction is treated as the direction from the ranging device 10 to the object included in the irradiation range corresponding to the detection direction. Therefore, when data of multiple objects is included in one irradiation range, the direction from the ranging device 10 to these multiple objects is treated as the same direction.
  • the irradiation range 43 includes data of the tire 41 and the road surface 42. Therefore, in the irradiation range 43, the direction from the ranging device 10 to each of the tire 41 and the road surface 42 is treated as the same detection direction d1.
  • the actual direction from the ranging device 10 to each of the tire 41 and the road surface 42 is different.
  • the part of the captured image 40 corresponding to the irradiation range 43 includes the tire 41 and the road surface 42.
  • the data on the intensity of the reflected wave in the irradiation range 43 shown in FIG. 4 only one object (here, a tire 41) is treated as being present at the distance corresponding to the intensity peak P1 where the reflected wave has the greatest intensity.
  • control unit 34 executes the following process to determine the direction of arrival of each of the multiple reflected waves reflected by the multiple objects when it determines that data of multiple objects is included in one irradiation range. By determining the direction of arrival of each of the multiple reflected waves, the control unit 34 can determine the position of each of the multiple objects in real space.
  • the control unit 34 determines whether multiple intensity peaks are detected from the data on the intensity of the reflected wave in one irradiation range of the distance measuring device 10.
  • the control unit 34 may detect intensity peaks that exceed an intensity threshold.
  • the intensity threshold may be set assuming the amount of noise in the reflected wave. For example, in the case of data on the reflected wave in the irradiation range 43 shown in FIG. 4, the control unit 34 determines that two intensity peaks, i.e., intensity peaks P1 and P2, are detected from the data on the intensity of the reflected wave. If the control unit 34 determines that multiple intensity peaks are detected from the data on the intensity of the reflected wave in one irradiation range, it determines that one irradiation range includes data on multiple objects.
  • control unit 34 determines that one irradiation range contains data of multiple objects, it determines the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks based on the captured image. As described above with reference to FIG. 3, the size of a pixel in the captured image is smaller than the size of a pixel in the ranging image. Therefore, the control unit 34 determines the direction of arrival of each of the multiple reflected waves in one irradiation range by using information about the pixels of the captured image. The process of determining the direction of arrival is described below.
  • an irradiation range determined to contain data of multiple objects is also referred to as a "first irradiation range.”
  • irradiation range 43 is the first irradiation range.
  • the control unit 34 divides the captured image into a plurality of image regions by performing bilateral filter and superpixel processing on the captured image.
  • the image regions can be regions in which pixels with similar characteristics are arranged consecutively.
  • the characteristics are, for example, the luminance or color of the pixels.
  • the image regions are composed of pixels with similar luminance information or pixels with similar color information.
  • the portion of the captured image that corresponds to the first illumination range is also referred to as the "first partial image.”
  • the first partial image is the portion that corresponds to illumination range 43 of captured image 40.
  • the first partial image that corresponds to illumination range 43 includes at least a portion of the image region that corresponds to tire 41 and a portion of the image region that corresponds to road surface 42.
  • the control unit 34 uses the above-mentioned features of the captured image and the data on the multiple intensity peaks and distance of the reflected wave in the first irradiation range to associate the image area corresponding to the object with one of the multiple intensity peaks of the reflected wave.
  • the other irradiation range adjacent to the first irradiation range is also described as the "second irradiation range.”
  • the part of the captured image corresponding to the second irradiation range is also described as the "second partial image.”
  • the second irradiation range is irradiation ranges 44, 45, 46, 47, 48, 49, 50, and 51.
  • Irradiation ranges 44 to 51 are irradiation ranges that surround irradiation range 43.
  • the second partial image is the part of the captured image 40 that corresponds to each of irradiation ranges 44 to 51.
  • the control unit 34 determines whether the second partial image includes the first image area.
  • the first image area is an image area corresponding to the first object among the multiple image areas after division.
  • the first object is any one of the multiple objects included in the first illumination range, i.e., the first partial image.
  • the first object is a tire 41.
  • the control unit 34 determines whether the second partial image corresponding to each of the illumination ranges 44 to 51 includes the first image area corresponding to the tire 41.
  • the control unit 34 may determine whether the second partial image includes the first image area by determining whether the characteristics of the pixels included in the second partial image are the same as the characteristics of the pixels corresponding to the first object.
  • the control unit 34 may consider the characteristics of the pixels included in the second partial image to be the same as the characteristics of the pixels corresponding to the first object when the characteristics of the pixels included in the second partial image and the characteristics of the pixels corresponding to the first object are similar within a predetermined range. For example, when the control unit 34 uses pixel brightness as a feature, the control unit 34 may determine that the features of the plurality of pixels are similar when the difference between values indicating brightness information of the plurality of pixels is within a predetermined range of values.
  • control unit 34 may determine that the features of the plurality of pixels are similar when the difference between values indicating color information of the plurality of pixels (e.g., RGB values) is within a predetermined range of values.
  • the control unit 34 determines that the feature of the pixel included in the second partial image is the same as the feature of the pixel corresponding to the first object, the control unit 34 determines that the second partial image includes the first image area. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to each of the irradiation ranges 44, 46, and 49 includes the first image area corresponding to the tire 41.
  • the control unit 34 determines that the second partial image includes the first image area, it acquires data on the reflected wave in the second irradiation range corresponding to the second partial image.
  • the control unit 34 compares the intensity peak of the acquired reflected wave with each of the multiple intensity peaks of the reflected wave in the first irradiation range, thereby associating the first image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to the irradiation range 44 includes the first image area corresponding to the tire 41. In this case, the control unit 34 acquires data on the reflected wave in the irradiation range 44 as shown in FIG. 5.
  • the control unit 34 compares the intensity peak P3 of the reflected wave as shown in FIG. 5 with each of the multiple intensity peaks of the reflected wave in the irradiation range 43 as shown in FIG. 4, thereby associating the first image area corresponding to the tire 41 with the intensity peak P1.
  • the control unit 34 determines whether or not the second partial image includes a second image area that is the same as or similar to the first object.
  • the second image area is an image area that corresponds to the second object among the multiple image areas after division.
  • the second object is any one of the multiple objects included in the first illumination range, i.e., the first partial image.
  • the second object is an object different from the first object.
  • the second object is the road surface 42.
  • control unit 34 determines that the second partial image includes the second image area, it acquires data (second data) of the reflected wave in the second irradiation range corresponding to the second partial image.
  • control unit 34 compares the intensity peak of the acquired reflected wave with each of the multiple intensity peaks of the reflected wave in the first irradiation range, thereby associating the second image area with any of the multiple intensity peaks of the reflected wave. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to the irradiation range 45 includes the second image area corresponding to the road surface 42.
  • control unit 34 acquires data of the reflected wave in the irradiation range 45 as shown in FIG. 6.
  • the control unit 34 compares the intensity peak P4 of the reflected wave as shown in FIG. 6 with each of the multiple intensity peaks of the reflected wave in the irradiation range 43 as shown in FIG. 4, thereby associating the second image area corresponding to the road surface 42 with the intensity peak P2.
  • the control unit 34 repeatedly executes the above-mentioned process for each object included in the first irradiation range, thereby associating the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave in the first irradiation range.
  • the control unit 34 associates them, it identifies the direction of light from a predetermined position included in the image area corresponding to the object toward the imaging device 20.
  • the predetermined position included in this image area is, for example, a position corresponding to the center of the image area included in the first partial image.
  • the control unit 34 identifies the identified light direction as the direction of arrival of the reflected wave whose intensity peak is associated with the image area. In other words, the control unit 34 identifies the identified light direction as the direction of arrival of the reflected wave reflected by an object included in the first irradiation range.
  • the control unit 34 identifies a light direction d2 that travels from a predetermined position included in the first image area corresponding to the tire 41 toward the imaging device 20.
  • the control unit 34 identifies the light direction d2 as the arrival direction of the reflected wave reflected by the tire 41.
  • the control unit 34 also identifies a light direction d3 that travels from a predetermined position included in the second image area corresponding to the road surface 42 toward the imaging device 20.
  • the control unit 34 identifies the light direction d3 as the arrival direction of the reflected wave reflected by the road surface 42.
  • the control unit 34 may specify the position of each object in the real space.
  • the control unit 34 may specify the position of each object included in the first irradiation range in the three-dimensional real space by the data of the distance corresponding to each intensity peak in the first irradiation range and the direction of arrival of the reflected wave corresponding to each intensity peak.
  • the control unit 34 specifies the position of the tire 41 in the three-dimensional real space for the irradiation range 43 by the data of the distance D1 corresponding to the intensity peak P1 shown in FIG. 4 and the light direction d2 shown in FIG.
  • the control unit 34 also specifies the position of the road surface 42 in the three-dimensional real space by the data of the distance D2 corresponding to the intensity peak P2 shown in FIG. 4 and the light direction d3 shown in FIG. 8 specified as the direction of arrival.
  • the control unit 34 may generate point cloud data based on the positions of the identified objects in three-dimensional real space.
  • the point cloud data is data that indicates the positions in three-dimensional space of objects that exist around the distance measuring device 10.
  • the control unit 34 may display the generated point cloud data on the display of the output unit 32, or may output the generated point cloud data to an external storage medium by the output unit 32.
  • the control unit 34 may transmit the generated point cloud data to another system of the mobile body 2 by the communication unit 31.
  • the process of associating the image area corresponding to the object with any one of the multiple intensity peaks in the first irradiation range is not limited to the above example. Other examples of this association process are described below.
  • the control unit 34 determines whether the first image area includes the second partial image. For example, in FIG. 3, the control unit 34 determines whether the first image area corresponding to the tire 41 includes the second partial images corresponding to each of the illumination ranges 44 to 51.
  • the control unit 34 determines that the first image area includes the second partial image, it acquires distance data of the second irradiation range corresponding to the second partial image.
  • the control unit 34 compares the acquired distance data of the second irradiation range with multiple distance data corresponding to each of the multiple intensity peaks of the reflected wave in the first irradiation range. By comparing these distance data, the control unit 34 associates the first image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the first image area corresponding to the tire 41 includes the second partial image corresponding to the irradiation range 44.
  • control unit 34 compares the distance data corresponding to the intensity peak P3 in the irradiation range 44 as shown in FIG. 5 with the distance data corresponding to each of the multiple intensity peaks as shown in FIG. 4. By comparing these distance data, the control unit 34 associates the first image area corresponding to the tire 41 with the intensity peak P1.
  • the control unit 34 determines whether the second image area includes the second partial image. In the same or similar manner as the first object, when the control unit 34 determines that the second image area includes the second partial image, it associates the second image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the second image area corresponding to the road surface 42 is included in the second partial image corresponding to the irradiation range 45. In this case, the control unit 34 compares the distance data corresponding to the intensity peak P4 in the irradiation range 45 as shown in FIG. 6 with the distance data corresponding to each of the multiple intensity peaks as shown in FIG. 4. By comparing these data, the control unit 34 associates the second image area corresponding to the road surface 42 with the intensity peak P2.
  • the control unit 34 repeatedly performs the above-mentioned process for each object included in the first irradiation range, thereby associating the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave in the first irradiation range.
  • the control unit 34 may execute an image recognition process on the captured image.
  • the control unit 34 may execute the image recognition process to identify the front-to-back relationship of multiple objects included in the first partial image.
  • "front" means the one closer to the distance measuring device 10.
  • the control unit 34 executes the image recognition process to identify that the first partial image corresponding to the illumination range 43 includes the tire 41 and a part of the road surface 42.
  • the control unit 34 specifies that the tire 41 is in front of the road surface 42 as viewed from the distance measuring device 10.
  • the control unit 34 may execute the image recognition process to identify an image area corresponding to each object included in the first illumination range.
  • the control unit 34 associates the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave based on the front-to-back relationship of the identified multiple objects and the magnitude relationship of the multiple distance data corresponding to each of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 associates intensity peak P1 with the first image area corresponding to the tire 41 based on the front-to-back relationship of the tire 41 and the road surface 42 and the magnitude relationship of the distance data corresponding to each intensity peak of the reflected wave as shown in FIG. 4. The control unit 34 also associates intensity peak P2 with the second image area corresponding to the road surface 42.
  • control unit 34 When the control unit 34 is unable to identify the direction of arrival of each of the multiple reflected waves in the first irradiation range based on one of the multiple features described above (e.g., luminance), the control unit 34 may identify the direction of arrival of each of the multiple reflected waves based on other of the multiple features (e.g., color). Furthermore, when the control unit 34 is unable to identify the direction of arrival of each of the multiple reflected waves in any or all of the multiple features described above, the control unit 34 may execute the image recognition process described above.
  • the control unit 34 may execute the image recognition process described above.
  • control unit 34 may identify the direction of arrival of each of the multiple reflected waves in the first irradiation range based on the above-mentioned features.
  • FIG. 9 is a flowchart showing an example procedure of an object detection process according to an embodiment of the present disclosure.
  • the object detection process shown in FIG. 9 corresponds to an example of a control method according to this embodiment.
  • the control unit 34 starts the object detection process.
  • the control unit 34 may detect that the moving object 2 has started moving by communicating with another system of the moving object 2 via the communication unit 31.
  • the control unit 34 receives data on the distance to the object for each irradiation range and data on the intensity of the reflected wave from the distance measuring device 10 via the communication unit 31 (step S1).
  • the control unit 34 also receives data on the captured image from the imaging device 20 via the communication unit 31 (step S2).
  • the control unit 34 determines whether or not multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3). If the control unit 34 determines that multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3: YES), it determines that data on multiple objects is included in one irradiation range, and proceeds to processing in step S4. On the other hand, if the control unit 34 does not determine that multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3: NO), it does not determine that data on multiple objects is included in one irradiation range, and proceeds to processing in step S5.
  • control unit 34 identifies the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks based on the captured image received in the process of step S2.
  • step S5 the control unit 34 determines whether or not the process of step S3 has been performed for all irradiation ranges received in the process of step S1. If the control unit 34 determines that the process of step S3 has been performed for all irradiation ranges (step S5: YES), the control unit 34 proceeds to the process of step S6. On the other hand, if the control unit 34 does not determine that the process of step S3 has been performed for all irradiation ranges (step S5: NO), the control unit 34 returns to the process of step S3.
  • step S6 the control unit 34 generates point cloud data based on the distance image data received in step S1 and the direction of arrival identified in step S4.
  • control unit 34 After processing step S6, the control unit 34 returns to processing step S1.
  • the control unit 34 ends the object detection processing, for example, when the moving object 2 stops.
  • the control unit 34 may detect that the moving object 2 has stopped moving by communicating with other systems of the moving object 2 via the communication unit 31.
  • the control unit 34 identifies the positions of the first object and the second object in real space based on the first reflected wave and the second reflected wave and the captured image including the first object and the second object.
  • the first reflected wave and the second reflected wave are respectively electromagnetic waves radiated in a first direction in real space, for example, to one irradiation range, and reflected by the first object and the second object.
  • the first reflected wave is electromagnetic waves irradiated to the irradiation range 43 as shown in FIG. 3 and reflected by the tire 41.
  • the second reflected wave is electromagnetic waves irradiated to the irradiation range 43 as shown in FIG. 3 and reflected by the road surface 42.
  • the distance measuring device 10 can measure the distance to an object with high accuracy compared to, for example, a stereo camera.
  • the resolution of the distance measuring image generated by the distance measuring device 10 is often lower than the resolution of the distance measuring image generated by a stereo camera or the like.
  • the control unit 34 can identify the positions of each of the multiple objects in real space as described above. As a result, in this embodiment, objects can be detected with high accuracy.
  • the control unit 34 may identify the positions of the first object and the second object in real space based on the correspondence between the first and second regions and the first and second reflected waves.
  • the first region is a region including an image of the first object in the captured image.
  • the first region is a region including an image of the tire 41 in the captured image 40 as shown in FIG. 3.
  • the second region is a region including an image of the second object in the captured image.
  • the second region is a region including an image of the road surface 42 in the captured image 40 as shown in FIG. 3.
  • the control unit 34 may perform the above-mentioned processing to identify the correspondence between the first and second regions and the first and second reflected waves based on the intensity peaks of the first and second reflected waves detected from the data on the intensity of the reflected waves.
  • the control unit 34 may identify the correspondence between the first and second regions and the first and second reflected waves by associating each of the first and second regions with either the intensity peaks of the first and second reflected waves detected from the data on the intensity of the reflected waves.
  • the control unit 34 may identify a real space direction corresponding to a first region for the distance measuring device 10 and a real space direction corresponding to a second region for the distance measuring device 10.
  • the real space direction corresponding to the first region for the distance measuring device 10 is, for example, a light direction d2 as shown in FIG. 8.
  • the real space direction corresponding to the second region for the distance measuring device 10 is, for example, a light direction d3 as shown in FIG. 8.
  • the control unit 34 may identify the positions of the first object and the second object in real space based on the identified real space directions corresponding to the first region and the second region, and data on the distances to the first object and the second object. With this configuration, the positions of the objects can be identified with high accuracy.
  • control unit 34 may identify the positional relationship of the first object and the second object relative to the distance measuring device 10 by performing image recognition processing on the captured image.
  • the control unit 34 may identify the front-to-back relationship of the multiple objects as the positional relationship of the multiple objects relative to the distance measuring device 10, as described above.
  • the control unit 34 may identify the correspondence between the first and second regions and the first and second reflected waves based on the identified positional relationship of the first and second objects. With this configuration, the position of the object can be identified efficiently.
  • the control unit 34 may identify the positions of the first object and the second object in real space based on the first reflected wave, the second reflected wave, the third reflected wave, and the captured image.
  • the third reflected wave is the electromagnetic wave irradiated to the second irradiation range in real space and reflected by the first object.
  • the second irradiation range is the irradiation range 44 shown in FIG. 3
  • the third reflected wave is the electromagnetic wave irradiated to the irradiation range 44 and reflected by the tire 41.
  • the third reflected wave is the reflection corresponding to the intensity peak P3 among the data of the reflected wave in the irradiation range 44 shown in FIG. 5.
  • the control unit 34 may also identify the positions of the first object and the second object in real space based on the correspondence between the first and second regions and the first, second, and third reflected waves.
  • the control unit 34 may associate the first and second reflected waves with the first and second regions based on the third reflected wave. For example, as described above, the control unit 34 compares each of the multiple intensity peaks of the reflected wave as shown in FIG. 4 with the intensity peak of the reflected wave as shown in FIG. 5. Through such a comparison, the control unit 34 identifies a correspondence between the first reflected wave corresponding to the intensity peak P1 shown in FIG.
  • the control unit 34 may also obtain first data on the intensity of the reflected wave including the first reflected wave and the second reflected wave, and second data on the intensity of the reflected wave including the third reflection.
  • the first data is, for example, data on the intensity of the reflected wave in the irradiation range 43 as shown in FIG. 4.
  • the second data is, for example, data on the reflected wave in the irradiation range 44 as shown in FIG. 5.
  • control device 30 may identify the positions of the first object and the second object in real space based on the directions of arrival of the first reflected wave, the second reflected wave, and the third reflected wave, the first partial image, the second partial image, and data on the distances to the first object and the second object.
  • the data on the distances to the first object and the second object may be based on distance data measured based on the first reflected wave, the second reflected wave, and the third reflected wave.
  • the control unit 34 may identify the direction of arrival of the third reflected wave based on the position of the scanning range in real space relative to the second irradiation range.
  • the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 may coincide.
  • An occlusion area is an area of the captured image and the distance measuring image that is captured or detected in one, but not captured or detected in the other.
  • the control unit 34 can accurately identify the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks, based on the captured image generated by the imaging device 20.
  • each functional unit, each means, each step, etc. can be added to other embodiments so as not to cause logical inconsistencies, or replaced with each functional unit, each means, each step, etc. of other embodiments.
  • multiple functional units, each means, each step, etc. can be combined into one or divided.
  • each of the above-described embodiments of the present disclosure is not limited to being implemented faithfully according to each of the described embodiments, but may be implemented by combining each feature or omitting some features as appropriate.
  • the multiple objects are described as being tires 41 and road surface 42 as shown in FIG. 3.
  • the multiple objects are not limited to this.
  • the multiple objects may be any objects as long as they overlap on the captured image seen by the detection system 1.
  • the multiple objects may be different types of objects, such as a vehicle and a pedestrian, or a vehicle and an obstacle on the road surface.
  • the multiple objects may also be the same type of object, such as multiple vehicles or multiple obstacles on the road surface.
  • the detection system 1 may be mounted not only on a moving body, but also on an observation device such as a roadside unit installed on the road surface or a surveillance camera that monitors the surroundings.
  • the control device comprises: The positions of the first object and the second object in the real space are identified based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.
  • the positions of the first object and the second object in the real space may be identified based on a correspondence between a first region of the captured image that includes an image of the one object and a second region of the captured image that includes an image of the second object, and the first reflected wave and the second reflected wave.
  • the control device includes: performing an image recognition process on the captured image to identify a positional relationship of the first object and the second object with respect to the distance measuring device; The correspondence relationship may be identified based on the positional relationship.
  • any one of the control devices according to (1) to (4) above acquiring data on a distance to an object and the intensity data from a distance measuring device that measures a distance to the object by emitting an electromagnetic wave into a real space and detecting a wave reflected by the object;
  • the positions of the first object and the second object in real space may be identified based on the distances to the first object and the second object obtained by using a direction in real space corresponding to the first area and a direction in real space corresponding to the second area relative to the distance measuring device, and the distance data measured based on the first reflected wave and the second reflected wave.
  • the control device is The positions of the first object and the second object in the real space may be identified based on a first reflected wave that is formed by reflecting an electromagnetic wave from a first object and a second reflected wave that is formed by reflecting an electromagnetic wave from a second irradiation range in the real space, a third reflected wave that is formed by reflecting an electromagnetic wave from the first object and a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.
  • the positions of the first object and the second object in the real space may be identified based on a correspondence between a first region of the captured image that includes an image of the one object and a second region of the captured image that includes an image of the second object, and the first reflected wave, the second reflected wave, and the third reflected wave.
  • a correspondence relationship between the first reflected wave and the second reflected wave and the first region and the second region may be identified based on the third reflected wave.
  • any one of the control devices according to (6) to (8) above Among the first reflected wave, the second reflected wave, and the third reflected wave, the first reflected wave and the third reflected wave having a difference in intensity peak within a predetermined range may be associated as a reflected wave from the first object.
  • a control method includes: The method includes identifying the positions of the first object and the second object in the real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.
  • the control method includes: The method includes identifying positions of the first object and the second object in the real space based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

This control device identifies the positions of a first object and a second object in a real space on the basis of: a first reflected wave and a second reflected wave, the first and second reflected waves being obtained by reflection of electromagnetic waves, emitted in a first direction of the real space, by the first object and the second object, respectively; and a captured image including the first object and the second object, obtained by imaging the real space.

Description

制御装置及び制御方法Control device and control method 関連出願へのクロスリファレンスCROSS-REFERENCE TO RELATED APPLICATIONS

 本出願は、2023年3月30日に日本国に特許出願された特願2023-056518の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims priority to patent application No. 2023-056518, filed in Japan on March 30, 2023, the entire disclosure of which is incorporated herein by reference.

 本開示は、制御装置及び制御方法に関する。 This disclosure relates to a control device and a control method.

 道路上の落下物等の物体を検出する技術が知られている(例えば、特許文献1)。特許文献1に記載の路上監視装置は、車両の少なくとも前方向若しくは後方向の撮像を行い、撮像データを順次出力する撮像部と、撮像データに基づいて、撮像データに対応する画像中に所定の監視対象物が含まれているか否かを判別する判別部とを備える。 Technology for detecting objects such as fallen objects on the road is known (for example, see Patent Document 1). The road monitoring device described in Patent Document 1 includes an imaging unit that captures images of at least the front or rear of the vehicle and sequentially outputs the image data, and a discrimination unit that determines whether or not a specified monitoring object is included in the image corresponding to the image data based on the image data.

特開2016-151887号公報JP 2016-151887 A

 本開示の一実施形態に係る制御装置は、
 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する。
A control device according to an embodiment of the present disclosure includes:
The positions of the first object and the second object in the real space are identified based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.

 本開示の一実施形態に係る制御装置は、
 実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する。
A control device according to an embodiment of the present disclosure includes:
The positions of the first object and the second object in the real space are identified based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.

 本開示の一実施形態に係る制御方法は、
 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む。
A control method according to an embodiment of the present disclosure includes:
The method includes identifying the positions of the first object and the second object in the real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.

 本開示の一実施形態に係る制御方法は、
 実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む。
A control method according to an embodiment of the present disclosure includes:
The method includes identifying positions of the first object and the second object in the real space based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.

本開示の一実施形態に係る検出システムの概略構成を示す図である。FIG. 1 is a diagram showing a schematic configuration of a detection system according to an embodiment of the present disclosure. 図1に示す検出システムのブロック図である。FIG. 2 is a block diagram of the detection system shown in FIG. 1 . 撮像画像の一例を示す図である。FIG. 4 is a diagram showing an example of a captured image. 反射波の強度のデータの一例を示す図である。FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave. 反射波の強度のデータの一例を示す図である。FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave. 反射波の強度のデータの一例を示す図である。FIG. 11 is a diagram showing an example of data on the intensity of a reflected wave. 図3に示す照射範囲に対応する検出方向を説明するための図である。4 is a diagram for explaining a detection direction corresponding to the irradiation range shown in FIG. 3 . FIG. 図3に示す構成における光線方向を説明するための図である。FIG. 4 is a diagram for explaining the direction of light in the configuration shown in FIG. 3 . 本開示の一実施形態に係る物体検出処理の手順例を示すフローチャートである。10 is a flowchart illustrating an example procedure of an object detection process according to an embodiment of the present disclosure.

 物体を精度良く検出することが望まれている。本開示の一実施形態によれば、物体を精度良く検出することができる。 It is desirable to detect objects with high accuracy. According to one embodiment of the present disclosure, objects can be detected with high accuracy.

 以下、本開示に係る実施形態について、図面を参照して説明する。 Embodiments of the present disclosure are described below with reference to the drawings.

 (検出システムの構成)
 図1に示すように、本実施形態に係る検出システム1は、移動体2に搭載される。移動体2は、例えば、車両、自動二輪車又はロボット等である。ただし、検出システム1は、移動体2に搭載されなくてもよい。検出システム1は、移動体2に搭載されない場合、スマートフォン等の電子機器に組み込まれてもよい。
(Configuration of the detection system)
1, a detection system 1 according to the present embodiment is mounted on a moving object 2. The moving object 2 is, for example, a vehicle, a motorcycle, a robot, or the like. However, the detection system 1 does not have to be mounted on the moving object 2. When the detection system 1 is not mounted on the moving object 2, it may be incorporated in an electronic device such as a smartphone.

 図1及び図2に示すように、検出システム1は、測距装置10と、撮像装置20と、制御装置30とを含む。測距装置10及び撮像装置20と、制御装置30とは、通信線を介して通信可能である。制御装置30は、移動体2に搭載されなくてもよい。制御装置30は、移動体2に搭載されない場合、移動体2に搭載された測距装置10及び撮像装置20とネットワーク等を介して通信可能に構成されてよい。 As shown in Figures 1 and 2, the detection system 1 includes a distance measuring device 10, an imaging device 20, and a control device 30. The distance measuring device 10 and the imaging device 20 can communicate with the control device 30 via a communication line. The control device 30 does not have to be mounted on the moving body 2. If the control device 30 is not mounted on the moving body 2, it may be configured to be able to communicate with the distance measuring device 10 and the imaging device 20 mounted on the moving body 2 via a network or the like.

 測距装置10は、例えば、LiDAR(Light Detection And Ranging)を含んで構成される。LiDARは、例えば、電磁波を放射するレーザ光源と、光学系と、電磁波を検出する検出部とを含んで構成される。電磁波は、例えば、赤外線、可視光線、紫外線又は電波等である。 The distance measuring device 10 is configured to include, for example, LiDAR (Light Detection And Ranging). LiDAR is configured to include, for example, a laser light source that emits electromagnetic waves, an optical system, and a detection unit that detects the electromagnetic waves. The electromagnetic waves are, for example, infrared rays, visible light, ultraviolet rays, or radio waves.

 測距装置10は、測距装置10の周囲の実空間に電磁波を放射する。測距装置10が放射した電磁波は、物体で反射されて反射波として測距装置10まで返ってくる。測距装置10は、放射した電磁波のうち、物体で反射されて返ってくる電磁波すなわち反射波を検出する。測距装置10は、反射波を検出することにより、物体までの距離を計測する。測距装置10は、ToF(Time of Flight)方式等の任意の方式によって、物体までの距離を計測してよい。ToF(Time of Flight)の場合、測距装置10は、電磁波を放射してから放射した電磁波の反射波を検出するまでの時間を直接計測することにより、物体までの距離を計測する。 The ranging device 10 emits electromagnetic waves into the real space around the ranging device 10. The electromagnetic waves emitted by the ranging device 10 are reflected by an object and return to the ranging device 10 as reflected waves. The ranging device 10 detects the electromagnetic waves that are reflected by an object and return, i.e., the reflected waves, from the emitted electromagnetic waves. The ranging device 10 measures the distance to the object by detecting the reflected waves. The ranging device 10 may measure the distance to the object by any method, such as the ToF (Time of Flight) method. In the case of ToF (Time of Flight), the ranging device 10 measures the distance to the object by directly measuring the time from when the electromagnetic waves are emitted to when the reflected waves of the emitted electromagnetic waves are detected.

 測距装置10は、物体までの距離を計測するとともに、その物体で反射されて返ってくる反射波の強度のデータを取得する。反射波の強度は、例えば、測距装置10がLiDARを含み、さらに当該LiDARの検出部がAPD(Avalanche Photo Diode)等の素子を含む場合、当該検出部に入力した光信号の強度に基づく値であってよい。本実施形態では、測距装置10は、実空間を電磁波で走査しつつ、電磁波が物体で反射した反射波を検出していくことにより、物体までの距離を計測するとともに、その物体で反射されて返ってくる反射波の強度のデータを取得する。 The ranging device 10 measures the distance to an object and obtains data on the intensity of the reflected wave reflected back by the object. For example, if the ranging device 10 includes a LiDAR and the detection unit of the LiDAR further includes an element such as an APD (Avalanche Photo Diode), the intensity of the reflected wave may be a value based on the intensity of the optical signal input to the detection unit. In this embodiment, the ranging device 10 measures the distance to an object by scanning real space with electromagnetic waves and detecting the reflected waves that are reflected by objects, and obtains data on the intensity of the reflected wave reflected back by the object.

 測距装置10が電磁波で走査する実空間上の範囲は、「走査範囲」とも記載される。走査範囲は、複数の照射範囲を含む。照射範囲は、実空間において1つの方向に測距装置10が電磁波を照射するときに電磁波が照射される範囲である。例えば、照射範囲は、後述の図7及び図8に示すような角度θに対応する。測距装置10は、例えば、1つの照射範囲に照射した電磁波の反射波を1つの方向から到来する反射波とみなして検出する。つまり、1つの照射範囲は、測距装置10が電磁波を検出する1つの検出方向に対応する。また、測距装置10は、測距装置10が走査範囲を電磁波で走査して、測距装置10から反射波が反射した物体までの距離のデータを照射範囲毎に取得する。取得された距離のデータは、「測距画像」とも記載される。測距画像は、複数の画素を含む。測距画像の1つの画素は、走査範囲の1つの照射範囲に対応する。また、測距画像の1つの画素には、対応する照射範囲で取得した距離のデータ、及び、反射波の強度のデータが対応付けられる。 The range in real space that the ranging device 10 scans with electromagnetic waves is also referred to as the "scanning range." The scanning range includes multiple irradiation ranges. The irradiation range is the range that is irradiated with electromagnetic waves when the ranging device 10 irradiates electromagnetic waves in one direction in real space. For example, the irradiation range corresponds to the angle θ shown in Figures 7 and 8 described below. For example, the ranging device 10 detects the reflected wave of the electromagnetic wave irradiated to one irradiation range as a reflected wave arriving from one direction. In other words, one irradiation range corresponds to one detection direction in which the ranging device 10 detects the electromagnetic wave. In addition, the ranging device 10 scans the scanning range with electromagnetic waves and acquires data on the distance from the ranging device 10 to the object on which the reflected wave is reflected for each irradiation range. The acquired distance data is also referred to as the "distance measurement image." The ranging image includes multiple pixels. One pixel of the ranging image corresponds to one irradiation range of the scanning range. Additionally, each pixel in the distance measurement image is associated with distance data acquired in the corresponding illumination range and data on the intensity of the reflected waves.

 測距装置10は、照射範囲毎に、後述の図4から図6に示すような、物体までの距離のデータ及び反射波の強度のデータを取得する。測距装置10は、取得した照射範囲毎の物体までの距離のデータ及び反射波の強度のデータを制御装置30に送信する。 The distance measuring device 10 acquires data on the distance to the object and data on the intensity of the reflected wave for each irradiation range, as shown in Figures 4 to 6 described below. The distance measuring device 10 transmits the acquired data on the distance to the object and data on the intensity of the reflected wave for each irradiation range to the control device 30.

 撮像装置20は、撮像光学系及び撮像素子を含んで構成される。撮像装置20は、撮像装置20の周囲の実空間を撮像することにより、撮像画像を生成する。撮像画像は、複数の画素を含む。撮像装置20が実空間を撮像する範囲は、「撮像範囲」とも記載される。撮像装置20の撮像範囲の少なくとも一部は、測距装置10の走査範囲の少なくとも一部と重なる。撮像装置20は、任意のフレームレートで、撮像画像を生成してよい。撮像装置20は、生成した撮像画像のデータを制御装置30に送信する。 The imaging device 20 is configured to include an imaging optical system and an imaging element. The imaging device 20 generates an image by capturing an image of the real space around the imaging device 20. The captured image includes a plurality of pixels. The range in which the imaging device 20 captures the real space is also referred to as the "imaging range." At least a portion of the imaging range of the imaging device 20 overlaps with at least a portion of the scanning range of the distance measuring device 10. The imaging device 20 may generate an image at any frame rate. The imaging device 20 transmits data of the generated captured image to the control device 30.

 撮像装置20の撮像画像と測距装置10の走査範囲の各照射範囲とは、予め対応付けられる。例えば、上述したように、測距画像の1つの画素は、走査範囲の1つの照射範囲に対応する。そこで、撮像画像に対して設定される座標系と測距画像に対して設定される座標系とが、予め対応付けられてよい。これらの2つの座標系が予め対応付けられることにより、撮像装置20の撮像画像と測距装置10の走査範囲の各照射範囲とを予め対応付けることができる。又は、撮像装置20の撮像画像の各画素と測距装置10の測距画像の各画素とが、予め対応付けられてもよい。撮像画像の各画素と測距画像の各画素とが予め対応付けられることにより、撮像装置20の撮像画像と測距装置10の各照射範囲とを予め対応付けることができる。 The captured image of the imaging device 20 and each irradiation range of the scanning range of the distance measuring device 10 are associated in advance. For example, as described above, one pixel of the distance measuring image corresponds to one irradiation range of the scanning range. Therefore, the coordinate system set for the captured image and the coordinate system set for the distance measuring image may be associated in advance. By associating these two coordinate systems in advance, it is possible to associate the captured image of the imaging device 20 with each irradiation range of the scanning range of the distance measuring device 10 in advance. Alternatively, each pixel of the captured image of the imaging device 20 and each pixel of the distance measuring image of the distance measuring device 10 may be associated in advance. By associating each pixel of the captured image with each pixel of the distance measuring image in advance, it is possible to associate the captured image of the imaging device 20 with each irradiation range of the distance measuring device 10 in advance.

 撮像装置20の光軸と測距装置10の光軸とは、一致してよい。例えば、撮像装置20の撮像光学系の光軸と測距装置10のLiDARの光学系の光軸とは、一致してよい。この場合、撮像装置20の撮像光学系の一部と測距装置10のLiDARの光学系の一部とは、共通であってよい。撮像光学系の一部とLiDARの光学系の一部とが共通である場合、撮像装置20と測距装置10とは、1つの装置として構成されてよい。この場合、撮像装置20と測距装置10とは、特開2018-132384号公報に記載されるように構成されてよい。ただし、撮像装置20の撮像画像と測距装置10の各照射範囲とが予め対応付けられていれば、撮像装置20の光軸と測距装置10の光軸とは、一致しなくてもよい。 The optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 may coincide. For example, the optical axis of the imaging optical system of the imaging device 20 and the optical axis of the LiDAR optical system of the distance measuring device 10 may coincide. In this case, a part of the imaging optical system of the imaging device 20 and a part of the LiDAR optical system of the distance measuring device 10 may be common. When a part of the imaging optical system and a part of the LiDAR optical system are common, the imaging device 20 and the distance measuring device 10 may be configured as a single device. In this case, the imaging device 20 and the distance measuring device 10 may be configured as described in JP 2018-132384 A. However, as long as the captured image of the imaging device 20 and each irradiation range of the distance measuring device 10 are previously associated with each other, the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 do not have to coincide.

 制御装置30は、例えば、コンピュータである。制御装置30は、汎用コンピュータ、ワークステーション又は専用コンピュータ等の任意のコンピュータであってよい。 The control device 30 is, for example, a computer. The control device 30 may be any computer, such as a general-purpose computer, a workstation, or a dedicated computer.

 制御装置30は、通信部31と、出力部32と、記憶部33と、制御部34とを備える。 The control device 30 includes a communication unit 31, an output unit 32, a memory unit 33, and a control unit 34.

 通信部31は、通信線を介して測距装置10及び撮像装置20と通信可能な少なくとも1つの通信モジュールを含んで構成される。通信モジュールは、通信線の規格に対応した通信モジュールである。通信線は、有線及び無線の少なくとも何れかを含んで構成される。 The communication unit 31 includes at least one communication module capable of communicating with the distance measuring device 10 and the imaging device 20 via a communication line. The communication module is a communication module that complies with the standard of the communication line. The communication line includes at least one of a wired line and a wireless line.

 通信部31は、移動体2の他のシステムと通信可能な少なくとも1つの通信モジュールを含んで構成されてもよい。通信モジュールは、制御装置30と他のシステムとの間の通信規格に対応した通信モジュールである。通信規格は、例えば、CAN(Controller Area Network)等の規格である。 The communication unit 31 may be configured to include at least one communication module capable of communicating with other systems of the mobile body 2. The communication module is a communication module compatible with a communication standard between the control device 30 and other systems. The communication standard is, for example, a standard such as CAN (Controller Area Network).

 出力部32は、データを出力可能である。出力部32は、データを出力可能な少なくとも1つの出力用インタフェースを含んで構成される。出力用インタフェースは、例えば、ディスプレイ等である。ディプレイは、移動体2が車両である場合、車両のダッシュボードに配置されてよい。出力部32は、外部の記憶媒体に情報を出力可能な装置等を含んで構成されてもよい。 The output unit 32 is capable of outputting data. The output unit 32 includes at least one output interface capable of outputting data. The output interface is, for example, a display. If the mobile object 2 is a vehicle, the display may be disposed on the dashboard of the vehicle. The output unit 32 may include a device capable of outputting information to an external storage medium.

 記憶部33は、少なくとも1つの半導体メモリ、少なくとも1つの磁気メモリ、少なくとも1つの光メモリ又はこれらのうちの少なくとも2種類の組み合わせを含んで構成される。半導体メモリは、例えば、RAM(Random Access Memory)又はROM(Read Only Memory)等である。RAMは、例えば、SRAM(Static Random Access Memory)又はDRAM(Dynamic Random Access Memory)等である。ROMは、例えば、EEPROM(Electrically Erasable Programmable Read Only Memory)等である。記憶部33は、主記憶装置、補助記憶装置又はキャッシュメモリとして機能してよい。記憶部33には、制御装置30の動作に用いられるデータと、制御装置30の動作によって得られたデータとが記憶される。 The memory unit 33 is configured to include at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or a combination of at least two of these. The semiconductor memory is, for example, a RAM (Random Access Memory) or a ROM (Read Only Memory). The RAM is, for example, an SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory). The ROM is, for example, an EEPROM (Electrically Erasable Programmable Read Only Memory). The memory unit 33 may function as a main memory device, an auxiliary memory device, or a cache memory. The memory unit 33 stores data used in the operation of the control device 30 and data obtained by the operation of the control device 30.

 制御部34は、少なくとも1つのプロセッサ、少なくとも1つの専用回路又はこれらの組み合わせを含んで構成される。プロセッサは、例えば、CPU(Central Processing Unit)若しくはGPU(Graphics Processing Unit)等の汎用プロセッサ又は特定の処理に特化した専用プロセッサである。専用回路は、例えば、FPGA(Field-Programmable Gate Array)又はASIC(Application Specific Integrated Circuit)等である。制御部34は、制御装置30の各部を制御しながら、制御装置30の動作に関わる処理を実行する。 The control unit 34 is configured to include at least one processor, at least one dedicated circuit, or a combination of these. The processor is, for example, a general-purpose processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a dedicated processor specialized for a specific process. The dedicated circuit is, for example, an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit), etc. The control unit 34 controls each part of the control unit 30 and executes processes related to the operation of the control unit 30.

 制御部34は、測距装置10から、測距装置10の照射範囲毎に、物体までの距離のデータ及び反射波の強度のデータを通信部31によって受信する。また、制御部34は、撮像装置20から、撮像画像のデータを通信部31によって受信する。 The control unit 34 receives data on the distance to the object and data on the intensity of the reflected wave for each irradiation range of the distance measuring device 10 from the distance measuring device 10 via the communication unit 31. The control unit 34 also receives captured image data from the imaging device 20 via the communication unit 31.

 例えば、制御部34は、図3に示すような撮像画像40のデータを受信する。撮像画像40には、タイヤ41及び路面42の物体が含まれる。本開示において、撮像画像に含まれる物体とは、撮像画像に描画された物体を意味する。図3に示す構成では、撮像装置20の撮像範囲は、測距装置10の走査範囲と同じである。撮像画像40には、測距装置10の走査範囲の各照射範囲の位置を破線で示す。上述したように、1つの照射範囲は、測距画像の1つの画素に対応する。そのため、破線の位置から分かるように、撮像画像40の画素のサイズは、測距画像の画素のサイズよりも小さくなる。つまり、撮像画像40の分解能は、測距画像の分解能よりも高くなる。 For example, the control unit 34 receives data of a captured image 40 as shown in FIG. 3. The captured image 40 includes objects on the tire 41 and the road surface 42. In this disclosure, an object included in the captured image means an object depicted in the captured image. In the configuration shown in FIG. 3, the imaging range of the imaging device 20 is the same as the scanning range of the distance measuring device 10. In the captured image 40, the position of each irradiation range of the scanning range of the distance measuring device 10 is indicated by a dashed line. As described above, one irradiation range corresponds to one pixel of the distance measuring image. Therefore, as can be seen from the position of the dashed line, the size of the pixels of the captured image 40 is smaller than the size of the pixels of the distance measuring image. In other words, the resolution of the captured image 40 is higher than the resolution of the distance measuring image.

 例えば、制御部34は、図4から図6に示すような物体までの距離のデータ及び反射波の強度のデータを受信する。図4から図6において、横軸は、物体までの距離を示す。縦軸は、反射波の強度を示す。 For example, the control unit 34 receives data on the distance to an object and data on the intensity of the reflected wave as shown in Figures 4 to 6. In Figures 4 to 6, the horizontal axis indicates the distance to the object, and the vertical axis indicates the intensity of the reflected wave.

 図4に示すデータは、図3に示す照射範囲43における物体までの距離のデータ及び反射波の強度のデータである。図3に示すように、照射範囲43に対応する撮像画像40の部分には、タイヤ41及び路面42が含まれる。つまり、照射範囲43の反射波の強度のデータには、タイヤ41からの反射波(第1反射波)及び路面42(第2反射波)からの反射波に基づくデータが含まれる。そのため、図4に示す反射波の強度のデータには、タイヤ41で電磁波が反射された反射波により生じる強度ピークP1と、路面42で電磁波が反射された反射波により生じる強度ピークP2とが含まれる。ここで、強度ピークP1,P2は、所定期間に受信された反射波から検出された検出時刻が異なる複数の強度ピークということができる。所定期間とは、例えば、走査範囲全体に電磁波を照射する期間(1フレーム)において、照射範囲43に電磁波の照射が開始された時間から当該フレームで電磁波の照射が終わるまでの期間、又は、照射範囲43に電磁波の照射が開始された時間から当該フレームで電磁波の照射が終わるまでの期間よりも短い期間である。 The data shown in Figure 4 is data on the distance to an object in the irradiation range 43 shown in Figure 3 and data on the intensity of the reflected wave. As shown in Figure 3, the portion of the captured image 40 corresponding to the irradiation range 43 includes the tire 41 and the road surface 42. In other words, the data on the intensity of the reflected wave in the irradiation range 43 includes data based on the reflected wave from the tire 41 (first reflected wave) and the reflected wave from the road surface 42 (second reflected wave). Therefore, the data on the intensity of the reflected wave shown in Figure 4 includes an intensity peak P1 caused by the reflected wave of the electromagnetic wave reflected by the tire 41, and an intensity peak P2 caused by the reflected wave of the electromagnetic wave reflected by the road surface 42. Here, the intensity peaks P1 and P2 can be said to be multiple intensity peaks with different detection times detected from reflected waves received during a specified period. The specified period is, for example, the period from when irradiation of the electromagnetic wave starts on the irradiation range 43 to when irradiation of the electromagnetic wave ends in that frame during the period (one frame) during which the electromagnetic wave is irradiated over the entire scanning range, or a period shorter than the period from when irradiation of the electromagnetic wave starts on the irradiation range 43 to when irradiation of the electromagnetic wave ends in that frame.

 図5に示すデータは、図3に示す照射範囲44における物体までの距離のデータ及び反射波の強度のデータである。図3に示すように、照射範囲44に対応する撮像画像40の部分には、タイヤ41が含まれるが、路面42が含まれない。つまり、照射範囲44の反射波の強度のデータには、タイヤ41からの反射波に基づくデータが含まれる。そのため、図5に示す反射波の強度のデータには、タイヤ41で電磁波が反射された反射波により生じる強度ピークP3が含まれる。 The data shown in FIG. 5 is data on the distance to an object in the irradiation range 44 shown in FIG. 3 and data on the intensity of the reflected wave. As shown in FIG. 3, the portion of the captured image 40 that corresponds to the irradiation range 44 includes the tire 41 but does not include the road surface 42. In other words, the data on the intensity of the reflected wave in the irradiation range 44 includes data based on the reflected wave from the tire 41. Therefore, the data on the intensity of the reflected wave shown in FIG. 5 includes an intensity peak P3 that occurs due to the reflected wave when the electromagnetic wave is reflected by the tire 41.

 図6に示すデータは、図3に示す照射範囲45における物体までの距離のデータ及び反射波の強度のデータである。図3に示すように、照射範囲45に対応する撮像画像40の部分には、路面42が含まれるが、タイヤ41が含まれない。つまり、照射範囲45の反射波の強度のデータには、路面42からの反射波に基づくデータが含まれる。そのため、図6に示す反射波の強度のデータには、路面42で電磁波が反射された反射波により生じる強度ピークP4が含まれる。 The data shown in FIG. 6 is data on the distance to an object in the irradiation range 45 shown in FIG. 3 and data on the intensity of the reflected wave. As shown in FIG. 3, the portion of the captured image 40 that corresponds to the irradiation range 45 includes the road surface 42 but does not include the tires 41. In other words, the data on the intensity of the reflected wave in the irradiation range 45 includes data based on the wave reflected from the road surface 42. Therefore, the data on the intensity of the reflected wave shown in FIG. 6 includes an intensity peak P4 that occurs due to the electromagnetic wave being reflected by the road surface 42.

 ここで、上述したように、測距装置10は、1つの照射範囲に照射した電磁波の反射波を1つの方向から到来する反射波とみなして検出する。つまり、1つの照射範囲は、測距装置10が電磁波の反射波を検出する1つの検出方向に対応する。例えば、図7には、図3に示す照射範囲43に対応する検出方向d1を示す。図7に示す構成は、図1に示す構成の上面図に相当する。角度θは、1つの照射範囲に対応する。図7に示す構成では、照射範囲43に対応する検出方向d1は、測距装置10から照射範囲43の中央への方向となる。 Here, as described above, the distance measuring device 10 detects the reflected wave of the electromagnetic wave irradiated to one irradiation range as a reflected wave arriving from one direction. In other words, one irradiation range corresponds to one detection direction in which the distance measuring device 10 detects the reflected wave of the electromagnetic wave. For example, FIG. 7 shows detection direction d1 corresponding to the irradiation range 43 shown in FIG. 3. The configuration shown in FIG. 7 corresponds to a top view of the configuration shown in FIG. 1. The angle θ corresponds to one irradiation range. In the configuration shown in FIG. 7, the detection direction d1 corresponding to the irradiation range 43 is the direction from the distance measuring device 10 to the center of the irradiation range 43.

 測距装置10が反射波を検出する検出方向は、物体で反射された反射波が到来する到来方向とみなされる。つまり、この検出方向は、測距装置10から当該検出方向に対応する照射範囲に含まれる物体までの方向として扱われる。そのため、1つの照射範囲に複数の物体のデータが含まれる場合、測距装置10からこれら複数の物体までの方向は、同じ方向として扱われる。例えば、図7に示す構成では、照射範囲43にタイヤ41及び路面42のデータが含まれる。そのため、照射範囲43では、測距装置10からタイヤ41及び路面42のそれぞれまでの方向は、同じ検出方向d1として扱われてしまう。しかしながら、測距装置10からタイヤ41及び路面42のそれぞれまでの実際の方向は、異なる。また、図3に示すように、撮像画像40のうちの照射範囲43に対応する部分には、タイヤ41及び路面42が含まれる。しかしながら、図4に示す照射範囲43における反射波の強度のデータのうち、反射波の強度が最も大きい強度ピークP1に対応する距離に1つの物体(ここでは、タイヤ41)のみが存在するものとして扱われてしまう。 The detection direction in which the ranging device 10 detects the reflected wave is regarded as the direction of arrival of the reflected wave reflected by the object. In other words, this detection direction is treated as the direction from the ranging device 10 to the object included in the irradiation range corresponding to the detection direction. Therefore, when data of multiple objects is included in one irradiation range, the direction from the ranging device 10 to these multiple objects is treated as the same direction. For example, in the configuration shown in FIG. 7, the irradiation range 43 includes data of the tire 41 and the road surface 42. Therefore, in the irradiation range 43, the direction from the ranging device 10 to each of the tire 41 and the road surface 42 is treated as the same detection direction d1. However, the actual direction from the ranging device 10 to each of the tire 41 and the road surface 42 is different. Also, as shown in FIG. 3, the part of the captured image 40 corresponding to the irradiation range 43 includes the tire 41 and the road surface 42. However, in the data on the intensity of the reflected wave in the irradiation range 43 shown in FIG. 4, only one object (here, a tire 41) is treated as being present at the distance corresponding to the intensity peak P1 where the reflected wave has the greatest intensity.

 そこで、本実施形態では、制御部34は、以下の処理を実行することにより、1つの照射範囲に複数の物体のデータが含まれると判定した場合、これら複数の物体でそれぞれ反射された複数の反射波のそれぞれの到来方向を特定する。制御部34は、複数の反射波のそれぞれの到来方向を特定することにより、複数の物体のそれぞれの実空間における位置を特定することができる。 In this embodiment, the control unit 34 executes the following process to determine the direction of arrival of each of the multiple reflected waves reflected by the multiple objects when it determines that data of multiple objects is included in one irradiation range. By determining the direction of arrival of each of the multiple reflected waves, the control unit 34 can determine the position of each of the multiple objects in real space.

 まず、制御部34は、測距装置10の1つの照射範囲における反射波の強度のデータから複数の強度ピークが検出されるか否かを判定する。制御部34は、強度閾値を超える強度ピークを検出してよい。強度閾値は、反射波におけるノイズ量を想定して設定されてよい。例えば、図4に示す照射範囲43における反射波のデータの場合、制御部34は、反射波の強度のデータから、2つの強度ピークすなわち強度ピークP1,P2が検出されると判定する。制御部34は、1つの照射範囲における反射波の強度のデータから複数の強度ピークが検出されると判定した場合、1つの照射範囲に複数の物体のデータが含まれると判定する。 First, the control unit 34 determines whether multiple intensity peaks are detected from the data on the intensity of the reflected wave in one irradiation range of the distance measuring device 10. The control unit 34 may detect intensity peaks that exceed an intensity threshold. The intensity threshold may be set assuming the amount of noise in the reflected wave. For example, in the case of data on the reflected wave in the irradiation range 43 shown in FIG. 4, the control unit 34 determines that two intensity peaks, i.e., intensity peaks P1 and P2, are detected from the data on the intensity of the reflected wave. If the control unit 34 determines that multiple intensity peaks are detected from the data on the intensity of the reflected wave in one irradiation range, it determines that one irradiation range includes data on multiple objects.

 制御部34は、1つの照射範囲に複数の物体のデータが含まれると判定した場合、撮像画像に基づいて、複数の強度ピークのそれぞれに対応する複数の反射波のそれぞれの到来方向を特定する。図3を参照して上述したように、撮像画像の画素のサイズは、測距画像の画素のサイズよりも小さい。そのため、制御部34は、撮像画像の画素に関する情報を用いることにより、1つの照射範囲における複数の反射波のそれぞれの到来方向を特定する。以下、到来方向を特定する処理について説明する。以下、複数の物体のデータが含まれると判定された照射範囲は、「第1照射範囲」とも記載される。例えば、図3では、照射範囲43が第1照射範囲となる。 When the control unit 34 determines that one irradiation range contains data of multiple objects, it determines the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks based on the captured image. As described above with reference to FIG. 3, the size of a pixel in the captured image is smaller than the size of a pixel in the ranging image. Therefore, the control unit 34 determines the direction of arrival of each of the multiple reflected waves in one irradiation range by using information about the pixels of the captured image. The process of determining the direction of arrival is described below. Hereinafter, an irradiation range determined to contain data of multiple objects is also referred to as a "first irradiation range." For example, in FIG. 3, irradiation range 43 is the first irradiation range.

 <到来方向の特定処理>
 制御部34は、撮像画像に対してバイラテラルフィルタ及びスーパーピクセルの処理を実行することにより、撮像画像を複数の画像領域に分割する。この画像領域は、特徴が類似する画素が連続して並ぶ領域となり得る。特徴とは、例えば、画素の輝度又は色等である。つまり、画像領域は、輝度情報が類似する画素により構成されるか又は色情報が類似する画素により構成される。
<Processing for identifying direction of arrival>
The control unit 34 divides the captured image into a plurality of image regions by performing bilateral filter and superpixel processing on the captured image. The image regions can be regions in which pixels with similar characteristics are arranged consecutively. The characteristics are, for example, the luminance or color of the pixels. In other words, the image regions are composed of pixels with similar luminance information or pixels with similar color information.

 以下、第1照射範囲に対応する撮像画像の部分は、「第1部分画像」とも記載される。例えば、図3では、第1部分画像は、撮像画像40のうちの照射範囲43に対応する部分となる。図3に示すよう撮像画像40を複数の画像領域に分割すると、照射範囲43に対応する第1部分画像は、タイヤ41に対応する画像領域の一部と、路面42に対応する画像領域の一部とを少なくとも含む。 Hereinafter, the portion of the captured image that corresponds to the first illumination range is also referred to as the "first partial image." For example, in FIG. 3, the first partial image is the portion that corresponds to illumination range 43 of captured image 40. When captured image 40 is divided into a plurality of image regions as shown in FIG. 3, the first partial image that corresponds to illumination range 43 includes at least a portion of the image region that corresponds to tire 41 and a portion of the image region that corresponds to road surface 42.

 制御部34は、撮像画像における上述した特徴と、第1照射範囲における反射波の複数の強度ピーク及び距離のデータとを用いることにより、物体に対応する画像領域と反射波の複数の強度ピークの何れかとを対応付ける。以下、第1照射範囲に隣接する他の照射範囲は、「第2照射範囲」とも記載される。第2照射範囲に対応する撮像画像の部分は、「第2部分画像」とも記載される。例えば、図3では、第2照射範囲は、照射範囲44,45,46,47,48,49,50,51となる。照射範囲44~51は、照射範囲43を囲む照射範囲である。第2部分画像は、撮像画像40のうちの照射範囲44~51のそれぞれに対応する部分となる。 The control unit 34 uses the above-mentioned features of the captured image and the data on the multiple intensity peaks and distance of the reflected wave in the first irradiation range to associate the image area corresponding to the object with one of the multiple intensity peaks of the reflected wave. Hereinafter, the other irradiation range adjacent to the first irradiation range is also described as the "second irradiation range." The part of the captured image corresponding to the second irradiation range is also described as the "second partial image." For example, in FIG. 3, the second irradiation range is irradiation ranges 44, 45, 46, 47, 48, 49, 50, and 51. Irradiation ranges 44 to 51 are irradiation ranges that surround irradiation range 43. The second partial image is the part of the captured image 40 that corresponds to each of irradiation ranges 44 to 51.

 制御部34は、第2部分画像に第1画像領域が含まれるか否かを判定する。第1画像領域は、分割後の複数の画像領域のうち、第1物体に対応する画像領域である。第1物体は、第1照射範囲すなわち第1部分画像に含まれる複数の物体のうちの何れかである。例えば、図3において、第1物体は、タイヤ41であるものとする。この場合、制御部34は、照射範囲44~51のそれぞれに対応する第2部分画像に、タイヤ41に対応する第1画像領域が含まれるか否かを判定する。制御部34は、第2部分画像に含まれる画素の特徴が第1物体に対応する画素の特徴と同じか否かを判定することにより、第2部分画像に第1画像領域が含まれるか否かを判定してよい。制御部34は、第2部分画像に含まれる画素の特徴と第1物体に対応する画素の特徴とが所定範囲内で類似する場合、第2部分画像に含まれる画素の特徴と第1物体に対応する画素の特徴とが同じであるとみなしてもよい。例えば、制御部34は、画素の輝度を特徴として用いる場合、複数の画素における輝度情報を示す値の差分が所定範囲の数値内にあるとき、これら複数の画素の特徴が類似するとしてよい。また、例えば、制御部34は、画素の色情報を特徴として用いる場合、複数の画素における色情報を示す値(例えば、RGB値)の差分が所定範囲の数値内にあるとき、これら複数の画素の特徴が類似するとしてよい。制御部34は、第2部分画像に含まれる画素の特徴が第1物体に対応する画素の特徴と同じであると判定した場合、第2部分画像に第1画像領域が含まれると判定する。例えば、図3では、制御部34は、照射範囲44,46,49のそれぞれに対応する第2部分画像に、タイヤ41に対応する第1画像領域が含まれると判定する。 The control unit 34 determines whether the second partial image includes the first image area. The first image area is an image area corresponding to the first object among the multiple image areas after division. The first object is any one of the multiple objects included in the first illumination range, i.e., the first partial image. For example, in FIG. 3, the first object is a tire 41. In this case, the control unit 34 determines whether the second partial image corresponding to each of the illumination ranges 44 to 51 includes the first image area corresponding to the tire 41. The control unit 34 may determine whether the second partial image includes the first image area by determining whether the characteristics of the pixels included in the second partial image are the same as the characteristics of the pixels corresponding to the first object. The control unit 34 may consider the characteristics of the pixels included in the second partial image to be the same as the characteristics of the pixels corresponding to the first object when the characteristics of the pixels included in the second partial image and the characteristics of the pixels corresponding to the first object are similar within a predetermined range. For example, when the control unit 34 uses pixel brightness as a feature, the control unit 34 may determine that the features of the plurality of pixels are similar when the difference between values indicating brightness information of the plurality of pixels is within a predetermined range of values. Also, when the control unit 34 uses pixel color information as a feature, the control unit 34 may determine that the features of the plurality of pixels are similar when the difference between values indicating color information of the plurality of pixels (e.g., RGB values) is within a predetermined range of values. When the control unit 34 determines that the feature of the pixel included in the second partial image is the same as the feature of the pixel corresponding to the first object, the control unit 34 determines that the second partial image includes the first image area. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to each of the irradiation ranges 44, 46, and 49 includes the first image area corresponding to the tire 41.

 制御部34は、第2部分画像に第1画像領域が含まれると判定した場合、当該第2部分画像に対応する第2照射範囲における反射波のデータを取得する。制御部34は、取得した反射波の強度ピークと、第1照射範囲における反射波の複数の強度ピークのそれぞれとを比較することにより、第1画像領域と、第1照射範囲における反射波の複数の強度ピークの何れかとを対応付ける。例えば、図3において、制御部34は、照射範囲44に対応する第2部分画像にタイヤ41に対応する第1画像領域が含まれると判定するものとする。この場合、制御部34は、図5に示すような照射範囲44の反射波のデータを取得する。制御部34は、図5に示すような反射波の強度ピークP3と、図4に示すような照射範囲43における反射波の複数の強度ピークのそれぞれとを比較することにより、タイヤ41に対応する第1画像領域と強度ピークP1とを対応付ける。 When the control unit 34 determines that the second partial image includes the first image area, it acquires data on the reflected wave in the second irradiation range corresponding to the second partial image. The control unit 34 compares the intensity peak of the acquired reflected wave with each of the multiple intensity peaks of the reflected wave in the first irradiation range, thereby associating the first image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to the irradiation range 44 includes the first image area corresponding to the tire 41. In this case, the control unit 34 acquires data on the reflected wave in the irradiation range 44 as shown in FIG. 5. The control unit 34 compares the intensity peak P3 of the reflected wave as shown in FIG. 5 with each of the multiple intensity peaks of the reflected wave in the irradiation range 43 as shown in FIG. 4, thereby associating the first image area corresponding to the tire 41 with the intensity peak P1.

 第1物体と同じ又は類似に、制御部34は、第2部分画像に第2画像領域が含まれるか否かを判定する。第2画像領域は、分割後の複数の画像領域のうち、第2物体に対応する画像領域である。第2物体は、第1照射範囲すなわち第1部分画像に含まれる複数の物体のうちの何れかである。ただし、第2物体は、第1物体とは異なる物体である。例えば、図3では、第2物体は、路面42となる。 The control unit 34 determines whether or not the second partial image includes a second image area that is the same as or similar to the first object. The second image area is an image area that corresponds to the second object among the multiple image areas after division. The second object is any one of the multiple objects included in the first illumination range, i.e., the first partial image. However, the second object is an object different from the first object. For example, in FIG. 3, the second object is the road surface 42.

 第1物体と同じ又は類似に、制御部34は、第2部分画像に第2画像領域が含まれると判定した場合、当該第2部分画像に対応する第2照射範囲における反射波のデータ(第2データ)を取得する。第1物体と同じ又は類似に、制御部34は、取得した反射波の強度ピークと、第1照射範囲における反射波の複数の強度ピークのそれぞれとを比較することにより、第2画像領域と、反射波の複数の強度ピークの何れかとを対応付ける。例えば、図3において、制御部34は、照射範囲45に対応する第2部分画像に路面42に対応する第2画像領域が含まれると判定するものとする。この場合、制御部34は、図6に示すような照射範囲45の反射波のデータを取得する。制御部34は、図6に示すような反射波の強度ピークP4と、図4に示すような照射範囲43における反射波の複数の強度ピークのそれぞれとを比較することにより、路面42に対応する第2画像領域と、強度ピークP2とを対応付ける。  In the same or similar manner as the first object, when the control unit 34 determines that the second partial image includes the second image area, it acquires data (second data) of the reflected wave in the second irradiation range corresponding to the second partial image. In the same or similar manner as the first object, the control unit 34 compares the intensity peak of the acquired reflected wave with each of the multiple intensity peaks of the reflected wave in the first irradiation range, thereby associating the second image area with any of the multiple intensity peaks of the reflected wave. For example, in FIG. 3, the control unit 34 determines that the second partial image corresponding to the irradiation range 45 includes the second image area corresponding to the road surface 42. In this case, the control unit 34 acquires data of the reflected wave in the irradiation range 45 as shown in FIG. 6. The control unit 34 compares the intensity peak P4 of the reflected wave as shown in FIG. 6 with each of the multiple intensity peaks of the reflected wave in the irradiation range 43 as shown in FIG. 4, thereby associating the second image area corresponding to the road surface 42 with the intensity peak P2.

 制御部34は、第1照射範囲に含まれる各物体に対して上述した処理を繰り返し実行することにより、各物体に対応する画像領域と、第1照射範囲における反射波の複数の強度ピークの何れかとを対応付ける。制御部34は、これらを対応付けると、物体に対応する画像領域に含まれる所定位置から撮像装置20に向かう光線方向を特定する。この画像領域に含まれる所定位置は、例えば、第1部分画像に含まれる画像領域の中心に対応する位置である。制御部34は、特定した光線方向を、画像領域に強度ピークを対応付けた反射波の到来方向として特定する。つまり、制御部34は、特定した光線方向を、第1照射範囲に含まれる物体で反射された反射波の到来方向として特定する。 The control unit 34 repeatedly executes the above-mentioned process for each object included in the first irradiation range, thereby associating the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave in the first irradiation range. When the control unit 34 associates them, it identifies the direction of light from a predetermined position included in the image area corresponding to the object toward the imaging device 20. The predetermined position included in this image area is, for example, a position corresponding to the center of the image area included in the first partial image. The control unit 34 identifies the identified light direction as the direction of arrival of the reflected wave whose intensity peak is associated with the image area. In other words, the control unit 34 identifies the identified light direction as the direction of arrival of the reflected wave reflected by an object included in the first irradiation range.

 例えば、図8では、制御部34は、タイヤ41に対応する第1画像領域に含まれる所定位置から撮像装置20に向かう光線方向d2を特定する。制御部34は、光線方向d2を、タイヤ41で反射された反射波の到来方向として特定する。また、制御部34は、路面42に対応する第2画像領域に含まれる所定位置から撮像装置20に向かう光線方向d3を特定する。制御部34は、光線方向d3を路面42で反射された反射波の到来方向として特定する。 For example, in FIG. 8, the control unit 34 identifies a light direction d2 that travels from a predetermined position included in the first image area corresponding to the tire 41 toward the imaging device 20. The control unit 34 identifies the light direction d2 as the arrival direction of the reflected wave reflected by the tire 41. The control unit 34 also identifies a light direction d3 that travels from a predetermined position included in the second image area corresponding to the road surface 42 toward the imaging device 20. The control unit 34 identifies the light direction d3 as the arrival direction of the reflected wave reflected by the road surface 42.

 <物体位置の特定処理>
 制御部34は、第1照射範囲に含まれる各物体で反射された反射波の到来方向を特定すると、各物体の実空間における位置を特定してもよい。制御部34は、第1照射範囲における各強度ピークに対応する距離のデータと、各強度ピークに対応する反射波の到来方向とによって、第1照射範囲に含まれる各物体の3次元実空間における位置を特定してよい。例えば、図3では、制御部34は、照射範囲43について、図4に示す強度ピークP1に対応する距離D1のデータと、到来方向として特定した図8に示す光線方向d2とによって、タイヤ41の3次元実空間における位置を特定する。また、制御部34は、図4に示す強度ピークP2に対応する距離D2のデータと、到来方向として特定した図8に示す光線方向d3とによって、路面42の3次元実空間における位置を特定する。
<Object position identification process>
When the control unit 34 specifies the direction of arrival of the reflected wave reflected by each object included in the first irradiation range, the control unit 34 may specify the position of each object in the real space. The control unit 34 may specify the position of each object included in the first irradiation range in the three-dimensional real space by the data of the distance corresponding to each intensity peak in the first irradiation range and the direction of arrival of the reflected wave corresponding to each intensity peak. For example, in FIG. 3, the control unit 34 specifies the position of the tire 41 in the three-dimensional real space for the irradiation range 43 by the data of the distance D1 corresponding to the intensity peak P1 shown in FIG. 4 and the light direction d2 shown in FIG. 8 specified as the direction of arrival. The control unit 34 also specifies the position of the road surface 42 in the three-dimensional real space by the data of the distance D2 corresponding to the intensity peak P2 shown in FIG. 4 and the light direction d3 shown in FIG. 8 specified as the direction of arrival.

 制御部34は、特定した物体の3次元実空間における位置によって、点群データを生成してよい。点群データは、測距装置10の周囲に存在する物体の3次元空間上の位置を示すデータである。制御部34は、生成した点群データを出力部32のディスプレイに表示させてもよいし、生成した点群データを外部の記憶媒体に出力部32によって出力してもよい。制御部34は、生成した点群データを移動体2の他のシステムに通信部31によって送信してもよい。 The control unit 34 may generate point cloud data based on the positions of the identified objects in three-dimensional real space. The point cloud data is data that indicates the positions in three-dimensional space of objects that exist around the distance measuring device 10. The control unit 34 may display the generated point cloud data on the display of the output unit 32, or may output the generated point cloud data to an external storage medium by the output unit 32. The control unit 34 may transmit the generated point cloud data to another system of the mobile body 2 by the communication unit 31.

 ここで、物体に対応する画像領域と、第1照射範囲における複数の強度ピークの何れかとを対応付ける処理は、上述した例に限定されない。以下、この対応付ける処理の他の例について説明する。 The process of associating the image area corresponding to the object with any one of the multiple intensity peaks in the first irradiation range is not limited to the above example. Other examples of this association process are described below.

 <他の例1>
 制御部34は、第1画像領域が第2部分画像を含むか否かを判定する。例えば、図3では、制御部34は、タイヤ41に対応する第1画像領域が照射範囲44~51のそれぞれに対応する第2部分画像を含むか否かを判定する。
<Other Example 1>
The control unit 34 determines whether the first image area includes the second partial image. For example, in FIG. 3, the control unit 34 determines whether the first image area corresponding to the tire 41 includes the second partial images corresponding to each of the illumination ranges 44 to 51.

 制御部34は、第1画像領域が第2部分画像を含むと判定した場合、当該第2部分画像に対応する第2照射範囲の距離のデータを取得する。制御部34は、取得した第2照射範囲の距離のデータと、第1照射範囲における反射波の複数の強度ピークのぞれぞれに対応する複数の距離のデータとを比較する。制御部34は、これらの距離のデータを比較することにより、第1画像領域と、第1照射範囲における反射波の複数の強度ピークの何れかとを対応付ける。例えば、図3において、制御部34は、タイヤ41に対応する第1画像領域が照射範囲44に対応する第2部分画像を含むと判定したものとする。この場合、制御部34は、図5に示すような照射範囲44における強度ピークP3に対応する距離のデータと、図4に示すような複数の強度ピークのそれぞれに対応する距離のデータとを比較する。制御部34は、これらの距離のデータを比較することにより、タイヤ41に対応する第1画像領域と、強度ピークP1とを対応付ける。 When the control unit 34 determines that the first image area includes the second partial image, it acquires distance data of the second irradiation range corresponding to the second partial image. The control unit 34 compares the acquired distance data of the second irradiation range with multiple distance data corresponding to each of the multiple intensity peaks of the reflected wave in the first irradiation range. By comparing these distance data, the control unit 34 associates the first image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the first image area corresponding to the tire 41 includes the second partial image corresponding to the irradiation range 44. In this case, the control unit 34 compares the distance data corresponding to the intensity peak P3 in the irradiation range 44 as shown in FIG. 5 with the distance data corresponding to each of the multiple intensity peaks as shown in FIG. 4. By comparing these distance data, the control unit 34 associates the first image area corresponding to the tire 41 with the intensity peak P1.

 第1物体と同じ又は類似に、制御部34は、第2画像領域が第2部分画像を含むか否かを判定する。第1物体と同じ又は類似に、制御部34は、第2画像領域が第2部分画像を含むと判定した場合、第2画像領域と、第1照射範囲における反射波の複数の強度ピークの何れかとを対応付ける。例えば、図3において、制御部34は、路面42に対応する第2画像領域が照射範囲45に対応する第2部分画像に含まれると判定したものとする。この場合、制御部34は、図6に示すような照射範囲45における強度ピークP4に対応する距離のデータと、図4に示すような複数の強度ピークのそれぞれに対応する距離のデータとを比較する。制御部34は、これらのデータを比較することにより、路面42に対応する第2画像領域と、強度ピークP2とを対応付ける。  In the same or similar manner as the first object, the control unit 34 determines whether the second image area includes the second partial image. In the same or similar manner as the first object, when the control unit 34 determines that the second image area includes the second partial image, it associates the second image area with any of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 determines that the second image area corresponding to the road surface 42 is included in the second partial image corresponding to the irradiation range 45. In this case, the control unit 34 compares the distance data corresponding to the intensity peak P4 in the irradiation range 45 as shown in FIG. 6 with the distance data corresponding to each of the multiple intensity peaks as shown in FIG. 4. By comparing these data, the control unit 34 associates the second image area corresponding to the road surface 42 with the intensity peak P2.

 制御部34は、第1照射範囲に含まれる各物体に対して上述した処理を繰り返し実行することにより、各物体に対応する画像領域と、第1照射範囲における反射波の複数の強度ピークの何れかとを対応付ける。 The control unit 34 repeatedly performs the above-mentioned process for each object included in the first irradiation range, thereby associating the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave in the first irradiation range.

 <他の例2>
 制御部34は、撮像画像に対して画像認識処理を実行してもよい。制御部34は、画像認識処理を実行することにより、第1部分画像に含まれる複数の物体の前後関係を特定してよい。複数の物体の前後関係において、前とは、測距装置10に近い方を意味する。例えば、図3では、画像認識処理を実行することにより、照射範囲43に対応する第1部分画像に、タイヤ41及び路面42の一部が含まれることを特定する。さらに、制御部34は、タイヤ41の方が路面42よりも測距装置10から見て前にあることを特定する。制御部34は、画像認識処理を実行することにより、第1照射範囲に含まれる各物体に対応する画像領域を特定してもよい。
<Other Example 2>
The control unit 34 may execute an image recognition process on the captured image. The control unit 34 may execute the image recognition process to identify the front-to-back relationship of multiple objects included in the first partial image. In the front-to-back relationship of multiple objects, "front" means the one closer to the distance measuring device 10. For example, in FIG. 3, the control unit 34 executes the image recognition process to identify that the first partial image corresponding to the illumination range 43 includes the tire 41 and a part of the road surface 42. Furthermore, the control unit 34 specifies that the tire 41 is in front of the road surface 42 as viewed from the distance measuring device 10. The control unit 34 may execute the image recognition process to identify an image area corresponding to each object included in the first illumination range.

 制御部34は、特定した複数の物体の前後関係と、第1照射範囲における反射波の複数の強度ピークのそれぞれに対応する複数の距離のデータの大小関係とに基づいて、各物体に対応する画像領域と、反射波の複数の強度ピークの何れかとを対応付ける。例えば、図3では、制御部34は、タイヤ41及び路面42の前後関係と、図4に示すような反射波の各強度ピークに対応する距離のデータの大小関係とに基づいて、タイヤ41に対応する第1画像領域に強度ピークP1を対応付ける。また、制御部34は、路面42に対応する第2画像領域に強度ピークP2を対応付ける。 The control unit 34 associates the image area corresponding to each object with one of the multiple intensity peaks of the reflected wave based on the front-to-back relationship of the identified multiple objects and the magnitude relationship of the multiple distance data corresponding to each of the multiple intensity peaks of the reflected wave in the first irradiation range. For example, in FIG. 3, the control unit 34 associates intensity peak P1 with the first image area corresponding to the tire 41 based on the front-to-back relationship of the tire 41 and the road surface 42 and the magnitude relationship of the distance data corresponding to each intensity peak of the reflected wave as shown in FIG. 4. The control unit 34 also associates intensity peak P2 with the second image area corresponding to the road surface 42.

 <他の例3>
 制御部34は、上述した複数の特徴の1つ(例えば、輝度)に基づいて、第1照射範囲における複数の反射波のそれぞれの到来方向を特定することができない場合、複数の特徴の他(例えば、色)に基づいて、これらの到来方向を特定してもよい。さらに、制御部34は、上述した複数の特徴の何れか又は何れにおいても、複数の反射波のそれぞれの到来方向を特定することができない場合、上述した画像認識処理を実行してもよい。また、制御部34は、画像認識処理を実行して複数の反射波のそれぞれの到来方向を特定することができない場合、上述した特徴に基づいて第1照射範囲における複数の反射波のそれぞれの到来方向を特定してもよい。
<Other Example 3>
When the control unit 34 is unable to identify the direction of arrival of each of the multiple reflected waves in the first irradiation range based on one of the multiple features described above (e.g., luminance), the control unit 34 may identify the direction of arrival of each of the multiple reflected waves based on other of the multiple features (e.g., color). Furthermore, when the control unit 34 is unable to identify the direction of arrival of each of the multiple reflected waves in any or all of the multiple features described above, the control unit 34 may execute the image recognition process described above. Furthermore, when the control unit 34 is unable to identify the direction of arrival of each of the multiple reflected waves by executing the image recognition process, the control unit 34 may identify the direction of arrival of each of the multiple reflected waves in the first irradiation range based on the above-mentioned features.

 図9は、本開示の一実施形態に係る物体検出処理の手順例を示すフローチャートである。図9に示す物体検出処理は、本実施形態に係る制御方法の一例に相当する。制御部34は、例えば移動体2が移動を開始すると、物体検出処理を開始する。制御部34は、通信部31を介して移動体2の他のシステムと通信することにより、移動体2が移動を開始したことを検出してよい。 FIG. 9 is a flowchart showing an example procedure of an object detection process according to an embodiment of the present disclosure. The object detection process shown in FIG. 9 corresponds to an example of a control method according to this embodiment. For example, when the moving object 2 starts moving, the control unit 34 starts the object detection process. The control unit 34 may detect that the moving object 2 has started moving by communicating with another system of the moving object 2 via the communication unit 31.

 制御部34は、測距装置10から、照射範囲毎の物体の距離のデータ及び反射波の強度のデータを通信部31によって受信する(ステップS1)。また、制御部34は、撮像装置20から、撮像画像のデータを通信部31によって受信する(ステップS2)。 The control unit 34 receives data on the distance to the object for each irradiation range and data on the intensity of the reflected wave from the distance measuring device 10 via the communication unit 31 (step S1). The control unit 34 also receives data on the captured image from the imaging device 20 via the communication unit 31 (step S2).

 制御部34は、1つの照射範囲における反射波の強度のデータから複数の強度ピークが検出されるか否かを判定する(ステップS3)。制御部34は、1つの照射範囲における反射波の強度のデータから複数の強度ピークが検出されると判定した場合(ステップS3:YES)、1つの照射範囲に複数の物体のデータが含まれると判定し、ステップS4の処理に進む。一方、制御部34は、1つの照射範囲における反射波の強度のデータから複数の強度ピークが検出されると判定しない場合(ステップS3:NO)、1つの照射範囲に複数の物体のデータが含まれると判定せず、ステップS5の処理に進む。 The control unit 34 determines whether or not multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3). If the control unit 34 determines that multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3: YES), it determines that data on multiple objects is included in one irradiation range, and proceeds to processing in step S4. On the other hand, if the control unit 34 does not determine that multiple intensity peaks are detected from the data on the intensity of the reflected waves in one irradiation range (step S3: NO), it does not determine that data on multiple objects is included in one irradiation range, and proceeds to processing in step S5.

 ステップS4の処理では、制御部34は、ステップS2の処理で受信した撮像画像に基づいて、複数の強度ピークのそれぞれに対応する複数の反射波のそれぞれの到来方向を特定する。 In the process of step S4, the control unit 34 identifies the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks based on the captured image received in the process of step S2.

 ステップS5の処理では、制御部34は、ステップS1の処理で受信した全ての照射範囲に対してステップS3の処理が実行済みであるか否かを判定する。制御部34は、全ての照射範囲に対してステップS3の処理が実行済みであると判定した場合(ステップS5:YES)、ステップS6の処理に進む。一方、制御部34は、全ての照射範囲に対してステップS3の処理が実行済みであると判定しない場合(ステップS5:NO)、ステップS3の処理に戻る。 In the process of step S5, the control unit 34 determines whether or not the process of step S3 has been performed for all irradiation ranges received in the process of step S1. If the control unit 34 determines that the process of step S3 has been performed for all irradiation ranges (step S5: YES), the control unit 34 proceeds to the process of step S6. On the other hand, if the control unit 34 does not determine that the process of step S3 has been performed for all irradiation ranges (step S5: NO), the control unit 34 returns to the process of step S3.

 ステップS6の処理では、制御部34は、ステップS1の処理で受信した測距画像のデータ及びステップS4の処理で特定した到来方向等によって、点群データを生成する。 In step S6, the control unit 34 generates point cloud data based on the distance image data received in step S1 and the direction of arrival identified in step S4.

 制御部34は、ステップS6の処理後、ステップS1の処理に戻る。ステップS1~S6の処理を繰り返し実行する場合、制御部34は、例えば移動体2が停止すると、物体検出処理を終了する。制御部34は、通信部31を介して移動体2の他のシステムと通信することにより、移動体2が移動を停止したことを検出してよい。 After processing step S6, the control unit 34 returns to processing step S1. When repeatedly executing the processing of steps S1 to S6, the control unit 34 ends the object detection processing, for example, when the moving object 2 stops. The control unit 34 may detect that the moving object 2 has stopped moving by communicating with other systems of the moving object 2 via the communication unit 31.

 このように本実施形態に係る制御装置30では、制御部34は、第1反射波及び第2反射波と、第1物体及び第2物体を含む撮像画像とに基づいて、第1物体及び第2物体の実空間における位置を特定する。第1反射波及び第2反射波は、それぞれ、実空間の第1方向に例えば1つの照射範囲に放射された電磁波が第1物体及び第2物体で反射したものである。例えば、第1反射波は、図3に示すような照射範囲43に照射された電磁波がタイヤ41で反射されたものである。例えば、第2反射波は、図3に示すような照射範囲43に照射された電磁波が路面42で反射されたものである。第1反射波及び第2反射波と撮像画像とに基づいて第1物体及び第2物体の実空間における位置を特定することにより、1つの照射範囲に複数の物体のデータが含まれる場合であっても、複数の物体の実空間における位置を特定することができる。 In this manner, in the control device 30 according to this embodiment, the control unit 34 identifies the positions of the first object and the second object in real space based on the first reflected wave and the second reflected wave and the captured image including the first object and the second object. The first reflected wave and the second reflected wave are respectively electromagnetic waves radiated in a first direction in real space, for example, to one irradiation range, and reflected by the first object and the second object. For example, the first reflected wave is electromagnetic waves irradiated to the irradiation range 43 as shown in FIG. 3 and reflected by the tire 41. For example, the second reflected wave is electromagnetic waves irradiated to the irradiation range 43 as shown in FIG. 3 and reflected by the road surface 42. By identifying the positions of the first object and the second object in real space based on the first reflected wave and the second reflected wave and the captured image, the positions of multiple objects in real space can be identified even when data of multiple objects is included in one irradiation range.

 ここで、測距装置10は、例えばステレオカメラ等と比較すると、物体までの距離を精度良く計測することができる。しかしながら、測距装置10が生成する測距画像の分解能は、ステレオカメラ等が生成する測距画像の分解能よりも低いことが多い。測距画像の分解能が低く、1つの照射範囲にすなわち1つの画素に複数の物体のデータが含まれる場合であっても、本実施形態では、制御部34は、上述したように実空間における複数の物体のそれぞれの位置を特定することができる。その結果として、本実施形態では、物体を精度良く検出することができる。 Here, the distance measuring device 10 can measure the distance to an object with high accuracy compared to, for example, a stereo camera. However, the resolution of the distance measuring image generated by the distance measuring device 10 is often lower than the resolution of the distance measuring image generated by a stereo camera or the like. Even if the resolution of the distance measuring image is low and data for multiple objects is included in one illumination range, i.e., in one pixel, in this embodiment, the control unit 34 can identify the positions of each of the multiple objects in real space as described above. As a result, in this embodiment, objects can be detected with high accuracy.

 よって、本実施形態によれば、物体を精度良く検出することができる。 Therefore, according to this embodiment, it is possible to detect objects with high accuracy.

 さらに、本実施形態に係る制御装置30では、制御部34は、第1領域及び第2領域と、第1反射波及び第2反射波との間の対応関係に基づいて、第1物体及び第2物体の実空間における位置を特定してもよい。第1領域は、撮像画像のうちの第1物体の画像を含む領域である。例えば、第1領域は、図3に示すような撮像画像40のうちのタイヤ41の画像を含む領域である。また、第2領域は、撮像画像のうちの第2物体の画像を含む領域である。例えば、第2領域は、図3に示すような撮像画像40のうちの路面42の画像を含む領域である。制御部34は、上述した処理を実行することにより、反射波の強度のデータから検出された第1反射波及び第2反射波の強度ピークに基づいて、第1領域及び第2領域と第1反射波及び第2反射波との間の対応関係を特定してもよい。制御部34は、第1領域及び第2領域のそれぞれと反射波の強度のデータから検出された第1反射波及び第2反射波の強度ピークの何れかとを対応付けることにより、第1領域及び第2領域と第1反射波及び第2反射波との間の対応関係を特定してよい。上述した対応関係を用いることにより、物体の位置を精度良く特定することができる。 Furthermore, in the control device 30 according to this embodiment, the control unit 34 may identify the positions of the first object and the second object in real space based on the correspondence between the first and second regions and the first and second reflected waves. The first region is a region including an image of the first object in the captured image. For example, the first region is a region including an image of the tire 41 in the captured image 40 as shown in FIG. 3. The second region is a region including an image of the second object in the captured image. For example, the second region is a region including an image of the road surface 42 in the captured image 40 as shown in FIG. 3. The control unit 34 may perform the above-mentioned processing to identify the correspondence between the first and second regions and the first and second reflected waves based on the intensity peaks of the first and second reflected waves detected from the data on the intensity of the reflected waves. The control unit 34 may identify the correspondence between the first and second regions and the first and second reflected waves by associating each of the first and second regions with either the intensity peaks of the first and second reflected waves detected from the data on the intensity of the reflected waves. By using the above-described correspondence, the position of the object can be identified with high accuracy.

 また、本実施形態に係る制御装置30では、制御部34は、測距装置10に対する第1領域に対応する実空間の方向と、測距装置10に対する第2領域に対応する実空間の方向とを特定してもよい。測距装置10に対する第1領域に対応する実空間の方向は、例えば、図8に示すような光線方向d2である。測距装置10に対する第2領域に対応する実空間の方向は、例えば、図8に示すような光線方向d3である。制御部34は、特定した第1領域に対応する実空間の方向及び第2領域に対応する実空間の方向と、第1物体及び第2物体までの距離のデータとに基づいて、第1物体及び第2物体の実空間における位置を特定してもよい。このような構成により、物体の位置を精度良く特定することができる。 Furthermore, in the control device 30 according to this embodiment, the control unit 34 may identify a real space direction corresponding to a first region for the distance measuring device 10 and a real space direction corresponding to a second region for the distance measuring device 10. The real space direction corresponding to the first region for the distance measuring device 10 is, for example, a light direction d2 as shown in FIG. 8. The real space direction corresponding to the second region for the distance measuring device 10 is, for example, a light direction d3 as shown in FIG. 8. The control unit 34 may identify the positions of the first object and the second object in real space based on the identified real space directions corresponding to the first region and the second region, and data on the distances to the first object and the second object. With this configuration, the positions of the objects can be identified with high accuracy.

 また、本実施形態に係る制御装置30では、制御部34は、撮像画像に対して画像認識処理を実行することにより、測距装置10に対する第1物体及び第2物体の位置関係を特定してもよい。制御部34は、測距装置10に対する複数の物体の位置関係として、上述したように、複数の物体の前後関係を特定してもよい。制御部34は、特定した第1物体及び第2物体の位置関係に基づいて、第1領域及び第2領域と、第1反射波及び第2反射波との間の対応関係を特定してもよい。このような構成により、物体の位置を効率良く特定することができる。 Furthermore, in the control device 30 according to this embodiment, the control unit 34 may identify the positional relationship of the first object and the second object relative to the distance measuring device 10 by performing image recognition processing on the captured image. The control unit 34 may identify the front-to-back relationship of the multiple objects as the positional relationship of the multiple objects relative to the distance measuring device 10, as described above. The control unit 34 may identify the correspondence between the first and second regions and the first and second reflected waves based on the identified positional relationship of the first and second objects. With this configuration, the position of the object can be identified efficiently.

 また、本実施形態に係る制御装置30では、制御部34は、第1反射波及び第2反射波と、第3反射波と、撮像画像とに基づいて、第1物体及び第2物体の実空間における位置を特定してもよい。第3反射波は、実空間の第2照射範囲に照射された電磁波が第1物体で反射したものである。例えば、第3反射波は、第2照射範囲が図3に示す照射範囲44である場合、照射範囲44に照射された電磁波がタイヤ41で反射したものである。この場合、第3反射波は、図5に示す照射範囲44における反射波のデータのうち、強度ピークP3に対応する反射となる。このように第3反射波を用いることにより、物体の位置を精度良く特定することができる。また、制御部34は、第1領域及び第2領域と、第1反射波、第2反射波及び第3反射波との間の対応関係に基づいて、第1物体及び第2物体の実空間における位置を特定してもよい。制御部34は、第3反射波に基づいて、第1反射波及び第2反射波と、第1領域及び第2領域とを対応付けてもよい。例えば、制御部34は、上述したように、図4に示すような反射波の複数の強度ピークのそれぞれと、図5に示すような反射波の強度ピークとを比較する。制御部34は、このような比較によって、図4に示す強度ピークP1に対応する第1反射波と、図5に示す強度ピークP3に対応する第3反射波との間の対応関係を特定する。また、制御部34は、第1反射波及び第2反射波を含む反射波の強度の第1データと、第3反射を含む反射波の強度の第2データとを取得してもよい。第1データは、例えば、図4に示すような照射範囲43の反射波の強度のデータである。第2データは、例えば、図5に示すような照射範囲44における反射波のデータである。 In addition, in the control device 30 according to this embodiment, the control unit 34 may identify the positions of the first object and the second object in real space based on the first reflected wave, the second reflected wave, the third reflected wave, and the captured image. The third reflected wave is the electromagnetic wave irradiated to the second irradiation range in real space and reflected by the first object. For example, when the second irradiation range is the irradiation range 44 shown in FIG. 3, the third reflected wave is the electromagnetic wave irradiated to the irradiation range 44 and reflected by the tire 41. In this case, the third reflected wave is the reflection corresponding to the intensity peak P3 among the data of the reflected wave in the irradiation range 44 shown in FIG. 5. By using the third reflected wave in this way, the position of the object can be identified with high accuracy. The control unit 34 may also identify the positions of the first object and the second object in real space based on the correspondence between the first and second regions and the first, second, and third reflected waves. The control unit 34 may associate the first and second reflected waves with the first and second regions based on the third reflected wave. For example, as described above, the control unit 34 compares each of the multiple intensity peaks of the reflected wave as shown in FIG. 4 with the intensity peak of the reflected wave as shown in FIG. 5. Through such a comparison, the control unit 34 identifies a correspondence between the first reflected wave corresponding to the intensity peak P1 shown in FIG. 4 and the third reflected wave corresponding to the intensity peak P3 shown in FIG. 5. The control unit 34 may also obtain first data on the intensity of the reflected wave including the first reflected wave and the second reflected wave, and second data on the intensity of the reflected wave including the third reflection. The first data is, for example, data on the intensity of the reflected wave in the irradiation range 43 as shown in FIG. 4. The second data is, for example, data on the reflected wave in the irradiation range 44 as shown in FIG. 5.

 また、本実施形態に係る制御装置30では、第1反射波、第2反射波及び第3反射波の到来方向と、第1部分画像と、第2部分画像と、第1物体及び第2物体までの距離のデータとに基づいて、第1物体及び第2物体の実空間における位置を特定してもよい。当該第1物体及び第2物体までの距離のデータは、第1反射波、第2反射波及び第3反射波に基づいて計測した距離のデータに基づくものであってよい。制御部34は、第3反射波の到来方向を第2照射範囲に対する実空間における走査範囲の位置に基づいて特定してもよい。 In addition, the control device 30 according to this embodiment may identify the positions of the first object and the second object in real space based on the directions of arrival of the first reflected wave, the second reflected wave, and the third reflected wave, the first partial image, the second partial image, and data on the distances to the first object and the second object. The data on the distances to the first object and the second object may be based on distance data measured based on the first reflected wave, the second reflected wave, and the third reflected wave. The control unit 34 may identify the direction of arrival of the third reflected wave based on the position of the scanning range in real space relative to the second irradiation range.

 また、本実施形態では、撮像装置20の光軸と測距装置10の光軸とは、一致してもよい。撮像装置20の光軸と測距装置10の光軸とが一致することにより、撮像装置20と測距装置10との間のオクルージョン領域を低減することができる。オクルージョン領域とは、撮像画像及び測距画像のうちの、一方では撮像又は検出される領域であって、他方では撮像又は検出されない領域である。オクルージョン領域が低減されることにより、制御部34は、撮像装置20が生成した撮像画像に基づいて、複数の強度ピークのそれぞれに対応する複数の反射波のそれぞれの到来方向を精度良く特定することができる。 Furthermore, in this embodiment, the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 may coincide. By making the optical axis of the imaging device 20 and the optical axis of the distance measuring device 10 coincide, it is possible to reduce the occlusion area between the imaging device 20 and the distance measuring device 10. An occlusion area is an area of the captured image and the distance measuring image that is captured or detected in one, but not captured or detected in the other. By reducing the occlusion area, the control unit 34 can accurately identify the direction of arrival of each of the multiple reflected waves corresponding to each of the multiple intensity peaks, based on the captured image generated by the imaging device 20.

 本開示を諸図面及び実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形又は修正を行うことが容易であることに注意されたい。したがって、これらの変形又は修正は本開示の範囲に含まれることに留意されたい。例えば、各機能部に含まれる機能等は論理的に矛盾しないように再配置可能である。複数の機能部等は、1つに組み合わせられたり、分割されたりしてよい。上述した本開示に係る各実施形態は、それぞれ説明した各実施形態に忠実に実施することに限定されるものではなく、適宜、各特徴を組み合わせたり、一部を省略したりして実施され得る。つまり、本開示の内容は、当業者であれば本開示に基づき種々の変形及び修正を行うことができる。したがって、これらの変形及び修正は本開示の範囲に含まれる。例えば、各実施形態において、各機能部、各手段又は各ステップ等は論理的に矛盾しないように他の実施形態に追加し、若しくは、他の実施形態の各機能部、各手段又は各ステップ等と置き換えることが可能である。また、各実施形態において、複数の各機能部、各手段又は各ステップ等を1つに組み合わせたり、或いは分割したりすることが可能である。また、上述した本開示の各実施形態は、それぞれ説明した各実施形態に忠実に実施することに限定されるものではなく、適宜、各特徴を組み合わせたり、一部を省略したりして実施することもできる。 Although the present disclosure has been described based on the drawings and examples, it should be noted that a person skilled in the art can easily make various modifications or corrections based on the present disclosure. Therefore, it should be noted that these modifications or corrections are included in the scope of the present disclosure. For example, the functions, etc. included in each functional unit can be rearranged so as not to cause logical inconsistencies. Multiple functional units, etc. may be combined into one or divided. Each embodiment of the present disclosure described above is not limited to being implemented faithfully to each of the embodiments described, and may be implemented by combining each feature as appropriate or omitting some of them. In other words, the contents of the present disclosure can be modified and corrected in various ways by a person skilled in the art based on the present disclosure. Therefore, these modifications and corrections are included in the scope of the present disclosure. For example, in each embodiment, each functional unit, each means, each step, etc. can be added to other embodiments so as not to cause logical inconsistencies, or replaced with each functional unit, each means, each step, etc. of other embodiments. In addition, in each embodiment, multiple functional units, each means, each step, etc. can be combined into one or divided. Furthermore, each of the above-described embodiments of the present disclosure is not limited to being implemented faithfully according to each of the described embodiments, but may be implemented by combining each feature or omitting some features as appropriate.

 上述した実施形態では、複数の物体が図3に示すようなタイヤ41及び路面42であるものとして説明した。しかしながら、複数の物体は、これに限定されない。複数の物体は、検出システム1から見た撮像画像上で重なり合えば、任意の物体であってよい。例えば、複数の物体は、車両と歩行者又は車両と路面上にある障害物等の異なる種類の物体であってよい。また、複数の物体は、複数の車両又は複数の路面上にある障害物等の同じ種類の物体であってもよい。さらに、検出システム1は、移動体だけでなく、例えば、路面に設置される路側機又は周囲を監視する監視カメラ等の観察装置に搭載されてもよい。 In the above embodiment, the multiple objects are described as being tires 41 and road surface 42 as shown in FIG. 3. However, the multiple objects are not limited to this. The multiple objects may be any objects as long as they overlap on the captured image seen by the detection system 1. For example, the multiple objects may be different types of objects, such as a vehicle and a pedestrian, or a vehicle and an obstacle on the road surface. The multiple objects may also be the same type of object, such as multiple vehicles or multiple obstacles on the road surface. Furthermore, the detection system 1 may be mounted not only on a moving body, but also on an observation device such as a roadside unit installed on the road surface or a surveillance camera that monitors the surroundings.

 一実施形態において、(1)制御装置は、
 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する。
In one embodiment, (1) the control device comprises:
The positions of the first object and the second object in the real space are identified based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.

 (2)上記(1)の制御装置は、
 前記撮像画像のうちの前記1物体の画像を含む第1領域及び前記第2物体の画像を含む第2領域と、前記第1反射波及び前記第2反射波との間の対応関係に基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定してもよい。
(2) The control device of (1) above,
The positions of the first object and the second object in the real space may be identified based on a correspondence between a first region of the captured image that includes an image of the one object and a second region of the captured image that includes an image of the second object, and the first reflected wave and the second reflected wave.

 (3)上記(1)又は(2)の制御装置は、
 前記第1反射波の強度及び前記第2反射波の強度を含む強度データを取得し、
 前記強度データから検出された前記第1反射波及び前記第2反射波の強度ピークに基づいて、前記対応関係を特定してもよい。
(3) The control device according to (1) or (2) above,
Acquire intensity data including an intensity of the first reflected wave and an intensity of the second reflected wave;
The correspondence relationship may be identified based on intensity peaks of the first reflected wave and the second reflected wave detected from the intensity data.

 (4)上記(1)から(3)までの何れか1つの制御装置は、
 前記制御装置は、
 前記撮像画像に対して画像認識処理を実行することにより、前記測距装置に対する前記第1物体及び前記第2物体の位置関係を特定し、
 前記位置関係に基づいて、前記対応関係を特定してもよい。
(4) Any one of the control devices according to (1) to (3) above,
The control device includes:
performing an image recognition process on the captured image to identify a positional relationship of the first object and the second object with respect to the distance measuring device;
The correspondence relationship may be identified based on the positional relationship.

 (5)上記(1)から(4)までの何れか1つの制御装置は、
 実空間に電磁波を放射し、前記電磁波が物体で反射した反射波を検出することにより前記物体までの距離を計測する測距装置から、前記物体までの距離のデータと、前記強度データとを取得し、
 前記測距装置に対する前記第1領域に対応する実空間の方向及び前記第2領域に対応する実空間の方向と、前記第1反射波及び前記第2反射波に基づいて計測した前記距離のデータとによって取得した、前記第1物体及び前記第2物体までの距離に基づいて、前記第1物体及び前記第2物体の実空間における位置を特定してもよい。
(5) Any one of the control devices according to (1) to (4) above,
acquiring data on a distance to an object and the intensity data from a distance measuring device that measures a distance to the object by emitting an electromagnetic wave into a real space and detecting a wave reflected by the object;
The positions of the first object and the second object in real space may be identified based on the distances to the first object and the second object obtained by using a direction in real space corresponding to the first area and a direction in real space corresponding to the second area relative to the distance measuring device, and the distance data measured based on the first reflected wave and the second reflected wave.

 一実施形態において、(6)制御装置は、
 実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定してもよい。
In one embodiment, (6) the control device is
The positions of the first object and the second object in the real space may be identified based on a first reflected wave that is formed by reflecting an electromagnetic wave from a first object and a second reflected wave that is formed by reflecting an electromagnetic wave from a second irradiation range in the real space, a third reflected wave that is formed by reflecting an electromagnetic wave from the first object and a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.

 (7)上記(6)の制御装置は、
 前記撮像画像のうちの前記1物体の画像を含む第1領域及び前記第2物体の画像を含む第2領域と、前記第1反射波、前記第2反射波及び前記第3反射波との間の対応関係に基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定してもよい。
(7) The control device according to (6) above,
The positions of the first object and the second object in the real space may be identified based on a correspondence between a first region of the captured image that includes an image of the one object and a second region of the captured image that includes an image of the second object, and the first reflected wave, the second reflected wave, and the third reflected wave.

 (8)上記(6)又は(7)の制御装置は、
 前記第3反射波に基づいて、前記第1反射波及び前記第2反射波と、前記第1領域及び前記第2領域との対応関係を特定してもよい。
(8) The control device according to (6) or (7) above,
A correspondence relationship between the first reflected wave and the second reflected wave and the first region and the second region may be identified based on the third reflected wave.

 (9)上記(6)から(8)までの何れか1つの制御装置は、
 前記第1反射波、前記第2反射波及び前記第3反射波のうち、強度ピークの差が所定範囲内である前記第1反射波及び前記第3反射波を前記第1物体からの反射波として対応付けてもよい。
(9) Any one of the control devices according to (6) to (8) above,
Among the first reflected wave, the second reflected wave, and the third reflected wave, the first reflected wave and the third reflected wave having a difference in intensity peak within a predetermined range may be associated as a reflected wave from the first object.

 (10)上記(6)から(9)までの何れか1つの制御装置は、
 前記第1反射波、前記第2反射波及び前記第3反射波の到来方向と、
 前記撮像画像のうちの、前記第1照射範囲に対応する第1部分画像と、前記第2照射範囲に対応する第2部分画像と、
 前記第1反射波、前記第2反射波及び前記第3反射波に基づいて計測した距離のデータに基づいて取得した前記第1物体及び前記第2物体までの距離と
 に基づいて、
 前記第1物体及び前記第2物体の前記実空間における位置を特定してもよい。
(10) Any one of the control devices according to (6) to (9) above,
Arrival directions of the first reflected wave, the second reflected wave, and the third reflected wave;
A first partial image corresponding to the first illumination range and a second partial image corresponding to the second illumination range of the captured image;
and distances to the first object and the second object acquired based on distance data measured based on the first reflected wave, the second reflected wave, and the third reflected wave.
The positions of the first object and the second object in the real space may be identified.

 一実施形態において、(11)制御方法は、
 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む。
In one embodiment, (11) a control method includes:
The method includes identifying the positions of the first object and the second object in the real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.

 一実施形態において、(12)制御方法は、
 実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む。
In one embodiment, the control method includes:
The method includes identifying positions of the first object and the second object in the real space based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in the real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.

 1 検出システム
 2 移動体
 10 測距装置
 20 撮像装置
 30 制御装置
 31 通信部
 32 出力部
 33 記憶部
 34 制御部
 40 撮像画像
 41 タイヤ
 42 路面
 43,44,45,46,47,48,49,50,51 照射範囲
REFERENCE SIGNS LIST 1 Detection system 2 Mobile object 10 Distance measuring device 20 Imaging device 30 Control device 31 Communication unit 32 Output unit 33 Storage unit 34 Control unit 40 Captured image 41 Tire 42 Road surface 43, 44, 45, 46, 47, 48, 49, 50, 51 Irradiation range

Claims (12)

 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する、制御装置。 A control device that identifies the positions of the first object and the second object in real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of electromagnetic waves emitted in a first direction in the real space, and an image of the real space including the first object and the second object.  前記制御装置は、前記撮像画像のうちの前記1物体の画像を含む第1領域及び前記第2物体の画像を含む第2領域と、前記第1反射波及び前記第2反射波との間の対応関係に基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する、請求項1に記載の制御装置。 The control device according to claim 1, wherein the control device determines the positions of the first object and the second object in the real space based on a correspondence between a first region of the captured image that includes an image of the first object and a second region of the captured image that includes an image of the second object, and the first reflected wave and the second reflected wave.  前記制御装置は、
 前記第1反射波の強度及び前記第2反射波の強度を含む強度データを取得し、
 前記強度データから検出された前記第1反射波及び前記第2反射波の強度ピークに基づいて、前記対応関係を特定する、請求項2に記載の制御装置。
The control device includes:
Acquire intensity data including an intensity of the first reflected wave and an intensity of the second reflected wave;
The control device according to claim 2 , wherein the corresponding relationship is determined based on intensity peaks of the first reflected wave and the second reflected wave detected from the intensity data.
 前記制御装置は、
 前記撮像画像に対して画像認識処理を実行することにより、前記測距装置に対する前記第1物体及び前記第2物体の位置関係を特定し、
 前記位置関係に基づいて、前記対応関係を特定する、請求項2に記載の制御装置。
The control device includes:
performing an image recognition process on the captured image to identify a positional relationship of the first object and the second object with respect to the distance measuring device;
The control device according to claim 2 , wherein the corresponding relationship is determined based on the positional relationship.
 前記制御装置は、
 実空間に電磁波を放射し、前記電磁波が物体で反射した反射波を検出することにより前記物体までの距離を計測する測距装置から、前記物体までの距離のデータと、前記強度データとを取得し、
 前記測距装置に対する前記第1領域に対応する実空間の方向及び前記第2領域に対応する実空間の方向と、前記第1反射波及び前記第2反射波に基づいて計測した前記距離のデータとによって取得した、前記第1物体及び前記第2物体までの距離に基づいて、前記第1物体及び前記第2物体の実空間における位置を特定する、請求項3に記載の制御装置。
The control device includes:
acquiring data on a distance to an object and the intensity data from a distance measuring device that measures a distance to an object by emitting an electromagnetic wave into a real space and detecting a wave reflected by the object;
4. The control device according to claim 3, which identifies the positions of the first object and the second object in real space based on the distances to the first object and the second object obtained by using a direction in real space corresponding to the first area and a direction in real space corresponding to the second area relative to the distance measuring device, and the distance data measured based on the first reflected wave and the second reflected wave.
 実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する、制御装置。 A control device that identifies the positions of the first object and the second object in the real space based on a first reflected wave and a second reflected wave that are formed by reflecting electromagnetic waves irradiated onto a first irradiation range in the real space from a first object and a second reflected wave that are formed by reflecting electromagnetic waves irradiated onto a second irradiation range in the real space from the first object, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.  前記制御装置は、前記撮像画像のうちの前記1物体の画像を含む第1領域及び前記第2物体の画像を含む第2領域と、前記第1反射波、前記第2反射波及び前記第3反射波との間の対応関係に基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定する、請求項6に記載の制御装置。 The control device according to claim 6, wherein the control device identifies the positions of the first object and the second object in the real space based on a correspondence between a first region of the captured image that includes an image of the first object and a second region of the captured image that includes an image of the second object, and the first reflected wave, the second reflected wave, and the third reflected wave.  前記制御装置は、前記第3反射波に基づいて、前記第1反射波及び前記第2反射波と、前記第1領域及び前記第2領域との対応関係を特定する、請求項7に記載の制御装置。 The control device according to claim 7, wherein the control device determines a correspondence between the first reflected wave and the second reflected wave and the first region and the second region based on the third reflected wave.  前記制御装置は、前記第1反射波、前記第2反射波及び前記第3反射波のうち、強度ピークの差が所定範囲内である前記第1反射波及び前記第3反射波を前記第1物体からの反射波として対応付ける、請求項8に記載の制御装置。 The control device according to claim 8, wherein the control device associates the first reflected wave and the third reflected wave, among the first reflected wave, the second reflected wave, and the third reflected wave, whose difference in intensity peaks is within a predetermined range, as reflected waves from the first object.  前記制御装置は、
 前記第1反射波、前記第2反射波及び前記第3反射波の到来方向と、
 前記撮像画像のうちの、前記第1照射範囲に対応する第1部分画像と、前記第2照射範囲に対応する第2部分画像と、
 前記第1反射波、前記第2反射波及び前記第3反射波に基づいて計測した距離のデータに基づいて取得した前記第1物体及び前記第2物体までの距離と
 に基づいて、
 前記第1物体及び前記第2物体の前記実空間における位置を特定する、請求項8又は9に記載の制御装置。
The control device includes:
Arrival directions of the first reflected wave, the second reflected wave, and the third reflected wave;
A first partial image corresponding to the first illumination range and a second partial image corresponding to the second illumination range of the captured image;
and distances to the first object and the second object acquired based on distance data measured based on the first reflected wave, the second reflected wave, and the third reflected wave.
The control device according to claim 8 or 9, further comprising a controller configured to identify positions of the first object and the second object in the real space.
 実空間の第1方向に放射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間が撮像された前記第1物体及び前記第2物体を含む撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む、制御方法。 A control method including identifying the positions of the first object and the second object in real space based on a first reflected wave reflected by a first object and a second reflected wave reflected by a second object of an electromagnetic wave radiated in a first direction in the real space, and an image of the real space including the first object and the second object.  実空間の第1照射範囲に照射された電磁波が第1物体で反射した第1反射波及び第2物体で反射した第2反射波と、前記実空間の第2照射範囲に照射された電磁波が前記第1物体で反射した第3反射波と、前記第1照射範囲及び前記第2照射範囲を含む前記実空間を撮像して生成された撮像画像とに基づいて、前記第1物体及び前記第2物体の前記実空間における位置を特定することを含む、制御方法。
 
a control method including: identifying positions of the first object and the second object in a real space based on a first reflected wave formed by reflection by a first object and a second reflected wave formed by reflection by a second object of an electromagnetic wave irradiated to a first irradiation range in a real space, a third reflected wave formed by reflection by the first object of an electromagnetic wave irradiated to a second irradiation range in the real space, and an image generated by imaging the real space including the first irradiation range and the second irradiation range.
PCT/JP2024/012087 2023-03-30 2024-03-26 Control device, and control method WO2024204271A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023056518 2023-03-30
JP2023-056518 2023-03-30

Publications (1)

Publication Number Publication Date
WO2024204271A1 true WO2024204271A1 (en) 2024-10-03

Family

ID=92905487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/012087 WO2024204271A1 (en) 2023-03-30 2024-03-26 Control device, and control method

Country Status (1)

Country Link
WO (1) WO2024204271A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012064026A (en) * 2010-09-16 2012-03-29 Toyota Motor Corp Vehicular object detection device and vehicular object detection method
JP2017166907A (en) * 2016-03-15 2017-09-21 オムロン株式会社 Object detection device, object detection method, and program
JP2019109219A (en) * 2017-10-27 2019-07-04 バイドゥ ユーエスエー エルエルシーBaidu USA LLC Three-dimensional lidar system for autonomous vehicle using dichroic mirror
JP2020003236A (en) * 2018-06-25 2020-01-09 株式会社リコー Distance measurement device, moving body, distance measurement method, and distance measurement system
JP2021012133A (en) * 2019-07-08 2021-02-04 株式会社リコー Distance measuring device, information processing apparatus, distance measuring method, on-vehicle device, movable body, and distance measuring system
WO2023007795A1 (en) * 2021-07-26 2023-02-02 ソニーグループ株式会社 Signal-processing device and signal-processing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012064026A (en) * 2010-09-16 2012-03-29 Toyota Motor Corp Vehicular object detection device and vehicular object detection method
JP2017166907A (en) * 2016-03-15 2017-09-21 オムロン株式会社 Object detection device, object detection method, and program
JP2019109219A (en) * 2017-10-27 2019-07-04 バイドゥ ユーエスエー エルエルシーBaidu USA LLC Three-dimensional lidar system for autonomous vehicle using dichroic mirror
JP2020003236A (en) * 2018-06-25 2020-01-09 株式会社リコー Distance measurement device, moving body, distance measurement method, and distance measurement system
JP2021012133A (en) * 2019-07-08 2021-02-04 株式会社リコー Distance measuring device, information processing apparatus, distance measuring method, on-vehicle device, movable body, and distance measuring system
WO2023007795A1 (en) * 2021-07-26 2023-02-02 ソニーグループ株式会社 Signal-processing device and signal-processing method

Similar Documents

Publication Publication Date Title
US11719788B2 (en) Signal processing apparatus, signal processing method, and program
KR102061445B1 (en) Method and apparatus for object recognition based on visible light and infrared fusion image
JP6540009B2 (en) Image processing apparatus, image processing method, program, image processing system
JP2006151125A (en) On-vehicle image processing device
JP2020003236A (en) Distance measurement device, moving body, distance measurement method, and distance measurement system
JP6782433B2 (en) Image recognition device
JP7095640B2 (en) Object detector
US20150243017A1 (en) Object recognition apparatus and object recognition method
JP2015195018A (en) Image processor, image processing method, operation support system, and program
JP2001052171A (en) Surrounding environment recognizing device
JP5718726B2 (en) Vehicle periphery monitoring device
US20220196841A1 (en) Object recognition abnormality detection apparatus, object recognition abnormality detection program product, and object recognition abnormality detection method
CN109703555A (en) Method and apparatus for detecting object shielded in road traffic
KR101960417B1 (en) Fusion sensor for improving object recognition rate and vehicle safety assistant system
US11861914B2 (en) Object recognition method and object recognition device
WO2021059967A1 (en) Object recognition device and object recognition program
US20230009071A1 (en) Control method for light sources of vision machine, and vision machine
WO2024204271A1 (en) Control device, and control method
EP4509878A1 (en) Laser radar point cloud processing method and apparatus
US12159348B2 (en) Image processing apparatus, image processing method, and storage medium
US12183030B2 (en) Apparatus and method for processing an image of a vehicle
CN117616309A (en) Signal processing method, signal transmission method and device
JP2025070778A (en) Control device and control method
CN112485807A (en) Object recognition device
US20250095106A1 (en) Gating camera, sensing system for vehicle, vehicle lamp

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24780379

Country of ref document: EP

Kind code of ref document: A1