WO2025191822A1 - Autonomous moving body, autonomous travel method, and autonomous travel program - Google Patents
Autonomous moving body, autonomous travel method, and autonomous travel programInfo
- Publication number
- WO2025191822A1 WO2025191822A1 PCT/JP2024/010188 JP2024010188W WO2025191822A1 WO 2025191822 A1 WO2025191822 A1 WO 2025191822A1 JP 2024010188 W JP2024010188 W JP 2024010188W WO 2025191822 A1 WO2025191822 A1 WO 2025191822A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- self
- attitude
- normal
- orientation
- autonomous
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/247—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
- G05D1/248—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons generated by satellites, e.g. GPS
Definitions
- This disclosure relates to an autonomously traveling vehicle, an autonomous traveling method, and an autonomous traveling program.
- it relates to an autonomously traveling vehicle that travels autonomously based on technology such as SLAM (Simultaneous Localization and Mapping), which uses surrounding detection sensors to estimate its own position and orientation on a point cloud map.
- SLAM Simultaneous Localization and Mapping
- GNSS Global Navigation Satellite System
- Patent Document 1 discloses a technology that combines the probability distribution of a vehicle's own position determined by GNSS reception data and that determined by point cloud data from LiDAR, and determines the peak position of the combined SLAM/GNSS probability distribution as the vehicle's own position.
- SLAM is an abbreviation for Simultaneous Localization and Mapping.
- SLAM which uses a surrounding detection sensor such as LiDAR
- a surrounding detection sensor such as LiDAR
- the technology of Patent Document 1 combines self-location estimation using GNSS reception data with self-location estimation using LiDAR-SLAM, which poses a problem in that it is not possible to perform self-location estimation based on SLAM, which uses a surrounding detection sensor.
- the autonomous vehicle disclosed herein aims to return to SLAM using a surrounding detection sensor by performing self-location estimation using another method if self-location estimation using SLAM using a surrounding detection sensor fails.
- the autonomous traveling vehicle is an autonomous traveling vehicle equipped with a surrounding detection sensor and performing autonomous traveling, a self-position estimation unit that estimates a position and orientation on a point cloud map as a self-position and orientation using the surrounding detection sensor; a self-position determination unit that performs a determination process to determine whether the self-position and orientation are normal; a driving control unit that controls the autonomous driving using the self-position and attitude when the self-position and attitude are determined to be normal, The self-position estimation unit If it is determined that the self-position and attitude is not normal, an acquisition process is carried out to acquire a normal self-position and attitude, and the acquisition process is repeated until the self-position and attitude obtained by the acquisition process is determined to be normal by the determination process.
- the autonomous driving mobile body if the self-position and attitude obtained by the acquisition processing is determined to be normal, it controls autonomous driving using the self-position and attitude determined to be normal, and returns to the process of estimating the self-position and attitude on a point cloud map using a periphery detection sensor. Therefore, the autonomous driving mobile body according to the present disclosure can perform driving based on autonomous driving using a process of estimating the self-position and attitude on a point cloud map using a periphery detection sensor, achieving the effect of enabling autonomous driving with high precision and low cost.
- FIG. 1 is a diagram showing an example of the appearance of an autonomous traveling vehicle according to a first embodiment.
- FIG. 1 is a diagram showing a configuration example of an autonomous traveling vehicle according to a first embodiment.
- FIG. 3 is a flowchart showing an autonomous traveling process of the autonomous traveling vehicle according to the first embodiment.
- FIG. 10 is a diagram showing a specific example of the first determination process according to the first embodiment.
- FIG. 10 is a diagram showing an example of an acquisition process according to the first embodiment.
- FIG. 10 is a diagram showing an example 2 of the acquisition process according to the first embodiment.
- FIG. 10 is a flowchart showing an autonomous traveling process of an autonomous traveling body according to a modification of the first embodiment.
- FIG. 10 is a diagram showing a configuration example of a control device according to a modified example of the first embodiment.
- FIG. 1 is a diagram showing an example of the appearance of an autonomously running vehicle 10 according to this embodiment.
- FIG. 2 is a diagram showing an example of the configuration of the autonomously running vehicle 10 according to this embodiment.
- the autonomously traveling vehicle 10 is, for example, a vehicle such as a PMV or AMR that travels autonomously or manually on a route within a facility.
- the PMV is a manned vehicle.
- the PMV may be capable of autonomous traveling.
- the PMV may be capable of switching between autonomous traveling and manual traveling.
- the AMR is an unmanned vehicle that travels autonomously within a facility.
- the autonomously traveling vehicle 10 includes a control device 100 , a GNSS receiving antenna 201 , a GNSS receiver 202 , a surroundings detection sensor 203 , and a vehicle traveling device 204 .
- the control device 100 receives GNSS reception data 41 from the satellite system via a GNSS reception antenna 201 and a GNSS receiver 202 .
- the control device 100 estimates its own position and orientation (localization) by SLAM using a surrounding detection sensor 203 such as the LiDAR 21, and outputs control information 31 for autonomous driving to the vehicle driving device 204.
- the control information 31 is information that determines the speed and steering of the autonomous driving vehicle 10.
- the vehicle driving device 204 is a device such as a vehicle motor and a drive shaft, and performs autonomous driving based on the control information 31.
- the autonomous vehicle 10 is equipped with a control device 100 which is a computer.
- the control device 100 includes a processor 910, as well as other hardware such as a memory 921, an auxiliary storage device 922, an input interface 930, an output interface 940, a display device 941, and a communication device 950.
- the processor 910 is connected to the other hardware via signal lines and controls the other hardware.
- the control device 100 has, as its functional elements, a self-position estimation unit 110, a self-position determination unit 120, a driving control unit 130, and a memory unit 150.
- the memory unit 150 stores a point cloud map 51, a self-position and attitude 52, and a position threshold 53.
- the point cloud map 51 is also called an environmental map.
- the point cloud map 51 is obtained by SLAM and is generated from three-dimensional point cloud data obtained by the periphery detection sensor 203.
- the point cloud map 51 may also be obtained by SLAM and generated from two-dimensional point cloud data obtained by the periphery detection sensor 203.
- the processor 910 is a device that executes an autonomous driving program.
- the autonomous driving program is a program that realizes the functions of the self-position estimation unit 110, the self-position determination unit 120, and the driving control unit 130.
- the processor 910 is an IC that performs arithmetic processing. Specific examples of the processor 910 are a CPU, a DSP, and a GPU.
- IC is an abbreviation for Integrated Circuit.
- CPU is an abbreviation for Central Processing Unit.
- DSP is an abbreviation for Digital Signal Processor.
- GPU is an abbreviation for Graphics Processing Unit.
- the memory 921 is a storage device that temporarily stores data. Specific examples of the memory 921 are SRAM and DRAM. SRAM is an abbreviation for Static Random Access Memory. DRAM is an abbreviation for Dynamic Random Access Memory.
- the auxiliary storage device 922 is a storage device that stores data. A specific example of the auxiliary storage device 922 is an HDD.
- the auxiliary storage device 922 may also be a portable storage medium such as an SD (registered trademark) memory card, CF, NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD. Note that HDD is an abbreviation for Hard Disk Drive. SD (registered trademark) is an abbreviation for Secure Digital. CF is an abbreviation for CompactFlash (registered trademark). DVD is an abbreviation for Digital Versatile Disk.
- the input interface 930 is a port that is connected to an input device such as a mouse, keyboard, or touch panel. Specifically, the input interface 930 is a USB terminal. Note that the input interface 930 may also be a port that is connected to a LAN.
- USB is an abbreviation for Universal Serial Bus.
- LAN is an abbreviation for Local Area Network.
- the output interface 940 is a port to which a cable from an output device such as a display is connected.
- the output interface 940 is a USB terminal or an HDMI (registered trademark) terminal.
- the display is an LCD.
- the output interface 940 is also called a display interface.
- HDMI (registered trademark) is an abbreviation for High Definition Multimedia Interface.
- LCD is an abbreviation for Liquid Crystal Display.
- the communication device 950 has a receiver and a transmitter.
- the communication device 950 is connected to a communication network such as a LAN, the Internet, a telephone line, or Wi-Fi (registered trademark).
- the communication device 950 is a communication chip or NIC.
- NIC is an abbreviation for Network Interface Card.
- the autonomous driving program is executed by the control device 100.
- the autonomous driving program is loaded into the processor 910 and executed by the processor 910.
- the OS is also stored in the memory 921.
- OS is an abbreviation for Operating System.
- the processor 910 executes the autonomous driving program while running the OS.
- the autonomous driving program and OS may be stored in the auxiliary storage device 922.
- the autonomous driving program and OS stored in the auxiliary storage device 922 are loaded into the memory 921 and executed by the processor 910. Note that part or all of the autonomous driving program may be incorporated into the OS.
- the control device 100 may be equipped with multiple processors that replace the processor 910. These multiple processors share the execution of the autonomous driving program. Each processor is a device that executes the autonomous driving program, just like the processor 910.
- Data, information, signal values, and variable values used, processed, or output by the autonomous driving program are stored in memory 921, auxiliary storage device 922, or registers or cache memory within processor 910.
- the “parts" of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 may be read as “circuits,""steps,"”procedures,"”processing," or “circuitry.”
- the autonomous driving program causes a computer to execute a self-location estimation process, a self-location determination process, and a driving control process.
- the “processing" of the self-location estimation process, the self-location determination process, and the driving control process may be read as a "program,””programproduct,”"a computer-readable storage medium storing a program," or "a computer-readable recording medium recording a program.”
- the autonomous driving method is a method performed by the control device 100 executing the autonomous driving program.
- the autonomous driving program may be provided by being stored on a computer-readable recording medium. Alternatively, the autonomous driving program may be provided as a program product.
- the operation procedure of the autonomously traveling vehicle 10 corresponds to an autonomous traveling method. Furthermore, a program that realizes the autonomous traveling process, which is the operation of the autonomously traveling vehicle 10, corresponds to an autonomous traveling program.
- FIG. 3 is a flow diagram showing the autonomous driving process of the autonomous driving vehicle 10 according to this embodiment.
- the autonomously traveling vehicle 10 is equipped with a periphery detection sensor 203 and travels autonomously.
- the autonomously traveling vehicle 10 uses the periphery detection sensor 203 to estimate the position and orientation in the point cloud map 51 as its own position and orientation.
- the autonomously traveling vehicle 10 then generates control information 31 for autonomous traveling using the estimated position and orientation.
- the autonomously running vehicle 10 according to this embodiment performs autonomous running based on a technology that uses a perimeter detection sensor to estimate the position and orientation on a point cloud map as its own position and orientation.
- the autonomously traveling vehicle 10 is equipped with a LiDAR 21 as the periphery detection sensor 203.
- the autonomous vehicle 10 estimates its own position and orientation through scan matching using the LiDAR 21.
- the autonomous vehicle 10 is a vehicle that implements LiDAR-SLAM.
- LiDAR-SLAM is a technology that can create a point cloud map and estimate its own position and orientation without relying on a satellite system such as GNSS.
- the technology for estimating the vehicle's position and orientation on a point cloud map using a perimeter detection sensor is not limited to LiDAR-SLAM.
- This embodiment can also be applied to autonomous vehicles based on technologies such as visual SLAM, which uses a camera as a perimeter detection sensor, visual odometry, which estimates orientation using continuous images obtained from a camera, visual-inertial odometry, which estimates orientation by combining visual odometry with an inertial measurement unit (IMU), or odometer SLAM, which uses an odometer.
- the autonomous vehicle 10 may perform SLAM using a stereo camera or a depth camera instead of the LiDAR 21 as the surroundings detection sensor 203 to create a point cloud map and estimate its own position and orientation.
- the autonomous vehicle 10 may perform SLAM using a high-resolution millimeter-wave radar or imaging radar and a monocular camera instead of the LiDAR 21 as the surrounding detection sensor 203, to create a point cloud map and estimate its own position and orientation.
- the autonomous vehicle 10 may use a fusion sensor that combines two or more of the following (11) to (14) as the surrounding detection sensor 203 to create a point cloud map and estimate its own position and orientation.
- (11) LiDAR21 (12) Millimeter wave radar or imaging radar and monocular camera (13) Monocular camera (14) Stereo camera or depth camera This fusion sensor may also be combined with an electronic compass or an inertial measurement unit.
- step S101 the self-location estimation unit 110 estimates the position and orientation on the point cloud map 51 as the self-location and orientation using the surrounding detection sensor 203. Specifically, the self-location estimation unit 110 estimates the self-location and orientation using LiDAR-SLAM technology. The self-location estimation unit 110 estimates the self-location and orientation by scan matching the point cloud data acquired by the LiDAR 21 with the point cloud map 51.
- Step S102 the self-position determining unit 120 performs a first determination process S10 to determine whether the self-position and orientation are normal.
- the self-position determination unit 120 compares the current self-position and orientation with the previously estimated self-position and orientation.
- the current self-position and orientation is the self-position and orientation obtained by scan matching in step S101.
- the previously estimated self-position and orientation is assumed to be stored in the storage unit 150 as the self-position and orientation 52.
- the comparison result between the current self-position and posture and the previously estimated self-position and posture is specifically the difference in position between the current self-position and posture and the previously estimated self-position and posture.
- the self-position determining unit 120 determines that the current self-position and orientation is normal.
- the self-position determining unit 120 stores the current self-position and orientation determined to be normal in the storage unit 150 as the self-position and orientation 52. If the position difference obtained as a result of the comparison is greater than the position threshold value 53, the self-position determining unit 120 determines that the current self-position and orientation is not normal.
- FIG. 4 is a diagram showing a specific example of the first determination process S10 according to this embodiment.
- 4 shows the self-position and orientation when the self-position estimation SLAM is performed.
- the self-position and orientation when the self-position estimation SLAM is performed is stored in the storage unit 150 as the self-position and orientation 52. 4 indicates the next frame, i.e., the current self-position and orientation.
- the comparison result is expressed as, for example, a value obtained by calculating an evaluation function.
- the comparison result is also called an estimation result. Specifically, the value obtained by calculating an evaluation function is expressed as a value such as a difference in coordinates or a difference in speed.
- step S103 If the self-position and orientation are determined to be normal (YES in step S103), the process proceeds to step S107. If it is determined that the self-position and orientation are not normal (NO in step S103), the process proceeds to step S104.
- the self-position estimation unit 110 determines that the self-position and attitude are not normal, the self-position estimation unit 110 acquires the current self-position and attitude using the GNSS reception data 41 received by the GNSS receiver 202 as an acquisition process S30.
- FIG. 5 is a diagram showing an example of the acquisition process S30 according to this embodiment.
- the self-position estimation unit 110 determines that the self-position and attitude are incorrect, it performs an acquisition process S30 in which it acquires the current self-position and attitude using the GNSS reception data 41 received by the GNSS receiver 202. That is, in the acquisition process S30, the current self-position and attitude are estimated using the GNSS receiver 202, which is a device different from the device used to estimate the self-position and attitude by SLAM using the periphery detection sensor 203. This attitude estimation will be described later in examples 1 to 5 of the acquisition process S30.
- the different device in this case may be a camera and an image processor for the camera image, or other devices. Examples of other devices may include millimeter-wave radar or imaging radar and a camera, an electronic compass, an inertial measurement unit, etc.
- the self-position estimation unit 110 estimates the self-position and orientation using the GNSS reception data 41. Note that the self-position estimation is not always performed using the GNSS reception data 41, but is performed only when a self-position loss is detected.
- the self-position estimation unit 110 estimates the self-position and attitude using the GNSS reception data 41, and sets the self-position and attitude based on the GNSS reception data 41 as the current self-position and attitude.
- step S105 the self-position determining unit 120 performs a second determination process S20 to determine whether the self-position and orientation obtained in the acquisition process S30 are normal or not.
- the second determination process S20 is performed in the same manner as the first determination process S10.
- the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired by SLAM in step S101.
- the second determination process S20 the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired by the acquisition process S30 in step S104.
- the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired from the GNSS reception data 41 in step S104.
- the self-position determination unit 120 determines that the current self-position and attitude is normal.
- the self-position determination unit 120 may store the current self-position and attitude determined to be normal in the storage unit 150 as the self-position and attitude 52.
- the final normal self-position and attitude is the "self-position and attitude obtained by SLAM.” In step S106 of FIG.
- the self-position and attitude obtained from the GNSS reception data 41 is equal to the "self-position and attitude obtained by LiDAR-SLAM,” but the self-position and attitude in this case is always the “self-position and attitude obtained by LiDAR-SLAM.” If the position difference obtained as a result of the comparison is greater than the position threshold value 53, the self-position determining unit 120 determines that the current self-position and orientation is not normal.
- the self-position determination unit 120 determines that the current self-position and attitude is normal, taking the self-position and attitude based on the GNSS reception data 41 as the current self-position and attitude.
- step S106 If the self-position and orientation are determined to be normal (YES in step S106), the process proceeds to step S107. If it is determined that the self-position and orientation are not normal (NO in step S106), the process returns to step S104 and the acquisition process S30 is repeated.
- Step S107 the driving control unit 130 controls the autonomous driving using the self-position and attitude determined to be normal.
- the driving control unit 130 generates control information 31 for controlling the autonomous driving using the self-position and attitude determined to be normal, and outputs the control information 31 to the vehicle driving device 204.
- the control information 31 is information that determines the speed and steering of the autonomously driving vehicle 10.
- the vehicle driving device 204 is a device such as a vehicle motor and drive shaft, and performs autonomous driving based on the control information 31.
- the driving control unit 130 may decelerate the autonomous driving until the second determination process S20 determines that the self-position and attitude are correct. Alternatively, the driving control unit 130 may stop the autonomous driving until the second determination process S20 determines that the self-position and attitude are correct.
- the self-position estimation unit 110 may acquire the current self-position using the GNSS reception data 41 and may acquire the attitude of the previously estimated self-position and attitude as the current attitude.
- the position may be estimated by high-precision RTK (Real Time Kinematic) positioning with a position error of the order of centimeters.
- FIG. 6 is a diagram showing a second example of the acquisition process S30 according to the present embodiment.
- the self-position estimation unit 110 determines that the self-position and attitude are incorrect, in acquisition process S30, it acquires the current self-position and attitude using the GNSS reception data 41.
- the self-position estimation unit 110 may set a position obtained by gradually shifting the self-position of the self-position and attitude as the current self-position, and may acquire an attitude shifted by a predetermined angle from the attitude of the self-position and attitude as the current attitude. Then, the self-position estimation unit 110 sets the current self-position and current attitude as the current self-position and attitude, and performs the second determination process S20 and subsequent processes.
- the predetermined angle is, for example, 10 degrees. If the predetermined angle is 1 degree, the acquisition process S30 will be performed a maximum of 360 times until the current self-position and orientation is determined to be normal.
- Example 3 of acquisition process S30 When the self-position estimation unit 110 determines that the self-position and attitude are incorrect, it may acquire the current self-position using the GNSS reception data 41 and the current attitude using an electronic compass as an acquisition process S30. Furthermore, when the self-position estimation unit 110 determines that the self-position and attitude are incorrect, as an acquisition process S30, it may acquire the current self-position using the GNSS reception data 41 and also acquire the current attitude using an inertial measurement device.
- the self-position estimation unit 110 may acquire the current self-position using the GNSS reception data 41, and may also use the previous self-position acquired using the previous GNSS reception data 41 to acquire the vector of the difference between the current self-position using the GNSS reception data 41 and the previous self-position, and the current attitude using an inertial measurement unit.
- the self-position estimation unit 110 may acquire the current self-position and orientation from road landmarks or the like using a camera in acquisition processing S30.
- the acquisition process S30 is repeated until the self-position and orientation obtained by the acquisition process S30 is determined to be normal by the second determination process S20.
- the acquisition process S30 may be repeated by determining rules such as those in (1) to (3) below.
- the attitude is acquired from the attitude of the previous self-position and attitude stored in the storage unit 150.
- From L+1 to Mth time The self-position is acquired using the GNSS reception data 41.
- the attitude is obtained using an electronic compass.
- K, L, and M are any integers that satisfy the relationship 1 ⁇ K ⁇ L ⁇ M, for example.
- the order of (1) to (3) and the combination of methods for obtaining the self-position and attitude are also arbitrary.
- a count threshold may be stored in the storage unit 150.
- the self-position estimation unit 110 may have a function of issuing a warning message indicating that a normal self-position and orientation cannot be acquired when the number of times the acquisition process S30 is repeated exceeds the count threshold.
- the time threshold may be stored in the storage unit 150.
- the self-position estimation unit 110 may have a function of issuing a warning message indicating that a normal self-position and orientation cannot be acquired when the time taken to repeat the acquisition process S30 exceeds the time threshold.
- the autonomously traveling vehicle may be equipped with a camera and may travel autonomously based on a method of estimating its own position and orientation using visual SLAM using the camera.
- the autonomously traveling vehicle may be equipped with an odometer and may travel autonomously based on a method of estimating its own position and orientation using odometer SLAM (Simultaneous Local Area Measurement) that uses the odometer.
- ⁇ Modification 2> after acquiring the vehicle's position and attitude using GNSS reception data, if the acquired position and attitude is normal, the vehicle performs driving control using the position and attitude (based on the GNSS reception data).Then, the vehicle's position and attitude is estimated using LiDAR-SLAM, which serves as the base. On the other hand, in this modified example, after obtaining the self-position and attitude from the GNSS reception data, if the obtained self-position and attitude is normal, the self-position and attitude may be estimated again using LiDAR-SLAM.
- FIG. 7 is a flowchart showing the autonomous driving process of the autonomous driving vehicle 10 according to a modified example of this embodiment. 7 differs from FIG. 3 in that if the answer to step S106 is YES, the process returns to step S101 and the self-position and orientation are estimated using LiDAR-SLAM as the base. Specifically, in FIG. 7, the following four processes can be imagined to be performed. (1) LiDAR-SLAM makes the vehicle's position unstable. (2) The vehicle's own position from the GNSS receiver is "temporarily stored in memory.” (3) The information in (2) is transmitted as complementary information to the LiDAR-SLAM self-position estimation process. (4) Once the vehicle's position is stabilized by LiDAR-SLAM self-location estimation, autonomous driving resumes. The information from (2) is deleted from memory.
- ⁇ Example 2 of acquisition process S30> described in the first embodiment may be implemented in the process of Fig. 7.
- ⁇ Example 2 of acquisition process S30> is a process when the self-position is roughly correct but the posture is not.
- the self-position estimation unit 110 determines that the self-position and attitude are incorrect (step S103)
- the self-position estimation unit 110 acquires the current self-position and attitude using the GNSS reception data 41 as the acquisition process S30 (step S104).
- the self-position estimation unit 110 may set a position obtained by gradually shifting the self-position of the self-position and attitude as the current self-position, and acquire an attitude shifted by a predetermined angle from the attitude of the self-position and attitude as the current attitude (step S104). Then, the self-position estimation unit 110 sets the current self-position and current attitude as the current self-position and attitude, and performs the second determination process S20 and subsequent processes (step S105). In step S106, if the current self-position and orientation is normal, the process returns to step S101 and the LiDAR-SLAM self-position estimation process is performed.
- LiDAR-SLAM detects that the vehicle has lost its position
- the final estimation of the vehicle's position and attitude is performed using LiDAR-SLAM, it is possible to estimate the vehicle's position and attitude even if the GNSS receiver is not fixed.
- the functions of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 are realized by software.
- the functions of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 may be realized by hardware.
- the control device 100 includes an electronic circuit 909 instead of a processor 910 .
- FIG. 8 is a diagram showing an example of the configuration of a control device 100 according to a modification of this embodiment.
- the electronic circuit 909 is a dedicated electronic circuit that realizes the functions of the self-position estimation unit 110, the self-position determination unit 120, and the driving control unit 130.
- the electronic circuit 909 is a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA.
- GA is an abbreviation for Gate Array.
- ASIC is an abbreviation for Application Specific Integrated Circuit.
- FPGA is an abbreviation for Field-Programmable Gate Array.
- the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be realized by a single electronic circuit, or may be distributed across multiple electronic circuits.
- some of the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be implemented by electronic circuits, with the remaining functions being implemented by software. Also, some or all of the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be implemented by firmware.
- Each of the processor and electronic circuits is also called processing circuitry.
- the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 are realized by processing circuitry.
- the autonomously running mobile body controls autonomous running using the self-position and attitude determined to be normal, and autonomously runs by estimating the self-position and attitude by SLAM using a periphery detection sensor.
- SLAM scan matching
- a perimeter detection sensor provides highly accurate self-position and orientation.
- the vehicle may lose track of its own position if it rotates or if the approximate location and the entire surrounding field of view are obscured.
- the present embodiment aims to restore scan matching on the spot, rather than moving the autonomous vehicle using an alternative method other than scan matching. Therefore, according to the autonomously driving vehicle of this embodiment, it is possible to continue driving based on autonomous driving using SLAM with a surrounding detection sensor, thereby achieving the effect of enabling autonomous driving to be performed with high precision and low cost.
- each unit of the control device has been described as an independent functional block.
- the configuration of the control device does not have to be the same as that of the above-described embodiment.
- the functional blocks of the control device may have any configuration as long as they can realize the functions described in the above-described embodiment.
- the control device may not be a single device, but may be a system composed of multiple devices.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本開示は、自律走行移動体、自律走行方法、および自律走行プログラムに関する。特に、周辺検知センサを使って点群地図における自己位置姿勢を推定するSLAM(Simultaneous Localization and Mapping)といった技術をベースに自律走行する自律走行移動体に関する。 This disclosure relates to an autonomously traveling vehicle, an autonomous traveling method, and an autonomous traveling program. In particular, it relates to an autonomously traveling vehicle that travels autonomously based on technology such as SLAM (Simultaneous Localization and Mapping), which uses surrounding detection sensors to estimate its own position and orientation on a point cloud map.
施設内を自律走行する移動体ロボットとして自律走行可能なPMVあるいはAMRといったパーソナルモビリティの開発が進められている。PMVは、Personal Mobility Vehicleの略語である。AMRは、Aautonomous Mobile Robotの略語である。PMVあるいはAMRといった小型の移動体の開発は、例えば、自動車の自律走行におけるラストワンマイル補完、自動車を持っていない交通弱者へ対策、自動車が入れない施設内での利用、あるいは施設内における無人のサービスの提供といった目的を有している。PMVあるいはAMRはLiDAR-SLAMによって自己位置推定を行っている。 Development is underway for personal mobility vehicles such as PMVs and AMRs, which are mobile robots capable of autonomously driving within facilities. PMV is an abbreviation for Personal Mobility Vehicle. AMR is an abbreviation for Autonomous Mobile Robot. The development of small mobile vehicles such as PMVs and AMRs has the following aims: to supplement the last mile of autonomous driving for automobiles, to provide assistance to those with limited mobility who do not own cars, to be used in facilities where cars cannot enter, or to provide unmanned services within facilities. PMVs and AMRs estimate their self-location using LiDAR-SLAM.
PMVあるいはAMRは屋内外での利用が期待されている。PMVあるいはAMRの車体に、光ファイバジャイロのような高精度な姿勢計測用のジャイロセンサや、車輪、駆動軸等に車速センサを取り付けて走行距離精度を高めるオドメータを搭載した際に、その設置スペースに伴い車体が大型化することにより利用範囲が狭まってしまう懸念や、その機器設置により低価格化が妨げられることがあった。
また、LiDAR-SLAMのみで自己位置推定を行うと、周囲が似た地形の場合などに自己位置推定を誤ってしまう。また、屋外および屋内を行き来して走行する場合、GNSS受信データのみで自己位置推定を行うと、FIX(位置情報がcm級)以外は誤差が大きく、自己位置を誤ってしまう。GNSSは、Global Navigation Satellite Systemの略語である。
PMVs and AMRs are expected to be used both indoors and outdoors. However, when a PMV or AMR is equipped with a gyro sensor for high-precision attitude measurement, such as an optical fiber gyroscope, or an odometer that improves the accuracy of the distance traveled by attaching a vehicle speed sensor to the wheels or drive shaft, there are concerns that the installation space required will increase the size of the vehicle, narrowing the range of use, and that the installation of such equipment will hinder price reduction.
Furthermore, if self-location estimation is performed using only LiDAR-SLAM, the self-location estimation will be incorrect if the surrounding terrain is similar. Furthermore, when traveling between indoors and outdoors, if self-location estimation is performed using only GNSS reception data, the error will be large except for FIX (position information on the centimeter level), resulting in an incorrect self-location. GNSS is an abbreviation for Global Navigation Satellite System.
特許文献1には、GNSS受信データによる自己位置と、LiDARからの点群データによる自己位置との確率分布を合成し、合成した確率分布であるSLAM/GNSS確率分布のピーク位置を自己位置として決定するという技術が開示されている。SLAMは、Simultaneous Localization and Mappingの略語である。 Patent Document 1 discloses a technology that combines the probability distribution of a vehicle's own position determined by GNSS reception data and that determined by point cloud data from LiDAR, and determines the peak position of the combined SLAM/GNSS probability distribution as the vehicle's own position. SLAM is an abbreviation for Simultaneous Localization and Mapping.
LiDARといった周辺検知センサを利用するSLAMは、GNSS受信データに依存することなく、点群地図(環境地図とも言う)の作成と自己位置姿勢の推定を高精度かつ低コストに行うことができる。よって、周辺検知センサを利用するSLAMをベースとして自己位置推定を行うことが求められている。
一方、特許文献1の技術は、GNSS受信データによる自己位置推定とLiDAR-SLAMによる自己位置推定とを組み合わせたものである。よって、周辺検知センサを利用するSLAMをベースとして自己位置推定を行うことはできないという課題がある。
SLAM, which uses a surrounding detection sensor such as LiDAR, can create a point cloud map (also known as an environmental map) and estimate the vehicle's position and orientation with high accuracy and low cost without relying on GNSS reception data. Therefore, there is a demand for self-position estimation based on SLAM, which uses a surrounding detection sensor.
On the other hand, the technology of Patent Document 1 combines self-location estimation using GNSS reception data with self-location estimation using LiDAR-SLAM, which poses a problem in that it is not possible to perform self-location estimation based on SLAM, which uses a surrounding detection sensor.
本開示に係る自律走行移動体では、周辺検知センサを利用するSLAMによる自己位置推定が失敗した場合に他の手法を用いて自己位置推定を実施することで、周辺検知センサを利用するSLAMに復帰することを目的とする。 The autonomous vehicle disclosed herein aims to return to SLAM using a surrounding detection sensor by performing self-location estimation using another method if self-location estimation using SLAM using a surrounding detection sensor fails.
本開示に係る自律走行移動体は、周辺検知センサを搭載し、自律走行を行う自律走行移動体において、
前記周辺検知センサを使って点群地図における位置および姿勢を自己位置姿勢として推定する自己位置推定部と、
前記自己位置姿勢が正常か否かを判定する判定処理を実施する自己位置判定部と、
前記自己位置姿勢が正常と判定されると、前記自己位置姿勢を用いて前記自律走行を制御する走行制御部と
を備え、
前記自己位置推定部は、
前記自己位置姿勢が正常でないと判定されると、正常な自己位置姿勢を取得するための取得処理を実施し、前記取得処理により得られた自己位置姿勢が前記判定処理により正常と判定されるまで前記取得処理を繰り返す。
The autonomous traveling vehicle according to the present disclosure is an autonomous traveling vehicle equipped with a surrounding detection sensor and performing autonomous traveling,
a self-position estimation unit that estimates a position and orientation on a point cloud map as a self-position and orientation using the surrounding detection sensor;
a self-position determination unit that performs a determination process to determine whether the self-position and orientation are normal;
a driving control unit that controls the autonomous driving using the self-position and attitude when the self-position and attitude are determined to be normal,
The self-position estimation unit
If it is determined that the self-position and attitude is not normal, an acquisition process is carried out to acquire a normal self-position and attitude, and the acquisition process is repeated until the self-position and attitude obtained by the acquisition process is determined to be normal by the determination process.
本開示に係る自律走行移動体では、周辺検知センサを使って点群地図における自己位置姿勢を推定する処理により得られた自己位置姿勢が正常でないと判定されると別の手法である取得処理を用いて正常な自己位置姿勢を取得する。そして、本開示に係る自律走行移動体では、取得処理により得られた自己位置姿勢が判定処理により正常と判定されるまで取得処理を繰り返す。そして、本開示に係る自律走行移動体では、取得処理により得られた自己位置姿勢が正常と判定されると、正常と判定された自己位置姿勢を用いて自律走行を制御し、周辺検知センサを使って点群地図における自己位置姿勢を推定する処理に戻る。よって、本開示に係る自律走行移動体によれば、周辺検知センサを使って点群地図における自己位置姿勢を推定する処理を用いた自律走行をベースとした走行を行うことができ、高精度かつ低コストに自律走行を行うことができるという効果を奏する。 In the autonomous driving mobile body according to the present disclosure, if it is determined that the self-position and attitude obtained by the process of estimating the self-position and attitude on a point cloud map using a periphery detection sensor is not normal, a different method, called acquisition processing, is used to acquire a normal self-position and attitude. Then, in the autonomous driving mobile body according to the present disclosure, the acquisition processing is repeated until the self-position and attitude obtained by the acquisition processing is determined to be normal by a determination processing. Then, in the autonomous driving mobile body according to the present disclosure, if the self-position and attitude obtained by the acquisition processing is determined to be normal, it controls autonomous driving using the self-position and attitude determined to be normal, and returns to the process of estimating the self-position and attitude on a point cloud map using a periphery detection sensor. Therefore, the autonomous driving mobile body according to the present disclosure can perform driving based on autonomous driving using a process of estimating the self-position and attitude on a point cloud map using a periphery detection sensor, achieving the effect of enabling autonomous driving with high precision and low cost.
以下、本実施の形態について、図を用いて説明する。各図中、同一または相当する部分には、同一符号を付している。実施の形態の説明において、同一または相当する部分については、説明を適宜省略または簡略化する。また、以下の図では各構成部材の大きさの関係が実際のものとは異なる場合がある。また、実施の形態の説明において、上、下、左、右、前、後、表、裏といった向きあるいは位置が示されている場合がある。これらの表記は、説明の便宜上の記載であり、装置、器具、あるいは部品などの配置、方向および向きを限定するものではない。 The present embodiment will now be described with reference to the drawings. In each drawing, identical or corresponding parts are designated by the same reference numerals. In describing the embodiment, the description of identical or corresponding parts will be omitted or simplified as appropriate. Furthermore, the size relationships between the components in the drawings below may differ from the actual size. Furthermore, in describing the embodiment, directions or positions such as up, down, left, right, front, rear, front and back may be indicated. These notations are used for convenience of description and do not limit the placement, direction or orientation of devices, instruments, parts, etc.
実施の形態1.
***構成の説明***
図1は、本実施の形態に係る自律走行移動体10の外観の例を示す図である。
図2は、本実施の形態に係る自律走行移動体10の構成例を示す図である。
自律走行移動体10は、例えば施設内の走行路を自律走行あるいは手動走行するPMVあるいはAMRといった移動体である。PMVは、有人の移動体である。PMVは、自律走行可能であっても良い。PMVは、自律走行と手動走行とを切り替え可能であってもよい。AMRは、無人の移動体であり、施設内を自律走行する。
Embodiment 1.
***Configuration Description***
FIG. 1 is a diagram showing an example of the appearance of an autonomously running vehicle 10 according to this embodiment.
FIG. 2 is a diagram showing an example of the configuration of the autonomously running vehicle 10 according to this embodiment.
The autonomously traveling vehicle 10 is, for example, a vehicle such as a PMV or AMR that travels autonomously or manually on a route within a facility. The PMV is a manned vehicle. The PMV may be capable of autonomous traveling. The PMV may be capable of switching between autonomous traveling and manual traveling. The AMR is an unmanned vehicle that travels autonomously within a facility.
自律走行移動体10は、制御装置100、GNSS受信アンテナ201、GNSS受信機202、周辺検知センサ203、および車両走行装置204を備える。
制御装置100は、GNSS受信アンテナ201およびGNSS受信機202を介して、衛星システムからGNSS受信データ41を受信する。
制御装置100は、LiDAR21といった周辺検知センサ203を利用するSLAMにより自己位置姿勢の推定(ローカリゼーション)を行い、自律走行のための制御情報31を車両走行装置204に出力する。制御情報31は、自律走行移動体10の速度・操舵を決定する情報である。
車両走行装置204は車体モータおよび駆動軸といった装置であり、制御情報31に基づいて自律走行を実施する。
The autonomously traveling vehicle 10 includes a control device 100 , a GNSS receiving antenna 201 , a GNSS receiver 202 , a surroundings detection sensor 203 , and a vehicle traveling device 204 .
The control device 100 receives GNSS reception data 41 from the satellite system via a GNSS reception antenna 201 and a GNSS receiver 202 .
The control device 100 estimates its own position and orientation (localization) by SLAM using a surrounding detection sensor 203 such as the LiDAR 21, and outputs control information 31 for autonomous driving to the vehicle driving device 204. The control information 31 is information that determines the speed and steering of the autonomous driving vehicle 10.
The vehicle driving device 204 is a device such as a vehicle motor and a drive shaft, and performs autonomous driving based on the control information 31.
自律走行移動体10は、コンピュータである制御装置100を搭載する。
制御装置100は、プロセッサ910を備えるとともに、メモリ921、補助記憶装置922、入力インタフェース930、出力インタフェース940、表示機器941、および通信装置950といった他のハードウェアを備える。プロセッサ910は、信号線を介して他のハードウェアと接続され、これら他のハードウェアを制御する。
The autonomous vehicle 10 is equipped with a control device 100 which is a computer.
The control device 100 includes a processor 910, as well as other hardware such as a memory 921, an auxiliary storage device 922, an input interface 930, an output interface 940, a display device 941, and a communication device 950. The processor 910 is connected to the other hardware via signal lines and controls the other hardware.
制御装置100は、機能要素として、自己位置推定部110と自己位置判定部120と走行制御部130と記憶部150とを備える。記憶部150には、点群地図51と自己位置姿勢52と位置閾値53が記憶される。点群地図51は環境地図とも言う。点群地図51はSLAMによって得られ、周辺検知センサ203により得た3次元点群データから生成される。点群地図51はSLAMによって得られ、周辺検知センサ203により得た2次元点群データから生成されても良い。 The control device 100 has, as its functional elements, a self-position estimation unit 110, a self-position determination unit 120, a driving control unit 130, and a memory unit 150. The memory unit 150 stores a point cloud map 51, a self-position and attitude 52, and a position threshold 53. The point cloud map 51 is also called an environmental map. The point cloud map 51 is obtained by SLAM and is generated from three-dimensional point cloud data obtained by the periphery detection sensor 203. The point cloud map 51 may also be obtained by SLAM and generated from two-dimensional point cloud data obtained by the periphery detection sensor 203.
自己位置推定部110と自己位置判定部120と走行制御部130の機能は、ソフトウェアにより実現される。記憶部150は、メモリ921に備えられる。なお、記憶部150は、補助記憶装置922に備えられていてもよいし、メモリ921と補助記憶装置922に分散して備えられていてもよい。 The functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 are realized by software. The storage unit 150 is provided in the memory 921. Note that the storage unit 150 may be provided in the auxiliary storage device 922, or may be provided separately in the memory 921 and the auxiliary storage device 922.
プロセッサ910は、自律走行プログラムを実行する装置である。自律走行プログラムは、自己位置推定部110と自己位置判定部120と走行制御部130の機能を実現するプログラムである。
プロセッサ910は、演算処理を行うICである。プロセッサ910の具体例は、CPU、DSP、GPUである。ICは、Integrated Circuitの略語である。CPUは、Central Processing Unitの略語である。DSPは、Digital Signal Processorの略語である。GPUは、Graphics Processing Unitの略語である。
The processor 910 is a device that executes an autonomous driving program. The autonomous driving program is a program that realizes the functions of the self-position estimation unit 110, the self-position determination unit 120, and the driving control unit 130.
The processor 910 is an IC that performs arithmetic processing. Specific examples of the processor 910 are a CPU, a DSP, and a GPU. IC is an abbreviation for Integrated Circuit. CPU is an abbreviation for Central Processing Unit. DSP is an abbreviation for Digital Signal Processor. GPU is an abbreviation for Graphics Processing Unit.
メモリ921は、データを一時的に記憶する記憶装置である。メモリ921の具体例は、SRAM、あるいはDRAMである。SRAMは、Static Random Access Memoryの略語である。DRAMは、Dynamic Random Access Memoryの略語である。
補助記憶装置922は、データを保管する記憶装置である。補助記憶装置922の具体例は、HDDである。また、補助記憶装置922は、SD(登録商標)メモリカード、CF、NANDフラッシュ、フレキシブルディスク、光ディスク、コンパクトディスク、ブルーレイ(登録商標)ディスク、DVDといった可搬の記憶媒体であってもよい。なお、HDDは、Hard Disk Driveの略語である。SD(登録商標)は、Secure Digitalの略語である。CFは、CompactFlash(登録商標)の略語である。DVDは、Digital Versatile Diskの略語である。
The memory 921 is a storage device that temporarily stores data. Specific examples of the memory 921 are SRAM and DRAM. SRAM is an abbreviation for Static Random Access Memory. DRAM is an abbreviation for Dynamic Random Access Memory.
The auxiliary storage device 922 is a storage device that stores data. A specific example of the auxiliary storage device 922 is an HDD. The auxiliary storage device 922 may also be a portable storage medium such as an SD (registered trademark) memory card, CF, NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD. Note that HDD is an abbreviation for Hard Disk Drive. SD (registered trademark) is an abbreviation for Secure Digital. CF is an abbreviation for CompactFlash (registered trademark). DVD is an abbreviation for Digital Versatile Disk.
入力インタフェース930は、マウス、キーボード、あるいはタッチパネルといった入力装置と接続されるポートである。入力インタフェース930は、具体的には、USB端子である。なお、入力インタフェース930は、LANと接続されるポートであってもよい。USBは、Universal Serial Busの略語である。LANは、Local Area Networkの略語である。 The input interface 930 is a port that is connected to an input device such as a mouse, keyboard, or touch panel. Specifically, the input interface 930 is a USB terminal. Note that the input interface 930 may also be a port that is connected to a LAN. USB is an abbreviation for Universal Serial Bus. LAN is an abbreviation for Local Area Network.
出力インタフェース940は、ディスプレイといった出力機器のケーブルが接続されるポートである。出力インタフェース940は、具体的には、USB端子またはHDMI(登録商標)端子である。ディスプレイは、具体的には、LCDである。出力インタフェース940は、表示器インタフェースともいう。HDMI(登録商標)は、High Definition Multimedia Interfaceの略語である。LCDは、Liquid Crystal Displayの略語である。 The output interface 940 is a port to which a cable from an output device such as a display is connected. Specifically, the output interface 940 is a USB terminal or an HDMI (registered trademark) terminal. Specifically, the display is an LCD. The output interface 940 is also called a display interface. HDMI (registered trademark) is an abbreviation for High Definition Multimedia Interface. LCD is an abbreviation for Liquid Crystal Display.
通信装置950は、レシーバとトランスミッタを有する。通信装置950は、LAN、インターネット、電話回線、あるいはWi-Fi(登録商標)といった通信網に接続している。通信装置950は、具体的には、通信チップまたはNICである。NICは、Network Interface Cardの略語である。 The communication device 950 has a receiver and a transmitter. The communication device 950 is connected to a communication network such as a LAN, the Internet, a telephone line, or Wi-Fi (registered trademark). Specifically, the communication device 950 is a communication chip or NIC. NIC is an abbreviation for Network Interface Card.
自律走行プログラムは、制御装置100において実行される。自律走行プログラムは、プロセッサ910に読み込まれ、プロセッサ910によって実行される。メモリ921には、自律走行プログラムだけでなく、OSも記憶されている。OSは、Operating Systemの略語である。プロセッサ910は、OSを実行しながら、自律走行プログラムを実行する。自律走行プログラムおよびOSは、補助記憶装置922に記憶されていてもよい。補助記憶装置922に記憶されている自律走行プログラムおよびOSは、メモリ921にロードされ、プロセッサ910によって実行される。なお、自律走行プログラムの一部または全部がOSに組み込まれていてもよい。 The autonomous driving program is executed by the control device 100. The autonomous driving program is loaded into the processor 910 and executed by the processor 910. In addition to the autonomous driving program, the OS is also stored in the memory 921. OS is an abbreviation for Operating System. The processor 910 executes the autonomous driving program while running the OS. The autonomous driving program and OS may be stored in the auxiliary storage device 922. The autonomous driving program and OS stored in the auxiliary storage device 922 are loaded into the memory 921 and executed by the processor 910. Note that part or all of the autonomous driving program may be incorporated into the OS.
制御装置100は、プロセッサ910を代替する複数のプロセッサを備えていてもよい。これら複数のプロセッサは、自律走行プログラムの実行を分担する。それぞれのプロセッサは、プロセッサ910と同じように、自律走行プログラムを実行する装置である。 The control device 100 may be equipped with multiple processors that replace the processor 910. These multiple processors share the execution of the autonomous driving program. Each processor is a device that executes the autonomous driving program, just like the processor 910.
自律走行プログラムにより利用、処理または出力されるデータ、情報、信号値および変数値は、メモリ921、補助記憶装置922、または、プロセッサ910内のレジスタあるいはキャッシュメモリに記憶される。 Data, information, signal values, and variable values used, processed, or output by the autonomous driving program are stored in memory 921, auxiliary storage device 922, or registers or cache memory within processor 910.
自己位置推定部110と自己位置判定部120と走行制御部130の各部の「部」を「回路」、「工程」、「手順」、「処理」、あるいは「サーキットリー」に読み替えてもよい。自律走行プログラムは、自己位置推定処理と自己位置判定処理と走行制御処理をコンピュータに実行させる。自己位置推定処理と自己位置判定処理と走行制御処理の「処理」を「プログラム」、「プログラムプロダクト」、「プログラムを記憶したコンピュータ読取可能な記憶媒体」、または「プログラムを記録したコンピュータ読取可能な記録媒体」に読み替えてもよい。また、自律走行方法は、制御装置100が自律走行プログラムを実行することにより行われる方法である。
自律走行プログラムは、コンピュータ読取可能な記録媒体に格納されて提供されてもよい。また、自律走行プログラムは、プログラムプロダクトとして提供されてもよい。
The "parts" of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 may be read as "circuits,""steps,""procedures,""processing," or "circuitry." The autonomous driving program causes a computer to execute a self-location estimation process, a self-location determination process, and a driving control process. The "processing" of the self-location estimation process, the self-location determination process, and the driving control process may be read as a "program,""programproduct,""a computer-readable storage medium storing a program," or "a computer-readable recording medium recording a program." Furthermore, the autonomous driving method is a method performed by the control device 100 executing the autonomous driving program.
The autonomous driving program may be provided by being stored on a computer-readable recording medium. Alternatively, the autonomous driving program may be provided as a program product.
***動作の説明***
次に、本実施の形態に係る自律走行移動体10の動作について説明する。自律走行移動体10の動作手順は、自律走行方法に相当する。また、自律走行移動体10の動作である自律走行処理を実現するプログラムは、自律走行プログラムに相当する。
***Explanation of Operation***
Next, the operation of the autonomously traveling vehicle 10 according to this embodiment will be described. The operation procedure of the autonomously traveling vehicle 10 corresponds to an autonomous traveling method. Furthermore, a program that realizes the autonomous traveling process, which is the operation of the autonomously traveling vehicle 10, corresponds to an autonomous traveling program.
図3は、本実施の形態に係る自律走行移動体10の自律走行処理を示すフロー図である。
自律走行移動体10は、周辺検知センサ203を搭載し、自律走行を行う。自律走行移動体10は、周辺検知センサ203を使って、点群地図51における位置および姿勢を自己位置姿勢として推定する。そして、自律走行移動体10は、推定した自己位置姿勢を用いて自律走行のための制御情報31を生成する。
本実施の形態に係る自律走行移動体10は、このように周辺検知センサを使って点群地図における位置および姿勢を自己位置姿勢として推定する技術をベースとして自律走行を行う。
FIG. 3 is a flow diagram showing the autonomous driving process of the autonomous driving vehicle 10 according to this embodiment.
The autonomously traveling vehicle 10 is equipped with a periphery detection sensor 203 and travels autonomously. The autonomously traveling vehicle 10 uses the periphery detection sensor 203 to estimate the position and orientation in the point cloud map 51 as its own position and orientation. The autonomously traveling vehicle 10 then generates control information 31 for autonomous traveling using the estimated position and orientation.
The autonomously running vehicle 10 according to this embodiment performs autonomous running based on a technology that uses a perimeter detection sensor to estimate the position and orientation on a point cloud map as its own position and orientation.
本実施の形態では、自律走行移動体10は、周辺検知センサ203としてLiDAR21を備える。
自律走行移動体10は、LiDAR21を用いたスキャンマッチングにより自己位置姿勢を推定する。すなわち、自律走行移動体10は、LiDAR-SLAMを実施する移動体である。LiDAR-SLAMは、GNSSといった衛星システムに依存することなく、点群地図の作成と自己位置姿勢の推定を行うことができる技術である。
なお、周辺検知センサを使って点群地図における自己位置姿勢を推定する技術は、LiDAR-SLAMに限らない。その他、周辺検知センサとしてカメラを使ったビジュアルSLAM、カメラから得られた連続画像を用いて姿勢を推定するVisual Odometry(ビジュアルオドメトリー)、Visual Odometryに慣性計測装置(IMU:Inertial Measurement Unit)を組み合わせて姿勢を推定するVisual-Inertial Odometry(ビジュアルイナーシャルオドメトリー)、あるいはオドメータを使ったオドメータSLAMといった技術をベースとする自律走行移動体についても、本実施の形態を適用することができる。
また、自律走行移動体10は、周辺検知センサ203として、LiDAR21の代わりにステレオカメラもしくはデプスカメラを用いてSLAMを実施し、点群地図の作成と自己位置姿勢の推定を行っても良い。
また、自律走行移動体10は、周辺検知センサ203として、LiDAR21の代わりに高分解能なミリ波レーダもしくはイメージングレーダと、単眼カメラとを用いてSLAMを実施し、点群地図の作成と自己位置姿勢の推定を行っても良い。
また、自律走行移動体10は、周辺検知センサ203として、以下の(11)から(14)のうちの複数を複合させたフュージョンセンサを用いて、点群地図の作成と自己位置姿勢の推定を行っても良い。
(11)LiDAR21
(12)ミリ波レーダもしくはイメージングレーダおよび単眼カメラ
(13)単眼カメラ
(14)ステレオカメラもしくはデプスカメラ
またこのフュージョンセンサは電子コンパスまたは慣性計測装置を組み合わせても良い。
In this embodiment, the autonomously traveling vehicle 10 is equipped with a LiDAR 21 as the periphery detection sensor 203.
The autonomous vehicle 10 estimates its own position and orientation through scan matching using the LiDAR 21. In other words, the autonomous vehicle 10 is a vehicle that implements LiDAR-SLAM. LiDAR-SLAM is a technology that can create a point cloud map and estimate its own position and orientation without relying on a satellite system such as GNSS.
The technology for estimating the vehicle's position and orientation on a point cloud map using a perimeter detection sensor is not limited to LiDAR-SLAM. This embodiment can also be applied to autonomous vehicles based on technologies such as visual SLAM, which uses a camera as a perimeter detection sensor, visual odometry, which estimates orientation using continuous images obtained from a camera, visual-inertial odometry, which estimates orientation by combining visual odometry with an inertial measurement unit (IMU), or odometer SLAM, which uses an odometer.
In addition, the autonomous vehicle 10 may perform SLAM using a stereo camera or a depth camera instead of the LiDAR 21 as the surroundings detection sensor 203 to create a point cloud map and estimate its own position and orientation.
In addition, the autonomous vehicle 10 may perform SLAM using a high-resolution millimeter-wave radar or imaging radar and a monocular camera instead of the LiDAR 21 as the surrounding detection sensor 203, to create a point cloud map and estimate its own position and orientation.
In addition, the autonomous vehicle 10 may use a fusion sensor that combines two or more of the following (11) to (14) as the surrounding detection sensor 203 to create a point cloud map and estimate its own position and orientation.
(11) LiDAR21
(12) Millimeter wave radar or imaging radar and monocular camera (13) Monocular camera (14) Stereo camera or depth camera This fusion sensor may also be combined with an electronic compass or an inertial measurement unit.
<自己位置推定処理(SLAM自己位置推定処理):ステップS101>
ステップS101において、自己位置推定部110は、周辺検知センサ203を使って点群地図51における位置および姿勢を自己位置姿勢として推定する。具体的には、自己位置推定部110は、LiDAR-SLAMの技術を用いて自己位置姿勢を推定する。自己位置推定部110は、LiDAR21により取得された点群データと点群地図51とをスキャンマッチングすることにより自己位置姿勢を推定する。
<Self-Location Estimation Process (SLAM Self-Location Estimation Process): Step S101>
In step S101, the self-location estimation unit 110 estimates the position and orientation on the point cloud map 51 as the self-location and orientation using the surrounding detection sensor 203. Specifically, the self-location estimation unit 110 estimates the self-location and orientation using LiDAR-SLAM technology. The self-location estimation unit 110 estimates the self-location and orientation by scan matching the point cloud data acquired by the LiDAR 21 with the point cloud map 51.
<自己位置判定処理(第1の判定処理S10):ステップS102からステップS103>
ステップS102において、自己位置判定部120は、自己位置姿勢が正常か否かを判定する第1の判定処理S10を実施する。
自己位置判定部120は、現在の自己位置姿勢と前回推定した自己位置姿勢とを比較する。現在の自己位置姿勢とは、ステップS101においてスキャンマッチングにより得られた自己位置姿勢である。前回推定した自己位置姿勢は、記憶部150に自己位置姿勢52として記憶されているものとする。
現在の自己位置姿勢と前回推定した自己位置姿勢とを比較した比較結果は、具体的には、現在の自己位置姿勢と前回推定した自己位置姿勢との位置の差である。
<Self-Position Determination Process (First Determination Process S10): Steps S102 to S103>
In step S102, the self-position determining unit 120 performs a first determination process S10 to determine whether the self-position and orientation are normal.
The self-position determination unit 120 compares the current self-position and orientation with the previously estimated self-position and orientation. The current self-position and orientation is the self-position and orientation obtained by scan matching in step S101. The previously estimated self-position and orientation is assumed to be stored in the storage unit 150 as the self-position and orientation 52.
The comparison result between the current self-position and posture and the previously estimated self-position and posture is specifically the difference in position between the current self-position and posture and the previously estimated self-position and posture.
比較結果である位置の差が位置閾値53よりも小さければ、自己位置判定部120は、現在の自己位置姿勢が正常と判定する。自己位置判定部120は、正常と判定した現在の自己位置姿勢を、自己位置姿勢52として記憶部150に記憶する。
比較結果である位置の差が位置閾値53よりも大きければ、自己位置判定部120は、自己位置判定部120は現在の自己位置姿勢が正常でないと判定する。
If the position difference obtained as a result of the comparison is smaller than the position threshold value 53, the self-position determining unit 120 determines that the current self-position and orientation is normal. The self-position determining unit 120 stores the current self-position and orientation determined to be normal in the storage unit 150 as the self-position and orientation 52.
If the position difference obtained as a result of the comparison is greater than the position threshold value 53, the self-position determining unit 120 determines that the current self-position and orientation is not normal.
図4は、本実施の形態に係る第1の判定処理S10の具体例を示す図である。
図4の左図は、自己位置推定SLAMできている状態の自己位置姿勢である。自己位置推定SLAMできている状態の自己位置姿勢は、自己位置姿勢52として記憶部150に記憶されている。
図4の右図の実線のPMVは、次のフレーム、すなわち現在の自己位置姿勢を示している。図4の右図における点線のPMVは記憶部150に記憶されている自己位置姿勢52を表す。
比較結果が位置閾値53以上の場合、「自己位置ロスト」と判定される。
比較結果は、例えば、評価関数の計算結果の値として表される。比較結果は推定結果ともいう。評価関数の計算結果の値は、具体的には、座標の差、あるいは速度の差といった値で表される。
FIG. 4 is a diagram showing a specific example of the first determination process S10 according to this embodiment.
4 shows the self-position and orientation when the self-position estimation SLAM is performed. The self-position and orientation when the self-position estimation SLAM is performed is stored in the storage unit 150 as the self-position and orientation 52.
4 indicates the next frame, i.e., the current self-position and orientation. The dotted line PMV in the right diagram of FIG.
If the comparison result is equal to or greater than the position threshold value 53, it is determined that the "self-position is lost."
The comparison result is expressed as, for example, a value obtained by calculating an evaluation function. The comparison result is also called an estimation result. Specifically, the value obtained by calculating an evaluation function is expressed as a value such as a difference in coordinates or a difference in speed.
このように、自己位置姿勢が正常か否かを判定することで、似た地形におけるスキャンマッチングによる「自己位置の誤り」を検出することが可能となる。
また、特徴点が少なくスキャンマッチングによる自己位置推定ができない場合、自己位置ロストを検出することが可能となる。
In this way, by determining whether the self-position and orientation are normal, it is possible to detect "self-position errors" caused by scan matching on similar terrain.
Furthermore, when there are not enough feature points and self-location estimation by scan matching is not possible, it becomes possible to detect self-location loss.
自己位置姿勢が正常と判定されると(ステップS103でYES)、ステップS107に進む。
自己位置姿勢が正常ではないと判定されると(ステップS103でNO)、ステップS104に進む。
If the self-position and orientation are determined to be normal (YES in step S103), the process proceeds to step S107.
If it is determined that the self-position and orientation are not normal (NO in step S103), the process proceeds to step S104.
<自己位置推定処理(取得処理S30))および自己位置判定処理(第2の判定処理S20):ステップS104からステップS106>
ステップS104において、自己位置推定部110は、自己位置姿勢が正常でないと判定されると、正常な自己位置姿勢を取得するための取得処理S30を実施する。なお、取得処理S30により得られた自己位置姿勢が第2の判定処理S20により正常と判定されるまで取得処理S30を繰り返す。
具体的には、自己位置推定部110は、自己位置姿勢が正常でないと判定されると、取得処理S30として、GNSS受信機202により受信したGNSS受信データ41を用いて現在の自己位置姿勢を取得する。
<Self-Location Estimation Process (Acquisition Process S30) and Self-Location Determination Process (Second Determination Process S20): Steps S104 to S106>
If the self-position estimation unit 110 determines in step S104 that the self-position and orientation are not normal, it performs an acquisition process S30 to acquire a normal self-position and orientation. Note that the acquisition process S30 is repeated until the self-position and orientation obtained by the acquisition process S30 is determined to be normal by the second determination process S20.
Specifically, when the self-position estimation unit 110 determines that the self-position and attitude are not normal, the self-position estimation unit 110 acquires the current self-position and attitude using the GNSS reception data 41 received by the GNSS receiver 202 as an acquisition process S30.
図5は、本実施の形態に係る取得処理S30の例を示す図である。
自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信機202により受信したGNSS受信データ41を用いて現在の自己位置姿勢を取得する。すなわち、取得処理S30は、周辺検知センサ203を利用するSLAMによる自己位置姿勢の推定とは異なる機器であるGNSS受信機202を用いて、現在の自己位置の推定を行い、また姿勢の推定を行う。この姿勢の推定については、取得処理S30の例1~5において後述する。なお、この場合の異なる機器は、カメラおよびカメラ画像の画像処理器、あるいはその他の機器を用いても良い。その他の機器としては、ミリ波レーダもしくはイメージングレーダおよびカメラ、電子コンパス、慣性計測装置等であっても良い。
FIG. 5 is a diagram showing an example of the acquisition process S30 according to this embodiment.
If the self-position estimation unit 110 determines that the self-position and attitude are incorrect, it performs an acquisition process S30 in which it acquires the current self-position and attitude using the GNSS reception data 41 received by the GNSS receiver 202. That is, in the acquisition process S30, the current self-position and attitude are estimated using the GNSS receiver 202, which is a device different from the device used to estimate the self-position and attitude by SLAM using the periphery detection sensor 203. This attitude estimation will be described later in examples 1 to 5 of the acquisition process S30. Note that the different device in this case may be a camera and an image processor for the camera image, or other devices. Examples of other devices may include millimeter-wave radar or imaging radar and a camera, an electronic compass, an inertial measurement unit, etc.
図5の上段左図は図4の右図と同様に、比較結果が位置閾値以上であり、「自己位置ロスト」と判定された状態を示している。
自己位置ロストが検出されると、自己位置推定部110は、GNSS受信データ41を使って自己位置姿勢の推定を行う。なお、常にGNSS受信データ41で自己位置推定を行うわけではなく、自己位置ロストが検出された場合のみである。
図5の上段右図に示すように、自己位置推定部110は、GNSS受信データ41を使って自己位置姿勢の推定を行い、GNSS受信データ41による自己位置姿勢を現在の自己位置姿勢とする。
The upper left diagram of FIG. 5, like the right diagram of FIG. 4, shows a state in which the comparison result is equal to or greater than the position threshold value and it is determined that "self-position lost."
When a self-position loss is detected, the self-position estimation unit 110 estimates the self-position and orientation using the GNSS reception data 41. Note that the self-position estimation is not always performed using the GNSS reception data 41, but is performed only when a self-position loss is detected.
As shown in the upper right diagram of FIG. 5, the self-position estimation unit 110 estimates the self-position and attitude using the GNSS reception data 41, and sets the self-position and attitude based on the GNSS reception data 41 as the current self-position and attitude.
ステップS105において、自己位置判定部120は、取得処理S30により得られた自己位置姿勢が正常か否かを判定する第2の判定処理S20を実施する。
第2の判定処理S20の処理は第1の判定処理S10の手法と同様である。
第1の判定処理S10では、前回推定した自己位置姿勢と比較する現在の自己位置姿勢は、ステップS101においてSLAMにより取得された自己位置姿勢である。一方、第2の判定処理S20では、前回推定した自己位置姿勢と比較する現在の自己位置姿勢は、ステップS104において取得処理S30により取得された自己位置姿勢である。例えば、前回推定した自己位置姿勢と比較する現在の自己位置姿勢は、ステップS104においてGNSS受信データ41により取得された自己位置姿勢である。
In step S105, the self-position determining unit 120 performs a second determination process S20 to determine whether the self-position and orientation obtained in the acquisition process S30 are normal or not.
The second determination process S20 is performed in the same manner as the first determination process S10.
In the first determination process S10, the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired by SLAM in step S101. On the other hand, in the second determination process S20, the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired by the acquisition process S30 in step S104. For example, the current self-position and attitude to be compared with the previously estimated self-position and attitude is the self-position and attitude acquired from the GNSS reception data 41 in step S104.
比較結果である位置の差が位置閾値53よりも小さければ、自己位置判定部120は、現在の自己位置姿勢が正常と判定する。自己位置判定部120は、正常と判定した現在の自己位置姿勢を自己位置姿勢52として記憶部150に記憶してもよい。なお、最終的な正常な自己位置姿勢は「SLAMによる自己位置姿勢」である。図3のステップS106においても、「GNSS受信データ41により取得された自己位置姿勢」=「LiDAR-SLAMによる自己位置姿勢」となるが、この場合の自己位置姿勢はあくまで「LiDAR-SLAMによって得た自己位置姿勢」となる。
比較結果である位置の差が位置閾値53よりも大きければ、自己位置判定部120は、現在の自己位置姿勢が正常でないと判定する。
If the position difference obtained as a result of the comparison is smaller than the position threshold value 53, the self-position determination unit 120 determines that the current self-position and attitude is normal. The self-position determination unit 120 may store the current self-position and attitude determined to be normal in the storage unit 150 as the self-position and attitude 52. The final normal self-position and attitude is the "self-position and attitude obtained by SLAM." In step S106 of FIG. 3, the "self-position and attitude obtained from the GNSS reception data 41" is equal to the "self-position and attitude obtained by LiDAR-SLAM," but the self-position and attitude in this case is always the "self-position and attitude obtained by LiDAR-SLAM."
If the position difference obtained as a result of the comparison is greater than the position threshold value 53, the self-position determining unit 120 determines that the current self-position and orientation is not normal.
図5の下段図に示すように、自己位置判定部120は、GNSS受信データ41による自己位置姿勢を現在の自己位置姿勢として、現在の自己位置姿勢が正常と判定する。 As shown in the lower diagram of Figure 5, the self-position determination unit 120 determines that the current self-position and attitude is normal, taking the self-position and attitude based on the GNSS reception data 41 as the current self-position and attitude.
自己位置姿勢が正常と判定されると(ステップS106でYES)、ステップS107に進む。
自己位置姿勢が正常ではないと判定されると(ステップS106でNO)、ステップS104に戻り、取得処理S30を繰り返す。
If the self-position and orientation are determined to be normal (YES in step S106), the process proceeds to step S107.
If it is determined that the self-position and orientation are not normal (NO in step S106), the process returns to step S104 and the acquisition process S30 is repeated.
<走行制御処理:ステップS107>
ステップS107において、走行制御部130は、正常と判定された自己位置姿勢を用いて自律走行を制御する。走行制御部130は、正常と判定された自己位置姿勢を用いて自律走行を制御する制御情報31を生成し、制御情報31を車両走行装置204に出力する。制御情報31は、自律走行移動体10の速度・操舵を決定する情報である。車両走行装置204は、車体モータおよび駆動軸といった装置であり、制御情報31に基づいて自律走行を実施する。
<Driving Control Processing: Step S107>
In step S107, the driving control unit 130 controls the autonomous driving using the self-position and attitude determined to be normal. The driving control unit 130 generates control information 31 for controlling the autonomous driving using the self-position and attitude determined to be normal, and outputs the control information 31 to the vehicle driving device 204. The control information 31 is information that determines the speed and steering of the autonomously driving vehicle 10. The vehicle driving device 204 is a device such as a vehicle motor and drive shaft, and performs autonomous driving based on the control information 31.
なお、走行制御部130は、第2の判定処理S20により自己位置姿勢が正しいと判定されるまで、自律走行を減速してもよい。あるいは、走行制御部130は、第2の判定処理S20により自己位置姿勢が正しいと判定されるまで、自律走行を停止してもよい。 The driving control unit 130 may decelerate the autonomous driving until the second determination process S20 determines that the self-position and attitude are correct. Alternatively, the driving control unit 130 may stop the autonomous driving until the second determination process S20 determines that the self-position and attitude are correct.
ここで、本実施の形態に係る取得処理S30のその他の例について説明する。 Here, we will explain other examples of the acquisition process S30 in this embodiment.
<取得処理S30の例1>
自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信データ41を用いて現在の自己位置を取得するとともに、前回推定した自己位置姿勢のうちの姿勢を現在の姿勢として取得してもよい。なお、GNSS受信データ41を用いて現在の自己位置を取得する際は、位置誤差がセンチメートル級となる高精度なRTK(Real Time Kinematic)測位によって位置推定しても良い。
<Example 1 of Acquisition Process S30>
When the self-position estimation unit 110 determines that the self-position and attitude are incorrect, in the acquisition process S30, the self-position estimation unit 110 may acquire the current self-position using the GNSS reception data 41 and may acquire the attitude of the previously estimated self-position and attitude as the current attitude. When the current self-position is acquired using the GNSS reception data 41, the position may be estimated by high-precision RTK (Real Time Kinematic) positioning with a position error of the order of centimeters.
<取得処理S30の例2>
図6は、本実施の形態に係る取得処理S30の例2を示す図である。
自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信データ41を用いて現在の自己位置姿勢を取得する。自己位置推定部110は、取得処理S30を繰り返す際には、自己位置姿勢の自己位置を少しずつずらした位置を現在の自己位置とし、自己位置姿勢のうちの姿勢から所定の角度ずらした姿勢を現在の姿勢として取得してもよい。
そして、自己位置推定部110は、現在の自己位置および現在の姿勢を現在の自己位置姿勢として、第2の判定処理S20以降の処理を行う。
ここで、所定の角度とは、例えば、10度などである。所定の角度が1度の場合は現在の自己位置姿勢が正常と判定されるまで、最高360回の取得処理S30を行うことになる。
<Example 2 of Acquisition Process S30>
FIG. 6 is a diagram showing a second example of the acquisition process S30 according to the present embodiment.
If the self-position estimation unit 110 determines that the self-position and attitude are incorrect, in acquisition process S30, it acquires the current self-position and attitude using the GNSS reception data 41. When repeating acquisition process S30, the self-position estimation unit 110 may set a position obtained by gradually shifting the self-position of the self-position and attitude as the current self-position, and may acquire an attitude shifted by a predetermined angle from the attitude of the self-position and attitude as the current attitude.
Then, the self-position estimation unit 110 sets the current self-position and current attitude as the current self-position and attitude, and performs the second determination process S20 and subsequent processes.
Here, the predetermined angle is, for example, 10 degrees. If the predetermined angle is 1 degree, the acquisition process S30 will be performed a maximum of 360 times until the current self-position and orientation is determined to be normal.
<取得処理S30の例3>
自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信データ41を用いて現在の自己位置を取得するとともに、電子コンパスを用いて現在の姿勢を取得してもよい。
また、自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信データ41を用いて現在の自己位置を取得するとともに、慣性計測装置を使って現在の姿勢を取得してもよい。
また、自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、GNSS受信データ41を用いて現在の自己位置を取得するとともに、前回のGNSS受信データ41を用いて取得された前回の自己位置を用いて、GNSS受信データ41を用いた現在の自己位置と前回の自己位置との差のベクトルと、慣性計測装置を使って現在の姿勢を取得してもよい。
<Example 3 of acquisition process S30>
When the self-position estimation unit 110 determines that the self-position and attitude are incorrect, it may acquire the current self-position using the GNSS reception data 41 and the current attitude using an electronic compass as an acquisition process S30.
Furthermore, when the self-position estimation unit 110 determines that the self-position and attitude are incorrect, as an acquisition process S30, it may acquire the current self-position using the GNSS reception data 41 and also acquire the current attitude using an inertial measurement device.
Furthermore, when the self-position estimation unit 110 determines that the self-position and attitude are incorrect, as an acquisition process S30, it may acquire the current self-position using the GNSS reception data 41, and may also use the previous self-position acquired using the previous GNSS reception data 41 to acquire the vector of the difference between the current self-position using the GNSS reception data 41 and the previous self-position, and the current attitude using an inertial measurement unit.
<取得処理S30の例4>
自己位置推定部110は、自己位置姿勢が正しくないと判定されると、取得処理S30として、カメラを用いて道路の目印などから現在の自己位置姿勢を取得してもよい。
<Example 4 of Acquisition Process S30>
If the self-position estimation unit 110 determines that the self-position and orientation are incorrect, the self-position estimation unit 110 may acquire the current self-position and orientation from road landmarks or the like using a camera in acquisition processing S30.
<取得処理S30の例5>
上述したように、取得処理S30は、取得処理S30により得られた自己位置姿勢が第2の判定処理S20により正常と判定されるまで繰り返される。
このとき、例えば、以下の(1)から(3)のようにルールを決めて取得処理S30を繰り返してもよい。
(1)1~K回目まで
自己位置姿勢は、GNSS受信データ41を用いて取得する。
(2)K+1~L回目まで
自己位置は、GNSS受信データ41を用いて取得する。
姿勢は、記憶部150に記憶されている前回の自己位置姿勢のうちの姿勢を取得する。(3)L+1~M回目まで
自己位置は、GNSS受信データ41を用いて取得する。
姿勢は、電子コンパスを用いて取得する。
<Example 5 of Acquisition Process S30>
As described above, the acquisition process S30 is repeated until the self-position and orientation obtained by the acquisition process S30 is determined to be normal by the second determination process S20.
At this time, for example, the acquisition process S30 may be repeated by determining rules such as those in (1) to (3) below.
(1) From 1st to Kth time: The self-position and attitude are acquired using the GNSS reception data 41.
(2) From K+1 to L times: The self-position is acquired using the GNSS reception data 41.
The attitude is acquired from the attitude of the previous self-position and attitude stored in the storage unit 150. (3) From L+1 to Mth time The self-position is acquired using the GNSS reception data 41.
The attitude is obtained using an electronic compass.
なお、K,L,Mの数は、例えば、1<K<L<Mとなる任意の整数である。
また(1)から(3)の順番、自己位置および姿勢の取得方法の組み合わせなども任意である。
The numbers K, L, and M are any integers that satisfy the relationship 1<K<L<M, for example.
The order of (1) to (3) and the combination of methods for obtaining the self-position and attitude are also arbitrary.
また、記憶部150に回数閾値を記憶していてもよい。自己位置推定部110は、取得処理S30を繰り返す回数が回数閾値を超えた場合に、正常な自己位置姿勢を取得できないことを示す警告メッセージを通知する機能を有していてもよい。
あるいは、記憶部150に時間閾値を記憶していてもよい。自己位置推定部110は、取得処理S30を繰り返す時間が時間閾値を超えた場合に、正常な自己位置姿勢を取得できないことを示す警告メッセージを通知する機能を有していてもよい。
A count threshold may be stored in the storage unit 150. The self-position estimation unit 110 may have a function of issuing a warning message indicating that a normal self-position and orientation cannot be acquired when the number of times the acquisition process S30 is repeated exceeds the count threshold.
Alternatively, the time threshold may be stored in the storage unit 150. The self-position estimation unit 110 may have a function of issuing a warning message indicating that a normal self-position and orientation cannot be acquired when the time taken to repeat the acquisition process S30 exceeds the time threshold.
***他の構成***
<変形例1>
本実施の形態では、LiDAR-SLAMにより自己位置姿勢を推定する手法をベースとして自律走行する態様について説明した。
変形例として、自律走行移動体は、カメラを備え、カメラを用いたビジュアルSLAMにより自己位置姿勢を推定する手法をベースとして自律走行してもよい。
あるいは、自律走行移動体は、オドメータを備え、オドメータを用いたオドメータSLAMにより自己位置姿勢を推定する手法をベースとして自律走行してもよい。
***Other configurations***
<Modification 1>
In this embodiment, an aspect of autonomous driving based on a method of estimating self-position and orientation using LiDAR-SLAM has been described.
As a modified example, the autonomously traveling vehicle may be equipped with a camera and may travel autonomously based on a method of estimating its own position and orientation using visual SLAM using the camera.
Alternatively, the autonomously traveling vehicle may be equipped with an odometer and may travel autonomously based on a method of estimating its own position and orientation using odometer SLAM (Simultaneous Local Area Measurement) that uses the odometer.
<変形例2>
本実施の形態では、GNSS受信データによる自己位置姿勢の取得後、取得した自己位置姿勢が正常であれば、その自己位置姿勢(GNSS受信データによる自己位置姿勢)を使って走行制御を行う。そして、その後、ベースとなるLiDAR-SLAMによる自己位置姿勢の推定が行われる処理となっている。
一方、本変形例では、GNSS受信データにより自己位置姿勢の取得後、取得した自己位置姿勢が正常であれば、再度LiDAR-SLAMによる自己位置姿勢の推定を行ってもよい。
<Modification 2>
In this embodiment, after acquiring the vehicle's position and attitude using GNSS reception data, if the acquired position and attitude is normal, the vehicle performs driving control using the position and attitude (based on the GNSS reception data).Then, the vehicle's position and attitude is estimated using LiDAR-SLAM, which serves as the base.
On the other hand, in this modified example, after obtaining the self-position and attitude from the GNSS reception data, if the obtained self-position and attitude is normal, the self-position and attitude may be estimated again using LiDAR-SLAM.
図7は、本実施の形態の変形例に係る自律走行移動体10の自律走行処理を示すフロー図である。
図7において、図3と異なる点は、ステップS106でYESの場合、ステップS101に戻りベースとなるLiDAR-SLAMによる自己位置姿勢の推定を行う点である。
図7では、具体的には、以下の4つの処理を行っているとイメージできる。
(1)LiDAR-SLAMで自己位置が不安定になる。
(2)GNSS受信機からの自己位置を「メモリに一時保存」する。
(3)(2)の情報をLiDAR-SLAMの自己位置推定処理に補完情報として送信する。
(4)LiDAR-SLAMでの自己位置推定により自己位置が安定し、自動走行を再開する。(2)の情報をメモリから削除する。
FIG. 7 is a flowchart showing the autonomous driving process of the autonomous driving vehicle 10 according to a modified example of this embodiment.
7 differs from FIG. 3 in that if the answer to step S106 is YES, the process returns to step S101 and the self-position and orientation are estimated using LiDAR-SLAM as the base.
Specifically, in FIG. 7, the following four processes can be imagined to be performed.
(1) LiDAR-SLAM makes the vehicle's position unstable.
(2) The vehicle's own position from the GNSS receiver is "temporarily stored in memory."
(3) The information in (2) is transmitted as complementary information to the LiDAR-SLAM self-position estimation process.
(4) Once the vehicle's position is stabilized by LiDAR-SLAM self-location estimation, autonomous driving resumes. The information from (2) is deleted from memory.
なお、実施の形態1で説明した<取得処理S30の例2>は、図7の処理において実施されてもよい。<取得処理S30の例2>は、自己位置はおおよそ合っているが、姿勢が合っていない場合の処理となる。
自己位置推定部110は、自己位置姿勢が正しくないと判定される(ステップS103)と、取得処理S30として、GNSS受信データ41を用いて現在の自己位置姿勢を取得する(ステップS104)。自己位置推定部110は、取得処理S30を繰り返す際には、自己位置姿勢の自己位置を少しずつずらした位置を現在の自己位置とし、自己位置姿勢のうちの姿勢から所定の角度ずらした姿勢を現在の姿勢として取得してもよい(ステップS104)。
そして、自己位置推定部110は、現在の自己位置および現在の姿勢を現在の自己位置姿勢として、第2の判定処理S20以降の処理を行う(ステップS105)。
ステップS106において、現在の自己位置姿勢が正常であれば、ステップS101に戻りLiDAR-SLAMの自己位置推定処理を実施する。
Note that <Example 2 of acquisition process S30> described in the first embodiment may be implemented in the process of Fig. 7. <Example 2 of acquisition process S30> is a process when the self-position is roughly correct but the posture is not.
When the self-position estimation unit 110 determines that the self-position and attitude are incorrect (step S103), the self-position estimation unit 110 acquires the current self-position and attitude using the GNSS reception data 41 as the acquisition process S30 (step S104). When repeating the acquisition process S30, the self-position estimation unit 110 may set a position obtained by gradually shifting the self-position of the self-position and attitude as the current self-position, and acquire an attitude shifted by a predetermined angle from the attitude of the self-position and attitude as the current attitude (step S104).
Then, the self-position estimation unit 110 sets the current self-position and current attitude as the current self-position and attitude, and performs the second determination process S20 and subsequent processes (step S105).
In step S106, if the current self-position and orientation is normal, the process returns to step S101 and the LiDAR-SLAM self-position estimation process is performed.
以上のように、LiDAR-SLAMにより自己位置ロストが検出された場合でも、GNSS受信データによる自己位置姿勢を取得することで、再度LiDAR-SLAMに復帰することが可能となる。
また、最終的な自己位置姿勢の推定はLiDAR-SLAMによるため、GNSS受信機がFIXしていない状態であっても自己位置姿勢の推定は可能となる。
As described above, even if LiDAR-SLAM detects that the vehicle has lost its position, it is possible to return to LiDAR-SLAM by obtaining its own position and attitude from GNSS reception data.
Furthermore, since the final estimation of the vehicle's position and attitude is performed using LiDAR-SLAM, it is possible to estimate the vehicle's position and attitude even if the GNSS receiver is not fixed.
<変形例3>
本実施の形態では、自己位置推定部110と自己位置判定部120と走行制御部130の機能がソフトウェアで実現される。変形例として、自己位置推定部110と自己位置判定部120と走行制御部130の機能がハードウェアで実現されてもよい。
具体的には、制御装置100は、プロセッサ910に替えて電子回路909を備える。
<Modification 3>
In this embodiment, the functions of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 are realized by software. As a modified example, the functions of the self-location estimation unit 110, the self-location determination unit 120, and the driving control unit 130 may be realized by hardware.
Specifically, the control device 100 includes an electronic circuit 909 instead of a processor 910 .
図8は、本実施の形態の変形例に係る制御装置100の構成例を示す図である。
電子回路909は、自己位置推定部110と自己位置判定部120と走行制御部130の機能を実現する専用の電子回路である。電子回路909は、具体的には、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ロジックIC、GA、ASIC、または、FPGAである。GAは、Gate Arrayの略語である。ASICは、Application Specific Integrated Circuitの略語である。FPGAは、Field-Programmable Gate Arrayの略語である。
FIG. 8 is a diagram showing an example of the configuration of a control device 100 according to a modification of this embodiment.
The electronic circuit 909 is a dedicated electronic circuit that realizes the functions of the self-position estimation unit 110, the self-position determination unit 120, and the driving control unit 130. Specifically, the electronic circuit 909 is a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA. GA is an abbreviation for Gate Array. ASIC is an abbreviation for Application Specific Integrated Circuit. FPGA is an abbreviation for Field-Programmable Gate Array.
自己位置推定部110と自己位置判定部120と走行制御部130の機能は、1つの電子回路で実現されてもよいし、複数の電子回路に分散して実現されてもよい。 The functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be realized by a single electronic circuit, or may be distributed across multiple electronic circuits.
別の変形例として、自己位置推定部110と自己位置判定部120と走行制御部130の一部の機能が電子回路で実現され、残りの機能がソフトウェアで実現されてもよい。また、自己位置推定部110と自己位置判定部120と走行制御部130の一部またはすべての機能がファームウェアで実現されてもよい。 As another variation, some of the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be implemented by electronic circuits, with the remaining functions being implemented by software. Also, some or all of the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 may be implemented by firmware.
プロセッサと電子回路の各々は、プロセッシングサーキットリとも呼ばれる。つまり、自己位置推定部110と自己位置判定部120と走行制御部130の機能は、プロセッシングサーキットリにより実現される。 Each of the processor and electronic circuits is also called processing circuitry. In other words, the functions of the self-position estimation unit 110, self-position determination unit 120, and driving control unit 130 are realized by processing circuitry.
***本実施の形態の効果の説明***
本実施の形態に係る自律走行移動体では、LiDARといった周辺検知センサを用いたSLAMにより自己位置姿勢を推定する処理により得られた自己位置姿勢が正常でないと判定されると別の手法である取得処理を用いて正常な自己位置姿勢を取得する。そして、本実施の形態に係る自律走行移動体では、取得処理により得られた自己位置姿勢が判定処理により正常と判定されるまで取得処理を繰り返す。本実施の形態に係る自律走行移動体では、取得処理により得られた自己位置姿勢が正常と判定されると、正常と判定された自己位置姿勢を用いて自律走行を制御し、周辺検知センサを用いたSLAMにより自己位置姿勢を推定して自律走行する。
***Description of the Effects of This Embodiment***
In the autonomously running mobile body according to this embodiment, if it is determined that the self-position and attitude obtained by the process of estimating the self-position and attitude by SLAM using a periphery detection sensor such as LiDAR is not normal, a normal self-position and attitude is obtained by an acquisition process, which is a different method. Then, in the autonomously running mobile body according to this embodiment, the acquisition process is repeated until the self-position and attitude obtained by the acquisition process is determined to be normal by a determination process. In the autonomously running mobile body according to this embodiment, if the self-position and attitude obtained by the acquisition process is determined to be normal, the autonomously running mobile body controls autonomous running using the self-position and attitude determined to be normal, and autonomously runs by estimating the self-position and attitude by SLAM using a periphery detection sensor.
周辺検知センサを用いたSLAM、すなわちスキャンマッチングは、自己位置姿勢の精度が高い。一方、車両が回転する、あるいは近似した場所および視野全周囲が隠される戸板亜場合に自己位置を見失うことがある。このような場合に、本実施の形態では、スキャンマッチング以外の代替手法で自律走行移動体を動かすのではなく、スキャンマッチングをその場で復旧させることを目的としている。
よって、本実施の形態に係る自律走行移動体によれば、周辺検知センサを用いたSLAMを用いた自律走行をベースとした走行を継続させることができ、高精度かつ低コストに自律走行を行うことができるという効果を奏する。
SLAM, or scan matching, using a perimeter detection sensor provides highly accurate self-position and orientation. However, the vehicle may lose track of its own position if it rotates or if the approximate location and the entire surrounding field of view are obscured. In such cases, the present embodiment aims to restore scan matching on the spot, rather than moving the autonomous vehicle using an alternative method other than scan matching.
Therefore, according to the autonomously driving vehicle of this embodiment, it is possible to continue driving based on autonomous driving using SLAM with a surrounding detection sensor, thereby achieving the effect of enabling autonomous driving to be performed with high precision and low cost.
以上の実施の形態1では、制御装置の各部を独立した機能ブロックとして説明した。しかし、制御装置の構成は、上述した実施の形態のような構成でなくてもよい。制御装置の機能ブロックは、上述した実施の形態で説明した機能を実現することができれば、どのような構成でもよい。また、制御装置は、1つの装置でなく、複数の装置から構成されたシステムでもよい。
また、実施の形態1のうち、複数の部分を組み合わせて実施しても構わない。あるいは、実施の形態1のうち、1つの部分を実施しても構わない。その他、実施の形態1を、全体としてあるいは部分的に、どのように組み合わせて実施しても構わない。
すなわち、実施の形態1では、実施の形態1の部分の自由な組み合わせ、あるいは実施の形態1の任意の構成要素の変形、もしくは実施の形態1において任意の構成要素の省略が可能である。
In the first embodiment described above, each unit of the control device has been described as an independent functional block. However, the configuration of the control device does not have to be the same as that of the above-described embodiment. The functional blocks of the control device may have any configuration as long as they can realize the functions described in the above-described embodiment. Furthermore, the control device may not be a single device, but may be a system composed of multiple devices.
Furthermore, it is possible to combine multiple parts of the first embodiment and implement it. Alternatively, it is possible to implement only one part of the first embodiment. In addition, it is possible to combine the first embodiment in whole or in part in any way.
That is, in the first embodiment, it is possible to freely combine parts of the first embodiment, or to modify any of the components of the first embodiment, or to omit any of the components of the first embodiment.
なお、上述した実施の形態は、本質的に好ましい例示であって、本開示の範囲、本開示の適用物の範囲、および本開示の用途の範囲を制限することを意図するものではない。上述した実施の形態は、必要に応じて種々の変更が可能である。例えば、フロー図あるいはシーケンス図を用いて説明した手順は、適宜に変更してもよい。 Note that the above-described embodiments are essentially preferred examples and are not intended to limit the scope of the present disclosure, the scope of applications of the present disclosure, or the scope of uses of the present disclosure. The above-described embodiments can be modified in various ways as needed. For example, procedures explained using flow charts or sequence diagrams may be modified as appropriate.
10 自律走行移動体、21 LiDAR、31 制御情報、41 GNSS受信データ、51 点群地図、52 自己位置姿勢、53 位置閾値、100 制御装置、110 自己位置推定部、120 自己位置判定部、130 走行制御部、150 記憶部、201 GNSS受信アンテナ、202 GNSS受信機、203 周辺検知センサ、204 車両走行装置、909 電子回路、910 プロセッサ、921 メモリ、922 補助記憶装置、930 入力インタフェース、940 出力インタフェース、941 表示機器、950 通信装置、S10 第1の判定処理、S20 第2の判定処理、S30 取得処理。 10 Autonomous driving vehicle, 21 LiDAR, 31 Control information, 41 GNSS reception data, 51 Point cloud map, 52 Self-position and attitude, 53 Position threshold, 100 Control device, 110 Self-position estimation unit, 120 Self-position determination unit, 130 Driving control unit, 150 Memory unit, 201 GNSS receiving antenna, 202 GNSS receiver, 203 Surrounding detection sensor, 204 Vehicle driving device, 909 Electronic circuit, 910 Processor, 921 Memory, 922 Auxiliary storage device, 930 Input interface, 940 Output interface, 941 Display device, 950 Communication device, S10 First determination process, S20 Second determination process, S30 Acquisition process.
Claims (17)
前記周辺検知センサを使って点群地図における位置および姿勢を自己位置姿勢として推定する自己位置推定部と、
前記自己位置姿勢が正常か否かを判定する判定処理を実施する自己位置判定部と、
前記自己位置姿勢が正常と判定されると、前記自己位置姿勢を用いて前記自律走行を制御する走行制御部と
を備え、
前記自己位置推定部は、
前記自己位置姿勢が正常でないと判定されると、正常な自己位置姿勢を取得するための取得処理を実施し、前記取得処理により得られた自己位置姿勢が前記判定処理により正常と判定されるまで前記取得処理を繰り返す
ことを特徴とする自律走行移動体。 In an autonomous vehicle equipped with a surrounding detection sensor and capable of autonomous driving,
a self-position estimation unit that estimates a position and orientation on a point cloud map as a self-position and orientation using the surrounding detection sensor;
a self-position determination unit that performs a determination process to determine whether the self-position and orientation are normal;
a driving control unit that controls the autonomous driving using the self-position and attitude when the self-position and attitude are determined to be normal,
The self-position estimation unit
When it is determined that the self-position and attitude is not normal, an acquisition process is carried out to acquire a normal self-position and attitude, and the acquisition process is repeated until the self-position and attitude obtained by the acquisition process is determined to be normal by the determination process.
前記自己位置推定部は、
前記LiDARを用いたスキャンマッチングにより前記自己位置姿勢を推定する請求項1に記載の自律走行移動体。 The autonomous traveling vehicle is equipped with a LiDAR,
The self-position estimation unit
The autonomous vehicle according to claim 1 , wherein the self-position and orientation are estimated by scan matching using the LiDAR.
前記取得処理を繰り返す回数が回数閾値を超えた場合に、正常な自己位置姿勢を取得できないことを示す警告メッセージを通知する請求項1または請求項2に記載の自律走行移動体。 The self-position estimation unit
3. The autonomous vehicle according to claim 1, wherein, when the number of times the acquisition process is repeated exceeds a threshold number, a warning message is sent to indicate that a normal self-position and orientation cannot be acquired.
前記取得処理を繰り返す時間が時間閾値を超えた場合に、正常な自己位置姿勢を取得できないことを示す警告メッセージを通知する請求項1または請求項2に記載の自律走行移動体。 The self-position estimation unit
3. The autonomous vehicle according to claim 1, wherein, when a time period during which the acquisition process is repeated exceeds a time threshold, a warning message is sent to indicate that a normal self-position and orientation cannot be acquired.
前回推定した自己位置姿勢を記憶部に記憶し、
前記自己位置判定部は、
現在の自己位置姿勢と前記前回推定した自己位置姿勢とを比較し、比較結果が位置閾値よりも小さければ前記現在の自己位置姿勢が正常と判定し、比較結果が前記位置閾値よりも大きければ前記現在の自己位置姿勢が正常でないと判定する請求項1から請求項4のいずれか1項に記載の自律走行移動体。 The self-position estimation unit
The previously estimated self-position and orientation are stored in a memory unit.
The self-position determination unit
5. The autonomously traveling vehicle according to claim 1, wherein the current self-position and attitude are compared with the previously estimated self-position and attitude, and if the comparison result is smaller than a position threshold, the current self-position and attitude are determined to be normal, and if the comparison result is larger than the position threshold, the current self-position and attitude are determined to be abnormal.
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記周辺検知センサを用いたローカリゼーションとは異なる機器を用いた位置推定により現在の自己位置を取得し、姿勢を取得する請求項1から請求項5のいずれか1項に記載の自律走行移動体。 The self-position estimation unit
6. An autonomous vehicle according to claim 1, wherein when it is determined that the self-position and attitude are incorrect, the acquisition process acquires the current self-position and attitude by position estimation using equipment other than localization using the surrounding detection sensor.
前記自己位置推定部は、
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記GNSS受信機により受信したGNSS受信データを用いて現在の自己位置姿勢を取得する請求項1から請求項5のいずれか1項に記載の自律走行移動体。 The autonomous traveling vehicle includes a GNSS (Global Navigation Satellite System) receiver,
The self-position estimation unit
An autonomously moving vehicle as described in any one of claims 1 to 5, wherein when it is determined that the self-position and attitude are incorrect, the acquisition process acquires the current self-position and attitude using GNSS reception data received by the GNSS receiver.
前記自己位置推定部は、
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記GNSS受信機により受信したGNSS受信データを用いて現在の自己位置を取得するとともに、前記前回推定した自己位置姿勢のうちの姿勢を現在の姿勢として取得する請求項5に記載の自律走行移動体。 the autonomously traveling vehicle includes a GNSS receiver;
The self-position estimation unit
6. The autonomously moving vehicle of claim 5, wherein, when it is determined that the self-position and attitude are incorrect, the acquisition process involves acquiring the current self-position using GNSS reception data received by the GNSS receiver, and acquiring the attitude of the previously estimated self-position and attitude as the current attitude.
前記自己位置推定部は、
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記GNSS受信機により受信したGNSS受信データを用いて現在の自己位置を取得するとともに、電子コンパスを用いて現在の姿勢を取得する請求項1から請求項5のいずれか1項に記載の自律走行移動体。 the autonomously traveling vehicle includes a GNSS receiver;
The self-position estimation unit
An autonomously driving vehicle as described in any one of claims 1 to 5, wherein when it is determined that the self-position and attitude are incorrect, the acquisition process involves acquiring the current self-position using GNSS reception data received by the GNSS receiver, and acquiring the current attitude using an electronic compass.
前記自己位置推定部は、
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記GNSS受信機により受信したGNSS受信データを用いて現在の自己位置を取得するとともに、慣性計測装置を使って現在の姿勢を取得する請求項1から請求項5のいずれか1項に記載の自律走行移動体。 the autonomously traveling vehicle includes a GNSS receiver;
The self-position estimation unit
An autonomously moving vehicle as described in any one of claims 1 to 5, wherein when it is determined that the self-position and attitude are incorrect, the acquisition process involves acquiring the current self-position using GNSS reception data received by the GNSS receiver and acquiring the current attitude using an inertial measurement device.
前記自己位置推定部は、
前記自己位置姿勢が正しくないと判定されると、前記取得処理として、前記GNSS受信機により受信したGNSS受信データを用いて現在の自己位置を取得するとともに、前記自己位置姿勢のうちの姿勢から所定の角度ずらした姿勢を現在の姿勢として取得する請求項1から請求項5のいずれか1項に記載の自律走行移動体。 the autonomously traveling vehicle includes a GNSS receiver;
The self-position estimation unit
6. An autonomously running vehicle as described in any one of claims 1 to 5, wherein, when it is determined that the self-position and attitude are incorrect, the acquisition process involves acquiring the current self-position using GNSS reception data received by the GNSS receiver, and acquiring an attitude shifted by a predetermined angle from the attitude of the self-position and attitude as the current attitude.
前記判定処理により前記自己位置姿勢が正しいと判定されるまで、自律走行を減速する請求項1から請求項11のいずれか1項に記載の自律走行移動体。 The traveling control unit
The autonomously traveling vehicle according to claim 1 , wherein the autonomously traveling vehicle is decelerated until the self-position and attitude are determined to be correct by the determination process.
前記判定処理により前記自己位置姿勢が正しいと判定されるまで、自律走行を停止する請求項1から請求項11のいずれか1項に記載の自律走行移動体。 The traveling control unit
The autonomously running vehicle according to claim 1 , wherein the autonomously running vehicle stops autonomously running until the self-position and attitude are determined to be correct by the determination process.
前記自己位置推定部は、
前記カメラを用いたビジュアルSLAM(Simultaneous Localization and Mapping)により前記自己位置姿勢を推定する請求項1に記載の自律走行移動体。 the autonomously traveling vehicle is equipped with a camera,
The self-position estimation unit
The autonomous vehicle according to claim 1 , wherein the self-position and orientation are estimated by visual SLAM (Simultaneous Localization and Mapping) using the camera.
前記自己位置推定部は、
前記オドメータを用いたオドメータSLAMにより前記自己位置姿勢を推定する請求項1に記載の自律走行移動体。 the autonomous vehicle is equipped with an odometer,
The self-position estimation unit
The autonomous vehicle according to claim 1 , wherein the self-position and orientation are estimated by an odometer SLAM using the odometer.
コンピュータが、前記周辺検知センサを使って点群地図における位置および姿勢を自己位置姿勢として推定し、
コンピュータが、前記自己位置姿勢が正常か否かを判定する判定処理を実施し、
コンピュータが、前記自己位置姿勢が正常と判定されると、前記自己位置姿勢を用いて前記自律走行を制御し、
コンピュータが、前記自己位置姿勢が正常でないと判定されると、正常な自己位置姿勢を取得するための取得処理を実施し、前記取得処理により得られた自己位置姿勢が前記判定処理により正常と判定されるまで前記取得処理を繰り返す
ことを特徴とする自律走行方法。 An autonomous driving method used for an autonomously driving mobile body that is equipped with a periphery detection sensor and that performs autonomous driving,
a computer using the surrounding detection sensor to estimate the position and orientation in the point cloud map as the self-position and orientation;
a computer performs a determination process to determine whether the self-position and orientation are normal;
When the computer determines that the self-position and attitude are normal, the computer controls the autonomous traveling using the self-position and attitude;
An autonomous driving method characterized in that, when a computer determines that the self-position and attitude is not normal, it performs an acquisition process to acquire a normal self-position and attitude, and repeats the acquisition process until the self-position and attitude obtained by the acquisition process is determined to be normal by the determination process.
前記周辺検知センサを使って点群地図における位置および姿勢を自己位置姿勢として推定する自己位置推定処理と、
前記自己位置姿勢が正常か否かを判定する判定処理を実施する自己位置判定処理と、
前記自己位置姿勢が正常と判定されると、前記自己位置姿勢を用いて前記自律走行を制御する走行制御処理と
を前記コンピュータに実行させる自律走行プログラムであって、
前記自己位置推定処理は、
前記自己位置姿勢が正常でないと判定されると、正常な自己位置姿勢を取得するための取得処理を実施し、前記取得処理により得られた自己位置姿勢が前記判定処理により正常と判定されるまで前記取得処理を繰り返す
ことを特徴とする自律走行プログラム。 An autonomous driving program used in a computer mounted on an autonomously driving vehicle equipped with a surrounding detection sensor and capable of autonomous driving,
a self-position estimation process for estimating a position and orientation on a point cloud map as a self-position and orientation using the surrounding detection sensor;
a self-position determination process for determining whether the self-position and posture are normal;
and a driving control process for controlling the autonomous driving using the self-position and attitude when the self-position and attitude are determined to be normal,
The self-location estimation process includes:
When it is determined that the self-position and attitude is not normal, an acquisition process is carried out to acquire a normal self-position and attitude, and the acquisition process is repeated until the self-position and attitude obtained by the acquisition process is determined to be normal by the judgment process.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2024/010188 WO2025191822A1 (en) | 2024-03-15 | 2024-03-15 | Autonomous moving body, autonomous travel method, and autonomous travel program |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2024/010188 WO2025191822A1 (en) | 2024-03-15 | 2024-03-15 | Autonomous moving body, autonomous travel method, and autonomous travel program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025191822A1 true WO2025191822A1 (en) | 2025-09-18 |
Family
ID=97063295
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/010188 Pending WO2025191822A1 (en) | 2024-03-15 | 2024-03-15 | Autonomous moving body, autonomous travel method, and autonomous travel program |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025191822A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2019016089A (en) * | 2017-07-05 | 2019-01-31 | カシオ計算機株式会社 | Autonomous moving device, autonomous moving method, and program |
| WO2022208588A1 (en) * | 2021-03-29 | 2022-10-06 | 三菱電機株式会社 | Position estimation device, position estimation program, and position estimation method |
| JP2023013441A (en) * | 2021-07-16 | 2023-01-26 | 大成建設株式会社 | Autonomous driving system |
| JP2023066524A (en) * | 2021-10-29 | 2023-05-16 | ソニーグループ株式会社 | Information processor, method for processing information, and information processing system |
-
2024
- 2024-03-15 WO PCT/JP2024/010188 patent/WO2025191822A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2019016089A (en) * | 2017-07-05 | 2019-01-31 | カシオ計算機株式会社 | Autonomous moving device, autonomous moving method, and program |
| WO2022208588A1 (en) * | 2021-03-29 | 2022-10-06 | 三菱電機株式会社 | Position estimation device, position estimation program, and position estimation method |
| JP2023013441A (en) * | 2021-07-16 | 2023-01-26 | 大成建設株式会社 | Autonomous driving system |
| JP2023066524A (en) * | 2021-10-29 | 2023-05-16 | ソニーグループ株式会社 | Information processor, method for processing information, and information processing system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110869700B (en) | System and method for determining vehicle position | |
| US11138465B2 (en) | Systems and methods for transforming coordinates between distorted and undistorted coordinate systems | |
| CN111149011B (en) | Method and vehicle system for locating a higher degree of automation vehicle (HAF), in particular a highly automated vehicle | |
| EP3907720B1 (en) | Own position estimating device, automatic driving system comprising same, and own generated map sharing device | |
| JP6456562B1 (en) | Driving support system, driving support method, and driving support program | |
| US11287281B2 (en) | Analysis of localization errors in a mobile object | |
| CN114942025A (en) | Vehicle navigation positioning method and device, electronic equipment and storage medium | |
| WO2018037653A1 (en) | Vehicle control system, local vehicle position calculation device, vehicle control device, local vehicle position calculation program, and vehicle control program | |
| WO2021035748A1 (en) | Pose acquisition method, system, and mobile platform | |
| Choi et al. | LiDAR-based Localization for Autonomous Vehicles-Survey and Recent Trends | |
| JP2019148456A (en) | Calculation device, self-location calculation method and program | |
| WO2025191822A1 (en) | Autonomous moving body, autonomous travel method, and autonomous travel program | |
| CN110375749A (en) | Air navigation aid and navigation system | |
| KR20000013568A (en) | Navigation system displaying photograph image of target point | |
| CN115963521A (en) | Method and system for determining position and acceleration of vehicle | |
| Nastro | Position and orientation data requirements for precise autonomous vehicle navigation | |
| JP2022098635A (en) | Device and method for operating reliability of position of owned vehicle, vehicle controller, and method for controlling vehicle | |
| Li et al. | Sonar image processing based underwater localization method and its experimental studies | |
| CN121091266A (en) | A motion estimation method and device | |
| JP7716938B2 (en) | On-board processing device, vehicle control device, and self-position estimation method | |
| CN114993313B (en) | Trajectory calculation and registration method based on autonomous underwater robot inertial navigation and ultra-short baseline positioning sensor | |
| EP4325169B1 (en) | Vehicle localization based on pose corrections from remote vehicles in parking garages | |
| US20250349085A1 (en) | Capture and display of point clouds using augmented reality device | |
| Hameed et al. | Real-time improvement with gyroscope of IMU, GPS, and laser range finder | |
| JP2025099942A (en) | Self-position estimation device, satellite communication system, self-position estimation method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24929609 Country of ref document: EP Kind code of ref document: A1 |