WO2021220331A1 - Mobile body system - Google Patents
Mobile body system Download PDFInfo
- Publication number
- WO2021220331A1 WO2021220331A1 PCT/JP2020/017937 JP2020017937W WO2021220331A1 WO 2021220331 A1 WO2021220331 A1 WO 2021220331A1 JP 2020017937 W JP2020017937 W JP 2020017937W WO 2021220331 A1 WO2021220331 A1 WO 2021220331A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- locus
- moving body
- data
- mobile system
- Prior art date
Links
- 238000009434 installation Methods 0.000 claims abstract description 36
- 230000007246 mechanism Effects 0.000 claims description 53
- 238000004364 calculation method Methods 0.000 claims description 30
- 230000033001 locomotion Effects 0.000 claims description 24
- 238000005259 measurement Methods 0.000 claims description 11
- 230000008859 change Effects 0.000 claims description 7
- 230000001360 synchronised effect Effects 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 121
- 238000001514 detection method Methods 0.000 description 34
- 238000000034 method Methods 0.000 description 27
- 238000012545 processing Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 18
- 238000004891 communication Methods 0.000 description 9
- 238000013500 data storage Methods 0.000 description 8
- 238000003860 storage Methods 0.000 description 8
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 206010027146 Melanoderma Diseases 0.000 description 1
- 241000519995 Stachys sylvatica Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present invention relates to a technique for a mobile system, a technique for measuring the position of a moving body, and the like.
- a sensor In a mobile system or the like having a function of measuring the position of a moving body (sometimes referred to as a positioning function), a sensor is installed on the moving body.
- This sensor is a type of sensor that can at least detect the position or calculate the position based on the sensor data. Examples of this sensor include a distance measuring sensor such as a laser scanner, a GPS receiver, and the like.
- the mobile system can realize a positioning function, a function of creating a map around the mobile, and the like by using the information of the sensor.
- Patent Document 1 JP-A-2017-97402
- Patent Document 1 As a "peripheral map creation method” or the like, the self-position estimation device of a mobile robot creates the latest self-position / attitude data by matching the distance data of the laser range finder (LRF) with the map. It is stated that it will be done.
- LRF laser range finder
- a plurality of (for example, two) sensors may be installed in one mobile body.
- the purpose of this installation depends on the details such as the function and application of the mobile system, and examples thereof include securing a wide detection range, calculating one position using a plurality of sensor data, and realizing a redundant configuration.
- a plurality of distance measuring sensors may be installed on the moving body.
- the type and shape of the moving body to be applied, the position and direction in which the sensor is installed, and the like vary depending on the application environment such as a factory.
- the positions and directions of installation of a plurality of sensors may be changed even for the same moving body.
- the relative positional relationship between the sensors can be initially set in advance, and there is no problem if there is no change.
- the installation positions of the two sensors in the moving body are changed, it may be difficult to measure the relative positional relationship between the changed sensors with high accuracy, or even if the user sets it, it may take a lot of time and effort. be. If the accuracy of the setting regarding the relative positional relationship between the sensors after the change is low, it may affect the functions such as the positioning function and the map creation function of the mobile system, and the accuracy of those functions may also be low.
- An object of the present invention is to obtain a relative positional relationship between sensors even when a plurality of sensors are installed on a moving body or when the installation position is changed with respect to the technology of a moving body system, and a positioning function can be obtained. It is to provide a technology that can improve the accuracy of such as.
- the moving body system of one embodiment includes a moving body, a plurality of sensors including a first sensor and a second sensor installed at different positions in the moving body coordinate system of the moving body, and a plurality of the plurality of sensors.
- the first sensor and the second sensor are provided with a control device that realizes a positioning function that measures at least the position of the moving body in the spatial coordinate system based on sensor data, and the first sensor and the second sensor are in the spatial coordinate system. It is a type of sensor that can detect the position of the self-sensor, and the control device is a first sensor data of the first sensor and a second sensor of the second sensor when the moving body moves in the environment.
- the position of the first sensor and the position of the second sensor in the spatial coordinate system are identified, and based on the position identification result, the first locus of the first sensor on the time series and the said The second locus of the second sensor is acquired, and the first locus and the second locus are used to compare and collate the shapes of the loci, and the first sensor of the moving body in the moving body coordinate system of the moving body.
- the relative positional relationship between the position and the position of the second sensor is calculated, and the information representing the calculated relative positional relationship is set in the moving body.
- the relative positional relationship between the sensors even when a plurality of sensors are installed on the mobile body or when the installation position or the like is changed. Can be obtained, and the accuracy of the positioning function and the like can be improved.
- FIG. 1 shows the structure of the mobile system of Embodiment 1 of this invention. It is a figure which shows the structure of the moving body in Embodiment 1.
- FIG. It is a figure which shows the configuration example of the detection range of a sensor in Embodiment 1.
- FIG. It is a figure which shows the structural example of the moving mechanism in Embodiment 1.
- FIG. It is a figure which shows the functional block composition example of the mobile body system in Embodiment 1.
- FIG. It is a figure which shows the configuration example of the software and hardware of the position identification apparatus in Embodiment 1.
- FIG. It is a figure which shows the main processing flow of the position identification apparatus in Embodiment 1.
- FIG. 1 shows the example of the locus of a moving body and a 1st sensor in Embodiment 1.
- FIG. It is a figure which shows the example of the locus of a moving body and a 2nd sensor in Embodiment 1.
- FIG. It is a figure which shows the example of the shape of the locus corresponding to the sensor position in Embodiment 1.
- FIG. It is a figure which shows the generation example of a relative position parameter in Embodiment 1.
- FIG. It is a figure which shows the example of the temporary locus of the temporary position of the 2nd sensor in Embodiment 1.
- FIG. 1 shows the example of the matching process in the case of the 1st embodiment, when the time is not synchronized between the sensor data. It is a figure which shows the example of the detection by the 1st sensor in the environment in Embodiment 1.
- FIG. 2nd sensor shows the environment in Embodiment 1.
- FIG. It is a figure which shows the example of the 1st map by the 1st sensor in Embodiment 1.
- FIG. It is a figure which shows the example of the association between the 1st map and the 2nd map in Embodiment 1.
- the mobile system according to the first embodiment of the present invention will be described with reference to FIGS. 1 to 19.
- the mobile system of the first embodiment has a function (sometimes referred to as a relative positional relationship calculation function) capable of automatically adjusting (in other words, calibrating) the relative positional relationship between a plurality of sensors provided in the mobile body. Even if the installation position of the sensor in the moving body is changed, the installation position of each sensor can be set with high accuracy by the automatic adjustment by this function with less trouble for the user. Therefore, the positioning function of the mobile system can be maintained with high accuracy.
- a function sometimes referred to as a relative positional relationship calculation function
- FIG. 1 shows the configuration of the mobile system of the first embodiment.
- This mobile system is a system applied to the environment 101 such as a factory.
- a production facility 102 is installed in a building.
- This mobile system has a mobile 1.
- the moving body 1 is an AGV (or an autonomous traveling robot or the like) capable of unmanned transporting a luggage 103 such as a product or a member.
- the mobile body 1 conveys a predetermined route in the factory, and supplies, for example, a luggage 103 to a production facility 102.
- the moving body 1 includes a control device 100, a sensor 2, a moving mechanism 3, a mounting mechanism 4, and the like in a housing 10.
- the control device 100 is a device that controls the moving body 1.
- the spatial coordinate system CS of the environment 101 is represented by (X, Y, Z).
- the origin of the spatial coordinate system CS is set at any position.
- the moving body 1 of FIG. 1 is arranged so that the front and rear correspond to the X direction, the left and right correspond to the Y direction, and the vertical and height correspond to the Z direction.
- the coordinate system in the moving body 1 is defined as the moving body coordinate system CM and is represented by (x, y, z).
- the origin of the moving body coordinate system CM is set to any position, for example, a representative position of the moving body 1.
- the moving mechanism 3 is a mechanism including, for example, wheels and a drive unit.
- the drive unit includes, for example, a motor, a drive circuit, and the like.
- the moving mechanism 3 is a mechanism capable of traveling forward and backward and turning left and right by the wheels (FIG. 4 described later), but is not limited to this.
- the mounting mechanism 4 is a structural portion for stably mounting the luggage 103, and the details are not limited.
- the mounting mechanism 4 has various types depending on the application and the like, and is, for example, a structural part including a conveyor and the like.
- the housing 10 of the moving body 1 has a flat plate-shaped first portion 10a parallel to a horizontal plane and a flat plate-shaped second portion standing vertically from a part of the first portion 10a. It has 10b, but is not limited to this.
- a moving mechanism 3 including two front and rear axles and four front, rear, left and right wheels is provided in the first portion 10a.
- a control device 100 is built in the second portion 10b. The control device 100 may be installed so as to be exposed to the outside in the second portion 10b or the like.
- the side with the second portion 10b is the front
- the side without the second portion 10b is the back
- the left and right directions are defined with respect to the front-back direction.
- it is not limited to this.
- a plurality of sensors 2 and two sensors 2 (2A, 2B) in this example are installed in the moving body 1.
- the two sensors 2 be the first sensor, the sensor 2A, and the second sensor, the sensor 2B.
- each sensor 2 is a distance measuring sensor, and in particular, a two-dimensional laser scanner (laser range finder: sometimes called LRF or the like).
- the device 100 can calculate the position of the sensor 2 from the distance measurement data of the sensor 2.
- the two-dimensional meaning means that the distance to an object can be detected in a plane (horizontal plane in this example) centered on the direction of the sensor 2.
- the sensor 2 which is a laser scanner detects and measures a distance by using an object in each direction around the moving body 1 as a feature point.
- the sensor 2 in this example plays a role like a safety sensor in a function of realizing safe automatic transportation of the moving body 1 in the moving body system.
- Each sensor 2 may be any type of sensor that can measure the position of the sensor 2 itself in the spatial coordinate system CS (particularly, the locus of the position on the time series) by a moving body system, and details. Is not limited.
- the sensor 2 may be at least a type of sensor that can realize a positioning function.
- the sensor 2 may be a type of sensor 2 capable of detecting its own position, or a type of sensor 2 capable of calculating the position by the control device 100 or the like from the sensor data of the sensor 2.
- the positioning function may be realized by the sensor 2 alone, or may be realized in combination with the control device 100 or the like.
- the sensor data output by the sensor 2 includes information on the position and orientation of the sensor 2.
- the control device 100 calculates the position and orientation of the sensor 2 based on the sensor data of the sensor 2.
- the sensor 2 is a type of sensor that can detect or calculate the position and orientation of the sensor 2.
- the posture of the sensor 2 is a state of direction and rotation.
- the control device 100 calculates the position and posture state of each sensor 2 based on the distance measuring data from the sensor 2.
- the two sensors 2 (2A, 2B) are installed at different positions (sometimes referred to as installation positions) in the moving body coordinate system CM of the moving body 1.
- the sensor 2A which is the first sensor
- the sensor 2B which is the second sensor, is installed at a position near the left corner of the upper surface on the rear side of the first portion 10a.
- the plurality of sensors 2 (2A, 2B) installed in the moving body 1 of FIG. 1 are distance measuring sensors having the same function and specifications, but are not limited to this, and may be a plurality of sensors 2 having different types and specifications. good.
- the sensor 2 is not limited to a distance measuring sensor such as a laser scanner, and an acceleration sensor, a gyro sensor, a geomagnetic sensor, a GPS receiver, or the like may be used.
- FIG. 2 shows (A) a side view and (B) a top view as the configuration of the moving body 1 of FIG.
- the position of the sensor 2A (indicated by a black dot) in the moving body coordinate system CM is set to PA
- the height position is set to ZA
- the position of the sensor 2B is set to ZA.
- PB the height position is particularly shown as ZB.
- the height positions (ZA, ZB) of the two sensors 2 (2A, 2B) are different.
- the height position ZA of the sensor 2A is higher than the height position ZB of the sensor 2B (ZA> ZB> 0).
- the positioning function of the moving body system corresponds to the one height position. It is a function to measure the position of a moving body on a horizontal plane.
- the map creation function is a function of creating a map showing the shape of an object in a horizontal plane corresponding to the one height position.
- the positioning function in the first embodiment is a function of measuring the position of the moving body 1 based on the positions of the respective sensors 2 on the two horizontal planes corresponding to the two height positions.
- the map creation function (described later) is a function of creating a map showing the shape of an object in two horizontal planes corresponding to two height positions.
- the coordinate system CA of the sensor 2A is shown by (x, y) with the position PA of the sensor 2A as a reference / origin.
- the x-axis is the installation direction of the sensor 2A.
- the coordinate system CB of the sensor 2B is indicated by (x, y) with the position PB of the sensor 2B as a reference / origin.
- the x-axis is the installation direction of the sensor 2B.
- the sensor 2A and the sensor 2B each perform detection within a range of a predetermined angle (FIG. 3 described later) about the x-axis.
- the coordinate system (CA, CB) of each sensor 2 is (x, y, z) in three dimensions.
- the sensor 2 has a direction (sometimes referred to as an installation direction) that serves as a reference for installation and detection.
- the installation direction of the sensor 2A is also shown as ⁇ A
- the installation direction of the sensor 2B is also shown as ⁇ B.
- the direction of the sensor 2 is a reference direction when emitting laser light.
- the direction of the sensor 2A is the forward direction on the X-axis and the x-axis direction on the coordinate system CA.
- the direction of the sensor 2B is a direction different from the X-axis due to the relative relationship of the angle ⁇ , substantially a direction diagonally to the left of the rear, and is an x-axis direction in the coordinate system CB.
- the installation direction of the sensor 2 may change along with the installation position.
- the relative positional relationship calculation function in the first embodiment is a function of calculating the relative positional relationship 105 including the relationship between the installation position and the direction between the sensors 2.
- the position PA of the sensor 2A and the position PB of the sensor 2B are expressed as (xA, yA, zA) or (xB, yB, zB) as coordinate values in the moving body coordinate system CM (x, y, z). can. Further, the position PA of the sensor 2A and the position PB of the sensor 2B are represented by different coordinate values in the spatial coordinate system CS.
- the relationship of the positional coordinates is represented by the illustrated values ⁇ x, ⁇ y, ⁇ z.
- the values ⁇ x, ⁇ y, ⁇ z are the difference values of the origin position PB (xB, yB, zB) of the coordinate system CB of the sensor 2B with respect to the position PA (xA, yA, zA) of the origin of the coordinate system CA of the sensor 2A.
- ⁇ x xB ⁇ xA.
- the value ⁇ x is the difference value of the X coordinate value in the spatial coordinate system CS of the position PB of the sensor 2B with respect to the X coordinate value in the spatial coordinate system CS of the position PA of the sensor 2A.
- ⁇ y and ⁇ z are the same applies to ⁇ y and ⁇ z.
- the relative positional relationship 105 in obtaining the relative positional relationship 105, the relationship between the position and direction of the second sensor 2B with respect to the first sensor is obtained with reference to the sensor 2A which is the first sensor. This is also true in reverse.
- the relative positional relationship 105 particularly the relationship of the position coordinates, is also represented as a vector vAB.
- the relationship of particular direction is represented by the illustrated value ⁇ .
- the value ⁇ is a difference value with respect to the direction ( ⁇ A, ⁇ B) of the sensor 2 (2A, 2B).
- the direction ⁇ A of the sensor 2A is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CA.
- the direction ⁇ B of the sensor 2B is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CB.
- the relative positional relationship calculation function in this mobile system obtains values ( ⁇ x, ⁇ y) representing the above positional relationship as at least the relative positional relationship 105 between the sensors 2.
- this relative positional relationship calculation function also obtains a value ⁇ representing the above-mentioned directional relationship.
- the relative positional relationship calculation function may be a function capable of obtaining the value ⁇ z in the Z direction of FIG. 2 (A) in consideration of three dimensions.
- each sensor 2 The installation position and direction of each sensor 2 are selected so as to form a predetermined detection range as shown in the example of FIG. At that time, depending on the shape of the moving body 1, it is necessary to select an appropriate installation so that, for example, the laser beam is not blocked by a portion such as the mounting mechanism 4.
- the shape and the like of the moving body 1 are variously different depending on the application environment and the application, but the installation position and direction of the sensor 2 can be changed according to them. Further, even after the sensor 2 is installed at a certain position with the intention of being fixed, it may deviate from the intention of the user and a deviation from that position may occur. For example, it is conceivable that the sensor 2 hits something and the position is slightly shifted.
- the plurality of sensors 2 have a predetermined detection range (FIG. 3) according to the application environment, application and function (map creation function, etc.) of the mobile body system, the type and shape of the mobile body 1, user operation, and the like. ) Etc., the position and direction of installation on the moving body 1 are determined and can be changed as appropriate.
- the relative positional relationship calculation function in the mobile system of the first embodiment automatically adjusts the relative positional relationship 105 between the sensors 2 with high accuracy and easily in response to such a change in the installation state of the sensor 2. be able to.
- This relative positional relationship calculation function calculates the relative positional relationship 105 by a mechanism that matches the shape of the locus of the sensor 2 as described later (FIG. 7). Then, the mobile system can realize the positioning function, the map creation function, and the like with high accuracy based on the high-precision relative positional relationship 105 as a result.
- the height positions (ZA, ZB) of the sensor 2 are different as shown in FIG. 2 and the like as a restriction on the installation of the sensor 2 or an intentional design.
- the maps that can be created by the map creation function described later are basically different (two) maps according to each height position that is different for each sensor 2 according to the conventional technology. ..
- the relative positional relationship 105 is unknown or low accuracy
- the relationship between the maps is also unknown or low accuracy. Therefore, it may be difficult for the user or the moving body 1 to grasp the same object in the environment 101 of FIG. 1, such as the shape of a building or the production equipment 102, from the maps.
- the map around the moving body 1 can be created with high accuracy according to the map creation function based on the highly accurate relative positional relationship 105 obtained by the relative positional relationship calculation function.
- this mobile system associates a plurality of (two) map data corresponding to each height position different for each created sensor 2 by using the relative positional relationship 105.
- those maps can be treated as one map in an integrated manner. This makes it easier to grasp the shape and the like of the same object in the environment 101.
- FIG. 3 shows a configuration example of the detection range of the sensor 2.
- the sensor 2 emits laser light while scanning in each direction of the surroundings (horizontal plane in this example) within the detection range from the installation position (PA, PB), and has a feature point of hitting an object in the environment 101.
- the laser beam returning from the feature point is incident.
- the sensor 2 calculates the distance from the feature point according to the direction from the time from the emission of the laser beam to the incident by the so-called TOF (Time of flight) method.
- TOF Time of flight
- the sensor data which is the output from the sensor 2, in other words, the distance measurement data, is at least at each time point in the time series, the angles ⁇ and ⁇ representing the direction in which the sensor 2 looks at the surroundings, and the distance value corresponding to the angle. (D) and.
- FIG. 3 schematically shows an example of the detection range of the sensor 2 corresponding to the configuration of the moving body 1 of FIG. 2 on a horizontal plane (XY plane).
- Each sensor 2 makes a horizontal plane and an xy plane in the sensor coordinate system a detection target, in other words, a distance measurement target.
- the detection range 301 indicates the detection range of the sensor 2A and is defined by the angle range 305 in the horizontal plane.
- the detection range 302 indicates the detection range of the sensor 2B and is defined by the angle range 306 in the horizontal plane. In this example, the angle ranges 305 and 306 are greater than 180 degrees.
- the detection direction of the sensor 2 is expressed by an angle ( ⁇ , ⁇ ) with respect to a reference direction (direction ⁇ A, ⁇ B) in the horizontal plane.
- the black spot in that direction is an example of the feature point 303 of the object and has a distance of 304 (value d).
- the actual feature points and detection ranges cover positions farther from the moving body 1. The distance that can be measured depends on the type of sensor 2 and the like.
- the angles ( ⁇ , ⁇ ) in this example can take positive and negative values.
- each sensor 2 is different as shown in the drawing, and a part of the detection range may overlap between them, or there may be a part of the range that cannot be detected around the moving body 1. Even a part of the range that cannot be detected can be detected by changing the position and posture of the moving body 1. As shown in the figure, by providing a plurality of sensors 2, a wide detection range as a mobile system can be secured.
- FIG. 3B shows another installation example of the same two sensors 2 (2A, 2B) as in (A) in the example of the moving body 1 having a shape different from that of FIG.
- the moving body 1 does not have the second portion 10b as shown in FIG. 2, and has a flat plate-shaped first portion 10a on a horizontal plane and a mounting mechanism 4 on the first portion 10a.
- the sensor 2A is installed on the front side of the x-axis in the mobile coordinate system CM at the center position PA on the left and right in the forward direction ⁇ A.
- the sensor 2B is installed on the rear side of the x-axis at the center position PB on the left and right in the rear direction ⁇ B.
- the two sensors 2 are installed at positions symmetrical with respect to the moving body 1 (PA, PB).
- the detection ranges (301 and 302) of the two sensors 2 are configured as symmetrical detection ranges in the front-rear direction.
- the installation position, direction, detection range, etc. of the plurality of sensors 2 can be changed as appropriate. It is possible to respond to various environments and applications by changing it.
- the sensor 2 When the sensor 2 is a laser scanner, laser light can be emitted in each of the surrounding directions by scanning that rotationally drives the laser irradiation unit, and distance information for each direction can be obtained.
- the distance information can be converted into position information of feature points based on the position of the sensor 2. Since the position information of the surrounding object seen from the moving body 1 or the sensor 2 represents the geometric shape of the object, it may be described as shape data.
- a camera or the like can also be applied as the sensor 2.
- a positioning system or the like using a sensor (for example, RFID tag, beacon) or the like installed in the environment instead of the mobile body 1 can be applied.
- the distance to the object can be calculated from the images of the left and right cameras.
- FIG. 4 shows a configuration example of the moving mechanism 3 of the moving body 1 of FIG. 1 as a schematic view on a horizontal plane (XY plane).
- the moving mechanism 3 is a mechanism capable of traveling forward and backward, stopping traveling, and turning left and right, and includes two axes and four wheels, for example, a rear wheel drive type mechanism.
- the moving mechanism 3 has a left wheel 401 and a right wheel 402 of the front axle 410, and a left wheel 403 and a right wheel 404 of the rear axle 420.
- the moving mechanism 3 is, for example, a mechanism in which the speeds of the left and right wheels can be controlled independently, and turning or the like can be controlled by controlling the difference in speed between the left and right wheels.
- FIG. 4 shows a state when traveling forward (in the X direction).
- the position PM indicates the center points of the front, back, left, and right as an example of a typical position of the moving body 1 in the moving body coordinate system CM.
- the position PM1 is the left and right center points of the front axle 410, and the position PM2 is the left and right center points of the rear axle 420.
- the moving mechanism 3 realizes forward traveling by driving each wheel at the same rotational speed.
- the broken line locus 431 indicates a locus when traveling forward from the current position PM.
- FIG. 4 shows a state when turning to the right.
- a right-handed turning operation is realized by controlling the rotation speeds of the right wheels 402 and 404 to be smaller than those of the left wheels 401 and 403.
- the broken line locus 432 shows the locus when turning right from the current position PM in the future.
- the moving mechanism 3 may be any mechanism capable of traveling and changing direction.
- the moving mechanism 3 may be a mechanism in which the direction of the wheels is fixed, a mechanism in which the direction of the wheels can be steered, or a mechanism using a structure other than the axle or the wheel, for example, a caster, a caterpillar, or a leg structure.
- the moving mechanism 3 may be, for example, a mechanism used in a cleaning robot or the like, for example, a mechanism capable of independently controlling the direction and rotation speed of each wheel.
- the turning motion is not limited to the turning motion accompanied by the locus of an arc as shown in the figure.
- the relative positional relationship calculation function in the mobile body system is a function of obtaining the relative relationship regarding the installation position and the installation direction of the plurality of sensors 2 in the mobile body coordinate system CM.
- the typical position of the moving body 1 can be specified in advance, and the method of specifying is not limited.
- This representative position may be defined using the position (PA, PB) of the sensor 2 (2A, 2B).
- This representative position may be, for example, the same as the position PA of one specific sensor 2, for example, the sensor 2A, or may be an intermediate position between the two sensors 2 (2A, 2B).
- This typical position may be a predetermined position in the shape of the housing 10 or the like, for example, a center position, an intermediate position of the axle (for example, positions PM1 and PM2) and the like.
- This typical position may be a position having a predetermined relative relationship (direction and distance) from the position of the sensor 2.
- FIG. 5 shows a functional block configuration of the mobile system of the first embodiment.
- the mobile body 1 of this mobile body system includes a control device 100, two sensors 2 (2A, 2B), a moving mechanism 3, and the like.
- the control device 100 includes a position identification device 5 and a movement mechanism control device 6.
- the position identification device 5 is mounted on, for example, a microcomputer or the like.
- the movement mechanism control device 6 is implemented by, for example, a PLC (programmable logic controller) or the like.
- the position identification device 5 and the movement mechanism control device 6 are integrally mounted as the control device 100, but the present invention is not limited to this.
- the control device 100 may include a portion that drives and controls the operation of the mounting mechanism 4.
- the position identification device 5 has a positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, a relative positional relationship calculation function, and the like (FIG. 6).
- the position identification device 5 realizes each part such as the sensor control unit 51 based on the program processing by the processor 601 of FIG.
- the position identification device 5 includes a sensor control unit 51, a position identification unit 52, a map creation unit 53, a data storage unit 54, an adjustment unit 55, and the like as each unit.
- data storage unit 54 data such as the first position identification result 41A and the second position identification result 41B as the position identification result data, the first map data 42A and the second map data 42B as the map data, and the relative positional relationship data 43, etc. Information is stored.
- the sensor control unit 51 includes a first sensor control unit 51A and a second sensor control unit 51B.
- the first sensor control unit 51A controls the sensor 2A and obtains the sensor data SDA from the sensor 2A.
- the second sensor control unit 51B controls the sensor 2B and obtains the sensor data SDB from the sensor 2B.
- the sensor data SDA and the sensor data SDB include distance measurement data for each time point on the time series, that is, distance information for each angle representing a direction.
- the sensor control unit 51 holds time-series sensor data, data for at least a certain period of time or longer, in the memory.
- the position identification unit 52 is an element that constitutes a positioning function (particularly a position / orientation estimation function), and is a part that identifies the position and orientation of the sensor 2 in the spatial coordinate system using sensor data.
- the position identification unit 52 has a first position identification unit 52A and a second position identification unit 52B.
- the first position identification unit 52A estimates the position and orientation of the sensor 2A based on the sensor data SDA, and sets the result as the first position identification result 41A.
- the second position identification unit 52B estimates the position and orientation of the sensor 2B based on the sensor data SDB, and sets the result as the second position identification result 41B.
- the position identification unit 52 creates shape data representing the geometric shape of a surrounding object from distance measurement data which is sensor data, and compares the shape data with the existing map data of the data storage unit 54. Match. Then, the position identification unit 52 estimates the position and orientation of the sensor 2 in the spatial coordinate system from the collation result.
- the map creation unit 53 is an element constituting the map creation function, and is a part that creates and updates the map data of the environment 101 (FIG. 1) while using the processing result by the position identification unit 52.
- the map creation unit 53 uses the shape data created by the position identification unit 52 to create new map data and update existing map data.
- the map creation unit 53 has a first map creation unit 53A and a second map creation unit 53B.
- the first map creation unit 53A creates the first map data 42A by using the sensor data SDA and the first position identification result 41A.
- the second map creation unit 53B creates the second map data 42B using the sensor data SDB and the second position identification result 41B.
- the first map data 42A is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZA in FIG.
- the second map data 42B is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZB.
- the data storage unit 54 temporarily stores each data such as the first position identification result 41A, the second position identification result 41A, the first map data 42A, and the second map data 42B created by the above processing.
- the adjustment unit 55 is an element that constitutes the relative positional relationship calculation function, in other words, is a relative positional relationship calculation unit.
- the adjusting unit 55 calculates the relative positional relationship between the sensors 2 (relative positional relationship 105 in FIGS. 1 and 2) while referring to each data (41A, 41B, 42A, 42B) of the data storage unit 54. I do.
- this process is a process of calibrating between the sensor coordinate systems, and is a process related to setting the position coordinates of each sensor 2 in the mobile coordinate system CM.
- the adjusting unit 55 stores it in the data storage unit 54 as the relative positional relationship data 43. This corresponds to the latest settings related to the sensor 2 and the positioning function, in other words, automatic setting updates.
- the relative positional relationship data 43 is data including sensor relative coordinate information and the like. Specifically, the relative positional relationship data 43 is data including values representing the positional relationship ( ⁇ x, ⁇ y) representing the relative positional relationship 105 as shown in FIG. 2 and values ( ⁇ ) representing the directional relationship.
- the position identification device 5 may output the relative positional relationship data 43 or the like of the data storage unit 54 to the user in the form of display or the like.
- the position identification device 5 provides a setting screen related to the sensor 2 in the form of, for example, a Web page, displays the relative positional relationship information based on the relative positional relationship data 43 on the setting screen, and confirms or manually by the user. Allows setting.
- the relative positional relationship information may be displayed on the setting screen, for example, the above values ( ⁇ x, ⁇ y, ⁇ ) may be displayed, or the relative positional relationship information may be displayed relative to the image showing the appearance configuration of the moving body 1 as shown in FIG.
- the positional relationship information may be displayed graphically.
- the business operator can set the initial setting value of the relative positional relationship of the sensor 2 in advance on the setting screen of the position identification device 5. After this initial setting, the setting of the relative positional relationship data 43 can be automatically updated depending on the enabled state of the relative positional relationship calculation function even if there is no manual setting by a person.
- the mobile system may be in a form in which a device such as a PC (PC 110 in FIG. 1) is further connected to the control device 100 of the mobile body 1 by communication.
- the device such as a PC is provided with an OS, an application program, and the like.
- Examples of this application program include processing related to user setting processing of a mobile system, positioning function, automatic transportation function, map creation function, and the like.
- Examples of this application program include those that support a function of setting an automatic transportation route, a function of a user browsing a map, and the like.
- the user can operate the device such as the PC and use those functions on the display screen.
- the user can also confirm the relative positional relationship data 43 on the display screen of the device.
- the adjustment unit 55 or the map creation unit 53 of the position identification device 5 further uses the relative positional relationship data 43 to associate a plurality of (two) map data (first map data 42A and second map data 42B) with each other. Is processed. As a result, the plurality of (two) map data can be roughly treated as one map data as a whole.
- the position identification device 5 may create one map data from a plurality of (two) map data by synthesis or the like. The user can browse the one map data on the display screen of the PC 110.
- the movement mechanism control device 6 is a part including a drive control circuit and the like, and controls the operation of the movement mechanism 3 by using the position identification result data and the map data by the position identification device 5.
- the movement mechanism control device 6 includes a position identification device control unit 61 and a movement mechanism control unit 62.
- the position identification device control unit 61 communicates with the position identification device 5 and acquires data necessary for control from the position identification device 5.
- the movement mechanism control unit 62 controls the traveling and turning operations of the movement mechanism 3 based on the position of the moving body 1 grasped by the positioning function and the map data created by the map creation function.
- FIG. 6 shows an implementation configuration example including the software and hardware of the position identification device 5 of FIG.
- the position identification device 5 includes a processor 601, a memory 603, an auxiliary storage device 605, a communication interface device 607, an input / output interface device 608, a power supply device 609, and the like, and these are connected to each other via a bus or the like.
- the moving body 1 may be provided with a mechanism such as an operation unit for the user to operate the moving body 1.
- the processor 601 is composed of, for example, a CPU, a ROM, a RAM, or the like, in other words, a controller.
- the processor 601 or the like may be implemented by a programmable hardware circuit or the like such as FPGA.
- the processor 601 reads a program stored in the auxiliary storage device 605 or the like into the memory 603, expands the program, and executes processing according to the program. As a result, each part such as the position identification part 52 in FIG. 5 is realized as an execution module.
- the control program 630, the processing data 635 by the processor 601 and the like are stored in the memory 603.
- the control program 630 includes a sensor control program 631, a position identification program 632, a map creation program 633, an adjustment program 635, and the like, whereby the sensor control unit 51, the position identification unit 52, the map creation unit 53, and the map creation unit 53 of FIG. 5 are included.
- the adjusting unit 55 and the like are realized.
- the processed data 640 includes data such as map data, position identification results, and relative positional relationship data (corresponding to each data of the data storage unit 54 in FIG. 5).
- the auxiliary storage device 605 is composed of a non-volatile memory, a storage device, for example, a storage medium such as a disk or a memory card, or a DB server on a communication network, and stores programs and various data in advance.
- the auxiliary storage device 605 stores, for example, map data 651, sensor position identification result data 652, sensor relative positional relationship data 653, and the like. If necessary, the processor 601 reads the data in the auxiliary storage device 605 into the memory 603, writes the data in the memory 603 into the auxiliary storage device 605, and stores the data.
- the map data 651 is environment map data, and may be a map database (DB) that stores a plurality of map data, and includes each map data corresponding to the first map data 42A and the second map data 42B of FIG. Each map data is composed of, for example, an image.
- the sensor position identification result data 652 is the position identification result data of each sensor 2 (2A, 2B), and corresponds to the first position identification result 41A and the second position identification result 41B in FIG. Contains information that represents the position and orientation of.
- the sensor relative positional relationship data 653 is data representing the relative positional relationship between the sensors 2 (2A, 2B), in other words, is setting data for automatic adjustment of the sensor 2, and corresponds to the relative positional relationship data 43 in FIG. ..
- the communication interface device 607 performs communication processing according to each communication interface between the sensor 2 and the mobile mechanism control device 6, or between an external device such as a PC or a server.
- the communication interface may be wired or wireless, short-range communication or remote communication.
- An input device for example, a keyboard
- an output device for example, a display device
- An input device or an output device may be mounted on the mobile body 1.
- the power supply device 609 is composed of a battery or the like, and supplies electric power to each part.
- the processor 601 has at least the above-mentioned positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, and a relative positional relationship calculation function as functions realized by program processing or the like.
- the positioning function is a function of measuring the position of the moving body 1 based on the sensor 2.
- the position / orientation estimation function is a function of estimating the position and orientation of the sensor 2.
- the positioning function and the position / orientation estimation function are mainly realized by the position identification unit 52 of FIG.
- the automatic transport control function is a function of controlling the automatic transport of the moving body 1, for example, a function of traveling a route set in the environment 101 (FIG. 1) so as not to collide with a surrounding object.
- the map creation function is a function of creating and updating a map of the environment 101 based on the movement in the environment 101 and the sensor 2, and is mainly realized by the map creation unit 53 of FIG.
- the relative positional relationship calculation function is a function of calculating the relative positional relationship 105 (FIG. 1 and the like) between the sensors 2 and performing automatic adjustment, and is mainly realized by the adjustment unit 55 of FIG.
- the control device 100 shown in FIG. 1 or the like calculates the position and orientation of the moving body 1 in the spatial coordinate system CS of the environment 101 by calculation using the sensor data from the two sensors 2. Is a function to measure or estimate.
- the control device 100 uses the locus of the position of each sensor 2 in the spatial coordinate system CS grasped by this function for the calculation by the relative positional relationship calculation function.
- the automatic transport control function is a function realized by using the positioning function or the position / attitude estimation function.
- the moving body 1 controls suitable automatic transportation, for example, safe transportation on a route, based on the state of the position and posture of the moving body 1 grasped by those functions. Since this mobile system includes a plurality of (two) sensors 2, the overall detection range can be widened, the position and the like can be stably detected by the positioning function, and as a result, suitable automatic transport can be performed. Can be realized.
- this mobile system may have not only a positioning function but also other functions as a function configured by using a plurality of sensors 2, and in the first embodiment, it has a map creation function. ..
- the map creation function constitutes a so-called SLAM (Simultaneous Localization and Mapping) function together with a position / orientation estimation function.
- SLAM is a method in which a moving object such as an automatic guided vehicle estimates its own position and posture based on the detection of the surrounding situation by a sensor, and at the same time creates and updates a map of the surroundings. That is, the map creation function is a function that can automatically create or update a map around the moving body 1 in the environment 101 based on the sensor data of the sensor 2 accompanying the traveling of the moving body 1.
- this map corresponds to the type of sensor 2 (two-dimensional laser scanner) and is a map on a horizontal plane, and is configured as image data representing the shape of an object in the environment 101. Since this mobile system includes a plurality of (two) sensors 2, it is possible to more preferably perform map creation by the map creation function.
- the relative positional relationship calculation function is a function of calculating and automatically setting the relative positional relationship 105 between the sensors 2 in the moving body coordinate system CM based on the sensor data of the two sensors 2 (2A, 2B). Is.
- the moving body 1 automatically activates this function during normal traveling and automatically adjusts the relative positional relationship 105.
- the control device 100 detects and measures surrounding objects by the sensor 2 when moving in the environment 101 (FIG. 1), for example, when automatically transporting on a set route.
- the control device 100 creates shape data representing the shape of an object around the moving body 1 with reference to the position of the sensor 2 based on the distance measurement data which is the sensor data from the sensor 2.
- the control device 100 estimates the current position and orientation of the moving body 1 in the environment 101 (corresponding map data) by comparing and collating the shape data with the existing map data stored in the control device 100. It is the position identification result.
- This estimation is, for example, a process of evaluating and judging the degree of matching or similarity between the shape data and the map data within a predetermined search range, and the degree can be evaluated from, for example, the number of overlaps in pixel units. ..
- the control device 100 controls suitable movement while estimating the current position and posture of the moving body 1 for each section on the route.
- the control device 100 determines the next target position on the route from the current position and posture, and controls the movement mechanism 3 for movement to the target position. At that time, the control device 100 controls the moving mechanism 3 so that the relationship between the surrounding object and the current position and posture of the moving body 1 becomes suitable.
- control device 100 creates and registers new map data using the shape data created above while estimating the position and posture. Alternatively, the control device 100 updates the existing map data using the shape data created above. The control device 100 similarly repeats the above-mentioned local planning process for each section on the route.
- FIG. 7 shows a flow of main processing (particularly relative positional relationship calculation) by the position identification device 5 of the control device 100 of FIG. 5, particularly the adjusting unit 55.
- the flow of FIG. 7 has steps S1 to S5.
- the adjusting unit 55 uses the data of the position identification results (41A, 41B) on the time series in a time of a certain period of time or more, in other words, the locus data, as the moving body 1 travels. Is input / acquired.
- the position identification result includes information on the position and orientation (angle representing the direction) of the sensor 2 in the spatial coordinate system CS.
- the adjusting unit 55 generates the relative position parameter 700 ( ⁇ x, ⁇ y).
- the relative position parameter 700 is a parameter representing the relationship (PA, PB) between the sensors 2 (2A, 2B) in the moving body coordinate system CM of the moving body 1 as shown in FIG.
- the relative position parameter 700 has, for example, a difference value ⁇ x with respect to the position PB of the sensor 2B in the x-axis direction and a difference value ⁇ y with respect to the y-axis direction with reference to the position PA of the sensor 2A.
- the positions (ZA, ZB) and the difference ( ⁇ z) in the z-axis direction are excluded from the calculation because they are set values.
- the subscript k represents a certain point in time
- the subscript T represents the last point in time.
- step S2 the adjusting unit 55 corresponds to each relative position parameter 700 in step S1 with respect to the locus 701 (sometimes referred to as the first locus) of the first position identification result 41A of the sensor 2A which is the first sensor.
- a locus 703 (sometimes referred to as a tentative locus) of the “temporary position” (referred to as VPB) of the sensor 2B, which is the second sensor, is generated.
- This "temporary position” sets a temporary position of the sensor 2B for matching processing, and a relative position parameter 700 ( ⁇ x, ⁇ y) is used as a vector from the position PA of the sensor 2A at each time point (t). Corresponds to the previous position.
- the temporary position VPB is generated as a plurality of candidates (VPB1, ..., VPBn).
- the locus 701 of the first position identification result 41A of the sensor 2A is expressed as time series data, for example, ⁇ (xA_1, yA_1), ..., (xA_k, yA_k), ..., (xA_T, yA_T) ⁇ .
- the locus 703 of the temporary position of the sensor 2B is expressed as time series data, for example, ⁇ (vxB_1, vyB_1), ..., (vxB_k, vyB_k), ..., (vxB_T, vyB_T).
- step S3 the adjusting unit 55 matches the locus 703 of the temporary position VPB generated in step S2 with the locus 702 of the second position identification result 41B of the sensor 2B (may be described as the second locus).
- This matching process is a process of evaluating and determining the degree of matching or similarity of shapes among trajectories.
- "matching degree” referred to as K
- a rotation parameter referred to as R
- the adjusting unit 55 stores the calculated coincidence degree K and the rotation parameter R in the memory for each candidate relative position parameter 700.
- the locus 702 of the second position identification result 41B is expressed as time series data, for example, ⁇ (xB_1, yB_1), ..., (xB_k, yB_k), ..., (xB_T, yB_T) ⁇ .
- step S4 the adjusting unit 55 determines the relative position parameter 700 ( ⁇ x, ⁇ y) corresponding to the pair of loci having the highest degree of coincidence K as the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) in the result of step S3. -Extract and store in memory. Up to this point, the positional relationship ( ⁇ x, ⁇ y) of the relative positional relationship 105 shown in FIG. 1 and the like has been grasped.
- step S5 the adjusting unit 55 further calculates the optimum value regarding the relationship ( ⁇ ) of the direction ( ⁇ ) between the sensors 2.
- the adjusting unit 55 uses the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) in step S4 to generate an amount ⁇ B_t + R ⁇ A_t ⁇ that represents the relationship ( ⁇ ) of the direction ( ⁇ ).
- the angle ⁇ B_t is an angle representing the posture of the sensor 2B at each time point (t) included in the second identification result 41B of the sensor 2B.
- the angle ⁇ A_t is an angle representing the posture of the sensor 2A at each time point (t) included in the first identification result 41A of the sensor 2A.
- “+ R” is the addition of the rotation parameter R. This addition is an operation for adjusting to the coordinate system CA of the first sensor (sensor 2A) in obtaining the relationship of directions.
- “ ⁇ A_t” is an operation for taking the difference between the angle ⁇ B_t and the angle ⁇ A_t.
- the adjusting unit 55 takes the average value of the quantities ⁇ B_t + R- ⁇ A_t ⁇ generated at each time point, determines it as the optimum relative direction 705 ( ⁇ _opt), and stores it in the memory.
- This average value is represented by ⁇ ⁇ B_t + R ⁇ A_t ⁇ / T.
- the optimum relative direction 705 ( ⁇ _opt) is an optimum value regarding the directional relationship ( ⁇ ) between the sensors 2 expressed by the difference in angles.
- step S5 an example of calculation of the directional relationship is shown, but the present invention is not limited to this.
- the adjusting unit 55 stores the information including the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) and the optimum relative direction ( ⁇ _opt) obtained as described above as the relative positional relationship data 43 in FIG.
- the relative positional relationship data 43 corresponds to the values ( ⁇ x, ⁇ y, ⁇ ) representing the relative positional relationship 105 in FIG.
- FIG. 8A shows an example of the movement of the moving body 1 and the locus at that time in a horizontal plane.
- the moving body 1 is initially located in front of the position P1 at the time point t1 (here, the typical position PM in FIG. 4 is used) in the front (X direction in the spatial coordinate system CS, moving body coordinate system CM). It is in a state of facing the x direction), and is traveling forward to the position P2 at the time point t2.
- the moving body 1 makes a right turn from the position P2 so as to pass through the position P3 at the time point t3.
- the moving body 1 is in a state of facing right at the position P4 at the time point t4.
- the locus 800 shown by the broken line indicates the locus of movement of the moving body 1 (position PM) as described above, particularly the locus through which the position PM2 of the rear axis 420 of FIG. 4 passes.
- the locus 701 shown by the solid line indicates a locus (first locus) through which the position PA of the sensor 2A on the front side passes along with this movement.
- FIG. 8B shows only the locus 701 of the sensor 2A (position PA) in (A) extracted.
- This locus 701 includes a straight line portion 701a when traveling straight forward, a curved section 701b (in other words, an arc) when turning right, and a straight line portion 701c when traveling straight forward to the right.
- the first position identification unit 52A in FIG. 5 obtains such trajectory data based on the sensor 2A.
- This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZA of FIG. 2 at each time point on the time series. Although the locus is shown by a line, it is a point cloud in detail.
- (A) of FIG. 9 corresponds to (A) of FIG. 8, and the locus 800 of the moving body 1 is the same.
- the locus 702 shown by the solid line indicates a locus (second locus) through which the position PB of the sensor 2B on the rear side passes along with this movement.
- FIG. 9B shows only the locus 702 of the sensor 2B (position PB) in (A) extracted.
- This locus 702 includes a straight line portion 702a when traveling straight forward, a curved section 702b (in other words, an arc) when turning right, and a straight line portion 702c when traveling straight forward to the right.
- the second position identification unit 52B in FIG. 5 obtains such trajectory data based on the sensor 2B.
- This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZB in FIG. 2 at each time point on the time series.
- FIG. 10 shows two loci 701 (second locus) when the installation position PB of the sensor 2B in FIG. 9 is different from the locus 800 of the moving body 1 and the locus 701 (first locus) of the sensor 2A as in FIG. ) Are shown as examples of loci 7011 and 7012.
- the locus 7011 is an example of a locus based on an actual measured value when the position PB of the sensor 2B in the mobile coordinate system CM is the position PB1
- the locus 7012 is an example of a locus when the position PB2 is the position PB2. be.
- each locus (7011, 7012) of the sensor 2B the distance (for example, distance) from the turning center of the moving body 1 (for example, the locus 800 of the position PM of the moving body 1) to the position PB (PB1, PB2). It can be seen that the radius of the arc portion in the locus changes according to 1001, 1002).
- the mobile system uses locus data including a locus such as a straight line corresponding to a straight line motion and a locus such as an arc corresponding to a direction change motion such as turning obtained in a certain period of time or more.
- the relative positional relationship can be calculated by the matching process in step S3 of.
- each relative position parameter 700 is set so that each such position PB is a temporary position VPB.
- FIG. 11A shows an example of generating the relative position parameter 700 ( ⁇ x, ⁇ y) in step S1 of FIG. 7.
- the adjusting unit 55 can select a plurality of relative position parameters 700 ( ⁇ x, ⁇ y) from the position PA of the reference sensor 2A at each time point (t) in each direction and each distance in the horizontal plane.
- the relative position parameter 700 ( ⁇ x, ⁇ y) is a value of a plurality of candidates shifted within a predetermined range based on, for example, the initial setting value of the relative position relationship (relative position relationship data 43 in FIG. 5). Is generated as.
- the adjusting unit 55 sets a temporary position VPB according to the relative position parameter 700 on a grid (a grid having a plurality of position coordinate points) as shown in the figure.
- the temporary position VPB0 is a value generated by the relative position parameters ( ⁇ x0, ⁇ y0) corresponding to the initial setting values of the relative positional relationship.
- the temporary positions VPB1 and VPB2 are values generated by other relative position parameters ( ⁇ x1, ⁇ p1) and (xp2, yp2).
- the range in which the adjusting unit 55 generates the relative position parameter 700 may be a range in which the sensor 2 based on the shape of the moving body 1 can be installed, or a predetermined range centered on the initial setting value of the relative positional relationship. May be good.
- the direction (x-axis, y-axis) of the sensor 2B (coordinate system CB) at each temporary position VPB is a constant direction based on the initial setting value of the relative positional relationship.
- FIG. 11B shows an example of parameters when setting a temporary direction (“temporary direction”: V ⁇ ) of the sensor 2B at the temporary position VPB.
- various temporary directions (V ⁇ ) may be generated by using the parameters of the temporary direction (V ⁇ ) of the sensor 2B at each temporary position VPB corresponding to each relative position parameter 700.
- the parameter of the tentative direction (V ⁇ ) for example, a plurality of candidates are generated within a predetermined range based on the initial setting value of the relative positional relationship.
- three examples V ⁇ 0, V ⁇ 1, V ⁇ 2 are shown as parameters of the tentative direction (V ⁇ ).
- the parameter of the tentative direction (V ⁇ ) can be defined by using the angle difference with respect to the direction ( ⁇ A, x-axis) of the reference sensor 2A or the rotation parameter R described above.
- step S2 the locus 703 of the temporary position VPB of the sensor 2B is generated according to the relative position parameter 700 ( ⁇ x, ⁇ y) of step S1 from the locus 701 which is the first position identification result regarding the position PA of the sensor 2A. ..
- the same relative position parameter 700 is applied to the locus 703 of the temporary position VPB corresponding to the relative position parameter 700 of a certain candidate at each time point (t).
- FIG. 12 shows an example in which the locus 703 (provisional locus) of each temporary position VPB corresponding to each relative position parameter 700 is generated in step S2.
- the tentative locus is indicated by a long-dotted chain line.
- Relative position parameters from each position PA (PA1, PA2, ...) having a direction ( ⁇ A) at each time point (t t1, t2, ...)
- the temporary position VPB is set first using 700 ( ⁇ x, ⁇ y).
- each position at each time point (t) is shown in ⁇ (vxB_1, vyB_1), (vxB_2, vyB_2), (vxB_3). , VyB_3), (vxB_4, vyB_4), (vxB_5, vyB_5) ⁇ .
- one tentative locus is a case where the relative position parameters 700 ( ⁇ x, ⁇ y) for each time point are the same.
- the temporary trajectories 1201 and 1202 of each temporary position VPB after generation have different radii of arcs (curved portions 1201b and 1202b) depending on the distance from the turning center.
- the shape of the tentative locus is different from the shape of the first locus.
- FIG. 13 shows an explanatory diagram regarding an example of the matching process in step S3 of FIG.
- the rotation parameter R is used for the temporary locus of the temporary position VPB of the second sensor generated based on the first locus of the first sensor and the relative position parameter 700 and the second locus of the second sensor.
- the shapes of the trajectories are compared and collated in relation to various orientations.
- FIG. 13A as a pair of loci to be compared, a plurality of loci (for example, loci 7021, 7022) based on the locus 703 (provisional locus) shown by the alternate long and short dash line and the locus 702 (second locus) shown by the solid line.
- the locus 703 is a provisional locus of a certain temporary position VPB corresponding to a certain relative position parameter 700 ( ⁇ x, ⁇ y).
- the plurality of loci (7021, etc.) are a plurality of loci generated in different directions using the rotation parameter R.
- a plurality of loci (7021 and the like) are superposed on the locus 703, assuming that the position at the first time point (t1) is the same.
- the adjusting unit 55 compares the locus 703 of the temporary position VPB with a plurality of loci (7021 etc.) and calculates the degree of coincidence K.
- the adjusting unit 55 corresponds to a point on the locus 703 of the temporary position VPB (for example, a position 1300 corresponding to the time t3) and each locus 7021 to 7025 at each time point (t).
- the distance 1301 is calculated for each position.
- the adjusting unit 55 calculates the degree of coincidence K according to the total sum. That is, roughly, the degree of coincidence K is defined and calculated so that the smaller the sum is, the higher the degree of coincidence K is.
- the rotation parameter R is changed in the locus 702 (second locus) to generate a comparison target. It is possible without limitation, and the rotation parameter R may be changed in the locus 703 (provisional locus) to generate a comparison target.
- An important point of the matching process is to determine the degree of matching of the shapes between the loci including the curved portion, and the provisional locus may be generated from either the first locus or the second locus.
- step S5 the optimum relative direction ( ⁇ _opt) is calculated from the quantity ⁇ 2_t + R ⁇ 1_t ⁇ .
- FIG. 13B there is an example of the relationship between the locus 701 of the sensor 2A, the locus 702 of the sensor 2B, the locus 703 of the temporary position VPB, the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt), and the rotation parameter R. Shown.
- the quantity ⁇ 2_t + R ⁇ 1_t ⁇ indicates the relationship between the direction of the sensor 2A ( ⁇ 1_t1) and the direction of the sensor 2B ( ⁇ 2_t1) at the time point t1.
- the control device 100 performs a locus shape matching process in step S3 of FIG. 7 as a relative positional relationship calculation function. At that time, in the sensor data SDA of the first sensor and the sensor data SDB of the second sensor, the matching process is performed regardless of whether the information at the time of detection (t) is synchronized or not. It is possible. Even when the time is not synchronized between the sensor data SDA and the sensor data SDB, the control device 100 may, for example, have a first locus (specifically, a tentative locus generated based on the first locus) and a second locus of the first sensor.
- a first locus specifically, a tentative locus generated based on the first locus
- FIG. 14 shows an example of matching processing when the time is not synchronized between the sensor data.
- the locus 1401 provisional locus
- the alternate long and short dash line is generated from, for example, the locus of the first position identification result 41A based on the sensor data SDA of the sensor 2A and a certain relative position parameter 700.
- This is the locus of the temporary position VPB of the sensor 2B.
- the locus 1402 shown by the solid line is, for example, the locus (second locus) of the second position identification result 41 based on the sensor data SDB of the sensor 2B.
- the time points (t) of the two sensor data are not synchronized with each other.
- the time point t1 on the locus 1401 and the time point t1 on the locus 1402 are different times.
- FIG. 15 does not consider the rotation parameter R.
- the control device 100 compares each pair so that, for example, the position at a certain time point on the locus 1401 corresponds to the position at each time point (at least some candidates) of the locus 1402. Try.
- FIG. 14B shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p11 at the time point t1 of the locus 1402 when the locus 1402 is superimposed on the locus 1401.
- FIG. 14C shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p12 at the time point t2 of the locus 1402 when the locus 1402 is superimposed on the locus 1401.
- the method of taking the distance between the loci is not limited to the example of the distance 1301 in FIG. 13, and for example, the line with the shortest distance from the point of one locus to the point of the other locus is taken. You may do it.
- the control device 100 may select the one having the highest degree of agreement K from the results of each trial as described above.
- the moving body 1 may be made to perform a specific preset operation, and the sensor data at that time may be acquired.
- This specific operation is an operation that satisfies the condition that the relative positional relationship can be calculated / determined, and is an operation including a change of direction such as turning in a time of a certain time or more. That is, the locus data based on the sensor data obtained in this time includes curved portions (701b, 702b) such as arcs corresponding to turning, as shown in FIGS. 8 and 9, for example. If there is such locus data, the above-mentioned matching process is established, and a solution as an optimum value can be obtained. As a result, it is possible to avoid the fact that a solution cannot be obtained and that it takes a long time to obtain a solution, so that calibration can be performed efficiently.
- the present inventor has confirmed that the relative positional relationship between the sensors 2 can be calculated and determined from at least the trajectory data including the arcs and the like corresponding to the turning as described above, based on the examination including the experiment. From this, necessary conditions and specific actions can be specified. As shown in FIG. 10 and the like described above, the matching process described above is effective because the shape of the locus differs depending on the position of the sensor 2 on the moving body 1.
- the mobile system may be provided with a user interface that allows the user to set a specific operation for the calibration. For example, on the display screen of a device such as a PC 101 (FIG. 1) connected to the mobile body 1, a route for the specific operation can be set by the user.
- Map creation function 15 to 19 are explanatory views relating to the map creation function.
- two map data (42A, 42B) are associated with each other based on the relative positional relationship data 43 (FIG. 5) obtained by the relative positional relationship calculation function.
- FIG. 15 shows a schematic configuration example of an environment 1500 such as a factory to which the moving body 1 is applied in a horizontal plane (XY plane).
- FIG. 15 shows a configuration detected by the sensor 2A, particularly with respect to the environment 1500. Based on the sensor data by the sensor 2A at the height position ZA in FIG. 2, the surrounding shape and the locus as the first identification result 41A are grasped, and the first map data 42A is created or updated based on them.
- the object 1501 shown in the shaded hatched region is an example of an object detected as a feature point by the sensor 2A of the moving body 1.
- FIG. 15 shows an example of the traveling and locus of the moving body 1.
- the moving body 1 In the spatial coordinate system CS (X, Y, Z), the moving body 1 is moving straight from the position P1 in the positive direction (for example, south) of the X axis. From the position P2, the moving body 1 turns to the left when viewed from the moving body 1. After turning left, the moving body 1 faces the positive direction (for example, east) of the Y-axis at the position P3, further turns to the left from the position P3, and faces the negative direction (for example, north) of the X-axis. It has become.
- the moving body 1 is moving straight up to the position P4 in the negative direction (for example, north) of the X axis.
- the broken line locus 1510 shows a locus with respect to a typical position PM of the moving body 1.
- the solid line locus 1511 shows the locus of the sensor 2A with respect to the position PA.
- the range 1502 shows an example of the emission range by scanning the laser beam from the sensor 2A (position PA).
- the range 1502 shows the case where it is larger than 180 degrees, as in FIG.
- the solid arrow indicates the laser beam.
- the laser beam 1503 hits the object 1501 and is reflected back to the sensor 2A to be detected as a feature point.
- the laser beam 1504 hits a wall or the like of a factory building, is reflected, returns to the sensor 2A, and is detected as a feature point.
- There is an object in the white area but it is not detected by the sensor 2A because it is not in the height position ZA.
- the above-mentioned position identification and map creation are possible, and the detailed technical contents thereof are not limited.
- FIG. 16 shows the configuration detected by the sensor 2B in the same environment 1500 as in FIG.
- the locus 1510 of the moving body 1 is the same as that in FIG.
- the solid line locus 1512 is the locus of the position PB of the sensor 2B.
- range 1602 shows an example of an emission range due to scanning of laser light from sensor 2B (position PB).
- the laser beam 1603 hits the object 1601, is reflected, returns to the sensor 2B, and is detected as a feature point. Since this object 1601 is at the height position ZB, it is detected by the sensor 2B.
- objects such as production equipment in factories can have various shapes, and the height may differ for each part.
- the moving body 1 is provided with a plurality of (two) sensors 2 having different positions including the height position as in the first embodiment, the detection range that can be covered by each sensor 2 is different, and the object shape that can be measured is different.
- this mobile system it is possible to create a map that measures and reflects the shape of the environment 1500 in more detail.
- a plurality of map data (42A, 42B) are created as different maps for each sensor 2.
- the plurality of map data can be associated with each other based on the calibration of the relative positional relationship between the sensors 2.
- Map data 17 and 18 show configuration examples of map data created based on the environment 1500 and measurements of FIGS. 15 and 16.
- FIG. 17 shows a map 1700 corresponding to the first map data 42A created based on the sensor data of the sensor 2A of FIG.
- line 1701 is a line corresponding to the contour (corresponding feature point group) of the object 1501 in FIG.
- the origin is the first position PA of the sensor 2A at the time when the measurement by the moving body 1 is started
- the x-axis direction of the sensor 2A is the X-axis, and it is orthogonal to it.
- the case where the y-axis direction is the Y-axis is shown.
- FIG. 18 shows a map 1800 corresponding to the second map data 42B created based on the sensor data of the sensor 2B of FIG.
- line 1801 is a line corresponding to the contour of the object 1601 in FIG.
- the origin is the first position PB of the sensor 2B at the time when the measurement by the moving body 1 is started
- the x-axis direction of the sensor 2B is the X-axis, and it is orthogonal to it.
- the case where the y-axis direction is the Y-axis is shown.
- FIG. 19 shows an example in which the above two map data (42A, 42B) are associated with one and output to the user by using the relative positional relationship data 43 obtained in the above calculation.
- the map 1700 its coordinate system
- the map 1800 of the sensor 2B is rotated and superposed on the map 1700 by using the relative positional relationship 105 between the position PA and the position PB.
- the user can refer to the two maps in a state of being associated with each other as one map in this way.
- the map data is configured as, for example, image data, and has information such as a position and the presence / absence of an object for each pixel.
- the relative positional relationship calculation function makes it possible to easily obtain the relative positional relationship between the sensors 2 with high accuracy.
- the relative positional relationship calculation function can automatically adjust the moving body 1 as it travels, there is less time and effort for the user to manually operate the setting of the sensor 2.
- the detection direction and the detection range of the sensor 2 are set to the horizontal plane, but the present invention is not limited to this, that is, the detection direction and the detection range of the sensor 2 can be similarly applied to other than the horizontal plane.
- the sensor 2 may be a type of sensor capable of three-dimensional positioning or distance measurement including the height direction.
- the present invention has been specifically described above based on the embodiments, the present invention is not limited to the above-described embodiments and can be variously modified without departing from the gist.
- the moving body an autonomously movable moving body such as an AGV has been described, but the present invention is not limited to this, and the moving body can also be applied to a moving body operated by a user.
- the moving body is not limited to a vehicle, but may be a ship or a flying object such as a drone.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Provided is a technology with which it is possible to determine a relative positional relationship between sensors if a plurality of sensors are installed in a mobile body, and if the installation positions or the like are changed. This mobile body system comprises: a mobile body 1; a plurality of sensors 2 including a first sensor (2A) and a second sensor (2B) that are installed in different positions in a mobile body coordinate system CM; and a control device 100 that realizes at least a position-measuring function on the basis of sensor data. The sensors 2 are a type of sensor that is capable of detecting the position of the sensor within a space coordinate system (CS). When the mobile body 1 moves within an environment 101, the control device 100 identifies a first sensor position and a second sensor position within the space coordinate system CS on the basis of the pieces of sensor data, acquires a first sensor first trajectory and a second sensor second trajectory in a time series and, on the basis of a comparison of the trajectory shapes, calculates a relative positional relationship 105 between the first sensor and the second sensor in the mobile body coordinate system CM, and configures said relationship in the mobile body 1.
Description
本発明は、移動体システムの技術に関し、移動体の位置を測定する技術等に関する。
The present invention relates to a technique for a mobile system, a technique for measuring the position of a moving body, and the like.
移動体の位置を測定する機能(測位機能と記載する場合がある)等を有する移動体システム等において、移動体にはセンサが設置される。このセンサは、少なくとも位置の検出、またはセンサデータに基づいた位置の計算が可能である種類のセンサである。このセンサは、例えば、レーザスキャナのような測距センサや、GPS受信器等が挙げられる。移動体システムは、センサの情報を用いて、測位機能や、移動体の周囲の地図を作成する機能等を実現できる。
In a mobile system or the like having a function of measuring the position of a moving body (sometimes referred to as a positioning function), a sensor is installed on the moving body. This sensor is a type of sensor that can at least detect the position or calculate the position based on the sensor data. Examples of this sensor include a distance measuring sensor such as a laser scanner, a GPS receiver, and the like. The mobile system can realize a positioning function, a function of creating a map around the mobile, and the like by using the information of the sensor.
上記移動体システムに係わる先行技術例としては、特開2017-97402号公報(特許文献1)が挙げられる。特許文献1には、「周辺地図作成方法」等として、移動ロボットの自己位置推定装置は、レーザレンジファインダ(LRF)の距離データを地図とマッチングすることにより、最新の自己位置・姿勢データを作成する旨が記載されている。
Examples of prior art related to the mobile system include JP-A-2017-97402 (Patent Document 1). In Patent Document 1, as a "peripheral map creation method" or the like, the self-position estimation device of a mobile robot creates the latest self-position / attitude data by matching the distance data of the laser range finder (LRF) with the map. It is stated that it will be done.
従来技術例の移動体システム、例えば測位システムや地図作成システム等では、1つの移動体に複数(例えば2個)のセンサが設置される場合がある。この設置の目的は、移動体システムの機能や用途等の詳細によるが、例えば、広い検出範囲の確保、複数のセンサデータを用いた1つの位置の計算、冗長構成の実現、等が挙げられる。
In the mobile system of the prior art example, for example, a positioning system, a map creation system, etc., a plurality of (for example, two) sensors may be installed in one mobile body. The purpose of this installation depends on the details such as the function and application of the mobile system, and examples thereof include securing a wide detection range, calculating one position using a plurality of sensor data, and realizing a redundant configuration.
一例として、移動体が無人搬送車(Automated Guided Vehicle:AGV)や自律走行ロボット等である移動体システムでは、移動体に複数の測距センサが設置される場合がある。工場等の適用環境に応じて、適用する移動体の種類や形状、センサの設置の位置や方向等は、様々である。同じ移動体に対しても、複数のセンサの設置の位置や方向等が変更される場合もある。
As an example, in a mobile body system in which the moving body is an automated guided vehicle (AGV), an autonomous traveling robot, or the like, a plurality of distance measuring sensors may be installed on the moving body. The type and shape of the moving body to be applied, the position and direction in which the sensor is installed, and the like vary depending on the application environment such as a factory. The positions and directions of installation of a plurality of sensors may be changed even for the same moving body.
従来、移動体における複数のセンサの設置位置等が変更される場合に、それらのセンサ間の相対位置関係が把握しにくい場合がある。例えば、1つの移動体において2個のセンサの設置位置が固定である場合には、それらのセンサ間の相対位置関係については予め初期設定可能であり、変更が無ければ問題無い。しかし、その移動体における2個のセンサの設置位置が変更された場合、変更後のセンサ間の相対位置関係について、高精度の計測が難しい場合や、ユーザ設定するにしても手間が大きい場合がある。変更後のセンサ間の相対位置関係についての設定の正確性が低い場合、移動体システムの測位機能や地図作成機能等の機能にも影響し、それらの機能の精度も低くなる恐れがある。
Conventionally, when the installation positions of a plurality of sensors in a moving body are changed, it may be difficult to grasp the relative positional relationship between the sensors. For example, when the installation positions of the two sensors are fixed in one moving body, the relative positional relationship between the sensors can be initially set in advance, and there is no problem if there is no change. However, when the installation positions of the two sensors in the moving body are changed, it may be difficult to measure the relative positional relationship between the changed sensors with high accuracy, or even if the user sets it, it may take a lot of time and effort. be. If the accuracy of the setting regarding the relative positional relationship between the sensors after the change is low, it may affect the functions such as the positioning function and the map creation function of the mobile system, and the accuracy of those functions may also be low.
本発明の目的は、移動体システムの技術に関して、移動体に複数のセンサが設置される場合や設置位置等が変更される場合にも、センサ間の相対位置関係を求めることができ、測位機能等の精度を高めることができる技術を提供することである。
An object of the present invention is to obtain a relative positional relationship between sensors even when a plurality of sensors are installed on a moving body or when the installation position is changed with respect to the technology of a moving body system, and a positioning function can be obtained. It is to provide a technology that can improve the accuracy of such as.
本発明のうち代表的な実施の形態は、以下に示す構成を有する。一実施の形態の移動体システムは、移動体と、前記移動体の移動体座標系における異なる位置に設置される第1センサおよび第2センサを含む複数のセンサと、前記複数のセンサの複数のセンサデータに基づいて少なくとも前記移動体の空間座標系内での位置を測定する測位機能を実現する制御装置と、を備え、前記第1センサおよび前記第2センサは、前記空間座標系内での自己センサの位置を検出可能な種類のセンサであり、前記制御装置は、環境内での前記移動体の移動の際に、前記第1センサの第1センサデータおよび前記第2センサの第2センサデータに基づいて、前記空間座標系内での前記第1センサの位置および前記第2センサの位置を同定し、位置同定結果に基づいて、時系列上の前記第1センサの第1軌跡および前記第2センサの第2軌跡を取得し、前記第1軌跡および前記第2軌跡を用いて、軌跡の形状の比較照合に基づいて、前記移動体の前記移動体座標系での前記第1センサの位置と前記第2センサの位置との相対位置関係を計算し、計算した相対位置関係を表す情報を前記移動体に設定する。
A typical embodiment of the present invention has the following configuration. The moving body system of one embodiment includes a moving body, a plurality of sensors including a first sensor and a second sensor installed at different positions in the moving body coordinate system of the moving body, and a plurality of the plurality of sensors. The first sensor and the second sensor are provided with a control device that realizes a positioning function that measures at least the position of the moving body in the spatial coordinate system based on sensor data, and the first sensor and the second sensor are in the spatial coordinate system. It is a type of sensor that can detect the position of the self-sensor, and the control device is a first sensor data of the first sensor and a second sensor of the second sensor when the moving body moves in the environment. Based on the data, the position of the first sensor and the position of the second sensor in the spatial coordinate system are identified, and based on the position identification result, the first locus of the first sensor on the time series and the said The second locus of the second sensor is acquired, and the first locus and the second locus are used to compare and collate the shapes of the loci, and the first sensor of the moving body in the moving body coordinate system of the moving body. The relative positional relationship between the position and the position of the second sensor is calculated, and the information representing the calculated relative positional relationship is set in the moving body.
本発明のうち代表的な実施の形態によれば、移動体システムの技術に関して、移動体に複数のセンサが設置される場合や設置位置等が変更される場合にも、センサ間の相対位置関係を求めることができ、測位機能等の精度を高めることができる。
According to a typical embodiment of the present invention, with respect to the technique of the mobile system, the relative positional relationship between the sensors even when a plurality of sensors are installed on the mobile body or when the installation position or the like is changed. Can be obtained, and the accuracy of the positioning function and the like can be improved.
以下、本発明の実施の形態を図面に基づいて詳細に説明する。なお、全図面において同一部には原則として同一符号を付し、繰り返しの説明は省略する。
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In principle, the same parts are designated by the same reference numerals in all drawings, and repeated description will be omitted.
(実施の形態1)
図1~図19を用いて、本発明の実施の形態1の移動体システムについて説明する。実施の形態1の移動体システムは、移動体に備える複数のセンサの間の相対位置関係を自動調整(言い換えるとキャリブレーション)できる機能(相対位置関係計算機能と記載する場合がある)を有する。移動体におけるセンサの設置位置等が変更された場合でも、この機能による自動調整によって、ユーザの手間が少なく各センサの設置位置等を高精度に設定できる。よって、移動体システムの測位機能等を高精度に維持することができる。 (Embodiment 1)
The mobile system according to the first embodiment of the present invention will be described with reference to FIGS. 1 to 19. The mobile system of the first embodiment has a function (sometimes referred to as a relative positional relationship calculation function) capable of automatically adjusting (in other words, calibrating) the relative positional relationship between a plurality of sensors provided in the mobile body. Even if the installation position of the sensor in the moving body is changed, the installation position of each sensor can be set with high accuracy by the automatic adjustment by this function with less trouble for the user. Therefore, the positioning function of the mobile system can be maintained with high accuracy.
図1~図19を用いて、本発明の実施の形態1の移動体システムについて説明する。実施の形態1の移動体システムは、移動体に備える複数のセンサの間の相対位置関係を自動調整(言い換えるとキャリブレーション)できる機能(相対位置関係計算機能と記載する場合がある)を有する。移動体におけるセンサの設置位置等が変更された場合でも、この機能による自動調整によって、ユーザの手間が少なく各センサの設置位置等を高精度に設定できる。よって、移動体システムの測位機能等を高精度に維持することができる。 (Embodiment 1)
The mobile system according to the first embodiment of the present invention will be described with reference to FIGS. 1 to 19. The mobile system of the first embodiment has a function (sometimes referred to as a relative positional relationship calculation function) capable of automatically adjusting (in other words, calibrating) the relative positional relationship between a plurality of sensors provided in the mobile body. Even if the installation position of the sensor in the moving body is changed, the installation position of each sensor can be set with high accuracy by the automatic adjustment by this function with less trouble for the user. Therefore, the positioning function of the mobile system can be maintained with high accuracy.
[移動体システム]
図1は、実施の形態1の移動体システムの構成を示す。この移動体システムは、工場等の環境101に適用されるシステムである。工場等の環境101では、例えば建物内に生産設備102が設置されている。この移動体システムは、移動体1を有する。移動体1は、本例では、製品や部材等の荷物103を無人搬送できるAGV(または自律走行ロボット等)である。移動体1は、工場内で所定の経路を搬送し、例えば荷物103を生産設備102に供給する。移動体1は、筐体10に、制御装置100、センサ2、移動機構3、および搭載機構4等を備える。制御装置100は、移動体1を制御する装置である。 [Mobile system]
FIG. 1 shows the configuration of the mobile system of the first embodiment. This mobile system is a system applied to theenvironment 101 such as a factory. In the environment 101 of a factory or the like, for example, a production facility 102 is installed in a building. This mobile system has a mobile 1. In this example, the moving body 1 is an AGV (or an autonomous traveling robot or the like) capable of unmanned transporting a luggage 103 such as a product or a member. The mobile body 1 conveys a predetermined route in the factory, and supplies, for example, a luggage 103 to a production facility 102. The moving body 1 includes a control device 100, a sensor 2, a moving mechanism 3, a mounting mechanism 4, and the like in a housing 10. The control device 100 is a device that controls the moving body 1.
図1は、実施の形態1の移動体システムの構成を示す。この移動体システムは、工場等の環境101に適用されるシステムである。工場等の環境101では、例えば建物内に生産設備102が設置されている。この移動体システムは、移動体1を有する。移動体1は、本例では、製品や部材等の荷物103を無人搬送できるAGV(または自律走行ロボット等)である。移動体1は、工場内で所定の経路を搬送し、例えば荷物103を生産設備102に供給する。移動体1は、筐体10に、制御装置100、センサ2、移動機構3、および搭載機構4等を備える。制御装置100は、移動体1を制御する装置である。 [Mobile system]
FIG. 1 shows the configuration of the mobile system of the first embodiment. This mobile system is a system applied to the
なお、説明上、座標系や方向の表現として、X,Y,Z等の記号を用いる場合がある。図1では、環境101の空間座標系CSを(X,Y,Z)で表している。空間座標系CSの原点はいずれかの位置に設定される。図1の移動体1は、この空間座標系CSにおいて、前後がX方向、左右がY方向、上下・高さがZ方向に対応させて、配置されている。また、移動体1における座標系を移動体座標系CMとし、(x,y,z)で表している。移動体座標系CMの原点はいずれかの位置、例えば移動体1の代表的な位置に設定される。
For the sake of explanation, symbols such as X, Y, and Z may be used to represent the coordinate system and direction. In FIG. 1, the spatial coordinate system CS of the environment 101 is represented by (X, Y, Z). The origin of the spatial coordinate system CS is set at any position. In this spatial coordinate system CS, the moving body 1 of FIG. 1 is arranged so that the front and rear correspond to the X direction, the left and right correspond to the Y direction, and the vertical and height correspond to the Z direction. Further, the coordinate system in the moving body 1 is defined as the moving body coordinate system CM and is represented by (x, y, z). The origin of the moving body coordinate system CM is set to any position, for example, a representative position of the moving body 1.
移動機構3は、例えば車輪や駆動部等を含む機構である。駆動部は、例えばモータや駆動回路等を含む。移動機構3は、本例では、車輪によって前後の走行や左右への旋回の動作が可能な機構であるが(後述の図4)、これに限らず可能である。搭載機構4は、荷物103を安定的に搭載するための構造部であり、詳細を限定しない。搭載機構4は、用途等に応じて様々な種類があるが、例えばコンベヤ等を含む構造部である。
The moving mechanism 3 is a mechanism including, for example, wheels and a drive unit. The drive unit includes, for example, a motor, a drive circuit, and the like. In this example, the moving mechanism 3 is a mechanism capable of traveling forward and backward and turning left and right by the wheels (FIG. 4 described later), but is not limited to this. The mounting mechanism 4 is a structural portion for stably mounting the luggage 103, and the details are not limited. The mounting mechanism 4 has various types depending on the application and the like, and is, for example, a structural part including a conveyor and the like.
図1の例では、移動体1の筐体10は、形状として、水平面に平行な平板状の第1部分10aと、その第1部分10aの一部から鉛直方向に立つ平板状の第2部分10bと有するが、これに限られない。第1部分10a内には、前後の2つの車軸および前後左右で4つの車輪を含む移動機構3が設けられている。第2部分10b内には、制御装置100が内蔵されている。第2部分10b等において外に露出するように制御装置100が設置されてもよい。なお、本例では、移動体1からみた方向に関する規定として、図示のように、第2部分10bがある方を前とし、無い方を後とし、その前後の方向に対して左右の方向を規定するが、これに限られない。
In the example of FIG. 1, the housing 10 of the moving body 1 has a flat plate-shaped first portion 10a parallel to a horizontal plane and a flat plate-shaped second portion standing vertically from a part of the first portion 10a. It has 10b, but is not limited to this. In the first portion 10a, a moving mechanism 3 including two front and rear axles and four front, rear, left and right wheels is provided. A control device 100 is built in the second portion 10b. The control device 100 may be installed so as to be exposed to the outside in the second portion 10b or the like. In this example, as a rule regarding the direction seen from the moving body 1, as shown in the figure, the side with the second portion 10b is the front, the side without the second portion 10b is the back, and the left and right directions are defined with respect to the front-back direction. However, it is not limited to this.
この移動体1には、複数のセンサ2、本例では2個のセンサ2(2A,2B)が設置されている。2個のセンサ2を、第1センサであるセンサ2A、第2センサであるセンサ2Bとする。実施の形態1では、各センサ2は、測距センサであり、特に2次元のレーザスキャナ(レーザレンジファインダ:LRF等と呼ばれる場合もある)である。御装置100は、センサ2の測距データからセンサ2の位置を計算可能である。2次元の意味は、センサ2の方向を中心とした平面(本例では水平面)内で物体との距離を検出できるという意味である。このレーザスキャナであるセンサ2は、移動体1の周囲の各方向にある物体を特徴点として距離を検出・計測する。本例でのセンサ2は、移動体システムにおける移動体1の安全な自動搬送等を実現する機能において安全センサのような役割を果たす。
A plurality of sensors 2 and two sensors 2 (2A, 2B) in this example are installed in the moving body 1. Let the two sensors 2 be the first sensor, the sensor 2A, and the second sensor, the sensor 2B. In the first embodiment, each sensor 2 is a distance measuring sensor, and in particular, a two-dimensional laser scanner (laser range finder: sometimes called LRF or the like). The device 100 can calculate the position of the sensor 2 from the distance measurement data of the sensor 2. The two-dimensional meaning means that the distance to an object can be detected in a plane (horizontal plane in this example) centered on the direction of the sensor 2. The sensor 2 which is a laser scanner detects and measures a distance by using an object in each direction around the moving body 1 as a feature point. The sensor 2 in this example plays a role like a safety sensor in a function of realizing safe automatic transportation of the moving body 1 in the moving body system.
各センサ2は、そのセンサ2自体の空間座標系CS内での位置(特に時系列上の位置の軌跡)を計測または移動体システムで計算可能とする種類のセンサであれば任意でよく、詳細は限定されない。言い換えると、センサ2は、少なくとも測位機能を実現できる種類のセンサであればよい。センサ2は、自己位置を検出できる種類のセンサ2でもよいし、センサ2のセンサデータから制御装置100等が位置を計算できる種類のセンサ2でもよい。測位機能は、センサ2単体で実現されてもよいし、制御装置100等との組み合わせで実現されてもよい。前者の場合、センサ2の出力するセンサデータは、センサ2の位置および姿勢の情報を含む。後者の場合、制御装置100は、センサ2のセンサデータに基づいて、センサ2の位置および姿勢を計算する。
Each sensor 2 may be any type of sensor that can measure the position of the sensor 2 itself in the spatial coordinate system CS (particularly, the locus of the position on the time series) by a moving body system, and details. Is not limited. In other words, the sensor 2 may be at least a type of sensor that can realize a positioning function. The sensor 2 may be a type of sensor 2 capable of detecting its own position, or a type of sensor 2 capable of calculating the position by the control device 100 or the like from the sensor data of the sensor 2. The positioning function may be realized by the sensor 2 alone, or may be realized in combination with the control device 100 or the like. In the former case, the sensor data output by the sensor 2 includes information on the position and orientation of the sensor 2. In the latter case, the control device 100 calculates the position and orientation of the sensor 2 based on the sensor data of the sensor 2.
特に、センサ2は、センサ2の位置および姿勢を検出または計算可能な種類のセンサである。センサ2の姿勢は、言い換えると、方向、回転の状態である。実施の形態1では、センサ2は、測距センサ、特にレーザスキャナであるため、制御装置100は、センサ2からの測距データに基づいて、各センサ2の位置および姿勢の状態を計算する。
In particular, the sensor 2 is a type of sensor that can detect or calculate the position and orientation of the sensor 2. In other words, the posture of the sensor 2 is a state of direction and rotation. In the first embodiment, since the sensor 2 is a distance measuring sensor, particularly a laser scanner, the control device 100 calculates the position and posture state of each sensor 2 based on the distance measuring data from the sensor 2.
2個のセンサ2(2A,2B)は、図示のように、移動体1の移動体座標系CMにおける異なる位置(設置位置と記載する場合もある)に設置されている。本例では、第1センサであるセンサ2Aは、筐体10の第1部分10aの前側の第2部分10bの上面の中央から若干右寄りの位置に設置されている。第2センサであるセンサ2Bは、第1部分10aの後側の上面の左隅付近の位置に設置されている。
As shown in the figure, the two sensors 2 (2A, 2B) are installed at different positions (sometimes referred to as installation positions) in the moving body coordinate system CM of the moving body 1. In this example, the sensor 2A, which is the first sensor, is installed at a position slightly to the right of the center of the upper surface of the second portion 10b on the front side of the first portion 10a of the housing 10. The sensor 2B, which is the second sensor, is installed at a position near the left corner of the upper surface on the rear side of the first portion 10a.
図1の移動体1に設置される複数のセンサ2(2A,2B)は、同じ機能や仕様を持つ測距センサとするが、これ限らず、異なる種類や仕様を持つ複数のセンサ2としてもよい。センサ2は、レーザスキャナ等の測距センサに限らず、加速度センサ、ジャイロセンサ、地磁気センサ、GPS受信器等を用いてもよい。
The plurality of sensors 2 (2A, 2B) installed in the moving body 1 of FIG. 1 are distance measuring sensors having the same function and specifications, but are not limited to this, and may be a plurality of sensors 2 having different types and specifications. good. The sensor 2 is not limited to a distance measuring sensor such as a laser scanner, and an acceleration sensor, a gyro sensor, a geomagnetic sensor, a GPS receiver, or the like may be used.
[移動体]
図2は、図1の移動体1の構成として、(A)側面図、(B)上面図を示す。(A)の側面図(X-Z面に相当する)において、移動体座標系CMでのセンサ2Aの位置(黒点で示す)をPAとし、特に高さ位置をZAとし、センサ2Bの位置をPBとし、特に高さ位置をZBとして示す。移動体1、特に移動機構3の車輪、が走行する水平面である地面200の高さ位置をZ=0とする。 [Mobile]
FIG. 2 shows (A) a side view and (B) a top view as the configuration of the movingbody 1 of FIG. In the side view (corresponding to the XX plane) of (A), the position of the sensor 2A (indicated by a black dot) in the moving body coordinate system CM is set to PA, the height position is set to ZA, and the position of the sensor 2B is set to ZA. It is referred to as PB, and the height position is particularly shown as ZB. Let Z = 0 be the height position of the ground 200, which is the horizontal plane on which the moving body 1, particularly the wheels of the moving mechanism 3, travel.
図2は、図1の移動体1の構成として、(A)側面図、(B)上面図を示す。(A)の側面図(X-Z面に相当する)において、移動体座標系CMでのセンサ2Aの位置(黒点で示す)をPAとし、特に高さ位置をZAとし、センサ2Bの位置をPBとし、特に高さ位置をZBとして示す。移動体1、特に移動機構3の車輪、が走行する水平面である地面200の高さ位置をZ=0とする。 [Mobile]
FIG. 2 shows (A) a side view and (B) a top view as the configuration of the moving
特に、実施の形態1では、2個のセンサ2(2A,2B)の設置の高さ位置(ZA,ZB)が異なる。センサ2Aの高さ位置ZAは、センサ2Bの高さ位置ZBよりも上である(ZA>ZB>0)。比較例として、移動体1に1つのセンサ2を備える構成、または複数のセンサ2の高さ位置が同じである構成の場合、移動体システムの測位機能は、その1つの高さ位置に対応する水平面での移動体の位置を測定する機能である。あるいは、地図作成機能は、その1つの高さ位置に対応する水平面での物体の形状を表す地図を作成する機能である。それに対し、実施の形態1での測位機能は、2つの高さ位置に対応する2つの水平面での各センサ2の位置に基づいて、移動体1の位置を測定する機能である。あるいは、地図作成機能(後述)は、2つの高さ位置に対応する2つの水平面での物体の形状を表す地図を作成する機能である。
In particular, in the first embodiment, the height positions (ZA, ZB) of the two sensors 2 (2A, 2B) are different. The height position ZA of the sensor 2A is higher than the height position ZB of the sensor 2B (ZA> ZB> 0). As a comparative example, in the case where the moving body 1 is provided with one sensor 2 or the height positions of the plurality of sensors 2 are the same, the positioning function of the moving body system corresponds to the one height position. It is a function to measure the position of a moving body on a horizontal plane. Alternatively, the map creation function is a function of creating a map showing the shape of an object in a horizontal plane corresponding to the one height position. On the other hand, the positioning function in the first embodiment is a function of measuring the position of the moving body 1 based on the positions of the respective sensors 2 on the two horizontal planes corresponding to the two height positions. Alternatively, the map creation function (described later) is a function of creating a map showing the shape of an object in two horizontal planes corresponding to two height positions.
(B)の上面図(X-Y面に相当する)において、センサ2Aの位置PAを基準・原点として、センサ2Aの座標系CAを(x,y)で示す。そのx軸がセンサ2Aの設置方向である。同様に、センサ2Bの位置PBを基準・原点として、センサ2Bの座標系CBを(x,y)で示す。そのx軸がセンサ2Bの設置方向である。センサ2Aおよびセンサ2Bは、それぞれ、x軸を中心として所定の角度の範囲(後述の図3)で検出を行う。なお、各センサ2の座標系(CA,CB)は、3次元では(x,y,z)となる。
In the top view (corresponding to the XY plane) of (B), the coordinate system CA of the sensor 2A is shown by (x, y) with the position PA of the sensor 2A as a reference / origin. The x-axis is the installation direction of the sensor 2A. Similarly, the coordinate system CB of the sensor 2B is indicated by (x, y) with the position PB of the sensor 2B as a reference / origin. The x-axis is the installation direction of the sensor 2B. The sensor 2A and the sensor 2B each perform detection within a range of a predetermined angle (FIG. 3 described later) about the x-axis. The coordinate system (CA, CB) of each sensor 2 is (x, y, z) in three dimensions.
センサ2は、設置および検出の基準となる方向(設置方向と記載する場合もある)を有する。センサ2Aの設置方向をθA、センサ2Bの設置方向をθBとしても示す。このセンサ2の方向は、レーザ光を出射する際の基準方向である。本例では、センサ2Aの方向は、X軸での前方向であり、座標系CAでのx軸方向である。センサ2Bの方向は、X軸に対し角度Δθの相対関係で異なる方向、概略的に後ろ斜め左の方向であり、座標系CBでのx軸方向である。
The sensor 2 has a direction (sometimes referred to as an installation direction) that serves as a reference for installation and detection. The installation direction of the sensor 2A is also shown as θA, and the installation direction of the sensor 2B is also shown as θB. The direction of the sensor 2 is a reference direction when emitting laser light. In this example, the direction of the sensor 2A is the forward direction on the X-axis and the x-axis direction on the coordinate system CA. The direction of the sensor 2B is a direction different from the X-axis due to the relative relationship of the angle Δθ, substantially a direction diagonally to the left of the rear, and is an x-axis direction in the coordinate system CB.
センサ2の設置方向についても、設置位置とともに、変更される場合がある。実施の形態1での相対位置関係計算機能は、センサ2間での設置の位置および方向の関係を含む相対位置関係105を計算する機能である。
The installation direction of the sensor 2 may change along with the installation position. The relative positional relationship calculation function in the first embodiment is a function of calculating the relative positional relationship 105 including the relationship between the installation position and the direction between the sensors 2.
センサ2Aの位置PAやセンサ2Bの位置PBは、移動体座標系CM(x,y,z)における座標値としては、(xA,yA,zA)や(xB,yB,zB)のように表現できる。また、センサ2Aの位置PAやセンサ2Bの位置PBは、空間座標系CS内では、別の座標値で表現される。
The position PA of the sensor 2A and the position PB of the sensor 2B are expressed as (xA, yA, zA) or (xB, yB, zB) as coordinate values in the moving body coordinate system CM (x, y, z). can. Further, the position PA of the sensor 2A and the position PB of the sensor 2B are represented by different coordinate values in the spatial coordinate system CS.
2個のセンサ2(2A,2B)間の相対位置関係105のうち特に位置座標の関係は、図示する値Δx,Δy,Δzで表される。値Δx,Δy,Δzは、センサ2Aの座標系CAの原点の位置PA(xA,yA,zA)に対する、センサ2Bの座標系CBの原点の位置PB(xB,yB,zB)の差分値である。例えばΔx=xB-xAである。別の表現として、値Δxは、センサ2Aの位置PAの空間座標系CSでのX座標値に対する、センサ2Bの位置PBの空間座標系CSでのX座標値の差分値である。ΔyやΔzについても同様である。
Of the relative positional relationships 105 between the two sensors 2 (2A, 2B), the relationship of the positional coordinates is represented by the illustrated values Δx, Δy, Δz. The values Δx, Δy, Δz are the difference values of the origin position PB (xB, yB, zB) of the coordinate system CB of the sensor 2B with respect to the position PA (xA, yA, zA) of the origin of the coordinate system CA of the sensor 2A. be. For example, Δx = xB−xA. As another expression, the value Δx is the difference value of the X coordinate value in the spatial coordinate system CS of the position PB of the sensor 2B with respect to the X coordinate value in the spatial coordinate system CS of the position PA of the sensor 2A. The same applies to Δy and Δz.
なお、本例では、相対位置関係105を求めるにあたり、第1センサであるセンサ2Aを基準とし、第1センサに対する第2センサ2Bの位置および方向の関係を求めるものである。これは逆としても同様に成立する。図2では、相対位置関係105、特に位置座標の関係を、ベクトルvABとしても表している。
In this example, in obtaining the relative positional relationship 105, the relationship between the position and direction of the second sensor 2B with respect to the first sensor is obtained with reference to the sensor 2A which is the first sensor. This is also true in reverse. In FIG. 2, the relative positional relationship 105, particularly the relationship of the position coordinates, is also represented as a vector vAB.
また、2個のセンサ2(2A,2B)間の相対位置関係105のうちの特に方向の関係は、図示する値Δθで表される。値Δθは、センサ2(2A,2B)の方向(θA,θB)に関する差分値である。センサ2Aの方向θAは、座標系CAのx軸の正方向(角度が0度)である。センサ2Bの方向θBは、座標系CBのx軸の正方向(角度が0度)である。方向関係を表す値Δθは、図示のように、センサ2Aの方向θAからセンサ2Bの方向θBまでの角度であり、Δθ=θB-θAである。
Further, of the relative positional relationship 105 between the two sensors 2 (2A, 2B), the relationship of particular direction is represented by the illustrated value Δθ. The value Δθ is a difference value with respect to the direction (θA, θB) of the sensor 2 (2A, 2B). The direction θA of the sensor 2A is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CA. The direction θB of the sensor 2B is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CB. As shown in the figure, the value Δθ representing the directional relationship is an angle from the direction θA of the sensor 2A to the direction θB of the sensor 2B, and Δθ = θB−θA.
この移動体システムにおける相対位置関係計算機能は、少なくとも、センサ2間の相対位置関係105として、上記位置関係を表す値(Δx,Δy)を求める。この相対位置関係計算機能は、詳しくは、上記方向関係を表す値Δθも求める。これに限らず、相対位置関係計算機能は、3次元を考慮し、図2の(A)のZ方向での値Δzを求めることができる機能としてもよい。
The relative positional relationship calculation function in this mobile system obtains values (Δx, Δy) representing the above positional relationship as at least the relative positional relationship 105 between the sensors 2. In detail, this relative positional relationship calculation function also obtains a value Δθ representing the above-mentioned directional relationship. Not limited to this, the relative positional relationship calculation function may be a function capable of obtaining the value Δz in the Z direction of FIG. 2 (A) in consideration of three dimensions.
[センサ]
移動体1への複数(2個)のセンサ2の設置の位置や方向には、制約がある。複数のセンサ2は、同じ位置には設置できず、例えば密集させて設置したとしても正確な各位置は異なる。また、複数のセンサ2は、移動体1の他の部位、例えば移動機構3や搭載機構4を阻害しないように、空いている選択された位置に設置される。また、複数のセンサ2は、センサ2自体の機能を阻害しないように、例えばレーザ光が遮られないように、選択された位置に設置される。 [Sensor]
There are restrictions on the positions and directions in which a plurality of (two)sensors 2 are installed on the moving body 1. The plurality of sensors 2 cannot be installed at the same position, and even if they are installed densely, the exact positions are different. Further, the plurality of sensors 2 are installed at vacant selected positions so as not to interfere with other parts of the moving body 1, for example, the moving mechanism 3 and the mounting mechanism 4. Further, the plurality of sensors 2 are installed at selected positions so as not to interfere with the function of the sensor 2 itself, for example, so that the laser beam is not blocked.
移動体1への複数(2個)のセンサ2の設置の位置や方向には、制約がある。複数のセンサ2は、同じ位置には設置できず、例えば密集させて設置したとしても正確な各位置は異なる。また、複数のセンサ2は、移動体1の他の部位、例えば移動機構3や搭載機構4を阻害しないように、空いている選択された位置に設置される。また、複数のセンサ2は、センサ2自体の機能を阻害しないように、例えばレーザ光が遮られないように、選択された位置に設置される。 [Sensor]
There are restrictions on the positions and directions in which a plurality of (two)
各センサ2は、図3の例のように所定の検出範囲を構成するように設置の位置や方向が選択される。その際、移動体1の形状に応じて、搭載機構4等の部位によって例えばレーザ光が遮られないように、適切な設置が選択される必要がある。適用環境や用途に応じて移動体1の形状等が様々に異なるが、それらに応じて、センサ2の設置の位置や方向が変更され得る。また、センサ2がある位置に固定の意図で設置された後でも、ユーザの意図から外れて、その位置からのズレが生じる場合もあり得る。例えばセンサ2が何かの物に当たって位置が少しずらされる場合が考えられる。
The installation position and direction of each sensor 2 are selected so as to form a predetermined detection range as shown in the example of FIG. At that time, depending on the shape of the moving body 1, it is necessary to select an appropriate installation so that, for example, the laser beam is not blocked by a portion such as the mounting mechanism 4. The shape and the like of the moving body 1 are variously different depending on the application environment and the application, but the installation position and direction of the sensor 2 can be changed according to them. Further, even after the sensor 2 is installed at a certain position with the intention of being fixed, it may deviate from the intention of the user and a deviation from that position may occur. For example, it is conceivable that the sensor 2 hits something and the position is slightly shifted.
上記のように、複数のセンサ2は、移動体システムの適用環境や用途や機能(地図作成機能等)、移動体1の種類や形状、ユーザ操作等に合わせて、所定の検出範囲(図3)等を構成するように、移動体1上の設置の位置および方向が決定され、適宜に変更され得る。実施の形態1の移動体システムにおける相対位置関係計算機能は、そのようなセンサ2の設置状態の変更に応じて、高精度かつ容易に、センサ2間の相対位置関係105を求めて自動調整することができる。この相対位置関係計算機能は、後述(図7)のようにセンサ2の軌跡の形状をマッチングする仕組みで、相対位置関係105を計算する。そして、移動体システムは、その結果としての高精度な相対位置関係105に基づいて、測位機能や地図作成機能等を高精度に実現できる。
As described above, the plurality of sensors 2 have a predetermined detection range (FIG. 3) according to the application environment, application and function (map creation function, etc.) of the mobile body system, the type and shape of the mobile body 1, user operation, and the like. ) Etc., the position and direction of installation on the moving body 1 are determined and can be changed as appropriate. The relative positional relationship calculation function in the mobile system of the first embodiment automatically adjusts the relative positional relationship 105 between the sensors 2 with high accuracy and easily in response to such a change in the installation state of the sensor 2. be able to. This relative positional relationship calculation function calculates the relative positional relationship 105 by a mechanism that matches the shape of the locus of the sensor 2 as described later (FIG. 7). Then, the mobile system can realize the positioning function, the map creation function, and the like with high accuracy based on the high-precision relative positional relationship 105 as a result.
また特に、実施の形態1では、センサ2の設置の制約または意図的な設計として、図2等のように、センサ2の高さ位置(ZA,ZB)が異なる。このことから、後述の地図作成機能によって作成できる地図も、従来技術に従った場合には、基本的にはセンサ2毎に異なる各高さ位置に応じた異なる複数(2個)の地図となる。このように異なる複数の地図がある場合、相対位置関係105が不明または低精度の場合には、それらの地図間の関連性も不明または低精度である。よって、ユーザまたは移動体1は、それらの地図からは、図1の環境101内の同一の物体、例えば建物や生産設備102等の形状等が把握しにくい場合があり得る。
In particular, in the first embodiment, the height positions (ZA, ZB) of the sensor 2 are different as shown in FIG. 2 and the like as a restriction on the installation of the sensor 2 or an intentional design. For this reason, the maps that can be created by the map creation function described later are basically different (two) maps according to each height position that is different for each sensor 2 according to the conventional technology. .. When there are a plurality of different maps in this way, when the relative positional relationship 105 is unknown or low accuracy, the relationship between the maps is also unknown or low accuracy. Therefore, it may be difficult for the user or the moving body 1 to grasp the same object in the environment 101 of FIG. 1, such as the shape of a building or the production equipment 102, from the maps.
そこで、実施の形態1では、相対位置関係計算機能によって得た高精度の相対位置関係105に基づいて、地図作成機能によれば、移動体1の周囲の地図を高精度に作成できる。それだけでなく、この移動体システムは、作成されたセンサ2毎に異なる各高さ位置に応じた複数(2個)の地図データを、相対位置関係105を用いて関連付けを行う。これにより、それらの地図を概略的に1つの地図として統合的に扱うことができる。これにより、環境101内の同一の物体の形状等が把握しやすくなる等の効果が得られる。
Therefore, in the first embodiment, the map around the moving body 1 can be created with high accuracy according to the map creation function based on the highly accurate relative positional relationship 105 obtained by the relative positional relationship calculation function. Not only that, this mobile system associates a plurality of (two) map data corresponding to each height position different for each created sensor 2 by using the relative positional relationship 105. As a result, those maps can be treated as one map in an integrated manner. This makes it easier to grasp the shape and the like of the same object in the environment 101.
[センサ検出範囲]
図3は、センサ2の検出範囲の構成例を示す。センサ2は、設置位置(PA,PB)から検出範囲内で周囲(本例では水平面)の各方向に走査しながらレーザ光を出射し、環境101内の物体に当たった点を特徴点としてその特徴点から戻ってくるレーザ光を入射する。そして、センサ2は、いわゆるTOF(Time of flight)方式で、レーザ光の出射から入射までの時間から、方向に応じた特徴点との距離を計算する。センサ2からの出力であるセンサデータ、言い換えると測距データは、少なくとも、時系列上の時点毎に、センサ2から周囲を見た方向を表す角度α,βと、その角度に応じた距離値(d)とを有する。 [Sensor detection range]
FIG. 3 shows a configuration example of the detection range of thesensor 2. The sensor 2 emits laser light while scanning in each direction of the surroundings (horizontal plane in this example) within the detection range from the installation position (PA, PB), and has a feature point of hitting an object in the environment 101. The laser beam returning from the feature point is incident. Then, the sensor 2 calculates the distance from the feature point according to the direction from the time from the emission of the laser beam to the incident by the so-called TOF (Time of flight) method. The sensor data, which is the output from the sensor 2, in other words, the distance measurement data, is at least at each time point in the time series, the angles α and β representing the direction in which the sensor 2 looks at the surroundings, and the distance value corresponding to the angle. (D) and.
図3は、センサ2の検出範囲の構成例を示す。センサ2は、設置位置(PA,PB)から検出範囲内で周囲(本例では水平面)の各方向に走査しながらレーザ光を出射し、環境101内の物体に当たった点を特徴点としてその特徴点から戻ってくるレーザ光を入射する。そして、センサ2は、いわゆるTOF(Time of flight)方式で、レーザ光の出射から入射までの時間から、方向に応じた特徴点との距離を計算する。センサ2からの出力であるセンサデータ、言い換えると測距データは、少なくとも、時系列上の時点毎に、センサ2から周囲を見た方向を表す角度α,βと、その角度に応じた距離値(d)とを有する。 [Sensor detection range]
FIG. 3 shows a configuration example of the detection range of the
図3の(A)は、図2の移動体1の構成に対応した、センサ2の検出範囲の一例を水平面(X-Y面)で模式的に示している。各センサ2は、水平面、センサ座標系でのx-y面を、検出対象、言い換えると測距対象とする。検出範囲301は、センサ2Aの検出範囲を示し、水平面内での角度範囲305で規定される。検出範囲302は、センサ2Bの検出範囲を示し、水平面内での角度範囲306で規定される。本例では、角度範囲305,306は、180度よりも大きい。
(A) of FIG. 3 schematically shows an example of the detection range of the sensor 2 corresponding to the configuration of the moving body 1 of FIG. 2 on a horizontal plane (XY plane). Each sensor 2 makes a horizontal plane and an xy plane in the sensor coordinate system a detection target, in other words, a distance measurement target. The detection range 301 indicates the detection range of the sensor 2A and is defined by the angle range 305 in the horizontal plane. The detection range 302 indicates the detection range of the sensor 2B and is defined by the angle range 306 in the horizontal plane. In this example, the angle ranges 305 and 306 are greater than 180 degrees.
センサ2の検出方向は、水平面内での基準方向(方向θA,θB)に対する角度(α,β)で表現される。センサ2Aは、位置PAからx軸の正方向を基準方向として角度α=0度とし、検出範囲301内での角度αに応じた方向(破線で示す)にレーザ光を出射する。その方向上にある黒点は、物体の特徴点303の例であり、距離304(値d)を有する。なお、実際の特徴点や検出範囲は、移動体1からより離れた位置をカバーしている。測距可能な距離はセンサ2の種類等に応じる。同様に、センサ2Bは、位置PBからx軸の正方向を基準方向として角度β=0度とし、角度βに応じた方向にレーザ光を出射する。なお、本例での角度(α,β)は正負の値をとり得る。
The detection direction of the sensor 2 is expressed by an angle (α, β) with respect to a reference direction (direction θA, θB) in the horizontal plane. The sensor 2A emits laser light in a direction (indicated by a broken line) corresponding to the angle α within the detection range 301 with the angle α = 0 degrees with the positive direction of the x-axis as the reference direction from the position PA. The black spot in that direction is an example of the feature point 303 of the object and has a distance of 304 (value d). The actual feature points and detection ranges cover positions farther from the moving body 1. The distance that can be measured depends on the type of sensor 2 and the like. Similarly, the sensor 2B sets the angle β = 0 degrees with the positive direction of the x-axis as the reference direction from the position PB, and emits the laser beam in the direction corresponding to the angle β. The angles (α, β) in this example can take positive and negative values.
各センサ2の検出範囲は、図示のように異なり、それらの間で、一部の検出範囲が重なってもよいし、移動体1の周囲で検出できない一部の範囲があってもよい。検出できない一部の範囲についても、移動体1が位置や姿勢を変えることで検出可能となる。図示のように、複数のセンサ2を備えることで、移動体システムとしての検出範囲を広く確保できる。
The detection range of each sensor 2 is different as shown in the drawing, and a part of the detection range may overlap between them, or there may be a part of the range that cannot be detected around the moving body 1. Even a part of the range that cannot be detected can be detected by changing the position and posture of the moving body 1. As shown in the figure, by providing a plurality of sensors 2, a wide detection range as a mobile system can be secured.
図3の(B)は、図2とは別の形状を持つ移動体1の例において、(A)と同じ2個のセンサ2(2A,2B)の別の設置例を示す。この移動体1は、図2のような第2部分10bが無く、水平面での平板状の第1部分10aと、その上の搭載機構4とがある。第1部分10aの上面において、センサ2Aは、移動体座標系CMでのx軸の前側で、左右で中心の位置PAに前への方向θAで設置されている。センサ2Bは、x軸の後側で、左右で中心の位置PBに後への方向θBで設置されている。本例では、2個のセンサ2は、移動体1に対し、前後で対称的な位置(PA,PB)に設置されている。それに対応して、2個のセンサ2の検出範囲(301,302)は、前後で対称的な検出範囲として構成されている。
FIG. 3B shows another installation example of the same two sensors 2 (2A, 2B) as in (A) in the example of the moving body 1 having a shape different from that of FIG. The moving body 1 does not have the second portion 10b as shown in FIG. 2, and has a flat plate-shaped first portion 10a on a horizontal plane and a mounting mechanism 4 on the first portion 10a. On the upper surface of the first portion 10a, the sensor 2A is installed on the front side of the x-axis in the mobile coordinate system CM at the center position PA on the left and right in the forward direction θA. The sensor 2B is installed on the rear side of the x-axis at the center position PB on the left and right in the rear direction θB. In this example, the two sensors 2 are installed at positions symmetrical with respect to the moving body 1 (PA, PB). Correspondingly, the detection ranges (301 and 302) of the two sensors 2 are configured as symmetrical detection ranges in the front-rear direction.
上記例のように、基本的に同じ機能を有する移動体システムの場合でも、複数のセンサ2の設置の位置や方向、検出範囲等は、適宜に変更され得る。変更によって様々な環境や用途への対応が可能である。
As in the above example, even in the case of a mobile system having basically the same function, the installation position, direction, detection range, etc. of the plurality of sensors 2 can be changed as appropriate. It is possible to respond to various environments and applications by changing it.
センサ2がレーザスキャナである場合、レーザ照射部を回転駆動する走査によって、周囲の各方向にレーザ光を出射でき、方向毎の距離情報が得られる。距離情報からは、センサ2の位置を基準とした特徴点の位置情報に変換できる。このような移動体1またはセンサ2から見た周囲の物体の位置情報は、物体の幾何形状を表しているため、形状データと記載する場合がある。
When the sensor 2 is a laser scanner, laser light can be emitted in each of the surrounding directions by scanning that rotationally drives the laser irradiation unit, and distance information for each direction can be obtained. The distance information can be converted into position information of feature points based on the position of the sensor 2. Since the position information of the surrounding object seen from the moving body 1 or the sensor 2 represents the geometric shape of the object, it may be described as shape data.
なお、センサ2としては、カメラ等も適用可能である。あるいは、移動体1ではなく環境に設置されたセンサ(例えばRFIDタグ、ビーコン)等を用いた測位システム等も適用可能である。例えばステレオカメラ方式を用いる場合、左右のカメラの画像から物体との距離を計算可能である。
A camera or the like can also be applied as the sensor 2. Alternatively, a positioning system or the like using a sensor (for example, RFID tag, beacon) or the like installed in the environment instead of the mobile body 1 can be applied. For example, when the stereo camera method is used, the distance to the object can be calculated from the images of the left and right cameras.
[移動機構]
図4は、図1の移動体1の移動機構3の構成例を、水平面(X-Y面)での模式図として示す。本例では、移動機構3は、前後への走行、走行停止、および左右への旋回を可能とする機構であり、二軸・四輪を備え、例えば後輪駆動方式の機構である。この移動機構3は、前軸410の左側の車輪401および右側の車輪402と、後軸420の左側の車輪403および右側の車輪404とを有する。この移動機構3は、例えば、左右の車輪の速度が独立に制御可能であり、左右の車輪の速度の差を制御することで、旋回等が制御可能な機構である。 [Movement mechanism]
FIG. 4 shows a configuration example of the movingmechanism 3 of the moving body 1 of FIG. 1 as a schematic view on a horizontal plane (XY plane). In this example, the moving mechanism 3 is a mechanism capable of traveling forward and backward, stopping traveling, and turning left and right, and includes two axes and four wheels, for example, a rear wheel drive type mechanism. The moving mechanism 3 has a left wheel 401 and a right wheel 402 of the front axle 410, and a left wheel 403 and a right wheel 404 of the rear axle 420. The moving mechanism 3 is, for example, a mechanism in which the speeds of the left and right wheels can be controlled independently, and turning or the like can be controlled by controlling the difference in speed between the left and right wheels.
図4は、図1の移動体1の移動機構3の構成例を、水平面(X-Y面)での模式図として示す。本例では、移動機構3は、前後への走行、走行停止、および左右への旋回を可能とする機構であり、二軸・四輪を備え、例えば後輪駆動方式の機構である。この移動機構3は、前軸410の左側の車輪401および右側の車輪402と、後軸420の左側の車輪403および右側の車輪404とを有する。この移動機構3は、例えば、左右の車輪の速度が独立に制御可能であり、左右の車輪の速度の差を制御することで、旋回等が制御可能な機構である。 [Movement mechanism]
FIG. 4 shows a configuration example of the moving
図4の(A)は、前方(X方向)への走行時の状態を示す。位置PMは、移動体座標系CMでの移動体1の代表的な位置の例として、前後左右の中心点を示す。なお、位置PM1は、前軸410の左右の中心点、位置PM2は、後軸420の左右の中心点である。この状態では、移動機構3は、各車輪が同じ回転速度で駆動されることで、前方への走行が実現されている。破線の軌跡431は、現在の位置PMから今後の前方走行時の軌跡を示す。
(A) in FIG. 4 shows a state when traveling forward (in the X direction). The position PM indicates the center points of the front, back, left, and right as an example of a typical position of the moving body 1 in the moving body coordinate system CM. The position PM1 is the left and right center points of the front axle 410, and the position PM2 is the left and right center points of the rear axle 420. In this state, the moving mechanism 3 realizes forward traveling by driving each wheel at the same rotational speed. The broken line locus 431 indicates a locus when traveling forward from the current position PM.
図4の(B)は、右への旋回時の状態を示す。本例では、左側の車輪401,403に対して右側の車輪402,404の回転速度が小さいように制御されることで、このような右旋回の動作が実現されている。破線の軌跡432は、現在の位置PMから今後の右旋回時の軌跡を示す。
(B) in FIG. 4 shows a state when turning to the right. In this example, such a right-handed turning operation is realized by controlling the rotation speeds of the right wheels 402 and 404 to be smaller than those of the left wheels 401 and 403. The broken line locus 432 shows the locus when turning right from the current position PM in the future.
なお、本例に限らず、移動機構3は、走行および方向転換が可能な機構であればよい。移動機構3は、車輪の方向が固定の機構でもよいし、車輪の方向が操舵可能な機構でもよいし、車軸や車輪以外の構造、例えばキャスターやキャタピラ、脚構造等を用いた機構でもよい。移動機構3は、例えば掃除ロボット等に採用されている機構、例えば各車輪の方向および回転速度が独立に制御できる機構でもよい。方向転換の動作は、図示のような円弧の軌跡を伴う旋回の動作に限られない。
Not limited to this example, the moving mechanism 3 may be any mechanism capable of traveling and changing direction. The moving mechanism 3 may be a mechanism in which the direction of the wheels is fixed, a mechanism in which the direction of the wheels can be steered, or a mechanism using a structure other than the axle or the wheel, for example, a caster, a caterpillar, or a leg structure. The moving mechanism 3 may be, for example, a mechanism used in a cleaning robot or the like, for example, a mechanism capable of independently controlling the direction and rotation speed of each wheel. The turning motion is not limited to the turning motion accompanied by the locus of an arc as shown in the figure.
なお、移動体座標系CMにおける移動体1の代表的な位置(図4での位置PM)と、センサ2の設置位置とは、独立した概念である。また、空間座標系CS内でのセンサ2等の位置と、移動体座標系CMでのセンサ2等の位置や相対位置関係105とは、異なる概念である。移動体システムでの相対位置関係計算機能は、移動体座標系CMにおける複数のセンサ2の設置位置および設置方向に関する相対関係を求める機能である。
Note that the typical position of the moving body 1 (position PM in FIG. 4) in the moving body coordinate system CM and the installation position of the sensor 2 are independent concepts. Further, the position of the sensor 2 or the like in the spatial coordinate system CS and the position or relative positional relationship 105 of the sensor 2 or the like in the mobile coordinate system CM are different concepts. The relative positional relationship calculation function in the mobile body system is a function of obtaining the relative relationship regarding the installation position and the installation direction of the plurality of sensors 2 in the mobile body coordinate system CM.
移動体1の代表的な位置(図4での位置PM)は、予め規定でき、規定の仕方については限定しない。この代表的な位置は、センサ2(2A,2B)の位置(PA,PB)を用いて規定されてもよい。この代表的な位置は、例えば、特定の1個のセンサ2、例えばセンサ2Aの位置PAと同じとしてもよいし、あるいは2個のセンサ2(2A,2B)の中間位置としてもよい。この代表的な位置は、筐体10等の形状における所定の位置、例えば中心位置や、車軸の中間位置(例えば位置PM1,PM2)等としてもよい。この代表的な位置は、センサ2の位置から所定の相対関係(方向および距離)にある位置としてもよい。
The typical position of the moving body 1 (position PM in FIG. 4) can be specified in advance, and the method of specifying is not limited. This representative position may be defined using the position (PA, PB) of the sensor 2 (2A, 2B). This representative position may be, for example, the same as the position PA of one specific sensor 2, for example, the sensor 2A, or may be an intermediate position between the two sensors 2 (2A, 2B). This typical position may be a predetermined position in the shape of the housing 10 or the like, for example, a center position, an intermediate position of the axle (for example, positions PM1 and PM2) and the like. This typical position may be a position having a predetermined relative relationship (direction and distance) from the position of the sensor 2.
[機能ブロック構成]
図5は、実施の形態1の移動体システムの機能ブロック構成を示す。この移動体システムの移動体1は、制御装置100、2個のセンサ2(2A,2B)、および移動機構3等を備える。制御装置100は、位置同定装置5、および移動機構制御装置6を備える。位置同定装置5は、例えばマイコン等で実装される。移動機構制御装置6は、例えばPLC(プログラマブル・ロジック・コントローラ)等で実装される。本例では、位置同定装置5と移動機構制御装置6とが一体的に制御装置100として実装されているが、これに限られない。制御装置100は、搭載機構4の動作を駆動制御する部分を備えてもよい。 [Functional block configuration]
FIG. 5 shows a functional block configuration of the mobile system of the first embodiment. Themobile body 1 of this mobile body system includes a control device 100, two sensors 2 (2A, 2B), a moving mechanism 3, and the like. The control device 100 includes a position identification device 5 and a movement mechanism control device 6. The position identification device 5 is mounted on, for example, a microcomputer or the like. The movement mechanism control device 6 is implemented by, for example, a PLC (programmable logic controller) or the like. In this example, the position identification device 5 and the movement mechanism control device 6 are integrally mounted as the control device 100, but the present invention is not limited to this. The control device 100 may include a portion that drives and controls the operation of the mounting mechanism 4.
図5は、実施の形態1の移動体システムの機能ブロック構成を示す。この移動体システムの移動体1は、制御装置100、2個のセンサ2(2A,2B)、および移動機構3等を備える。制御装置100は、位置同定装置5、および移動機構制御装置6を備える。位置同定装置5は、例えばマイコン等で実装される。移動機構制御装置6は、例えばPLC(プログラマブル・ロジック・コントローラ)等で実装される。本例では、位置同定装置5と移動機構制御装置6とが一体的に制御装置100として実装されているが、これに限られない。制御装置100は、搭載機構4の動作を駆動制御する部分を備えてもよい。 [Functional block configuration]
FIG. 5 shows a functional block configuration of the mobile system of the first embodiment. The
位置同定装置5は、測位機能(言い換えると位置姿勢推定機能)、自動搬送制御機能、地図作成機能、および相対位置関係計算機能等を有する(図6)。位置同定装置5は、図6のプロセッサ601によるプログラム処理等に基づいて、センサ制御部51等の各部を実現する。位置同定装置5は、各部として、センサ制御部51、位置同定部52、地図作成部53、データ記憶部54、調整部55等を有する。データ記憶部54には、位置同定結果データとして第1位置同定結果41Aおよび第2位置同定結果41B、地図データとして第1地図データ42Aおよび第2地図データ42B、相対位置関係データ43等のデータや情報が格納される。
The position identification device 5 has a positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, a relative positional relationship calculation function, and the like (FIG. 6). The position identification device 5 realizes each part such as the sensor control unit 51 based on the program processing by the processor 601 of FIG. The position identification device 5 includes a sensor control unit 51, a position identification unit 52, a map creation unit 53, a data storage unit 54, an adjustment unit 55, and the like as each unit. In the data storage unit 54, data such as the first position identification result 41A and the second position identification result 41B as the position identification result data, the first map data 42A and the second map data 42B as the map data, and the relative positional relationship data 43, etc. Information is stored.
センサ制御部51は、第1センサ制御部51Aおよび第2センサ制御部51Bを有する。第1センサ制御部51Aは、センサ2Aを制御し、センサ2AからセンサデータSDAを得る。第2センサ制御部51Bは、センサ2Bを制御し、センサ2BからセンサデータSDBを得る。センサデータSDAやセンサデータSDBは、時系列上の時点毎の測距データ、すなわち方向を表す角度毎の距離情報を含む。センサ制御部51は、時系列上のセンサデータ、少なくとも一定時間以上のデータを、メモリ上に保持する。
The sensor control unit 51 includes a first sensor control unit 51A and a second sensor control unit 51B. The first sensor control unit 51A controls the sensor 2A and obtains the sensor data SDA from the sensor 2A. The second sensor control unit 51B controls the sensor 2B and obtains the sensor data SDB from the sensor 2B. The sensor data SDA and the sensor data SDB include distance measurement data for each time point on the time series, that is, distance information for each angle representing a direction. The sensor control unit 51 holds time-series sensor data, data for at least a certain period of time or longer, in the memory.
位置同定部52は、測位機能(特に位置姿勢推定機能)を構成する要素であり、センサデータを用いて空間座標系内でのセンサ2の位置および姿勢を同定する部分である。位置同定部52は、第1位置同定部52Aおよび第2位置同定部52Bを有する。第1位置同定部52Aは、センサデータSDAに基づいて、センサ2Aの位置および姿勢を推定し、その結果を第1位置同定結果41Aとする。同様に、第2位置同定部52Bは、センサデータSDBに基づいて、センサ2Bの位置および姿勢を推定し、その結果を第2位置同定結果41Bとする。
The position identification unit 52 is an element that constitutes a positioning function (particularly a position / orientation estimation function), and is a part that identifies the position and orientation of the sensor 2 in the spatial coordinate system using sensor data. The position identification unit 52 has a first position identification unit 52A and a second position identification unit 52B. The first position identification unit 52A estimates the position and orientation of the sensor 2A based on the sensor data SDA, and sets the result as the first position identification result 41A. Similarly, the second position identification unit 52B estimates the position and orientation of the sensor 2B based on the sensor data SDB, and sets the result as the second position identification result 41B.
位置同定部52は、処理例としては、センサデータである測距データから、周囲の物体の幾何形状を表す形状データを作成し、その形状データを、データ記憶部54の既存の地図データと比較照合する。そして、位置同定部52は、その照合の結果から、空間座標系内でのセンサ2の位置および姿勢を推定する。
As a processing example, the position identification unit 52 creates shape data representing the geometric shape of a surrounding object from distance measurement data which is sensor data, and compares the shape data with the existing map data of the data storage unit 54. Match. Then, the position identification unit 52 estimates the position and orientation of the sensor 2 in the spatial coordinate system from the collation result.
地図作成部53は、地図作成機能を構成する要素であり、位置同定部52による処理結果を用いながら、環境101(図1)の地図データを作成および更新する処理を行う部分である。地図作成部53は、位置同定部52が作成した形状データを用いて、新たな地図データの作成や、既存の地図データの更新を行う。地図作成部53は、第1地図作成部53Aおよび第2地図作成部53Bを有する。第1地図作成部53Aは、センサデータSDA、および第1位置同定結果41Aを用いて、第1地図データ42Aを作成する。同様に、第2地図作成部53Bは、センサデータSDB、および第2位置同定結果41Bを用いて、第2地図データ42Bを作成する。第1地図データ42Aは、図2の高さ位置ZAの水平面における移動体1の周囲の物体の形状を表すデータである。第2地図データ42Bは、高さ位置ZBの水平面における移動体1の周囲の物体の形状を表すデータである。
The map creation unit 53 is an element constituting the map creation function, and is a part that creates and updates the map data of the environment 101 (FIG. 1) while using the processing result by the position identification unit 52. The map creation unit 53 uses the shape data created by the position identification unit 52 to create new map data and update existing map data. The map creation unit 53 has a first map creation unit 53A and a second map creation unit 53B. The first map creation unit 53A creates the first map data 42A by using the sensor data SDA and the first position identification result 41A. Similarly, the second map creation unit 53B creates the second map data 42B using the sensor data SDB and the second position identification result 41B. The first map data 42A is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZA in FIG. The second map data 42B is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZB.
データ記憶部54には、上記処理で作成された、第1位置同定結果41A、第2位置同定結果41A、第1地図データ42A、および第2地図データ42B等の各データが一旦格納される。
The data storage unit 54 temporarily stores each data such as the first position identification result 41A, the second position identification result 41A, the first map data 42A, and the second map data 42B created by the above processing.
調整部55は、相対位置関係計算機能を構成する要素であり、言い換えると相対位置関係計算部である。調整部55は、データ記憶部54の各データ(41A,41B,42A,42B)を参照しながら、センサ2間の相対位置関係(図1や図2での相対位置関係105)を計算する処理を行う。この処理は、言い換えると、センサ座標系間のキャリブレーションの処理であり、移動体座標系CMでの各センサ2の位置座標等の設定に関する処理である。調整部55は、後述(図7)の計算によって、最新の相対位置関係を得ると、相対位置関係データ43として、データ記憶部54に格納する。これは、センサ2ならびに測位機能等に係わる最新の設定、言い換えると自動的な設定更新に相当する。
The adjustment unit 55 is an element that constitutes the relative positional relationship calculation function, in other words, is a relative positional relationship calculation unit. The adjusting unit 55 calculates the relative positional relationship between the sensors 2 (relative positional relationship 105 in FIGS. 1 and 2) while referring to each data (41A, 41B, 42A, 42B) of the data storage unit 54. I do. In other words, this process is a process of calibrating between the sensor coordinate systems, and is a process related to setting the position coordinates of each sensor 2 in the mobile coordinate system CM. When the latest relative positional relationship is obtained by the calculation described later (FIG. 7), the adjusting unit 55 stores it in the data storage unit 54 as the relative positional relationship data 43. This corresponds to the latest settings related to the sensor 2 and the positioning function, in other words, automatic setting updates.
相対位置関係データ43は、センサ相対座標情報等を含むデータである。相対位置関係データ43は、具体的には、図2のような相対位置関係105を表す、位置関係を表す値(Δx,Δy)や、方向関係を表す値(Δθ)を含むデータである。
The relative positional relationship data 43 is data including sensor relative coordinate information and the like. Specifically, the relative positional relationship data 43 is data including values representing the positional relationship (Δx, Δy) representing the relative positional relationship 105 as shown in FIG. 2 and values (Δθ) representing the directional relationship.
なお、位置同定装置5は、データ記憶部54の相対位置関係データ43等を、ユーザに対し、表示等の形態で出力してもよい。例えば、位置同定装置5は、センサ2に関する設定画面を例えばWebページ等の形態で提供し、その設定画面で、相対位置関係データ43に基づいた相対位置関係情報を表示し、ユーザによる確認や手動設定を可能とする。この場合、設定画面での相対位置関係情報の表示としては、例えば上記値(Δx,Δy,Δθ)の表示としてもよいし、図2等のような移動体1の外観構成を表す画像とともに相対位置関係情報をグラフィカルに表示してもよい。例えば、事業者は、予め、位置同定装置5の設定画面で、センサ2の相対位置関係の初期設定値を設定できる。この初期設定以後では、人による手動設定が無くても、相対位置関係計算機能の有効状態によって、自動的に相対位置関係データ43の設定更新が可能である。
Note that the position identification device 5 may output the relative positional relationship data 43 or the like of the data storage unit 54 to the user in the form of display or the like. For example, the position identification device 5 provides a setting screen related to the sensor 2 in the form of, for example, a Web page, displays the relative positional relationship information based on the relative positional relationship data 43 on the setting screen, and confirms or manually by the user. Allows setting. In this case, the relative positional relationship information may be displayed on the setting screen, for example, the above values (Δx, Δy, Δθ) may be displayed, or the relative positional relationship information may be displayed relative to the image showing the appearance configuration of the moving body 1 as shown in FIG. The positional relationship information may be displayed graphically. For example, the business operator can set the initial setting value of the relative positional relationship of the sensor 2 in advance on the setting screen of the position identification device 5. After this initial setting, the setting of the relative positional relationship data 43 can be automatically updated depending on the enabled state of the relative positional relationship calculation function even if there is no manual setting by a person.
移動体システムは、この移動体1の制御装置100に対し、さらに、PC(図1でのPC110)等の装置が通信接続される形態としてもよい。そのPC等の装置には、OSやアプリケーションプログラム等を備える。このアプリケーションプログラムは、移動体システムのユーザ設定処理、測位機能や自動搬送機能や地図作成機能等に係わる処理を行うものが挙げられる。このアプリケーションプログラムは、例えば、自動搬送の経路設定を行う機能や、ユーザが地図を閲覧する機能等を支援するものが挙げられる。ユーザは、そのPC等の装置を操作し、表示画面でそれらの機能を利用できる。ユーザは、その装置の表示画面で相対位置関係データ43を確認することもできる。
The mobile system may be in a form in which a device such as a PC (PC 110 in FIG. 1) is further connected to the control device 100 of the mobile body 1 by communication. The device such as a PC is provided with an OS, an application program, and the like. Examples of this application program include processing related to user setting processing of a mobile system, positioning function, automatic transportation function, map creation function, and the like. Examples of this application program include those that support a function of setting an automatic transportation route, a function of a user browsing a map, and the like. The user can operate the device such as the PC and use those functions on the display screen. The user can also confirm the relative positional relationship data 43 on the display screen of the device.
位置同定装置5の調整部55または地図作成部53は、さらに、相対位置関係データ43を用いて、複数(2個)の地図データ(第1地図データ42Aおよび第2地図データ42B)間における関連付けの処理を行う。これにより、複数(2個)の地図データは、全体として概略的に1つの地図データとして扱うことができる。位置同定装置5は、複数(2個)の地図データから、合成等によって、1つの地図データを作成してもよい。ユーザは、その1つの地図データを、PC110の表示画面で閲覧することができる。
The adjustment unit 55 or the map creation unit 53 of the position identification device 5 further uses the relative positional relationship data 43 to associate a plurality of (two) map data (first map data 42A and second map data 42B) with each other. Is processed. As a result, the plurality of (two) map data can be roughly treated as one map data as a whole. The position identification device 5 may create one map data from a plurality of (two) map data by synthesis or the like. The user can browse the one map data on the display screen of the PC 110.
移動機構制御装置6は、駆動制御回路等を含む部分であり、位置同定装置5による位置同定結果データや地図データ等を用いて、移動機構3の動作を制御する。移動機構制御装置6は、位置同定装置制御部61および移動機構制御部62を含む。位置同定装置制御部61は、位置同定装置5と通信し、位置同定装置5から制御に必要なデータを取得する。移動機構制御部62は、測位機能によって把握された移動体1の位置や、地図作成機能によって作成された地図データに基づいて、移動機構3による走行や旋回の動作を制御する。
The movement mechanism control device 6 is a part including a drive control circuit and the like, and controls the operation of the movement mechanism 3 by using the position identification result data and the map data by the position identification device 5. The movement mechanism control device 6 includes a position identification device control unit 61 and a movement mechanism control unit 62. The position identification device control unit 61 communicates with the position identification device 5 and acquires data necessary for control from the position identification device 5. The movement mechanism control unit 62 controls the traveling and turning operations of the movement mechanism 3 based on the position of the moving body 1 grasped by the positioning function and the map data created by the map creation function.
[ソフトウェアおよびハードウェア]
図6は、図5の位置同定装置5のソフトウェアおよびハードウェアを含む実装構成例を示す。位置同定装置5は、プロセッサ601、メモリ603、補助記憶装置605、通信インタフェース装置607、入出力インタフェース装置608、および電源装置609等を備え、これらがバス等を通じて相互に接続されている。移動体1は、その他図示しないが、ユーザが操作するための操作部等の機構を備えてもよい。 [Software and Hardware]
FIG. 6 shows an implementation configuration example including the software and hardware of theposition identification device 5 of FIG. The position identification device 5 includes a processor 601, a memory 603, an auxiliary storage device 605, a communication interface device 607, an input / output interface device 608, a power supply device 609, and the like, and these are connected to each other via a bus or the like. Although not shown, the moving body 1 may be provided with a mechanism such as an operation unit for the user to operate the moving body 1.
図6は、図5の位置同定装置5のソフトウェアおよびハードウェアを含む実装構成例を示す。位置同定装置5は、プロセッサ601、メモリ603、補助記憶装置605、通信インタフェース装置607、入出力インタフェース装置608、および電源装置609等を備え、これらがバス等を通じて相互に接続されている。移動体1は、その他図示しないが、ユーザが操作するための操作部等の機構を備えてもよい。 [Software and Hardware]
FIG. 6 shows an implementation configuration example including the software and hardware of the
プロセッサ601は、例えばCPU、ROM、RAM等で構成され、言い換えるとコントローラである。プロセッサ601等は、FPGA等のプログラマブルハードウェア回路等で実装されてもよい。プロセッサ601は、補助記憶装置605等に格納されているプログラムをメモリ603上に読み出して展開し、プログラムに従った処理を実行する。これにより、図5の位置同定部52等の各部が、実行モジュールとして実現される。メモリ603には、制御プログラム630や、プロセッサ601による処理データ635等が記憶される。制御プログラム630は、センサ制御プログラム631、位置同定プログラム632、地図作成プログラム633、および調整プログラム635等を含み、これらによって、図5のセンサ制御部51、位置同定部52、地図作成部53、および調整部55等が実現される。処理データ640は、地図データ、位置同定結果、および相対位置関係データ等のデータ(図5のデータ記憶部54の各データと対応する)がある。
The processor 601 is composed of, for example, a CPU, a ROM, a RAM, or the like, in other words, a controller. The processor 601 or the like may be implemented by a programmable hardware circuit or the like such as FPGA. The processor 601 reads a program stored in the auxiliary storage device 605 or the like into the memory 603, expands the program, and executes processing according to the program. As a result, each part such as the position identification part 52 in FIG. 5 is realized as an execution module. The control program 630, the processing data 635 by the processor 601 and the like are stored in the memory 603. The control program 630 includes a sensor control program 631, a position identification program 632, a map creation program 633, an adjustment program 635, and the like, whereby the sensor control unit 51, the position identification unit 52, the map creation unit 53, and the map creation unit 53 of FIG. 5 are included. The adjusting unit 55 and the like are realized. The processed data 640 includes data such as map data, position identification results, and relative positional relationship data (corresponding to each data of the data storage unit 54 in FIG. 5).
補助記憶装置605は、不揮発性メモリ、ストレージ装置、例えばディスクやメモリカード等の記憶媒体、あるいは通信網上のDBサーバ等で構成され、予めプログラムや各種のデータが格納されている。補助記憶装置605には、例えば、地図データ651、センサ位置同定結果データ652、およびセンサ相対位置関係データ653等が格納される。プロセッサ601は、必要に応じて、補助記憶装置605内のデータをメモリ603に読み出し、メモリ603上のデータを補助記憶装置605内に書き込んで保存する。地図データ651は、環境の地図データであり、複数の地図データを格納する地図データベース(DB)でもよく、図5の第1地図データ42Aおよび第2地図データ42Bに対応する各地図データを含む。各地図データは、例えば画像で構成されている。センサ位置同定結果データ652は、各センサ2(2A,2B)の位置同定結果のデータであり、図5の第1位置同定結果41Aおよび第2位置同定結果41Bと対応し、時点毎に環境内の位置および姿勢を表す情報を含む。センサ相対位置関係データ653は、センサ2(2A,2B)間の相対位置関係を表すデータであり、言い換えるとセンサ2の自動調整の設定データであり、図5の相対位置関係データ43と対応する。
The auxiliary storage device 605 is composed of a non-volatile memory, a storage device, for example, a storage medium such as a disk or a memory card, or a DB server on a communication network, and stores programs and various data in advance. The auxiliary storage device 605 stores, for example, map data 651, sensor position identification result data 652, sensor relative positional relationship data 653, and the like. If necessary, the processor 601 reads the data in the auxiliary storage device 605 into the memory 603, writes the data in the memory 603 into the auxiliary storage device 605, and stores the data. The map data 651 is environment map data, and may be a map database (DB) that stores a plurality of map data, and includes each map data corresponding to the first map data 42A and the second map data 42B of FIG. Each map data is composed of, for example, an image. The sensor position identification result data 652 is the position identification result data of each sensor 2 (2A, 2B), and corresponds to the first position identification result 41A and the second position identification result 41B in FIG. Contains information that represents the position and orientation of. The sensor relative positional relationship data 653 is data representing the relative positional relationship between the sensors 2 (2A, 2B), in other words, is setting data for automatic adjustment of the sensor 2, and corresponds to the relative positional relationship data 43 in FIG. ..
通信インタフェース装置607は、センサ2との間や移動機構制御装置6との間、あるいは外部のPCやサーバ等の装置との間で、それぞれの通信インタフェースに応じた通信処理を行う。通信インタフェースは、有線でも無線でもよいし、近距離通信でも遠隔通信でもよい。入出力インタフェース装置608は、入力デバイス(例えばキーボード)や出力デバイス(例えば表示装置)が接続可能であり、それぞれのデバイスとのインタフェースに応じた処理を行う。移動体1に入力デバイスや出力デバイスが搭載されてもよい。電源装置609は、バッテリ等で構成され、各部へ電力を供給する。
The communication interface device 607 performs communication processing according to each communication interface between the sensor 2 and the mobile mechanism control device 6, or between an external device such as a PC or a server. The communication interface may be wired or wireless, short-range communication or remote communication. An input device (for example, a keyboard) and an output device (for example, a display device) can be connected to the input / output interface device 608, and processing is performed according to the interface with each device. An input device or an output device may be mounted on the mobile body 1. The power supply device 609 is composed of a battery or the like, and supplies electric power to each part.
プロセッサ601は、プログラム処理等で実現される機能として、少なくとも、前述の測位機能(言い換えると位置姿勢推定機能)、自動搬送制御機能、地図作成機能、および相対位置関係計算機能を有する。測位機能は、センサ2に基づいて移動体1の位置を測定する機能である。位置姿勢推定機能は、センサ2の位置および姿勢を推定する機能である。測位機能、位置姿勢推定機能は、主に図5の位置同定部52によって実現される。自動搬送制御機能は、移動体1の自動搬送を制御する機能であり、例えば環境101(図1)内で設定された経路を周囲の物体に衝突しないように走行させる機能である。地図作成機能は、環境101内の移動およびセンサ2に基づいて、環境101の地図を作成および更新する機能であり、主に図5の地図作成部53によって実現される。相対位置関係計算機能は、センサ2間の相対位置関係105(図1等)を計算して自動調整を行う機能であり、主に図5の調整部55によって実現される。
The processor 601 has at least the above-mentioned positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, and a relative positional relationship calculation function as functions realized by program processing or the like. The positioning function is a function of measuring the position of the moving body 1 based on the sensor 2. The position / orientation estimation function is a function of estimating the position and orientation of the sensor 2. The positioning function and the position / orientation estimation function are mainly realized by the position identification unit 52 of FIG. The automatic transport control function is a function of controlling the automatic transport of the moving body 1, for example, a function of traveling a route set in the environment 101 (FIG. 1) so as not to collide with a surrounding object. The map creation function is a function of creating and updating a map of the environment 101 based on the movement in the environment 101 and the sensor 2, and is mainly realized by the map creation unit 53 of FIG. The relative positional relationship calculation function is a function of calculating the relative positional relationship 105 (FIG. 1 and the like) between the sensors 2 and performing automatic adjustment, and is mainly realized by the adjustment unit 55 of FIG.
[機能]
移動体システムの各機能について補足説明する。測位機能または位置姿勢推定機能は、図1等の制御装置100が、2個のセンサ2からのセンサデータを用いた計算によって、環境101の空間座標系CSでの移動体1の位置および姿勢等を測定または推定する機能である。制御装置100は、この機能によって把握した空間座標系CS内での各センサ2の位置の軌跡を、相対位置関係計算機能での計算に用いる。 [function]
A supplementary explanation will be given for each function of the mobile system. In the positioning function or the position / orientation estimation function, thecontrol device 100 shown in FIG. 1 or the like calculates the position and orientation of the moving body 1 in the spatial coordinate system CS of the environment 101 by calculation using the sensor data from the two sensors 2. Is a function to measure or estimate. The control device 100 uses the locus of the position of each sensor 2 in the spatial coordinate system CS grasped by this function for the calculation by the relative positional relationship calculation function.
移動体システムの各機能について補足説明する。測位機能または位置姿勢推定機能は、図1等の制御装置100が、2個のセンサ2からのセンサデータを用いた計算によって、環境101の空間座標系CSでの移動体1の位置および姿勢等を測定または推定する機能である。制御装置100は、この機能によって把握した空間座標系CS内での各センサ2の位置の軌跡を、相対位置関係計算機能での計算に用いる。 [function]
A supplementary explanation will be given for each function of the mobile system. In the positioning function or the position / orientation estimation function, the
また、自動搬送制御機能は、測位機能または位置姿勢推定機能を利用して実現される機能である。移動体1は、それらの機能によって把握した移動体1の位置や姿勢の状態に基づいて、好適な自動搬送、例えば経路上の安全な搬送を制御する。この移動体システムは、複数(2個)のセンサ2を備えるので、総合的な検出範囲を広くでき、測位機能による位置等の検出を安定的に行うことができ、その結果、好適な自動搬送を実現できる。
In addition, the automatic transport control function is a function realized by using the positioning function or the position / attitude estimation function. The moving body 1 controls suitable automatic transportation, for example, safe transportation on a route, based on the state of the position and posture of the moving body 1 grasped by those functions. Since this mobile system includes a plurality of (two) sensors 2, the overall detection range can be widened, the position and the like can be stably detected by the positioning function, and as a result, suitable automatic transport can be performed. Can be realized.
さらに、この移動体システムは、複数のセンサ2を利用して構成される機能として、測位機能のみならず、他の機能を有していてもよく、実施の形態1では、地図作成機能を有する。地図作成機能は、位置姿勢推定機能とともに、いわゆるSLAM(Simultaneous Localization and Mapping)の機能を構成する。SLAMは、無人搬送ロボット等の移動体が、センサによる周囲状況の検知に基づいて、自己の位置および姿勢を推定しながら、同時に、周囲の地図を作成・更新する方法である。すなわち、地図作成機能は、移動体1の走行に伴うセンサ2のセンサデータに基づいて、環境101の移動体1の周囲の地図を自動的に作成または更新できる機能である。この地図は、本例では、センサ2の種類(2次元のレーザスキャナ)に対応して、水平面での地図であり、環境101内の物体の形状を表す画像データとして構成される。この移動体システムは、複数(2個)のセンサ2を備えるので、地図作成機能による地図作成をより好適に行うことができる。
Further, this mobile system may have not only a positioning function but also other functions as a function configured by using a plurality of sensors 2, and in the first embodiment, it has a map creation function. .. The map creation function constitutes a so-called SLAM (Simultaneous Localization and Mapping) function together with a position / orientation estimation function. SLAM is a method in which a moving object such as an automatic guided vehicle estimates its own position and posture based on the detection of the surrounding situation by a sensor, and at the same time creates and updates a map of the surroundings. That is, the map creation function is a function that can automatically create or update a map around the moving body 1 in the environment 101 based on the sensor data of the sensor 2 accompanying the traveling of the moving body 1. In this example, this map corresponds to the type of sensor 2 (two-dimensional laser scanner) and is a map on a horizontal plane, and is configured as image data representing the shape of an object in the environment 101. Since this mobile system includes a plurality of (two) sensors 2, it is possible to more preferably perform map creation by the map creation function.
そして、相対位置関係計算機能は、2個のセンサ2(2A,2B)のセンサデータに基づいて、移動体座標系CMにおけるセンサ2間の相対位置関係105を計算して自動的に設定する機能である。移動体1は、例えば通常の走行時に自動的にこの機能を有効状態とし、相対位置関係105を自動調整する。
Then, the relative positional relationship calculation function is a function of calculating and automatically setting the relative positional relationship 105 between the sensors 2 in the moving body coordinate system CM based on the sensor data of the two sensors 2 (2A, 2B). Is. For example, the moving body 1 automatically activates this function during normal traveling and automatically adjusts the relative positional relationship 105.
[SLAM機能]
移動体1におけるSLAM機能の構成例について補足説明する。制御装置100は、環境101(図1)内の移動時、例えば設定された経路上の自動搬送時に、センサ2による周囲の物体の検出・計測を行う。制御装置100は、センサ2からのセンサデータである測距データに基づいて、センサ2の位置を基準とした移動体1の周囲の物体の形状を表す形状データを作成する。制御装置100は、その形状データと、記憶している既存の地図データとを比較照合することで、環境101(対応する地図データ)内での移動体1の現在の位置および姿勢を推定し、位置同定結果とする。この推定は、例えば所定の探索範囲内で、形状データと地図データとの一致や類似の度合いを評価・判断する処理であり、その度合いは例えば画素単位での重なりの数等から評価可能である。 [SLAM function]
A configuration example of the SLAM function in themobile body 1 will be supplementarily described. The control device 100 detects and measures surrounding objects by the sensor 2 when moving in the environment 101 (FIG. 1), for example, when automatically transporting on a set route. The control device 100 creates shape data representing the shape of an object around the moving body 1 with reference to the position of the sensor 2 based on the distance measurement data which is the sensor data from the sensor 2. The control device 100 estimates the current position and orientation of the moving body 1 in the environment 101 (corresponding map data) by comparing and collating the shape data with the existing map data stored in the control device 100. It is the position identification result. This estimation is, for example, a process of evaluating and judging the degree of matching or similarity between the shape data and the map data within a predetermined search range, and the degree can be evaluated from, for example, the number of overlaps in pixel units. ..
移動体1におけるSLAM機能の構成例について補足説明する。制御装置100は、環境101(図1)内の移動時、例えば設定された経路上の自動搬送時に、センサ2による周囲の物体の検出・計測を行う。制御装置100は、センサ2からのセンサデータである測距データに基づいて、センサ2の位置を基準とした移動体1の周囲の物体の形状を表す形状データを作成する。制御装置100は、その形状データと、記憶している既存の地図データとを比較照合することで、環境101(対応する地図データ)内での移動体1の現在の位置および姿勢を推定し、位置同定結果とする。この推定は、例えば所定の探索範囲内で、形状データと地図データとの一致や類似の度合いを評価・判断する処理であり、その度合いは例えば画素単位での重なりの数等から評価可能である。 [SLAM function]
A configuration example of the SLAM function in the
制御装置100は、経路上で区間毎に、移動体1の現在の位置および姿勢を推定しながら、好適な移動を制御する。制御装置100は、現在の位置および姿勢から、経路上の次の目標位置を決定し、その目的位置への移動のために移動機構3を制御する。その際、制御装置100は、周囲の物体と移動体1の現在の位置および姿勢との関係が好適になるように、移動機構3を制御する。
The control device 100 controls suitable movement while estimating the current position and posture of the moving body 1 for each section on the route. The control device 100 determines the next target position on the route from the current position and posture, and controls the movement mechanism 3 for movement to the target position. At that time, the control device 100 controls the moving mechanism 3 so that the relationship between the surrounding object and the current position and posture of the moving body 1 becomes suitable.
また、制御装置100は、上記位置および姿勢の推定を行いながら、上記作成した形状データを用いて新たな地図データを作成して登録する。または、制御装置100は、上記作成した形状データを用いて、既存の地図データを更新する。制御装置100は、経路上の区間毎に上記のような局所的計画処理を同様に繰り返す。
Further, the control device 100 creates and registers new map data using the shape data created above while estimating the position and posture. Alternatively, the control device 100 updates the existing map data using the shape data created above. The control device 100 similarly repeats the above-mentioned local planning process for each section on the route.
[処理フロー]
図7は、図5の制御装置100の位置同定装置5、特に調整部55による、主な処理(特に相対位置関係計算)のフローを示す。図7のフローはステップS1~S5を有する。前提として、図5の構成に基づいて、調整部55は、移動体1の走行に伴う、一定時間以上の時間での時系列上の位置同定結果(41A,41B)のデータ、言い換えると軌跡データを入力・取得する。位置同定結果には、空間座標系CS内でのセンサ2の位置および姿勢(方向を表す角度)の情報を含む。 [Processing flow]
FIG. 7 shows a flow of main processing (particularly relative positional relationship calculation) by theposition identification device 5 of the control device 100 of FIG. 5, particularly the adjusting unit 55. The flow of FIG. 7 has steps S1 to S5. As a premise, based on the configuration of FIG. 5, the adjusting unit 55 uses the data of the position identification results (41A, 41B) on the time series in a time of a certain period of time or more, in other words, the locus data, as the moving body 1 travels. Is input / acquired. The position identification result includes information on the position and orientation (angle representing the direction) of the sensor 2 in the spatial coordinate system CS.
図7は、図5の制御装置100の位置同定装置5、特に調整部55による、主な処理(特に相対位置関係計算)のフローを示す。図7のフローはステップS1~S5を有する。前提として、図5の構成に基づいて、調整部55は、移動体1の走行に伴う、一定時間以上の時間での時系列上の位置同定結果(41A,41B)のデータ、言い換えると軌跡データを入力・取得する。位置同定結果には、空間座標系CS内でのセンサ2の位置および姿勢(方向を表す角度)の情報を含む。 [Processing flow]
FIG. 7 shows a flow of main processing (particularly relative positional relationship calculation) by the
ステップS1で、調整部55は、相対位置パラメータ700(Δx,Δy)を生成する。相対位置パラメータ700は、図2等の移動体1の移動体座標系CMでのセンサ2(2A,2B)間の位置(PA,PB)の関係を表すパラメータである。相対位置パラメータ700は、例えばセンサ2Aの位置PAを基準として、x軸方向で、センサ2Bの位置PBに関する差分の値Δxと、同様にy軸方向に関する差分の値Δyとを有する。なお、実施の形態1では、z軸方向の位置(ZA,ZB)および差分(Δz)については、一定の設定値であるとして、計算対象外とする。相対位置パラメータ700は、時系列上の時点(t=1,……,k,……,T)毎の値を有し、例えば{(Δx1,Δy1),……,(Δxk,Δyk),……,(ΔxT,ΔyT)}のように表現される。添え字kはある時点、添え字Tは最後の時点を表す。
In step S1, the adjusting unit 55 generates the relative position parameter 700 (Δx, Δy). The relative position parameter 700 is a parameter representing the relationship (PA, PB) between the sensors 2 (2A, 2B) in the moving body coordinate system CM of the moving body 1 as shown in FIG. The relative position parameter 700 has, for example, a difference value Δx with respect to the position PB of the sensor 2B in the x-axis direction and a difference value Δy with respect to the y-axis direction with reference to the position PA of the sensor 2A. In the first embodiment, the positions (ZA, ZB) and the difference (Δz) in the z-axis direction are excluded from the calculation because they are set values. The relative position parameter 700 has a value for each time point (t = 1, ..., k, ..., T) on the time series, for example, {(Δx1, Δy1), ..., (Δxx, Δyk), ..., (ΔxT, ΔyT)}. The subscript k represents a certain point in time, and the subscript T represents the last point in time.
ステップS2で、調整部55は、第1センサであるセンサ2Aの第1位置同定結果41Aの軌跡701(第1軌跡と記載する場合もある)に対し、ステップS1の各相対位置パラメータ700に対応した第2センサであるセンサ2Bの「仮位置」(VPBとする)の軌跡703(仮軌跡と記載する場合もある)を生成する。この「仮位置」は、マッチング処理用にセンサ2Bの仮の位置を設定するものであり、時点(t)毎に、センサ2Aの位置PAから相対位置パラメータ700(Δx,Δy)をベクトルとした先の位置に対応する。仮位置VPBは、複数の候補(VPB1,……,VPBnとする)として生成される。センサ2Aの第1位置同定結果41Aの軌跡701は、時系列データとして、例えば{(xA_1,yA_1),……,(xA_k,yA_k), ……,(xA_T,yA_T)}のように表現される。センサ2Bの仮位置の軌跡703は、時系列データとして、例えば{(vxB_1,vyB_1),……,(vxB_k,vyB_k),……,(vxB_T,vyB_T)のように表現される。
In step S2, the adjusting unit 55 corresponds to each relative position parameter 700 in step S1 with respect to the locus 701 (sometimes referred to as the first locus) of the first position identification result 41A of the sensor 2A which is the first sensor. A locus 703 (sometimes referred to as a tentative locus) of the “temporary position” (referred to as VPB) of the sensor 2B, which is the second sensor, is generated. This "temporary position" sets a temporary position of the sensor 2B for matching processing, and a relative position parameter 700 (Δx, Δy) is used as a vector from the position PA of the sensor 2A at each time point (t). Corresponds to the previous position. The temporary position VPB is generated as a plurality of candidates (VPB1, ..., VPBn). The locus 701 of the first position identification result 41A of the sensor 2A is expressed as time series data, for example, {(xA_1, yA_1), ..., (xA_k, yA_k), ..., (xA_T, yA_T)}. NS. The locus 703 of the temporary position of the sensor 2B is expressed as time series data, for example, {(vxB_1, vyB_1), ..., (vxB_k, vyB_k), ..., (vxB_T, vyB_T).
ステップS3では、調整部55は、ステップS2で生成した仮位置VPBの軌跡703に対し、センサ2Bの第2位置同定結果41Bの軌跡702(第2軌跡と記載する場合もある)とのマッチング処理を行う。このマッチング処理は、軌跡同志での形状の一致度合いないし類似度合いを評価・判断する処理である。このマッチング処理では、詳細は後述するが、それらの2つの軌跡データ間での一致度合いを表す評価値としての「一致度」(Kとする)を扱う。また、このマッチング処理では、軌跡の向きに関するパラメータである回転パラメータ(Rとする)を用いる。調整部55は、候補の相対位置パラメータ700毎に、計算した一致度Kおよび回転パラメータRをメモリに格納する。第2位置同定結果41Bの軌跡702は、時系列データとして、例えば{(xB_1,yB_1),……,(xB_k,yB_k),……,(xB_T,yB_T)}のように表現される。
In step S3, the adjusting unit 55 matches the locus 703 of the temporary position VPB generated in step S2 with the locus 702 of the second position identification result 41B of the sensor 2B (may be described as the second locus). I do. This matching process is a process of evaluating and determining the degree of matching or similarity of shapes among trajectories. In this matching process, although the details will be described later, "matching degree" (referred to as K) as an evaluation value indicating the degree of matching between the two locus data is handled. Further, in this matching process, a rotation parameter (referred to as R), which is a parameter related to the direction of the locus, is used. The adjusting unit 55 stores the calculated coincidence degree K and the rotation parameter R in the memory for each candidate relative position parameter 700. The locus 702 of the second position identification result 41B is expressed as time series data, for example, {(xB_1, yB_1), ..., (xB_k, yB_k), ..., (xB_T, yB_T)}.
ステップS4で、調整部55は、ステップS3の結果で、一致度Kが最も高かった軌跡の対に対応する相対位置パラメータ700(Δx,Δy)を、最適相対位置704(Δx_opt,Δy_opt)として決定・抽出し、メモリに格納する。ここまでで、図1等の相対位置関係105のうちの位置関係(Δx,Δy)が把握されたことになる。
In step S4, the adjusting unit 55 determines the relative position parameter 700 (Δx, Δy) corresponding to the pair of loci having the highest degree of coincidence K as the optimum relative position 704 (Δx_opt, Δy_opt) in the result of step S3. -Extract and store in memory. Up to this point, the positional relationship (Δx, Δy) of the relative positional relationship 105 shown in FIG. 1 and the like has been grasped.
ステップS5で、さらに、調整部55は、センサ2間の方向(θ)の関係(Δθ)に関する最適値を計算する。調整部55は、ステップS4の最適相対位置704(Δx_opt,Δy_opt)を用いて、方向(θ)の関係(Δθ)を表す量{θB_t+R-θA_t}を生成する。ここで、角度θB_tは、センサ2Bの第2同定結果41Bに含まれている、時点(t)毎のセンサ2Bの姿勢を表す角度である。角度θA_tは、センサ2Aの第1同定結果41Aに含まれている、時点(t)毎のセンサ2Aの姿勢を表す角度である。“+R”は、回転パラメータRの加算である。この加算は、方向の関係を求めるにあたり、第1センサ(センサ2A)の座標系CAに合わせるための操作である。“-θA_t”は、角度θB_tと角度θA_tとの差をとるための操作である。
In step S5, the adjusting unit 55 further calculates the optimum value regarding the relationship (Δθ) of the direction (θ) between the sensors 2. The adjusting unit 55 uses the optimum relative position 704 (Δx_opt, Δy_opt) in step S4 to generate an amount {θB_t + R−θA_t} that represents the relationship (Δθ) of the direction (θ). Here, the angle θB_t is an angle representing the posture of the sensor 2B at each time point (t) included in the second identification result 41B of the sensor 2B. The angle θA_t is an angle representing the posture of the sensor 2A at each time point (t) included in the first identification result 41A of the sensor 2A. “+ R” is the addition of the rotation parameter R. This addition is an operation for adjusting to the coordinate system CA of the first sensor (sensor 2A) in obtaining the relationship of directions. “−θA_t” is an operation for taking the difference between the angle θB_t and the angle θA_t.
そして、調整部55は、時点毎に生成した量{θB_t+R-θA_t}の平均値をとり、最適相対方向705(Δθ_opt)として決定し、メモリに格納する。この平均値は、Σ{θB_t+R-θA_t}/Tで表される。Σは時点t=1~Tでの総和を示す。最適相対方向705(Δθ_opt)は、角度の差で表現されるセンサ2間の方向の関係(Δθ)に関する最適値である。ステップS5では方向関係の計算の一例を示したが、これに限られない。
Then, the adjusting unit 55 takes the average value of the quantities {θB_t + R-θA_t} generated at each time point, determines it as the optimum relative direction 705 (Δθ_opt), and stores it in the memory. This average value is represented by Σ {θB_t + R−θA_t} / T. Σ indicates the sum at time points t = 1 to T. The optimum relative direction 705 (Δθ_opt) is an optimum value regarding the directional relationship (Δθ) between the sensors 2 expressed by the difference in angles. In step S5, an example of calculation of the directional relationship is shown, but the present invention is not limited to this.
調整部55は、上記のようにして得た、最適相対位置704(Δx_opt,Δy_opt)および最適相対方向(Δθ_opt)を含む情報を、図5の相対位置関係データ43として保存する。この相対位置関係データ43は、図2の相対位置関係105を表す値(Δx,Δy,Δθ)と対応している。
The adjusting unit 55 stores the information including the optimum relative position 704 (Δx_opt, Δy_opt) and the optimum relative direction (Δθ_opt) obtained as described above as the relative positional relationship data 43 in FIG. The relative positional relationship data 43 corresponds to the values (Δx, Δy, Δθ) representing the relative positional relationship 105 in FIG.
[軌跡]
図8の(A)は、移動体1の移動とその際の軌跡の例を水平面で示す。本例では、移動体1は、最初、時点t1の位置P1(ここでは図4の代表的な位置PMを用いている)では、前方(空間座標系CSでのX方向、移動体座標系CMでのx方向)を向いた状態であり、時点t2の位置P2まで前方に走行している。移動体1は、位置P2からは、時点t3の位置P3を経由するように、右旋回を行っている。これにより、移動体1は、時点t4の位置P4では、右を向いた状態となっている。破線で示す軌跡800は、上記のような移動体1(位置PM)の移動の軌跡、特に図4の後軸420の位置PM2が通る軌跡を示す。実線で示す軌跡701は、この移動に伴って、前側のセンサ2Aの位置PAが通る軌跡(第1軌跡)を示す。 [Trajectory]
FIG. 8A shows an example of the movement of the movingbody 1 and the locus at that time in a horizontal plane. In this example, the moving body 1 is initially located in front of the position P1 at the time point t1 (here, the typical position PM in FIG. 4 is used) in the front (X direction in the spatial coordinate system CS, moving body coordinate system CM). It is in a state of facing the x direction), and is traveling forward to the position P2 at the time point t2. The moving body 1 makes a right turn from the position P2 so as to pass through the position P3 at the time point t3. As a result, the moving body 1 is in a state of facing right at the position P4 at the time point t4. The locus 800 shown by the broken line indicates the locus of movement of the moving body 1 (position PM) as described above, particularly the locus through which the position PM2 of the rear axis 420 of FIG. 4 passes. The locus 701 shown by the solid line indicates a locus (first locus) through which the position PA of the sensor 2A on the front side passes along with this movement.
図8の(A)は、移動体1の移動とその際の軌跡の例を水平面で示す。本例では、移動体1は、最初、時点t1の位置P1(ここでは図4の代表的な位置PMを用いている)では、前方(空間座標系CSでのX方向、移動体座標系CMでのx方向)を向いた状態であり、時点t2の位置P2まで前方に走行している。移動体1は、位置P2からは、時点t3の位置P3を経由するように、右旋回を行っている。これにより、移動体1は、時点t4の位置P4では、右を向いた状態となっている。破線で示す軌跡800は、上記のような移動体1(位置PM)の移動の軌跡、特に図4の後軸420の位置PM2が通る軌跡を示す。実線で示す軌跡701は、この移動に伴って、前側のセンサ2Aの位置PAが通る軌跡(第1軌跡)を示す。 [Trajectory]
FIG. 8A shows an example of the movement of the moving
図8の(B)は、(A)のうちのセンサ2A(位置PA)の軌跡701のみを抜き出したものを示す。この軌跡701は、前方への直進の走行時の直線部701aと、右旋回時の曲線部701b(言い換えると円弧)と、右への直進の走行時の直線部701cとを含む。図5の第1位置同定部52Aは、センサ2Aに基づいてこのような軌跡データを得る。この軌跡データは、時系列上の時点毎に、図2の高さ位置ZAに応じた水平面内での位置座標を有するデータである。なお、軌跡を線で図示しているが、詳細には点群である。
FIG. 8B shows only the locus 701 of the sensor 2A (position PA) in (A) extracted. This locus 701 includes a straight line portion 701a when traveling straight forward, a curved section 701b (in other words, an arc) when turning right, and a straight line portion 701c when traveling straight forward to the right. The first position identification unit 52A in FIG. 5 obtains such trajectory data based on the sensor 2A. This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZA of FIG. 2 at each time point on the time series. Although the locus is shown by a line, it is a point cloud in detail.
図9の(A)は、図8の(A)と対応し、移動体1の軌跡800は同じである。実線で示す軌跡702は、この移動に伴って、後側のセンサ2Bの位置PBが通る軌跡(第2軌跡)を示す。
(A) of FIG. 9 corresponds to (A) of FIG. 8, and the locus 800 of the moving body 1 is the same. The locus 702 shown by the solid line indicates a locus (second locus) through which the position PB of the sensor 2B on the rear side passes along with this movement.
図9の(B)は、(A)のうちのセンサ2B(位置PB)の軌跡702のみを抜き出したものを示す。この軌跡702は、前方への直進の走行時の直線部702aと、右旋回時の曲線部702b(言い換えると円弧)と、右への直進の走行時の直線部702cとを含む。図5の第2位置同定部52Bは、センサ2Bに基づいて、このような軌跡データを得る。この軌跡データは、時系列上の時点毎に、図2の高さ位置ZBに応じた水平面内での位置座標を有するデータである。
FIG. 9B shows only the locus 702 of the sensor 2B (position PB) in (A) extracted. This locus 702 includes a straight line portion 702a when traveling straight forward, a curved section 702b (in other words, an arc) when turning right, and a straight line portion 702c when traveling straight forward to the right. The second position identification unit 52B in FIG. 5 obtains such trajectory data based on the sensor 2B. This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZB in FIG. 2 at each time point on the time series.
[処理例(1)]
図10は、図8と同じ移動体1の軌跡800およびセンサ2Aの軌跡701(第1軌跡)に対し、図9のセンサ2Bの設置の位置PBが異なる場合の2つの軌跡701(第2軌跡)の例としての軌跡7011,7012を示す。軌跡7011は、移動体座標系CMでのセンサ2Bの位置PBが位置PB1である場合の実際の計測値に基づいた軌跡の例であり、軌跡7012は、位置PB2である場合の軌跡の例である。センサ2Bの各軌跡(7011,7012)の形状をみてみると、移動体1の旋回中心(例えば移動体1の位置PMの軌跡800)からの位置PB(PB1,PB2)への距離(例えば距離1001,1002)に応じて、軌跡中の円弧の部分の半径が変わることがわかる。移動体システムは、一定時間以上の時間で得た、直進動作に対応する直線等の軌跡と、旋回等の方向転換動作に対応する円弧等の軌跡とを含む軌跡データを用いることで、図7のステップS3のマッチング処理によって相対位置関係の計算が可能である。ステップS1では、このような各位置PBを仮位置VPBとするように各相対位置パラメータ700をとる。 [Processing example (1)]
FIG. 10 shows two loci 701 (second locus) when the installation position PB of thesensor 2B in FIG. 9 is different from the locus 800 of the moving body 1 and the locus 701 (first locus) of the sensor 2A as in FIG. ) Are shown as examples of loci 7011 and 7012. The locus 7011 is an example of a locus based on an actual measured value when the position PB of the sensor 2B in the mobile coordinate system CM is the position PB1, and the locus 7012 is an example of a locus when the position PB2 is the position PB2. be. Looking at the shape of each locus (7011, 7012) of the sensor 2B, the distance (for example, distance) from the turning center of the moving body 1 (for example, the locus 800 of the position PM of the moving body 1) to the position PB (PB1, PB2). It can be seen that the radius of the arc portion in the locus changes according to 1001, 1002). The mobile system uses locus data including a locus such as a straight line corresponding to a straight line motion and a locus such as an arc corresponding to a direction change motion such as turning obtained in a certain period of time or more. The relative positional relationship can be calculated by the matching process in step S3 of. In step S1, each relative position parameter 700 is set so that each such position PB is a temporary position VPB.
図10は、図8と同じ移動体1の軌跡800およびセンサ2Aの軌跡701(第1軌跡)に対し、図9のセンサ2Bの設置の位置PBが異なる場合の2つの軌跡701(第2軌跡)の例としての軌跡7011,7012を示す。軌跡7011は、移動体座標系CMでのセンサ2Bの位置PBが位置PB1である場合の実際の計測値に基づいた軌跡の例であり、軌跡7012は、位置PB2である場合の軌跡の例である。センサ2Bの各軌跡(7011,7012)の形状をみてみると、移動体1の旋回中心(例えば移動体1の位置PMの軌跡800)からの位置PB(PB1,PB2)への距離(例えば距離1001,1002)に応じて、軌跡中の円弧の部分の半径が変わることがわかる。移動体システムは、一定時間以上の時間で得た、直進動作に対応する直線等の軌跡と、旋回等の方向転換動作に対応する円弧等の軌跡とを含む軌跡データを用いることで、図7のステップS3のマッチング処理によって相対位置関係の計算が可能である。ステップS1では、このような各位置PBを仮位置VPBとするように各相対位置パラメータ700をとる。 [Processing example (1)]
FIG. 10 shows two loci 701 (second locus) when the installation position PB of the
[処理例(2)]
図11の(A)は、図7のステップS1での相対位置パラメータ700(Δx,Δy)の生成例を示す。図示のように、調整部55は、基準とするセンサ2Aの位置PAから、時点(t)毎に、水平面での各方向および各距離で、複数の相対位置パラメータ700(Δx,Δy)を候補として生成する。この相対位置パラメータ700(Δx,Δy)は、例えば、相対位置関係(図5の相対位置関係データ43)の初期設定値等に基づいて、所定の範囲内でずらしたような複数の候補の値として生成される。例えば、調整部55は、図示のように格子(複数の位置座標点を持つ格子)上に、相対位置パラメータ700に応じた仮位置VPBを設定する。本例では、相対位置パラメータ700の3つの例のみ図示しているが、範囲内で多数が存在する。仮位置VPB0は、相対位置関係の初期設定値に対応した相対位置パラメータ(Δx0,Δy0)によって生成された値である。仮位置VPB1,VPB2は、他の相対位置パラメータ(Δx1,Δp1),(xp2,yp2)で生成した値である。調整部55が相対位置パラメータ700を生成する範囲は、移動体1の形状に基づいたセンサ2が設置可能な範囲としてもよいし、相対位置関係の初期設定値を中心とした所定の範囲等としてもよい。なお、(A)では、各仮位置VPBでのセンサ2B(座標系CB)の方向(x軸,y軸)としては、相対位置関係の初期設定値に基づいた一定の方向としている。 [Processing example (2)]
FIG. 11A shows an example of generating the relative position parameter 700 (Δx, Δy) in step S1 of FIG. 7. As shown in the figure, the adjustingunit 55 can select a plurality of relative position parameters 700 (Δx, Δy) from the position PA of the reference sensor 2A at each time point (t) in each direction and each distance in the horizontal plane. Generate as. The relative position parameter 700 (Δx, Δy) is a value of a plurality of candidates shifted within a predetermined range based on, for example, the initial setting value of the relative position relationship (relative position relationship data 43 in FIG. 5). Is generated as. For example, the adjusting unit 55 sets a temporary position VPB according to the relative position parameter 700 on a grid (a grid having a plurality of position coordinate points) as shown in the figure. In this example, only three examples of the relative position parameter 700 are shown, but there are many within the range. The temporary position VPB0 is a value generated by the relative position parameters (Δx0, Δy0) corresponding to the initial setting values of the relative positional relationship. The temporary positions VPB1 and VPB2 are values generated by other relative position parameters (Δx1, Δp1) and (xp2, yp2). The range in which the adjusting unit 55 generates the relative position parameter 700 may be a range in which the sensor 2 based on the shape of the moving body 1 can be installed, or a predetermined range centered on the initial setting value of the relative positional relationship. May be good. In (A), the direction (x-axis, y-axis) of the sensor 2B (coordinate system CB) at each temporary position VPB is a constant direction based on the initial setting value of the relative positional relationship.
図11の(A)は、図7のステップS1での相対位置パラメータ700(Δx,Δy)の生成例を示す。図示のように、調整部55は、基準とするセンサ2Aの位置PAから、時点(t)毎に、水平面での各方向および各距離で、複数の相対位置パラメータ700(Δx,Δy)を候補として生成する。この相対位置パラメータ700(Δx,Δy)は、例えば、相対位置関係(図5の相対位置関係データ43)の初期設定値等に基づいて、所定の範囲内でずらしたような複数の候補の値として生成される。例えば、調整部55は、図示のように格子(複数の位置座標点を持つ格子)上に、相対位置パラメータ700に応じた仮位置VPBを設定する。本例では、相対位置パラメータ700の3つの例のみ図示しているが、範囲内で多数が存在する。仮位置VPB0は、相対位置関係の初期設定値に対応した相対位置パラメータ(Δx0,Δy0)によって生成された値である。仮位置VPB1,VPB2は、他の相対位置パラメータ(Δx1,Δp1),(xp2,yp2)で生成した値である。調整部55が相対位置パラメータ700を生成する範囲は、移動体1の形状に基づいたセンサ2が設置可能な範囲としてもよいし、相対位置関係の初期設定値を中心とした所定の範囲等としてもよい。なお、(A)では、各仮位置VPBでのセンサ2B(座標系CB)の方向(x軸,y軸)としては、相対位置関係の初期設定値に基づいた一定の方向としている。 [Processing example (2)]
FIG. 11A shows an example of generating the relative position parameter 700 (Δx, Δy) in step S1 of FIG. 7. As shown in the figure, the adjusting
図11の(B)は、仮位置VPBでのセンサ2Bの仮の方向(「仮方向」:Vθとする)を設定する場合のパラメータの例を示す。このように、各相対位置パラメータ700に応じた各仮位置VPBで、センサ2Bの仮方向(Vθ)のパラメータを用いて、様々な仮方向(Vθ)を生成してもよい。仮方向(Vθ)のパラメータに関しても、例えば相対位置関係の初期設定値に基づいて所定の範囲内で複数の候補が生成される。本例では、仮方向(Vθ)のパラメータとして3つの例(Vθ0,Vθ1,Vθ2)を示す。仮方向(Vθ)のパラメータは、基準とするセンサ2Aの方向(θA、x軸)に対する角度差、または前述の回転パラメータRを用いて規定できる。
FIG. 11B shows an example of parameters when setting a temporary direction (“temporary direction”: Vθ) of the sensor 2B at the temporary position VPB. In this way, various temporary directions (Vθ) may be generated by using the parameters of the temporary direction (Vθ) of the sensor 2B at each temporary position VPB corresponding to each relative position parameter 700. Regarding the parameter of the tentative direction (Vθ), for example, a plurality of candidates are generated within a predetermined range based on the initial setting value of the relative positional relationship. In this example, three examples (Vθ0, Vθ1, Vθ2) are shown as parameters of the tentative direction (Vθ). The parameter of the tentative direction (Vθ) can be defined by using the angle difference with respect to the direction (θA, x-axis) of the reference sensor 2A or the rotation parameter R described above.
ステップS2では、センサ2Aの位置PAに関する第1位置同定結果である軌跡701から、ステップS1の相対位置パラメータ700(Δx,Δy)に応じた、センサ2Bの仮位置VPBの軌跡703が生成される。本例では、ある候補の相対位置パラメータ700に応じた仮位置VPBの軌跡703は、時点(t)毎に同じ相対位置パラメータ700を適用するものとした。これに限らず、軌跡703は、時点(t)毎に異なる相対位置パラメータ700を適用することも可能である。
In step S2, the locus 703 of the temporary position VPB of the sensor 2B is generated according to the relative position parameter 700 (Δx, Δy) of step S1 from the locus 701 which is the first position identification result regarding the position PA of the sensor 2A. .. In this example, the same relative position parameter 700 is applied to the locus 703 of the temporary position VPB corresponding to the relative position parameter 700 of a certain candidate at each time point (t). Not limited to this, it is also possible to apply the relative position parameter 700, which is different for each time point (t), to the locus 703.
[処理例(3)]
図12は、ステップS2で各相対位置パラメータ700に応じた各仮位置VPBの軌跡703(仮軌跡)を生成する例を示す。仮軌跡を一点鎖線で示す。本例では、2つの仮位置VPB(図11の仮位置VPB1,VPB2)に関する2つの仮軌跡1201,1202の例のみを示す。センサ2Aの第1位置同定結果41Aの軌跡701上、時点(t=t1,t2,……)毎に方向(θA)を持つ各位置PA(PA1,PA2,……)から、それぞれ相対位置パラメータ700(Δx,Δy)を用いた先に仮位置VPBが設定される。例えば、相対位置パラメータ(Δx2,Δy2)に応じた仮位置VPB2の仮軌跡1202については、時点(t)毎の各位置は、図示の{(vxB_1,vyB_1),(vxB_2,vyB_2),(vxB_3,vyB_3),(vxB_4,vyB_4),(vxB_5,vyB_5)}のように表される。本例では、1つの仮軌跡は、時点毎の相対位置パラメータ700(Δx,Δy)を同じとした場合である。図示のように、生成後の各仮位置VPBの仮軌跡1201,1202は、旋回中心からの距離に応じて円弧(曲線部1201b,1202b)の部分の半径が異なる。仮軌跡の形状は、第1軌跡の形状に対し、異なるものとなる。 [Processing example (3)]
FIG. 12 shows an example in which the locus 703 (provisional locus) of each temporary position VPB corresponding to eachrelative position parameter 700 is generated in step S2. The tentative locus is indicated by a long-dotted chain line. In this example, only the examples of the two temporary trajectories 1201 and 1202 relating to the two temporary positions VPB (temporary positions VPB1 and VPB2 in FIG. 11) are shown. Relative position parameters from each position PA (PA1, PA2, ...) having a direction (θA) at each time point (t = t1, t2, ...) On the locus 701 of the first position identification result 41A of the sensor 2A. The temporary position VPB is set first using 700 (Δx, Δy). For example, regarding the temporary locus 1202 of the temporary position VPB2 according to the relative position parameter (Δx2, Δy2), each position at each time point (t) is shown in {(vxB_1, vyB_1), (vxB_2, vyB_2), (vxB_3). , VyB_3), (vxB_4, vyB_4), (vxB_5, vyB_5)}. In this example, one tentative locus is a case where the relative position parameters 700 (Δx, Δy) for each time point are the same. As shown in the figure, the temporary trajectories 1201 and 1202 of each temporary position VPB after generation have different radii of arcs ( curved portions 1201b and 1202b) depending on the distance from the turning center. The shape of the tentative locus is different from the shape of the first locus.
図12は、ステップS2で各相対位置パラメータ700に応じた各仮位置VPBの軌跡703(仮軌跡)を生成する例を示す。仮軌跡を一点鎖線で示す。本例では、2つの仮位置VPB(図11の仮位置VPB1,VPB2)に関する2つの仮軌跡1201,1202の例のみを示す。センサ2Aの第1位置同定結果41Aの軌跡701上、時点(t=t1,t2,……)毎に方向(θA)を持つ各位置PA(PA1,PA2,……)から、それぞれ相対位置パラメータ700(Δx,Δy)を用いた先に仮位置VPBが設定される。例えば、相対位置パラメータ(Δx2,Δy2)に応じた仮位置VPB2の仮軌跡1202については、時点(t)毎の各位置は、図示の{(vxB_1,vyB_1),(vxB_2,vyB_2),(vxB_3,vyB_3),(vxB_4,vyB_4),(vxB_5,vyB_5)}のように表される。本例では、1つの仮軌跡は、時点毎の相対位置パラメータ700(Δx,Δy)を同じとした場合である。図示のように、生成後の各仮位置VPBの仮軌跡1201,1202は、旋回中心からの距離に応じて円弧(曲線部1201b,1202b)の部分の半径が異なる。仮軌跡の形状は、第1軌跡の形状に対し、異なるものとなる。 [Processing example (3)]
FIG. 12 shows an example in which the locus 703 (provisional locus) of each temporary position VPB corresponding to each
[処理例(4)]
図13は、図7のステップS3のマッチング処理の例に関する説明図を示す。本例では、第1センサの第1軌跡および相対位置パラメータ700に基づいて生成された第2センサの仮位置VPBの仮軌跡と、第2センサの第2軌跡とで、回転パラメータRを用いて様々な向きの関係で、軌跡の形状が比較照合される。図13の(A)では、比較対象の軌跡の対として、一点鎖線で示す軌跡703(仮軌跡)と、実線で示す軌跡702(第2軌跡)に基づいた複数の軌跡(例えば軌跡7021,7022,7023,7024,7025)とを示す。軌跡703は、ある相対位置パラメータ700(Δx,Δy)に応じたある仮位置VPBの仮軌跡である。複数の軌跡(7021等)は、回転パラメータRを用いて異なる向きとして生成した複数の軌跡である。軌跡703に対し、最初の時点(t1)の位置を同じとして、複数の軌跡(7021等)が重ね合わせられている。複数の軌跡(7021等)は、例えばセンサ2Aの座標系CAのx軸を基準として、回転パラメータRの角度を0度とし、R=0度の軌跡702を元に、角度をずらすように生成されている。このように回転パラメータRを用いた様々な向きでの比較が有効である。 [Processing example (4)]
FIG. 13 shows an explanatory diagram regarding an example of the matching process in step S3 of FIG. In this example, the rotation parameter R is used for the temporary locus of the temporary position VPB of the second sensor generated based on the first locus of the first sensor and therelative position parameter 700 and the second locus of the second sensor. The shapes of the trajectories are compared and collated in relation to various orientations. In FIG. 13A, as a pair of loci to be compared, a plurality of loci (for example, loci 7021, 7022) based on the locus 703 (provisional locus) shown by the alternate long and short dash line and the locus 702 (second locus) shown by the solid line. , 7023, 7024, 7025). The locus 703 is a provisional locus of a certain temporary position VPB corresponding to a certain relative position parameter 700 (Δx, Δy). The plurality of loci (7021, etc.) are a plurality of loci generated in different directions using the rotation parameter R. A plurality of loci (7021 and the like) are superposed on the locus 703, assuming that the position at the first time point (t1) is the same. A plurality of loci (7021 etc.) are generated so as to shift the angles based on the locus 702 of R = 0 degree, for example, with the angle of the rotation parameter R as 0 degree with reference to the x-axis of the coordinate system CA of the sensor 2A. Has been done. In this way, comparison in various directions using the rotation parameter R is effective.
図13は、図7のステップS3のマッチング処理の例に関する説明図を示す。本例では、第1センサの第1軌跡および相対位置パラメータ700に基づいて生成された第2センサの仮位置VPBの仮軌跡と、第2センサの第2軌跡とで、回転パラメータRを用いて様々な向きの関係で、軌跡の形状が比較照合される。図13の(A)では、比較対象の軌跡の対として、一点鎖線で示す軌跡703(仮軌跡)と、実線で示す軌跡702(第2軌跡)に基づいた複数の軌跡(例えば軌跡7021,7022,7023,7024,7025)とを示す。軌跡703は、ある相対位置パラメータ700(Δx,Δy)に応じたある仮位置VPBの仮軌跡である。複数の軌跡(7021等)は、回転パラメータRを用いて異なる向きとして生成した複数の軌跡である。軌跡703に対し、最初の時点(t1)の位置を同じとして、複数の軌跡(7021等)が重ね合わせられている。複数の軌跡(7021等)は、例えばセンサ2Aの座標系CAのx軸を基準として、回転パラメータRの角度を0度とし、R=0度の軌跡702を元に、角度をずらすように生成されている。このように回転パラメータRを用いた様々な向きでの比較が有効である。 [Processing example (4)]
FIG. 13 shows an explanatory diagram regarding an example of the matching process in step S3 of FIG. In this example, the rotation parameter R is used for the temporary locus of the temporary position VPB of the second sensor generated based on the first locus of the first sensor and the
調整部55は、仮位置VPBの軌跡703と、複数の軌跡(7021等)とをそれぞれ比較し、一致度Kを計算する。本例では、図示のように、調整部55は、時点(t)毎に、仮位置VPBの軌跡703上の点(例えば時点t3に対応する位置1300)と、各軌跡7021~7025上の対応する位置との間で、それぞれ距離1301を計算する。そして、調整部55は、マッチングの対毎に、全時点での距離1301の総和をとる。距離1301をDとすると、距離1301の総和はΣDで表される。Σは時点t=1~Tでの和である。調整部55は、その総和に応じて、一致度Kを計算する。すなわち、概略的には、その総和が小さいほど一致度Kが高い値となるように、一致度Kが規定および計算される。
The adjusting unit 55 compares the locus 703 of the temporary position VPB with a plurality of loci (7021 etc.) and calculates the degree of coincidence K. In this example, as shown in the figure, the adjusting unit 55 corresponds to a point on the locus 703 of the temporary position VPB (for example, a position 1300 corresponding to the time t3) and each locus 7021 to 7025 at each time point (t). The distance 1301 is calculated for each position. Then, the adjusting unit 55 takes the sum of the distances 1301 at all time points for each matching pair. Assuming that the distance 1301 is D, the sum of the distances 1301 is represented by ΣD. Σ is the sum at time points t = 1 to T. The adjusting unit 55 calculates the degree of coincidence K according to the total sum. That is, roughly, the degree of coincidence K is defined and calculated so that the smaller the sum is, the higher the degree of coincidence K is.
本例では、様々な向きでの比較の際に、処理効率を考慮して、軌跡702(第2軌跡)の方で回転パラメータRを変えて比較対象を生成する処理例としたが、これに限らず可能であり、軌跡703(仮軌跡)の方で回転パラメータRを変えて比較対象を生成してもよい。マッチング処理の重要な点としては、曲線部を含んだ軌跡間で形状の一致度合いを判断する点であり、第1軌跡と第2軌跡とのいずれから仮軌跡を生成してもよい。
In this example, when comparing in various directions, in consideration of processing efficiency, the rotation parameter R is changed in the locus 702 (second locus) to generate a comparison target. It is possible without limitation, and the rotation parameter R may be changed in the locus 703 (provisional locus) to generate a comparison target. An important point of the matching process is to determine the degree of matching of the shapes between the loci including the curved portion, and the provisional locus may be generated from either the first locus or the second locus.
図13の(B)は、(A)のマッチングの結果、ある相対位置パラメータ700(Δx,Δy)に応じた仮位置VPBの軌跡703と、ある回転パラメータRでの軌跡702(特に軌跡702x)との対において一致度Kが最も高くなった場合を示す。同様に、相対位置パラメータ700毎に、一致度Kが最も高くなるものが計算される。そして、ステップS4では、それらの複数の対における複数の相対位置パラメータ700のうち、一致度Kが最も高いものが、最適相対位置704(Δx_opt,Δy_opt)として選択される。
In FIG. 13B, as a result of the matching of (A), the locus 703 of the temporary position VPB corresponding to a certain relative position parameter 700 (Δx, Δy) and the locus 702 (particularly the locus 702x) in a certain rotation parameter R. The case where the degree of agreement K is the highest in the pair with and is shown. Similarly, for each relative position parameter 700, the one with the highest degree of coincidence K is calculated. Then, in step S4, among the plurality of relative position parameters 700 in the plurality of pairs, the one having the highest degree of coincidence K is selected as the optimum relative position 704 (Δx_opt, Δy_opt).
ステップS5では、量{θ2_t+R-θ1_t}から最適相対方向(Δθ_opt)が計算される。図13の(B)では、センサ2Aの軌跡701と、センサ2Bの軌跡702と、仮位置VPBの軌跡703と、最適相対位置704(Δx_opt,Δy_opt)と、回転パラメータRとの関係の例を示している。量{θ2_t+R-θ1_t}は、時点t1でのセンサ2Aの方向(θ1_t1)とセンサ2Bの方向(θ2_t1)との関係を示している。
In step S5, the optimum relative direction (Δθ_opt) is calculated from the quantity {θ2_t + R−θ1_t}. In FIG. 13B, there is an example of the relationship between the locus 701 of the sensor 2A, the locus 702 of the sensor 2B, the locus 703 of the temporary position VPB, the optimum relative position 704 (Δx_opt, Δy_opt), and the rotation parameter R. Shown. The quantity {θ2_t + R−θ1_t} indicates the relationship between the direction of the sensor 2A (θ1_t1) and the direction of the sensor 2B (θ2_t1) at the time point t1.
[処理例(5)]
制御装置100は、相対位置関係計算機能として、図7のステップS3で軌跡の形状のマッチング処理を行う。その際、第1センサのセンサデータSDAと第2センサのセンサデータSDBとにおいて、検出の時点(t)の情報に関しては、同期していても、同期していなくても、いずれでもマッチング処理が可能である。センサデータSDAとセンサデータSDBとで、時刻が同期していない場合でも、制御装置100は、例えば、第1センサの第1軌跡(詳しくは第1軌跡に基づいて生成した仮軌跡)と第2センサの第2軌跡との比較の際に、一方の軌跡上のある時点(例えばt=k)の位置に対し、他方の軌跡上の各時点(例えば時点kの周辺の時点)を変えながら比較を試行すればよい。 [Processing example (5)]
Thecontrol device 100 performs a locus shape matching process in step S3 of FIG. 7 as a relative positional relationship calculation function. At that time, in the sensor data SDA of the first sensor and the sensor data SDB of the second sensor, the matching process is performed regardless of whether the information at the time of detection (t) is synchronized or not. It is possible. Even when the time is not synchronized between the sensor data SDA and the sensor data SDB, the control device 100 may, for example, have a first locus (specifically, a tentative locus generated based on the first locus) and a second locus of the first sensor. When comparing with the second locus of the sensor, the comparison is made while changing the position of a certain time point (for example, t = k) on one locus while changing each time point (for example, a time point around the time point k) on the other locus. Just try.
制御装置100は、相対位置関係計算機能として、図7のステップS3で軌跡の形状のマッチング処理を行う。その際、第1センサのセンサデータSDAと第2センサのセンサデータSDBとにおいて、検出の時点(t)の情報に関しては、同期していても、同期していなくても、いずれでもマッチング処理が可能である。センサデータSDAとセンサデータSDBとで、時刻が同期していない場合でも、制御装置100は、例えば、第1センサの第1軌跡(詳しくは第1軌跡に基づいて生成した仮軌跡)と第2センサの第2軌跡との比較の際に、一方の軌跡上のある時点(例えばt=k)の位置に対し、他方の軌跡上の各時点(例えば時点kの周辺の時点)を変えながら比較を試行すればよい。 [Processing example (5)]
The
図14は、上記センサデータ間で時刻が同期していない場合におけるマッチング処理例を示す。図14の(A)で、一点鎖線で示す軌跡1401(仮軌跡)は、例えばセンサ2AのセンサデータSDAに基づいた第1位置同定結果41Aの軌跡と、ある相対位置パラメータ700とから生成した、センサ2Bの仮位置VPBの軌跡である。この軌跡1401上で、白点は時点(例えばt=t1~t9)毎の位置(例えばp1~p9)を示す。一方、実線で示す軌跡1402は、例えばセンサ2BのセンサデータSDBに基づいた第2位置同定結果41の軌跡(第2軌跡)である。この軌跡1402上で、黒点は時点(例えばt=t1~tt11)毎の位置(例えばp21~p31)を示す。2つのセンサデータの時点(t)は時刻が同期しておらず、例えば軌跡1401上の時点t1と軌跡1402上の時点t1とは別の時刻である。なお図15では回転パラメータRについては考慮していない。制御装置100は、これらの軌跡のマッチングの際、例えば軌跡1401上のある時点の位置に対し、軌跡1402の各時点の位置(少なくともいくつかの候補)を対応させるように、それぞれの対で比較を試行する。
FIG. 14 shows an example of matching processing when the time is not synchronized between the sensor data. In FIG. 14A, the locus 1401 (provisional locus) indicated by the alternate long and short dash line is generated from, for example, the locus of the first position identification result 41A based on the sensor data SDA of the sensor 2A and a certain relative position parameter 700. This is the locus of the temporary position VPB of the sensor 2B. On this locus 1401, the white spots indicate the positions (for example, p1 to p9) at each time point (for example, t = t1 to t9). On the other hand, the locus 1402 shown by the solid line is, for example, the locus (second locus) of the second position identification result 41 based on the sensor data SDB of the sensor 2B. On this locus 1402, the black dots indicate the positions (for example, p21 to p31) for each time point (for example, t = t1 to tt11). The time points (t) of the two sensor data are not synchronized with each other. For example, the time point t1 on the locus 1401 and the time point t1 on the locus 1402 are different times. Note that FIG. 15 does not consider the rotation parameter R. When matching these trajectories, the control device 100 compares each pair so that, for example, the position at a certain time point on the locus 1401 corresponds to the position at each time point (at least some candidates) of the locus 1402. Try.
図14の(B)は、軌跡1401に対し、軌跡1402を重ね合わせる際に、軌跡1401の時点t1の位置p1に軌跡1402の時点t1の位置p11を合わせる場合を示す。図14の(C)は、軌跡1401に対し、軌跡1402を重ね合わせる際に、軌跡1401の時点t1の位置p1に軌跡1402の時点t2の位置p12を合わせる場合を示す。例えば(B)と(C)で比較すると、(C)の場合の方が、軌跡間の距離が小さく、一致度Kが高いことがわかる。なお、軌跡間での距離の取り方については、図13の距離1301の例に限らず可能であり、例えば一方の軌跡の点から他方の軌跡の点へと最も距離が短くなる線をとるようにしてもよい。制御装置100は、上記のような各試行の結果のうち、一致度Kが最も高いものを選択すればよい。
FIG. 14B shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p11 at the time point t1 of the locus 1402 when the locus 1402 is superimposed on the locus 1401. FIG. 14C shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p12 at the time point t2 of the locus 1402 when the locus 1402 is superimposed on the locus 1401. For example, when comparing (B) and (C), it can be seen that the distance between the trajectories is smaller and the degree of coincidence K is higher in the case of (C). The method of taking the distance between the loci is not limited to the example of the distance 1301 in FIG. 13, and for example, the line with the shortest distance from the point of one locus to the point of the other locus is taken. You may do it. The control device 100 may select the one having the highest degree of agreement K from the results of each trial as described above.
[キャリブレーション動作]
センサ2間の相対位置関係を計算する自動調整(キャリブレーション)のために、予め設定された特定の動作を移動体1に行わせ、その際のセンサデータを取得するようにしてもよい。この特定の動作は、相対位置関係の計算・決定ができるという条件を満たす動作であり、一定時間以上の時間での、旋回等の方向転換を含む動作である。すなわち、この時間で得られるセンサデータに基づいた軌跡データにおいて、例えば図8や図9等に示したように、旋回に対応した円弧等の曲線部(701b,702b)を含む。このような軌跡データがあれば、前述のマッチング処理が成立し、最適値としての解が得られる。これにより、解が得られないことや、解を得るまでに長時間かかること等が避けられるので、効率的にキャリブレーションが可能である。 [Calibration operation]
For automatic adjustment (calibration) for calculating the relative positional relationship between thesensors 2, the moving body 1 may be made to perform a specific preset operation, and the sensor data at that time may be acquired. This specific operation is an operation that satisfies the condition that the relative positional relationship can be calculated / determined, and is an operation including a change of direction such as turning in a time of a certain time or more. That is, the locus data based on the sensor data obtained in this time includes curved portions (701b, 702b) such as arcs corresponding to turning, as shown in FIGS. 8 and 9, for example. If there is such locus data, the above-mentioned matching process is established, and a solution as an optimum value can be obtained. As a result, it is possible to avoid the fact that a solution cannot be obtained and that it takes a long time to obtain a solution, so that calibration can be performed efficiently.
センサ2間の相対位置関係を計算する自動調整(キャリブレーション)のために、予め設定された特定の動作を移動体1に行わせ、その際のセンサデータを取得するようにしてもよい。この特定の動作は、相対位置関係の計算・決定ができるという条件を満たす動作であり、一定時間以上の時間での、旋回等の方向転換を含む動作である。すなわち、この時間で得られるセンサデータに基づいた軌跡データにおいて、例えば図8や図9等に示したように、旋回に対応した円弧等の曲線部(701b,702b)を含む。このような軌跡データがあれば、前述のマッチング処理が成立し、最適値としての解が得られる。これにより、解が得られないことや、解を得るまでに長時間かかること等が避けられるので、効率的にキャリブレーションが可能である。 [Calibration operation]
For automatic adjustment (calibration) for calculating the relative positional relationship between the
なお、本発明者は、実験を含む検討に基づいて、少なくとも上記のような旋回に対応した円弧等を含む軌跡データから、センサ2間の相対位置関係を計算・決定できることを確認した。このことから、必要な条件および特定の動作が規定できる。前述の図10等のように、移動体1上のセンサ2の位置に応じて軌跡の形状が異なることで、前述のマッチング処理が有効である。また、移動体システムにおいて、上記キャリブレーション用の特定の動作をユーザ設定できるユーザ・インタフェースを備えてもよい。例えば、移動体1に接続されるPC101(図1)等の装置の表示画面で、その特定の動作のための経路をユーザ設定可能とする。
It should be noted that the present inventor has confirmed that the relative positional relationship between the sensors 2 can be calculated and determined from at least the trajectory data including the arcs and the like corresponding to the turning as described above, based on the examination including the experiment. From this, necessary conditions and specific actions can be specified. As shown in FIG. 10 and the like described above, the matching process described above is effective because the shape of the locus differs depending on the position of the sensor 2 on the moving body 1. In addition, the mobile system may be provided with a user interface that allows the user to set a specific operation for the calibration. For example, on the display screen of a device such as a PC 101 (FIG. 1) connected to the mobile body 1, a route for the specific operation can be set by the user.
[地図作成機能]
図15~図19は、地図作成機能に関する説明図である。相対位置関係計算機能で得た相対位置関係データ43(図5)に基づいて、例えば2つの地図データ(42A,42B)が関連付けられる。 [Map creation function]
15 to 19 are explanatory views relating to the map creation function. For example, two map data (42A, 42B) are associated with each other based on the relative positional relationship data 43 (FIG. 5) obtained by the relative positional relationship calculation function.
図15~図19は、地図作成機能に関する説明図である。相対位置関係計算機能で得た相対位置関係データ43(図5)に基づいて、例えば2つの地図データ(42A,42B)が関連付けられる。 [Map creation function]
15 to 19 are explanatory views relating to the map creation function. For example, two map data (42A, 42B) are associated with each other based on the relative positional relationship data 43 (FIG. 5) obtained by the relative positional relationship calculation function.
図15は、移動体1が適用される工場等の環境1500の概要構成例を、水平面(X-Y面)で示す。図15では、特に、環境1500に関して、センサ2Aによって検出される構成を示している。図2の高さ位置ZAにあるセンサ2Aによるセンサデータに基づいて、周囲の形状と第1同定結果41Aとしての軌跡が把握され、それらに基づいて第1地図データ42Aが作成または更新される。この環境1500内の物体のうち、斜線ハッチング領域で示す物体1501は、移動体1のセンサ2Aによって特徴点として検出される物体の一例である。
FIG. 15 shows a schematic configuration example of an environment 1500 such as a factory to which the moving body 1 is applied in a horizontal plane (XY plane). FIG. 15 shows a configuration detected by the sensor 2A, particularly with respect to the environment 1500. Based on the sensor data by the sensor 2A at the height position ZA in FIG. 2, the surrounding shape and the locus as the first identification result 41A are grasped, and the first map data 42A is created or updated based on them. Among the objects in the environment 1500, the object 1501 shown in the shaded hatched region is an example of an object detected as a feature point by the sensor 2A of the moving body 1.
図15中に移動体1の走行および軌跡の例を示す。空間座標系CS(X,Y,Z)において、移動体1は、位置P1から、X軸の正方向(例えば南)に直進動作している。移動体1は、位置P2からは、移動体1から見て左側に旋回している。左旋回後、移動体1は、位置P3では、Y軸の正方向(例えば東)を向いており、さらに、位置P3から左側に旋回し、X軸の負方向(例えば北)に向いた状態となっている。移動体1は、位置P4まで、X軸の負方向(例えば北)に直進動作している。破線の軌跡1510は、移動体1の代表的な位置PMに関する軌跡を示す。実線の軌跡1511は、センサ2Aの位置PAに関する軌跡を示す。
FIG. 15 shows an example of the traveling and locus of the moving body 1. In the spatial coordinate system CS (X, Y, Z), the moving body 1 is moving straight from the position P1 in the positive direction (for example, south) of the X axis. From the position P2, the moving body 1 turns to the left when viewed from the moving body 1. After turning left, the moving body 1 faces the positive direction (for example, east) of the Y-axis at the position P3, further turns to the left from the position P3, and faces the negative direction (for example, north) of the X-axis. It has become. The moving body 1 is moving straight up to the position P4 in the negative direction (for example, north) of the X axis. The broken line locus 1510 shows a locus with respect to a typical position PM of the moving body 1. The solid line locus 1511 shows the locus of the sensor 2A with respect to the position PA.
例えば位置P1において、範囲1502は、センサ2A(位置PA)からのレーザ光の走査による出射範囲の例を示す。範囲1502は、図3と同様に、180度よりも大きい場合を示す。実線矢印はレーザ光を示す。例えばレーザ光1503は、物体1501に当たって反射されてセンサ2Aに戻って特徴点として検出される。例えばレーザ光1504は、工場の建物の壁等に当たって反射されてセンサ2Aに戻って特徴点として検出される。白抜き領域にも物体があるが、高さ位置ZAには無いため、センサ2Aでは検出されない。このようなセンサ2Aのセンサデータに基づいて、前述の位置同定および地図作成が可能であり、その詳細技術内容については限定しない。
For example, at the position P1, the range 1502 shows an example of the emission range by scanning the laser beam from the sensor 2A (position PA). The range 1502 shows the case where it is larger than 180 degrees, as in FIG. The solid arrow indicates the laser beam. For example, the laser beam 1503 hits the object 1501 and is reflected back to the sensor 2A to be detected as a feature point. For example, the laser beam 1504 hits a wall or the like of a factory building, is reflected, returns to the sensor 2A, and is detected as a feature point. There is an object in the white area, but it is not detected by the sensor 2A because it is not in the height position ZA. Based on the sensor data of the sensor 2A, the above-mentioned position identification and map creation are possible, and the detailed technical contents thereof are not limited.
図16は、図15と同じ環境1500に関して、センサ2Bによって検出される構成を示す。移動体1の軌跡1510は図15と同じである。図16で、実線の軌跡1512は、センサ2Bの位置PBの軌跡である。例えば位置P1において、範囲1602は、センサ2B(位置PB)からのレーザ光の走査による出射範囲の例を示す。例えばレーザ光1603は、物体1601に当たって反射されてセンサ2Bに戻って特徴点として検出される。この物体1601は、高さ位置ZBにあるため、センサ2Bで検出されている。
FIG. 16 shows the configuration detected by the sensor 2B in the same environment 1500 as in FIG. The locus 1510 of the moving body 1 is the same as that in FIG. In FIG. 16, the solid line locus 1512 is the locus of the position PB of the sensor 2B. For example, at position P1, range 1602 shows an example of an emission range due to scanning of laser light from sensor 2B (position PB). For example, the laser beam 1603 hits the object 1601, is reflected, returns to the sensor 2B, and is detected as a feature point. Since this object 1601 is at the height position ZB, it is detected by the sensor 2B.
上記例のように、例えば工場の生産設備等の物体は、様々な形状があり得、部位毎に高さが異なる場合がある。実施の形態1のように、移動体1に高さ位置を含め位置が異なる複数(2個)のセンサ2を備える場合、各センサ2でカバーできる検出範囲が異なり、計測できる物体形状が異なる。その結果、この移動体システムでは、環境1500の形状をより詳細に計測し反映した地図を作成可能である。しかしながら、各センサ2の位置が異なることから、センサ2毎に異なる地図として複数の地図データ(42A,42B)が作成される。この移動体システムでは、その場合でも、センサ2間の相対位置関係のキャリブレーションに基づいて、それらの複数の地図データを関連付けることができる。
As in the above example, objects such as production equipment in factories can have various shapes, and the height may differ for each part. When the moving body 1 is provided with a plurality of (two) sensors 2 having different positions including the height position as in the first embodiment, the detection range that can be covered by each sensor 2 is different, and the object shape that can be measured is different. As a result, in this mobile system, it is possible to create a map that measures and reflects the shape of the environment 1500 in more detail. However, since the positions of the sensors 2 are different, a plurality of map data (42A, 42B) are created as different maps for each sensor 2. In this mobile system, even in that case, the plurality of map data can be associated with each other based on the calibration of the relative positional relationship between the sensors 2.
[地図データ]
図17および図18は、図15および図16の環境1500および計測に基づいて作成される地図データの構成例を示す。図17は、図15のセンサ2Aのセンサデータに基づいて作成される第1地図データ42Aに対応する地図1700を示す。例えば線1701は、図15の物体1501の輪郭(対応する特徴点群)に対応する線である。この地図1700の座標系(X,Y)は、例えば、移動体1による計測を開始した時点のセンサ2Aの最初の位置PAを原点とし、そのセンサ2Aのx軸方向をX軸とし、それに直交するy軸方向をY軸とした場合を示す。 [Map data]
17 and 18 show configuration examples of map data created based on theenvironment 1500 and measurements of FIGS. 15 and 16. FIG. 17 shows a map 1700 corresponding to the first map data 42A created based on the sensor data of the sensor 2A of FIG. For example, line 1701 is a line corresponding to the contour (corresponding feature point group) of the object 1501 in FIG. In the coordinate system (X, Y) of this map 1700, for example, the origin is the first position PA of the sensor 2A at the time when the measurement by the moving body 1 is started, the x-axis direction of the sensor 2A is the X-axis, and it is orthogonal to it. The case where the y-axis direction is the Y-axis is shown.
図17および図18は、図15および図16の環境1500および計測に基づいて作成される地図データの構成例を示す。図17は、図15のセンサ2Aのセンサデータに基づいて作成される第1地図データ42Aに対応する地図1700を示す。例えば線1701は、図15の物体1501の輪郭(対応する特徴点群)に対応する線である。この地図1700の座標系(X,Y)は、例えば、移動体1による計測を開始した時点のセンサ2Aの最初の位置PAを原点とし、そのセンサ2Aのx軸方向をX軸とし、それに直交するy軸方向をY軸とした場合を示す。 [Map data]
17 and 18 show configuration examples of map data created based on the
図18は、図16のセンサ2Bのセンサデータに基づいて作成される第2地図データ42Bに対応する地図1800を示す。例えば線1801は、図16の物体1601の輪郭に対応する線である。この地図1800の座標系(X,Y)は、例えば、移動体1による計測を開始した時点のセンサ2Bの最初の位置PBを原点とし、そのセンサ2Bのx軸方向をX軸とし、それに直交するy軸方向をY軸とした場合を示す。
FIG. 18 shows a map 1800 corresponding to the second map data 42B created based on the sensor data of the sensor 2B of FIG. For example, line 1801 is a line corresponding to the contour of the object 1601 in FIG. In the coordinate system (X, Y) of this map 1800, for example, the origin is the first position PB of the sensor 2B at the time when the measurement by the moving body 1 is started, the x-axis direction of the sensor 2B is the X-axis, and it is orthogonal to it. The case where the y-axis direction is the Y-axis is shown.
図19は、前述の計算で得た相対位置関係データ43を用いて、上記2つの地図データ(42A,42B)を1つに関連付けて、ユーザに対し出力する場合の例を示す。ここでは、センサ2Aの地図1700(その座標系)を基準とし、位置PAと位置PBとの相対位置関係105を用いて、センサ2Bの地図1800を回転させて地図1700上に重ね合わせている。ユーザは、このように2つの地図を1つの地図として関連付けた状態で参照できる。なお、地図データは、例えば画像データとして構成され、画素毎に位置や物体有無などの情報を有する。
FIG. 19 shows an example in which the above two map data (42A, 42B) are associated with one and output to the user by using the relative positional relationship data 43 obtained in the above calculation. Here, with reference to the map 1700 (its coordinate system) of the sensor 2A, the map 1800 of the sensor 2B is rotated and superposed on the map 1700 by using the relative positional relationship 105 between the position PA and the position PB. The user can refer to the two maps in a state of being associated with each other as one map in this way. The map data is configured as, for example, image data, and has information such as a position and the presence / absence of an object for each pixel.
[効果等]
上記のように、実施の形態1の移動体システムによれば、移動体1に複数のセンサ2が設置される場合やセンサ2間の設置位置等が変更される場合にも、センサ2間の相対位置関係を求めることができ、測位機能等の精度を高めることができる。特に、相対位置関係計算機能によって、センサ2間の相対位置関係を高精度かつ容易に求めることができる。特に、相対位置関係計算機能によって、移動体1の走行に伴い自動調整ができるので、センサ2の設定に関するユーザの手動操作による手間も少ない。 [Effects, etc.]
As described above, according to the mobile system of the first embodiment, even when a plurality ofsensors 2 are installed on the mobile 1 or when the installation position between the sensors 2 is changed, the sensors 2 are connected to each other. The relative positional relationship can be obtained, and the accuracy of the positioning function and the like can be improved. In particular, the relative positional relationship calculation function makes it possible to easily obtain the relative positional relationship between the sensors 2 with high accuracy. In particular, since the relative positional relationship calculation function can automatically adjust the moving body 1 as it travels, there is less time and effort for the user to manually operate the setting of the sensor 2.
上記のように、実施の形態1の移動体システムによれば、移動体1に複数のセンサ2が設置される場合やセンサ2間の設置位置等が変更される場合にも、センサ2間の相対位置関係を求めることができ、測位機能等の精度を高めることができる。特に、相対位置関係計算機能によって、センサ2間の相対位置関係を高精度かつ容易に求めることができる。特に、相対位置関係計算機能によって、移動体1の走行に伴い自動調整ができるので、センサ2の設定に関するユーザの手動操作による手間も少ない。 [Effects, etc.]
As described above, according to the mobile system of the first embodiment, even when a plurality of
(変形例)
実施の形態1の変形例として以下も可能である。移動体1に3個以上のセンサ2を備える形態も同様に可能である。この形態では、例えば、第1センサと第2センサとの相対関係と、第2センサと第3センサとの相対関係と、第3センサと第1センサとの相対関係とのうち少なくとも2つの相対関係を、それぞれ前述の相対位置関係計算機能によって同様に計算すればよい。 (Modification example)
The following is also possible as a modification of the first embodiment. Similarly, a form in which the movingbody 1 is provided with three or more sensors 2 is also possible. In this embodiment, for example, at least two relative relationships between the first sensor and the second sensor, the relative relationship between the second sensor and the third sensor, and the relative relationship between the third sensor and the first sensor. The relationships may be calculated in the same manner by the relative positional relationship calculation function described above.
実施の形態1の変形例として以下も可能である。移動体1に3個以上のセンサ2を備える形態も同様に可能である。この形態では、例えば、第1センサと第2センサとの相対関係と、第2センサと第3センサとの相対関係と、第3センサと第1センサとの相対関係とのうち少なくとも2つの相対関係を、それぞれ前述の相対位置関係計算機能によって同様に計算すればよい。 (Modification example)
The following is also possible as a modification of the first embodiment. Similarly, a form in which the moving
実施の形態1では、センサ2の検出方向や検出範囲を水平面に設定したが、これに限らず可能であり、すなわちセンサ2の検出方向や検出範囲が水平面以外でも同様に適用可能である。また、センサ2は、高さ方向を含めた3次元の測位または測距等が可能な種類のセンサとしてもよい。
In the first embodiment, the detection direction and the detection range of the sensor 2 are set to the horizontal plane, but the present invention is not limited to this, that is, the detection direction and the detection range of the sensor 2 can be similarly applied to other than the horizontal plane. Further, the sensor 2 may be a type of sensor capable of three-dimensional positioning or distance measurement including the height direction.
以上、本発明を実施の形態に基づいて具体的に説明したが、本発明は前述の実施の形態に限定されず、要旨を逸脱しない範囲で種々変更可能である。移動体として、AGVのような自律移動可能な移動体を説明したが、これに限らず、ユーザが操縦する方式の移動体にも適用可能である。移動体は、車両に限らず、船舶や、ドローン等の飛行体でもよい。
Although the present invention has been specifically described above based on the embodiments, the present invention is not limited to the above-described embodiments and can be variously modified without departing from the gist. As the moving body, an autonomously movable moving body such as an AGV has been described, but the present invention is not limited to this, and the moving body can also be applied to a moving body operated by a user. The moving body is not limited to a vehicle, but may be a ship or a flying object such as a drone.
1…移動体、2,2A,2B…センサ、3…移動機構、4…搭載機構、5…位置同定装置、6…移動機構制御装置、10…筐体、10a…第1部分、10b…第2部分、100…制御装置、101…環境、102…生産設備、103…荷物、105…相対位置関係、110…PC、CS…空間座標系、CM…移動体座標系。
1 ... moving body, 2, 2A, 2B ... sensor, 3 ... moving mechanism, 4 ... mounting mechanism, 5 ... position identification device, 6 ... moving mechanism control device, 10 ... housing, 10a ... first part, 10b ... Two parts, 100 ... control device, 101 ... environment, 102 ... production equipment, 103 ... luggage, 105 ... relative positional relationship, 110 ... PC, CS ... spatial coordinate system, CM ... moving body coordinate system.
Claims (12)
- 移動体と、
前記移動体の移動体座標系における異なる位置に設置される第1センサおよび第2センサを含む複数のセンサと、
前記複数のセンサの複数のセンサデータに基づいて少なくとも前記移動体の空間座標系内での位置を測定する測位機能を実現する制御装置と、
を備え、
前記第1センサおよび前記第2センサは、前記空間座標系内での自己センサの位置を検出可能な種類のセンサであり、
前記制御装置は、
環境内での前記移動体の移動の際に、前記第1センサの第1センサデータおよび前記第2センサの第2センサデータに基づいて、前記空間座標系内での前記第1センサの位置および前記第2センサの位置を同定し、位置同定結果に基づいて、時系列上の前記第1センサの第1軌跡および前記第2センサの第2軌跡を取得し、
前記第1軌跡および前記第2軌跡を用いて、軌跡の形状の比較照合に基づいて、前記移動体の前記移動体座標系での前記第1センサの位置と前記第2センサの位置との相対位置関係を計算し、計算した相対位置関係を表す情報を前記移動体に設定する、
移動体システム。 With a mobile body
A plurality of sensors including a first sensor and a second sensor installed at different positions in the moving body coordinate system of the moving body.
A control device that realizes a positioning function that measures at least the position of the moving body in the spatial coordinate system based on the plurality of sensor data of the plurality of sensors.
With
The first sensor and the second sensor are a type of sensor capable of detecting the position of the self-sensor in the spatial coordinate system.
The control device is
When the moving body moves in the environment, the position of the first sensor in the spatial coordinate system and the position of the first sensor in the spatial coordinate system based on the first sensor data of the first sensor and the second sensor data of the second sensor. The position of the second sensor is identified, and based on the position identification result, the first locus of the first sensor and the second locus of the second sensor on the time series are acquired.
Relative to the position of the first sensor and the position of the second sensor in the moving body coordinate system of the moving body based on the comparison and collation of the shape of the locus using the first locus and the second locus. The positional relationship is calculated, and the information representing the calculated relative positional relationship is set in the moving body.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記制御装置は、
前記比較照合の際、前記移動体座標系での前記第1センサの位置を基準とした前記第2センサの位置に関する相対位置パラメータを用いて、前記第2センサの仮位置を複数の候補として生成し、
前記仮位置毎に、前記第1軌跡を用いて、仮軌跡を生成し、
前記仮位置毎に、前記仮軌跡と前記第2軌跡との間で、軌跡の形状を比較照合して、一致度を計算し、
前記一致度が最も大きい場合に対応する前記相対位置パラメータを、前記相対位置関係として決定する、
移動体システム。 In the mobile system according to claim 1,
The control device is
At the time of the comparison and collation, the temporary position of the second sensor is generated as a plurality of candidates by using the relative position parameter regarding the position of the second sensor with respect to the position of the first sensor in the moving body coordinate system. death,
A temporary locus is generated for each temporary position using the first locus.
For each temporary position, the shape of the locus is compared and collated between the temporary locus and the second locus, and the degree of coincidence is calculated.
The relative position parameter corresponding to the case where the degree of coincidence is the largest is determined as the relative positional relationship.
Mobile system. - 請求項2記載の移動体システムにおいて、
前記制御装置は、前記比較照合の際、前記移動体座標系での前記第1センサの方向を基準とした前記第2センサの方向に関する相対方向パラメータを用いて、前記第2センサの仮方向を複数の候補として生成し、
前記仮位置および前記仮方向毎に、前記第1軌跡を用いて、前記仮軌跡を生成し、
前記一致度が最も大きい場合に対応する前記相対方向パラメータを、前記相対位置関係に含まれる情報として決定する、
移動体システム。 In the mobile system according to claim 2,
At the time of the comparison and collation, the control device uses a relative direction parameter regarding the direction of the second sensor with reference to the direction of the first sensor in the moving body coordinate system to determine the tentative direction of the second sensor. Generate as multiple candidates
Using the first locus for each of the temporary positions and the temporary directions, the temporary locus is generated.
The relative direction parameter corresponding to the case where the degree of agreement is the largest is determined as the information included in the relative positional relationship.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記軌跡は、曲線部を含む、
移動体システム。 In the mobile system according to claim 1,
The locus includes a curved portion.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記移動体は、旋回が可能な移動機構を備える、
移動体システム。 In the mobile system according to claim 1,
The moving body includes a moving mechanism capable of turning.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記制御装置は、前記相対位置関係の計算のために、前記移動体の移動の特定の動作として、一定の時間以上の時間で、方向転換を含む動作を行わせるように制御する、
移動体システム。 In the mobile system according to claim 1,
The control device controls the movement of the moving body so as to perform an operation including a change of direction for a certain period of time or more for the calculation of the relative positional relationship.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記制御装置は、前記計算した相対位置関係の情報を、表示画面に表示する、
移動体システム。 In the mobile system according to claim 1,
The control device displays the calculated relative positional relationship information on the display screen.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記制御装置は、前記比較照合の際、回転パラメータを用いて前記第1軌跡に対する前記第2軌跡の向きの関係を様々に変えた対で前記比較照合を行う、
移動体システム。 In the mobile system according to claim 1,
At the time of the comparison and collation, the control device performs the comparison and collation with pairs in which the relationship of the orientation of the second locus with respect to the first locus is variously changed by using a rotation parameter.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記第1センサのセンサデータと前記第2センサのセンサデータとで時刻が同期していない場合に、
前記制御装置は、前記比較照合の際に、前記第1軌跡の時点の位置に対する前記第2軌跡の時点の位置の対応関係を様々に変えた対で前記比較照合を行う、
移動体システム。 In the mobile system according to claim 1,
When the time is not synchronized between the sensor data of the first sensor and the sensor data of the second sensor,
At the time of the comparison and collation, the control device performs the comparison and collation with pairs in which the correspondence between the position at the time of the first locus and the position at the time of the second locus is variously changed.
Mobile system. - 請求項1記載の移動体システムにおいて、
前記移動体は、無人搬送車であり、
前記センサは、測距センサであり、
前記制御装置は、前記センサデータである測距データに基づいて、前記移動体の周囲の物体の形状を表す形状データを作成し、前記形状データと、前記環境の地図データとに基づいて、前記空間座標系内での前記移動体の位置および姿勢を推定し、推定結果に基づいて前記環境内での前記移動体の移動を制御し、前記地図データを作成または更新する、
移動体システム。 In the mobile system according to claim 1,
The moving body is an automatic guided vehicle.
The sensor is a distance measuring sensor and
The control device creates shape data representing the shape of an object around the moving body based on the distance measurement data which is the sensor data, and based on the shape data and the map data of the environment, the control device said. The position and orientation of the moving body in the spatial coordinate system are estimated, the movement of the moving body in the environment is controlled based on the estimation result, and the map data is created or updated.
Mobile system. - 請求項10記載の移動体システムにおいて、
前記制御装置は、前記計算した相対位置関係の情報を用いて、前記第1センサの位置同定結果に基づいて作成した第1地図データと、前記第2センサの位置同定結果に基づいて作成した第2地図データとの関連付けを行う、
移動体システム。 In the mobile system according to claim 10,
The control device uses the calculated relative positional relationship information to create the first map data based on the position identification result of the first sensor and the second map data created based on the position identification result of the second sensor. 2 Associate with map data,
Mobile system. - 請求項1記載の移動体システムにおいて、
前記第1センサおよび前記第2センサは、前記移動体座標系における設置の高さ位置が異なり、設置方向が水平面内の方向である、
移動体システム。 In the mobile system according to claim 1,
The first sensor and the second sensor have different installation height positions in the mobile coordinate system, and the installation direction is a direction in the horizontal plane.
Mobile system.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202080099292.2A CN115362423A (en) | 2020-04-27 | 2020-04-27 | Moving body system |
PCT/JP2020/017937 WO2021220331A1 (en) | 2020-04-27 | 2020-04-27 | Mobile body system |
JP2022518431A JP7338048B2 (en) | 2020-04-27 | 2020-04-27 | mobile system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/017937 WO2021220331A1 (en) | 2020-04-27 | 2020-04-27 | Mobile body system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021220331A1 true WO2021220331A1 (en) | 2021-11-04 |
Family
ID=78332347
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/017937 WO2021220331A1 (en) | 2020-04-27 | 2020-04-27 | Mobile body system |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7338048B2 (en) |
CN (1) | CN115362423A (en) |
WO (1) | WO2021220331A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012226675A (en) * | 2011-04-22 | 2012-11-15 | Hitachi Industrial Equipment Systems Co Ltd | Mobile body |
JP2015127664A (en) * | 2013-12-27 | 2015-07-09 | 株式会社国際電気通信基礎技術研究所 | Calibration apparatus, calibration method, and calibration program |
JP2015222541A (en) * | 2014-05-23 | 2015-12-10 | 株式会社日立産機システム | Carriage conveying system, carrier, and carriage conveying method |
JP2017096813A (en) * | 2015-11-25 | 2017-06-01 | 株式会社国際電気通信基礎技術研究所 | Calibration device, calibration method, and calibration program |
JP2018022215A (en) * | 2016-08-01 | 2018-02-08 | 村田機械株式会社 | Movement teaching device and movement teaching method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8768558B2 (en) * | 2007-01-05 | 2014-07-01 | Agjunction Llc | Optical tracking vehicle control system and method |
JP2014191689A (en) * | 2013-03-28 | 2014-10-06 | Hitachi Industrial Equipment Systems Co Ltd | Traveling object attached with position detection device for outputting control command to travel control means of traveling object and position detection device |
JP6080189B2 (en) * | 2014-08-15 | 2017-02-15 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Inline sensor calibration method and calibration apparatus |
DE102015205088C5 (en) * | 2015-03-20 | 2024-09-26 | Kuka Deutschland Gmbh | Method for determining a calibration parameter of a vehicle and vehicle therefor |
JP6913339B2 (en) * | 2017-01-26 | 2021-08-04 | 学校法人千葉工業大学 | Movement locus calculation system, control method and program of movement locus calculation system |
-
2020
- 2020-04-27 JP JP2022518431A patent/JP7338048B2/en active Active
- 2020-04-27 WO PCT/JP2020/017937 patent/WO2021220331A1/en active Application Filing
- 2020-04-27 CN CN202080099292.2A patent/CN115362423A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012226675A (en) * | 2011-04-22 | 2012-11-15 | Hitachi Industrial Equipment Systems Co Ltd | Mobile body |
JP2015127664A (en) * | 2013-12-27 | 2015-07-09 | 株式会社国際電気通信基礎技術研究所 | Calibration apparatus, calibration method, and calibration program |
JP2015222541A (en) * | 2014-05-23 | 2015-12-10 | 株式会社日立産機システム | Carriage conveying system, carrier, and carriage conveying method |
JP2017096813A (en) * | 2015-11-25 | 2017-06-01 | 株式会社国際電気通信基礎技術研究所 | Calibration device, calibration method, and calibration program |
JP2018022215A (en) * | 2016-08-01 | 2018-02-08 | 村田機械株式会社 | Movement teaching device and movement teaching method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2021220331A1 (en) | 2021-11-04 |
JP7338048B2 (en) | 2023-09-04 |
CN115362423A (en) | 2022-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3779357B1 (en) | Localisation of a surveying instrument | |
CN111511620B (en) | Dynamic window method using optimal interaction collision avoidance cost assessment | |
JP6825712B2 (en) | Mobiles, position estimators, and computer programs | |
JP6433122B2 (en) | Enhanced mobile platform positioning | |
US11537140B2 (en) | Mobile body, location estimation device, and computer program | |
CN112840285A (en) | Autonomous map traversal with waypoint matching | |
US20230064071A1 (en) | System for 3d surveying by an autonomous robotic vehicle using lidar-slam and an estimated point distribution map for path planning | |
CN110789529B (en) | Vehicle control method, device and computer-readable storage medium | |
JP6074205B2 (en) | Autonomous mobile | |
JP2020064011A (en) | Laser scanner calibration method, and transporting machine | |
CN114714357A (en) | Sorting and carrying method, sorting and carrying robot and storage medium | |
WO2019194079A1 (en) | Position estimation system, moving body comprising said position estimation system, and computer program | |
KR102564663B1 (en) | Coordinates recognition apparatus of automatic guided vehicle and method thereof | |
JP7300413B2 (en) | Control device, moving body, movement control system, control method and program | |
US20230333568A1 (en) | Transport vehicle system, transport vehicle, and control method | |
JPWO2018179960A1 (en) | Moving object and self-position estimation device | |
WO2021220331A1 (en) | Mobile body system | |
CN113534810A (en) | Logistics robot and logistics robot system | |
US20230316567A1 (en) | Localization of a surveying instrument | |
Buck et al. | Multi-sensor payload detection and acquisition for truck-trailer AGVs | |
JP2018013860A (en) | Autonomous movable object control device | |
García-Gutierrez et al. | Obstacle Coordinates Transformation from TVS Body-Frame to AGV Navigation-Frame | |
JP2017130006A (en) | Autonomous mobile body control device | |
KR102716669B1 (en) | Robots for transporting automobile parts, and control system thereof | |
JP2020077162A (en) | Traveling vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20934195 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022518431 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20934195 Country of ref document: EP Kind code of ref document: A1 |