[go: up one dir, main page]

CN116804763A - Obstacle judging method, device, equipment and storage medium - Google Patents

Obstacle judging method, device, equipment and storage medium Download PDF

Info

Publication number
CN116804763A
CN116804763A CN202310750561.2A CN202310750561A CN116804763A CN 116804763 A CN116804763 A CN 116804763A CN 202310750561 A CN202310750561 A CN 202310750561A CN 116804763 A CN116804763 A CN 116804763A
Authority
CN
China
Prior art keywords
obstacle
point cloud
initial point
sensor
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310750561.2A
Other languages
Chinese (zh)
Inventor
朱俊安
张涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Pudu Robot Co ltd
Shenzhen Pudu Technology Co Ltd
Original Assignee
Chengdu Pudu Robot Co ltd
Shenzhen Pudu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Pudu Robot Co ltd, Shenzhen Pudu Technology Co Ltd filed Critical Chengdu Pudu Robot Co ltd
Priority to CN202310750561.2A priority Critical patent/CN116804763A/en
Publication of CN116804763A publication Critical patent/CN116804763A/en
Priority to PCT/CN2024/084470 priority patent/WO2025001375A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present application relates to an obstacle determination method, an obstacle determination device, a computer apparatus, a storage medium, and a computer program product. The method comprises the following steps: acquiring a first initial point cloud acquired by a first sensor and a spacing distance between equipment acquired by a second sensor and an obstacle at the same moment; acquiring a field angle of a second sensor, and acquiring a second initial point cloud based on the field angle and the interval distance; determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud; if the first initial point cloud and the second initial point cloud do not have a superposition area, judging that the obstacle is a first type of obstacle; and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle. By adopting the method, the operation efficiency of the equipment can be improved.

Description

Obstacle judging method, device, equipment and storage medium
Technical Field
The present application relates to the field of automation technology, and in particular, to a method, an apparatus, a device, a storage medium, and a computer program product for determining an obstacle.
Background
With the development of automation technology, various intelligent devices are developed, and based on a planned path, the intelligent devices automatically execute related tasks, thereby bringing great convenience to life and work of people.
In the conventional art, the intelligent device detects the obstacle by using the laser radar sensor and the depth sensor, but the intelligent device cannot accurately position the obstacle similar to the laser invisible obstacle such as a mirror, glass and an object made of transparent materials, or needs to position the obstacle by using an auxiliary tool, so that the operation efficiency of the intelligent device is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an obstacle determining method, apparatus, computer device, computer readable storage medium, and computer program product that can improve the operating efficiency of an intelligent device.
In a first aspect, the present application provides a method of determining an obstacle. The method comprises the following steps:
acquiring a first initial point cloud acquired by a first sensor and a spacing distance between equipment acquired by a second sensor and an obstacle at the same moment;
acquiring a field angle of the second sensor, and obtaining a second initial point cloud based on the field angle and the interval distance;
determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud;
if the first initial point cloud and the second initial point cloud do not have a superposition area, judging that the obstacle is a first type of obstacle;
And if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle.
In one embodiment, the second sensor is an ultrasonic sensor, the acquiring the field angle of the second sensor, and the obtaining the second initial point cloud based on the field angle and the separation distance includes:
dividing the view angle based on a preset dividing angle to obtain a plurality of offset angles;
calculating initial coordinates of obstacle points corresponding to the offset angle based on the interval distance and the offset angle; a plurality of said obstacle points are used to characterize said obstacle;
and obtaining a second initial point cloud based on the initial coordinates of the obstacle points corresponding to the plurality of offset angles.
In one embodiment, the first sensor is a lidar and/or a depth camera, and the determining whether there is a region of coincidence between the first initial point cloud and the second initial point cloud includes:
converting the first initial point cloud into a first intermediate point cloud in a target coordinate system based on a first external parameter of the first sensor, and converting the second initial point cloud into a second intermediate point cloud in the target coordinate system based on a second external parameter of the second sensor;
Acquiring the size of a grid map and the resolution corresponding to the grid map;
based on the size and the resolution, projecting the first intermediate point cloud into the grid map to obtain a first grid point cloud, and projecting the second intermediate point cloud into the grid map to obtain a second grid point cloud;
comparing the second coordinates in the second grid point cloud with the first coordinates in the first grid point cloud, and determining whether a superposition area exists between the first initial point cloud and the second initial point cloud according to a comparison result;
the first initial point cloud and the second initial point cloud have no overlapping area, and the obstacle is judged to be a first type of obstacle, including:
if the first coordinate which is the same as the second coordinate does not exist, determining that the obstacle is a first type of obstacle, wherein the first type of obstacle is a laser invisible obstacle;
the first initial point cloud and the second initial point cloud have a coincidence area, and the obstacle is judged to be a second type obstacle, including:
and if the first coordinates which are the same as the second coordinates exist, determining that the obstacle is a second type of obstacle, wherein the second type of obstacle is a laser visible obstacle.
In one embodiment, the method further comprises:
acquiring an error threshold of a target map and the second sensor; the target map is a map used in the running process of the equipment;
under the condition that the obstacle is determined to be a first type obstacle, if the interval distance is smaller than the error threshold value, updating the target map based on the second initial point cloud to obtain an updated target map;
and under the condition that the obstacle is determined to be a second type obstacle, updating the target map based on the first initial point cloud to obtain an updated target map.
In one embodiment, the updating the target map based on the second initial point cloud, after obtaining the updated target map, further includes:
and based on a preset clearing time, clearing the obstacle corresponding to the second initial point cloud in the updated target map to obtain the cleared target map.
In one embodiment, in the event that the obstacle is determined to be a first type of obstacle, the method further comprises:
determining an obstacle avoidance mode corresponding to the obstacle based on the interval distance;
If the interval distance meets a preset deceleration distance interval and the detection direction of the second sensor is the same as the running direction of the equipment, decelerating;
and if the interval distance is smaller than the error threshold value of the second sensor, generating an obstacle avoidance path corresponding to the obstacle based on the interval distance, and avoiding the obstacle based on the obstacle avoidance path.
In one embodiment, the generating the obstacle avoidance path corresponding to the obstacle based on the separation distance includes:
acquiring a reserved distance, and determining a detour distance corresponding to the obstacle based on the interval distance and the reserved distance;
and generating an obstacle avoidance path corresponding to the obstacle based on the detour distance.
In a second aspect, the application further provides an obstacle judging device. The device comprises:
the acquisition module is used for acquiring a first initial point cloud acquired by the first sensor and a spacing distance between the second sensor and the obstacle;
the dividing module is used for acquiring the field angle of the second sensor and obtaining a second initial point cloud based on the field angle and the interval distance;
the judging module is used for judging that the obstacle is a first type obstacle if the first initial point cloud and the second initial point cloud do not have a superposition area; and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle.
In a third aspect, the present application also provides a computer device, including a memory and a processor, where the memory stores a computer program, and the processor executes the computer program to implement the steps in the obstacle determining method provided in the above aspects.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps in the obstacle determination method provided in the above aspects.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of the obstacle determination method provided in the above aspects.
The obstacle judging method, the device, the computer equipment, the storage medium and the computer program product are used for acquiring a first initial point cloud acquired by the first sensor and the interval distance between equipment acquired by the second sensor and an obstacle at the same time, acquiring the field angle of the second sensor, acquiring a second initial point cloud based on the field angle and the interval distance, determining whether a superposition area exists between the first initial point cloud and the second initial point cloud, judging that the obstacle is a first type obstacle if the superposition area does not exist between the first initial point cloud and the second initial point cloud, and judging that the obstacle is a second type obstacle if the superposition area exists between the first initial point cloud and the second initial point cloud. If the obstacle detected by the second sensor is a first obstacle, the first sensor cannot acquire the first initial point cloud of the obstacle, at the moment, the first initial point cloud acquired by the first sensor is another second obstacle, the second initial point cloud representing the obstacle is obtained through the interval distance between equipment acquired by the second sensor and the obstacle and the field angle of the second sensor, if the first initial point cloud and the second initial point cloud do not have a superposition area, the obstacle detected by the second sensor can be judged to be the first obstacle, the accuracy of judging the obstacle is improved, and if the second sensor is determined to detect the obstacle to be the first obstacle, the obstacle can be avoided based on the interval distance, so that the equipment is prevented from colliding with the obstacle, and the equipment is prevented from performing large-scale detouring or stopping running, and the running efficiency of the equipment is improved.
Drawings
FIG. 1 is a diagram of an application environment of an obstacle determination method in one embodiment;
FIG. 2 is a flow chart of an obstacle determination method according to an embodiment;
FIG. 3 is a flowchart illustrating a second initial point cloud acquisition step according to an embodiment;
FIG. 4 is a flow diagram of determining whether a coincident region exists in one embodiment;
FIG. 5 is a flowchart illustrating a step of determining an obstacle avoidance mode according to an embodiment;
FIG. 6 is a schematic diagram of an apparatus in one embodiment;
FIG. 7 is a schematic diagram of a map in one embodiment;
fig. 8 is a block diagram showing the structure of an obstacle deciding apparatus in one embodiment;
fig. 9 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The obstacle judging method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the device 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. Both the device 102 and the server 104 may be used separately to perform the obstacle determination method provided in the embodiments of the application. The device 102 and the server 104 may also cooperate to perform the obstacle determination methods provided in embodiments of the application. The device 102 and the device 102 may also cooperate to perform the obstacle determination method provided in the embodiment of the present application. Taking the device 102 alone to execute the obstacle judging method provided in the embodiment of the application as an example, the device acquires a first initial point cloud acquired by a first sensor and a spacing distance between the device acquired by a second sensor and an obstacle at the same moment; acquiring a field angle of a second sensor, and acquiring a second initial point cloud based on the field angle and the interval distance; determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud; if the first initial point cloud and the second initial point cloud do not have a superposition area, judging that the obstacle is a first type of obstacle; and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle. The device 102 may be, but is not limited to, at least one of various smart devices, automated devices, cleaning robots, dispensing robots, navigation robots, autopilots, drones, and the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
In one embodiment, as shown in fig. 2, there is provided an obstacle determining method, which is described by taking application of the method to the device 102 as an example, and includes the following steps:
step 202, acquiring a first initial point cloud acquired by a first sensor and a spacing distance between equipment acquired by a second sensor and an obstacle at the same moment.
The first sensor is a sensor for emitting light waves, and the first sensor can determine obstacle information according to the received reflected light waves. The first sensor includes, but is not limited to, a lidar sensor, a depth sensor, and an infrared sensor. A point cloud refers to a set of multiple point coordinates. The first initial point cloud refers to an obstacle point cloud acquired by the first sensor. The point coordinates in the first initial point cloud may be a coordinate system located with the first sensor as an origin. The second sensor is a sensor that emits sound waves, and the second sensor can determine a separation distance from the obstacle based on the received reflected sound waves. The second sensor includes, but is not limited to, an ultrasonic sensor, a sonar sensor, and a sound sensor. The separation distance refers to the linear distance between the device and the obstacle.
The device may be configured to acquire a first initial point cloud acquired by a first sensor at a same time and a separation distance between the device and an obstacle acquired by a second sensor.
In one embodiment, the device acquires a first initial point cloud acquired by a first sensor and a separation distance between the device and an obstacle acquired by a second sensor at the same time and in the same detection direction.
Step 204, obtaining a field angle of the second sensor, and obtaining a second initial point cloud based on the field angle and the interval distance.
The Field of View (FOV) refers to the angular range that the sensor can detect. The field angle may be expressed in terms of angle or physical dimensions. The larger the angle of view, the wider the range detected by the second sensor, and the smaller the angle of view, the narrower the range detected by the second sensor. The second initial point cloud refers to a set of point coordinates characterizing an obstacle. The point coordinates in the second initial point cloud may be a coordinate system located with the second sensor as an origin.
The device obtains a field angle of view of the second sensor, and then calculates based on the field angle and the separation distance, resulting in a second initial point cloud characterizing the obstacle.
In one embodiment, the device establishes an initial coordinate system with the second sensor as an origin, obtains a curve with the origin as a center and the interval distance as a radius based on the origin and the interval distance, samples a part of the curve corresponding to the view angle as a target curve, obtains a plurality of sampling points, and obtains a second initial point cloud representing the obstacle based on point coordinates corresponding to the plurality of sampling points.
Step 206, determining whether there is a coincidence region between the first initial point cloud and the second initial point cloud.
Illustratively, the device makes a comparison determination of the first initial point cloud with the second initial point cloud.
In step 208, if the first initial point cloud and the second initial point cloud do not have the overlapping area, the obstacle is determined to be a first type of obstacle.
Wherein, the first type of barrier refers to a barrier through which light waves can penetrate. The first type of obstruction is a laser invisible obstruction. Such as glass, mirrors, transparent baffles.
The device compares the first initial point cloud with the second initial point cloud, and judges that the obstacle is a first type of obstacle if the first initial point cloud and the second initial point cloud have no overlapping area.
Step 210, if there is a coincidence region between the first initial point cloud and the second initial point cloud, determining that the obstacle is a second type obstacle.
The second type of obstacle refers to an obstacle which cannot be penetrated by light waves. The second type of obstacle is a laser visible obstacle. Such as tables, stools, and walls.
In one embodiment, the device obtains a first conversion relationship between a first coordinate system and a reference coordinate system, the first coordinate system is a coordinate system with a first sensor as an origin, and a second conversion relationship between a second coordinate system and the reference coordinate system, the second coordinate system is a coordinate system with a second sensor as the origin, the first initial point cloud is converted based on the first conversion relationship to obtain a first reference point cloud, the second initial point cloud is converted based on the second conversion relationship to obtain a second reference point cloud, the first reference point cloud and the second reference point cloud are compared, if the same point coordinates exist in the first reference point cloud and the second reference point cloud, the obstacle is judged to be a second type obstacle, and if the same point coordinates do not exist in the first reference point cloud and the second reference point cloud, the obstacle is judged to be the first type obstacle.
In the obstacle judging method, if the obstacle detected by the second sensor is the first obstacle, the first sensor cannot acquire the first initial point cloud of the obstacle, at the moment, the first initial point cloud acquired by the first sensor is other second obstacles, the second initial point cloud representing the obstacle is obtained through the interval distance between equipment acquired by the second sensor and the obstacle and the view angle of the second sensor, if the first initial point cloud and the second initial point cloud do not have a superposition area, the obstacle detected by the second sensor can be judged to be the first obstacle, the accuracy of obstacle judgment is improved, and if the second sensor is determined to detect the obstacle to be the first obstacle, the obstacle can be avoided based on the interval distance, so that collision between equipment and the obstacle is avoided, the equipment can be bypassed or stopped in a large range, and the running efficiency of the equipment is improved.
In one embodiment, as shown in fig. 3, the second sensor is an ultrasonic sensor, obtaining the field angle of the second sensor, and obtaining the second initial point cloud based on the field angle and the separation distance includes:
Step 302, dividing the video angle based on the preset dividing angle to obtain a plurality of offset angles.
The ultrasonic sensor is a sensor for detecting and measuring by using an ultrasonic technology, and can measure the interval distance between a target object and the sensor. The preset division angle refers to an angle preset for dividing the angle of the video field. The smaller the preset dividing angle is, the more the obtained offset angles are, the larger the preset dividing angle is, and the fewer the obtained offset angles are. The offset angle refers to the degree of the included angle between the straight line between the obstacle point and the origin and the coordinate axis.
The apparatus obtains a preset division angle, and divides the angle of view into a plurality of offset angles according to the preset division angle.
In one embodiment, the device obtains an initial coordinate system with the second sensor as an origin, converts the field angle into a detection angle range in the initial coordinate system, and divides the detection angle range based on a preset division angle to obtain a plurality of offset angles. For example, the second sensor has a field angle of 60, which in the initial coordinate system has a detection angle in the range of-30 ° to 30 °, the preset division angle being 5 °, the resulting offset angles being-30 °, -25 °, -20 °, -15 °, -10 °, -5 °, 0 °, 5 °, 10 °, 15 °,20 °, 25 ° and 30 °.
Step 304, calculating initial coordinates of the obstacle points corresponding to the offset angle based on the interval distance and the offset angle; a plurality of obstacle points are used to characterize the obstacle.
Where an obstacle point refers to a point used to characterize an obstacle. It is understood that a point is spaced from the origin by a distance within the angular field of view. The initial coordinates refer to coordinates of the obstacle point in a coordinate system with the second sensor as an origin.
The device calculates initial coordinates of the offset angle corresponding to the obstacle point based on the separation distance and the offset angle.
In one embodiment, the device calculates a sine value and a cosine value of the offset angle, obtains an X-axis coordinate based on the separation distance and the cosine value, obtains a Y-axis coordinate based on the separation distance and the sine value, and obtains an initial coordinate of the obstacle point corresponding to the offset angle based on the X-axis coordinate and the Y-axis coordinate.
In one embodiment, the initial coordinate P i The calculation formula of (x, y) is as follows:
x = d*cos(θ i ) Formula (1)
y = d*sin(θ i ) Formula (2)
Wherein P is i (X, Y) is the initial coordinate of the obstacle point corresponding to the ith offset angle, X is the X-axis coordinate, Y is the Y-axis coordinate, d is the spacing distance, θ i Is the i-th offset angle.
Step 306, obtaining a second initial point cloud based on the initial coordinates of the obstacle points corresponding to the plurality of offset angles.
The device illustratively composes the initial coordinates of the plurality of offset angles corresponding to the obstacle points into a second initial point cloud.
In this embodiment, through the interval distance between the device and the obstacle and the view angle corresponding to the second sensor, the initial coordinates of the obstacle points corresponding to each offset angle are calculated, which can be understood that the obstacle points are points where the obstacle may exist, the initial coordinates corresponding to a plurality of obstacle points form a second initial point cloud, the second initial point cloud can relatively accurately represent the obstacle detected by the second sensor, and the obstacle is positioned through the second initial point cloud, so that the accuracy of positioning the obstacle is improved.
In one embodiment, as shown in fig. 4, the first sensor is a laser radar and/or a depth camera, and determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud includes:
step 402, converting the first initial point cloud into a first intermediate point cloud in the target coordinate system based on the first external parameter of the first sensor, and converting the second initial point cloud into a second intermediate point cloud in the target coordinate system based on the second external parameter of the second sensor.
The laser radar is a sensor for detecting and measuring by using laser beams, and can measure the distance, shape and position between a target object and the laser radar. A depth camera is a sensor, also called a depth sensor or depth camera, used to generate a depth image containing distance information. External parameters refer to parameters characterizing the position of the sensor relative to the device. The external parameters can be obtained from the structural design diagram of the equipment. The first external parameter refers to a positional parameter of the first sensor relative to the device. The second external parameter refers to a positional parameter of the second sensor relative to the device. The target coordinate system is a coordinate system with the center of the apparatus as the origin.
The device multiplies each first initial coordinate in the first initial point cloud by a first external parameter to obtain a first intermediate coordinate corresponding to each first initial coordinate, forms a plurality of first intermediate coordinates into a first intermediate point cloud, multiplies each second initial coordinate in the second initial point cloud by a second external parameter to obtain a second intermediate coordinate corresponding to each second initial coordinate, and forms a plurality of second intermediate coordinates into a second intermediate point cloud.
In one embodiment, the first intermediate coordinates are calculated as follows:
Pr i = T rc * Pc i formula (3)
Wherein T is rc Is the first external reference, pc i Pr, the i first initial coordinate i Is the i first intermediate coordinate.
Step 404, obtaining the size of the grid map and the resolution corresponding to the grid map.
The grid map is a map formed by taking the center of the device as the center and taking the preset size as the size. For example, a grid map one meter long and one meter wide is formed centering on the center of the apparatus. The size refers to the length and width of the grid map. The dimensions may be expressed in terms of length and width. Resolution refers to the ratio between the actual map, which refers to the map used during operation of the device, and the grid map.
Illustratively, the device obtains the size and corresponding resolution of the grid map.
Step 406, based on the size and the resolution, projecting the first intermediate point cloud into the grid map to obtain a first grid point cloud, and projecting the second intermediate point cloud into the grid map to obtain a second grid point cloud.
The device translates each first intermediate coordinate in the first intermediate point cloud based on the size to obtain a first translated point cloud, translates each second intermediate coordinate in the second intermediate point cloud based on the size to obtain a second translated point cloud based on the second translated point cloud and the resolution to obtain a second grid point cloud.
In one embodiment, the device translates each first intermediate coordinate in the first intermediate point cloud based on the size to obtain a first translated point cloud, divides each first translated coordinate in the first translated point cloud by the resolution to obtain a first grid point cloud, translates each second intermediate coordinate in the second intermediate point cloud based on the size to obtain a second translated point cloud, and divides each second translated coordinate in the second translated point cloud by the resolution to obtain a second grid point cloud.
In one embodiment, the device translates each first intermediate coordinate in the first intermediate point cloud based on the size to obtain a first translated point cloud, divides the resolution of each first translated coordinate in the first translated point cloud to obtain a first grid point cloud, translates each second intermediate coordinate in the second intermediate point cloud based on the size to obtain a second translated point cloud, divides the resolution of each second translated coordinate in the second translated point cloud to obtain a second grid point cloud.
In one embodiment, the first grid coordinates are calculated as follows:
Ps xi = (Pr xi -size_x/2) |resolution formula (4)
Ps yi = (Pr yi -size_y/2) |resolution formula (5)
Wherein, ps xi X-axis coordinate, ps, which is the ith first grid coordinate yi Y-axis coordinate, pr, which is the ith first grid coordinate xi X-axis coordinate, pr, which is the ith first intermediate coordinate yi The Y-axis coordinate, which is the ith first intermediate coordinate, size _ x is the length of the grid map, size_y is the width of the grid map, resolution is the resolution corresponding to the grid map, and l is the integer division symbol.
Step 408, comparing the second coordinates in the second grid point cloud with the first coordinates in the first grid point cloud, and determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud according to the comparison result.
If the first initial point cloud and the second initial point cloud do not have a superposition area, namely, the first coordinates identical to the second coordinates do not exist, the obstacle is determined to be a first type of obstacle, wherein the first type of obstacle is a laser invisible obstacle.
If the first initial point cloud and the second initial point cloud have a superposition area, namely, the first coordinates identical to the second coordinates exist, the obstacle is determined to be a second type of obstacle, wherein the second type of obstacle is a laser visible obstacle.
The device obtains a current second coordinate in the second grid point cloud, compares the current second coordinate with a first coordinate in the first grid point cloud, and if the first coordinate identical to the current second coordinate exists, determines that the obstacle is a second type obstacle, and stops comparison; if the first coordinate which is the same as the current second coordinate does not exist, the next second coordinate is obtained as the current second coordinate, the steps are repeated until the last second coordinate in the second grid point cloud is reached, and the obstacle is determined to be the first type of obstacle.
In this embodiment, the first initial point cloud and the second initial point cloud are converted through external parameters corresponding to the sensor, so that corresponding first intermediate point cloud and second intermediate point cloud are obtained, the first intermediate point cloud and the second intermediate point cloud are located in the same target coordinate system, so that comparability is provided between the first intermediate point cloud and the second intermediate point cloud, then the first intermediate point cloud and the second intermediate point cloud are projected into the grid map, the resolution is divided by each first intermediate coordinate in the first intermediate point cloud, and because the result of division is an integer part, the result of division of resolution by partial first intermediate coordinates is the same, so that the number of first coordinates in the first grid point cloud is smaller than that of the first intermediate coordinates in the first intermediate point cloud, and similarly, the number of second coordinates in the second grid point cloud is smaller than that of the second intermediate coordinates in the second intermediate point cloud, therefore, the number of times of comparison can be reduced, and the judging efficiency can be improved.
In one embodiment, the obstacle judging method further includes:
acquiring an error threshold of the target map and the second sensor; the target map is a map used in the running process of the equipment; under the condition that the obstacle is determined to be the first type of obstacle, if the interval distance is smaller than the error threshold value, updating the target map based on the second initial point cloud to obtain an updated target map; and under the condition that the obstacle is determined to be the second type of obstacle, updating the target map based on the first initial point to obtain an updated target map.
The error threshold refers to the minimum distance of the second sensor for detecting false detection. The error threshold may be set based on a characteristic of the second sensor. For example, the second sensor is an a ultrasonic sensor, which detects an obstacle within 0.4 meter accurately, and detects an obstacle outside 0.4 meter, and thus the error threshold of the a ultrasonic sensor may be set to 0.4 meter.
The device acquires a target map used in the driving process and an error threshold of the second sensor, compares the interval distance with the error threshold when the obstacle is determined to be the first obstacle, updates the target map based on the second initial point cloud if the interval distance is smaller than the error threshold to obtain an updated target map, and does not update the target map if the interval distance is greater than or equal to the error threshold; and under the condition that the obstacle is determined to be the second type obstacle, updating the target map based on the first initial point cloud to obtain an updated target map.
In one embodiment, under the condition that the obstacle is determined to be a first obstacle, the device acquires an error threshold value of the target map and the second sensor, compares the interval distance with the error threshold value, judges whether the detection direction of the second sensor is the same as the running direction of the device if the interval distance is smaller than the error threshold value, and updates the target map based on a second initial point cloud if the detection direction of the second sensor is the same as the running direction of the device, so as to obtain an updated target map; if the detection direction of the second sensor is different from the running direction of the equipment or the interval distance is greater than or equal to the error threshold value, the target map is not updated.
In one embodiment, under the condition that the obstacle is determined to be a first obstacle, the device acquires an error threshold of the target coordinate system and the second sensor, compares the interval distance with the error threshold, acquires a second external parameter of the second sensor if the interval distance is smaller than the error threshold, converts a second initial point cloud into a second intermediate point cloud in a target coordinate system based on the second external parameter, the target coordinate system is a coordinate system taking the center of the device as an origin, acquires a conversion relation between the target coordinate system and the target map, converts the second intermediate point cloud based on the conversion relation to obtain a second target point cloud, and newly adds the second target point cloud in the target map to obtain an updated target map.
In one embodiment, under the condition that the obstacle is determined to be a first obstacle, the device acquires an error threshold of the target map and the second sensor, compares the interval distance with the error threshold, acquires a second external parameter of the second sensor if the interval distance is smaller than the error threshold, converts the second initial point cloud into a second intermediate point cloud in a target coordinate system based on the second external parameter, the target coordinate system is a coordinate system taking the center of the device as an origin, then acquires the size and the corresponding resolution of the grid map and the conversion relation between the grid map and the target map, projects the second intermediate point cloud into the grid map based on the size and the resolution to obtain a second grid point cloud, obtains a second target point cloud based on the second grid point cloud and the conversion relation, and newly adds the second target point cloud into the target map to obtain an updated target map.
In one embodiment, under the condition that the obstacle is determined to be a first obstacle, the device acquires an error threshold of the target map and the second sensor, compares the interval distance with the error threshold, acquires a second grid point cloud corresponding to the second initial point cloud and a conversion relation between the grid map and the target map if the interval distance is smaller than the error threshold, converts the second grid point cloud based on the conversion relation to obtain a second target point cloud, and newly adds the second target point cloud in the target map to obtain an updated target map.
In one embodiment, after the device obtains the updated target map, generating an obstacle avoidance path corresponding to the obstacle based on the updated target map, where the obstacle avoidance path is used for avoiding the obstacle by the device.
In this embodiment, when it is determined that the obstacle is a first type of obstacle, if the separation distance is smaller than the error threshold, the target map is updated based on the second initial point cloud, and the separation distance is smaller than the error threshold, which not only indicates that the separation distance is accurate, but also indicates that the smaller the separation distance is, the smaller the arc length corresponding to the field angle is, the smaller the position range of the obstacle is, so that the target map is updated based on the second initial point cloud corresponding to the separation distance smaller than the error threshold, the smaller the position range of the obstacle is in the target map, and the device operates based on the updated target map, so that the device can avoid a large-scale detour, but also avoid the collision between the device and the first type of obstacle, thereby improving the operation efficiency of the device. Under the condition that the obstacle is determined to be the second obstacle, updating the target map based on the first initial point cloud to obtain an updated target map, wherein the first initial point cloud is data acquired by the first sensor, and the second initial point cloud is data obtained by calculation according to the interval distance.
In one embodiment, updating the target map based on the second initial point cloud, after obtaining the updated target map, further includes:
and clearing obstacles corresponding to the second initial point cloud in the updated target map based on the preset clearing time length to obtain the cleared target map.
The preset clearing time is a preset time, which can be understood that the time when the updated target map is obtained is taken as the starting time, the preset clearing time is added to the starting time, the clearing time is obtained, and the clearing time is the time when the device clears the obstacle in the updated target map. The preset clearing time period can be determined according to the operation capability of the device, or the operation speed of the device and the field angle of the second sensor.
The device, for example, clears the obstacle corresponding to the second initial point cloud in the updated target map after the time length of obtaining the updated target map reaches the preset clearing time length, and obtains the cleared target map.
In this embodiment, the device generates the obstacle avoidance path for avoiding the obstacle according to the obstacle in the target map within the preset time when the time length of obtaining the updated target map reaches the preset time length, and timely clears the obstacle data in the target map, so that invalid avoidance of the device is avoided under the condition that the obstacle is removed, the time length of avoiding the obstacle by the device is reduced, and the running efficiency of the device is improved.
In one embodiment, as shown in fig. 5, in the case where the obstacle is determined to be a first type of obstacle, the method further includes:
step 502, determining an obstacle avoidance mode corresponding to the obstacle based on the interval distance.
The obstacle avoidance mode refers to a mode of avoiding an obstacle. Obstacle avoidance modes include, but are not limited to, deceleration and detour.
Illustratively, the apparatus determines an obstacle avoidance mode corresponding to the obstacle according to the separation distance.
In one embodiment, the device obtains a first threshold and a second threshold, the second threshold is smaller than the first threshold, the interval distance is compared with the first threshold and the second threshold, if the interval distance is larger than the first threshold, the device keeps running in a current state, if the interval distance is smaller than or equal to the first threshold and larger than or equal to the second threshold, the detection direction of the second sensor is obtained, if the detection direction of the second sensor is the same as the running direction of the device, an obstacle avoidance mode using a decelerating running is determined, and if the interval distance is smaller than the second threshold, an obstacle avoidance mode using a changed running path is determined.
In one embodiment, the device determines an obstacle avoidance mode corresponding to the obstacle based on the interval distance, and if the obstacle avoidance mode is a changed running path, updates the target map based on the second initial point cloud, where the target map is a map used in the running process of the device, and obtains an updated target map.
Step 504, if the interval distance meets the preset deceleration distance interval and the detection direction of the second sensor is the same as the running direction of the device, decelerating.
The preset deceleration distance interval refers to a preset distance range, and if the interval distance meets the distance range, the equipment is decelerated. The detection direction refers to the direction in which the second sensor emits sound waves. For example, the detection direction may be divided into a running direction, a running left direction, a running right direction, and a running reverse direction.
The device acquires a preset deceleration distance interval, compares the interval distance with the preset deceleration distance interval, acquires the detection direction of the second sensor and the running direction of the device if the interval distance is within the preset deceleration distance interval, and decelerates the device if the detection direction is the same as the running direction of the device.
In one embodiment, in the process of the device deceleration operation, continuously acquiring an update interval distance between the device acquired by the second sensor and the obstacle, if the update interval distance meets a preset deceleration distance interval, continuously decelerating operation or operating at a set speed, if the update interval is smaller than an error threshold value of the second sensor, updating the target map based on the update interval distance and the field angle of the second sensor to obtain an updated target map, generating an obstacle avoidance path based on the updated target map, and avoiding the obstacle based on the obstacle avoidance path.
Step 506, if the interval distance is smaller than the error threshold of the second sensor, generating an obstacle avoidance path corresponding to the obstacle based on the interval distance, and avoiding the obstacle based on the obstacle avoidance path.
The obstacle avoidance path is a path for avoiding an obstacle.
The device compares the interval distance with an error threshold value of the second sensor, generates an obstacle avoidance path corresponding to the obstacle based on the interval distance if the interval distance is smaller than the error threshold value of the second sensor, and performs obstacle avoidance based on the obstacle avoidance path.
In one embodiment, the device compares the separation distance with an error threshold of the second sensor, if the separation distance is smaller than the error threshold of the second sensor, a conversion relationship between the target map and the grid map is obtained, based on the conversion relationship, a second grid point cloud obtained based on the separation distance is converted to obtain a second target point cloud, the target map is updated based on the second target point cloud to obtain an updated target map, based on the updated target map, an obstacle avoidance path corresponding to the obstacle is generated, and based on the obstacle avoidance path, the obstacle is avoided.
In this embodiment, the device selects a suitable obstacle avoidance mode according to the interval distance, and avoids the obstacle in the suitable obstacle avoidance mode, so as to improve the obstacle avoidance efficiency of the device, thereby improving the operation efficiency of the device, wherein when the interval distance meets a preset deceleration distance interval, it is indicated that the interval distance between the device and the obstacle is relatively large, the device temporarily does not need to bypass, and if the device does not need to bypass at this time, the bypass distance is relatively long, and collision between the device and the first type of obstacle is avoided through deceleration operation, so as to improve the operation efficiency of the device; under the condition that the interval distance is smaller than the error threshold value, the interval distance measurement is accurate, the position range of the obstacle is small, the interval distance between the equipment and the obstacle is small, the equipment and the obstacle are easy to collide, an obstacle avoidance path is generated based on the interval distance, the obstacle is avoided based on the obstacle avoidance path, the equipment can be prevented from bypassing in a large range, the equipment and the first type of obstacle can be prevented from colliding, and therefore the operation efficiency of the equipment is improved.
In one embodiment, generating an obstacle avoidance path corresponding to an obstacle based on the separation distance comprises:
acquiring a reserved distance, and determining a detour distance corresponding to the obstacle based on the interval distance and the reserved distance; and generating an obstacle avoidance path corresponding to the obstacle based on the detour distance.
The reserved distance refers to a preset fixed distance. The reserved distance can be determined according to the width or thickness of the first type of barrier, and it can be understood that the first type of barrier can transmit light waves, and the width or thickness of the barrier which can transmit light waves is smaller, and the reserved distance can represent the width or thickness of the first type of barrier. By detour distance is meant the distance the device moves to the other side of the obstacle in order to avoid the obstacle.
The device acquires the reserved distance, adds the reserved distance to the interval distance to obtain a detour distance corresponding to the obstacle, and then generates an obstacle avoidance path corresponding to the obstacle based on the detour distance.
In one embodiment, the device acquires the reserved distance, adds the reserved distance to the interval distance to obtain a detour distance corresponding to the obstacle, determines an obstacle point cloud based on the field angle and the detour distance of the second sensor, and generates an obstacle avoidance path corresponding to the obstacle based on the obstacle point cloud.
In this embodiment, the interval distance is smaller than the error threshold value of the second sensor, which indicates that the distance between the device and the obstacle is smaller, the device needs to move to the other side of the obstacle to bypass the obstacle, the interval distance is added with the reserved distance to obtain the bypass distance corresponding to the obstacle, then a bypass path corresponding to the obstacle is generated according to the bypass distance, the device moves according to the bypass path, collision with the obstacle can be avoided, and therefore the operation efficiency of the device is improved.
In one exemplary embodiment, the sensor layout of the robot is shown in fig. 6, 602 is a robot that mounts 5 ultrasonic sensors, 1 lidar sensor, and 1 depth camera sensor. The 5 ultrasonic sensors are respectively a No. 1 ultrasonic sensor, a No. 2 ultrasonic sensor, a No. 3 ultrasonic sensor, a No. 4 ultrasonic sensor and a No. 5 ultrasonic sensor, the corresponding detection areas are respectively 604, 606, 608, 610 and 612, the field angle of each ultrasonic sensor is 60 degrees, the detection range of the depth camera sensor is 614, and the detection range of the laser radar sensor is 616.
The method comprises the steps of obtaining laser radar point clouds collected by a laser radar sensor at the same moment and depth point clouds collected by a depth camera sensor, obtaining laser radar external parameters of the laser radar sensor and depth camera external parameters of the depth camera sensor from a structural design diagram of a robot, converting the laser radar point clouds into a robot coordinate system by using a formula (3) to obtain middle laser radar point clouds, and converting the depth point clouds into the robot coordinate system by using the formula (3) to obtain middle depth point clouds. Obtaining the length and the width of the grid map and the resolution corresponding to the grid map, projecting the intermediate laser radar point cloud into the grid map by using formulas (4) and (5) to obtain the grid laser radar point cloud, and projecting the intermediate depth point cloud into the grid map by using formulas (4) and (5) to obtain the grid depth point cloud.
Acquiring a spacing distance d between a robot and an obstacle, which are acquired by an ultrasonic sensor at the same moment, and an angle of view of the ultrasonic sensor, wherein the angle of view is 60 degrees, and the 5 degrees are preset dividing angles, so that the obtained offset angles are-30 degrees, -25 degrees, -20 degrees, -15 degrees, -10 degrees, -5 degrees, 0 degrees, 5 degrees, 10 degrees, 15 degrees, 20 degrees, 25 degrees and 30 degrees, and calculating point coordinates of the obstacle points corresponding to each offset angle by using formulas (1) and (2), and forming a plurality of point coordinates into an ultrasonic point cloud. And (3) acquiring ultrasonic external parameters of the ultrasonic sensor from a structural design drawing of the robot, and converting the ultrasonic point cloud into a robot coordinate system by using a formula (3) to obtain an intermediate ultrasonic point cloud. Based on the length and width of the grid map and the resolution corresponding to the grid map, the intermediate ultrasonic point cloud is projected into the grid map by using formulas (4) and (5) to obtain the grid ultrasonic point cloud.
Comparing the grid ultrasonic point cloud with the grid laser radar point cloud and the grid depth point cloud, if at least one point cloud of the grid laser radar point cloud and the grid depth point cloud has the same point coordinates as the grid ultrasonic point cloud, determining that the obstacle detected by the ultrasonic sensor is a laser visible obstacle, as shown in fig. 7, wherein 702 is a robot, 704 is the grid laser radar point cloud or the grid depth point cloud, 706 is the grid ultrasonic point cloud, and 704 and 706 have overlapping areas, and determining that the obstacle detected by the ultrasonic sensor is the laser visible obstacle. And if the grid laser radar point cloud and the grid depth point cloud do not have the same point coordinates as the grid ultrasonic point cloud, determining that the obstacle detected by the ultrasonic sensor is a laser invisible obstacle.
When the obstacle detected by the ultrasonic sensor is determined to be the laser visible obstacle, discarding the grid ultrasonic point cloud, updating the target map based on the grid laser radar point cloud and the grid depth point cloud to obtain an updated target map, generating an obstacle avoidance path based on the updated target map, and enabling the robot to run along the obstacle avoidance path to avoid the laser visible obstacle.
When the obstacle detected by the ultrasonic sensor is determined to be the invisible obstacle of the laser, acquiring a preset deceleration interval and an error threshold value, and if the interval distance is between the preset deceleration intervals and the obstacle is detected by the ultrasonic sensor No. 2 or No. 3, indicating that the laser invisible obstacle exists in front of the robot, and decelerating the robot; if the interval distance is smaller than the error threshold value, updating the target map based on the grid ultrasonic point cloud to obtain an updated target map, generating an obstacle avoidance path based on the updated target map, and enabling the robot to run along the obstacle avoidance path to avoid the invisible laser obstacle.
According to the obstacle judging method, the ultrasonic point clouds, the laser radar point clouds and the depth point clouds at the same moment are converted into the grid map to obtain the grid ultrasonic point clouds, the grid laser radar point clouds and the grid depth point clouds, the grid ultrasonic point clouds, the grid laser radar point clouds and the grid depth point clouds are arranged in the same map, so that the ultrasonic point clouds, the laser radar point clouds and the grid depth point clouds are comparable, the number of point coordinates of the grid ultrasonic point clouds, the grid laser radar point clouds and the grid depth point clouds is reduced by using a calculation mode of integer division, the number of comparison between the grid ultrasonic point clouds, the grid laser lightning point clouds and the grid depth point clouds is reduced, the determining efficiency of the types of the obstacle is improved, if the obstacle detected by the ultrasonic sensor is the invisible obstacle, the corresponding obstacle avoiding mode is determined according to the distance between the obstacle and the invisible obstacle, the robot is prevented from colliding with the invisible obstacle, and the robot is prevented from bypassing in a large range, and therefore the running efficiency of the robot is improved.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an obstacle judging device for realizing the obstacle judging method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the one or more obstacle determination devices provided below may refer to the limitation of the obstacle determination method hereinabove, and will not be described herein.
In one embodiment, as shown in fig. 8, there is provided an obstacle judging device including: an acquisition module 802, a division module 804, a judgment module 806, and a determination module 808, wherein:
an acquisition module 802, configured to acquire a first initial point cloud acquired by a first sensor, and a separation distance between the first sensor and an obstacle acquired by a second sensor;
the dividing module 804 is configured to obtain a field angle of the second sensor, and obtain a second initial point cloud based on the field angle and the separation distance;
a judging module 806, configured to determine whether a coincidence region exists between the first initial point cloud and the second initial point cloud;
a determining module 808, configured to determine that the obstacle is a first type of obstacle if the first initial point cloud and the second initial point cloud do not have a coincident region; and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle.
In one embodiment, the partitioning module 804 is further configured to: dividing the video angle based on a preset dividing angle to obtain a plurality of offset angles; calculating initial coordinates of obstacle points corresponding to the offset angle based on the interval distance and the offset angle; a plurality of obstacle points for characterizing an obstacle; and obtaining a second initial point cloud based on the initial coordinates of the obstacle points corresponding to the plurality of offset angles.
In one embodiment, the determining module 806 is further configured to: converting the first initial point cloud into a first intermediate point cloud in the target coordinate system based on a first external parameter of the first sensor, and converting the second initial point cloud into a second intermediate point cloud in the target coordinate system based on a second external parameter of the second sensor; acquiring the size of a grid map and the resolution corresponding to the grid map; based on the size and the resolution, projecting the first intermediate point cloud into a grid map to obtain a first grid point cloud, and projecting the second intermediate point cloud into the grid map to obtain a second grid point cloud; and comparing the second coordinates in the second grid point cloud with the first coordinates in the first grid point cloud, and determining whether a superposition area exists between the first initial point cloud and the second initial point cloud according to the comparison result.
In one embodiment, the determination module 808 is further to: if the first initial point cloud and the second initial point cloud do not have a coincident region, judging that the obstacle is a first type of obstacle, including: if the first coordinate which is the same as the second coordinate does not exist, determining that the obstacle is a first type of obstacle, wherein the first type of obstacle is a laser invisible obstacle; if the first initial point cloud and the second initial point cloud have a coincident region, judging that the obstacle is a second type obstacle, including: and if the first coordinates which are the same as the second coordinates exist, determining that the obstacle is a second type of obstacle, wherein the second type of obstacle is a laser visible obstacle.
In one embodiment, the obstacle judging device further includes an updating module for: acquiring an error threshold of the target map and the second sensor; the target map is a map used in the running process of the equipment; under the condition that the obstacle is determined to be the first type of obstacle, if the interval distance is smaller than the error threshold value, updating the target map based on the second initial point cloud to obtain an updated target map; and under the condition that the obstacle is determined to be the second type obstacle, updating the target map based on the first initial point cloud to obtain an updated target map.
In one embodiment, the update module is further to: and clearing obstacles corresponding to the second initial point cloud in the updated target map based on the preset clearing time length to obtain the cleared target map.
In one embodiment, the obstacle judging device further includes an obstacle avoidance module, where the obstacle avoidance module is configured to: determining an obstacle avoidance mode corresponding to the obstacle based on the interval distance; if the interval distance meets a preset deceleration distance interval and the detection direction of the second sensor is the same as the running direction of the equipment, decelerating and running; if the interval distance is smaller than the error threshold value of the second sensor, generating an obstacle avoidance path corresponding to the obstacle based on the interval distance, and avoiding the obstacle based on the obstacle avoidance path.
In one embodiment, the obstacle avoidance module is further to: acquiring a reserved distance, and determining a detour distance corresponding to the obstacle based on the interval distance and the reserved distance; and generating an obstacle avoidance path corresponding to the obstacle based on the detour distance.
Each module in the above-described obstacle judging device may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be device 102, the internal structure of which may be as shown in FIG. 9. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program, when executed by a processor, implements a method of obstacle determination.
It will be appreciated by persons skilled in the art that the architecture shown in fig. 9 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting as to the computer device to which the present inventive arrangements are applicable, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
The user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. A method of determining an obstacle, the method comprising:
acquiring a first initial point cloud acquired by a first sensor and a spacing distance between equipment acquired by a second sensor and an obstacle at the same moment;
acquiring a field angle of the second sensor, and obtaining a second initial point cloud based on the field angle and the interval distance;
determining whether a coincidence region exists between the first initial point cloud and the second initial point cloud;
If the first initial point cloud and the second initial point cloud do not have a superposition area, judging that the obstacle is a first type of obstacle;
and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle.
2. The method of claim 1, wherein the second sensor is an ultrasonic sensor, wherein the obtaining the field angle of the second sensor, based on the field angle and the separation distance, comprises:
dividing the view angle based on a preset dividing angle to obtain a plurality of offset angles;
calculating initial coordinates of obstacle points corresponding to the offset angle based on the interval distance and the offset angle; a plurality of said obstacle points are used to characterize said obstacle;
and obtaining a second initial point cloud based on the initial coordinates of the obstacle points corresponding to the plurality of offset angles.
3. The method of claim 1, wherein the first sensor is a lidar and/or a depth camera, and wherein the determining whether the first initial point cloud and the second initial point cloud have a region of coincidence comprises:
Converting the first initial point cloud into a first intermediate point cloud in a target coordinate system based on a first external parameter of the first sensor, and converting the second initial point cloud into a second intermediate point cloud in the target coordinate system based on a second external parameter of the second sensor;
acquiring the size of a grid map and the resolution corresponding to the grid map;
based on the size and the resolution, projecting the first intermediate point cloud into the grid map to obtain a first grid point cloud, and projecting the second intermediate point cloud into the grid map to obtain a second grid point cloud;
comparing the second coordinates in the second grid point cloud with the first coordinates in the first grid point cloud, and determining whether a superposition area exists between the first initial point cloud and the second initial point cloud according to a comparison result;
the first initial point cloud and the second initial point cloud have no overlapping area, and the obstacle is judged to be a first type of obstacle, including:
if the first coordinate which is the same as the second coordinate does not exist, determining that the obstacle is a first type of obstacle, wherein the first type of obstacle is a laser invisible obstacle;
The first initial point cloud and the second initial point cloud have a coincidence area, and the obstacle is judged to be a second type obstacle, including:
and if the first coordinates which are the same as the second coordinates exist, determining that the obstacle is a second type of obstacle, wherein the second type of obstacle is a laser visible obstacle.
4. A method according to any one of claims 1 to 3, further comprising:
acquiring an error threshold of a target map and the second sensor; the target map is a map used in the running process of the equipment;
under the condition that the obstacle is determined to be a first type obstacle, if the interval distance is smaller than the error threshold value, updating the target map based on the second initial point cloud to obtain an updated target map;
and under the condition that the obstacle is determined to be a second type obstacle, updating the target map based on the first initial point cloud to obtain an updated target map.
5. The method of claim 4, wherein updating the target map based on the second initial point cloud, after obtaining an updated target map, further comprises:
And based on a preset clearing time, clearing the obstacle corresponding to the second initial point cloud in the updated target map to obtain the cleared target map.
6. The method of claim 1, wherein in the event that the obstacle is determined to be a first type of obstacle, the method further comprises:
determining an obstacle avoidance mode corresponding to the obstacle based on the interval distance;
if the interval distance meets a preset deceleration distance interval and the detection direction of the second sensor is the same as the running direction of the equipment, decelerating;
and if the interval distance is smaller than the error threshold value of the second sensor, generating an obstacle avoidance path corresponding to the obstacle based on the interval distance, and avoiding the obstacle based on the obstacle avoidance path.
7. The method of claim 6, wherein generating an obstacle avoidance path corresponding to the obstacle based on the separation distance comprises:
acquiring a reserved distance, and determining a detour distance corresponding to the obstacle based on the interval distance and the reserved distance;
and generating an obstacle avoidance path corresponding to the obstacle based on the detour distance.
8. An obstacle deciding apparatus, comprising:
the acquisition module is used for acquiring a first initial point cloud acquired by the first sensor and a spacing distance between the second sensor and the obstacle;
the dividing module is used for acquiring the field angle of the second sensor and obtaining a second initial point cloud based on the field angle and the interval distance;
the judging module is used for determining whether a coincidence area exists between the first initial point cloud and the second initial point cloud;
the determining module is used for judging that the obstacle is a first type obstacle if the first initial point cloud and the second initial point cloud do not have a coincident region; and if the first initial point cloud and the second initial point cloud have the overlapping area, judging that the obstacle is a second type obstacle.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202310750561.2A 2023-06-25 2023-06-25 Obstacle judging method, device, equipment and storage medium Pending CN116804763A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310750561.2A CN116804763A (en) 2023-06-25 2023-06-25 Obstacle judging method, device, equipment and storage medium
PCT/CN2024/084470 WO2025001375A1 (en) 2023-06-25 2024-03-28 Obstacle determination method and apparatus, and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310750561.2A CN116804763A (en) 2023-06-25 2023-06-25 Obstacle judging method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116804763A true CN116804763A (en) 2023-09-26

Family

ID=88080433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310750561.2A Pending CN116804763A (en) 2023-06-25 2023-06-25 Obstacle judging method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN116804763A (en)
WO (1) WO2025001375A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025001375A1 (en) * 2023-06-25 2025-01-02 深圳市普渡科技有限公司 Obstacle determination method and apparatus, and device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955266A (en) * 2016-05-11 2016-09-21 深圳乐行天下科技有限公司 Map building device and map building method
CN108227523A (en) * 2017-11-01 2018-06-29 深圳乐动机器人有限公司 robot control method, device, storage medium and computer equipment
WO2021052403A1 (en) * 2019-09-20 2021-03-25 杭州海康机器人技术有限公司 Obstacle information sensing method and device for mobile robot
CN114115210A (en) * 2020-08-25 2022-03-01 莱克电气绿能科技(苏州)有限公司 Obstacle avoidance method, device and obstacle avoidance device for self-moving equipment
CN114474065A (en) * 2022-03-04 2022-05-13 美智纵横科技有限责任公司 Robot control method and device, robot and storage medium
CN114779761A (en) * 2022-03-22 2022-07-22 广东博智林机器人有限公司 Mobile robot fault stopping control method, device, equipment and storage medium
CN116107316A (en) * 2023-03-14 2023-05-12 盐城工学院 Sweeping robot path planning and control system
CN116310773A (en) * 2021-12-08 2023-06-23 深圳市普渡科技有限公司 Robot, obstacle detection method and related device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2910245B2 (en) * 1990-12-11 1999-06-23 株式会社豊田自動織機製作所 Driverless vehicle safety devices
CN109814112A (en) * 2019-01-15 2019-05-28 北京百度网讯科技有限公司 A kind of ultrasonic radar and laser radar information fusion method and system
CN110007680B (en) * 2019-04-23 2021-11-23 吉林农业大学 Robot obstacle avoidance algorithm based on topological relation
CN111308491A (en) * 2020-03-09 2020-06-19 中振同辂(江苏)机器人有限公司 Obstacle sensing method based on multi-sensor combination
CN111272183A (en) * 2020-03-16 2020-06-12 达闼科技成都有限公司 Map creating method and device, electronic equipment and storage medium
CN113111905B (en) * 2021-02-25 2022-12-16 上海水齐机器人有限公司 Obstacle detection method integrating multiline laser radar and ultrasonic data
CN113665500B (en) * 2021-09-03 2022-07-19 南昌智能新能源汽车研究院 All-weather-operation environment sensing system and method for unmanned transport vehicle
CN116804763A (en) * 2023-06-25 2023-09-26 深圳市普渡科技有限公司 Obstacle judging method, device, equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955266A (en) * 2016-05-11 2016-09-21 深圳乐行天下科技有限公司 Map building device and map building method
CN108227523A (en) * 2017-11-01 2018-06-29 深圳乐动机器人有限公司 robot control method, device, storage medium and computer equipment
WO2021052403A1 (en) * 2019-09-20 2021-03-25 杭州海康机器人技术有限公司 Obstacle information sensing method and device for mobile robot
CN114115210A (en) * 2020-08-25 2022-03-01 莱克电气绿能科技(苏州)有限公司 Obstacle avoidance method, device and obstacle avoidance device for self-moving equipment
CN116310773A (en) * 2021-12-08 2023-06-23 深圳市普渡科技有限公司 Robot, obstacle detection method and related device
CN114474065A (en) * 2022-03-04 2022-05-13 美智纵横科技有限责任公司 Robot control method and device, robot and storage medium
CN114779761A (en) * 2022-03-22 2022-07-22 广东博智林机器人有限公司 Mobile robot fault stopping control method, device, equipment and storage medium
CN116107316A (en) * 2023-03-14 2023-05-12 盐城工学院 Sweeping robot path planning and control system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邸慧军 等: "无人驾驶车辆目标检测与运动跟踪", 30 April 2021, 北京理工大学出版社, pages: 143 - 146 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025001375A1 (en) * 2023-06-25 2025-01-02 深圳市普渡科技有限公司 Obstacle determination method and apparatus, and device and storage medium

Also Published As

Publication number Publication date
WO2025001375A1 (en) 2025-01-02

Similar Documents

Publication Publication Date Title
Yagfarov et al. Map comparison of lidar-based 2d slam algorithms using precise ground truth
KR102728080B1 (en) Method and device for detecting obstacle information for mobile robot
CN113376650B (en) Mobile robot positioning method and device, electronic equipment and storage medium
JP6246609B2 (en) Self-position estimation apparatus and self-position estimation method
WO2021208143A1 (en) Method and system for planning and sampling mobile robot path in human-machine integration environment
CN113932790B (en) Map updating method, device, system, electronic device and storage medium
CN112904358B (en) Laser positioning method based on geometric information
WO2021102676A1 (en) Object state acquisition method, mobile platform and storage medium
CN113459088B (en) Map adjustment method, electronic device and storage medium
CN114115263A (en) Automatic mapping method and device for AGV, mobile robot and medium
CN112965076B (en) Multi-radar positioning system and method for robot
CN116804763A (en) Obstacle judging method, device, equipment and storage medium
CN109903367A (en) Construct the method, apparatus and computer readable storage medium of map
CN113375657A (en) Method, device and electronic device for updating electronic map
CN116629106A (en) Quasi-digital twin method, system, equipment and medium for mobile robot operation scene
EP4459553A1 (en) Method and device for optimizing three-dimensional map display
WO2022116831A1 (en) Positioning method and apparatus, electronic device and readable storage medium
KR102624644B1 (en) Method of estimating the location of a moving object using vector map
WO2025002358A1 (en) Methods and systems for locating targets and device docking
Yee et al. Autonomous mobile robot navigation using 2D LiDAR and inclined laser rangefinder to avoid a lower object
CN115201794B (en) A robot positioning error evaluation method and device based on two-dimensional laser
CN116931583A (en) Method, device, equipment and storage medium for determining and avoiding moving object
CN113203424B (en) Multi-sensor data fusion method and device and related equipment
Shalihan et al. Moving Object Localization based on the Fusion of Ultra-WideBand and LiDAR with a Mobile Robot
CN115327571A (en) Three-dimensional environment obstacle detection system and method based on planar laser radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination