[go: up one dir, main page]

CN110587597B - SLAM closed loop detection method and detection system based on laser radar - Google Patents

SLAM closed loop detection method and detection system based on laser radar Download PDF

Info

Publication number
CN110587597B
CN110587597B CN201910707655.5A CN201910707655A CN110587597B CN 110587597 B CN110587597 B CN 110587597B CN 201910707655 A CN201910707655 A CN 201910707655A CN 110587597 B CN110587597 B CN 110587597B
Authority
CN
China
Prior art keywords
closed
loop detection
module
detection result
loop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910707655.5A
Other languages
Chinese (zh)
Other versions
CN110587597A (en
Inventor
闫瑞君
任娟娟
张国栋
叶力荣
孙振坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN201910707655.5A priority Critical patent/CN110587597B/en
Priority to PCT/CN2019/102850 priority patent/WO2021017072A1/en
Publication of CN110587597A publication Critical patent/CN110587597A/en
Application granted granted Critical
Publication of CN110587597B publication Critical patent/CN110587597B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application is suitable for the field of cleaning equipment, and provides a SLAM closed loop detection method and a detection system based on a laser radar, wherein the method comprises the following steps: performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result; on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module; and when the laser SLAM module acquires the closed loop detection information, executing image optimization operation, and improving the accuracy of construction of an environment map and subsequent path planning.

Description

SLAM closed loop detection method and detection system based on laser radar
Technical Field
The application belongs to the field of cleaning equipment, and particularly relates to a SLAM closed-loop detection method and a detection system based on a laser radar.
Background
With the progress of signal processing, artificial intelligence, and mechanical manufacturing technologies, the demand for robots with environmental awareness, autonomous navigation, and human-computer interaction is becoming stronger. Robots with autonomic awareness require that they have good perception, judgment and adaptability to the environment so that many people can perform tasks they wish to perform. The autonomy of the robot is that tasks are completed by the autonomous decision of the robot in an unknown environment. In an unknown environment, an SLAM (simultaneous localization and mapping) technology is usually adopted to implement detection and path navigation of the unknown environment.
The closed loop detection, also called loop back detection, refers to that the robot returns to the original position as a loop back. Closed loop detection is a key component in SLAM to reduce the cumulative error in building an environmental map.
Closed loop detection is also a difficulty in SLAM because: if the closed-loop detection is successful, the accumulated error can be obviously reduced, and the robot is helped to carry out obstacle avoidance navigation work more accurately and rapidly. The map may be deformed by the wrong detection result, and particularly for the SLAM technology based on the laser radar, two main problems exist in the closed loop detection, namely, the closed loop detection of the laser SLAM has relative hysteresis, and when the loop is detected by the laser SLAM, the map optimization is performed, so that the local map may be deformed; secondly, in a similar environment, a closed loop detection error or an accumulated error in a large-scale environment may occur, which is not a closed loop but is mistaken as a closed loop, and a local map may also be deformed. When the problems exist, the mapping effect is poor, and the subsequent path planning and the like also have adverse effects.
Disclosure of Invention
In view of this, the embodiment of the present application provides a SLAM closed-loop detection method and a detection system based on a laser radar, so as to solve the problems in the prior art that laser SLAM closed-loop detection has hysteresis, an accumulated error is large, closed-loop false detection is easily caused, and construction of an environment map and subsequent path planning are affected.
A first aspect of an embodiment of the present application provides a SLAM closed loop detection method based on a laser radar, including:
performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result;
on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module;
and when the laser SLAM module acquires the closed loop detection information, executing image optimization operation.
A second aspect of the embodiments of the present application provides a SLAM closed loop detection system based on a laser radar, including:
the visual closed-loop detection module is used for carrying out closed-loop detection on the basis of camera data acquired by the camera to obtain a first closed-loop detection result; on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module;
and the laser SLAM module is used for executing image optimization operation when the closed-loop detection information is acquired.
Therefore, in the embodiment of the application, the addition of the visual closed-loop detection module realizes the auxiliary effect on the laser SLAM module, can detect the closed loop in time, even confirm the local closed loop in advance, reduce the accumulated error, filter out the wrong closed loop, enable the composition of the laser SLAM module to be more timely and accurate, realize the closed-loop detection on the laser SLAM module, and improve the accuracy of the construction of the environment map and the subsequent path planning.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a first flowchart of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present disclosure;
fig. 2 is a flowchart ii of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present application;
fig. 3 is a block diagram of a SLAM closed-loop detection system based on a laser radar according to an embodiment of the present disclosure;
fig. 4 is a structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, fig. 1 is a first flowchart of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present application. As shown in fig. 1, a SLAM closed loop detection method based on a laser radar includes the following steps:
step 101, performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result.
The camera data specifically includes image data acquired by a camera capturing an external image. The camera data is obtained by shooting images by a camera in the robot travelling path.
The vision closed-loop detection module is used for comparing and processing the camera data acquired by the camera and judging whether the robot passes through the current position point or not to realize closed-loop detection. Specifically, the closed-loop detection can be realized by comparing the visual information of the current point and the past walking point acquired by the monocular camera, the binocular camera, the multi-view camera or the fisheye camera and judging whether the robot walks through the current position point.
Wherein, optionally, the visual closed-loop detection module can be a visual closed-loop detection part in a visual SLAM module of an existing robot.
Specifically, as an optional implementation manner, the performing closed-loop detection by the visual closed-loop detection module based on the camera data acquired by the camera to obtain a first closed-loop detection result includes:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording through a visual closed-loop detection module;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
The current position point is specifically a current position point where the robot is located on the travel path.
Whether it is a laser SLAM module or a visual closed loop detection module, closed loops are easily detected incorrectly in similar environments, so that matching of the current frame with the key frames in the key frame library is required to determine whether closed loops really exist.
The robot collects images at each travel point along the travel path during travel, collects a plurality of images at each travel point, determines a key image frame from the plurality of images collected at each travel point, and correspondingly stores the key image frame and the travel point to form a database (namely a key frame library) for recording the obtained image frames of the historical position points. And matching the first image frame of the current position point with the recorded image frames of the historical position points in the database through a visual closed-loop detection module.
The robot is, for example, a sweeper, and in a specific implementation, the historical position of the robot may be in one-to-one correspondence with the frame number of the key image captured by the camera, and the current position of the sweeper is also recorded each time the key frame is saved, such as a position of a poision (sweeper, key frame number).
In the above step, the first image frame is a key image frame in a plurality of image frames acquired by the robot at the current position point, and the key image frame may be an image frame with optimal definition.
In the process of performing closed-loop detection and judgment, the following steps may be specifically performed:
and if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
And if the similarity between the first image frame and the second image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
And if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value and the similarities between a previous image frame and a next image frame of the second image frame and the first image frame are greater than the threshold value, determining that the first closed-loop detection result is a closed loop.
And comparing the image frame at the current position with the key image frame in the key frame library, wherein when the similarity between the image frame at the current position and a certain key image frame in the database is greater than a threshold value, for example, 50%, a closed loop is formed, and a visual closed-loop signal is provided for the laser slam module.
And comparing the image frame at the current position with the key image frame in the key image frame library, wherein when the similarity between the image frame at the current position and the two image frames before and after a certain key image frame in the database is greater than a threshold value, for example, 50%, a closed loop is established, and a visual closed loop signal is provided for the laser slam module.
And comparing the image frame at the current position with the key image frame in the key image frame library, wherein when the similarity between the image frame at the current position and a certain key image frame in the database and between the image frame at the current position and the two frames before and after the key image frame is greater than a threshold value, for example, 50%, a closed loop is established, and a visual closed loop signal is provided for the laser slam module.
The previous image frame and the next image frame are image frames corresponding to a position point before and a position point after the position point corresponding to the second image frame. Namely, the image frames respectively corresponding to the front and rear adjacent position points of the position point corresponding to the second image frame. And the image comparison detection is comprehensively carried out, so that the closed-loop detection accuracy is improved.
Specifically, when the similarity determination is performed, it may be that image feature points are extracted from the image frame, it is determined whether or not the amount of shift between the displacement of the feature points and the displacement of the position points can be matched, and when the matching is possible, it is determined that the similarity is greater than the set value.
And 102, on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
The closed-loop detection information is generated based on the first closed-loop detection result. Specifically, the closed-loop detection information may include a first closed-loop detection result.
Closed loop detection is carried out through a visual closed loop detection module; when the closed loop is detected, a visual closed loop signal is directly sent to the laser synchronous positioning and mapping SLAM module, the closed loop detection result is sent to the laser SLAM module, the laser SLAM module is actively informed that the closed loop exists in the current path, and the laser SLAM can directly use the closed loop detection result to execute image optimization operation.
And 103, executing image optimization operation when the laser SLAM module acquires the closed loop detection information.
The laser SLAM module directly uses the closed-loop detection information of the visual closed-loop detection module as a closed-loop detection basis, and determines whether to perform the optimized updating operation of the map or not by using the closed-loop detection information as the basis for judging whether the map needs to be updated or not.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
The embodiment of the application also provides different implementation modes of the SLAM closed loop detection method based on the laser radar.
Referring to fig. 2, fig. 2 is a second flowchart of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present application. As shown in fig. 2, a SLAM closed loop detection method based on a laser radar includes the following steps:
step 201, performing closed-loop detection through the laser SLAM module to obtain a second closed-loop detection result, and sending a verification request to the visual closed-loop detection module.
Wherein the verification request carries the second closed-loop detection result.
The process is carried out on the basis of the first closed-loop detection result, and closed-loop detection information is sent to the front of a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
Before the vision closed-loop detection module sends closed-loop detection information to the laser SLAM module, the laser SLAM module can scan the environment to achieve laser point cloud collection in the moving process of the robot, the laser SLAM module can automatically carry out closed-loop detection according to collected point clouds to obtain a detection result, and generates a verification request to be sent to the vision closed-loop detection module based on the closed-loop detection result, so that the vision closed-loop detection module can assist in verifying whether a second closed-loop detection result is correct, wrong closed-loop detection results of the laser SLAM module are eliminated, and the accuracy of map composition is further ensured.
When the laser SLAM module detects a closed-loop signal, the closed-loop signal is sent to the visual closed-loop detection module for detection, whether the map information detected by the closed-loop detection is really closed-loop or not is confirmed, if the closed-loop exists, the visual closed-loop signal is sent to the laser SLAM module, if the closed-loop does not exist, the current image frame is returned, and the traversal comparison process of the image is finished.
Step 202, performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result.
This step is the same as the implementation of step 101 in the foregoing embodiment, and is not described here again. Wherein, the occurrence between the step 202 and the step 201 is not in sequence.
Step 203, when the visual closed-loop detection module receives the verification request, on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
Further, as an optional implementation manner, on the basis of the first closed-loop detection result, sending, by the visual closed-loop detection module, closed-loop detection information to a laser synchronous positioning and mapping SLAM module, includes:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module through the visual closed-loop detection module.
When the visual closed-loop detection module verifies the closed-loop detection result of the laser SLAM module, the detection result of the laser SLAM module is specifically compared with the closed-loop detection result made by the visual closed-loop detection module to judge whether the detection result of the laser SLAM module is consistent with the closed-loop detection result of the visual closed-loop detection module, and if the detection result of the visual closed-loop detection module is consistent with the closed-loop detection result of the laser SLAM module and both the detection results of the laser.
Further, the method also comprises the following steps: and if the first closed-loop detection result is inconsistent with the second closed-loop detection result, sending verification information of closed-loop detection errors to the laser SLAM module through the visual closed-loop detection module.
When the visual closed-loop detection module verifies the closed-loop detection result of the laser SLAM module, the detection result of the laser SLAM module is compared with the closed-loop detection result made by the visual closed-loop detection module to judge whether the detection result is consistent with the closed-loop detection result made by the visual closed-loop detection module, if not, the visual closed-loop detection module sends verification information of a closed-loop detection error to the laser SLAM module, the laser SLAM module carries out closed-loop detection again until the closed-loop detection results of the laser SLAM module and the visual closed-loop detection module are consistent and are closed loops, or the laser SLAM module directly takes the closed-loop detection result of the visual closed-loop detection module as the standard to carry out.
And 204, executing image optimization operation when the laser SLAM module acquires the closed loop detection information.
This step is the same as the step 103 in the previous embodiment, and is not described here again.
Before the laser SLAM module executes the image optimization operation, the closed-loop detection result needs to be determined to be a closed loop, the result verification of the visual closed-loop detection module is passed, the closed-loop detection results of the laser SLAM module and the visual closed-loop detection module are consistent, the detection result is considered to be reliable enough, the final detection result is determined to be the closed loop, and the subsequent image optimization operation is executed. The laser SLAM module is assisted by the visual closed-loop detection module to reject closed-loop false detection, and detection accuracy is improved.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
Referring to fig. 3, fig. 3 is a structural diagram of a SLAM closed-loop detection system based on a laser radar according to an embodiment of the present application, and for convenience of explanation, only a part related to the embodiment of the present application is shown.
The SLAM closed loop detection system based on the laser radar comprises:
the visual closed-loop detection module is used for carrying out closed-loop detection on the basis of camera data acquired by the camera to obtain a first closed-loop detection result; on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module;
and the laser SLAM module is used for executing image optimization operation when the closed-loop detection information is acquired.
Wherein, the laser SLAM module is further configured to:
performing closed-loop detection to obtain a second closed-loop detection result, and sending a verification request to the visual closed-loop detection module, wherein the verification request carries the second closed-loop detection result;
the visual closed-loop detection module is further configured to:
and when the verification request is received, the step of sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module on the basis of the first closed-loop detection result is executed.
Wherein the visual closed-loop detection module is further configured to:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module.
Wherein the visual closed-loop detection module is further configured to:
and if the first closed loop detection result is inconsistent with the second closed loop detection result, sending verification information of closed loop detection errors to the laser SLAM module.
Wherein the visual closed-loop detection module is further configured to:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
The SLAM closed-loop detection system based on the laser radar provided by the embodiment of the application can realize each process of the embodiment of the SLAM closed-loop detection method based on the laser radar, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated.
Fig. 4 is a structural diagram of a terminal according to an embodiment of the present application. As shown in the figure, the terminal 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The terminal may be a robot, for example a sweeping robot, a warehouse goods handling robot.
The terminal 4 may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is only an example of a terminal 4 and does not constitute a limitation of terminal 4 and may include more or less components than those shown, or some components in combination, or different components, for example, the terminal may also include input output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal 4, such as a hard disk or a memory of the terminal 4. The memory 41 may also be an external storage device of the terminal 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the terminal 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal 4. The memory 41 is used for storing the computer program and other programs and data required by the terminal. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. A SLAM closed loop detection method based on laser radar is characterized by comprising the following steps:
performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result;
performing closed-loop detection through a laser synchronous positioning and mapping SLAM module to obtain a second closed-loop detection result, and sending a verification request to the visual closed-loop detection module, wherein the verification request carries the second closed-loop detection result;
when the visual closed-loop detection module receives the verification request, on the basis of the first closed-loop detection result, closed-loop detection information is sent to a laser SLAM module through the visual closed-loop detection module;
and when the laser SLAM module acquires the closed loop detection information, executing image optimization operation.
2. The lidar based SLAM closed loop detection method of claim 1,
on the basis of the first closed-loop detection result, the visual closed-loop detection module sends closed-loop detection information to the laser synchronous positioning and mapping SLAM module, and the method comprises the following steps:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module through the visual closed-loop detection module.
3. The method of claim 2, further comprising:
and if the first closed-loop detection result is inconsistent with the second closed-loop detection result, sending verification information of closed-loop detection errors to the laser SLAM module through the visual closed-loop detection module.
4. The SLAM closed-loop detection method based on lidar of claim 1, wherein the closed-loop detection is performed on the camera data collected by the camera through a visual closed-loop detection module to obtain a first closed-loop detection result, comprising:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording through a visual closed-loop detection module;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
5. A SLAM closed loop detection system based on laser radar, comprising:
the visual closed-loop detection module is used for carrying out closed-loop detection on the basis of camera data acquired by the camera to obtain a first closed-loop detection result; when the visual closed-loop detection module receives a verification request, on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module;
the laser SLAM module is used for carrying out closed-loop detection to obtain a second closed-loop detection result and sending the verification request to the visual closed-loop detection module, wherein the verification request carries the second closed-loop detection result; and the image optimization module is further used for executing image optimization operation when the closed loop detection information is acquired.
6. The lidar-based SLAM closed-loop detection system of claim 5, wherein the visual closed-loop detection module is further configured to:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module.
7. The lidar-based SLAM closed-loop detection system of claim 6, wherein the visual closed-loop detection module is further configured to:
and if the first closed loop detection result is inconsistent with the second closed loop detection result, sending verification information of closed loop detection errors to the laser SLAM module.
8. The lidar-based SLAM closed-loop detection system of claim 5, wherein the visual closed-loop detection module is further configured to:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
CN201910707655.5A 2019-08-01 2019-08-01 SLAM closed loop detection method and detection system based on laser radar Active CN110587597B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910707655.5A CN110587597B (en) 2019-08-01 2019-08-01 SLAM closed loop detection method and detection system based on laser radar
PCT/CN2019/102850 WO2021017072A1 (en) 2019-08-01 2019-09-27 Laser radar-based slam closed-loop detection method and detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910707655.5A CN110587597B (en) 2019-08-01 2019-08-01 SLAM closed loop detection method and detection system based on laser radar

Publications (2)

Publication Number Publication Date
CN110587597A CN110587597A (en) 2019-12-20
CN110587597B true CN110587597B (en) 2020-09-22

Family

ID=68853316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910707655.5A Active CN110587597B (en) 2019-08-01 2019-08-01 SLAM closed loop detection method and detection system based on laser radar

Country Status (2)

Country Link
CN (1) CN110587597B (en)
WO (1) WO2021017072A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111145634B (en) * 2019-12-31 2022-02-22 深圳市优必选科技股份有限公司 Method and device for correcting map
CN113552864B (en) * 2020-04-15 2024-12-27 深圳市镭神智能系统有限公司 Positioning method and device for self-moving subject, self-moving subject and storage medium
CN111856441B (en) * 2020-06-09 2023-04-25 北京航空航天大学 Train positioning method based on vision and millimeter wave radar fusion
CN112595322B (en) * 2020-11-27 2024-05-07 浙江同善人工智能技术有限公司 ORB closed loop detection fused laser SLAM method
CN113246136B (en) * 2021-06-07 2021-11-16 深圳市普渡科技有限公司 Robot, map construction method, map construction device and storage medium
CN113744236B (en) * 2021-08-30 2024-05-24 阿里巴巴达摩院(杭州)科技有限公司 Loop detection method, device, storage medium and computer program product
CN113947716B (en) * 2021-10-20 2025-03-21 上海擎朗智能科技有限公司 Closed-loop detection method, device, robot and storage medium
CN114034299B (en) * 2021-11-08 2024-04-26 中南大学 A navigation system based on active laser SLAM
CN113963084A (en) * 2021-11-11 2022-01-21 上海快仓自动化科技有限公司 Method, system and device for establishing graph and computer readable storage medium
CN114608552B (en) * 2022-01-19 2024-06-18 达闼机器人股份有限公司 Robot mapping method, system, device, equipment and storage medium
CN115290066A (en) * 2022-07-06 2022-11-04 杭州萤石软件有限公司 An error correction method, device and mobile device
CN115436968A (en) * 2022-09-23 2022-12-06 福建(泉州)哈工大工程技术研究院 A Bitmap Relocation Method Based on LiDAR

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106153048A (en) * 2016-08-11 2016-11-23 广东技术师范学院 A kind of robot chamber inner position based on multisensor and Mapping System
CN106272423A (en) * 2016-08-31 2017-01-04 哈尔滨工业大学深圳研究生院 A kind of multirobot for large scale environment works in coordination with the method for drawing and location
CN107246876A (en) * 2017-07-31 2017-10-13 中北智杰科技(北京)有限公司 A kind of method and system of pilotless automobile autonomous positioning and map structuring
CN107529650A (en) * 2017-08-16 2018-01-02 广州视源电子科技股份有限公司 Network model construction and closed loop detection method, corresponding device and computer equipment
CN109509230A (en) * 2018-11-13 2019-03-22 武汉大学 A kind of SLAM method applied to more camera lens combined type panorama cameras

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9201133B2 (en) * 2011-11-11 2015-12-01 The Board Of Trustees Of The Leland Stanford Junior University Method and system for signal-based localization
CN106897666B (en) * 2017-01-17 2020-09-08 上海交通大学 Closed loop detection method for indoor scene recognition
CN106885574B (en) * 2017-02-15 2020-02-07 北京大学深圳研究生院 Monocular vision robot synchronous positioning and map construction method based on re-tracking strategy
CN108537844B (en) * 2018-03-16 2021-11-26 上海交通大学 Visual SLAM loop detection method fusing geometric information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106153048A (en) * 2016-08-11 2016-11-23 广东技术师范学院 A kind of robot chamber inner position based on multisensor and Mapping System
CN106272423A (en) * 2016-08-31 2017-01-04 哈尔滨工业大学深圳研究生院 A kind of multirobot for large scale environment works in coordination with the method for drawing and location
CN107246876A (en) * 2017-07-31 2017-10-13 中北智杰科技(北京)有限公司 A kind of method and system of pilotless automobile autonomous positioning and map structuring
CN107529650A (en) * 2017-08-16 2018-01-02 广州视源电子科技股份有限公司 Network model construction and closed loop detection method, corresponding device and computer equipment
CN109509230A (en) * 2018-11-13 2019-03-22 武汉大学 A kind of SLAM method applied to more camera lens combined type panorama cameras

Also Published As

Publication number Publication date
CN110587597A (en) 2019-12-20
WO2021017072A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
CN110587597B (en) SLAM closed loop detection method and detection system based on laser radar
CN112734852B (en) Robot mapping method and device and computing equipment
US11629964B2 (en) Navigation map updating method and apparatus and robot using the same
CN110807350B (en) System and method for scan-matching oriented visual SLAM
JP7422105B2 (en) Obtaining method, device, electronic device, computer-readable storage medium, and computer program for obtaining three-dimensional position of an obstacle for use in roadside computing device
EP3974778B1 (en) Method and apparatus for updating working map of mobile robot, and storage medium
KR101784183B1 (en) APPARATUS FOR RECOGNIZING LOCATION MOBILE ROBOT USING KEY POINT BASED ON ADoG AND METHOD THEREOF
CN113469045B (en) Visual positioning method and system for unmanned integrated card, electronic equipment and storage medium
CN110262487B (en) Obstacle detection method, terminal and computer readable storage medium
CN110673607B (en) Feature point extraction method and device under dynamic scene and terminal equipment
CN110648354B (en) Slam method in dynamic environment
Jang et al. Camera orientation estimation using motion-based vanishing point detection for advanced driver-assistance systems
CN110853085A (en) Semantic SLAM-based mapping method and device and electronic equipment
Luo et al. DFF-VIO: A General Dynamic Feature Fused Monocular Visual-Inertial Odometry
CN114757824B (en) Image splicing method, device, equipment and storage medium
CN117146795A (en) Loop detection method, system, equipment and medium for visual laser double verification
CN115700507B (en) Map updating method and device
Li et al. Automatic surround camera calibration method in road scene for self-driving car
CN114510031A (en) Robot vision navigation method, device, robot and storage medium
CN115908571A (en) Method, device, equipment and medium for updating pose of automatic driving automobile
CN115239776A (en) Point cloud registration method, device, equipment and medium
Dekkiche et al. Vehicles detection in stereo vision based on disparity map segmentation and objects classification
CN113701766A (en) Robot map construction method, robot positioning method and device
CN119826836B (en) Robot with body, obstacle avoidance control method, device and program product thereof
US20240418831A1 (en) Method for estimating a distance between a motor vehicle and an external object, associated device, associated computer program, and associated motor vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Yan Ruijun

Inventor after: Ren Juanjuan

Inventor after: Zhang Guodong

Inventor after: Ye Lirong

Inventor after: Sun Zhenkun

Inventor before: Ye Lirong

Inventor before: Ren Juanjuan

Inventor before: Zhang Guodong

Inventor before: Yan Ruijun

Inventor before: Sun Zhenkun

GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 518000 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518000 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.