CN119690085A - Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product - Google Patents
Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product Download PDFInfo
- Publication number
- CN119690085A CN119690085A CN202411923932.3A CN202411923932A CN119690085A CN 119690085 A CN119690085 A CN 119690085A CN 202411923932 A CN202411923932 A CN 202411923932A CN 119690085 A CN119690085 A CN 119690085A
- Authority
- CN
- China
- Prior art keywords
- obstacle
- robot
- risk area
- relative
- size
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
本公开提供一种避障方法和避障装置、机器人、存储介质和程序产品,属于计算机技术领域。避障方法包括:生成移动对象的行进路线;在移动对象的行进前方存在障碍物的情况下,获取障碍物的尺寸;确定障碍物的位置;根据障碍物的位置和障碍物的尺寸,确定风险区域;在行进路线与风险区域相交的情况下,发送提醒信息。
The present disclosure provides an obstacle avoidance method and obstacle avoidance device, a robot, a storage medium and a program product, which belong to the field of computer technology. The obstacle avoidance method includes: generating a moving route of a moving object; obtaining the size of the obstacle when there is an obstacle in front of the moving object; determining the position of the obstacle; determining a risk area according to the position of the obstacle and the size of the obstacle; and sending a reminder message when the moving route intersects with the risk area.
Description
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to an obstacle avoidance method, an obstacle avoidance apparatus, a robot, a storage medium, and a program product.
Background
At present, the health monitoring robot can freely move in a user home, monitors vital signs of the user, and actively early warns when the vital signs of the user are in an abnormal state, so that the health condition of the user can be effectively monitored.
Disclosure of Invention
The inventors have noted that users tend to stumble with obstacles on the ground when getting up at night, resulting in falls. Because the vision of the elderly is degraded, the elderly is more likely to fall down when getting up at night, and the elderly is often greatly injured by falling down. In the related art, the robot does not have the capability of providing a stumbling risk reminder to the user, so that the occurrence of a fall event of the user cannot be reduced.
Accordingly, the obstacle avoidance method can timely provide the user with a stumbling risk reminder when the stumbling risk exists in front of the movement of the user, and therefore the occurrence of falling events of the user is effectively reduced.
According to a first aspect of embodiments of the present disclosure, there is provided an obstacle avoidance method, which is performed by an obstacle avoidance device in a robot, including generating a travel route of a moving object, acquiring a size of an obstacle in the presence of the obstacle ahead of travel of the moving object, determining a position of the obstacle, determining a risk area according to the position of the obstacle and the size of the obstacle, and transmitting a reminder message in the event that the travel route intersects the risk area.
In some embodiments, the determining the position of the obstacle includes detecting a distance between the obstacle and the robot and an orientation of the obstacle relative to the robot, determining a relative position of the obstacle relative to the robot based on the distance between the obstacle and the robot, the size of the obstacle, and the orientation of the obstacle relative to the robot, and determining the position of the obstacle based on the current position of the robot and the relative position of the obstacle relative to the robot.
In some embodiments, the detecting the distance between the obstacle and the robot and the orientation of the obstacle relative to the robot includes detecting the distance between the obstacle and the robot and the orientation of the obstacle relative to the robot using an airborne radar.
In some embodiments, the acquiring the size of the obstacle includes acquiring an image of the front of the travel with an onboard camera, extracting feature information of the obstacle when the obstacle is identified to exist in the image, and extracting the size of the obstacle corresponding to the feature information from a database.
In some embodiments, the generating the travel route of the moving object includes determining a position of the moving object at a predetermined period using an onboard radar in a case where the moving object is identified from an image acquired by an onboard camera, and generating the travel route using the position of the moving object at different times.
In some embodiments, the determining the position of the moving object includes detecting a current distance between the moving object and the robot and a current position of the moving object relative to the robot, determining a relative position of the moving object relative to the robot based on the current distance between the moving object and the robot and the current position of the moving object relative to the robot, and determining the position of the moving object based on the current position of the robot and the relative position of the moving object relative to the robot.
In some embodiments, the detecting the current distance between the moving object and the robot and the current position of the moving object relative to the robot includes detecting the current distance between the moving object and the robot and the current position of the moving object relative to the robot using an airborne radar.
In some embodiments, the determining the risk area includes obtaining a length of the obstacle from a size of the obstacle, and determining the risk area based on a position of the obstacle and the length of the obstacle.
In some embodiments, the determining the risk area according to the position of the obstacle and the length of the obstacle includes determining a risk area radius according to the length of the obstacle, and determining the risk area using the risk area radius with the position of the obstacle as a center.
In some embodiments, the risk area radius is a sum of a length of the obstacle and a predetermined length.
In some embodiments, the reminding information is sent when a predetermined risk area exists in front of the traveling of the moving object and the traveling route intersects with the risk area.
In some embodiments, in the case where the robot is deployed to a designated environment, characteristic information and dimensions of an nth object in the designated environment are collected,N is the total number of the objects in the appointed environment, and the characteristic information and the size of the nth object are stored in a database.
In some embodiments, the acquiring the characteristic information and dimensions of the nth object includes acquiring an image of an mth side with the onboard camera if the mth side of the nth object is in a field of view of the onboard camera,M is the total number of the sides of the nth object, extracting the characteristic information of the mth side, and detecting the length of the mth side in the horizontal direction by using an airborne radar.
In some embodiments, the storing the characteristic information and the size of the ith object in a database includes storing the characteristic information of M sides of the ith object and the lengths of the M sides in the database, wherein a maximum value of the lengths of the M sides is taken as the length of the ith object.
According to a second aspect of the embodiment of the present disclosure, there is provided an obstacle avoidance apparatus including a travel route generation module configured to generate a travel route of a moving object, an obstacle information acquisition module configured to acquire a size of the obstacle and determine a position of the obstacle in the presence of the obstacle in front of travel of the moving object, a risk area determination module configured to determine a risk area according to the position of the obstacle and the size of the obstacle, and a reminder module configured to send reminder information in the event that the travel route intersects the risk area.
According to a third aspect of embodiments of the present disclosure, there is provided an obstacle avoidance device comprising a memory, a processor coupled to the memory, the processor configured to perform a method according to any of the embodiments described above based on instructions stored by the memory.
According to a fourth aspect of embodiments of the present disclosure, there is provided a robot comprising an obstacle avoidance device as described in any of the embodiments above, an on-board camera configured to acquire images, an on-board radar configured to detect a distance of a target from the robot, and an orientation of the target relative to the robot.
In some embodiments, the robot further comprises a positioning device configured to determine a position of the robot in the current environment.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium, wherein the computer readable storage medium stores computer instructions which, when executed by a processor, implement a method as referred to in any of the embodiments above.
According to a sixth aspect of embodiments of the present disclosure, there is provided a computer program product comprising computer instructions which, when executed by a processor, implement a method as referred to in any of the embodiments above.
Other features of the present disclosure and its advantages will become apparent from the following detailed description of exemplary embodiments of the disclosure, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings may be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a schematic flow chart of an obstacle avoidance method according to an embodiment of the disclosure;
FIG. 2 is a schematic illustration of a robot measuring object dimensions according to one embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a robotic obstacle avoidance pre-warning according to one embodiment of the disclosure;
fig. 4 is a schematic diagram of a robot obstacle avoidance pre-warning according to another embodiment of the disclosure;
FIG. 5 is a schematic diagram of a robotic obstacle avoidance pre-warning according to yet another embodiment of the present disclosure;
FIG. 6 is a schematic diagram of an obstacle avoidance device according to an embodiment of the disclosure;
FIG. 7 is a schematic view of an obstacle avoidance apparatus according to another embodiment of the present disclosure;
FIG. 8 is a schematic view of a robot according to an embodiment of the present disclosure;
Fig. 9 is a schematic structural view of a robot according to another embodiment of the present disclosure.
Detailed Description
The following description of the technical solutions in the embodiments of the present disclosure will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure are intended to be within the scope of this disclosure.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but should be considered part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that like reference numerals and letters refer to like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Fig. 1 is a flow chart of an obstacle avoidance method according to an embodiment of the disclosure. In some embodiments, the following obstacle avoidance method is performed by an obstacle avoidance device in the robot, including steps 11-15.
In step 11, a travel route of the moving object is generated.
For example, the moving object includes a user in a walking state.
In some embodiments, in the event that a moving object is identified from an image acquired by an onboard camera, the location of the moving object is determined at a predetermined period using an onboard radar. Next, a travel route is generated using the positions of the moving object at different times.
For example, the step of determining the position of the moving object includes the steps of:
1) A current distance between the mobile object and the robot, and a current position of the mobile object relative to the robot are detected.
For example, with airborne radar, the current distance between the moving object and the robot, and the current position of the moving object relative to the robot, are detected.
2) The relative position of the mobile object with respect to the robot is determined based on the current distance between the mobile object and the robot and the current position of the mobile object with respect to the robot.
3) The position of the moving object is determined based on the current position of the robot and the relative position of the moving object with respect to the robot.
Here, it should be noted that, since the positioning device is provided in the robot, the robot can be configured to determine the position of the robot in the current environment. For example, the charging position in the current environment may be used as the origin of coordinates, and the robot can know the current position relative to the origin of coordinates through the positioning device.
In step 12, in the case where there is an obstacle ahead of the travel of the moving object, the size of the obstacle is acquired.
In some embodiments, the step of obtaining the size of the obstacle comprises the steps of:
1) An image of the front of the travel is acquired with an onboard camera.
2) And extracting characteristic information of the obstacle when the obstacle exists in the image.
3) The size of the obstacle corresponding to the characteristic information is extracted from the database.
Here, in the case where the robot is deployed to a designated environment, characteristic information and the size of an nth object in the designated environment are acquired,N is the total number of objects in the designated environment, and the characteristic information and the size of the nth object are stored in a database.
In some embodiments, the characteristic information and dimensions of the nth object are collected and stored in a database by the following steps.
1) In the case where the mth side of the nth object is located in the field of view of the on-board camera, an image of the mth side is acquired with the on-board camera,M is the total number of sides of the nth object.
2) And extracting the characteristic information of the mth side surface.
3) The length of the mth side in the horizontal direction is detected by an airborne radar.
4) The characteristic information of the M sides of the ith object and the lengths of the M sides are stored in a database, wherein the maximum value in the lengths of the M sides is taken as the length of the ith object.
Here, it is to be noted that an article (for example, a small stool) used in a home generally has a shape of a rectangular parallelepiped as a whole, that is, of 4 sides of the rectangular parallelepiped, the shape and characteristics of a first side and a third side opposite to the first side are similar, and the shape and characteristics of a second side adjacent to the first side and a fourth side opposite to the second side are similar. In this case, only the characteristic information and the size information of the first side and the second side of the article need to be acquired.
For example, as shown in fig. 2, the robot collects an object image using an onboard camera, and adjusts its own position according to the collected object image so that the onboard radar of the robot faces the object. In order to facilitate the adjustment of the position of the robot, the onboard camera and the onboard radar of the robot may be arranged on the same vertical line. The distance between the robot and the object is calculated to be d by utilizing the point cloud data acquired by the airborne radar, and the included angles are respectivelyAnd. And then using the formula (1) and the formula (2) to calculate a first length L1 and a second length L2 of the surface of the object facing the robot.
(1)
(2)
In this case, the length L in the horizontal direction of the side opposite to the robot in the object is as shown in formula (3).
(3)
Here, if=I.e. the projection of the airborne radar on the opposite side of the robot is exactly at the midpoint of the length of that side in the horizontal direction, in which case the length L of the opposite side of the object in the horizontal direction can be calculated directly by means of equation (4).
(4)
In step 13, the position of the obstacle is determined.
In some embodiments, the step of determining the position of the obstacle comprises the steps of:
1) The distance between the obstacle and the robot, and the position of the obstacle relative to the robot are detected.
For example, with airborne radar, the distance between the obstacle and the robot, and the position of the obstacle relative to the robot, are detected.
2) The relative position of the obstacle with respect to the robot is determined based on the distance between the obstacle and the robot, the size of the obstacle, and the position of the obstacle with respect to the robot.
3) The position of the obstacle is determined based on the current position of the robot and the relative position of the obstacle with respect to the robot.
For example, the relative position of the geometric center of the obstacle with respect to the robot is determined based on the distance between the obstacle and the robot, the size of the obstacle, and the orientation of the obstacle with respect to the robot.
In this case, the position of the geometric center of the obstacle is determined from the current position of the robot and the relative position of the geometric center of the obstacle with respect to the robot.
In step 14, a risk area is determined based on the location of the obstacle and the size of the obstacle.
In some embodiments, the length of the obstacle is obtained from the size of the obstacle, and then the risk area is determined based on the position of the obstacle and the length of the obstacle.
Here, the obstacle generally has a plurality of side surfaces, and the length of each side surface in the horizontal direction is not the same, and in order to avoid a collision between the user and the obstacle, the maximum value of the lengths of the side surfaces is referred to as the length of the obstacle.
In some embodiments, the risk area radius is determined from the length of the obstacle. Next, the risk area is determined using the risk area radius with the position of the obstacle as the center of the circle.
For example, the risk area radius is the sum of the length of the obstacle and the predetermined length.
Here, in order to effectively reduce the probability of collision between the user and the obstacle, the sum of the length of the obstacle and the predetermined length is taken as the radius of the risk area.
For example, the predetermined length is 3-10cm. The area of the risk area can thereby be further increased, thereby further reducing the probability of the user colliding with an obstacle.
In step 15, a reminder is sent in case the travel route intersects the risk area.
For example, as shown in fig. 3, a user walks from west to east at home, with an obstacle in front of it. The robot detects a travel route of the user and detects the presence of an obstacle in front of the travel of the user. The robot determines a risk area according to the position of the obstacle and the size of the obstacle, and sends reminding information under the condition that the travelling route intersects with the risk area.
In some embodiments, as shown in fig. 4, the current coordinates of the robot are (x 0, y 0), the current coordinates of the user are (x 1, y 1), and the travel route is shown as a dotted line. The current coordinates of the obstacle are (x 2, y 2), and the robot acquires the obstacle size corresponding to the characteristic information from the database by identifying the characteristic information of the obstacle. Next, the robot determines a risk area according to the position of the obstacle and the size of the obstacle, as shown by a circular area in fig. 4. As can be seen from fig. 4, since the travel route of the user intersects with the risk area, the risk of collision between the user and the obstacle is indicated, in which case the robot sends a reminder to the user to change the route, thereby effectively avoiding the occurrence of a fall caused by the collision between the user and the obstacle.
In some embodiments, the alert information is sent in the event that a predetermined risk area exists in front of the travel of the mobile object and the travel route intersects the risk area.
Here, in the home environment, there may be a region where a user falls due to a threshold, a step, or the like in the room. Therefore, the user is reminded under the condition that the user is about to enter the area, and the risk of falling down of the user can be effectively reduced.
For example, as shown in fig. 5, the current coordinates of the robot are (x 0, y 0), the current coordinates of the user are (x 1, y 1), and the travel route thereof is shown as a dotted line. The step exists in front of the travelling route of the user, and the travelling route is intersected with the step area, and in this case, the robot sends reminding information to the user, so that the user is effectively prevented from falling down at the step.
In the obstacle avoidance method provided in the foregoing embodiments of the present disclosure, a travel route of a moving object is generated, in the case where an obstacle exists in front of travel of the moving object, a size of the obstacle is obtained, a position of the obstacle is determined, a risk area is determined according to the position of the obstacle and the size of the obstacle, and when the travel route intersects with the risk area, a reminder is sent. Therefore, when the tripping risk exists in front of the movement of the user, the tripping risk reminding can be provided for the user in time, and the occurrence of the falling event of the user is effectively reduced.
Fig. 6 is a schematic structural diagram of an obstacle avoidance device according to an embodiment of the disclosure. As shown in fig. 6, the obstacle avoidance apparatus includes a travel route generation module 61, an obstacle information acquisition module 62, a risk area determination module 63, and a reminder module 64.
The travel route generation module 61 is configured to generate a travel route of the moving object.
For example, the moving object includes a user in a walking state.
In some embodiments, the travel route generation module 61 determines the position of the moving object at a predetermined period using the onboard radar in the case where the moving object is identified from the image acquired by the onboard camera. Next, a travel route is generated using the positions of the moving object at different times.
For example, the operation of determining the position of the moving object includes the following:
1) A current distance between the mobile object and the robot, and a current position of the mobile object relative to the robot are detected.
For example, with airborne radar, the current distance between the moving object and the robot, and the current position of the moving object relative to the robot, are detected.
2) The relative position of the mobile object with respect to the robot is determined based on the current distance between the mobile object and the robot and the current position of the mobile object with respect to the robot.
3) The position of the moving object is determined based on the current position of the robot and the relative position of the moving object with respect to the robot.
Here, it should be noted that, since the positioning device is provided in the robot, the robot can be configured to determine the position of the robot in the current environment. For example, the charging position in the current environment may be used as the origin of coordinates, and the robot can know the current position relative to the origin of coordinates through the positioning device.
The obstacle information acquisition module 62 is configured to acquire the size of an obstacle and determine the position of the obstacle in the case where there is an obstacle ahead of the travel of the moving object.
In some embodiments, the operation of obtaining the size of the obstacle includes the following:
1) An image of the front of the travel is acquired with an onboard camera.
2) And extracting characteristic information of the obstacle when the obstacle exists in the image.
3) The size of the obstacle corresponding to the characteristic information is extracted from the database.
Here, in the case where the robot is deployed to a designated environment, characteristic information and the size of an nth object in the designated environment are acquired,N is the total number of objects in the designated environment, and the characteristic information and the size of the nth object are stored in a database.
In some embodiments, the characteristic information and dimensions of the nth object are collected and stored in a database by the following operations.
1) In the case where the mth side of the nth object is located in the field of view of the on-board camera, an image of the mth side is acquired with the on-board camera,M is the total number of sides of the nth object.
2) And extracting the characteristic information of the mth side surface.
3) The length of the mth side in the horizontal direction is detected by an airborne radar.
4) The characteristic information of the M sides of the ith object and the lengths of the M sides are stored in a database, wherein the maximum value in the lengths of the M sides is taken as the length of the ith object.
Here, it is to be noted that an article (for example, a small stool) used in a home generally has a shape of a rectangular parallelepiped as a whole, that is, of 4 sides of the rectangular parallelepiped, the shape and characteristics of a first side and a third side opposite to the first side are similar, and the shape and characteristics of a second side adjacent to the first side and a fourth side opposite to the second side are similar. In this case, only the characteristic information and the size information of the first side and the second side of the article need to be acquired.
In some embodiments, determining the location of the obstacle includes:
1) The distance between the obstacle and the robot, and the position of the obstacle relative to the robot are detected.
For example, with airborne radar, the distance between the obstacle and the robot, and the position of the obstacle relative to the robot, are detected.
2) The relative position of the obstacle with respect to the robot is determined based on the distance between the obstacle and the robot, the size of the obstacle, and the position of the obstacle with respect to the robot.
3) The position of the obstacle is determined based on the current position of the robot and the relative position of the obstacle with respect to the robot.
For example, the relative position of the geometric center of the obstacle with respect to the robot is determined based on the distance between the obstacle and the robot, the size of the obstacle, and the orientation of the obstacle with respect to the robot.
In this case, the position of the geometric center of the obstacle is determined from the current position of the robot and the relative position of the geometric center of the obstacle with respect to the robot.
The risk area determination module 63 is configured to determine a risk area based on the position of the obstacle and the size of the obstacle.
In some embodiments, the risk area determination module 63 obtains the length of the obstacle from the size of the obstacle, and then determines the risk area based on the position of the obstacle and the length of the obstacle.
Here, the obstacle generally has a plurality of side surfaces, and the length of each side surface in the horizontal direction is not the same, and in order to avoid a collision between the user and the obstacle, the maximum value of the lengths of the side surfaces is referred to as the length of the obstacle.
In some embodiments, the risk area radius is determined from the length of the obstacle. Next, the risk area is determined using the risk area radius with the position of the obstacle as the center of the circle.
For example, the risk area radius is the sum of the length of the obstacle and the predetermined length.
Here, in order to effectively reduce the probability of collision between the user and the obstacle, the sum of the length of the obstacle and the predetermined length is taken as the radius of the risk area.
For example, the predetermined length is 3-10cm. The area of the risk area can thereby be further increased, thereby further reducing the probability of the user colliding with an obstacle.
The reminder module 64 is configured to send reminder information in the event that the travel route intersects a risk area.
Fig. 7 is a schematic structural diagram of an obstacle avoidance device according to another embodiment of the disclosure.
As shown in fig. 7, obstacle avoidance device 70 is in the form of a general purpose computing device. The obstacle avoidance device 70 includes a memory 71, a processor 72, and a bus 73 that connects the various system components.
The memory 71 may include, for example, a system memory, a nonvolatile storage medium, and the like. The system memory stores, for example, an operating system, application programs, boot Loader (Boot Loader), and other programs. The system memory may include volatile storage media, such as Random Access Memory (RAM) and/or cache memory. The non-volatile storage medium stores, for example, instructions for performing a corresponding embodiment of at least one obstacle avoidance method. Non-volatile storage media include, but are not limited to, disk storage, optical storage, flash memory, and the like.
The processor 72 may be implemented as discrete hardware components such as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gates or transistors, and the like. Accordingly, each module, such as the acquisition module, the calculation module, the adjustment module, and the like, may be implemented by a Central Processing Unit (CPU) executing instructions for performing the corresponding steps in a memory, or may be implemented by a dedicated circuit for performing the corresponding steps.
For example, the processor 72 is configured to perform a method as referred to in any of the embodiments of fig. 1 based on the instructions stored by the memory.
Bus 73 may use any of a variety of bus architectures. For example, bus structures include, but are not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, and a Peripheral Component Interconnect (PCI) bus.
These interfaces 74, 75, 76 of the obstacle avoidance device 70, and the memory 71 and processor 72 may be connected by a bus 73. The input output interface 74 may provide a connection interface for input output devices such as a display, mouse, keyboard, etc. Network interface 75 provides a connection interface for various networking devices. The storage interface 76 provides a connection interface for external storage devices such as floppy disks, U disks, SD cards, and the like.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions specified in the flowchart and/or block diagram block or blocks.
These computer readable program instructions may also be stored in a computer readable memory that can direct a computer to function in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture including instructions which implement the function specified in the flowchart and/or block diagram block or blocks.
The present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
The present disclosure also provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement a method as referred to in any of the embodiments of fig. 1.
The present disclosure also provides a computer program product comprising computer instructions which, when executed by a processor, implement a method as referred to in any of the embodiments of fig. 1.
Fig. 8 is a schematic structural view of a robot according to an embodiment of the present disclosure. As shown in fig. 8, the robot includes an obstacle avoidance device 81, an onboard camera 82, and an onboard radar 83. The obstacle avoidance device 81 is one according to any one of the embodiments shown in fig. 6 or 7.
The onboard camera 82 is configured to acquire images so that the obstacle avoidance device 81 recognizes and processes the images acquired by the onboard camera 82.
The airborne radar 83 is configured to detect a distance of the target object from the robot, and an orientation of the target object with respect to the robot.
Here, the target may be an obstacle or a user.
For example, the airborne radar 83 is a lidar.
Fig. 9 is a schematic structural view of a robot according to another embodiment of the present disclosure. Fig. 9 differs from fig. 8 in that in the embodiment shown in fig. 9 the robot further comprises positioning means 84.
The positioning device 84 is configured to determine the position of the robot in the current environment.
For example, the positioning device 84 includes an inertial navigation device.
By implementing the embodiment of the disclosure, when the tripping risk exists in front of the movement of the user, the tripping risk reminding can be provided for the user in time, so that the occurrence of the falling event of the user is effectively reduced.
In some embodiments, the functional block modules described above may be implemented as a general-purpose Processor, programmable logic controller (Programmable Logic Controller, abbreviated as PLC), digital signal Processor (DIGITAL SIGNAL Processor, abbreviated as DSP), application Specific Integrated Circuit (ASIC), field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or any suitable combination thereof for performing the functions described herein.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The description of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiments were chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411923932.3A CN119690085A (en) | 2024-12-25 | 2024-12-25 | Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411923932.3A CN119690085A (en) | 2024-12-25 | 2024-12-25 | Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN119690085A true CN119690085A (en) | 2025-03-25 |
Family
ID=95038752
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202411923932.3A Pending CN119690085A (en) | 2024-12-25 | 2024-12-25 | Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN119690085A (en) |
-
2024
- 2024-12-25 CN CN202411923932.3A patent/CN119690085A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108062095B (en) | Object tracking using sensor fusion within a probabilistic framework | |
US10228693B2 (en) | Generating simulated sensor data for training and validation of detection models | |
US10740658B2 (en) | Object recognition and classification using multiple sensor modalities | |
US11220259B2 (en) | Multi-sensor probabilistic object detection and automated braking | |
CN106952303B (en) | Vehicle distance detection method, device and system | |
US20170193310A1 (en) | Method and apparatus for detecting a speed of an object | |
US8995714B2 (en) | Information creation device for estimating object position and information creation method and program for estimating object position | |
CN113780064B (en) | A target tracking method and device | |
US12189718B1 (en) | Learned state covariances | |
US10945888B2 (en) | Intelligent blind guide method and apparatus | |
KR20210074364A (en) | Intersection condition detection method, device, electronic device and vehicle | |
US12039783B2 (en) | Simultaneous diagnosis and shape estimation from a perceptual system derived from range sensors | |
Zhao et al. | Omni-directional detection and tracking of on-road vehicles using multiple horizontal laser scanners | |
JP7214437B2 (en) | Information processing device, information processing method and program | |
JP7476563B2 (en) | OBJECT TRACKING DEVICE, OBJECT TRACKING METHOD, AND OBJECT TRACKING PROGRAM | |
JP7160257B2 (en) | Information processing device, information processing method, and program | |
CN119690085A (en) | Obstacle avoidance method, obstacle avoidance device, robot, storage medium, and program product | |
CN113875223B (en) | External environment identification device | |
JP2015191268A (en) | Person's head detection device and posture estimation device | |
CN117689911A (en) | An autonomous driving multi-source perception uncertainty assessment method, device and medium | |
US11669980B2 (en) | Optical flow based motion detection | |
JP2019078618A (en) | Personal space estimation device, personal space estimation method and personal space estimation system | |
US20240317241A1 (en) | Driver assistance recommendations using sensor data associated with a lower extremity region | |
Albarella et al. | A Forward-Collision Warning System for Electric Vehicles: Experimental Validation in Virtual and Real Environment. Energies 2021, 14, 4872 | |
CN115857499A (en) | Robot, operation control method and device for robot, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |