Disclosure of Invention
In order to solve the above problems, the present invention provides a cleaning robot capable of improving obstacle detection accuracy.
A first aspect of an embodiment of the present invention provides a cleaning robot, including a host computer and a camera device, where the camera device is located on a central axis of the host computer, and the central axis is parallel to an advancing direction of the cleaning robot when the cleaning robot moves.
In one implementation, the camera device is fixed to a forward portion of the host, the camera device is located between a top surface and a bottom surface of the host, and the top surface and the bottom surface are arranged opposite to each other.
In one implementation, an optical axis of the image pickup device coincides with the central axis.
In one implementation, the optical axis of the image capturing device is disposed obliquely to the central axis.
In one implementation, an inclination angle of the optical axis of the image pickup device with respect to the central axis of the cleaning robot is 0.5 ° to 10 °.
In one implementation, the center of the lens of the camera device is located on the central axis of the cleaning robot.
In one implementation manner, the number of the camera devices is even, the even camera devices are symmetrically arranged on two sides of a central axis of the cleaning robot, and the central axis is parallel to the advancing direction of the cleaning robot when the cleaning robot moves.
In one implementation, the number of the camera devices is an odd number, a lens center of one camera device is located on a central axis of the cleaning robot, and the central axis is parallel to an advancing direction of the cleaning robot when the cleaning robot moves.
In one implementation, the maximum longitudinal field of view of the image capture device ranges from 50 ° to 70 °, and the maximum lateral field of view of the image capture device ranges from 110 ° to 130 °.
In an implementation, cleaning machines people still includes buffer and lidar, the buffer at least part set up in the host computer to the week side of part, lidar locates on the host computer and expose the host computer on the advancing direction when cleaning machines people removes, the nearest barrier that lidar can acquire with the distance at lidar's center is less than the buffer with the distance at lidar's center.
In one implementation mode, the host computer is further provided with an accommodating part, the top surface is provided with an opening communicated with the accommodating part, the laser radar is fixed in the accommodating part and exposed out of the top surface, the camera device is arranged at an interval of the laser radar, the laser radar is located on a central axis of the cleaning robot, and the central axis is parallel to the advancing direction of the cleaning robot when the cleaning robot moves.
In an implementation, the lidar is symmetric about the central axis.
In an implementation manner, cleaning machines people still includes the safety cover, the safety cover is located laser radar is last, correspond on the safety cover the position in laser emission hole is equipped with the printing opacity district, is used for passing through the laser of laser radar transmission, the printing opacity district is the through-hole, the aperture of through-hole is less than 8.5 mm.
According to the cleaning robot provided by the embodiment of the invention, the camera device is used for acquiring the image information to acquire the environment information in the environment where the cleaning robot is located, and the camera device is positioned on the central axis of the host machine and fixed at the forward part of the host machine, so that the cleaning robot can be helped to accurately detect the obstacle in front of the operation, the detection blind area of the cleaning robot can be reduced, and the obstacle detection accuracy of the cleaning robot can be improved. Furthermore, the camera device is positioned on the central axis of the host, so that the data processing of obstacle ranging, image recognition, component mapping and movement path planning of the mobile robot is simplified, and the data processing precision is improved, so that the obstacle detection precision is improved.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 and 2, fig. 1 is a side view of a cleaning robot according to an embodiment of the present invention. Fig. 2 is a partially exploded perspective view of the cleaning robot shown in fig. 1. The cleaning robot 100 includes a main body 10, a laser radar 30, and an imaging device 70. The laser radar 30 is provided on the main unit 10. The laser emitting hole 31 of the laser radar 30 is exposed to the top surface 15 of the main body 10. Host 10 also includes a bottom surface 17 disposed opposite top surface 15. The camera device 70 is fixed to the forward portion 11 of the main body 10, and the camera device 70 is disposed at intervals from the laser radar 30. The camera device 70 is located between the top surface 15 and the bottom surface 17. In the present embodiment, the forward portion 11 is a portion of the main body 10 that faces the forward direction when the cleaning robot 100 moves in the forward direction.
Since the laser emitting hole 31 of the laser radar 30 is exposed to the top surface 15 of the main body 10, the cleaning robot 100 can acquire environmental information higher than the top surface 15 of the main body 10 by the laser emitted from the laser radar 30. The camera device 70 is fixed on the forward portion 11, and the cleaning robot 100 acquires environmental information of the orientation of the forward portion 11 of the main body 10 through an image captured by the camera device 70. Since the field of view of the camera device 70 is large, the environmental information acquired by the camera device 70 includes environmental information that the height of the traveling direction of the cleaning robot 100 is lower than the top surface 15 of the main body 10. In other words, the imaging device 70 can acquire the obstacle information in the detection blind area of the laser radar 30, thereby improving the detection accuracy of the cleaning robot 100, reducing the possibility that the cleaning robot 100 collides with an obstacle during operation, and improving the operation efficiency of the cleaning robot 100.
Referring to fig. 3, fig. 3 is a cross-sectional view of the cleaning robot shown in fig. 1 moving along a forward direction on a walking surface. The host 10 comprises said forward part 11 and backward part 13. As described above, the forward portion 11 refers to a portion of the main body 10 that faces the forward direction when the cleaning robot 100 moves in the forward direction. The rearward portion 13 refers to a portion opposite the forward portion 11. Host 10 also includes oppositely disposed top surface 15 and bottom surface 17. When the cleaning robot 100 travels on the traveling surface 200, the bottom surface 17 is a surface of the cleaning robot 100 adjacent to the traveling surface 200, compared to the top surface 15. The main body 10 is further provided with a housing portion 19 for fixing the laser radar 30. The housing portion 19 has a groove-like structure. The top surface 15 is opened with an opening 151 communicating with the receiving portion 19. The main body 10 has a central axis γ to which an advancing direction of the cleaning robot 100 when moving is parallel. It is understood that the central axis γ is not limited to the position shown in fig. 3, and the central axis γ may be a central axis at other positions on the longitudinal section of the main body 10 passing through the central axis, for example, the central axis γ may also be a central axis having a height from the bottom surface 17 being one half of the thickness of the main body 10, and so on.
The cleaning robot 100 further includes a bumper 20. The bumper 20 is installed at least partially on the circumferential side of the forward portion of the main body 10 to buffer an impact force applied when the cleaning robot 100 collides with an obstacle. In the present embodiment, the buffer 20 is provided on the peripheral side of the forward portion of the host computer 10.
The laser radar 30 is fixed to the housing portion 19. Laser emitting aperture 31 (also shown in fig. 6) of lidar 30 is exposed at top surface 15 of host 10 for emitting laser light to detect environmental information above top surface 15 of host 10. In one embodiment, in the forward direction of the cleaning robot 100 when moving, the distance from the center of the laser radar 30 to the nearest obstacle that the laser radar 30 can acquire is smaller than the distance from the center of the laser radar 30 to the bumper 20. Thus, when the cleaning robot 10 moves forward, the blind area of the laser radar 30 is small, and the obstacle in front of the bumper 20 can be detected before the bumper 20 which is collided forward is not triggered, so that the cleaning robot 100 can timely perform the action of avoiding the obstacle without colliding the bumper 20 with the obstacle, the probability of triggering the bumper 20 is greatly reduced, and the damage of the cleaning robot is reduced.
Referring to fig. 4, fig. 4 is a block diagram of a lidar 30 according to an embodiment, which includes a laser emitting portion 33, a laser receiving portion 35, and a lidar processor 37. In the present embodiment, the laser emitted from the laser radar 30 is substantially parallel to the central axis γ, and the laser radar 30 can detect an obstacle having a height not lower than the top surface 15, such as the obstacle 201 shown in fig. 3. Whereas for the area having a height lower than the top surface 15, which is the detection blind area of the laser radar 30, the laser radar 30 cannot detect an obstacle within the detection blind area, such as the obstacle 203 shown in fig. 3. The laser emitting part 33 of the laser radar 30 emits laser, the laser is reflected after being irradiated to an obstacle, the reflected laser is received by the laser receiving part 35, and an obstacle having a different distance from the cleaning robot 100 is imaged at a different position of the laser receiving part 35, that is, the distance between the obstacle and the cleaning robot 100 and the imaging position of the obstacle on the laser receiving part 35 have a corresponding relationship, so that the distance between the obstacle to be measured and the cleaning robot 100 can be obtained through the imaging position on the laser receiving part 35. The laser radar processor 37 is configured to perform calculation processing on the image acquired by the laser receiving unit 35 to obtain environmental information such as a relative distance between the obstacle and the cleaning robot 100.
Referring to fig. 3 again, the cleaning robot 100 further includes a protective cover 50. The protective cover 50 covers the laser radar 30 to protect the laser radar 30, so as to reduce the possibility that the laser radar 30 is damaged by being pressed during the movement of the cleaning robot 100. The protective cover 50 is provided with a light-transmitting area 51, and the light-transmitting area 51 is exposed out of the top surface 13 of the host computer 10. The light-transmitting area 50 is used for passing the laser light emitted from the laser emitting hole 31 and the reflected laser light reflected back to the laser receiving portion 35 of the laser radar 30 through an obstacle. In this embodiment, the light-transmitting area 51 is a through hole, and the aperture of the through hole is smaller than 8.5mm, so as to prevent an external object (e.g., a finger) from entering the protective cover 50 and damaging the laser radar 30. It is understood that the size of the light-transmitting region 51 is not limited; the light-transmitting region 50 of the protective cover 50 may be made of a light-transmitting material, such as transparent glass, to prevent impurities, such as dust, from entering the protective cover 50 and causing damage to the laser radar 30.
A camera device 70 is fixed to the forward portion 11 for capturing an image to acquire environmental information of a direction in which the forward portion 11 of the cleaning robot 100 is directed. The camera device 70 is located between the top surface 15 and the bottom surface 17 of the host 10. The buffer 20 is opened with a light-transmitting area 21 at a position corresponding to the image pickup device 70 so that light outside the cleaning robot 100 can enter the image pickup device 70. Since the field of view of the camera device 70 is large, the environmental information that the camera device 70 can acquire includes obstacle information in which the height of the direction in which the forward portion 11 of the host computer 10 faces is lower than the top surface 15. In other words, the image pickup device 70 can acquire obstacle information in the detection blind area of the laser radar 30, improve the detection accuracy of the cleaning robot 100, reduce the possibility that the cleaning robot 100 collides with an obstacle during operation, and improve the operation efficiency of the cleaning robot 100. In the present embodiment, the imaging device 70 can recognize an obstacle from the captured image. The image pickup device 70 has a lens center 700 (shown in fig. 6), and the optical axis of the image pickup device 70 passes through the lens center 700.
In the present embodiment, the height of the laser emitting hole 31 from the bottom surface 17 is greater than the height of the lens center 700 from the bottom surface 17, the laser radar 30 is located on the central axis γ of the cleaning robot 100, the optical axis of the camera device 70 is coaxial with the central axis γ of the cleaning robot 100, and the central axis γ of the cleaning robot 100 passes through the laser radar 30, so as to simplify data processing such as image recognition, obstacle ranging, path planning, and map construction performed by the cleaning robot 100, thereby improving the detection accuracy of the cleaning robot 100 for an obstacle. The laser radar 30 is located on the central axis γ of the cleaning robot 100, may be located on the central axis γ for the center of the laser radar 30, and may also be set off from the central axis γ for the center of the laser radar 30, and the central axis γ may be set through the laser radar 30.
It is understood that the positions of the laser radar 30 and the imaging device 70 are not limited. For example, the laser radar 30 may be disposed near the central axis γ of the main body 10, and the central axis γ does not pass through the laser radar 30. In some embodiments, the optical axis of the imaging device 70 does not coincide with the central axis γ on which the imaging device 70 is located. In some embodiments, the lens center of the imaging device 70 is located on the central axis γ. In this way, it is only necessary that the laser radar 30 is disposed on the host 10 to emit laser, and the image capturing device 70 is fixed to the host 10 to capture an image.
It is understood that the number of the image pickup devices 70 may be plural. In some embodiments, the number of the image capturing devices 70 is even, and an even number of the image capturing devices 70 are symmetrically disposed on the main body 10 about the central axis γ, in other words, the number of the image capturing devices 70 located on one side of the central axis γ is the same as the number of the image capturing devices 70 located on the other side of the central axis γ, and the positions of the image capturing devices 70 located on one side of the central axis γ correspond to the positions of the image capturing devices 70 located on the other side of the central axis γ one-to-one. In some embodiments, the even number of cameras 70 may not be symmetrically disposed about the central axis γ, and the even number of cameras 70 may be disposed adjacent to the central axis γ.
In some embodiments, the number of the cameras 70 is odd, and the lens center 700 of one camera 70 is disposed on or adjacent to the central axis γ, so as to simplify data processing of the cleaning robot 100, such as image recognition, obstacle distance measurement, path planning, and map construction, and improve the obstacle detection accuracy.
In some embodiments, the plurality of cameras 70 may be a module, and the cameras 70 may be modularized to improve the accuracy of ranging and the uniformity of installation of the cleaning robot 100.
Assuming that the vertical angle of view of the imaging device 70 is a (as shown in fig. 3) and the horizontal angle of view of the imaging device 70 is b (as shown in fig. 5), the field view of the imaging device 70 is a two-dimensional field formed by the vertical angle of view a and the horizontal angle of view b. Although the larger the angle of view, the larger the field of view of the imaging device 70, and the smaller the optical magnification, the too large angle of view easily causes distortion in the image acquired by the imaging device 70, and affects the detection accuracy. In one embodiment, the maximum longitudinal field angle a ranges from 50 ° to 70 °, and the maximum lateral field angle b ranges from 110 ° to 130 °, so that the image capturing device 70 can capture high-quality images while having a good field range.
The cleaning robot further includes a traveling unit 80, and the traveling unit 80 is movably disposed on the main body 10 and partially protrudes from the bottom surface 17.
Referring to fig. 7, fig. 7 is a block diagram of a cleaning robot according to an embodiment. The cleaning robot 100 further includes a processor 110 and a driving unit 120. The processor 110 and the driving unit 120 are both disposed on the host 10, that is, the processor 110 and the driving unit 120 are both connected to the host 10. The driving unit 120 is used for driving the walking unit 80 to move. The laser radar 30, the image pickup device 70, and the driving unit 120 are all connected to the processor 110. The processor 110 is configured to construct a map and plan a motion path according to the environment information acquired by the laser radar 30 and the camera device 70, and the processor 110 is configured to control the driving unit 120 to drive the traveling unit 80 to travel along the planned motion path. The driving unit 120 may be a driving device such as a motor. The environmental information acquired by the laser radar 30 is defined as first environmental information, and the environmental information acquired by the image pickup device 70 is defined as second environmental information.
In some embodiments, the optical axis of the camera device 70 is disposed to be inclined with respect to a horizontal plane, and the central axis γ of the cleaning robot 100 is parallel to the horizontal plane, that is, the optical axis of the camera device 70 is disposed to be inclined with respect to the central axis γ of the host computer 10, the lens center 700 of the camera device 70 is located on the central axis γ, and an included angle between the optical axis of the camera device 70 and the central axis γ of the host computer 10 ranges from 0.5 ° to 10 °. It is understood that the lens center 700 of the image pickup device 70 may not be located on the central axis γ. Alternatively, the optical axis of the image pickup device 70 may be disposed obliquely upward with respect to the horizontal plane. Referring to fig. 9, fig. 9 is a schematic diagram illustrating that an optical axis of the image capturing device is inclined to a first direction relative to a central axis γ of the host, the first direction (the Y direction shown in fig. 9) is a direction perpendicular to the central axis γ and away from the bottom surface 17, and an inclination angle of an optical axis δ of the image capturing device 70 to the first direction relative to the central axis γ of the host 10 is 5 °. When the optical axis of the image capturing device 70 is inclined in the first direction with respect to the central axis γ of the main body 10, the visible angle of the image capturing device 70 with respect to the traveling surface 200 decreases in the forward direction of the main body 10, and the dead zone of the image capturing with respect to the traveling surface 200 increases, compared to when the optical axis of the image capturing device 70 is coaxial with the central axis γ of the main body 10.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating that an optical axis of the camera device is inclined in a second direction relative to a central axis of a main body of the cleaning robot according to an embodiment. The optical axis δ of the imaging device 70 is inclined in a second direction opposite to the first direction with respect to the central axis γ of the main body 10. When the optical axis of the camera device 70 is inclined to the second direction relative to the central axis γ of the host 10 compared to the optical axis of the camera device 70 located on the central axis γ of the host 10, the visible angle of the camera device 70 with respect to the walking surface 200 is increased in the forward direction of the host 10, and the dead zone of shooting with respect to the walking surface 200 is reduced. However, if the inclination angle between the optical axis δ of the camera device 70 and the central axis γ of the host 10 is too large, most of the images collected by the camera device 70 are the walking surface 200, which affects the user's experience of watching and monitoring. In a specific implementation, whether the optical axis of the camera device 70 is inclined with respect to the central axis γ of the host 10 is adjusted according to the space limitation of the structure and the required image capturing characteristics.
Referring to fig. 7 again, the cleaning robot 100 further includes a memory 130, a power supply unit 150, and a communication bus 170. The connection between the laser radar 30, the image pickup device 70, the memory 130, the power supply unit 150, and the processor 110 is realized by a communication bus 170.
The memory 130 may be integrated in the processor 110 or may be provided separately from the processor 110. In some embodiments, the lidar processor 37 in the lidar 30 may be omitted, and the processor 110 is configured to obtain the first environment information according to the information of the laser receiving part 33; the camera processor 7335 in the image pickup device 70 shown in fig. 8 may be omitted, and the processor 110 is configured to acquire the second environment information based on the image information provided by the image sensor 7331.
For ease of illustration, only one memory and processor are shown in FIG. 7. In actual practice, there may be multiple processors and memories.
The processor 110 may be considered to be implemented by a dedicated processing chip, processing circuit, processor, or a general-purpose chip.
It should be understood that in the embodiments of the present Application, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field-Programmable Gate arrays (FPGA) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like.
Memory 130 is used to store instructions and data including, but not limited to: map data, temporary data generated when controlling the operation of the cleaning robot 100, such as position data, speed data, etc. of the cleaning robot 100. The processor 110 may read instructions stored in the memory 130 to perform the corresponding functions. The Memory 130 may include a Random Access Memory (RAM) and a Non-Volatile Memory (NVM). The nonvolatile Memory may include a Hard Disk Drive (Hard Disk Drive, HDD), a Solid State Drive (SSD), a Silicon Disk Drive (SDD), a Read-Only Memory (ROM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy Disk, an optical data storage device, and the like.
The power supply unit 150 includes a rechargeable battery, a charging circuit connected to the rechargeable battery, and electrodes of the rechargeable battery. The number of the rechargeable batteries is one or more, and the rechargeable batteries may supply power required for the operation of the cleaning robot 100. The electrode may be provided at a side of the body or at the bottom of the body of the cleaning robot 100. The power supply unit 150 may also include a battery parameter detection component for detecting battery parameters, such as voltage, current, battery temperature, and the like. When the operation mode of the cleaning robot 100 is switched to the recharging mode, the cleaning robot 100 starts to search for the charging pile and charges the cleaning robot 100 with the charging pile.
It should be noted that the connection relationship between the units or components in the cleaning robot 100 is not limited to the connection relationship shown in fig. 7. For example, the processor 110 may be communicatively coupled to other units or components via wireless communication.
Referring to fig. 11, fig. 11 is a bottom view of a cleaning robot according to an embodiment. The cleaning robot 100 further includes a cleaning unit 190. The driving unit 140 may drive the cleaning unit 190 and the traveling unit 80 under the control of the processor 110. The driving unit 140 serves to drive the traveling unit 80 to move so that the cleaning robot 100 moves. The driving unit 140 is used to drive the sweeping unit 190 to move to clean the walking surface 200.
The traveling unit 80 includes first traveling wheels 81, second traveling wheels 83, and guide wheels 85 (wherein the first traveling wheels and the second traveling wheels may also be referred to as left wheels and right wheels) respectively arranged in a symmetrical manner at opposite sides of the bottom of the main body 10 of the cleaning robot 100. Motion operations including forward motion, backward motion, and rotation are performed during the performance of a task. The guide wheels 85 may be provided at the front or rear of the main machine 10.
The cleaning unit 190 includes: a main brush 190 and one or more side brushes 192. The main brush 190 is installed on the main body 10 of the cleaning robot 100 and is disposed to protrude from the bottom surface 17 of the main body 10. Alternatively, the main brush 190 is a drum-shaped rotating brush rotating with respect to the contact surface in a roller type. The side brushes 192 are mounted on the left and right edge portions of the front end of the bottom surface 17 of the cleaning robot 100. That is, the side brush 192 is installed substantially in front of the first road wheel 81 and/or the second road wheel 83. The side brush 192 is used to clean a cleaning area that the main brush 190 cannot clean. The side brush 192 may not only rotate on the spot but also be installed to protrude to the outside of the cleaning robot 100, so that the area swept by the cleaning robot 100 may be enlarged. It is understood that the cleaning unit 190 is not limited to a brush, but may be a rag or other cleaning objects.
It is understood that in one or more embodiments, the cleaning robot 100 may further include an input-output unit, a wireless communication unit, a display unit, and the like.
It should be noted that the cleaning robot 100 may further include other units or components, or only include some of the units or components, or lack some of the units or components (for example, in other embodiments, the cleaning robot 100 may not include the cleaning unit 190), which is not limited in this embodiment, and only the cleaning robot 100 is taken as an example for description.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention, and it is therefore to be understood that the invention is not limited by the scope of the appended claims.