CN107678038A - Robot collision-proof method, robot and storage medium - Google Patents
Robot collision-proof method, robot and storage medium Download PDFInfo
- Publication number
- CN107678038A CN107678038A CN201710891402.9A CN201710891402A CN107678038A CN 107678038 A CN107678038 A CN 107678038A CN 201710891402 A CN201710891402 A CN 201710891402A CN 107678038 A CN107678038 A CN 107678038A
- Authority
- CN
- China
- Prior art keywords
- robot
- image
- distance
- laser
- reflective thing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Manipulator (AREA)
Abstract
The present invention provides a kind of robot collision-proof method, robot and storage medium, and methods described includes:The reflected light information of the second robot formation of reflective thing is posted in collection by the laser illumination being installed in the first robot;Light image is formed according to the reflected light information of collection;Determine the intensity image that reflective thing is presented in the light image in second robot;The intensity image size presented according to the reflective thing in the light image determines the distance between first robot and second robot.The reflective thing of the invention that is sticked in robot, reflective thing can be fixed as a length, apart from it is near when in laser radar light image to account for Area comparison big, apart from it is remote when the area that accounts for will be small, the intensity image size presented according to reflective thing in light image determines the distance between first robot and second robot, efficiently solving can not effectively avoid in the prior art, easily it is impinging one another the problem of.
Description
Technical field
The invention belongs to robotic technology field, more particularly to a kind of robot collision-proof method, robot and deposits
Storage media.
Background technology
With developing rapidly for prior art, robot starts to come into the life of common people, and robot uses field
Relatively broad, by taking service industry as an example, robot can be engaged in food and drink, maintaining, repairing, transport, cleaning, rescue or monitoring etc.
Work, the healthy services that is beneficial to man, the great convenience that the life to the people is brought can be completed.
The laser radar that our robot uses at present is as main sensor.In order to protect laser radar, such as
Anticollision, dust-proof, lucifuge.More than we, the equipment of nearly all radar with laser is all to refer to this way, by laser
Radar is placed in certain distance in robot.Thus have a common fault.Such as figure, two identical equipment are run in opposite directions.Laser is pacified
The 30cm in equipment.Two equipment are already close to contact.But the distance that the laser radar of opposite face recognizes but also has 60cm.
If the safe distance of equipment is arranged to more than 60cm.Then equipment cannot pass through the environment that width is less than 1.2m certainly.
Can not effectively be avoided so current robot is present, it is easily impinging one another the problem of.
The content of the invention
In view of the above the shortcomings that prior art, it is an object of the invention to provide a kind of robot collision-proof method,
Robot and storage medium, for solving the problem of can not effectively avoiding in the prior art, easily impinging one another.
In order to achieve the above objects and other related objects, the present invention provides a kind of robot collision-proof method, the machine
People's collision-proof method includes:The second robot of reflective thing is posted in collection by the laser illumination being installed in the first robot
The reflected light information of formation;Light image is formed according to the reflected light information of collection;Determine reflective in second robot
The intensity image that thing is presented in the light image;The intensity image size presented according to the reflective thing in the light image
Determine the distance between first robot and second robot.
It is described to determine that reflective thing is presented in the light image in second robot in one embodiment of the invention
Intensity image specifically include:Identify that light intensity value is more than the pixel of preset value;It is determined that it is more than preset value by multiple light intensity values
The bar shaped intensity image that pixel is formed, the bar shaped intensity image are the reflection of the reflective thing in second robot
Light.
In one embodiment of the invention, the intensity image that is presented according to the reflective thing in the light image is big
It is small to determine that the distance between first robot and described second robot specifically include:Determine the bar shaped intensity image
Width;According to reflective thing on the laser in the width of the bar shaped intensity image, the first robot and second robot
The distance between corresponding proportionate relationship determine laser in first robot and reflective thing in second robot
The distance between;Determined according to the distance between reflective thing on the laser in first robot and second robot
The distance between first robot and second robot.
In one embodiment of the invention, width, the laser in the first robot and the institute of the bar shaped intensity image
Stating the corresponding proportionate relationship of the distance between reflective thing in the second robot is:Wherein, D is the width of bar shaped intensity image
Degree, L are the distance between reflective thing on the laser and the second robot in the first robot, and C is testing constant.
In one embodiment of the invention, by the distance between first robot of determination and second robot
Subtract a preset value and form the distance between first robot of final output and described second robot.
Also include described the formed according to the reflected light information in one embodiment of the invention, in the light image
The range image that two robots are presented in the light image.
In one embodiment of the invention, difference is presented in the intensity image and the range image in the light image
Color.
Embodiments of the invention also provide a kind of computer-readable recording medium, are stored thereon with computer program, the journey
Robot collision-proof method as described above is realized when sequence is executed by processor.
Embodiments of the invention also provide a kind of robot, including processor, memory and laser, and the memory is deposited
Programmed instruction is contained, robot collision-proof method as described above is realized in the processor operation described program instruction.
As described above, robot collision-proof method, robot and the storage medium of the present invention, have below beneficial to effect
Fruit:
1st, the present invention is sticked reflective thing in robot, and reflective thing can be fixed as a length, apart from it is near when in laser thunder
Account for that Area comparison is big up to light image, apart from it is remote when the area that accounts for will be small, the plot of light intensity presented according to reflective thing in light image
As size determines the distance between first robot and second robot, efficiently solving can not have in the prior art
Effect avoid, easily it is impinging one another the problem of.
2nd, the robot collision-proof method of robot of the invention make it that robot is more intelligent, can be preferably user
Service.
Brief description of the drawings
Fig. 1 is shown as schematic flow sheet of the robot collision-proof method of the present invention in an embodiment.
Fig. 2 is shown as the peace of laser in the first robot and the second robot in the robot collision-proof method of the present invention
Fill schematic diagram.
Fig. 3 and Fig. 4 is shown as the light formed in the robot collision-proof method of the present invention according to the reflected light information of collection
The schematic diagram of image.
Fig. 5 is shown as determining that reflective thing is in light image in the second robot in the robot collision-proof method of the present invention
Existing intensity image schematic diagram.
Fig. 6 be shown as the present invention robot collision-proof method in determine between the first robot and the second robot away from
From schematic diagram.
Fig. 7 and Fig. 8 is shown as the width identification schematic diagram of bar shaped intensity image in the robot collision-proof method of the present invention.
Component label instructions
100 first robots
110 lasers
120 grooves
200 second robots
210 lasers
220 grooves
Embodiment
Illustrate embodiments of the present invention below by way of specific instantiation, those skilled in the art can be by this specification
Disclosed content understands other advantages and effect of the present invention easily.The present invention can also pass through specific realities different in addition
The mode of applying is embodied or practiced, the various details in this specification can also be based on different viewpoints with application, without departing from
Various modifications or alterations are carried out under the spirit of the present invention.It should be noted that in the case where not conflicting, following examples and implementation
Feature in example can be mutually combined.
Fig. 1 to Fig. 8 is referred to, it is necessary to illustrate, the diagram provided in following examples only illustrates in a schematic way
The basic conception of the present invention, the component relevant with the present invention is only shown in schema then rather than according to package count during actual implement
Mesh, shape and size are drawn, and kenel, quantity and the ratio of each component can be a kind of random change during its actual implementation, and its
Assembly layout kenel may also be increasingly complex.
The purpose of the present embodiment is to provide a kind of robot collision-proof method, robot and storage medium, for solving
Certainly can not effectively avoid in the prior art, easily it is impinging one another the problem of.Robot anti-collision of the invention described in detail below
The principle and embodiment of method, robot and storage medium are hit, those skilled in the art is not needed creative work i.e.
It is appreciated that robot collision-proof method, robot and the storage medium of the present invention.
The robot collision-proof method of the present embodiment and the principle of robot are as follows:Most laser radar has light intensity
Information, reflecting strips can be sticked in equipment fluting or coat reflective material.Using the intensity signal of laser radar, at this
Under the reflective environment of sample, laser is that have very strong light intensity readings.Environment where most equipment, such as airport, warehouse are public
Road, hotel, hospital, high reflective situation will not be too many, so will not misidentify as a rule.If two equipment
Go in the same direction, by the high reflective identity for determining other side and position, by the intensity signal for setting laser radar so that robot
The environment of lesser width can be passed through, it might even be possible to through the environment less than 1.2m width, so that it may avoid both sides in time.Below
The robot collision-proof method, robot and storage medium of the present embodiment are specifically described.
The present embodiment provides a kind of robot collision-proof method, as shown in figure 1, the robot collision-proof method include with
Lower step:
Step S110, gather the second robot that reflective thing is posted by the laser illumination being installed in the first robot
The reflected light information of formation.
Step S120, light image is formed according to the reflected light information of collection.
Step S130, determine the intensity image that reflective thing is presented in the light image in second robot.
Step S140, the intensity image size presented according to the reflective thing in the light image determine first machine
The distance between device people and second robot.
The above-mentioned steps in the robot collision-proof method of the present embodiment are described in detail below.
Step S110, gather the second robot that reflective thing is posted by the laser illumination being installed in the first robot
The reflected light information of formation.
In the present embodiment, as shown in Fig. 2 the laser 110 in first robot 100 is respectively positioned on first machine
In fluting 120 on device people 100, wherein fluting 120 has certain depth, for example, but 10cm~30cm is not limited to.This implementation
Illustrated in example so that the depth of fluting 120 in the first robot 100 is 30cm as an example.Similarly, second robot 200
On laser 210 be respectively positioned in the fluting 220 in second robot 200, wherein fluting 220 has certain depth, example
Such as it is but is not limited to 10cm~30cm.Carried out in the present embodiment so that the depth of fluting 220 in the second robot 200 is 30cm as an example
Explanation.In the present embodiment, only illustrate device in the present embodiment by taking first robot 100 and the people of the second machine 200 as an example
The principle of people's collision-proof method, in practical application scene, the laser in any robot is respectively positioned on the fluting in robot
In, wherein fluting has certain depth, for example, but it is not limited to 10cm~30cm.
In the present embodiment, reflective thing is respectively provided with the fluting of each robot, for example, in the first robot 100
Laser 110, which irradiates, posts the second robot 200 of reflective thing, and the laser 110 in the first robot 100 obtains reflected light,
Gather the radiation optical information that the laser 110 in the first robot 100 obtains.
High reflective thing is attached in the fluting of robot, laser is installed in the fluting of the robot.Utilize laser
Device can gather intensity signal, and laser illumination posts the fluting of reflective thing, then the reflected light of laser collection irradiation, afterwards
Gather the intensity signal of laser output.Laser can regard an intensive range sensor as, can be with sniffing robot face
First 270 ° of distance value, unit millimeter.Same laser can also detect the light intensity in 270 ° of front, unit cd.It is special to detect
Strong light intensity position, it is possible to correspond to corresponding scope.
Intensity signal is can be as the feature of laser positions in robot, and in general scene does not have high reflective ground
Side.In the present embodiment, at the laser positions on post reflective thing, wherein, it is described anti-to reach more preferable recognition effect
Light thing is high reflective thing, reflective degree be the class of national standard 4 and more than.Wherein, the reflective thing is to work as to be not limited to reflective paster, reflective
Bar etc..
Step S120, light image is formed according to the reflected light information of collection.
It is ordinary skill in the art means that the reflected light information gathered according to laser, which forms light image, no longer superfluous herein
State.The light image of formation is specifically as shown in Figure 3.
Step S130, determine the intensity image that reflective thing is presented in the light image in second robot 200.
Also include second robot formed according to the reflected light information in the present embodiment, in the light image
200 range images presented in the light image.
The laser is its investigative range from 45 ° to 315 °, such as wherein has 1080 laser beams, and every light beam is all
The meeting position of display distance and light intensity in the intensity image.So this 1080 light beams composition be a distance and
The plan of light intensity.As shown in figure 4, it is shown as the distance of laser and the intensity image of light intensity.
In the present embodiment, different colors is presented in the intensity image and the range image in the light image,
Such as green is distance, pink colour is light intensity.
In the present embodiment, as shown in figure 5, described determine that reflective thing is in the light image in second robot 200
The intensity image of middle presentation specifically includes:
Step S131, identification light intensity value are more than the pixel of preset value.
As shown in figure 4, because the light intensity value of the light intensity value confronting grooves of reflective thing is very big, other directions are without anti-in environment
The light intensity value of striation is with there is differing greatly for reflecting strips, so readily identifying the pixel that light intensity value is more than preset value.
Step S132, it is determined that by multiple light intensity values more than the bar shaped intensity image that the pixel of preset value is formed, the bar
Shape intensity image is the reflected light of the reflective thing in second robot 200.
Step S140, the intensity image size presented according to the reflective thing in the light image determine first machine
The distance between device people 100 and second robot 200.
The reflective thing is fixed as a length, apart from it is near when in bar shaped intensity image to account for Area comparison big, apart from it is remote when
The area that bar shaped intensity image accounts for will be small.
In the present embodiment, as shown in fig. 6, the intensity image presented according to the reflective thing in the light image
Size determines that the distance between first robot 100 and described second robot 200 specifically include:
Step S141, determine the width of the bar shaped intensity image.
Step S142, according to the laser 110 in the width of the bar shaped intensity image, the first robot 100 with it is described
The corresponding proportionate relationship of the distance between reflective thing determines the laser in first robot 100 in second robot 200
110 with the distance between reflective thing in second robot 200.
For example, the as shown in fig. 7, intensity image that current two robot obtains at a distance of 40cm or so, a robot.Bar shaped
Post is light intensity value, apparent.Corresponding region is encircled in Fig. 7, account for 63 of 1080 light beams.
In another example in Fig. 8, apart there are 1.5m in two robots, and circle scope is exactly that the region of reflective thing identification account for 21 light
Beam.Identical light-emitting section different distance can obtain a linear relationship, can accurately identify the reflecting strips of our demarcation.
Specifically, in this hair embodiment, the laser on the width of the bar shaped intensity image, the first robot 100
110 are with the corresponding proportionate relationship of the distance between the upper reflective thing of the people of the second machine 200:Wherein, D is strip light
The width of strong image, L are the distance between reflective thing, C in the robot 200 of laser 110 and second in the first robot 100
For testing constant, such as according to specific test process value it is 20000.
Step S143, it is upper reflective with the people of the second machine 200 according to the laser 110 in first robot 100
The distance between thing determines the distance between first robot 100 and the people of the second machine 200.
In the present embodiment, the distance between first robot 100 of determination and described second robot 200 are subtracted
A preset value is gone to form the distance between first robot 100 of final output and described second robot 200.Wherein,
The preset value is but is not limited to 0.2m~0.5m.
High retroreflective regions can be identified by above-mentioned steps in the present embodiment, then find corresponding region.Determine described
The distance between first robot 100 and second robot 200 and then this panel region is artificially grown into obstacle
Thing.For example laser recognizes retroreflective regions as 1 meter, barrier will be grown 0.4 meter automatically.So this laser will be considered that front
Barrier is 0.6 meter rather than 1 meter, this makes it possible to solve the problems, such as head-on collision well, in the prior art can not be efficiently solved
Effectively avoid, easily it is impinging one another the problem of.
The present embodiment also provides a kind of computer-readable recording medium, is stored thereon with computer program, and the program is located
Reason device realizes robot collision-proof method as described above when performing.It is above-mentioned that the robot collision-proof method is carried out
It is described in detail, will not be repeated here.
The present embodiment also provides a kind of robot, including processor, memory and laser, and the memory storage has journey
Sequence is instructed, and robot collision-proof method as described above is realized in the processor operation described program instruction.It is above-mentioned to institute
State robot collision-proof method to be described in detail, will not be repeated here.In the present embodiment, for example, robot passes through wheel
Formula drive device keeps relative distance to follow the walking of people with people.
In summary, the present invention is sticked reflective thing in robot, and reflective thing can be fixed as a length, apart from it is near when
Laser radar light image accounts for that Area comparison is big, apart from it is remote when the area that accounts for will be small, presented according to reflective thing in light image
Intensity image size determines the distance between first robot 100 and second robot, efficiently solves existing skill
Can not effectively be avoided in art, easily it is impinging one another the problem of;The robot collision-proof method of the robot of the present invention causes machine
People is more intelligent, can be preferably user service.So the present invention effectively overcomes various shortcoming of the prior art and had
High industrial utilization.
The above-described embodiments merely illustrate the principles and effects of the present invention, not for the limitation present invention.It is any ripe
Know the personage of this technology all can carry out modifications and changes under the spirit and scope without prejudice to the present invention to above-described embodiment.Cause
This, those of ordinary skill in the art is complete without departing from disclosed spirit and institute under technological thought such as
Into all equivalent modifications or change, should by the present invention claim be covered.
Claims (10)
1. a kind of robot collision-proof method, it is characterised in that the robot collision-proof method includes:
The reflected light of the second robot formation of reflective thing is posted in collection by the laser illumination being installed in the first robot
Information;
Light image is formed according to the reflected light information of collection;
Determine the intensity image that reflective thing is presented in the light image in second robot;
The intensity image size presented according to the reflective thing in the light image determines first robot and described the
The distance between two robots.
2. robot collision-proof method according to claim 1, it is characterised in that described to determine in second robot
The intensity image that reflective thing is presented in the light image specifically includes:
Identify that light intensity value is more than the pixel of preset value;
It is determined that it is more than the bar shaped intensity image that the pixel of preset value is formed, the bar shaped intensity image by multiple light intensity values
The reflected light of reflective thing in second robot.
3. robot collision-proof method according to claim 2, it is characterised in that it is described according to the reflective thing in described
The intensity image size presented in light image determines that the distance between first robot and second robot specifically wrap
Include:
Determine the width of the bar shaped intensity image;
According to reflective thing in the laser in the width of the bar shaped intensity image, the first robot and second robot it
Between distance corresponding proportionate relationship determine laser in first robot and reflective thing in second robot it
Between distance;
Described the is determined according to the distance between reflective thing in the laser in first robot and second robot
The distance between one robot and second robot.
4. robot collision-proof method according to claim 3, it is characterised in that the width of the bar shaped intensity image,
Laser in first robot is with the corresponding proportionate relationship of the distance between reflective thing in second robot:
<mrow>
<mi>D</mi>
<mo>=</mo>
<mfrac>
<mi>C</mi>
<mi>L</mi>
</mfrac>
<mo>;</mo>
</mrow>
Wherein, D be bar shaped intensity image width, L be the first robot on laser and the second robot on reflective thing it
Between distance, C is testing constant.
5. according to the robot collision-proof method described in any claim of claim 3 and 4, it is characterised in that by determination
The distance between first robot and second robot subtract first machine that a preset value forms final output
The distance between device people and second robot.
6. robot collision-proof method according to claim 1, it is characterised in that also include in the light image according to institute
State the range image that second robot of reflected light information formation is presented in the light image.
7. robot collision-proof method according to claim 6, it is characterised in that the intensity image and the distance map
As different colors is presented in the light image.
8. a kind of computer-readable recording medium, is stored thereon with computer program, it is characterised in that the program is held by processor
The robot collision-proof method as described in any one of claim 1~7 is realized during row.
A kind of 9. robot, it is characterised in that including processor, memory and laser, the robot is provided with reflective thing,
The memory storage has programmed instruction, and the processor forms light according to the reflected light information gathered from the laser
The robot collision-proof method as described in 1~7 any one is realized in the instruction of image operation program.
10. robot according to claim 9, it is characterised in that the reflective thing is attached at the installation laser
Groove.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710891402.9A CN107678038A (en) | 2017-09-27 | 2017-09-27 | Robot collision-proof method, robot and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710891402.9A CN107678038A (en) | 2017-09-27 | 2017-09-27 | Robot collision-proof method, robot and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107678038A true CN107678038A (en) | 2018-02-09 |
Family
ID=61138502
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710891402.9A Pending CN107678038A (en) | 2017-09-27 | 2017-09-27 | Robot collision-proof method, robot and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107678038A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108536142A (en) * | 2018-03-18 | 2018-09-14 | 上海交通大学 | Industrial robot anti-collision early warning system based on digital fringe projection and method |
CN111390917A (en) * | 2020-05-08 | 2020-07-10 | 苏州博众机器人有限公司 | Robot anti-collision device and method and robot |
CN113741424A (en) * | 2021-08-04 | 2021-12-03 | 深圳市普渡科技有限公司 | Robot cooperative obstacle avoidance system, method, robot and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1437063A (en) * | 1995-06-22 | 2003-08-20 | 3Dv系统有限公司 | Method and apparatus for generating range subject distance image |
CN103941306A (en) * | 2014-01-13 | 2014-07-23 | 苏州爱普电器有限公司 | Cleaning robot and method for controlling same to avoid obstacle |
CN104272321A (en) * | 2012-05-01 | 2015-01-07 | 讯宝科技公司 | Apparatus for and method of electro-optically reading direct part marking indicia by image capture |
CN104731092A (en) * | 2014-12-22 | 2015-06-24 | 南京阿凡达机器人科技有限公司 | Multi-directional barrier avoiding system of mobile robot |
CN104827482A (en) * | 2015-05-22 | 2015-08-12 | 上海思岚科技有限公司 | Robotic platform capable of moving automatically |
WO2017006544A1 (en) * | 2015-07-09 | 2017-01-12 | Canon Kabushiki Kaisha | Measurement apparatus for measuring shape of target object, system and manufacturing method |
CN107053219A (en) * | 2017-06-16 | 2017-08-18 | 齐鲁工业大学 | A kind of method for positioning mobile robot based on laser scanner Yu strong reflecting sign |
-
2017
- 2017-09-27 CN CN201710891402.9A patent/CN107678038A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1437063A (en) * | 1995-06-22 | 2003-08-20 | 3Dv系统有限公司 | Method and apparatus for generating range subject distance image |
CN104272321A (en) * | 2012-05-01 | 2015-01-07 | 讯宝科技公司 | Apparatus for and method of electro-optically reading direct part marking indicia by image capture |
CN103941306A (en) * | 2014-01-13 | 2014-07-23 | 苏州爱普电器有限公司 | Cleaning robot and method for controlling same to avoid obstacle |
CN104731092A (en) * | 2014-12-22 | 2015-06-24 | 南京阿凡达机器人科技有限公司 | Multi-directional barrier avoiding system of mobile robot |
CN104827482A (en) * | 2015-05-22 | 2015-08-12 | 上海思岚科技有限公司 | Robotic platform capable of moving automatically |
WO2017006544A1 (en) * | 2015-07-09 | 2017-01-12 | Canon Kabushiki Kaisha | Measurement apparatus for measuring shape of target object, system and manufacturing method |
CN107053219A (en) * | 2017-06-16 | 2017-08-18 | 齐鲁工业大学 | A kind of method for positioning mobile robot based on laser scanner Yu strong reflecting sign |
Non-Patent Citations (1)
Title |
---|
王斌明: ""基于多传感器信息融合的移动机器人避障研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108536142A (en) * | 2018-03-18 | 2018-09-14 | 上海交通大学 | Industrial robot anti-collision early warning system based on digital fringe projection and method |
CN108536142B (en) * | 2018-03-18 | 2020-06-12 | 上海交通大学 | Industrial robot anti-collision early warning system and method based on digital grating projection |
CN111390917A (en) * | 2020-05-08 | 2020-07-10 | 苏州博众机器人有限公司 | Robot anti-collision device and method and robot |
CN113741424A (en) * | 2021-08-04 | 2021-12-03 | 深圳市普渡科技有限公司 | Robot cooperative obstacle avoidance system, method, robot and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112749594B (en) | Information completion method, lane line identification method, intelligent driving method and related products | |
US11681746B2 (en) | Structured prediction crosswalk generation | |
Soilán et al. | Safety assessment on pedestrian crossing environments using MLS data | |
CN107808123B (en) | Image feasible region detection method, electronic device, storage medium and detection system | |
CN107678038A (en) | Robot collision-proof method, robot and storage medium | |
Shang et al. | Lidar based negative obstacle detection for field autonomous land vehicles | |
US20130107010A1 (en) | Surface segmentation from rgb and depth images | |
CN113378760A (en) | Training target detection model and method and device for detecting target | |
Fortin et al. | Instance segmentation for autonomous log grasping in forestry operations | |
Dabbiru et al. | Lidar data segmentation in off-road environment using convolutional neural networks (cnn) | |
Li et al. | A survey of 3D object detection algorithms for intelligent vehicles development | |
Chong et al. | Integrated real-time vision-based preceding vehicle detection in urban roads | |
CN107728633A (en) | Obtain object positional information method and device, mobile device and its control method | |
Tsai et al. | Automating the crack map detection process for machine operated crack sealer | |
Morton et al. | Positive and negative obstacle detection using the HLD classifier | |
WO2024016877A1 (en) | Roadside sensing simulation system for vehicle-road collaboration | |
CN114468843A (en) | Cleaning device, cleaning system, cleaning control method and device thereof, and medium | |
CN111813125A (en) | An indoor environment detection system and method based on a wheeled robot | |
CN108090459A (en) | A kind of road traffic sign detection recognition methods suitable for vehicle-mounted vision system | |
Eric et al. | Kinect depth sensor for computer vision applications in autonomous vehicles | |
CN113409282A (en) | Deformation detection method and device for box-type structure, electronic equipment and storage medium | |
CN118968470A (en) | Target recognition model detection method, device, equipment, medium and program product | |
Li et al. | Intelligent detection method with 3D ranging for external force damage monitoring of power transmission lines | |
Li et al. | Safety monitoring method for powerline corridors based on single‐stage detector and visual matching | |
Mehta et al. | Supermarket shelf monitoring using ros based robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180209 |
|
RJ01 | Rejection of invention patent application after publication |