CN110376593B - Target sensing method and device based on laser radar - Google Patents
Target sensing method and device based on laser radar Download PDFInfo
- Publication number
- CN110376593B CN110376593B CN201910715446.5A CN201910715446A CN110376593B CN 110376593 B CN110376593 B CN 110376593B CN 201910715446 A CN201910715446 A CN 201910715446A CN 110376593 B CN110376593 B CN 110376593B
- Authority
- CN
- China
- Prior art keywords
- target
- unmanned ship
- laser radar
- holder
- photoelectric system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 24
- 238000004458 analytical method Methods 0.000 claims abstract description 11
- 238000012544 monitoring process Methods 0.000 claims abstract description 10
- 238000000605 extraction Methods 0.000 claims abstract description 6
- 238000004364 calculation method Methods 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 14
- 238000012549 training Methods 0.000 claims description 12
- 230000003287 optical effect Effects 0.000 claims description 6
- 230000005693 optoelectronics Effects 0.000 claims description 6
- 230000008447 perception Effects 0.000 claims description 5
- 238000012795 verification Methods 0.000 claims description 4
- 235000006506 Brasenia schreberi Nutrition 0.000 claims description 3
- 238000009434 installation Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/203—Instruments for performing navigational calculations specially adapted for water-borne vessels
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a target sensing method and a target sensing device based on a laser radar, wherein the method comprises the following steps: step 1: the laser radar sensing module detects a water area around the unmanned ship to acquire position information of a target; step 2: the holder control module is used for calculating holder parameters of the photoelectric system according to the position information of the target and controlling the photoelectric system to accurately position the target; and step 3: the video analysis module is used for detecting and identifying the target monitoring picture frame by frame, and acquiring the target type, the position of the target in the image, the target extraction contour, the size and the color; and 4, step 4: and uploading the sensed target data to the unmanned ship control center. According to the target sensing method and device based on the laser radar, the laser radar and the photoelectric system are linked to actively detect the surrounding water area of the unmanned ship, the target position is used for calculating the tripod head parameter of the photoelectric system, the target is accurately positioned, and the target identification accuracy is improved.
Description
Technical Field
The present invention relates to a target sensing method and device, and in particular, to a target sensing method and device based on a laser radar.
Background
The Unmanned Surface Vehicle (USV) has the characteristics of small size, strong maneuverability, high intelligent degree, capability of replacing the Unmanned Surface Vehicle to complete complex and dangerous tasks in extreme environments and the like, and has wide application prospects in the fields of production, scientific research, national defense and the like. The unmanned ship environment sensing device and the unmanned ship environment sensing method are key technologies for realizing autonomous navigation of the unmanned ship, and have extremely high research values.
Patent document CN109282813A discloses a method for identifying global obstacles of unmanned surface vehicle, which includes: the navigation radar scans the obstacles; calculating a corrected position of the obstacle; the photoelectric tracker captures the obstacle and calculates the size of the obstacle; averaging the acquired data of the obstacle; and carrying out global obstacle avoidance planning. The method utilizes a navigation radar to detect targets around the unmanned ship, and then utilizes a photoelectric tracker to capture and identify the targets, and belongs to an active perception system. According to the method, the detection precision of the navigation radar is lower than that of a laser radar, and accurate target position information cannot be acquired; the method does not relate to a method for calculating parameters of a holder of a photoelectric tracker; the method only extracts the size information of the target, and cannot identify and classify the target.
Patent document CN109255820A discloses an active sensing device and method based on unmanned ship, which includes: the system comprises an unmanned ship navigation control unit, a shipborne laser radar device, a camera calibration unit, a data acquisition and display unit, a data unit, a shore-based server control unit and a shore-based target detection unit; the shore-based target detection unit is used for determining the position of a target after detecting the target in the complex water area; the unmanned ship navigation control unit is used for controlling the unmanned ship to navigate to the position near the target to be observed; the shipborne laser radar device is used for detecting the specific position of a target; the camera calibration unit is used for calibrating all Cell groups through the camera; the data acquisition and display unit is used for shooting a clear image of a target; and the shore-based server control unit is used for storing the image data acquired by the data acquisition and display unit. The method calibrates the camera through the camera calibration unit, realizes the observation of the camera on the preset Cell area, and has complicated calibration work and low positioning precision; the method only collects the image data of the target, and cannot identify and classify the target.
Therefore, there is a need to develop a device that can actively sense the position information of the water area target around the unmanned boat.
Disclosure of Invention
The invention aims to solve the technical problem of providing a target sensing method and device based on a laser radar.
The invention adopts the technical scheme to solve the technical problems and provides a target sensing method based on a laser radar, which comprises the following steps:
step 1: the laser radar sensing module detects a water area around the unmanned ship to acquire position information of a target;
step 2: the holder control module is used for calculating holder parameters of the photoelectric system according to the position information of the target and controlling the photoelectric system to accurately position the target;
and step 3: the video analysis module is used for detecting and identifying the target monitoring picture frame by frame, and acquiring the target type, the position of the target in the image, the target extraction contour, the size and the color;
and 4, step 4: and uploading the sensed target data to the unmanned ship control center.
Preferably, the video analysis module is used for detecting and identifying the target monitoring pictures frame by frame, and training and learning the yolov3 water surface target detection model based on the Darknet framework.
Preferably, the step 1 specifically includes the laser radar detecting a water area around the unmanned ship, and acquiring a distance, an azimuth angle and a radial dimension of the target relative to the unmanned ship, which are respectively marked as DVT、θAAnd LTAnd acquiring the position of the unmanned ship through a high-precision GPS (global positioning system), and recording as (x)lon,ylat) The calculation formula of the position information (x, y) of the object is as follows:
Preferably, the step 2 specifically includes recording parameters of the pan/tilt head of the optoelectronic system as (p, t, z), adjusting the pan/tilt head of the optoelectronic system to a zero azimuth angle, that is, the parameters of the pan/tilt head are (0,0,0), and acquiring an included angle between the optical axis of the lens and the geographical true north and an included angle between the optical axis of the lens and the horizontal plane, which are respectively recorded as θ1,θ2;
The calculation formula of the holder parameters (p, t, z) is as follows:
p=θA+θ1
wherein, thetaAIs the azimuth angle of the target relative to the unmanned boat;
wherein,h is the height of the installation position of the photoelectric system from the sea level, DVTIs the distance of the target relative to the unmanned boat;
wherein M ismaxFor maximum magnification, f, of the photoelectric system pan-tiltminAnd fmaxThe minimum focal length and the maximum focal length of the camera lens of the photoelectric system are respectively, f is the optimal focal length for focusing and displaying the current target, and the calculation formula is as follows:
wherein,DVTthe distance between the target and the unmanned boat is L, the width of a photosensitive device CMOS or CCD of the photoelectric system camera is LTIs the target radial dimension.
Preferably, the training and learning yolov3 water surface target detection model based on Darknet framework comprises: collecting a plurality of water targets, including six target pictures of ships, reefs, islands, floating woods, floating ices and other floating objects, wherein one part of the pictures are used as training data, and the other part of the pictures are used as verification data.
Preferably, the step 4 includes uploading the geographic position of the target, the distance between the target and the unmanned ship, the azimuth, the contour, the size and the category of the target to the unmanned ship control center, and storing the geographic position, the distance between the target and the unmanned ship.
Another technical solution adopted by the present invention to solve the above technical problem is to provide a target sensing apparatus based on a lidar, including:
the system comprises a laser radar sensing module, a high-precision GPS and a radar data processing unit, wherein the laser radar is used for acquiring the distance, azimuth angle and radial size of a target relative to an unmanned ship, the high-precision GPS is used for calibrating the position of the laser radar, and the radar data processing unit is used for acquiring the position information of the target;
the photoelectric system is connected with the holder control module and the video analysis module;
the holder control module comprises a holder parameter calculation unit and a high-speed holder control unit, wherein the holder parameter calculation unit is used for calculating holder parameters of the photoelectric system according to the position information of the target, and the high-speed holder control unit is used for controlling the photoelectric system to accurately position the target;
the video analysis module comprises an image processing unit and a target identification unit, wherein the image processing unit is used for detecting and identifying a target monitoring picture frame by frame, and the target identification unit is used for acquiring a target category, a position of a target in an image, a target extraction outline, a size and a color;
and the unmanned ship control center is used for acquiring the sensed target data.
Compared with the prior art, the invention has the following beneficial effects: according to the target sensing method and device based on the laser radar, the laser radar and the photoelectric system are linked to actively detect the water area around the unmanned ship, so that the detection efficiency is improved; the target position is used for calculating the parameters of the holder of the photoelectric system, so that the target is accurately positioned, the observation effect is enhanced, and the target identification accuracy is improved; training a yolov3 water surface target detection model based on a Darknet framework for classification and identification of water surface targets, and the identification rate is high; the method provides target information data including the geographic position of the target, the distance between the target and the unmanned ship, the azimuth angle, the radial size, the contour of the target, the category of the target and the like, and perception information is rich.
Drawings
FIG. 1 is a schematic view of an active sensing system of an unmanned surface vehicle with a target sensing device based on a laser radar installed in an embodiment of the invention;
FIG. 2 is a top view of a holder parameter calculation angle relationship of a target sensing device based on a laser radar in an embodiment of the present invention;
FIG. 3 is a side view of a holder parameter calculation angle relationship of a target sensing device based on a laser radar in an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a target sensing device based on a lidar in an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the figures and examples.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. Accordingly, the particular details set forth are merely exemplary, and the particular details may be varied from the spirit and scope of the present invention and still be considered within the spirit and scope of the present invention.
According to the target sensing method and device based on the laser radar, the laser radar and the photoelectric system are linked to actively detect the surrounding water area of the unmanned ship, the target position is used for calculating the parameters of the holder of the photoelectric system, the target is accurately positioned, and the target identification accuracy is improved.
Referring now to fig. 1, 1 is a surface drone, 2 is a mounting strut, 3 is a photovoltaic pod, 4 is a lidar, 5 is a surface target, and 6 is a drone control center.
The embodiment discloses a target perception method based on a laser radar, which comprises the following steps:
step 1: the laser radar sensing module detects a water area around the unmanned ship to acquire position information of a target;
step 2: the holder control module is used for calculating holder parameters of the photoelectric system according to the position information of the target and controlling the photoelectric system to accurately position the target;
and step 3: the video analysis module is used for detecting and identifying the target monitoring picture frame by frame, and acquiring the target type, the position of the target in the image, the target extraction contour, the size and the color;
and 4, step 4: and uploading the sensed target data to the unmanned ship control center.
Preferably, the video analysis module is used for detecting and identifying the target monitoring pictures frame by frame, and training and learning the yolov3 water surface target detection model based on the Darknet framework.
Preferably, the step 1 specifically includes the laser radar detecting a water area around the unmanned ship, and acquiring a distance, an azimuth angle and a radial dimension of the target relative to the unmanned ship, which are respectively marked as DVT、θAAnd LTAnd acquiring the position of the unmanned ship through a high-precision GPS (global positioning system), and recording as (x)lon,ylat) The calculation formula of the position information (x, y) of the object is as follows:
Preferably, the step 2 specifically includes recording parameters of the pan/tilt head of the optoelectronic system as (p, t, z), adjusting the pan/tilt head of the optoelectronic system to a zero azimuth angle, that is, the parameters of the pan/tilt head are (0,0,0), and acquiring an included angle between the optical axis of the lens and the geographical true north and an included angle between the optical axis of the lens and the horizontal plane, which are respectively recorded as θ1,θ2;
The calculation formula of the holder parameters (p, t, z) is as follows:
p=θA+θ1
wherein, thetaAIs the azimuth angle of the target relative to the unmanned boat;
wherein,h is the height of the installation position of the photoelectric system from the sea level, DVTIs the distance of the target relative to the unmanned boat;
wherein M ismaxFor maximum magnification, f, of the photoelectric system pan-tiltminAnd fmaxThe minimum focal length and the maximum focal length of the camera lens of the photoelectric system are respectively, f is the optimal focal length for focusing and displaying the current target, and the calculation formula is as follows:
wherein,DVTthe distance between the target and the unmanned boat is L, the width of a photosensitive device CMOS or CCD of the photoelectric system camera is LTIs the target radial dimension.
Preferably, the training and learning yolov3 water surface target detection model based on Darknet framework comprises: collecting a plurality of water targets, including six target pictures of ships, reefs, islands, floating woods, floating ices and other floating objects, wherein one part of the pictures are used as training data, and the other part of the pictures are used as verification data. 30000 pictures of six target pictures such as ships, reefs, islands, floating woods, floating ice and other floating objects, wherein 18000 pictures are used as training data, 12000 pictures are used as verification data, and a yolov3 water surface target detection model is trained and learned based on a Darknet framework. And detecting and identifying the target monitoring picture frame by using the trained model to obtain the target type and the position of the target in the image, and extracting the contour, size and appearance characteristic data of the target.
Preferably, the step 4 includes uploading the geographic position of the target, the distance between the target and the unmanned ship, the azimuth, the contour, the size and the category of the target to an unmanned ship control center, storing the geographic position, the distance between the target and the unmanned ship, and providing data support for air route planning.
Referring now to fig. 4, this embodiment further discloses a target sensing apparatus based on lidar, including:
the system comprises a laser radar sensing module, a high-precision GPS and a radar data processing unit, wherein the laser radar is used for acquiring the distance, azimuth angle and radial size of a target relative to an unmanned ship, the high-precision GPS is used for calibrating the position of the laser radar, and the radar data processing unit is used for acquiring the position information of the target;
the photoelectric system is connected with the holder control module and the video analysis module;
the holder control module comprises a holder parameter calculation unit and a high-speed holder control unit, wherein the holder parameter calculation unit is used for calculating holder parameters of the photoelectric system according to the position information of the target, and the high-speed holder control unit is used for controlling the photoelectric system to accurately position the target;
the video analysis module comprises an image processing unit and a target identification unit, wherein the image processing unit is used for detecting and identifying a target monitoring picture frame by frame, and the target identification unit is used for acquiring a target category, a position of a target in an image, a target extraction outline, a size and a color;
and the unmanned ship control center is used for acquiring the sensed target data.
In summary, the target sensing method and device based on the laser radar provided by the embodiment actively detect the water area around the unmanned ship by using the linkage of the laser radar and the photoelectric system, so that the detection efficiency is improved; the target position is used for calculating the parameters of the holder of the photoelectric system, so that the target is accurately positioned, the observation effect is enhanced, and the target identification accuracy is improved; training a yolov3 water surface target detection model based on a Darknet framework for classification and identification of water surface targets, and the identification rate is high; the method provides target information data including the geographic position of the target, the distance between the target and the unmanned ship, the azimuth angle, the radial size, the contour of the target, the category of the target and the like, and perception information is rich.
Although the present invention has been described with respect to the preferred embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (4)
1. A target perception method based on a laser radar is characterized by comprising the following steps:
step 1: the laser radar sensing module detects a water area around the unmanned ship to acquire position information of a target;
step 2: the holder control module is used for calculating holder parameters of the photoelectric system according to the position information of the target and controlling the photoelectric system to accurately position the target;
and step 3: the video analysis module is used for detecting and identifying the target monitoring picture frame by frame, and acquiring the target type, the position of the target in the image, the target extraction contour, the size and the color;
and 4, step 4: uploading the sensed target data to an unmanned ship control center;
the video analysis module is used for detecting and identifying target monitoring pictures frame by frame and training and learning a yolov3 water surface target detection model based on a Darknet frame;
the step 1 specifically comprises the steps that the laser radar detects the water area around the unmanned ship, and the distance, the azimuth angle and the radial size of a target relative to the unmanned ship are obtained and are respectively recorded as DVT、θAAnd LTAnd acquiring the position of the unmanned ship through a high-precision GPS (global positioning system), and recording as (x)lon,ylat) The calculation formula of the position information (x, y) of the object is as follows:
2. The lidar-based target sensing method according to claim 1, wherein the step 2 specifically comprises recording parameters of a holder of the optoelectronic system as (p, t, z), adjusting the holder of the optoelectronic system to a zero azimuth angle, that is, the holder parameters are (0,0,0), and obtaining an included angle between an optical axis of the lens and the geographical true north and an included angle between the optical axis of the lens and a horizontal plane, which are respectively recorded as θ1,θ2;
The calculation formula of the holder parameters (p, t, z) is as follows:
p=θA+θ1
wherein, thetaAIs the azimuth angle of the target relative to the unmanned boat;
wherein,h is the height of the installation position of the photoelectric system from the sea level, DVTIs the distance of the target relative to the unmanned boat;
wherein M ismaxFor maximum magnification, f, of the photoelectric system pan-tiltminAnd fmaxThe minimum focal length and the maximum focal length of the camera lens of the photoelectric system are respectively, f is the optimal focal length for focusing and displaying the current target, and the calculation formula is as follows:
3. The lidar-based target sensing method of claim 2, wherein the training of the learning yolov3 water surface target detection model based on the Darknet framework comprises: collecting a plurality of water targets, including six target pictures of ships, reefs, islands, floating woods, floating ices and other floating objects, wherein one part of the pictures are used as training data, and the other part of the pictures are used as verification data.
4. The lidar-based target sensing method of claim 1, wherein the step 4 comprises uploading the geographic position of the target, the distance of the target relative to the unmanned ship, the azimuth, the contour, the size and the category of the target to an unmanned ship control center and storing the geographic position, the distance relative to the unmanned ship, the azimuth, the contour, the size and the category of the target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910715446.5A CN110376593B (en) | 2019-08-05 | 2019-08-05 | Target sensing method and device based on laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910715446.5A CN110376593B (en) | 2019-08-05 | 2019-08-05 | Target sensing method and device based on laser radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110376593A CN110376593A (en) | 2019-10-25 |
CN110376593B true CN110376593B (en) | 2021-05-04 |
Family
ID=68257964
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910715446.5A Active CN110376593B (en) | 2019-08-05 | 2019-08-05 | Target sensing method and device based on laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110376593B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112835055A (en) * | 2020-12-30 | 2021-05-25 | 潍柴动力股份有限公司 | A positioning method and system for laser SLAM equipment |
CN112927233A (en) * | 2021-01-27 | 2021-06-08 | 湖州市港航管理中心 | Marine laser radar and video combined target capturing method |
CN113064157B (en) * | 2021-06-01 | 2022-05-27 | 北京高普乐光电科技股份公司 | Radar and photoelectric linkage early warning method, device and system |
CN114863180A (en) * | 2022-05-20 | 2022-08-05 | 上海交通大学 | Unmanned ship intelligent target detection method and system adopting binocular camera |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108471497A (en) * | 2018-03-02 | 2018-08-31 | 天津市亚安科技有限公司 | A kind of ship target real-time detection method based on monopod video camera |
KR20190005413A (en) * | 2017-07-06 | 2019-01-16 | 세한대학교기술지주회사 주식회사 | Collision detection device of Marina leisure ship based on laser sensor |
CN109298708A (en) * | 2018-08-31 | 2019-02-01 | 中船重工鹏力(南京)大气海洋信息系统有限公司 | A kind of unmanned boat automatic obstacle avoiding method merging radar and photoelectric information |
CN109375633A (en) * | 2018-12-18 | 2019-02-22 | 河海大学常州校区 | System and method for river clearing path planning based on global state information |
CN109444911A (en) * | 2018-10-18 | 2019-03-08 | 哈尔滨工程大学 | A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion |
CN109784278A (en) * | 2019-01-17 | 2019-05-21 | 上海海事大学 | The small and weak moving ship real-time detection method in sea based on deep learning |
-
2019
- 2019-08-05 CN CN201910715446.5A patent/CN110376593B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190005413A (en) * | 2017-07-06 | 2019-01-16 | 세한대학교기술지주회사 주식회사 | Collision detection device of Marina leisure ship based on laser sensor |
CN108471497A (en) * | 2018-03-02 | 2018-08-31 | 天津市亚安科技有限公司 | A kind of ship target real-time detection method based on monopod video camera |
CN109298708A (en) * | 2018-08-31 | 2019-02-01 | 中船重工鹏力(南京)大气海洋信息系统有限公司 | A kind of unmanned boat automatic obstacle avoiding method merging radar and photoelectric information |
CN109444911A (en) * | 2018-10-18 | 2019-03-08 | 哈尔滨工程大学 | A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion |
CN109375633A (en) * | 2018-12-18 | 2019-02-22 | 河海大学常州校区 | System and method for river clearing path planning based on global state information |
CN109784278A (en) * | 2019-01-17 | 2019-05-21 | 上海海事大学 | The small and weak moving ship real-time detection method in sea based on deep learning |
Non-Patent Citations (2)
Title |
---|
Optimal Path Planning of a Mini USV using Sharp Cornering Algorithm;Muhammad Asrofi et al.;《2016 International Conference on Information Technology Systems and Innovation (ICITSI)》;20161027;第1-6页 * |
智能船舶的研究现状与发展趋势;严新平;《交通与港航》;20160228(第1期);第25-28页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110376593A (en) | 2019-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110376593B (en) | Target sensing method and device based on laser radar | |
CN111523465B (en) | Ship identity recognition system based on camera calibration and deep learning algorithm | |
CN110414396B (en) | A deep learning-based perception fusion algorithm for unmanned boats | |
CN113627473A (en) | Water surface unmanned ship environment information fusion sensing method based on multi-mode sensor | |
Schwendeman et al. | A horizon-tracking method for shipboard video stabilization and rectification | |
EP3881221A1 (en) | System and method for measuring the distance to an object in water | |
CN106373159A (en) | Simplified unmanned aerial vehicle multi-target location method | |
CN112857356B (en) | Unmanned aerial vehicle water body environment investigation and air route generation method | |
CN115717867B (en) | A bridge deformation measurement method based on airborne dual cameras and target tracking | |
KR20210007767A (en) | Autonomous navigation ship system for removing sea waste based on deep learning-vision recognition | |
CN114004977A (en) | Aerial photography data target positioning method and system based on deep learning | |
CN111260539A (en) | Fisheye image target recognition method and system | |
CN115950435B (en) | Real-time positioning method for unmanned aerial vehicle inspection image | |
WO2019189381A1 (en) | Moving body, control device, and control program | |
CN113780246B (en) | UAV 3D track monitoring method, system and 3D monitoring device | |
CN110398760B (en) | Pedestrian coordinate capture device based on image analysis and use method thereof | |
WO2022121024A1 (en) | Unmanned aerial vehicle positioning method and system based on screen optical communication | |
CN110001945A (en) | One kind facade that falls precipice finely tilts boat and takes the photograph device and method for imaging | |
CN111090283A (en) | Unmanned ship combined positioning and orientation method and system | |
CN112611361A (en) | Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle | |
CN115830140B (en) | Offshore short-range photoelectric monitoring method, system, medium, equipment and terminal | |
CN104613928A (en) | Automatic tracking and air measurement method for optical pilot balloon theodolite | |
CN117830934A (en) | Road target tracking and identifying method and system based on millimeter wave radar and variable-focus camera | |
CN111402324B (en) | Target measurement method, electronic equipment and computer storage medium | |
CN110393165A (en) | A kind of off-lying sea cultivation net cage bait-throwing method based on Autoamtic bait putting ship |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |