CN117346792A - Positioning method for underwater robot in ocean engineering environment - Google Patents
Positioning method for underwater robot in ocean engineering environment Download PDFInfo
- Publication number
- CN117346792A CN117346792A CN202311643883.3A CN202311643883A CN117346792A CN 117346792 A CN117346792 A CN 117346792A CN 202311643883 A CN202311643883 A CN 202311643883A CN 117346792 A CN117346792 A CN 117346792A
- Authority
- CN
- China
- Prior art keywords
- target
- underwater
- initial
- underwater robot
- instantaneous
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000005516 engineering process Methods 0.000 claims abstract description 10
- 230000000007 visual effect Effects 0.000 claims description 31
- 230000008569 process Effects 0.000 claims description 11
- 238000002271 resection Methods 0.000 claims description 4
- 238000010276 construction Methods 0.000 description 10
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 9
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012634 optical imaging Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000005477 standard model Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/203—Instruments for performing navigational calculations specially adapted for water-borne vessels
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
本申请提供了一种海洋工程环境水下机器人定位方法,包括:基于卫星和水声定位系统,获取水下机器人所在的初始位置;在编码合作靶标集合中,确定与所述初始位置对应的初始编码合作靶标,其中,所述编码合作靶标集合包括预先设置于水下构筑物的编码合作靶标;基于所述初始编码合作靶标,利用水下摄影测量技术确定所述水下机器人的瞬时精确位置和瞬时精确姿态;基于由所述初始编码合作靶标组成的作业路径,确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标确定为目的编码合作靶标;根据所述瞬时精确位置和所述瞬时精确姿态,结合惯性导航系统进行定位导航,从而实现海洋工程环境水下长航时定位。
This application provides a method for positioning an underwater robot in a marine engineering environment, which includes: obtaining the initial position of the underwater robot based on a satellite and hydroacoustic positioning system; and determining an initial position corresponding to the initial position in a set of coded cooperative targets. Coded cooperation targets, wherein the coded cooperation target set includes coded cooperation targets pre-set on underwater structures; based on the initial coded cooperation targets, use underwater photogrammetry technology to determine the instantaneous precise position and instantaneous position of the underwater robot Accurate posture; based on the operating path composed of the initial coded cooperation target, determine the next coded cooperation target that the underwater robot leaves the initial coded cooperation target and goes to as the destination coded cooperation target; according to the instantaneous precise position and the instantaneous precise attitude, combined with the inertial navigation system for positioning and navigation, thereby achieving underwater long-endurance positioning in the marine engineering environment.
Description
技术领域Technical field
本申请涉及水下机器人技术领域,尤其涉及一种海洋工程环境水下机器人定位方法。The present application relates to the technical field of underwater robots, and in particular to a positioning method for underwater robots in a marine engineering environment.
背景技术Background technique
典型的海洋工程包括海底沉管隧道建设、港口码头施工、海上风力发电厂建设等。以目前为止定位要求最为严苛的海底沉管隧道建设为例,由于沉管几何尺度大、施工环境复杂且施工过程对周围水体影响较大,为满足高精度定位需求,需要潜水员进行持续水下作业,安全风险较高。以水下机器人替代潜水员进行水下观测和作业成为发展趋势。Typical marine projects include the construction of submarine immersed tube tunnels, port terminal construction, offshore wind power plant construction, etc. Take the construction of submarine immersed tube tunnels, which have the most stringent positioning requirements so far, as an example. Due to the large geometric scale of the immersed tubes, the complex construction environment and the great impact of the construction process on the surrounding water bodies, in order to meet the high-precision positioning requirements, divers are required to conduct continuous underwater operations. operation, the safety risk is high. It has become a development trend to use underwater robots to replace divers for underwater observation and operations.
精确定位是水下机器人正常工作的前提。当前,水下机器人通常搭载惯性导航系统以进行定位。然而,惯性导航系统的误差随导航时间成发散趋势,导航时间越长,惯性导航系统的误差越大,导致水下机器人难以维持高精度定位。Accurate positioning is a prerequisite for the normal operation of underwater robots. Currently, underwater robots are usually equipped with inertial navigation systems for positioning. However, the error of the inertial navigation system diverges with navigation time. The longer the navigation time, the greater the error of the inertial navigation system, making it difficult for underwater robots to maintain high-precision positioning.
发明内容Contents of the invention
本申请实施例的目的是提供一种海洋工程环境水下机器人定位方法,能够使搭载惯性导航系统的水下机器人延长精度保持时间,使水下机器人维持高精度定位。The purpose of the embodiments of this application is to provide a method for positioning an underwater robot in a marine engineering environment, which can extend the accuracy maintenance time of an underwater robot equipped with an inertial navigation system and enable the underwater robot to maintain high-precision positioning.
为解决上述技术问题,本申请实施例提供了一种海洋工程环境水下机器人定位方法,包括:基于卫星和水声定位系统,获取水下机器人所在的初始位置;在编码合作靶标集合中,确定与所述初始位置对应的初始编码合作靶标,其中,所述编码合作靶标集合包括预先设置于水下构筑物的编码合作靶标;基于所述初始编码合作靶标,利用水下摄影测量技术确定所述水下机器人的瞬时精确位置和瞬时精确姿态;基于由所述初始编码合作靶标组成的作业路径,确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标为目的编码合作靶标;根据所述瞬时精确位置、所述瞬时精确姿态和所述作业路径,结合惯性导航系统进行定位导航。In order to solve the above technical problems, embodiments of the present application provide a method for positioning an underwater robot in a marine engineering environment, which includes: obtaining the initial position of the underwater robot based on satellites and an underwater acoustic positioning system; and determining the initial position of the underwater robot in the coded cooperation target set. An initial coded cooperation target corresponding to the initial position, wherein the coded cooperation target set includes coded cooperation targets pre-set on underwater structures; based on the initial coded cooperation target, use underwater photogrammetry technology to determine the water The instantaneous precise position and instantaneous precise posture of the underwater robot; based on the operation path composed of the initial coded cooperation target, determine the next coded cooperation target that the underwater robot leaves the initial coded cooperation target and goes to as the purpose coded cooperation target ; According to the instantaneous precise position, the instantaneous precise attitude and the working path, positioning and navigation are performed in combination with an inertial navigation system.
在本申请实施例中,通过在水下构筑物上布设编码合作靶标,采用水下摄影测量的技术,基于编码合作靶标得到水下机器人的瞬时精确位置和瞬时精确姿态,结合瞬时精确位置和瞬时精确姿态和水下机器人的作业路径,能够形成基于惯性导航系统的长航时、高精度的水下机器人定位方法。In the embodiment of this application, by arranging coded cooperative targets on underwater structures, using underwater photogrammetry technology, the instantaneous precise position and instantaneous precise attitude of the underwater robot are obtained based on the coded cooperative targets, combining the instantaneous precise position and instantaneous precise attitude. The attitude and operating path of the underwater robot can form a long-endurance, high-precision underwater robot positioning method based on the inertial navigation system.
附图说明Description of drawings
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请中记载的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to explain the embodiments of the present application or the technical solutions in the prior art more clearly, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below. Obviously, the drawings in the following description are only These are some of the embodiments recorded in this application. For those of ordinary skill in the art, other drawings can be obtained based on these drawings without exerting any creative effort.
图1示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的一种流程示意图;Figure 1 shows a schematic flow chart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application;
图2示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的一种示意图;Figure 2 shows a schematic diagram of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application;
图3示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的另一种流程示意图;Figure 3 shows another schematic flow chart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application;
图4示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的另一种流程示意图;Figure 4 shows another schematic flow chart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application;
图5示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的另一种示意图;Figure 5 shows another schematic diagram of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application;
图6示出本申请实施例提供的一种海洋工程环境水下机器人定位装置的结构示意图。Figure 6 shows a schematic structural diagram of an underwater robot positioning device in a marine engineering environment provided by an embodiment of the present application.
具体实施方式Detailed ways
如前所述,受限于成本、体积和载荷等约束,水下机器人一般搭载一定性能的惯导,并通过水面卫星、水下声呐、视觉等辅助惯性进行定位,该多源组合定位方法在水下观测与作业方面得到了广泛的应用。然而,水声定位容易产生多路径效应,从而影响水声定位的精度;工程施工过程中对水体的扰动将改变局部环境的水体水质,使得水下可见度差,极大影响成像距离和成像质量,影响视觉匹配定位精度。在这种工程环境中,上述多源组合定位方法难以满足水下机器人高精度定位的要求。因此,获取更可靠、更高精度的特定定位数据,在惯导需要时提供辅助修正信息,仍然是水下机器人定位的最可能途径。基于此,本申请提出一种海洋工程环境水下机器人定位方法。As mentioned before, due to constraints such as cost, volume, and load, underwater robots are generally equipped with inertial navigation of a certain performance, and are positioned through auxiliary inertia such as surface satellites, underwater sonar, and vision. This multi-source combined positioning method is used in It has been widely used in underwater observation and operations. However, hydroacoustic positioning is prone to multipath effects, which affects the accuracy of hydroacoustic positioning; disturbances to the water body during the construction process will change the water quality of the local environment, resulting in poor underwater visibility, which greatly affects the imaging distance and imaging quality. Affects visual matching and positioning accuracy. In this engineering environment, the above-mentioned multi-source combined positioning method is difficult to meet the requirements for high-precision positioning of underwater robots. Therefore, obtaining more reliable and higher-precision specific positioning data and providing auxiliary correction information when inertial navigation needs it is still the most likely way to position underwater robots. Based on this, this application proposes a positioning method for underwater robots in marine engineering environments.
为了使本技术领域的人员更好地理解本申请中的技术方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。In order to enable those in the technical field to better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the described The embodiments are only some of the embodiments of the present application, but not all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by those of ordinary skill in the art without creative efforts should fall within the scope of protection of this application.
图1示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的一种流程示意图。如图所示,该方法可以包括以下步骤。Figure 1 shows a schematic flow chart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application. As shown in the figure, the method may include the following steps.
步骤S110:基于卫星和水声定位系统,获取水下机器人所在的初始位置。Step S110: Obtain the initial position of the underwater robot based on the satellite and hydroacoustic positioning system.
例如,基于全球导航卫星系统GNSS以及水声系统的多源组合定位,获取水下机器人的初始位置。该初始位置也作为水下机器人粗定位位置。卫星得到大地坐标系的高精度位置,特别是通过差分可以达到厘米级定位;卫星和水声联合可以得到水下机器人位置。水声定位包括长基线、短基线、超短基线等。For example, the initial position of the underwater robot is obtained based on the multi-source combined positioning of the Global Navigation Satellite System (GNSS) and the hydroacoustic system. This initial position is also used as the rough positioning position of the underwater robot. Satellites can obtain high-precision positions in the geodetic coordinate system, especially centimeter-level positioning through differentiation; satellites and hydroacoustics can be combined to obtain the position of underwater robots. Hydroacoustic positioning includes long baseline, short baseline, ultra-short baseline, etc.
惯性导航系统(也称惯性传感器)基于卫星和水声系统得到的初始位置进行导航。具体的,其基于该初始位置,并通过测量加速度和角速度进行导航,并将加速度和角速度分别与导航时间结合,以实现导航。该初始位置可以表示为水下机器人在大地坐标系的坐标,例如g(x0, y0, z0)。通过该方法获取的初始位置,受导航时间影响。导航时间越长,初始位置的误差越大。Inertial navigation systems (also called inertial sensors) navigate based on initial positions obtained from satellites and hydroacoustic systems. Specifically, it navigates based on the initial position by measuring acceleration and angular velocity, and combines the acceleration and angular velocity with navigation time respectively to achieve navigation. The initial position can be expressed as the coordinates of the underwater robot in the geodetic coordinate system, such as g(x 0 , y 0 , z 0 ). The initial position obtained through this method is affected by navigation time. The longer the navigation time, the greater the error in the initial position.
步骤S120:在编码合作靶标集合中,确定与所述初始位置对应的初始编码合作靶标。Step S120: Determine the initial coding cooperation target corresponding to the initial position in the coding cooperation target set.
初始编码合作靶标与初始位置满足位置对应关系。例如,初始编码合作靶标位于初始位置的预设范围内,或者初始编码合作靶标与初始位置的距离满足预定关系或满足预定阈值。The initial encoding cooperation target and the initial position satisfy the position correspondence relationship. For example, the initial encoding cooperation target is located within a preset range of the initial position, or the distance between the initial encoding cooperation target and the initial position satisfies a predetermined relationship or meets a predetermined threshold.
所述编码合作靶标集合包括预先设置于水下构筑物的编码合作靶标(简称合作靶标)。水下构筑物为水下位置固定的建筑物,例如已经布设完成的隧道沉管、桥墩等。合作靶标设置于水下构筑物上可供感应的位置,例如布设在沉管的表面。合作靶标在水下构筑物上的位置已知。由于水下构筑物的位置是固定的,且合作靶标在水下构筑物上的位置已知,因此可以确定合作靶标的位置,例如,可以将合作靶标在大地坐标系的坐标表示为Ci(xi,yi, zi)。The set of coded cooperation targets includes coded cooperation targets (referred to as cooperation targets) pre-set on underwater structures. Underwater structures are buildings with fixed underwater positions, such as tunnel immersed tubes, bridge piers, etc. that have been laid out. The cooperative target is set at a sensing location on the underwater structure, such as on the surface of an immersed tube. The location of the cooperative target on the underwater structure is known. Since the position of the underwater structure is fixed and the position of the cooperative target on the underwater structure is known, the position of the cooperative target can be determined. For example, the coordinates of the cooperative target in the geodetic coordinate system can be expressed as Ci(x i , y i , z i ).
通过本步骤,能够确定用于获取水下机器人精确位置的编码合作靶标。Through this step, the coded cooperation target used to obtain the precise position of the underwater robot can be determined.
步骤S130:基于所述初始编码合作靶标,利用水下摄影测量技术确定所述水下机器人的瞬时精确位置和瞬时精确姿态。Step S130: Based on the initial encoded cooperation target, use underwater photogrammetry technology to determine the instantaneous precise position and instantaneous precise attitude of the underwater robot.
水下摄影测量用于以摄影方式对水下目标进行测量,以确定水下目标的形状、大小、位置和性质等。用于水下摄影测量的设备,如视觉传感器,可以位于水上也可以位于水下,在此不做限制。Underwater photogrammetry is used to measure underwater targets photographically to determine the shape, size, location and nature of underwater targets. Equipment used for underwater photogrammetry, such as vision sensors, can be located above or underwater, and there is no restriction here.
在合作靶标的位置已知的情况下,通过拍摄合作靶标,可以计算得到水下机器人的位置及姿态,即瞬时精确位置和瞬时精确姿态。When the position of the cooperative target is known, by photographing the cooperative target, the position and attitude of the underwater robot can be calculated, that is, the instantaneous precise position and instantaneous precise attitude.
可选的,根据测量需要,可以基于初始编码合作靶标,仅确定瞬时精确位置和瞬时精确姿态的其中一种。Optionally, according to measurement needs, only one of the instantaneous precise position and the instantaneous precise attitude can be determined based on the initial encoding cooperative target.
步骤S140:基于由所述初始编码合作靶标组成的作业路径,确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标为目的编码合作靶标。Step S140: Based on the operation path composed of the initial coding cooperation target, determine the next coding cooperation target to which the underwater robot leaves the initial coding cooperation target as the destination coding cooperation target.
作业路径可以包括水下机器人经过多个编码合作靶标的次序;进而基于该次序,可以确定目的编码合作靶标。The operation path may include the order in which the underwater robot passes through multiple coded cooperation targets; and based on this order, the target coded cooperation target may be determined.
步骤S150:根据所述瞬时精确位置、所述瞬时精确姿态和所述作业路径,结合惯性导航系统进行定位导航。Step S150: Perform positioning and navigation based on the instantaneous precise position, the instantaneous precise attitude and the working path, combined with an inertial navigation system.
基于瞬时精确位置和所述瞬时精确姿态可以对初始位置进行修正或替换,惯导系统使用瞬时精确位置和瞬时精确姿态沿工作路径进行导航定位。在初始位置不够准确的情况下,基于精确位置和瞬时精确姿态进行定位,能够使水下机器人维持较高的定位精度。换言之,可以使水下机器人的定位精度不受导航时间的影响。The initial position can be corrected or replaced based on the instantaneous precise position and the instantaneous precise attitude. The inertial navigation system uses the instantaneous precise position and the instantaneous precise attitude to perform navigation and positioning along the working path. When the initial position is not accurate enough, positioning based on the precise position and instantaneous precise attitude can enable the underwater robot to maintain high positioning accuracy. In other words, the positioning accuracy of the underwater robot can be made independent of navigation time.
在本申请实施例中,通过基于卫星和水声定位系统,获取水下机器人所在的初始位置;在编码合作靶标集合中,确定与所述初始位置对应的初始编码合作靶标,其中,所述编码合作靶标集合包括预先设置于水下构筑物的编码合作靶标;基于所述初始编码合作靶标,利用水下摄影测量技术确定所述水下机器人的瞬时精确位置和瞬时精确姿态;基于由所述初始编码合作靶标组成的作业路径,确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标为目的编码合作靶标;根据所述瞬时精确位置、所述瞬时精确姿态和所述作业路径,结合惯性导航系统进行定位导航,能够将基于编码合作靶标以摄影测量方式确定的瞬时精确位置和瞬时精确姿态,能够以摄影测量方式辅助惯性导航系统,从而实现基于惯性导航系统的长航时、高精度的水下机器人定位方法。In the embodiment of the present application, the initial position of the underwater robot is obtained based on the satellite and hydroacoustic positioning system; in the coded cooperation target set, the initial coded cooperation target corresponding to the initial position is determined, wherein the coded The cooperative target set includes coded cooperation targets pre-set on underwater structures; based on the initial coded cooperation targets, underwater photogrammetry technology is used to determine the instantaneous precise position and instantaneous precise attitude of the underwater robot; based on the initial coded The operation path composed of cooperative targets determines the next encoded cooperative target that the underwater robot leaves the initial encoded cooperative target and goes to as the destination encoded cooperative target; according to the instantaneous precise position, the instantaneous precise attitude and the operation Path, combined with the inertial navigation system for positioning and navigation, can achieve the instantaneous precise position and instantaneous precise attitude determined by photogrammetry based on the coded cooperative target, and can assist the inertial navigation system with photogrammetry, thereby achieving long endurance based on the inertial navigation system. , High-precision underwater robot positioning method.
在一种可能的实现方式中,步骤S130包括:通过所述水下机器人的视觉传感器对所述初始编码合作靶标进行摄影测量后方交会,确定所述视觉传感器相对于所述初始编码合作靶标的位姿关系;基于所述初始编码合作靶标的靶标位置、所述初始编码合作靶标的靶标姿态以及所述位姿关系,确定所述水下机器人的所述瞬时精确位置和所述瞬时精确姿态。In a possible implementation, step S130 includes: performing photogrammetry resection on the initially coded cooperation target through the visual sensor of the underwater robot, and determining the position of the visual sensor relative to the initial coded cooperation target. pose relationship; determine the instantaneous precise position and the instantaneous precise posture of the underwater robot based on the target position of the initially encoded cooperative target, the target posture of the initially encoded cooperative target, and the pose relationship.
视觉传感器搭载在水下机器人上,可以用于拍摄合作靶标。视觉传感器为可以适合水下使用的相机或摄像机。视觉传感器的数量可以是一个也可以为多个,在此不作限制。合作靶标可以为被动反光靶标也可以为主动发光靶标,其可以提供位置和姿态信息。由于在海洋工程施工环境中,水体浑浊,浮游生物多,水体对光学成像质量差。如果仅使用视觉传感器,其成像效果差,提供的视觉信息准确性低。通过能够反光或者发光的合作靶标配合视觉传感器进行联测,则能够在施工环境中,得到准确的水下机器人的定位信息,例如水下机器人的位置和姿态。Vision sensors are mounted on underwater robots and can be used to photograph cooperative targets. The vision sensor is a camera or video camera that can be adapted for underwater use. The number of visual sensors may be one or multiple, and is not limited here. The cooperative target can be a passive reflective target or an active luminous target, which can provide position and attitude information. Because in the marine engineering construction environment, the water body is turbid and there are many plankton, the water body has poor optical imaging quality. If only a visual sensor is used, the imaging effect is poor and the accuracy of the visual information provided is low. By cooperating with reflective or luminous targets and visual sensors for joint testing, accurate underwater robot positioning information, such as the position and attitude of the underwater robot, can be obtained in the construction environment.
可选的,合作靶标的布设需要考虑水下结构特点、结构诱导场特点以及水下机器人的控制性能。此外,在水下构筑物上布设靶标后,可以通过传统工程测量手段获取每个靶标在大地坐标系的位置以及姿态。Optionally, the layout of cooperative targets needs to consider the underwater structural characteristics, structure-induced field characteristics, and the control performance of the underwater robot. In addition, after placing targets on underwater structures, the position and attitude of each target in the geodetic coordinate system can be obtained through traditional engineering measurement methods.
通过摄影测量后方交会确定所述视觉传感器相对于所述合作靶标的位姿关系,例如可以包括基于相机坐标系、靶标坐标系以及大地坐标系之间的转换关系,以及水下机器人在靶标坐标系下的位置,确定视觉传感器相对于所述合作靶标的位姿关系。Determining the position and orientation relationship of the visual sensor relative to the cooperative target through photogrammetry resection, which may include, for example, the transformation relationship between the camera coordinate system, the target coordinate system and the earth coordinate system, and the position of the underwater robot in the target coordinate system. The position below determines the pose relationship of the visual sensor relative to the cooperative target.
可选的,合作靶标还可以为声学合作靶标,搭配声学传感器进行水下测量。Optionally, the cooperative target can also be an acoustic cooperative target, used with acoustic sensors for underwater measurement.
结合图2,在一种可能的实现方式中,所述编码合作靶标200包括至少三个用于所述视觉传感器的参考点201。步骤S130进一步包括:确定所述视觉传感器相对于各所述参考点201的参考位姿关系;基于各所述参考点201的参考点位置、各所述参考点201的参考点姿态以及对应的所述参考位姿关系,确定所述水下机器人的瞬时精确位置和所述瞬时精确姿态。Referring to FIG. 2 , in a possible implementation, the encoding cooperation target 200 includes at least three reference points 201 for the visual sensor. Step S130 further includes: determining the reference pose relationship of the visual sensor relative to each of the reference points 201; based on the reference point position of each of the reference points 201, the reference point posture of each of the reference points 201 and the corresponding The reference pose relationship is used to determine the instantaneous precise position and the instantaneous precise attitude of the underwater robot.
参考点201为设置在合作靶标上的可观测实体点,例如,参考点201可以为反光点也可以为发光点。参考点201可以作为一种编码信息的图案。各编码合作靶标200上参考点201的数量以及排列形式均可以根据测量需要进行设计。可选的,不同编码合作靶标200上的参考点201的数量和/或排列形式不同,以通过参考点201可以对编码合作靶标200进行区分。The reference point 201 is an observable entity point set on the cooperation target. For example, the reference point 201 can be a reflective point or a luminous point. Reference point 201 may serve as a pattern that encodes information. The number and arrangement of the reference points 201 on each coding cooperation target 200 can be designed according to measurement needs. Optionally, the number and/or arrangement of the reference points 201 on different coding cooperation targets 200 are different, so that the coding cooperation targets 200 can be distinguished through the reference points 201 .
图3示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的另一种流程示意图。如图所示,该方法可以包括以下步骤。Figure 3 shows another schematic flowchart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application. As shown in the figure, the method may include the following steps.
步骤S310:基于卫星和水声定位系统,获取水下机器人所在的初始位置。Step S310: Obtain the initial position of the underwater robot based on the satellite and hydroacoustic positioning system.
步骤S321:根据所述编码合作靶标集合中各所述合作靶标与所述初始位置之间的距离,确定多个所述编码合作靶标中的一个为所述初始编码合作靶标。Step S321: Determine one of the plurality of coding cooperation targets as the initial coding cooperation target according to the distance between each cooperation target in the coding cooperation target set and the initial position.
可选的,确定与初始位置距离最近的合作靶标为初始编码合作靶标。并且,初始编码合作靶标为水下机器人目前尚未经过的合作靶标。Optionally, determine the cooperative target closest to the initial position as the initial encoding cooperative target. Moreover, the initial coded cooperation target is a cooperation target that the underwater robot has not yet passed.
例如,编码合作靶标A距离初始位置的距离为5米,编码合作靶标B距离初始位置的距离为3米,编码合作靶标C距离初始位置的距离为1米;水下机器人已经过合作靶标C,未经过合作靶标A及合作靶标B,则确定合作靶标B为初始编码合作靶标。For example, the distance between the coded cooperation target A and the initial position is 5 meters, the distance between the coded cooperation target B and the initial position is 3 meters, and the distance between the coded cooperation target C and the initial position is 1 meter; the underwater robot has passed the cooperation target C, If cooperation target A and cooperation target B have not been passed, cooperation target B is determined to be the initial coding cooperation target.
步骤S322:根据所述水下机器人的预设作业路径和所述初始编码合作靶标,确定所述水下机器人的所述作业路径。Step S322: Determine the working path of the underwater robot according to the preset working path of the underwater robot and the initial coded cooperation target.
预设作业路径为预先设定的水下机器人的作业路径。以确定的初始编码合作靶标在大地坐标系的坐标为Ci(xi, yi, zi)为例,得到以Ci(xi, yi, zi)为关键节点的水下机器人的作业路径,也称实际作业路径。实际作业路径可以记为p{Ci, Ci+1,…, Cn,…C1},其中,i=1,2,…,n, Ci为初始编码合作靶标,初始编码合作靶标作为实际作业路径中的起始合作靶标。可选的,与实际作业要求结合,计算并规划从初始位置到初始编码合作靶标Ci(xi,yi, zi)的实际路径。The preset operating path is the preset operating path of the underwater robot. Taking the coordinates of the determined initial coding cooperation target in the geodetic coordinate system as Ci( xi , yi , zi ) as an example, we obtain the coordinates of the underwater robot with Ci (xi , yi , zi ) as the key node. Job path, also called actual job path. The actual operation path can be recorded as p{C i , C i+1 ,…, C n ,…C 1 }, where i=1,2,…,n, Ci is the initial coding cooperation target, and the initial coding cooperation target is as The initial cooperation target in the actual operation path. Optionally, combined with actual job requirements, calculate and plan the actual path from the initial position to the initial encoding cooperation target Ci (xi , y i , z i ).
可选的,根据预设作业路径,可以用于确定经过不同合作靶标的顺序,例如Ci,Ci+1,…, Cn。因此,根据预设作业路径确定的实际作业路径也可以用于确定经过不同合作靶标的顺序。即根据初始合作靶标,可以确定水下机器人离开所述初始合作靶标后、前往的下一合作靶标,即可以确定目的合作靶标。例如根据确定的初始合作靶标Ci,可以确定目的合作靶标Ci+1。Optionally, according to the preset operation path, it can be used to determine the order of passing through different cooperation targets, such as C i , C i+1 ,..., C n . Therefore, the actual operation path determined based on the preset operation path can also be used to determine the sequence of passing different cooperation targets. That is, based on the initial cooperation target, the next cooperation target that the underwater robot will go to after leaving the initial cooperation target can be determined, and the target cooperation target can be determined. For example, based on the determined initial cooperation target Ci , the target cooperation target Ci +1 can be determined.
可选的,在水下机器人到达目的合作靶标Ci+1后,将初始合作靶标Ci标记为“已经过”,重复上述步骤S321-S322,可以确定水下机器人离开目的合作靶标后、前往的下一合作靶标,例如合作靶标Ci+2。Optionally, after the underwater robot reaches the target cooperation target C i+1 , mark the initial cooperation target C i as "passed" and repeat the above steps S321-S322. It can be determined that after the underwater robot leaves the target cooperation target, it will go to The next cooperative target, such as cooperative target C i+2 .
如此,水下机器人每次遇到编码合作靶标,就可以得到准确的位置和姿态;惯导的导航时间不超过水下机器人自一个合作靶标到下一个合作靶标的时间,惯导的误差始终维持在两个编码合作靶标之间,从而能够保障长航时、高精度导航。In this way, every time the underwater robot encounters a coded cooperative target, it can obtain an accurate position and attitude; the navigation time of the inertial navigation does not exceed the time of the underwater robot from one cooperative target to the next, and the error of the inertial navigation is always maintained. Between the two coded cooperation targets, long-endurance, high-precision navigation can be guaranteed.
步骤S340:确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标为目的编码合作靶标。Step S340: Determine the next coding cooperation target to which the underwater robot leaves the initial coding cooperation target and goes to as the destination coding cooperation target.
步骤S310以及步骤S340及其之后的步骤可以采用上一实施例对应步骤的描述,并实现相同或相应的技术效果,对于可重复的部分,在此不再赘述。Step S310, step S340 and subsequent steps can adopt the description of the corresponding steps in the previous embodiment, and achieve the same or corresponding technical effects. Repeatable parts will not be described again here.
在本申请实施例中,在通过编码合作靶标为关键点生成水下机器人的实际作业路径、水下机器人沿该作业路径进行水下作业的过程中,可以通过各预设编码合作靶标,得到水下机器人当前的精确位置,以不断向惯导提供准确的水下机器人的准确位置,进一步保持水下机器人在长导航时间下的定位准确性。In the embodiment of the present application, when the actual operating path of the underwater robot is generated by using the coded cooperation target as a key point, and the underwater robot performs underwater operations along the operating path, the underwater robot can be obtained through each preset coded cooperation target. The current precise position of the underwater robot is continuously provided to the inertial navigation system to further maintain the positioning accuracy of the underwater robot under long navigation time.
图4示出本申请实施例提供的一种海洋工程环境水下机器人定位方法的另一种流程示意图。如图所示,该方法可以包括以下步骤。Figure 4 shows another schematic flowchart of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application. As shown in the figure, the method may include the following steps.
步骤S410:基于卫星和水声定位系统,获取水下机器人所在的初始位置。Step S410: Obtain the initial position of the underwater robot based on the satellite and hydroacoustic positioning system.
步骤S420:确定与所述初始位置对应的初始编码合作靶标。Step S420: Determine the initial coding cooperation target corresponding to the initial position.
步骤S410和S420可以采用前述实施例对应步骤的描述,并实现相同或相应的技术效果,对于可重复的部分,在此不再赘述。Steps S410 and S420 may adopt the description of the corresponding steps in the foregoing embodiments and achieve the same or corresponding technical effects. Repeatable parts will not be described again here.
步骤S431:获取所述水下构筑物的结构化信息。Step S431: Obtain the structured information of the underwater structure.
在所述水下机器人前往所述初始编码合作靶标和/或所述目的编码合作靶标的过程中,通过水下机器人,获取所述水下构筑物的结构化信息。In the process of the underwater robot traveling to the initial coded cooperation target and/or the target coded cooperation target, the underwater robot acquires the structured information of the underwater structure.
可选的,利用卫星和水声定位引导水下机器人到达相应合作靶标位置附近。由于水下摄影测量受到水体情况影响较大,因此,水下机器人到达距离合作靶标较近的位置,可以进一步提升通过合作靶标确定的水下机器人的位姿的准确性。例如,控制水下机器人从初始位置前往初始编码合作靶标,以及控制水下机器人从初始编码合作靶标前往目的编码合作靶标。Optionally, use satellites and hydroacoustic positioning to guide the underwater robot to the vicinity of the corresponding cooperative target location. Since underwater photogrammetry is greatly affected by water conditions, when the underwater robot reaches a position closer to the cooperative target, the accuracy of the underwater robot's pose determined by the cooperative target can be further improved. For example, control the underwater robot from the initial position to the initial coding cooperation target, and control the underwater robot from the initial coding cooperation target to the destination coding cooperation target.
通过水下机器人搭载的不同种类的传感器,获取水下构筑物的结构化信息。该结构化信息可用于表征水下构筑物的形貌,例如包括水下构筑物的轮廓、水下构筑物的形状等。标准结构化信息例如为预先存储的水下构筑物的标准结构模型。由于水下构筑物为人造构筑物,具有标准几何模型,在水下可以得到其在大地坐标系的绝对坐标。初始位置与水下构筑物的位置位于同一坐标系。标准结构化信息可以包括水下构筑物上各合作靶标的位姿信息。Through different types of sensors mounted on underwater robots, structured information of underwater structures can be obtained. This structured information can be used to characterize the morphology of the underwater structure, including, for example, the outline of the underwater structure, the shape of the underwater structure, etc. The standard structured information is, for example, a pre-stored standard structural model of an underwater structure. Since underwater structures are man-made structures and have standard geometric models, their absolute coordinates in the geodetic coordinate system can be obtained underwater. The initial position is in the same coordinate system as the position of the underwater structure. Standard structured information may include pose information of each cooperative target on the underwater structure.
步骤S432:确定所述水下机器人的实时位置。Step S432: Determine the real-time position of the underwater robot.
在所述结构化信息与预先存储的所述水下构筑物的标准结构化信息匹配成功的情况下,例如在结构化信息表征的轮廓与标准结构化信息表征的轮廓能够匹配的情况下,或在结构化信息表征的形貌与标准结构化信息表征的形貌能够匹配的情况下,根据所述标准结构化信息,确定所述水下机器人的实时位置。When the structured information successfully matches the pre-stored standard structured information of the underwater structure, for example, when the outline represented by the structured information can match the outline represented by the standard structured information, or when When the morphology represented by structured information matches the morphology represented by standard structured information, the real-time position of the underwater robot is determined based on the standard structured information.
根据该方式确定的实时位置,能在瞬时精确位置和瞬时精确姿态的基础上,进一步提高位置的准确性,进而提高长时导航的准确性。The real-time position determined based on this method can further improve the accuracy of the position based on the instantaneous precise position and instantaneous attitude, thereby improving the accuracy of long-term navigation.
在一种可能的实现方式中,在所述结构化信息与所述标准结构化信息匹配失败的情况下,控制所述水下机器人返回已经经过的上一合作靶标或结束作业。In a possible implementation, if the structured information fails to match the standard structured information, the underwater robot is controlled to return to the previous cooperative target it has passed or to end the operation.
可选的,在水下机器人从初始编码合作靶标前往目的编码合作靶标的过程中,如结构化信息表征的水下构筑物的形貌与标准结构化信息中的水下构筑物的形貌不同,控制水下机器人返回初始编码合作靶标。Optionally, during the process of the underwater robot moving from the initial coded cooperation target to the target coded cooperation target, if the shape of the underwater structure represented by the structured information is different from the shape of the underwater structure in the standard structured information, control The underwater robot returns to the initial coded cooperative target.
或者,在水下机器人从初始位置前往初始编码合作靶标的过程中,如结构化信息表征的水下构筑物的形貌与标准结构化信息中的水下构筑物的形貌不同,由于作业路径中没有上一合作靶标位置,指示水下机器人结束水下作业。Or, in the process of the underwater robot moving from the initial position to the initial coded cooperation target, the morphology of the underwater structure represented by the structured information is different from the morphology of the underwater structure in the standard structured information, because there is no such thing in the operation path. The last cooperative target position instructs the underwater robot to end underwater operations.
步骤S440:根据所述瞬时精确位置、所述瞬时精确姿态、所述作业路径和所述实时位置,控制所述惯性导航系统进行定位导航。Step S440: Control the inertial navigation system to perform positioning and navigation based on the instantaneous precise position, the instantaneous precise attitude, the working path and the real-time position.
在瞬时精确位置、瞬时精确姿态和作业路径的基础上,结合根据结构化信息获取的实时位置,能够进一步提高位置的准确性,进而提高长时导航的准确性。On the basis of instantaneous precise position, instantaneous precise attitude and working path, combined with the real-time position obtained based on structured information, the accuracy of position can be further improved, thereby improving the accuracy of long-term navigation.
在本申请实施例中,在水下机器人前往初始/目的编码合作靶标的过程中,通过匹配水下构筑物的标准化结构信息,得到实时位置,能够进一步维持水下机器人导航定位的精准性。In the embodiment of the present application, when the underwater robot goes to the initial/destination coded cooperation target, the real-time position is obtained by matching the standardized structural information of the underwater structure, which can further maintain the accuracy of the underwater robot's navigation and positioning.
在一种可能的实现方式中,步骤S431包括以下至少之一:通过所述水下机器人的视觉传感器,获取用于表示所述水下构筑物的结构的视觉图像信息;通过所述水下机器人的声学传感器,获取用于表示所述水下构筑物的结构的声呐图像信息。In a possible implementation, step S431 includes at least one of the following: obtaining visual image information representing the structure of the underwater structure through the visual sensor of the underwater robot; An acoustic sensor acquires sonar image information representing the structure of the underwater structure.
声呐适用于大范围获取水下地形、构筑物等三维结构,通过特征匹配进行定位;视觉可以获取图像或者点云,通过提取被测物特征进行特征匹配定位。获取的视觉图像和声呐图像,可以进行结构化信息匹配,也可以通过其他图像匹配方法进行匹配。Sonar is suitable for obtaining large-scale three-dimensional structures such as underwater terrain and structures, and performs positioning through feature matching; vision can obtain images or point clouds, and perform feature matching and positioning by extracting the characteristics of the measured object. The acquired visual images and sonar images can be matched with structured information or through other image matching methods.
由于水体散射、悬浮等问题,水下进行光学成像存在光学质量差,不清晰,常用的点匹配往往难以适用;而声呐图像由于集合结构引起的多路径效应,也存在分辨率低,精度低,也难以进行点匹配。然而,水下构筑物部分特征是已知的,特征所在的位置也是已知的,通过机器人在运动过程中获取图像,采用结构化匹配的方式,例如轮廓匹配的方式,能够降低对图像质量的要求,从而进行有效匹配。Due to issues such as water scattering and suspension, underwater optical imaging suffers from poor optical quality and unclear clarity, and commonly used point matching is often difficult to apply. Sonar images also suffer from low resolution and low accuracy due to the multipath effect caused by the collective structure. Point matching is also difficult. However, some features of underwater structures are known, and the locations of features are also known. The robot acquires images during movement and uses structured matching, such as contour matching, to reduce the requirements for image quality. , so as to achieve effective matching.
结合图5示出的本申请实施例提供的一种海洋工程环境水下机器人定位方法的一种示意图。所述方法包括以下步骤:A schematic diagram of an underwater robot positioning method in a marine engineering environment provided by an embodiment of the present application is shown in conjunction with FIG. 5 . The method includes the following steps:
(1)获取水下构筑物上预先设置的所有编码合作靶标的靶标信息,靶标信息包括但不限于靶标位置、靶标姿态、靶标编号等。(1) Obtain target information of all coded cooperation targets preset on underwater structures. Target information includes but is not limited to target location, target attitude, target number, etc.
(2)GNSS和水声技术联合得到水下机器人在大地坐标系的初始位置坐标g(x0, y0,z0),初始位置与水声定位的精度相关。(2) GNSS and hydroacoustic technology are combined to obtain the initial position coordinates g(x 0 , y 0 , z 0 ) of the underwater robot in the geodetic coordinate system. The initial position is related to the accuracy of hydroacoustic positioning.
(3)根据初始位置坐标g(x0, y0, z0)计算距离水下机器人最近但未被水下机器人经过的初始编码合作靶标Ci(xi, yi, zi)501。(3) Calculate the initial coded cooperative target Ci(xi , y i , z i ) 501 that is closest to the underwater robot but has not been passed by the underwater robot based on the initial position coordinates g(x 0 , y 0 , z 0 ).
(4)根据水下机器人的预设作业路径,确定以初始编码合作靶标501为关键节点的水下机器人实际作业路径,记为p{Ci, Ci+1,…, Cn,…C1},其中,i=1,2,…,n, Ci为初始编码合作靶标501。(4) According to the preset operating path of the underwater robot, determine the actual operating path of the underwater robot with the initial coded cooperation target 501 as the key node, recorded as p{C i , C i+1 ,…, C n ,…C 1 }, where i=1,2,…,n, Ci is the initial coding cooperation target 501.
(5)结合实际作业要求,计算并规划自g(x0, y0, z0)至Ci(xi, yi, zi)的实际作业路径,并通过卫星和水声定位系统引导水下机器人到达Ci(xi, yi, zi)。(5) Combined with actual operation requirements, calculate and plan the actual operation path from g(x 0 , y 0 , z 0 ) to Ci( xi , y i , z i ), and guide the water through satellite and hydroacoustic positioning systems The next robot reaches Ci(x i , y i , z i ).
(6)水下机器人以摄影测量的方式与初始编码合作靶标501进行联测,得到水下机器人瞬时精确位置和瞬时精确姿态,用以向惯导系统提供更准确的位姿信息。(6) The underwater robot conducts joint testing with the initial coding cooperation target 501 through photogrammetry to obtain the instantaneous precise position and instantaneous precise attitude of the underwater robot, which is used to provide more accurate pose information to the inertial navigation system.
(7)水下机器人从路径集合p中得到初始编码合作靶标501和目的编码合作靶标502,计算得到运动路径、运动方向等,并开始导航作业。(7) The underwater robot obtains the initial coded cooperation target 501 and the destination coded cooperation target 502 from the path set p, calculates the movement path, movement direction, etc., and starts the navigation operation.
(8)可选的,在从初始编码合作靶标501前往目的编码合作靶标502的过程中,再次通过多源定位方法,获取水下机器人实时位置,从水下构筑物的标准结构模型中提取水下构筑物在该实时位置附近的局部结构信息;同时,通过水下机器人搭载的视觉传感器和声传感器,获取水下构筑物在该实时位置的视觉图像和声呐图像;将视觉图像和声呐图像与局部结构信息进行对比,如果匹配成功,则根据标准模型确定该实时位置,进一步修正惯导误差。(8) Optionally, in the process of going from the initial encoding cooperation target 501 to the destination encoding cooperation target 502, obtain the real-time position of the underwater robot again through the multi-source positioning method, and extract the underwater position from the standard structural model of the underwater structure. The local structural information of the structure near the real-time position; at the same time, through the visual sensor and acoustic sensor mounted on the underwater robot, the visual image and sonar image of the underwater structure at the real-time position are obtained; the visual image and sonar image are combined with the local structural information Comparison is made, and if the match is successful, the real-time position is determined based on the standard model and the inertial navigation error is further corrected.
(9)水下机器人到达目的编码合作靶标502后,将初始编码合作靶标501标记为已经过,并重复步骤(6)至(9)。(9) After the underwater robot reaches the destination coded cooperation target 502, it marks the initial coded cooperation target 501 as passed, and repeats steps (6) to (9).
(10)根据需要,重复步骤(1)至(9),以实现水下机器人在海洋工程环境中长航时、高精度定位。(10) Repeat steps (1) to (9) as needed to achieve long-endurance, high-precision positioning of the underwater robot in the marine engineering environment.
在本申请实施例中,通过在构筑物上布设编码合作靶标,利用编码合作靶标提供的精确位置和姿态信息辅助惯导,同时,与水下卫星、水声定位定位、声学和光学图像匹配定位等方法协同,能够构建海洋工程环境水下机器人高精度定位技术框架,充分利用海洋工程建设中人造环境的优势,形成以惯导为核心的水下机器人长航时、高精度定位方法。In the embodiment of this application, by arranging coded cooperation targets on the structure, the precise position and attitude information provided by the coded cooperation targets are used to assist inertial navigation, and at the same time, the positioning is matched with underwater satellites, hydroacoustic positioning, acoustic and optical images, etc. The synergy of methods can build a high-precision positioning technology framework for underwater robots in marine engineering environments, make full use of the advantages of the artificial environment in marine engineering construction, and form a long-endurance, high-precision positioning method for underwater robots with inertial navigation as the core.
图6示出本申请实施例提供的海洋工程环境水下机器人定位装置的结构示意图,该装置600包括:获取模块610、第一确定模块620、第二确定模块630、第三确定模块640和定位模块650。Figure 6 shows a schematic structural diagram of an underwater robot positioning device in a marine engineering environment provided by an embodiment of the present application. The device 600 includes: an acquisition module 610, a first determination module 620, a second determination module 630, a third determination module 640 and a positioning module. Module 650.
获取模块610用于基于卫星和水声定位系统,获取水下机器人所在的初始位置;第一确定模块620用于在编码合作靶标集合中,确定与所述初始位置对应的初始编码合作靶标,其中,所述编码合作靶标集合包括预先设置于水下构筑物的编码合作靶标;第二确定模块630用于基于所述初始编码合作靶标,利用水下摄影测量技术确定所述水下机器人的瞬时精确位置和瞬时精确姿态;第三确定模块640用于基于由所述初始编码合作靶标组成的作业路径,确定所述水下机器人离开所述初始编码合作靶标、前往的下一编码合作靶标为目的编码合作靶标;定位模块650用于根据所述瞬时精确位置、所述瞬时精确姿态和所述作业路径,结合惯性导航系统进行定位导航。The acquisition module 610 is used to obtain the initial position of the underwater robot based on the satellite and hydroacoustic positioning system; the first determination module 620 is used to determine the initial coded cooperation target corresponding to the initial position in the coded cooperation target set, where , the coded cooperation target set includes coded cooperation targets pre-set on underwater structures; the second determination module 630 is used to determine the instantaneous precise position of the underwater robot based on the initial coded cooperation target using underwater photogrammetry technology and the instantaneous precise posture; the third determination module 640 is used to determine, based on the operating path composed of the initial coding cooperation target, that the next coding cooperation target that the underwater robot leaves the initial coding cooperation target and goes to is the purpose coding cooperation Target; the positioning module 650 is used to perform positioning and navigation based on the instantaneous precise position, the instantaneous precise attitude and the working path, combined with an inertial navigation system.
在一种可能的实现方式中,第二确定模块630用于通过所述水下机器人的视觉传感器对所述初始编码合作靶标进行摄影测量后方交会,确定所述视觉传感器相对于所述初始编码合作靶标的位姿关系;基于所述初始编码合作靶标的靶标位置、所述初始编码合作靶标的靶标姿态以及所述位姿关系,确定所述水下机器人的所述瞬时精确位置和所述瞬时精确姿态。In a possible implementation, the second determination module 630 is configured to perform photogrammetric resection on the initial coding cooperation target through the visual sensor of the underwater robot, and determine the visual sensor relative to the initial coding cooperation target. The pose relationship of the target; based on the target position of the initially encoded cooperative target, the target attitude of the initially encoded cooperative target, and the pose relationship, determine the instantaneous precise position and the instantaneous precise position of the underwater robot attitude.
在一种可能的实现方式中,所述编码合作靶标包括至少3个用于所述视觉传感器的参考点,第二确定模块630用于确定所述视觉传感器相对于各所述参考点的参考位姿关系;基于各所述参考点的参考点位置、各所述参考点的参考点姿态以及对应的所述参考位姿关系,确定所述水下机器人的所述瞬时精确位置和所述瞬时精确姿态。In a possible implementation, the coding cooperation target includes at least 3 reference points for the visual sensor, and the second determination module 630 is used to determine the reference position of the visual sensor relative to each of the reference points. posture relationship; based on the reference point position of each reference point, the reference point posture of each reference point and the corresponding reference posture relationship, determine the instantaneous precise position and the instantaneous precise position of the underwater robot attitude.
在一种可能的实现方式中,所述编码合作靶标的数量不小于2,第一确定模块620用于根据所述编码合作靶标集合中各所述合作靶标与所述初始位置之间的距离,确定多个所述编码合作靶标中的一个为所述初始编码合作靶标,其中,所述初始编码合作靶标为所述水下机器人未经过的编码合作靶标。In a possible implementation, the number of the coded cooperation targets is not less than 2, and the first determination module 620 is configured to determine based on the distance between each of the cooperation targets in the coded cooperation target set and the initial position, One of the plurality of coding cooperation targets is determined to be the initial coding cooperation target, wherein the initial coding cooperation target is a coding cooperation target that the underwater robot has not passed.
在一种可能的实现方式中,第三确定模块640用于根据所述水下机器人的预设作业路径和所述初始编码合作靶标,确定所述水下机器人的所述作业路径。In a possible implementation, the third determination module 640 is configured to determine the operation path of the underwater robot according to the preset operation path of the underwater robot and the initial coded cooperation target.
在一种可能的实现方式中,装置600还用于在所述水下机器人前往所述初始编码合作靶标和/或所述目的编码合作靶标的过程中,通过所述水下机器人,获取所述水下构筑物的结构化信息;将所述结构化信息与预先存储的所述水下构筑物的标准结构化信息进行匹配;在所述结构化信息与所述标准结构化信息匹配成功的情况下,根据所述标准结构化信息,确定所述水下机器人的实时位置。In a possible implementation, the device 600 is also configured to acquire the underwater robot through the underwater robot during the process of traveling to the initial encoding cooperation target and/or the destination encoding cooperation target. The structured information of the underwater structure; matching the structured information with the pre-stored standard structured information of the underwater structure; in the case where the structured information matches the standard structured information successfully, Based on the standard structured information, the real-time position of the underwater robot is determined.
在一种可能的实现方式中,装置600用于通过所述水下机器人的视觉传感器,获取用于表示所述水下构筑物的结构的视觉图像;通过所述水下机器人的声学传感器,获取用于表示所述水下构筑物的结构的声呐图像。In a possible implementation, the device 600 is configured to obtain a visual image representing the structure of the underwater structure through a visual sensor of the underwater robot; and obtain a visual image representing the structure of the underwater structure through an acoustic sensor of the underwater robot. A sonar image representing the structure of the underwater structure.
在一种可能的实现方式中,装置600用于在所述结构化信息与所述标准结构化信息匹配失败的情况下,控制所述水下机器人返回已经经过的上一编码合作靶标或结束作业。In a possible implementation, the device 600 is used to control the underwater robot to return to the previous coded cooperation target it has passed or to end the operation if the structured information fails to match the standard structured information. .
在一种可能的实现方式中,定位模块650用于根据所述瞬时精确位置、所述瞬时精确姿态、所述作业路径和所述实时位置,控制所述惯性导航系统进行定位导航。In a possible implementation, the positioning module 650 is configured to control the inertial navigation system to perform positioning and navigation based on the instantaneous precise position, the instantaneous precise attitude, the working path and the real-time position.
本申请实施例提供的该装置600,可执行前文方法实施例中所述的各方法,并实现前文方法实施例中所述的各方法的功能和有益效果,在此不再赘述。The device 600 provided in the embodiment of the present application can execute each method described in the previous method embodiment, and realize the functions and beneficial effects of each method described in the previous method embodiment, which will not be described again here.
以上所述仅为本申请的较佳实施例,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。The above descriptions are only preferred embodiments of the present application and are not intended to limit the protection scope of the present application. Any modifications, equivalent replacements, improvements, etc. made within the spirit and principles of this application shall be included in the protection scope of this application.
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。It should also be noted that the terms "comprises," "comprises," or any other variation thereof are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that includes a list of elements not only includes those elements, but also includes Other elements are not expressly listed or are inherent to the process, method, article or equipment. Without further limitation, an element defined by the statement "comprises a..." does not exclude the presence of additional identical elements in a process, method, article, or device that includes the stated element.
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。Each embodiment in this specification is described in a progressive manner. The same and similar parts between the various embodiments can be referred to each other. Each embodiment focuses on its differences from other embodiments. In particular, for the system embodiment, since it is basically similar to the method embodiment, the description is relatively simple. For relevant details, please refer to the partial description of the method embodiment.
Claims (9)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311643883.3A CN117346792B (en) | 2023-12-04 | 2023-12-04 | A method for positioning underwater robots in marine engineering environments |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311643883.3A CN117346792B (en) | 2023-12-04 | 2023-12-04 | A method for positioning underwater robots in marine engineering environments |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117346792A true CN117346792A (en) | 2024-01-05 |
CN117346792B CN117346792B (en) | 2024-03-15 |
Family
ID=89363563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311643883.3A Active CN117346792B (en) | 2023-12-04 | 2023-12-04 | A method for positioning underwater robots in marine engineering environments |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117346792B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180252820A1 (en) * | 2015-09-08 | 2018-09-06 | Underwater Communications & Navigation Laboratory (Limited Liability Company) | Method for positioning underwater objects and system for the implementation thereof |
FR3080194A1 (en) * | 2018-04-12 | 2019-10-18 | Cgg Services Sas | METHOD FOR GUIDING AN AUTONOMOUS SUBMARINE VEHICLE AND ASSOCIATED SYSTEM FOR ACQUIRING UNDERWATER ANALYSIS DATA |
CN115574855A (en) * | 2022-09-29 | 2023-01-06 | 深圳大学 | Method for detecting underwater operation robot in butt joint state of immersed tube pipe joints |
JP2023050230A (en) * | 2021-09-30 | 2023-04-11 | 日本電気株式会社 | Underwater Position Correction Device, Underwater Position Correction Method, and Underwater Position Correction Program |
-
2023
- 2023-12-04 CN CN202311643883.3A patent/CN117346792B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180252820A1 (en) * | 2015-09-08 | 2018-09-06 | Underwater Communications & Navigation Laboratory (Limited Liability Company) | Method for positioning underwater objects and system for the implementation thereof |
FR3080194A1 (en) * | 2018-04-12 | 2019-10-18 | Cgg Services Sas | METHOD FOR GUIDING AN AUTONOMOUS SUBMARINE VEHICLE AND ASSOCIATED SYSTEM FOR ACQUIRING UNDERWATER ANALYSIS DATA |
JP2023050230A (en) * | 2021-09-30 | 2023-04-11 | 日本電気株式会社 | Underwater Position Correction Device, Underwater Position Correction Method, and Underwater Position Correction Program |
CN115574855A (en) * | 2022-09-29 | 2023-01-06 | 深圳大学 | Method for detecting underwater operation robot in butt joint state of immersed tube pipe joints |
Non-Patent Citations (1)
Title |
---|
李清泉等: "动态精密工程测量技术及应用", 《测绘学报》, vol. 50, no. 9, pages 1147 - 1158 * |
Also Published As
Publication number | Publication date |
---|---|
CN117346792B (en) | 2024-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106123850B (en) | AUV prestowage multibeam sonars underwater topography surveys and draws modification method | |
CN110806209B (en) | An underwater robot multi-device combined navigation system and method | |
CN101241011B (en) | High precision positioning and posture-fixing device on laser radar platform and method | |
CN115077487B (en) | Immersed tube butt joint measurement method and system for stay wire assisted photogrammetry | |
CN104390646B (en) | The location matching method of underwater hiding-machine terrain aided inertial navigation system | |
CN103434610B (en) | A kind of offshore drilling platforms butt junction location bootstrap technique | |
CN107966145B (en) | AUV underwater navigation method based on sparse long baseline tight combination | |
CN114046792B (en) | Unmanned ship water surface positioning and mapping method, device and related components | |
CN112665584B (en) | Underwater robot positioning and composition method based on multi-sensor fusion | |
CN115435779B (en) | A method for intelligent body pose estimation based on GNSS/IMU/optical flow information fusion | |
CN108360318B (en) | A-INS accurate measurement for track irregularity detection is segmented linear approximating method | |
WO2024032663A1 (en) | Underwater photogrammetry-based method for measurement during docking of immersed tube segments | |
CN112797978B (en) | Guiding method and system of heading machine and storage medium | |
CN109798874A (en) | A kind of high-speed rail bridge vertically moves degree of disturbing measurement method | |
CN113639722A (en) | Continuous laser scanning registration assisted inertial positioning and attitude determination method | |
CN116840258A (en) | Bridge pier disease detection method based on multifunctional underwater robot and stereo vision | |
CN114623822B (en) | Multi-beam underwater terrain combination matching method based on inertia constraint | |
CN109813510B (en) | Measurement method of vertical disturbance of high-speed railway bridge based on UAV | |
CN111637889A (en) | Tunneling machine positioning method and system based on inertial navigation and laser radar three-point distance measurement | |
CN117346792B (en) | A method for positioning underwater robots in marine engineering environments | |
CN106405603A (en) | Portable efficient long-distance accurate target positioning system and positioning method | |
AU2018226595B2 (en) | Combined metrology method for computing distance, roll and pitch attitudes and relative orientations between two underwater points of interest | |
TWI635302B (en) | Real-time precise positioning system of vehicle | |
CN119414875A (en) | A UAV navigation and positioning method based on bridge detection | |
CN105425246A (en) | Method for ship-borne integrated measurement system precision calibration in water pool |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |