[go: up one dir, main page]

CN118262071A - Engineering machinery mixed reality digital twin management and control method and system - Google Patents

Engineering machinery mixed reality digital twin management and control method and system Download PDF

Info

Publication number
CN118262071A
CN118262071A CN202410341789.0A CN202410341789A CN118262071A CN 118262071 A CN118262071 A CN 118262071A CN 202410341789 A CN202410341789 A CN 202410341789A CN 118262071 A CN118262071 A CN 118262071A
Authority
CN
China
Prior art keywords
engineering machinery
point cloud
mixed reality
scene
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410341789.0A
Other languages
Chinese (zh)
Inventor
丁伟利
张晓雨
杨玉林
向天鹤
郭欣雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanshan University
Original Assignee
Yanshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanshan University filed Critical Yanshan University
Priority to CN202410341789.0A priority Critical patent/CN118262071A/en
Publication of CN118262071A publication Critical patent/CN118262071A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/18Details relating to CAD techniques using virtual or augmented reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a system for controlling a digital twin of a mixed reality of engineering machinery, which belong to the field of digital twin, and realize the bidirectional digital twin control of the engineering machinery by carrying out real-time dynamic sensing on multi-source heterogeneous data in the working environment of the engineering machinery, generating virtual-real data by fusing the mixed reality working environment and setting interactive tasks; the virtual-real data fusion mixed reality operation environment generation is to construct a color point cloud map by utilizing a server to acquire color point clouds, combine with a pre-constructed three-dimensional model based on GPS positioning data, acquired engineering machinery motion tracks and engineering machinery local IMU data, realize the fusion registration of virtual-real data of the engineering machinery, reconstruct an operation scene with higher virtual-real consistency in three-dimensional mode, analyze the material state and the equipment motion state based on semantic information, and perform danger early warning in combination with a collision detection technology in the virtual scene. The invention optimizes the engineering machinery operation plan, improves the resource utilization rate and reduces the operation cost.

Description

一种工程机械混合现实数字孪生管控方法及系统A mixed reality digital twin control method and system for engineering machinery

技术领域Technical Field

本发明涉及数字孪生技术领域,尤其是一种工程机械混合现实数字孪生管控方法及系统。The present invention relates to the field of digital twin technology, and in particular to a mixed reality digital twin control method and system for engineering machinery.

背景技术Background technique

工程机械(包括门座式起重机、装船机、堆取料机、挖掘机、装载机等)是用于装卸散装货物(如煤炭、矿石、谷物等)的重要设备。建设散货智慧港口和实现智能施工,高效智能的装卸设备是关键。然而,现有工程机械主要利用监控视频回传现场的实时图像,对装卸作业环境进行感知,同时远程遥控整机位置和抓斗/铲斗位置完成装卸料动作。整个过程需要司机水平较高,同时还需要注意力高度集中。而在远程遥控作业过程中,由于鱼眼或普通摄像头缺少作业环境三维信息,现场场景存在缩小、变形及距离感严重缺失等问题,导致司机难以准确判断作业目标和障碍物的三维空间位置,对涉及到的精细操作动作无法准确完成。虽然有的工程机械系统融合了激光雷达进行环境建模,但是由于点云数据处理难以实时进行,导致作业过程中司机仍然依赖于视频。Engineering machinery (including gantry cranes, ship loaders, stackers, excavators, loaders, etc.) is an important equipment for loading and unloading bulk goods (such as coal, ore, grain, etc.). Efficient and intelligent loading and unloading equipment is the key to building smart bulk ports and realizing intelligent construction. However, existing engineering machinery mainly uses monitoring videos to transmit real-time images of the site to perceive the loading and unloading environment, and remotely controls the position of the whole machine and the grab/bucket to complete the loading and unloading actions. The whole process requires a high level of driver skills and a high degree of concentration. During the remote control operation, due to the lack of three-dimensional information of the working environment by fisheye or ordinary cameras, the scene on site has problems such as shrinkage, deformation and serious lack of distance sense, which makes it difficult for the driver to accurately judge the three-dimensional spatial position of the operating target and obstacles, and the fine operation involved cannot be completed accurately. Although some engineering machinery systems integrate lidar for environmental modeling, the point cloud data processing is difficult to perform in real time, resulting in the driver still relying on video during the operation.

随着混合现实技术(MR)的不断发展,研究者针对这项技术进行研究并应用于多个领域。如:公开号为CN110322553A的中国发明专利提供了一种激光雷达点云混合现实场景实施放样的方法和系统,但是该方法纯基于激光点云进行匹配,仅适用于局部区域的定位和跟踪,对于大范围的场景定位具有局限性,并且对环境要求高。公开号为WO2021138940A1的发明专利提供了一种面向增强现实和混合现实的远程虚实高精度匹配定位方法,解决了现有方法中匹配定位不精准的技术问题,但是该匹配方法是基于标志物的匹配,依赖于外部标志物,只适用于特定场景或应用环境,适用范围有限,对环境变化较为敏感。公开号为CN117151417A的中国发明专利提供了一种基于数字孪生的港口运营管理方法、系统及存储介,通过融合数字孪生技术,能够将港口的运营数据进行可视化处理,使得运营数据能够更加直观地展现给用户,但其有效性高度依赖于准确和及时的数据输入,如货物装载信息和运输车资源数据,任何数据错误或延迟都可能影响系统的性能和决策的准确性。With the continuous development of mixed reality technology (MR), researchers have studied this technology and applied it to multiple fields. For example, the Chinese invention patent with publication number CN110322553A provides a method and system for implementing layout of mixed reality scenes with lidar point clouds, but this method is purely based on matching with laser point clouds, and is only applicable to local area positioning and tracking. It has limitations for positioning of large-scale scenes and has high requirements for the environment. The invention patent with publication number WO2021138940A1 provides a remote virtual-real high-precision matching and positioning method for augmented reality and mixed reality, which solves the technical problem of inaccurate matching and positioning in existing methods, but the matching method is based on marker matching, depends on external markers, is only applicable to specific scenes or application environments, has a limited scope of application, and is more sensitive to environmental changes. The Chinese invention patent with publication number CN117151417A provides a port operation management method, system and storage medium based on digital twins. By integrating digital twin technology, the port's operation data can be visualized so that the operation data can be presented to users more intuitively. However, its effectiveness is highly dependent on accurate and timely data input, such as cargo loading information and transport vehicle resource data. Any data errors or delays may affect the performance of the system and the accuracy of decision-making.

有鉴于此,有必要研发一种工程机械混合现实数字孪生管控方法及系统,以克服上述问题。In view of this, it is necessary to develop a mixed reality digital twin control method and system for engineering machinery to overcome the above problems.

发明内容Summary of the invention

本发明需要解决的技术问题是提供一种工程机械混合现实数字孪生管控方法及系统,结合了物料三维建模、动态感知、混合现实虚实叠加以及在线学习和优化的功能,不依赖于数据输入,能够实现更高级别的作业管控和资源利用率的提升。The technical problem to be solved by the present invention is to provide a mixed reality digital twin control method and system for engineering machinery, which combines the functions of three-dimensional material modeling, dynamic perception, mixed reality virtual-reality superposition, and online learning and optimization. It does not rely on data input and can achieve a higher level of operation control and improved resource utilization.

为解决上述技术问题,本发明所采用的技术方案是:In order to solve the above technical problems, the technical solution adopted by the present invention is:

一种工程机械混合现实数字孪生管控方法,通过对工程机械作业环境中多源异构数据的实时动态感知、虚实数据融合混合现实作业环境生成和交互式任务设定,实现工程机械的双向数字孪生管控;A mixed reality digital twin control method for construction machinery, which realizes bidirectional digital twin control of construction machinery through real-time dynamic perception of multi-source heterogeneous data in the construction machinery operation environment, generation of mixed reality operation environment with virtual and real data fusion, and interactive task setting;

所述虚实数据融合混合现实作业环境生成,是利用服务器将采集的彩色点云进行彩色点云地图构建,并和预先构建的工程机械三维模型和附属设施三维模型,基于GPS定位数据、获取的工程机械运动轨迹和工程机械局部IMU数据联合,实现工程机械虚实数据融合配准,以及虚实一致性较高的三维重建作业场景,并进一步基于语义信息对物料状态、设备运动状态进行分析,同时结合虚拟场景中的碰撞检测技术进行设备-设备、设备-人员、设备-周围物体的危险预警。The virtual-real data fusion mixed reality working environment generation is to use the server to construct a color point cloud map with the collected color point cloud, and combine it with the pre-constructed three-dimensional model of engineering machinery and the three-dimensional model of ancillary facilities based on GPS positioning data, the acquired motion trajectory of engineering machinery and the local IMU data of engineering machinery to achieve the virtual-real data fusion and registration of engineering machinery, as well as the three-dimensional reconstruction of the working scene with high virtual-real consistency, and further analyze the material status and equipment motion status based on semantic information, and at the same time combine the collision detection technology in the virtual scene to carry out equipment-equipment, equipment-personnel, and equipment-surrounding objects danger warning.

本发明技术方案的进一步改进在于:所述作业环境中多源异构数据的实时动态感知,具体包括:通过将视觉-激光雷达融合感知系统架设到工程机械上,实时获取作业场景三维彩色点云、工程机械的运动轨迹,然后对三维彩色点云进行分析,进一步获取障碍物、人员及周围场景的语义点云和三维包围盒信息,并通过控制器和IMU联合获取工程机械各部位的运动数据,通过GPS获取工程机械的全局位置信息,实时存储于数据库。A further improvement of the technical solution of the present invention is that the real-time dynamic perception of multi-source heterogeneous data in the working environment specifically includes: by setting up a visual-lidar fusion perception system on the construction machinery, the three-dimensional color point cloud of the working scene and the motion trajectory of the construction machinery are obtained in real time, and then the three-dimensional color point cloud is analyzed to further obtain the semantic point cloud and three-dimensional bounding box information of obstacles, personnel and surrounding scenes, and the motion data of various parts of the construction machinery are jointly obtained through the controller and IMU, and the global position information of the construction machinery is obtained through GPS, and stored in the database in real time.

本发明技术方案的进一步改进在于:所述对三维彩色点云进行分析,具体是指通过利用点云语义分割网络,对采集彩色激光点云数据预处理、语义分割和三角化,进一步获取障碍物、人员及周围场景的语义点云和三维包围盒信息,并实现对物体的三维建模和形状分析,从而更准确识别和检测物料的位置、形态和特征;其中点云数据预处理包括点云降采样、点云去噪和点云重采样。A further improvement of the technical solution of the present invention lies in that: the analysis of the three-dimensional color point cloud specifically refers to preprocessing, semantically segmenting and triangulating the collected color laser point cloud data by utilizing the point cloud semantic segmentation network, further obtaining semantic point clouds and three-dimensional bounding box information of obstacles, personnel and surrounding scenes, and realizing three-dimensional modeling and shape analysis of objects, so as to more accurately identify and detect the position, form and characteristics of materials; wherein the point cloud data preprocessing includes point cloud downsampling, point cloud denoising and point cloud resampling.

本发明技术方案的进一步改进在于:所述彩色点云地图构建是通过对动静态点云数据采用不同的处理方法;其中,对于静态区域,当3D点云足够密集后,将停止对该区域的更新;对于动态区域,进行增量检测,实时更新变化的点云数据,从而构建实时彩色点云地图。A further improvement of the technical solution of the present invention is that the color point cloud map is constructed by using different processing methods for dynamic and static point cloud data; wherein, for static areas, when the 3D point cloud is dense enough, the update of the area will be stopped; for dynamic areas, incremental detection is performed, and the changed point cloud data is updated in real time, thereby constructing a real-time color point cloud map.

本发明技术方案的进一步改进在于:基于GPS定位数据、获取的工程机械运动轨迹和工程机械局部IMU数据联合,具体模型如下:The further improvement of the technical solution of the present invention is that based on the GPS positioning data, the acquired engineering machinery motion trajectory and the local IMU data of the engineering machinery, the specific model is as follows:

其中,设定虚拟场景中工程机械中心和实际场景中GPS安装位置重合,表示工程机械中心在虚拟场景坐标系下的坐标;Pgps(x,y,z)=(N+H)*cosB*cosL,(N+H)*cosB*sinL,[N*(1-e2)+H*sinB)表示GPS坐标转成平面坐标后的表示,B为经度,N为维度,H为大地高;表示当前时刻工程机械的坐标,该坐标由获取的工程机械运动轨迹计算;RT表示工程机械运动起始位置的坐标相对于GPS安装位置的旋转平移矩阵; 表示虚拟场景中工程机械中心绕x,绕y,绕z的旋转角;表示当前时刻工程机械在实际场景中绕x,绕y,绕z的旋转角;表示虚拟场景中工程机械关键部位绕x,绕y,绕z的旋转角,表示实际场景中工程机械与虚拟场景对应的关键部位所安装的IMU获得的绕x,绕y,绕z的旋转角。Among them, the construction machinery center in the virtual scene is set to coincide with the GPS installation position in the actual scene. Indicates the coordinates of the center of the construction machinery in the virtual scene coordinate system; P gps (x, y, z) = (N + H) * cosB * cosL, (N + H) * cosB * sinL, [N * (1-e 2 ) + H * sinB) indicates the representation of GPS coordinates converted into plane coordinates, B is longitude, N is latitude, and H is geodetic height; Indicates the coordinates of the construction machinery at the current moment, which are calculated based on the acquired motion trajectory of the construction machinery; RT indicates the rotation and translation matrix of the coordinates of the starting position of the construction machinery relative to the GPS installation position; Indicates the rotation angle of the center of the construction machinery around x, y, and z in the virtual scene; Indicates the rotation angle of the construction machinery around x, y, and z in the actual scene at the current moment; Indicates the rotation angles of the key parts of the engineering machinery in the virtual scene around x, y, and z. It represents the rotation angles around x, y, and z obtained by the IMU installed on the key parts of the engineering machinery in the actual scene corresponding to the virtual scene.

本发明技术方案的进一步改进在于:所述工程机械虚实数据融合配准是通过对齐GPS获得的大机粗定位坐标,关键部件IMU信息,以及视觉-激光雷达SLAM输出的工程机械自身运动轨迹信息和工程机械局部彩色3D点云,并通过3D点云和工程机械的相似特征点融合匹配实现的。A further improvement of the technical solution of the present invention is that the virtual and real data fusion alignment of the engineering machinery is achieved by aligning the rough positioning coordinates of the large machine obtained by GPS, the IMU information of key components, and the engineering machinery's own motion trajectory information and the local color 3D point cloud of the engineering machinery output by the visual-lidar SLAM, and is achieved by fusion matching of the 3D point cloud and similar feature points of the engineering machinery.

本发明技术方案的进一步改进在于:所述交互式任务设定具体包括:通过手势、语音交互对已经生成的混合现实作业场景,根据任务需求设定作业任务目标,并对虚实融合场景进行动态的信息融合显示及提示。A further improvement of the technical solution of the present invention is that the interactive task setting specifically includes: setting the work task objectives for the generated mixed reality work scene according to the task requirements through gesture and voice interaction, and dynamically displaying and prompting the virtual-reality fusion scene.

本发明技术方案的进一步改进在于:所述双向数字孪生管控,具体包括:一方面,通过对散货物料、工程机械自身运行状态,周围环境和物料变化状态的实时在线监控实现场景感知功能;另一方面,依据生成的混合现实地图,在混合现实环境中对预训练的工程机械操作轨迹进行在线学习、优化和调整,以及工程机械装卸作业的实验验证和安全测试,并针对特定机型、特定任务生成最优轨迹,进而实现工程机械作业的在线数字孪生控制。A further improvement of the technical solution of the present invention lies in that the bidirectional digital twin management and control specifically includes: on the one hand, realizing the scene perception function through real-time online monitoring of bulk materials, the operating status of the construction machinery itself, the surrounding environment and the changing status of materials; on the other hand, based on the generated mixed reality map, online learning, optimization and adjustment of the pre-trained construction machinery operation trajectory are carried out in the mixed reality environment, as well as experimental verification and safety testing of the construction machinery loading and unloading operations, and the optimal trajectory is generated for specific models and specific tasks, thereby realizing online digital twin control of construction machinery operations.

一种工程机械混合现实数字孪生管控系统,包括港口场景的动态实时感知模块、混合现实场景融合匹配模块和人机协同交互及控制模块;A mixed reality digital twin control system for construction machinery, including a dynamic real-time perception module for port scenes, a mixed reality scene fusion matching module, and a human-machine collaborative interaction and control module;

所述港口场景的动态实时感知模块用于通过将单目相机传感器和激光雷达传感器进行融合匹配显示得到工程机械运动轨迹和作业环境的彩色三维点云数据;通过对点云数据调用语义分割网络,获得物料、人、设备、轨道、地面、船、车辆部分语义信息,以及语义点云的最小包围盒;通过在工程机械的起升、旋转、变幅关键部位安装IMU传感器获取运动信息,在工程机械司机室顶部安装GPS获取大机位置信息;The dynamic real-time perception module of the port scene is used to obtain the color three-dimensional point cloud data of the motion trajectory and working environment of the engineering machinery by fusing and matching the monocular camera sensor and the laser radar sensor; obtain the semantic information of materials, people, equipment, tracks, ground, ships, and vehicles, as well as the minimum bounding box of the semantic point cloud by calling the semantic segmentation network on the point cloud data; obtain the motion information by installing IMU sensors at the key parts of the lifting, rotation, and amplitude change of the engineering machinery, and obtain the location information of the large machine by installing GPS on the top of the engineering machinery cab;

所述混合现实场景融合匹配模块用于根据GPS数据、IMU与运动轨迹对预先建立工程机械虚拟模型,附属设施模型和采集的三维彩色点云进行融合,以建立更准确的设备位置信息;之后将定位和语义匹配得到的位置信息和物体最小包围盒信息进行坐标系转换,使其与unity中建立的工程机械虚拟模型和地面附属设施的坐标系对齐;根据转换后的位置信息和物体信息,将虚拟工程机械模型在Unity3D软件中显示出来,并在真实场景中叠加显示;The mixed reality scene fusion matching module is used to fuse the pre-established construction machinery virtual model, the auxiliary facility model and the collected three-dimensional color point cloud according to GPS data, IMU and motion trajectory to establish more accurate equipment location information; then the location information and the object minimum bounding box information obtained by positioning and semantic matching are converted into a coordinate system so as to align them with the coordinate system of the construction machinery virtual model and the ground auxiliary facilities established in Unity; according to the converted location information and object information, the virtual construction machinery model is displayed in the Unity3D software and superimposed on the real scene;

所述人机协同交互及控制模块用于使用语音和手势识别技术来控制工程机械在混合现实数字孪生环境中的任务设定和运动轨迹的记录,并实现轨迹优化和预测,并结合Unity端向工程机械实时发送控制命令实现实际工程机械的运动控制。The human-machine collaborative interaction and control module is used to use voice and gesture recognition technology to control the task setting and motion trajectory recording of construction machinery in a mixed reality digital twin environment, and to achieve trajectory optimization and prediction, and to send control commands to the construction machinery in real time in combination with the Unity end to realize the motion control of the actual construction machinery.

本发明技术方案的进一步改进在于:所述工程机械至少包括港口设备中的门座式起重机、装船机、堆取料机、挖掘机和装载机。A further improvement of the technical solution of the present invention is that the engineering machinery at least includes a portal crane, a ship loader, a stacker-reclaimer, an excavator and a loader in the port equipment.

由于采用了上述技术方案,本发明取得的技术进步是:Due to the adoption of the above technical solution, the technical progress achieved by the present invention is:

1、本发明基于彩色激光点云,结合了物料三维建模、动态感知、混合现实虚实叠加以及在线学习和优化的功能,可以提高工程机械的作业效率和安全性,并可以通过混合现实数字孪生环境对数据的分析、运动预测和模拟仿真,优化工程机械作业计划、提高资源利用率,降低运营成本。1. The present invention is based on color laser point cloud and combines the functions of three-dimensional material modeling, dynamic perception, mixed reality virtual-real superposition, and online learning and optimization. It can improve the operating efficiency and safety of construction machinery, and can analyze, predict motion and simulate data in a mixed reality digital twin environment to optimize the operation plan of construction machinery, improve resource utilization and reduce operating costs.

2、本发明将混合现实技术和数字孪生技术引入到工程机械的无人化作业中,通过对实时采集的三维彩色激光点云、工程机械三维模型和附属设施模型的实时融合,既可以实现对现场作业环境实时物料三维建模、环境动态感知和监控;又可以通过交互式的任务设定,在混合现实环境中对预训练的工程机械操作轨迹进行在线学习、优化和调整,以及工程机械装卸作业的实验验证和安全测试,并针对特定机型、特定任务生成最优轨迹,进而实现对实际工程机械的作业控制。2. The present invention introduces mixed reality technology and digital twin technology into the unmanned operation of construction machinery. Through the real-time fusion of the real-time collected three-dimensional color laser point cloud, the three-dimensional model of the construction machinery and the model of the ancillary facilities, it can realize the real-time three-dimensional material modeling, dynamic environmental perception and monitoring of the on-site working environment; and through interactive task setting, the pre-trained construction machinery operation trajectory can be learned, optimized and adjusted online in the mixed reality environment, as well as the experimental verification and safety testing of the construction machinery loading and unloading operations, and the optimal trajectory can be generated for specific models and specific tasks, thereby realizing the operation control of the actual construction machinery.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图做以简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图;In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required for use in the embodiments or the description of the prior art are briefly introduced below. Obviously, the drawings described below are some embodiments of the present invention. For those of ordinary skill in the art, other drawings can be obtained based on these drawings without creative labor.

图1是本发明实施例中提供的工程机械混合现实数字孪生管控系统整体框图;FIG1 is an overall block diagram of a mixed reality digital twin control system for construction machinery provided in an embodiment of the present invention;

图2是本发明实施例中提供的工程机械混合现实数字孪生管控方法流程图。Figure 2 is a flow chart of the engineering machinery mixed reality digital twin control method provided in an embodiment of the present invention.

具体实施方式Detailed ways

需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。It should be noted that the terms "including" and "having" and any variations thereof in the specification and claims of the present invention and the above-mentioned drawings are intended to cover non-exclusive inclusions. For example, a process, method, system, product or apparatus comprising a series of steps or units is not necessarily limited to those steps or units clearly listed, but may include other steps or units that are not clearly listed or inherent to these processes, methods, products or apparatuses.

下面结合附图及实施例对本发明做进一步详细说明:The present invention is further described in detail below with reference to the accompanying drawings and embodiments:

本发明中的工程机械包括港口设备中的门座式起重机、装船机、堆取料机、挖掘机、装载机等,下面以门座式起重机为例进行说明。The engineering machinery in the present invention includes portal cranes, ship loaders, stackers, excavators, loaders, etc. in port equipment. The portal crane is taken as an example for description below.

本发明采用livox固态激光雷达、单目相机和IMU(Inertial measurement unit,简称IMU,是测量物体三轴姿态角及加速度的装置)根据公开号为CN117367441A的中国发明专利所提出的“一种自主移动构建三维真彩色点云地图的机器人系统及方法”中的自主移动构建三维真彩色点云地图的方法进行工程机械作业场景的彩色点云采集,将livox固态激光雷达、单目相机和IMU架设到门座式起重机司机室附近的臂架处,通过门座式起重机沿轨道方向匀速移动,对作业场景的三维彩色点云以及门座式起重机的运动轨迹进行采集。The present invention adopts livox solid-state laser radar, monocular camera and IMU (Inertial measurement unit, referred to as IMU, is a device for measuring the three-axis attitude angle and acceleration of an object) to collect color point clouds of engineering machinery operation scenes according to the method of autonomously moving to construct three-dimensional true color point cloud maps in "a robot system and method for autonomously moving to construct three-dimensional true color point cloud maps" proposed in the Chinese invention patent with publication number CN117367441A. The livox solid-state laser radar, monocular camera and IMU are set up on the boom near the driver's cab of the portal crane. The portal crane moves at a uniform speed along the track direction to collect the three-dimensional color point cloud of the operation scene and the motion trajectory of the portal crane.

如图1所示,一种工程机械混合现实数字孪生管控系统,包括港口场景的动态实时感知模块、混合现实场景融合匹配模块以及人机协同交互及控制模块。As shown in Figure 1, a mixed reality digital twin management and control system for construction machinery includes a dynamic real-time perception module of the port scene, a mixed reality scene fusion and matching module, and a human-machine collaborative interaction and control module.

港口场景的动态实时感知模块通过将单目相机传感器和激光雷达传感器进行融合匹配显示得到门座式起重机运动轨迹和作业环境的彩色三维点云数据:The dynamic real-time perception module of the port scene obtains the motion trajectory of the gantry crane by fusing and matching the monocular camera sensor and the lidar sensor. And color 3D point cloud data of the working environment:

通过对点云数据调用PointNet++语义分割网络,获得物料、人、设备、轨道、地面、船、车辆等部分语义信息By calling the PointNet++ semantic segmentation network on the point cloud data, we can obtain some semantic information of materials, people, equipment, tracks, ground, ships, vehicles, etc.

以及语义点云的最小包围盒 And the minimum bounding box of the semantic point cloud

通过在门座式起重机的起升、旋转、变幅关键部位安装IMU传感器获取运动信息在门座式起重机司机室顶部安装GPS获取大机位置信息Pgps(x,y,z)。By installing IMU sensors at the key parts of the portal crane for lifting, rotation and luffing, motion information can be obtained. Install GPS on the top of the gantry crane cab to obtain the location information P gps (x, y, z) of the crane.

混合现实场景融合匹配模块,根据GPS数据、IMU与运动轨迹对预先建立门座式起重机虚拟模型,附属设施模型和采集的三维彩色点云进行融合,以建立更准确的设备位置信息。之后将定位和语义匹配得到的位置信息和物体最小包围盒信息进行坐标系转换,使其与unity中建立的门座式起重机虚拟模型和地面附属设施的坐标系对齐。根据转换后的位置信息和物体信息,将虚拟门座式起重机模型在Unity3D软件中显示出来,并在真实场景中叠加显示。The mixed reality scene fusion matching module pre-builds the virtual model of the portal crane, the auxiliary facilities model and the collected 3D color point cloud based on GPS data, IMU and motion trajectory The fusion is performed to establish more accurate equipment location information. The location information and object minimum bounding box information obtained by positioning and semantic matching are then converted into coordinate systems to align them with the coordinate systems of the portal crane virtual model and ground auxiliary facilities established in Unity. Based on the converted location information and object information, the virtual portal crane model is displayed in the Unity3D software and superimposed on the real scene.

基于GPS、工程机械运动轨迹和局部IMU数据联合配置,具体模型如下:Based on the joint configuration of GPS, engineering machinery motion trajectory and local IMU data, the specific model is as follows:

式中,设定虚拟场景中工程机械中心和实际场景中GPS安装位置重合,表示工程机械中心在虚拟场景坐标系下的坐标,Pgps(x,y,z)=(N+H)*cosB*cosL,(N+H)*cosB*sinL,[N*(1-e2)+H*sinB)表示GPS坐标转成平面坐标后的表示(B为经度,N为维度,H为大地高),表示当前时刻工程机械的坐标,该坐标由获取的工程机械运动轨迹计算。RT表示工程机械运动起始位置的坐标相对于GPS安装位置的旋转平移矩阵。此处,表示虚拟场景中工程机械中心绕x,绕y,绕z的旋转角; 表示当前时刻工程机械在实际场景中绕x,绕y,绕z的旋转角; 表示虚拟场景中工程机械关键部位绕x,绕y,绕z的旋转角, 表示实际场景中工程机械与虚拟场景对应的关键部位所安装的IMU获得的绕x,绕y,绕z的旋转角。In the formula, the center of the construction machinery in the virtual scene is assumed to coincide with the GPS installation position in the actual scene. represents the coordinates of the center of the construction machinery in the virtual scene coordinate system, P gps (x, y, z) = (N + H) * cosB * cosL, (N + H) * cosB * sinL, [N * (1-e 2 ) + H * sinB) represents the GPS coordinates converted into plane coordinates (B is longitude, N is latitude, H is geodetic height), Represents the coordinates of the construction machinery at the current moment, which are calculated from the acquired construction machinery motion trajectory. RT represents the rotation and translation matrix of the coordinates of the starting position of the construction machinery relative to the GPS installation position. Here, Indicates the rotation angle of the center of the construction machinery around x, y, and z in the virtual scene; Indicates the rotation angle of the construction machinery around x, y, and z in the actual scene at the current moment; Indicates the rotation angles of the key parts of the engineering machinery in the virtual scene around x, y, and z. It represents the rotation angles around x, y, and z obtained by the IMU installed on the key parts of the engineering machinery in the actual scene corresponding to the virtual scene.

其中,通过Unity3D软件显示场景是使用TCP/IP协议,将感知层获取的数据传输至MySQL数据库,并实现服务器层对MySQL数据库数据的读取和处理。对散货船舶的彩色点云数据进行处理,包括基于PCL库的点云下采样、点云降噪等预处理,之后对预处理后的点云进行网格化;读取港口机械的控制信号对信号处理并进行作业规划;对周围环境信息和人员位置信息处理进行危险预警、故障诊断、碰撞检测和人员定位等,构建XR环境。Among them, the Unity3D software displays the scene using TCP/IP protocol, transmits the data obtained by the perception layer to the MySQL database, and enables the server layer to read and process the MySQL database data. The color point cloud data of the bulk carrier is processed, including pre-processing such as point cloud downsampling and point cloud noise reduction based on the PCL library, and then the pre-processed point cloud is meshed; the control signal of the port machinery is read to process the signal and perform operation planning; the surrounding environment information and personnel location information are processed for danger warning, fault diagnosis, collision detection and personnel positioning, etc., to build an XR environment.

构建XR环境,包括物料三维点云可视化、彩色点云地图显示、设备三维模型构建、作业轨迹跟踪、人员模型构建和融合显示。通过Unity3D引擎构建服务层,包括服务模型和用户界面,服务模型包括历史回放、物料测量、设备监测、吞吐量估计、作业任务规划和测量范围划定。用户界面有多种,web浏览器界面,客户端GUI界面,以及手机应用客户端界面。Build an XR environment, including 3D point cloud visualization of materials, color point cloud map display, 3D equipment model construction, operation trajectory tracking, personnel model construction and fusion display. Use the Unity3D engine to build a service layer, including service models and user interfaces. The service model includes historical playback, material measurement, equipment monitoring, throughput estimation, operation task planning and measurement range delineation. There are many user interfaces, including web browser interface, client GUI interface, and mobile application client interface.

人机协同交互及控制模块是使用语音和手势识别技术来控制门座式起重机在混合现实数字孪生环境中的任务设定和运动轨迹的记录,并实现轨迹优化和预测,并结合Unity端向门座式起重机实时发送控制命令实现实际机器的运动控制。具体步骤如下:The human-machine collaborative interaction and control module uses voice and gesture recognition technology to control the task setting and motion trajectory recording of the portal crane in the mixed reality digital twin environment, and realizes trajectory optimization and prediction, and combines with the Unity end to send control commands to the portal crane in real time to realize the motion control of the actual machine. The specific steps are as follows:

(1)在混合现实数字孪生客户端通过摄像头获取人的手势输入,并使用手势识别算法对手势进行分析和识别。(1) The mixed reality digital twin client obtains human gesture input through the camera, and uses the gesture recognition algorithm to analyze and recognize the gesture.

(2)通过麦克风录音,使用百度语音识别API将录音转换为文本,识别结果可以包含控制命令或关键词。(2) Recording through a microphone, using Baidu speech recognition API to convert the recording into text. The recognition results can include control commands or keywords.

(3)根据手势识别和语音识别的结果,解析出相应的控制命令,如“移动到第一舱口”、“作业停止”、“上升”、“下降”、“旋转”等,并将相应的控制命令发送到混合现实数字孪生端。(3) Based on the results of gesture recognition and voice recognition, the corresponding control commands are parsed, such as "move to the first hatch", "stop operation", "rise", "fall", "rotate", etc., and the corresponding control commands are sent to the mixed reality digital twin end.

控制模块具体的控制过程为:The specific control process of the control module is:

(1)在远程操作端,通过记录作业任务、作业目标和作业环境彩色点云,获得特定任务下门座式起重机的作业轨迹,建立特定任务下的作业轨迹库;(1) On the remote operation end, by recording the operation task, operation target and color point cloud of the operation environment, the operation trajectory of the portal crane under the specific task is obtained, and the operation trajectory library under the specific task is established;

(2)通过对作业轨迹的优化和微调,在混合现实环境下预测特定任务下的最优作业轨迹,并在混合现实环境下建立门座式起重机运动学模型,进行实验验证和安全测试,最终筛选出最优运动轨迹;(2) By optimizing and fine-tuning the operation trajectory, the optimal operation trajectory for a specific task is predicted in a mixed reality environment. A kinematic model of a gantry crane is established in a mixed reality environment, and experimental verification and safety testing are performed to ultimately select the optimal motion trajectory.

(3)基于最优运动轨迹,通过unity混合现实数字孪生系统和门座式起重机PLC控制系统的通讯,实现具体的任务控制。(3) Based on the optimal motion trajectory, specific task control is achieved through communication between the Unity mixed reality digital twin system and the gantry crane PLC control system.

如图2所示,为本发明提供的一种工程机械混合现实数字孪生管控方法的流程图,通过对感知数据进行处理,建立混合现实虚拟场景,并基于人机交互实现自动作业任务生成。通过工程机械在混合现实环境下的三维运动仿真预测,保证作业任务的合理性和安全性。如果满足安全标准,则使用PLC控制信号发布最优运动轨迹,实现实际控制操作。若存在安全隐患或不符合要求,则重新规划任务,或采用远程手动操作。As shown in Figure 2, it is a flow chart of a mixed reality digital twin control method for engineering machinery provided by the present invention. By processing the perception data, a mixed reality virtual scene is established, and automatic operation task generation is realized based on human-computer interaction. The rationality and safety of the operation task are guaranteed by the three-dimensional motion simulation prediction of the engineering machinery in the mixed reality environment. If the safety standards are met, the PLC control signal is used to publish the optimal motion trajectory to realize the actual control operation. If there are safety hazards or it does not meet the requirements, the task is re-planned, or remote manual operation is adopted.

最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present invention, rather than to limit it. Although the present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that they can still modify the technical solutions described in the aforementioned embodiments, or replace some or all of the technical features therein by equivalents. However, these modifications or replacements do not make the essence of the corresponding technical solutions deviate from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. The method is characterized in that bidirectional digital twin management and control of the engineering machinery is realized by carrying out real-time dynamic sensing on multi-source heterogeneous data in the engineering machinery working environment, generating virtual-real data by fusing the mixed reality working environment and setting interactive tasks;
the virtual-real data fusion mixed reality operation environment is generated by utilizing a server to construct a color point cloud map of an acquired color point cloud, combining the color point cloud with a pre-constructed engineering machine three-dimensional model and an auxiliary facility three-dimensional model, realizing fusion registration of virtual-real data of the engineering machine based on GPS positioning data, acquired engineering machine motion tracks and local IMU data of the engineering machine, further analyzing a material state and a device motion state based on semantic information, and simultaneously carrying out danger early warning on devices, personnel and surrounding objects by combining collision detection technology in the virtual scene.
2. The method for controlling and managing mixed reality digital twinning of engineering machinery according to claim 1, wherein the method for dynamically sensing multi-source heterogeneous data in real time in the working environment specifically comprises: the visual-laser radar fusion sensing system is erected on the engineering machinery, the three-dimensional color point cloud of the operation scene and the motion trail of the engineering machinery are obtained in real time, then the three-dimensional color point cloud is analyzed, the semantic point cloud and the three-dimensional bounding box information of obstacles, personnel and surrounding scenes are further obtained, the motion data of all parts of the engineering machinery are obtained through the combination of the controller and the IMU, the global position information of the engineering machinery is obtained through the GPS, and the global position information is stored in the database in real time.
3. The method for managing and controlling the mixed reality digital twin of the engineering machinery according to claim 2, wherein the analysis of the three-dimensional color point cloud is specifically that the preprocessing, the semantic segmentation and the triangulation of the collected color laser point cloud data are performed by utilizing a point cloud semantic segmentation network, so that the semantic point cloud and the three-dimensional bounding box information of obstacles, personnel and surrounding scenes are further obtained, the three-dimensional modeling and the shape analysis of objects are realized, and the positions, the shapes and the characteristics of materials are more accurately identified and detected; the point cloud data preprocessing comprises point cloud downsampling, point cloud denoising and point cloud resampling.
4. The method for managing and controlling the mixed reality digital twinning of the engineering machinery according to claim 1, wherein the color point cloud map is constructed by adopting different processing methods for dynamic and static point cloud data; wherein, for a static region, when the 3D point cloud is sufficiently dense, updating of the region will be stopped; and performing increment detection on the dynamic region, and updating the changed point cloud data in real time, so as to construct a real-time color point cloud map.
5. The method for controlling the mixed reality digital twin of the engineering machinery according to claim 1, wherein the method is characterized in that the method is based on GPS positioning data, acquired motion tracks of the engineering machinery and local IMU data of the engineering machinery, and a specific model is as follows:
wherein, the center of the engineering machine in the virtual scene is set to coincide with the GPS installation position in the actual scene, Representing the coordinates of the engineering machinery center under a virtual scene coordinate system; p gps(x,y,z)=(N+H)*cosB*cosL,(N+H)*cosB*sinL,[N*(1-e2) +h sinB) represents a representation after the GPS coordinates are converted into plane coordinates, B is longitude, N is latitude, H is geodetic altitude; representing the coordinates of the engineering machinery at the current moment, wherein the coordinates are calculated by the acquired motion trail of the engineering machinery; RT represents a rotation translation matrix of coordinates of a movement starting position of the engineering machinery relative to a GPS mounting position; (h, p, r) represents a rotation angle around x, around y and around z of the center of the engineering machinery in the virtual scene; representing the rotation angle of the engineering machinery around x, around y and around z in an actual scene at the current moment; representing the rotation angle of the key parts of the engineering machinery around x, around y and around z in the virtual scene, And the rotation angles around x, around y and around z, which are obtained by the IMU installed at the key part of the engineering machine corresponding to the virtual scene in the actual scene, are represented.
6. The method for managing and controlling the hybrid reality digital twin of the engineering machinery according to claim 1, wherein the fusion registration of virtual and real data of the engineering machinery is realized by aligning the rough positioning coordinates of a large machine obtained by GPS, the IMU information of a key part, the motion track information of the engineering machinery and the local color 3D point cloud of the engineering machinery output by a vision-laser radar SLAM and fusion matching of the 3D point cloud and similar characteristic points of the engineering machinery.
7. The method for controlling and managing mixed reality digital twinning of engineering machinery according to claim 1, wherein the interactive task setting specifically comprises: setting a task target of the generated mixed reality operation scene according to task requirements through gesture and voice interaction, and carrying out dynamic information fusion display and prompt on the virtual-real fusion scene.
8. The method for controlling the mixed reality digital twin of the engineering machinery according to claim 1, which is characterized by specifically comprising the following steps: on one hand, the scene sensing function is realized by real-time on-line monitoring of bulk materials, the running state of engineering machinery and the change state of surrounding environment and materials; on the other hand, according to the generated mixed reality map, on-line learning, optimization and adjustment are carried out on the operation track of the pre-trained engineering machinery in the mixed reality environment, the experimental verification and the safety test of the loading and unloading operation of the engineering machinery are carried out, and the optimal track is generated aiming at a specific model and a specific task, so that the on-line digital twin control of the operation of the engineering machinery is realized.
9. The system for managing and controlling the mixed reality digital twin of the engineering machinery is characterized in that the system for managing and controlling the mixed reality digital twin of the engineering machinery used by the method for managing and controlling the mixed reality digital twin of the engineering machinery according to any one of claims 1-8 comprises a dynamic real-time sensing module of a harbor scene, a mixed reality scene fusion matching module and a man-machine cooperative interaction and control module;
The dynamic real-time sensing module of the port scene is used for obtaining color three-dimensional point cloud data of the engineering machinery motion trail and the operation environment by carrying out fusion, matching and display on the monocular camera sensor and the laser radar sensor; invoking semantic segmentation network through point cloud data to obtain semantic information of parts of materials, people, equipment, tracks, ground, ships and vehicles and a minimum bounding box of semantic point cloud; acquiring motion information by installing an IMU sensor at a lifting, rotating and luffing key part of the engineering machinery, and acquiring position information of a mainframe by installing a GPS at the top of a cab of the engineering machinery;
The mixed reality scene fusion matching module is used for fusing a pre-established engineering machinery virtual model, an accessory facility model and an acquired three-dimensional color point cloud according to GPS data, an IMU and a motion trail so as to establish more accurate equipment position information; then, converting the coordinate system of the position information and the minimum bounding box information of the object obtained by positioning and semantic matching to align the coordinate system of the engineering machinery virtual model and ground auxiliary facilities built in the unity; displaying the virtual engineering machine model in the Unity3D software according to the converted position information and object information, and superposing and displaying the virtual engineering machine model in a real scene;
The man-machine cooperative interaction and control module is used for controlling task setting and motion track recording of the engineering machinery in the mixed reality digital twin environment by using a voice and gesture recognition technology, realizing track optimization and prediction, and realizing motion control of the actual engineering machinery by combining a Unity end to send a control command to the engineering machinery in real time.
10. The system of claim 9, wherein the work machine comprises at least a gantry crane, a ship loader, a stacker-reclaimer, an excavator, and a loader in a harbour site.
CN202410341789.0A 2024-03-25 2024-03-25 Engineering machinery mixed reality digital twin management and control method and system Pending CN118262071A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410341789.0A CN118262071A (en) 2024-03-25 2024-03-25 Engineering machinery mixed reality digital twin management and control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410341789.0A CN118262071A (en) 2024-03-25 2024-03-25 Engineering machinery mixed reality digital twin management and control method and system

Publications (1)

Publication Number Publication Date
CN118262071A true CN118262071A (en) 2024-06-28

Family

ID=91601786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410341789.0A Pending CN118262071A (en) 2024-03-25 2024-03-25 Engineering machinery mixed reality digital twin management and control method and system

Country Status (1)

Country Link
CN (1) CN118262071A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120729880A (en) * 2025-08-29 2025-09-30 上海宝信软件股份有限公司 Virtual human interaction method and system based on digital twin
WO2026011482A1 (en) * 2024-07-09 2026-01-15 中国科学院深圳先进技术研究院 Mixed reality-based building construction method, system, and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2026011482A1 (en) * 2024-07-09 2026-01-15 中国科学院深圳先进技术研究院 Mixed reality-based building construction method, system, and storage medium
CN120729880A (en) * 2025-08-29 2025-09-30 上海宝信软件股份有限公司 Virtual human interaction method and system based on digital twin

Similar Documents

Publication Publication Date Title
Loupos et al. Autonomous robotic system for tunnel structural inspection and assessment
EP3660231B1 (en) System and method for autonomous operation of heavy machinery
Tang et al. IMU-based full-body pose estimation for construction machines using kinematics modeling
CN118262071A (en) Engineering machinery mixed reality digital twin management and control method and system
US11746501B1 (en) Autonomous control of operations of powered earth-moving vehicles using data from on-vehicle perception systems
CN114527763B (en) Intelligent inspection system and method based on target detection and SLAM composition
CN109491383A (en) Multirobot positions and builds drawing system and method
CN116383641B (en) Earthwork mechanical working condition identification method and device, storage medium and processor
You et al. Earthwork digital twin for teleoperation of an automated bulldozer in edge dumping
CN118226796A (en) Intelligent mine operation method based on 5G network
CN114879699A (en) Autonomous navigation system and method for buried pipeline field inspection robot
CN119987412A (en) An autonomous flight and inspection system for unmanned aerial vehicles based on multi-sensor fusion technology
CN119370630A (en) A method for clearing a tank by cooperative operation of multiple machines
Li Constructing the intelligent expressway traffic monitoring system using the internet of things and inspection robot: Y. Li
CN116560357A (en) A SLAM-based tunnel inspection robot system and inspection control method
WO2024049813A1 (en) Autonomous control of operations of powered earth-moving vehicles using data from on-vehicle perception systems
CN117032215A (en) Mobile robot object identification and positioning method based on binocular vision
Robyns et al. A digital twin of an off highway vehicle based on a low cost camera
Ferrein et al. Controlling a fleet of autonomous LHD vehicles in mining operation
Sugasawa et al. Visualization of dump truck and excavator in bird’s-eye view by fisheye cameras and 3D range sensor
Fang et al. Cyber-physical systems (CPS) in intelligent crane operations
CN119507978A (en) Autonomous navigation and operation system of tunnel cleaning robot based on multi-sensor fusion
Andonovski et al. Towards a development of robotics tower crane system
Liang et al. 4D Point Cloud (4DPC)-driven real-time monitoring of construction mobile cranes
CN115793669A (en) Interactive robot autonomous inspection device and method based on 5G communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination