CN111476893A - Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility - Google Patents
Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility Download PDFInfo
- Publication number
- CN111476893A CN111476893A CN202010389171.3A CN202010389171A CN111476893A CN 111476893 A CN111476893 A CN 111476893A CN 202010389171 A CN202010389171 A CN 202010389171A CN 111476893 A CN111476893 A CN 111476893A
- Authority
- CN
- China
- Prior art keywords
- data
- empty
- offshore oil
- modeling
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000004364 calculation method Methods 0.000 claims description 32
- 238000005192 partition Methods 0.000 claims description 17
- 238000013507 mapping Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 6
- 230000000903 blocking effect Effects 0.000 claims description 5
- 238000004422 calculation algorithm Methods 0.000 claims description 4
- 230000004927 fusion Effects 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 6
- 238000005553 drilling Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000002023 wood Substances 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000004566 building material Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
Abstract
The application discloses a three-dimensional live-action modeling method and a device based on offshore oil and gas field equipment facilities, wherein the method comprises the following steps: acquiring image data acquired by an unmanned aerial vehicle flying around an offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner; performing space-three encryption on the image data to obtain aerial triangulation data; selecting control points from the scanned data to correct the aerial triangulation data; fusing the image data and the scanning data to obtain fused point cloud data; modeling the point cloud data. The three-dimensional live-action modeling method based on the offshore oil and gas field equipment facilities fuses different data source data to obtain fused point cloud data, then carries out modeling on the point cloud data, is good in modeling effect, accurate in model and clear in bottom texture, and can well meet the requirements of practical application.
Description
Technical Field
The application relates to the technical field of modeling, in particular to a three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facilities.
Background
In the initial stage of offshore oil production, since the marching to the sea is very difficult and the construction of platforms and drilling at sea are very difficult, the initial offshore platforms are only offshore structures constructed by wood near the coast at shallow sea, and simple drilling activities are performed at sea areas with shallow water depth, and the scale production of oil and gas cannot be started. With the continuous progress of human science and technology, people master more deeply on building materials, so that materials for constructing the offshore drilling platform are changed from wood into steel, the distance from the coast is gradually increased, the offshore drilling platform is gradually changed from coastal buildings to the ocean, and the modeling of the offshore oil platform is more and more important. The existing modeling method for the offshore oil platform only adopts a single data source, only generates an area shot by an image, and is not good in modeling effect for the area which is not easy to shoot at the bottom.
Disclosure of Invention
The application aims to provide a three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facilities. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
According to an aspect of an embodiment of the present application, there is provided a three-dimensional live-action modeling method based on an offshore oil and gas equipment facility, including:
acquiring image data acquired by an unmanned aerial vehicle flying around an offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
performing space-three encryption on the image data to obtain aerial triangulation data;
selecting control points from the scanned data to correct the aerial triangulation data;
fusing the image data and the scanning data to obtain fused point cloud data;
modeling the point cloud data.
Further, the performing space-three encryption on the image data includes: extracting feature points; extracting a same-name image pair; a relative orientation; matching the connection points; and (4) local area network adjustment.
Further, the performing space-three encryption on the image data to obtain space triangulation data includes:
acquiring feature points of all images in the image data;
projecting all the images onto a three-dimensional terrain according to the external orientation elements, the camera parameters and the image data projection coordinates of the images, and dividing all the images into a plurality of space-three blocks, wherein each space-three block is a space-three resolving task group;
distributing the empty three-calculation task group to each calculation node respectively so that each calculation node performs adjustment calculation on the corresponding empty three-partition, wherein the empty three-calculation task group comprises information of the feature points;
and receiving the empty three calculation results returned by each calculation node, combining the empty three blocks to form an integral area network according to the empty three calculation results, and performing joint adjustment calculation on the integral area network to obtain aerial triangulation data.
Further, the merging the empty three partitions to form an integral area network according to the empty three calculation results includes:
calculating the overlapping degree of the partition corresponding to each empty three-partition block;
and correcting based on a point feature global matching algorithm according to the overlapping degree of the partition corresponding to each empty three-partition block and each empty three-partition calculation result, and combining each empty three-partition block to form an integral area network.
Further, the modeling the point cloud data includes:
selecting a proper maximum memory according to the performance of the computer, and setting the memory in blocks;
and carrying out texture mapping treatment to obtain the built model.
Further, the performing block setting includes: and selecting an interest area of the shot area, and splitting the interest area into a plurality of small areas.
According to another aspect of the embodiments of the present application, there is provided a three-dimensional real-scene modeling apparatus for an offshore oil platform, including:
the acquisition module is used for acquiring image data acquired by the unmanned aerial vehicle flying around the offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
the encryption module is used for carrying out space-three encryption on the image data to obtain space triangulation data;
a correction module for selecting control points from the scan data to correct the aerial triangulation data;
the fusion module is used for fusing the image data and the scanning data to obtain fused point cloud data;
and the modeling module is used for modeling the point cloud data.
Further, the modeling module includes:
the block module is used for selecting the appropriate maximum memory according to the performance of the computer and carrying out block setting;
and the texture mapping processing module is used for carrying out texture mapping processing to obtain the built model.
According to another aspect of embodiments of the present application, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the above-mentioned three-dimensional live-action modeling method based on an offshore oil and gas field equipment facility.
The technical scheme provided by one aspect of the embodiment of the application can have the following beneficial effects:
according to the three-dimensional live-action modeling method based on the offshore oil and gas field equipment facility, different data source data are fused to obtain fused point cloud data, then modeling is carried out on the point cloud data, the modeling effect is good, the model is more accurate, the bottom texture is clearer, and the requirements of practical application can be well met.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application, or may be learned by the practice of the embodiments. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 illustrates a flow chart of a method of three-dimensional live-action modeling based on an offshore oil and gas equipment facility according to an embodiment of the present application;
FIG. 2 illustrates a block diagram of the structure of a three-dimensional live-action modeling apparatus for an offshore oil platform according to an embodiment of the present application;
FIG. 3 shows a schematic diagram of a fused point cloud in another embodiment of the present application;
FIG. 4 shows a platform patch setup diagram upon modeling in another embodiment of the present application;
fig. 5 shows a schematic diagram of a model established in another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As shown in fig. 1, an embodiment of the present application provides a three-dimensional live-action modeling method based on an offshore oil and gas field facility, including:
s1, acquiring image data acquired by the unmanned aerial vehicle flying around the offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
s2, performing space-three encryption on the image data to obtain space triangulation data;
s3, selecting a control point from the scanning data to correct the aerial triangulation data;
s4, fusing the image data and the scanning data to obtain fused point cloud data;
and S5, modeling the point cloud data.
In some embodiments, the performing space-three encryption on the image data includes: extracting feature points; extracting a same-name image pair; a relative orientation; matching the connection points; and (4) local area network adjustment.
In some embodiments, the step S2 of performing space-three encryption on the image data to obtain the aerial triangulation data includes:
s21, acquiring feature points of all images in the image data;
s22, projecting all the images onto a three-dimensional terrain according to the external orientation elements, the camera parameters and the image data projection coordinates of the images, and dividing all the images into a plurality of space-three blocks, wherein each space-three block is a space-three resolving task group;
s23, distributing the empty three-calculation task group to each calculation node respectively to enable each calculation node to carry out adjustment calculation on the corresponding empty three-division block, wherein the empty three-calculation task group comprises information of the feature points;
and S24, receiving the space-three calculation results returned by each calculation node, combining the space-three blocks to form an integral area network according to the space-three calculation results, and performing joint adjustment calculation on the integral area network to obtain the air triangulation data.
In some embodiments, the merging the empty three partitions into an integrated area network according to the empty three solution result includes:
s241, calculating the overlapping degree of the partition corresponding to each empty three-partition block;
and S242, correcting based on a point feature global matching algorithm according to the overlapping degree of the partition corresponding to each empty third block and the empty third calculation result, and combining the empty third blocks to form an integral area network.
In certain embodiments, step S5, modeling the point cloud data, includes:
s51, selecting a proper maximum memory according to the performance of the computer, and setting the memory in blocks;
and S52, performing texture mapping processing to obtain the built model.
In some embodiments, the performing a blocking setting includes: and selecting an interest area of the shot area, and splitting the interest area into a plurality of small areas.
As shown in fig. 2, the present embodiment further provides a three-dimensional real-world modeling apparatus for an offshore oil platform, including:
the acquisition module 10 is used for acquiring image data acquired by the unmanned aerial vehicle flying around the offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
the encryption module 20 is configured to perform space-three encryption on the image data to obtain space triangulation data;
a correction module 30, configured to select a control point from the scan data to correct the aerial triangulation data;
a fusion module 40, configured to fuse the image data and the scan data to obtain fused point cloud data;
a modeling module 50 for modeling the point cloud data.
In some embodiments, the modeling module 50 includes:
a block module 501, configured to select a suitable maximum memory according to the performance of the computer, and perform block setting;
and the texture mapping processing module 502 is configured to perform texture mapping processing to obtain a built model.
The embodiment also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the three-dimensional real-scene modeling method based on the offshore oil and gas field equipment facility.
The present embodiments also provide a computer readable storage medium having stored thereon a computer program for execution by a processor to implement the above-described three-dimensional live-action modeling method based on an offshore oil and gas equipment facility.
Another embodiment of the present application provides a three-dimensional live-action modeling method based on an offshore oil and gas field facility, comprising:
and carrying out space-three encryption on the images acquired by the unmanned aerial vehicle. The space-three encryption is performed by using a ContextCapture Center software aerotrixing (hereinafter referred to as AT) module. The ContextCapture Center software AT module obtains the aerial triangulation result of the photographic area through the operation processing of the steps of Extracting Keypoints (Extracting characteristic Points), Selecting Pairs (Extracting same-name image Pairs), InitializationOrientation (relative orientation), Matching Points (Matching connection Points), Bundle Adjustment and the like. In order to improve the adjustment precision of aerial triangulation, multiple aerial three-iteration operations are carried out on a target area, an accurate result is finally obtained, and an oblique aerial photography aerial triangulation result report is generated.
The region of interest can be selected according to the requirement of the ContextCapture Center software, so that the redundant model is prevented from being built, and the workload is prevented from being increased. Meanwhile, the software supports the splitting of the complete interest area into a plurality of small areas. The blocking mode has various options in software, and blocking can be performed according to the actual situation of the shooting area. After the building is completed, specific blocks can be loaded if needed, so that the loading speed is higher, and unnecessary operation of searching for specific positions in a large-scale area is avoided. And after the setting is finished, production can be carried out, the type of the result and the product format are selected, and a model is generated. The generated model result can be browsed, viewed and measured in a free visualization module Acute3D Viewer of ContextCapture.
This photograph area is independent platform, and the data that obtain include the image data that unmanned aerial vehicle walked around and the point cloud data that three-dimensional laser scanner obtained. In order to register the two data, the three-dimensional laser scanner data needs to be processed first, obvious feature points are selected as control points to correct the space three after the unmanned aerial vehicle image adjustment, and point clouds of the two sets of data are fused, and as shown in fig. 3, the point clouds are a schematic diagram after fusion and participate in modeling processing together.
Modeling the fused point cloud data by using ContextCapture Center software, firstly setting blocks, selecting a proper maximum memory for blocking according to the performance of a computer, aiming at dividing the least tiles for modeling, wherein the divided tiles have different sizes, but can most effectively utilize the performance of the computer and also can most ensure the quality of the tiles, and as shown in FIG. 4, a platform block setting schematic diagram is shown:
modeling is performed after the blocks are divided, and texture mapping processing is performed to obtain a built model, as shown in fig. 5.
It should be noted that:
the term "module" is not intended to be limited to a particular physical form. Depending on the particular application, a module may be implemented as hardware, firmware, software, and/or combinations thereof. Furthermore, different modules may share common components or even be implemented by the same component. There may or may not be clear boundaries between the various modules.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing such a device will be apparent from the description above. In addition, this application is not directed to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the creation apparatus of a virtual machine according to embodiments of the present application. The present application may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The above-mentioned embodiments only express the embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
Claims (9)
1. A three-dimensional live-action modeling method based on offshore oil and gas field equipment facilities is characterized by comprising the following steps:
acquiring image data acquired by an unmanned aerial vehicle flying around an offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
performing space-three encryption on the image data to obtain aerial triangulation data;
selecting control points from the scanned data to correct the aerial triangulation data;
fusing the image data and the scanning data to obtain fused point cloud data;
modeling the point cloud data.
2. The method of claim 1, wherein the space-three-encryption of the image data comprises: extracting feature points; extracting a same-name image pair; a relative orientation; matching the connection points; and (4) local area network adjustment.
3. The method of claim 1, wherein the space-three encryption of the image data to obtain the space triangulation data comprises:
acquiring feature points of all images in the image data;
projecting all the images onto a three-dimensional terrain according to the external orientation elements, the camera parameters and the image data projection coordinates of the images, and dividing all the images into a plurality of space-three blocks, wherein each space-three block is a space-three resolving task group;
distributing the empty three-calculation task group to each calculation node respectively so that each calculation node performs adjustment calculation on the corresponding empty three-partition, wherein the empty three-calculation task group comprises information of the feature points;
and receiving the empty three calculation results returned by each calculation node, combining the empty three blocks to form an integral area network according to the empty three calculation results, and performing joint adjustment calculation on the integral area network to obtain aerial triangulation data.
4. The method according to claim 3, wherein the merging the empty tri-blocks into an overall area network according to the empty tri-solution results comprises:
calculating the overlapping degree of the partition corresponding to each empty three-partition block;
according to the overlapping degree of the partition corresponding to each empty three partition and each empty three calculation result, point feature-based full calculation is performed
And correcting by using a local matching algorithm, and combining the empty three blocks to form an integral area network.
5. The method of claim 1, wherein the modeling the point cloud data comprises:
selecting a proper maximum memory according to the performance of the computer, and setting the memory in blocks;
and carrying out texture mapping treatment to obtain the built model.
6. The method of claim 5, wherein the performing a blocking setting comprises: and selecting an interest area of the shot area, and splitting the interest area into a plurality of small areas.
7. A three-dimensional live-action modeling device of an offshore oil platform is characterized by comprising:
the acquisition module is used for acquiring image data acquired by the unmanned aerial vehicle flying around the offshore oil platform and scanning data obtained by scanning the offshore oil platform by using a three-dimensional laser scanner;
the encryption module is used for carrying out space-three encryption on the image data to obtain space triangulation data;
a correction module for selecting control points from the scan data to correct the aerial triangulation data;
the fusion module is used for fusing the image data and the scanning data to obtain fused point cloud data;
and the modeling module is used for modeling the point cloud data.
8. The apparatus of claim 7, wherein the modeling module comprises:
the block module is used for selecting the appropriate maximum memory according to the performance of the computer and carrying out block setting;
and the texture mapping processing module is used for carrying out texture mapping processing to obtain the built model.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the method of three-dimensional live-action modeling based on an offshore oil and gas field equipment facility as claimed in any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010389171.3A CN111476893A (en) | 2020-05-09 | 2020-05-09 | Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010389171.3A CN111476893A (en) | 2020-05-09 | 2020-05-09 | Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111476893A true CN111476893A (en) | 2020-07-31 |
Family
ID=71763182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010389171.3A Pending CN111476893A (en) | 2020-05-09 | 2020-05-09 | Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111476893A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111951402A (en) * | 2020-08-18 | 2020-11-17 | 北京市测绘设计研究院 | Three-dimensional model generation method, device, computer equipment and storage medium |
CN113284247A (en) * | 2021-05-21 | 2021-08-20 | 泰瑞数创科技(北京)有限公司 | Three-dimensional modeling method, system and storage medium for ocean engineering equipment |
CN114757983A (en) * | 2022-04-27 | 2022-07-15 | 四川大学 | Unmanned aerial vehicle and three-dimensional laser scanning combined monitoring method |
CN116704137A (en) * | 2023-07-27 | 2023-09-05 | 山东科技大学 | A deep learning reverse modeling method for offshore oil drilling platform point cloud |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120013710A1 (en) * | 2009-02-19 | 2012-01-19 | Dimensional Perception Technologies Ltd. | System and method for geometric modeling using multiple data acquisition means |
CN105931234A (en) * | 2016-04-19 | 2016-09-07 | 东北林业大学 | Ground three-dimensional laser scanning point cloud and image fusion and registration method |
CN107907111A (en) * | 2017-11-14 | 2018-04-13 | 泰瑞数创科技(北京)有限公司 | A kind of automatic distributed aerial triangulation calculation method |
CN109978791A (en) * | 2019-03-28 | 2019-07-05 | 苏州市建设工程质量检测中心有限公司 | A kind of bridge monitoring methods merged based on oblique photograph and 3 D laser scanning |
-
2020
- 2020-05-09 CN CN202010389171.3A patent/CN111476893A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120013710A1 (en) * | 2009-02-19 | 2012-01-19 | Dimensional Perception Technologies Ltd. | System and method for geometric modeling using multiple data acquisition means |
CN105931234A (en) * | 2016-04-19 | 2016-09-07 | 东北林业大学 | Ground three-dimensional laser scanning point cloud and image fusion and registration method |
CN107907111A (en) * | 2017-11-14 | 2018-04-13 | 泰瑞数创科技(北京)有限公司 | A kind of automatic distributed aerial triangulation calculation method |
CN109978791A (en) * | 2019-03-28 | 2019-07-05 | 苏州市建设工程质量检测中心有限公司 | A kind of bridge monitoring methods merged based on oblique photograph and 3 D laser scanning |
Non-Patent Citations (3)
Title |
---|
张平: "《倾斜摄影和激光扫描技术在城市三维建模中的融合应用研究》", 《城市勘测》 * |
李伟哲: "《实景三维模型在峡谷地形中的建立》", 《山西水利》 * |
林小波: ""Pléiades卫星影像正射处理中像控点布设方案研究"", 《地理空间信息》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111951402A (en) * | 2020-08-18 | 2020-11-17 | 北京市测绘设计研究院 | Three-dimensional model generation method, device, computer equipment and storage medium |
CN111951402B (en) * | 2020-08-18 | 2024-02-23 | 北京市测绘设计研究院 | Three-dimensional model generation method, three-dimensional model generation device, computer equipment and storage medium |
CN113284247A (en) * | 2021-05-21 | 2021-08-20 | 泰瑞数创科技(北京)有限公司 | Three-dimensional modeling method, system and storage medium for ocean engineering equipment |
CN113284247B (en) * | 2021-05-21 | 2021-12-21 | 泰瑞数创科技(北京)有限公司 | Three-dimensional modeling method, system and storage medium for ocean engineering equipment |
CN114757983A (en) * | 2022-04-27 | 2022-07-15 | 四川大学 | Unmanned aerial vehicle and three-dimensional laser scanning combined monitoring method |
CN116704137A (en) * | 2023-07-27 | 2023-09-05 | 山东科技大学 | A deep learning reverse modeling method for offshore oil drilling platform point cloud |
CN116704137B (en) * | 2023-07-27 | 2023-10-24 | 山东科技大学 | Reverse modeling method for point cloud deep learning of offshore oil drilling platform |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111476893A (en) | Three-dimensional live-action modeling method and device based on offshore oil and gas field equipment facility | |
CA2861555C (en) | Densifying and colorizing point cloud representation of physical surface using image data | |
WO2019127445A1 (en) | Three-dimensional mapping method, apparatus and system, cloud platform, electronic device, and computer program product | |
Barazzetti et al. | True-orthophoto generation from UAV images: Implementation of a combined photogrammetric and computer vision approach | |
CN107907111B (en) | Automatic distributed aerial triangulation calculation method | |
CN106611441B (en) | The treating method and apparatus of three-dimensional map | |
CN111221933A (en) | Three-dimensional tile construction method for fusion of massive map data and building information model | |
CN109559349A (en) | A kind of method and apparatus for calibration | |
CN113034347B (en) | Oblique photography image processing method, device, processing equipment and storage medium | |
CN114757834B (en) | Panoramic image processing method and panoramic image processing device | |
CN113920275B (en) | Triangular mesh construction method and device, electronic equipment and readable storage medium | |
CN110825079A (en) | Map construction method and device | |
CN118212405B (en) | Construction method and device of 3D target detection model based on LiDAR point cloud and RGB image | |
CN105953777B (en) | A kind of large scale based on depth map tilts image plotting method | |
CN113034681B (en) | Three-dimensional reconstruction method and device for spatial plane relation constraint | |
CN108648141B (en) | Image splicing method and device | |
CN112613107A (en) | Method and device for determining construction progress of tower project, storage medium and equipment | |
CN113418448B (en) | Fragment distribution detection system and method | |
CN116091610B (en) | Combined calibration method of radar and camera based on three-dimensional tower type checkerboard | |
Skuratovskyi et al. | Outdoor mapping framework: from images to 3d model | |
Shragai et al. | Automatic tie-point extraction using advanced approaches | |
CN118262030B (en) | Automatic pier point cloud coloring method and system based on deep learning and optical photo | |
CN115937431A (en) | Inclined three-dimensional model construction method, inclined three-dimensional model construction system, computer and storage medium | |
CN118982632A (en) | Real-time 3D reconstruction method and device based on UAV flight | |
CN115661253A (en) | Unmanned aerial vehicle multi-view positioning method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200731 |