CN108282615B - Method and system for scanning surrounding environment - Google Patents
Method and system for scanning surrounding environment Download PDFInfo
- Publication number
- CN108282615B CN108282615B CN201810084872.9A CN201810084872A CN108282615B CN 108282615 B CN108282615 B CN 108282615B CN 201810084872 A CN201810084872 A CN 201810084872A CN 108282615 B CN108282615 B CN 108282615B
- Authority
- CN
- China
- Prior art keywords
- data
- scanning
- environment
- environment data
- ambient environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Manipulator (AREA)
- Image Input (AREA)
Abstract
The invention discloses a method and a system for scanning surrounding environment, wherein the method comprises the following steps: capturing ambient environment data in multiple directions at the same time; processing the captured ambient environment data, wherein the processing comprises stitching the ambient environment data in multiple directions in real time; and generating composite ambient environment information according to the processing result. The system includes a main frame, a plurality of data capture modules, and a processing module. The invention can capture the surrounding environment data in multiple directions at the same time, is not limited by the detection range and the collected data, has higher efficiency and can ensure the scanning precision; when the captured surrounding environment data is processed, the surrounding environment data in multiple directions can be spliced in real time, integration of multiple scanning data is achieved, and real-time performance is better. The invention can be widely applied to the field of environmental scanning.
Description
Technical Field
The invention relates to the field of environmental scanning, in particular to an indoor peripheral environment scanning method and system.
Background
In recent years, laser technology and computer technology have been developed rapidly, and the application of environmental scanning technology in the aspects of environmental identification, navigation, positioning and the like is also becoming more and more extensive. By taking google earth and google street scenes as examples, the method can provide high-precision 360-degree panoramic photos based on GPS positioning information, greatly facilitates navigation, path planning and other operations of users, and has been expanded to various aspects related to spatial distribution, such as natural environment monitoring and analysis, resource investigation and development, traffic navigation and the like. However, most of the current environment scanning technologies are directed to outdoor environments, and scanning schemes for indoor environments are rare. The indoor environment information obtained by the indoor environment scanning technology is indispensable basic data under the promotion of huge application requirements in the aspects of digital cities, emergency response simulation and training, digital cultural heritage, exhibition and the like, especially in typical applications such as anti-terrorism, fire fighting, exhibition, intelligent buildings and the like. Different from the outdoor environment, the indoor environment has the characteristics of short distance, multiple corners, easiness in shielding, complex illumination, lack of absolute positioning and the like, and the efficient and accurate acquisition of the indoor environment information is a challenging subject. The currently widely used solution is to manually scan the indoor environment by using a scanning device, but the scanning efficiency of this manual scanning mode is low (especially in indoor environment scanning in large-scale scenes, such as environment scanning in a museum of 1 ten thousand square meters) and it is difficult to ensure the scanning accuracy. And the current manual scanning mode is limited by the detection range and the collected data, so that indoor environment data in multiple directions cannot be scanned at the same time, integration of multiple scanning data cannot be performed, and the real-time performance is poor.
Disclosure of Invention
To solve the above technical problems, the present invention aims to: the method and the system for scanning the surrounding environment are efficient, high in precision and real-time.
The first technical scheme adopted by the invention is as follows:
the peripheral environment scanning method comprises the following steps:
capturing ambient environment data in multiple directions at the same time;
processing the captured ambient environment data, wherein the processing comprises stitching the ambient environment data in multiple directions in real time;
and generating composite ambient environment information according to the processing result.
Further, the number of directions of the plurality of directions is 4.
Further, the multi-directional ambient environment data is captured by the autonomous robot.
Further, the autonomous robot includes at least 4 cameras for capturing ambient environment data from 4 directions.
Further, the camera comprises a fisheye lens, and the fisheye lens is used for capturing surrounding environment data of the aspheric structure.
Further, the ambient environment is an indoor environment.
Further, the ambient environment data of the plurality of directions includes environment data of an overlapping area, and the overlapping area refers to a capture area common when capturing ambient environment data of different directions.
Further, the processing further comprises the steps of:
and matching and identifying the environment data of the overlapping area, and recapturing the environment data of the overlapping area when the environment data of the overlapping area is different.
The second technical scheme adopted by the invention is as follows:
a peripheral environment scanning system, characterized by: the method comprises the following steps:
a main frame for providing support;
a plurality of data capturing modules installed on the main frame for capturing environmental data of a plurality of directions;
and the processing module is connected to the data capturing modules in a communication mode and is used for splicing the environment data in the multiple directions to generate composite surrounding environment information.
Further, the main frame, the plurality of data capture modules and the processing module constitute an autonomous robot.
Further, each of the plurality of data capture modules is a camera.
Further, the camera includes a fisheye lens for capturing spherical images.
Further, the composite ambient environment information is generated in real-time.
The invention has the beneficial effects that: the peripheral environment scanning method and the system can capture peripheral environment data in multiple directions at the same time, are not limited by a detection range and collected data, have higher efficiency and can ensure scanning precision; when the captured surrounding environment data is processed, the surrounding environment data in multiple directions can be spliced in real time, integration of multiple scanning data is achieved, and real-time performance is better.
Drawings
FIG. 1 is a schematic structural diagram of a preferred embodiment of the ambient scanning system of the present invention;
FIG. 2 is a schematic diagram of a scanning system for scanning the surrounding environment according to a preferred embodiment of the present invention;
FIG. 3 is a block diagram of the internal structure of the processing module of the present invention;
fig. 4 is a flowchart illustrating an overall method for scanning the surrounding environment according to an embodiment of the present invention.
Detailed Description
The invention will be further explained and explained with reference to the drawings and the embodiments in the description.
Referring to fig. 1, the scanning system for generating real-time ambient environment information according to the present embodiment may be controlled manually, autonomously by a robot, or by a combination of the two, and may be controlled remotely by using a mobile application. As shown in fig. 1, the scanning system mainly includes a main frame 102 and a plurality of support legs 104. As shown in fig. 1, a plurality of data capture modules 106 are mounted on the main frame 102, and each data capture module of the plurality of data capture modules 106 is configured to capture environmental data in a corresponding direction. The main frame 102 may be made of any one or a combination of wood, metal, alloy, plastic, rubber, and fiber. The shape of the main frame 102 shown in fig. 1 is for illustrative purposes only, and those skilled in the art will appreciate that the main frame 102 may have any shape and size. The plurality of support legs 104 are configured to provide support for the main frame 102 to adjust the height of the plurality of data capture modules 106 such that the multi-directional scanning of the plurality of data capture modules 106 over the entire environmental area can be accomplished at one time at a certain height.
Further as a preferred embodiment, each of the plurality of data capture modules is a camera.
Further as a preferred embodiment, the camera comprises a fisheye lens. The fisheye lens is used for capturing a spherical view or an aspheric structure view of a corresponding direction area so as to highlight an area of interest and enhance visual impact force.
Further as a preferred embodiment, the plurality of support legs comprises at least one moving means for moving the entire scanning system. The moving device can select wheels which can freely slide in any direction, so that the whole scanning system can be driven to automatically move (corresponding to a robot autonomous control mode) or controllably move (corresponding to a manual control mode) to a target position to perform real-time motion scanning, and the problem that the existing manual scanning mode cannot realize real-time motion scanning is solved.
The scanning system of the present embodiment further includes a processing module (not shown in fig. 1) connected to the plurality of data capture modules 106, for receiving the multi-directional environmental data captured by the plurality of data capture modules 106 and performing further processing (such as alignment and stitching).
Referring to fig. 2, the scanning system for generating real-time ambient environment information of the present embodiment may employ 4 data capture modules. As shown in fig. 2, the 4 data capture modules 106 may capture data from the indoor environment and send it to the processing module 204 for further processing.
Further as a preferred embodiment, the outer side of each of the 4 data capture modules 106 (including data capture module 106A, data capture module 106B, data capture module 106C, data capture module 106D) may have a corresponding scan extension 202A-202D (these 4 extensions are collectively referred to as scan extensions 202). The scan extension unit 202 is used to extend the field of view of the camera (e.g., a fisheye camera) with a wider field of view, so as to prevent data of the indoor environment from being missed to be scanned, thereby improving the scanning quality.
Further as a preferred embodiment, each of the 4 scan extensions 202 corresponding to the 4 data capture modules 106 may overlap each other such that no detailed data is missed for an indoor environment scan. The overlapping of the scan extensions 202 may be accomplished by manual fixation or may be automatically accomplished by the processing module 204 before the scan begins. Thus, 4 data capture modules 106 may initiate the data capture process together.
Referring to fig. 3, the internal components of the processing module 204 of the present embodiment include a camera control module 302, a camera data capture module 304, a data comparison module 306, a compound environment generator 308, and a memory 310. The processing module 204 of this embodiment may be a stand-alone processor with various hardware processing components inside, or the components inside the processing module 204 are software-generated. The processing module 204 of this embodiment may also be a collection of multiple processors, which together implement the same functions as an independent processor.
Wherein the camera control module 302 is used for controlling each of the plurality of data capturing modules or the plurality of cameras. The camera control module 302 may set parameters such as the field of view, angle, diffusion parameters, and motion parameters of the camera to a certain extent.
A camera data capture module 304 to receive the multi-directional environmental data captured by the plurality of data capture modules 106, identify by which data capture module the received data was captured, and store the received data in the memory 310 in a time list under the respective data capture module.
A data comparison module 306 for comparing data from the plurality of data capture modules, identifying differences in the data, and storing the identified differences in a corresponding difference list in the memory 310. The data comparison module 306 will also examine and compare the captured data from the scan extension overlap region and, if there is a difference in data from the same overlap region but from different data capture modules, scan the overlap region again using the same data capture module so that the data for the overlap region remains consistent.
A composite environment generator 308 for aligning and stitching together the various views of the indoor ambient environment (data from multiple data capture modules) and generating and storing in memory 310 a composite three-dimensional map of the indoor ambient environment area in real-time.
Referring to fig. 4, a specific flow of a scanning method for scanning a surrounding environment according to the present embodiment is as follows:
step 402: initially, the plurality of data capture modules 106 are launched by the processing module 204 to begin scanning and begin capturing ambient data. Instructions for controlling the plurality of data capture modules 106 of the present embodiment may be stored in the memory 310. Additionally, the processing module 204 of the present embodiment may also receive instructions from a remote device (such as a mobile application) to control the plurality of data capture modules 106.
Step 404: data captured by the plurality of data capture modules 106 is processed. The specific processing of step 404 may include step 4042: data captured from multiple directions are matched/aligned and stitched together.
Step 406: and generating composite environment data according to the spliced data, wherein the composite environment data describes the indoor surrounding environment in a three-dimensional view mode.
Step 408: the generated composite environmental data is updated according to the new environmental information captured by the plurality of data capture modules 106.
The process of steps 402-408 is repeated at predetermined intervals until the scan is completed and a predetermined amount of captured data has not changed in the subsequent periodic update information. Further, as described above, the scanning may also be stopped manually upon instruction from the remote device.
The above flowchart and/or block diagrams of methods and systems describe in detail embodiments of the invention. It will be understood by those within the art that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the actions specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an apparatus that implements the action specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the actions or steps specified in the flowchart and/or block diagram block or blocks.
In addition, the step numbers or the module numbers in the embodiments of the present invention are provided only for convenience of illustration, the order of the steps or the connection relationship between the modules is not limited at all, and the execution order of the steps and the connection relationship between the modules in the embodiments can be adaptively adjusted according to the understanding of those skilled in the art.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (9)
1. A method for scanning the surrounding environment, characterized by: the method comprises the following steps:
capturing surrounding environment data of multiple directions at the same time and the same height; the height can be adjusted;
processing the captured ambient environment data, wherein the processing comprises aligning and stitching the ambient environment data in multiple directions in real time;
generating composite surrounding environment information according to the processing result;
the surrounding environment data of the multiple directions comprises environment data of an overlapping area, wherein the overlapping area refers to a common capture area when capturing surrounding environment data of different directions; the environmental data of the overlap region is data captured by the data capture module from the overlap region of the scan extension located outside the data capture module.
2. The ambient environment scanning method of claim 1, further comprising: the number of directions of the plurality of directions is 4.
3. The ambient environment scanning method of claim 2, further comprising: the multi-directional ambient environment data is captured by an autonomous robot.
4. The ambient environment scanning method of claim 3, further comprising: the autonomous robot includes at least 4 cameras for capturing ambient environment data from 4 directions.
5. The ambient environment scanning method of claim 4, further comprising: the camera comprises a fisheye lens, and the fisheye lens is used for capturing surrounding environment data of the aspheric structure.
6. The ambient environment scanning method of claim 5, further comprising: the process further comprises the steps of:
and matching and identifying the environment data of the overlapping area, and recapturing the environment data of the overlapping area when the environment data of the overlapping area is different.
7. A peripheral environment scanning system, characterized by: the method comprises the following steps:
a main frame for providing support; the height of the main frame can be adjusted;
a plurality of data capturing modules installed on the main frame for capturing environmental data of a plurality of directions at the same height;
a processing module communicably connected to the plurality of data capture modules for aligning and stitching the environmental data in the plurality of directions to generate composite ambient environmental information;
the surrounding environment data of the multiple directions comprises environment data of an overlapping area, wherein the overlapping area refers to a common capture area when capturing surrounding environment data of different directions; the environmental data of the overlap region is data captured by the data capture module from the overlap region of the scan extension located outside the data capture module.
8. The ambient environment scanning system of claim 7, further comprising: each of the plurality of data capture modules is a camera.
9. The ambient environment scanning system of claim 8, wherein: the camera includes a fisheye lens for capturing spherical images.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762580468P | 2017-11-02 | 2017-11-02 | |
US62/580,468 | 2017-11-02 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108282615A CN108282615A (en) | 2018-07-13 |
CN108282615B true CN108282615B (en) | 2021-01-05 |
Family
ID=62805568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810084872.9A Active CN108282615B (en) | 2017-11-02 | 2018-01-29 | Method and system for scanning surrounding environment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108282615B (en) |
WO (1) | WO2019085497A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112017407B (en) * | 2020-09-08 | 2022-11-29 | 中国科学院上海微系统与信息技术研究所 | Integrated wireless ground mark monitoring device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5137238A (en) * | 1991-09-09 | 1992-08-11 | Hutten Friedrich W | Fast access camera mounting device |
CN1745337A (en) * | 2002-12-05 | 2006-03-08 | 索尼株式会社 | camera equipment |
CN106572313A (en) * | 2016-10-31 | 2017-04-19 | 腾讯科技(深圳)有限公司 | Video recording box and video recording method and device |
CN107027338A (en) * | 2014-05-06 | 2017-08-08 | 扎卡里亚·尼亚齐 | Imaging systems, methods and applications |
CN107219615A (en) * | 2017-07-31 | 2017-09-29 | 武汉赫天光电股份有限公司 | Panoramic optical systems and electronic equipment |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101437155A (en) * | 2007-11-12 | 2009-05-20 | 詹雨明 | Omnidirection circumstance video monitoring system |
CN101577795A (en) * | 2009-06-17 | 2009-11-11 | 深圳华为通信技术有限公司 | Method and device for realizing real-time viewing of panoramic picture |
US10531071B2 (en) * | 2015-01-21 | 2020-01-07 | Nextvr Inc. | Methods and apparatus for environmental measurements and/or stereoscopic image capture |
US9787896B2 (en) * | 2015-12-29 | 2017-10-10 | VideoStitch Inc. | System for processing data from an omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
CN105678729B (en) * | 2016-02-24 | 2018-03-09 | 段梦凡 | Fish eye lens Panorama Mosaic method |
CN106161937A (en) * | 2016-07-23 | 2016-11-23 | 徐荣婷 | A kind of panoramic shooting machine people |
-
2018
- 2018-01-29 CN CN201810084872.9A patent/CN108282615B/en active Active
- 2018-06-15 WO PCT/CN2018/091535 patent/WO2019085497A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5137238A (en) * | 1991-09-09 | 1992-08-11 | Hutten Friedrich W | Fast access camera mounting device |
CN1745337A (en) * | 2002-12-05 | 2006-03-08 | 索尼株式会社 | camera equipment |
CN107027338A (en) * | 2014-05-06 | 2017-08-08 | 扎卡里亚·尼亚齐 | Imaging systems, methods and applications |
CN106572313A (en) * | 2016-10-31 | 2017-04-19 | 腾讯科技(深圳)有限公司 | Video recording box and video recording method and device |
CN107219615A (en) * | 2017-07-31 | 2017-09-29 | 武汉赫天光电股份有限公司 | Panoramic optical systems and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2019085497A1 (en) | 2019-05-09 |
CN108282615A (en) | 2018-07-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102647449B (en) | Based on the intelligent photographic method of cloud service, device and mobile terminal | |
CN112470092B (en) | Surveying and mapping system, surveying and mapping method, device, equipment and medium | |
CN109387186B (en) | Surveying and mapping information acquisition method and device, electronic equipment and storage medium | |
CN108234927B (en) | Video tracking method and system | |
RU2741443C1 (en) | Method and device for sampling points selection for surveying and mapping, control terminal and data storage medium | |
CN112469967B (en) | Mapping system, mapping method, mapping device, mapping apparatus, and recording medium | |
WO2017133147A1 (en) | Live-action map generation method, pushing method and device for same | |
CN115641401A (en) | Construction method and related device of three-dimensional live-action model | |
CN111868656B (en) | An operation control system, operation control method, device, equipment and medium | |
CN106210647A (en) | Based on the method and system building base station coverage area full-view image of taking photo by plane | |
US11943539B2 (en) | Systems and methods for capturing and generating panoramic three-dimensional models and images | |
CN103763470A (en) | Portable scene shooting device | |
CN113001985A (en) | 3D model, device, electronic equipment and storage medium based on oblique photography construction | |
CN110969696A (en) | Method and system for three-dimensional modeling rapid space reconstruction | |
EP3882846B1 (en) | Method and device for collecting images of a scene for generating virtual reality data | |
CN108287345A (en) | Spacescan method and system based on point cloud data | |
CN111527375B (en) | Planning method and device for surveying and mapping sampling point, control terminal and storage medium | |
CN108282615B (en) | Method and system for scanning surrounding environment | |
CN108364340A (en) | The method and system of synchronous spacescan | |
KR101963449B1 (en) | System and method for generating 360 degree video | |
CN112504263A (en) | Indoor navigation positioning device based on multi-view vision and positioning method thereof | |
CN117332370A (en) | Underwater target acousto-optic panorama cooperative identification device and identification method | |
US20160349409A1 (en) | Photovoltaic shade impact prediction | |
CN108287549A (en) | A kind of method and system improving spacescan time performance | |
Kossieris et al. | Developing a low-cost system for 3D data acquisition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
PP01 | Preservation of patent right | ||
PP01 | Preservation of patent right |
Effective date of registration: 20220704 Granted publication date: 20210105 |