[go: up one dir, main page]

CN117278734B - Rocket launching immersive viewing system - Google Patents

Rocket launching immersive viewing system Download PDF

Info

Publication number
CN117278734B
CN117278734B CN202311550185.9A CN202311550185A CN117278734B CN 117278734 B CN117278734 B CN 117278734B CN 202311550185 A CN202311550185 A CN 202311550185A CN 117278734 B CN117278734 B CN 117278734B
Authority
CN
China
Prior art keywords
rocket
image
target
monitoring image
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311550185.9A
Other languages
Chinese (zh)
Other versions
CN117278734A (en
Inventor
刘百奇
刘建设
夏东坤
何艳玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xinghe Power Aerospace Technology Co ltd
Beijing Xinghe Power Equipment Technology Co Ltd
Anhui Galaxy Power Equipment Technology Co Ltd
Galactic Energy Shandong Aerospace Technology Co Ltd
Jiangsu Galatic Aerospace Technology Co Ltd
Original Assignee
Beijing Xinghe Power Aerospace Technology Co ltd
Beijing Xinghe Power Equipment Technology Co Ltd
Anhui Galaxy Power Equipment Technology Co Ltd
Galactic Energy Shandong Aerospace Technology Co Ltd
Jiangsu Galatic Aerospace Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xinghe Power Aerospace Technology Co ltd, Beijing Xinghe Power Equipment Technology Co Ltd, Anhui Galaxy Power Equipment Technology Co Ltd, Galactic Energy Shandong Aerospace Technology Co Ltd, Jiangsu Galatic Aerospace Technology Co Ltd filed Critical Beijing Xinghe Power Aerospace Technology Co ltd
Priority to CN202311550185.9A priority Critical patent/CN117278734B/en
Publication of CN117278734A publication Critical patent/CN117278734A/en
Application granted granted Critical
Publication of CN117278734B publication Critical patent/CN117278734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a rocket launching immersive viewing system, which relates to the technical field of aerospace and comprises the following components: the head-mounted display device is used for determining the viewing angle of the target rocket by the user based on the input instruction of the user and displaying the flying viewing picture of the target rocket to the user; the watching service device is used for acquiring flight state data of the target rocket and external monitoring images; generating a virtual rocket body model of the target rocket based on the flight state data; generating a virtual scene model based on the external monitoring image; generating a flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; the flight view is sent to the head mounted display device. The system provided by the application enables a user to remotely watch the whole rocket launching process without needing to visit the rocket launching site, can meet the watching wish and requirements of the user, and improves the rocket launching watching experience of the user.

Description

Rocket launching immersive viewing system
Technical Field
The application relates to the technical field of aerospace, in particular to a rocket launching immersive viewing system.
Background
With the gradual rise of the commercial aerospace field, service projects promoted by commercial carrier rocket companies are also becoming more and more abundant.
After learning the propaganda information about to be launched by the rocket, the user is hoped to be watched in the field as a fan and a supporter of commercial aerospace. Some users can arrive at the launching site to watch rocket launching scenes, and other users have watching will and requirements, so that the users cannot arrive at the site in person due to time or physical conditions, and therefore have a large regret.
Therefore, how to enable a user to remotely watch the rocket launching process is a technical problem to be solved in the industry.
Disclosure of Invention
The application provides a rocket launching immersive viewing system which is used for solving the technical problem of how to enable a user to remotely view the launching process of a rocket.
The application provides a rocket launching immersive viewing system, comprising:
The head-mounted display device is used for determining the viewing angle of the user to the target rocket based on the input instruction of the user and displaying the flying viewing picture of the target rocket to the user;
The watching service device is used for acquiring flight state data of the target rocket and an external monitoring image; generating a virtual rocket body model of the target rocket based on the flight state data; generating a virtual scene model based on the external monitoring image; generating the flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; and sending the flight view screen to the head-mounted display device.
In some embodiments, the viewing service device comprises:
The man-machine interaction module is in communication connection with the head-mounted display device and is used for receiving the viewing angle sent by the head-mounted display device and sending the flying viewing picture to the head-mounted display device;
the data communication module is used for acquiring flight state data of the target rocket and an external monitoring image;
The image generation module is used for generating a virtual rocket body model of the target rocket based on the flight state data and generating a virtual scene model based on the external monitoring image; and generating the flying viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle.
In some embodiments, the data communication module is in communication connection with a ground control center, and is used for acquiring flight state data of the target rocket and setting an off-rocket monitoring image acquired by a camera outside the target rocket;
the data communication module is in communication connection with ground monitoring equipment and is used for acquiring ground monitoring images of the target rocket;
the data communication module is in communication connection with air monitoring equipment and is used for acquiring an air monitoring image of the target rocket;
the data communication module is in communication connection with a monitoring satellite and is used for acquiring satellite monitoring images of the target rocket.
In some embodiments, the screen generation module is to determine the external surveillance image based on at least one of the off-arrow surveillance image, the ground surveillance image, the air surveillance image, and the satellite surveillance image.
In some embodiments, the picture generation module is specifically configured to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
And determining the flight view picture based on any external monitoring image when the image resolution of the external monitoring image is greater than or equal to a preset resolution.
In some embodiments, the picture generation module is specifically configured to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
and determining any external monitoring image as a flight view picture of the target rocket under the condition that the ratio of the number of pixels of the image area where the target rocket is positioned to the number of pixels of any external monitoring image is larger than or equal to a preset ratio.
In some embodiments, the picture generation module is specifically configured to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
Determining any external monitoring image as a real scene image under the condition that the ratio of the number of pixels of the image area where the target rocket is located to the number of pixels of any external monitoring image is smaller than a preset ratio;
Projecting a virtual rocket body model of the target rocket into the real scene image based on the viewing angle to generate an augmented reality picture;
and determining the augmented reality picture as a flight view picture of the target rocket.
In some embodiments, the picture generation module is specifically configured to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
Under the condition that the image resolution of each external monitoring image is smaller than the preset resolution, constructing the virtual scene model based on the external monitoring image with the highest image resolution;
superposing a virtual rocket body model of the target rocket in the virtual scene model, and projecting a model superposition result based on the viewing angle to generate a virtual reality picture;
and determining the virtual reality picture as a flight view picture of the target rocket.
In some embodiments, the picture generation module is specifically configured to:
Inputting the flight state data into a preset data model corresponding to the target rocket to obtain a virtual rocket body model of the target rocket;
The preset data model is determined based on historical flight state data of the same type of rockets of the target rocket and rocket body change data of the same type of rockets in the launching process.
In some embodiments, the head mounted display device is specifically configured to:
Acquiring voice or limb actions of the user; the limb movements include at least one of gestures, head gestures, and eye movements;
based on the voice or the limb action, an input instruction of the user is determined.
The application provides a rocket launching immersive viewing system, which comprises a head-mounted display device and a viewing service device, wherein the head-mounted display device is used for determining a viewing angle of a user to a target rocket and displaying a flying viewing picture of the target rocket to the user; the watching service device is used for acquiring flight state data of the target rocket and external monitoring images; generating a virtual rocket body model and a virtual scene model of the target rocket; generating a flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; the method comprises the steps of processing flight state data and external monitoring images of a target rocket, respectively generating a virtual rocket body model and a virtual scene model, further generating a flight viewing picture, and displaying the flight viewing picture to a user through a head-mounted display device, so that the user can remotely view the whole rocket launching process without being close to a rocket launching site, viewing will and requirements of the user can be met, and rocket launching viewing experience of the user is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the application or the technical solutions of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of a rocket launching immersive viewing system provided by the present application;
fig. 2 is a schematic structural diagram of a viewing service device provided by the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like herein are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules that are expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic structural diagram of a rocket-launching immersive viewing system provided by the present application, and as shown in fig. 1, the rocket-launching immersive viewing system 100 includes a head-mounted display device 110 and a viewing service device 120.
A head-mounted display device 110 for determining a viewing angle of the target rocket by the user based on an input instruction of the user, and displaying a flying viewing screen of the target rocket to the user;
A viewing service device 120 for acquiring flight state data of the target rocket and an external monitoring image; generating a virtual rocket body model of the target rocket based on the flight state data; generating a virtual scene model based on the external monitoring image; generating a flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; the flight view is sent to the head mounted display device 110.
Specifically, the embodiment of the application provides a rocket launching immersive viewing system, and the application scene is to provide rocket launching viewing service for users who are not on the rocket launching site. The target rocket is a rocket in the launching process, and the launching process is to live broadcast or display to each user through a rocket launching immersive viewing system.
The head-mounted display device is worn on the head of a user, and can display images, videos or virtual contents in a head-mounted manner in front of the eyes of the user. The head-mounted display device may include devices such as smart helmets and smart glasses.
Structurally, the head-mounted display device at least comprises a display module, a motion sensing module, an input module and a playback module. And the display module is used for displaying the flying and watching picture of the target rocket to a user. The motion sensing module is used for sensing limb motions (such as head gestures, eyeball motions or gestures) of a user. These limb actions may be used to generate different operation instructions, such as switching the view angle of the screen, zooming in on the screen, zooming out on the screen, or pausing on the screen. The input module is used for acquiring input (such as voice input or action input) of a user and can be a touch screen or a pickup. And the playback module is used for playing the sound related to the flying watching picture to the user.
The viewing angle is the viewing angle or viewing direction of the user to the target rocket. Since the user views by means of the head-mounted display device, the user can select the viewing angle of interest by inputting an instruction, and a corresponding flight viewing picture is acquired. The input instructions may be generated by head gestures, voice and touch actions, and the like.
The viewing service device may be connected to a plurality of head mounted display devices, in particular may be connected via a network. The viewing service devices may be computers, servers, etc.
The viewing service device can be connected with the ground control center and used for acquiring flight state data of the target rocket. The flight state data is data describing the flight state of the target rocket, and can comprise flight altitude, flight speed, flight acceleration, flight attitude, state of an rocket body and the like.
The viewing service device may be connected to an external monitoring device for acquiring external monitoring images or videos acquired by the external monitoring device for the target rocket. The external monitoring devices may include ground-mounted cameras, looking from afar vessels, aircraft, or high-resolution satellites, etc.
The viewing service device can process the flight state data to generate a virtual rocket body model of the target rocket. The virtual rocket body model simulates the rocket body state of the target rocket, and can be embodied in the form of a three-dimensional model and the like. The viewing service device may process the external surveillance image to generate a virtual scene model. The virtual scene model simulates the external environment of the target rocket.
The viewing service device may superimpose the virtual arrow body model into the virtual scene model, perform projection and rendering according to the viewing angle, generate a flight viewing picture corresponding to the viewing angle, and send the flight viewing picture to the head-mounted display device worn by the user.
The rocket launching immersive viewing system provided by the embodiment of the application comprises a head-mounted display device and a viewing service device, wherein the head-mounted display device is used for determining a viewing angle of a user to a target rocket and displaying a flying viewing picture of the target rocket to the user; the watching service device is used for acquiring flight state data of the target rocket and external monitoring images; generating a virtual rocket body model and a virtual scene model of the target rocket; generating a flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; the method comprises the steps of processing flight state data and external monitoring images of a target rocket, respectively generating a virtual rocket body model and a virtual scene model, further generating a flight viewing picture, and displaying the flight viewing picture to a user through a head-mounted display device, so that the user can remotely view the whole rocket launching process without being close to a rocket launching site, viewing will and requirements of the user can be met, and rocket launching viewing experience of the user is improved.
In some embodiments, fig. 2 is a schematic structural diagram of a viewing service device provided by the present application, as shown in fig. 2, the viewing service device 120 includes:
the man-machine interaction module 121 is in communication connection with the head-mounted display device and is used for receiving the viewing angle sent by the head-mounted display device and sending a flight viewing picture to the head-mounted display device;
the data communication module 122 is used for acquiring flight state data of the target rocket and an external monitoring image;
A picture generation module 123 for generating a virtual rocket body model of the target rocket based on the flight state data and generating a virtual scene model based on the external monitoring image; and generating a flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle.
In particular, from a structural point of view, the viewing service device may include a human-computer interaction module, a data communication module, and a screen generation module.
And the man-machine interaction module is mainly used for being in communication connection with the head-mounted display equipment, and returning to a flight viewing picture matched with the viewing angle according to the viewing angle sent by the head-mounted display equipment. The man-machine interaction module can also verify the identity validity of the head-mounted display device when communication connection is established with the head-mounted display device.
The data communication module is mainly connected with the ground control center and various external monitoring devices to acquire flight state data and external monitoring images.
And the picture generation module is used for generating a virtual arrow body model and a virtual scene model according to digital twinning and other technologies and generating a flight viewing picture according to a viewing angle sent by the head-mounted display equipment. Digital twinning refers to a technology of integrating multidisciplinary, multiscale and multiscale simulation processes by utilizing data such as a physical model, sensor data and operation history, and mapping is completed in a virtual space, so that a virtual model reflecting corresponding entity equipment is generated.
In some embodiments, the data communication module is in communication connection with the ground control center and is used for acquiring flight state data of the target rocket and setting an off-rocket monitoring image acquired by a camera outside the target rocket;
The data communication module is in communication connection with the ground monitoring equipment and is used for acquiring a ground monitoring image of the target rocket;
the data communication module is in communication connection with the air monitoring equipment and is used for acquiring an air monitoring image of the target rocket;
the data communication module is in communication connection with the monitoring satellite and is used for acquiring satellite monitoring images of the target rocket.
Specifically, the ground control center is a command control center for rocket launching, and is a central center for information collection, exchange, processing and control of aerospace measurement and control and data acquisition networks.
The data communication module in the watching service equipment is in communication connection with the ground control center, and flight state data generated in real time in the launching process of the target rocket can be obtained. Furthermore, cameras are also typically provided on the exterior of the target rocket, which cameras are used to capture the off-arrow surveillance images. For example, cameras are arranged in the fairing of the rocket, on the outer walls and the tails of the boosters and instrument cabins of all stages, and can track and shoot actions such as booster separation, secondary separation, fairing separation, rocket ship separation and the like. The off-arrow surveillance images are also transmitted simultaneously to the ground control center. The data communication module can acquire the monitoring images outside the arrow, and select the corresponding monitoring images to be displayed to the user according to the input of the user in the head-mounted display device.
The external monitoring devices for monitoring the target rocket can comprise three types, namely ground monitoring devices and the like, wherein the external monitoring devices mainly monitor the initial stage of rocket launching, and if the rocket is launched at sea, the ground monitoring devices can be remotely-looking boats and the like; one type is air monitoring equipment, including aircraft, etc., which is mainly used for monitoring the intermediate stage of rocket launching; one type is a monitoring satellite or the like, which is mainly used for monitoring the final stage of rocket launching.
The data communication module is respectively in communication connection with the ground monitoring device, the air monitoring device and the monitoring satellite and is used for acquiring ground monitoring images, air monitoring images and satellite monitoring images of the target rocket.
The rocket launching immersive viewing system provided by the embodiment of the application is respectively connected with the ground control center, the ground monitoring equipment, the air monitoring equipment and the monitoring satellite, so that the whole rocket launching process is covered, and the rocket launching viewing experience of a user is improved.
In some embodiments, the screen generation module is to determine the external surveillance image based on at least one of an off-arrow surveillance image, a ground surveillance image, an air surveillance image, and a satellite surveillance image.
Specifically, the screen generating module may select at least one or more of the off-arrow monitoring image, the ground monitoring image, the air monitoring image, and the satellite monitoring image as the external monitoring image.
For example, analyzing the viewing angle that the head mounted display device sends to the viewing service device, if the user selects an extraarrow angle, the extraarrow surveillance image may be taken as the external surveillance image; if the user selects the ground viewing angle, the ground monitoring image may be taken as an external monitoring image; if the user selects an air viewing angle, the air monitoring image can be used as an external monitoring image; if the user selects the space viewing angle, the satellite surveillance image may be taken as the external surveillance image.
In some embodiments, the screen generating module is specifically configured to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
In the case where the image resolution of any one of the external monitor images is greater than or equal to the preset resolution, the flight view screen is determined based on the external monitor image.
Specifically, the screen generating module may acquire the image resolutions of the acquired external monitoring images, respectively, from the images. The greater the resolution of the image, the more clear and realistic the details of the image.
The preset resolution may be set for screening the external monitoring image. If the image resolution of any external monitoring image is greater than or equal to the preset resolution, indicating that the external monitoring image can display enough details and can be used for displaying to a user; if the image resolution of any external monitoring image is less than the preset resolution, the external monitoring image is not clear enough to be displayed to the user.
Even if the image resolution is greater than or equal to the preset resolution, further analysis of the external monitoring image is required. If the flying state of the target rocket can be clearly displayed in the image, the image can be directly displayed to a user; if the content of the target rocket displayed in the image is less, the image needs to be further processed to improve the rocket launching viewing experience of the user.
In some embodiments, the screen generating module is specifically configured to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
And determining any external monitoring image as a flying view picture of the target rocket under the condition that the ratio of the number of pixels of the image area where the target rocket is positioned to the number of pixels of any external monitoring image is larger than or equal to a preset ratio.
Specifically, the target rocket may be identified in any external monitoring image with an image resolution greater than or equal to a preset resolution, and an image area where the target rocket is located may be determined. For example, a method of color recognition, texture recognition or shape recognition may be used to recognize the target rocket in the external monitoring image, or a method of machine learning may be used to train a neural network model through an image data set of the target rocket, and the trained neural network model is used to recognize the target rocket.
If the ratio of the number of pixels in the image area where the target rocket is located to the number of pixels in any external monitoring image is greater than or equal to a preset ratio, the fact that enough pixels are in the external monitoring image to represent the target rocket is indicated, and the more the pixels are, the more details of the target rocket are displayed, and the external monitoring image can be directly determined to be a flight view picture of the target rocket. The preset ratio can be set as required.
In some embodiments, the screen generating module is specifically configured to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
Determining any external monitoring image as a real scene image under the condition that the ratio of the number of pixels of the image area where the target rocket is located to the number of pixels of any external monitoring image is smaller than a preset ratio;
projecting a virtual rocket body model of the target rocket into a real scene image based on a viewing angle to generate an augmented reality picture;
and determining the augmented reality picture as a flight view picture of the target rocket.
Specifically, if the ratio of the number of pixels in the image area where the target rocket is located to the number of pixels in any external monitoring image is smaller than the preset ratio, it indicates that the external monitoring image can display more details, but reflects less parts of the target rocket.
To enhance the user's rocket launch viewing experience, the external surveillance image may be determined as a realistic scene image. The virtual rocket body model of the target rocket can be projected and rendered in the real scene image according to the viewing angle input by the user, and an augmented reality (Augmented Reality, AR) picture is obtained.
In the augmented reality picture, the target rocket is a virtual rocket body, and the background is a real reality scene, so that a user can see a mixed visual effect of the real world and virtual content. Although the user cannot completely see the target rocket in reality, the virtual rocket body displayed by the augmented reality picture completely accords with the target rocket in the real world in the flying state, and the user can also keep focusing on the rocket launching process.
In some embodiments, the screen generating module is specifically configured to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
under the condition that the image resolution of each external monitoring image is smaller than the preset resolution, constructing a virtual scene model based on the external monitoring image with the highest image resolution;
Superposing a virtual rocket body model of the target rocket in the virtual scene model, and projecting a model superposition result based on a viewing angle to generate a virtual reality picture;
And determining the virtual reality picture as a flight view picture of the target rocket.
Specifically, if the image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image are all smaller than the preset resolution, the external monitoring images are not clear enough, and if the external monitoring images are displayed to the user, the viewing experience of the user is affected.
At this time, the external monitoring image with the highest image resolution may be screened out, and a virtual scene model may be constructed based on the external monitoring image. For example, the space position of the target rocket from the ground can be determined according to the flight state parameters of the target rocket, a virtual model of the earth is built at the space position, and a virtual scene model is further built.
And superposing the virtual rocket body model of the target rocket in the virtual scene model, wherein the obtained model superposition result is a completely virtual three-dimensional world model. And projecting the model superposition result according to the viewing angle to obtain a Virtual Reality (VR) picture.
In the virtual reality picture, the target rocket is a virtual rocket body, and the background is a virtual environment scene, so that a user can see completely virtual content. Although the user cannot completely see the target rocket in reality, the virtual rocket body displayed by the virtual reality picture completely accords with the target rocket in the real world in the flying state, the virtual scene also accords with the external environment of the target rocket in the real world, the user can be personally on the scene, and the user can also continuously pay attention to the rocket launching process.
According to the rocket launching immersive viewing system provided by the embodiment of the application, according to the change of the resolution of the external monitoring image, the flight state and the external environment of the rocket are displayed to the user respectively in the real monitoring image, the augmented reality image and the virtual reality image, and the user can watch the rocket launching process in the whole course in a virtual-real combination mode.
In some embodiments, the screen generating module is specifically configured to:
inputting the flight state data into a preset data model corresponding to the target rocket to obtain a virtual rocket body model of the target rocket;
The preset data model is determined based on historical flight state data of the same type of rocket of the target rocket and rocket body change data of the same type of rocket in the launching process.
Specifically, the image generating module can construct a preset data model corresponding to the target rocket according to the historical flight state data of the same type of rocket of the target rocket and the rocket body change data of the same type of rocket in the launching process.
The preset data model is used for describing the flight state and the rocket body change of the target rocket in each flight stage. Arrow body change refers to booster separation, propulsion stage separation, fairing separation and the like.
The construction process of the preset data model may include:
step one, collecting flight state data and arrow body change data
Various sensor devices are required to obtain historical flight state data of the same type of rocket as the target rocket, such as altimeters, speedometers, accelerometers, compasses, barometers, etc., and to count rocket body change data of the same type of rocket during the launching process. These data can be further divided into training and testing sets.
Step two, data preprocessing
After the flight data is collected, the data needs to be preprocessed to remove noise and outliers. This may be accomplished by various data processing techniques such as filtering, smoothing, normalization, etc. For example, the data may be characterized, including altitude, speed, acceleration, direction, and the like.
Step three, constructing a data model
The preprocessed data is input into a pre-established data model. This may be achieved by various machine learning algorithms, such as neural networks, support vector machines, decision trees, and the like. These algorithms can build data models from previously collected flight state data of a large number of launch vehicles and predict their future states, thereby producing high simulation animation effects.
In evaluating the performance of the data model, various evaluation metrics such as accuracy, precision, recall, etc. may be used.
According to the rocket launching immersive viewing system provided by the embodiment of the application, the preset data model is determined according to the historical flight state data of the same type of rockets of the target rocket and the rocket body change data of the same type of rockets in the launching process, so that the virtual rocket body model of the target rocket is generated, the accuracy of the virtual rocket body of the target rocket is improved, and the rocket launching viewing experience of a user is improved.
In some embodiments, the head mounted display device is specifically for:
Acquiring voice or limb actions of a user; the limb movements include at least one of gestures, head gestures, and eye movements;
Based on the voice or limb movements, the user's input instructions are determined.
In particular, the input module in the head mounted display device may include a microphone and a camera. The pickup is used for collecting voice of a user. The camera is used for collecting gestures of a user. The head-mounted display device can further comprise a camera for collecting eye actions of the user. The motion sensing module in the head-mounted display device can be a gyroscope or an inertial measurement unit, and can acquire the head gesture of the user.
The input instruction may include switching the view angle of the screen, enlarging the screen, reducing the screen, or suspending the screen, or the like. The voice or limb movement may be in one-to-one correspondence with the input instruction. After detecting a specific voice command or limb action, the head-mounted display device can quickly determine an input instruction of a user according to a one-to-one correspondence relationship between the voice command or limb action and the input instruction.
According to the rocket launching immersive viewing system provided by the embodiment of the application, the corresponding input instruction is generated by collecting the gestures, the head gestures and the eye actions of the user, so that the rocket launching viewing experience of the user is improved.
The embodiment of the application provides an interaction method of a rocket launching immersive viewing system, a ground control center and a rocket body, which comprises the following specific steps:
step one, a user enters a rocket launching immersive viewing system by wearing a wearable device, starts a viewing mode, and enters a virtual scene (a virtual viewing platform, a virtual launching field and an actual rocket entity picture to be launched exist in the scene).
And step two, transmitting the rocket to a ground control center, and transmitting flight data and external environment detection data in different stages in real time.
And thirdly, the ground control center receives flight data of the rocket, receives external environment detection data (including space pictures, earth pictures and the like shot by the rocket), and sends the external environment detection data to a rocket launching immersive viewing system.
And step four, the rocket launching immersive viewing system inputs the received multiple data into a pre-established data model, outputs virtual rocket running animation (the rocket flies out of an atmosphere, separates and enters a track), and sends the animation to wearable equipment for displaying to a user.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (6)

1. A rocket-launched immersive viewing system, comprising:
The head-mounted display device is used for determining the viewing angle of the user to the target rocket based on the input instruction of the user and displaying the flying viewing picture of the target rocket to the user;
The watching service device is used for acquiring flight state data of the target rocket and an external monitoring image; generating a virtual rocket body model of the target rocket based on the flight state data; generating a virtual scene model based on the external monitoring image; generating the flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle; transmitting the flight view screen to the head mounted display device;
the viewing service device includes:
The man-machine interaction module is in communication connection with the head-mounted display device and is used for receiving the viewing angle sent by the head-mounted display device and sending the flying viewing picture to the head-mounted display device;
the data communication module is used for acquiring flight state data of the target rocket and an external monitoring image;
The image generation module is used for generating a virtual rocket body model of the target rocket based on the flight state data and generating a virtual scene model based on the external monitoring image; generating the flight viewing picture based on the virtual arrow body model, the virtual scene model and the viewing angle;
The data communication module is in communication connection with the ground control center and is used for acquiring flight state data of the target rocket and an off-rocket monitoring image acquired by a camera arranged outside the target rocket;
the data communication module is in communication connection with ground monitoring equipment and is used for acquiring ground monitoring images of the target rocket;
the data communication module is in communication connection with air monitoring equipment and is used for acquiring an air monitoring image of the target rocket;
The data communication module is in communication connection with a monitoring satellite and is used for acquiring satellite monitoring images of the target rocket;
the picture generation module is used for determining the external monitoring image based on at least one of the arrow external monitoring image, the ground monitoring image, the air monitoring image and the satellite monitoring image;
The picture generation module is specifically configured to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
Under the condition that the image resolution of each external monitoring image is smaller than the preset resolution, constructing the virtual scene model based on the external monitoring image with the highest image resolution;
superposing a virtual rocket body model of the target rocket in the virtual scene model, and projecting a model superposition result based on the viewing angle to generate a virtual reality picture;
and determining the virtual reality picture as a flight view picture of the target rocket.
2. A rocket launch immersive viewing system according to claim 1 wherein said picture generation module is specifically adapted to:
Determining image resolutions corresponding to the ground monitoring image, the air monitoring image and the satellite monitoring image;
And determining the flight view picture based on any external monitoring image when the image resolution of the external monitoring image is greater than or equal to a preset resolution.
3. A rocket launch immersive viewing system according to claim 2 wherein said picture generation module is specifically adapted to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
and determining any external monitoring image as a flight view picture of the target rocket under the condition that the ratio of the number of pixels of the image area where the target rocket is positioned to the number of pixels of any external monitoring image is larger than or equal to a preset ratio.
4. A rocket launch immersive viewing system according to claim 2 wherein said picture generation module is specifically adapted to:
identifying the target rocket in any external monitoring image, and determining an image area where the target rocket is located;
Determining any external monitoring image as a real scene image under the condition that the ratio of the number of pixels of the image area where the target rocket is located to the number of pixels of any external monitoring image is smaller than a preset ratio;
Projecting a virtual rocket body model of the target rocket into the real scene image based on the viewing angle to generate an augmented reality picture;
and determining the augmented reality picture as a flight view picture of the target rocket.
5. A rocket launch immersive viewing system according to claim 1 wherein said picture generation module is specifically adapted to:
Inputting the flight state data into a preset data model corresponding to the target rocket to obtain a virtual rocket body model of the target rocket;
The preset data model is determined based on historical flight state data of the same type of rockets of the target rocket and rocket body change data of the same type of rockets in the launching process.
6. A rocket-launch immersive viewing system according to claim 1, wherein the head-mounted display device is specifically for:
Acquiring voice or limb actions of the user; the limb movements include at least one of gestures, head gestures, and eye movements;
based on the voice or the limb action, an input instruction of the user is determined.
CN202311550185.9A 2023-11-21 2023-11-21 Rocket launching immersive viewing system Active CN117278734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311550185.9A CN117278734B (en) 2023-11-21 2023-11-21 Rocket launching immersive viewing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311550185.9A CN117278734B (en) 2023-11-21 2023-11-21 Rocket launching immersive viewing system

Publications (2)

Publication Number Publication Date
CN117278734A CN117278734A (en) 2023-12-22
CN117278734B true CN117278734B (en) 2024-04-19

Family

ID=89221893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311550185.9A Active CN117278734B (en) 2023-11-21 2023-11-21 Rocket launching immersive viewing system

Country Status (1)

Country Link
CN (1) CN117278734B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808971B (en) * 2024-01-02 2024-11-12 海南国际商业航天发射有限公司 A rocket launch scene construction method, device, equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206209370U (en) * 2016-11-04 2017-05-31 深圳华德恩教育信息咨询有限公司 A kind of satellite launching tower supervising device
CN212084125U (en) * 2020-10-19 2020-12-04 中国长征火箭有限公司 Rocket flight display system
CN116132602A (en) * 2023-04-12 2023-05-16 东方空间技术(山东)有限公司 Carrier rocket image acquisition system, method, equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7755635B2 (en) * 2006-02-27 2010-07-13 Benman William J System and method for combining satellite imagery with virtual imagery
CN114981846A (en) * 2020-01-20 2022-08-30 索尼集团公司 Image generation device, image generation method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206209370U (en) * 2016-11-04 2017-05-31 深圳华德恩教育信息咨询有限公司 A kind of satellite launching tower supervising device
CN212084125U (en) * 2020-10-19 2020-12-04 中国长征火箭有限公司 Rocket flight display system
CN116132602A (en) * 2023-04-12 2023-05-16 东方空间技术(山东)有限公司 Carrier rocket image acquisition system, method, equipment and storage medium

Also Published As

Publication number Publication date
CN117278734A (en) 2023-12-22

Similar Documents

Publication Publication Date Title
US20240267481A1 (en) Scene-aware selection of filters and effects for visual digital media content
JP7068562B2 (en) Techniques for recording augmented reality data
US11681834B2 (en) Test cell presence system and methods of visualizing a test environment
KR102680675B1 (en) Flight controlling method and electronic device supporting the same
CN112150885B (en) Cockpit system based on mixed reality and scene construction method
JP7146662B2 (en) Image processing device, image processing method, and program
US9210413B2 (en) System worn by a moving user for fully augmenting reality by anchoring virtual objects
KR20190126919A (en) Generating device, generating method, and storage medium
CN108292489A (en) Information processing unit and image generating method
CN108629830A (en) A kind of three-dimensional environment method for information display and equipment
WO2013171731A1 (en) A system worn by a moving user for fully augmenting reality by anchoring virtual objects
Yu et al. Intelligent visual-IoT-enabled real-time 3D visualization for autonomous crowd management
Saini et al. Airpose: Multi-view fusion network for aerial 3d human pose and shape estimation
CN117278734B (en) Rocket launching immersive viewing system
CN115131528B (en) Virtual reality scene determination method, device and system
CN111862348A (en) Video display method, video generation method, video display device, video generation device, video display equipment and storage medium
JP2019045991A (en) Generation device, generation method and program
CN117197388A (en) A method and system for constructing real-life three-dimensional virtual reality scenes based on generative adversarial neural networks and oblique photography
WO2017042070A1 (en) A gazed virtual object identification module, a system for implementing gaze translucency, and a related method
CN117689826A (en) Three-dimensional model construction and rendering method, device, equipment and medium
US11961190B2 (en) Content distribution system, content distribution method, and content distribution program
WO2013041152A1 (en) Methods to command a haptic renderer from real motion data
JP2019103126A (en) Camera system, camera control device, camera control method, and program
US20240144573A1 (en) Computer-implemented systems and methods for generating enhanced motion data and rendering objects
CN112288876A (en) Long-distance AR identification server and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant