[go: up one dir, main page]

CN118977859A - System and method for augmented reality windshield projection in aircraft cockpits - Google Patents

System and method for augmented reality windshield projection in aircraft cockpits Download PDF

Info

Publication number
CN118977859A
CN118977859A CN202411259305.4A CN202411259305A CN118977859A CN 118977859 A CN118977859 A CN 118977859A CN 202411259305 A CN202411259305 A CN 202411259305A CN 118977859 A CN118977859 A CN 118977859A
Authority
CN
China
Prior art keywords
aircraft
augmented reality
information
reality image
pilot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411259305.4A
Other languages
Chinese (zh)
Inventor
金梓城
范瑞杰
周梦贝
诸心阳
蒋俊
冯志祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Comac Shanghai Aircraft Design & Research Institute
Commercial Aircraft Corp of China Ltd
Original Assignee
Comac Shanghai Aircraft Design & Research Institute
Commercial Aircraft Corp of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Comac Shanghai Aircraft Design & Research Institute, Commercial Aircraft Corp of China Ltd filed Critical Comac Shanghai Aircraft Design & Research Institute
Priority to CN202411259305.4A priority Critical patent/CN118977859A/en
Publication of CN118977859A publication Critical patent/CN118977859A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to a system and method for augmented reality windshield projection of an aircraft cockpit. The system comprises: the sensing module is configured to sense state information of the aircraft and information of an external environment of the aircraft; an augmented reality image generation module configured to determine a flight phase in which an aircraft is based on state information of the aircraft itself, generate an augmented reality image based on the flight phase and using information from the perception module; and an optical projection module disposed within a front hood of an aircraft cockpit and configured to project the augmented reality image to a windshield of the aircraft.

Description

System and method for augmented reality windshield projection of an aircraft cockpit
Technical Field
The present disclosure relates to aircraft cockpit human-machine interactive display systems and methods, and more particularly to systems and methods for augmented reality windshield projection of an aircraft cockpit.
Background
An aircraft (such as a large civil aircraft) cockpit is used as a main place for a pilot to execute tasks, a large amount of information is integrated in a limited space, and an interaction scene is complex. For example, the aircraft has the characteristics of complex operation process, short reaction time and much information required to be focused by pilots in the key stages of taking off and approaching of the aircraft (particularly the civil aircraft). The aircraft has many parameter instruments and large distribution range, so that the unit is difficult to simultaneously consider the external view and instrument data.
However, the existing aircraft is provided with a cockpit head-up display device on the head space of the crew, occupies the head space of the crew, has the defects of single information display mode, insufficient vividness in single-color display, occupation of the head space of the crew, possibility of personnel injury in a bumpy environment and the like, and needs to be improved. And the existing equipment only obtains the information in the aircraft through an internal indication recording system of the aircraft, so that the sensing means of the external environment are limited.
The present disclosure is improved upon with respect to, but is not limited to, the factors described above.
Disclosure of Invention
To this end, the present disclosure provides an augmented reality windshield projection system and method applied to an aircraft cockpit that will sense the external environment in real time during various phases of flight of the aircraft (e.g., ground taxiing and in-flight), and project critical information on the pilot's perceived needs onto the cockpit windshield so that the pilot can conveniently look at the scene.
According to a first aspect of the present disclosure there is provided a system for augmented reality windshield projection of an aircraft cockpit, comprising: the sensing module is configured to sense state information of the aircraft and information of an external environment of the aircraft; an augmented reality image generation module configured to determine a flight phase in which an aircraft is based on state information of the aircraft itself, generate an augmented reality image based on the flight phase and using information from the perception module; and an optical projection module disposed within a front hood of an aircraft cockpit and configured to project the augmented reality image to a windshield of the aircraft.
According to an embodiment, the sensing module comprises: on-board indication recording means for sensing status information of the aircraft itself: the visual sensing device is arranged outside the cockpit, acquires an image in front of the aircraft in the field of view of the pilot and comprises a visual camera and an infrared camera; and the radar detection device comprises a laser radar and/or a millimeter wave radar for detecting the terrain information and the obstacles of the surrounding environment of the airplane during the running process of the airplane, wherein the state information of the airplane comprises at least one of airspeed, flying height, global Navigation Satellite System (GNSS) positioning, configuration state of each system of the airplane and alarm information of the airplane.
According to another embodiment, generating an augmented reality image using information from the perception module comprises: integrating information from the perception module to obtain the picture content of the augmented reality image to be projected; and eliminating distortion of the augmented reality image when projected to the windshield according to a curvature of the windshield.
According to a further embodiment, the perception module further comprises: cockpit detection means including cameras arranged in the cockpit and acquisition means for acquiring the crew's maneuvers at the side bars, steering wheel and/or pedals from the flight control system, the cameras acquiring information about the crew's head and gaze direction.
According to yet another embodiment, the flight crew of the aircraft includes two pilots, and generating the augmented reality image using information from the perception module includes: determining a stick pilot and a non-stick pilot based on the crew's maneuvers at the sidesticks, steering wheel, and/or pedals collected from the flight control system; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a portion of the windshield in front of the stick pilot and the second augmented reality image onto a portion of the windshield in front of the non-stick pilot.
According to a further embodiment, the augmented reality image generating module is further configured to suspend or dim the display of the second augmented reality image in response to the direction of the stick pilot's head turning to the non-stick pilot.
According to a further embodiment, the augmented reality image generation module is further configured to replace the content of the projected second augmented reality image in response to an input instruction from the non-stick pilot.
According to a further embodiment, the flight phase comprises ground taxiing, takeoff, climb/cruise/descent, approach, landing, and wherein: in a ground taxi phase, the augmented reality image includes at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image includes at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image includes at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image includes at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in a landing phase, the augmented reality image includes at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft.
According to yet another embodiment, the windshield is a multi-layer structure and each layer of the windshield is filled with a wedge-shaped interlayer.
According to a further embodiment, the projected position of the center of the augmented reality image on the windshield of the aircraft is the gaze position of the pilot of the aircraft when sitting on the cockpit in a flat vision windshield, such that the augmented reality image is overlaid over the corresponding real object in the pilot's field of view.
According to a second aspect of the present disclosure, there is provided a method for augmented reality windshield projection of an aircraft cockpit, comprising: determining the current flight phase of the aircraft according to the information obtained from the on-board indication recording device; determining aircraft state information and environment information to be displayed according to the determined flight phase; acquiring aircraft state information and environment information to be displayed from the on-board indication recording system and each environment sensor arranged on the aircraft; generating an augmented reality image to be projected using the acquired aircraft state information and environmental information; and projecting the augmented reality image to a cockpit front windshield of the aircraft.
According to an embodiment, the flight crew of the aircraft comprises two pilots, the method further comprising: collecting the crew manoeuvres at the side bars, steering wheel and/or pedals from the flight control system of the aircraft; determining a stick pilot and a non-stick pilot based on the collected maneuvers; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a cockpit front windshield portion in front of the stick pilot and projecting the second augmented reality image onto a cockpit front windshield portion in front of the non-stick pilot.
According to a third aspect of the present disclosure there is provided an aircraft comprising a system for augmented reality windshield projection of an aircraft cockpit according to the first aspect of the present disclosure.
Aspects generally include a method, apparatus, system, computer program product, and processing system substantially as described herein with reference to and as illustrated by the accompanying drawings.
The foregoing has outlined rather broadly the features and technical advantages of examples in accordance with the present disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The disclosed concepts and specific examples may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. The features of the concepts disclosed herein, both as to their organization and method of operation, together with associated advantages, will be better understood from the following description when considered in connection with the accompanying drawings. Each of the figures is provided for the purpose of illustration and description and is not intended to limit the claims.
Drawings
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to aspects, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this disclosure and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects. The same reference numbers in different drawings may identify the same or similar elements.
FIG. 1 illustrates a schematic diagram of a system for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a system for augmented reality windshield projection of an aircraft cockpit according to a specific example of the present disclosure;
FIG. 3 shows a comparative schematic of imaging of a windshield with a wedge-shaped interlayer and a windshield without a wedge-shaped interlayer;
FIG. 4 illustrates a projected head-up display effect diagram of a system of augmented reality windshield projections according to an example embodiment of the present disclosure;
FIG. 5 illustrates a flowchart of a method for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure; and
Fig. 6 is a schematic diagram illustrating an example aircraft in accordance with aspects of the present disclosure.
Detailed Description
The inventor realizes that the existing aircraft is provided with cockpit head-up display equipment on the head space of a crew member, occupies the head space of the crew member, has the defects of single information display mode, insufficient vivid single-color display, occupation of the head space of the crew member, possibility of personnel injury in a bumpy environment and the like, and needs to be improved. And the existing equipment only obtains the information in the aircraft through an internal indication recording system of the aircraft, so that the sensing means of the external environment are limited.
To this end, the present disclosure provides an augmented reality windshield projection method and system for an aircraft cockpit. The system determines display information required to be provided for the pilot according to the current flight stage of the aircraft and the work task of the pilot, acquires the surrounding environment data of the aircraft through the sensors such as the camera, the laser radar and the like, acquires the state data of the aircraft through the flight control system, generates an augmented reality image after data processing through a data fusion method, then projects the augmented reality image on a front windshield of the cockpit of the aircraft, provides the required information for the pilot, enhances the scene awareness of the flight set and improves the flight safety.
The methods and systems of the present disclosure may provide a heads-up display interface to crew members by disposing a projection device within a front hood of the cockpit and using a windshield as a display screen. In particular, in providing a display interface, the methods and systems of the present disclosure may identify crew member eye position and gaze direction through an in-cockpit camera in connection with crew input information (or pilot input settings) obtained from a flight control system to determine the current stage of crew composition to thereby display different augmented reality images to different pilots and such that the augmented reality images do not interfere with each other.
The detailed description set forth below in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of the various concepts. It will be apparent, however, to one skilled in the art that these concepts may be practiced without these specific details.
Referring to fig. 1, a schematic diagram of a system 100 for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure is shown.
As shown in fig. 1, a system 100 for augmented reality windshield projection of an aircraft cockpit may include a perception module 101, an augmented reality image generation module 103, and an optical projection module 105.
In an embodiment of the present disclosure, the perception module 101 may be configured to sense status information of the aircraft itself as well as information of the aircraft external environment. In an example, the perception module 101 may comprise on-board indication recording means for sensing status information of the aircraft itself: the visual sensing device is arranged outside the cockpit, acquires an aircraft front image in the field of view of a pilot and comprises a visual camera and an infrared camera; and the radar detection device comprises a laser radar and/or a millimeter wave radar for detecting the surrounding environment topographic information and obstacles of the airplane during the running process of the airplane (especially in the scenes of the ground, the take-off and the approach landing fields, etc.). In yet another example, the perception module 101 may further comprise a cockpit detection means, which may comprise a camera arranged in the cockpit and acquisition means to acquire the crew's manoeuvres at the side stick, the steering wheel and/or the foot pedal from the flight control system, wherein the camera acquires information about the crew's head and gaze direction. According to this example, the status information of the aircraft itself may include at least one of aircraft airspeed, altitude, global Navigation Satellite System (GNSS) positioning, configuration status of various systems of the aircraft, alert information, or any other suitable information. Further according to this example, the on-board indication recording device may be an original structure of the aircraft, responsible for providing the optical projection system with the status information of the aircraft itself; the visual camera shoots a scene image in the view field of a driver in front of the airplane and is used for carrying out image recognition processing subsequently; the infrared camera can particularly shoot in dark external environment, and provides pictures for enhancing a vision system and image recognition.
Referring to fig. 2, a schematic diagram of a system 200 for augmented reality windshield projection of an aircraft cockpit according to a specific example of the present disclosure is shown. As can be seen in fig. 2, the system 200 may include a perception module 201, an augmented reality image generation module 203, and an optical projection module 205. As shown in fig. 2, the perception module 201 includes (onboard) an indication recording system (such as an atmospheric data and inertial reference system for providing airspeed and attitude information, a GNSS for providing position information, and any other suitable means for providing other flight information), a radar detection module (such as a lidar for providing environmental object information, an onboard millimeter wave radar), a visual sensing device (such as a visual camera for providing a front video image, an infrared camera for providing visual enhancement information), a cockpit detection module (such as a cockpit camera for providing crew eye position information, a crew flight control acquisition module for providing crew flight mission division information).
In an embodiment of the present disclosure, the augmented reality image generation module 103 may be configured to determine a flight phase in which the aircraft is based on the status information of the aircraft itself, and to generate the augmented reality image using information from the sensing module 101 based on the determined flight phase. In an example, determining the stage of flight in which the aircraft is in may be based on pilot manual inputs, current altitude of flight of the aircraft, airspeed, landing gear and flap settings, and so forth, which are not described in detail herein. Further according to this embodiment, generating the augmented reality image using the information from the sensing module 101 may include integrating the information from the sensing module 101 to obtain the picture content of the augmented reality image to be projected; and eliminating distortion of the augmented reality image when projected to the windshield according to the curvature of the windshield.
As shown in the specific example in fig. 2, the augmented reality image generation module 203 may include a data fusion unit and an image generation unit, wherein the data fusion unit may integrate data from the indication recording system and the radar detection module, coordinate data formats of different information to obtain information content that can be used for direct projection on the windshield display to the image generation unit, and the image generation unit may receive information from the data fusion unit and information from the visual sensing device and the cockpit detection module to determine the content of the picture to be projected for display and eliminate distortion of the picture when projected onto the windshield according to the curvature of the windshield.
In an embodiment of the present disclosure, the optical projection module 105 may be disposed within a front hood of the aircraft cockpit and configured to project the augmented reality image from the augmented reality image generation module 103 to a windshield of the aircraft. In an example, the optical projection module 105 may include optics and corresponding optical paths for projection of the augmented reality image.
As shown in the specific example in fig. 2, the optical projection module 205 may include optics and corresponding optical paths for projecting the augmented reality image from the augmented reality image generation module 203 to the windshield. In this example, the optical device may receive the generated augmented reality image, and the image generation unit is controlled by the driving board to generate the optical image. The generated optical image is projected by the optical path to the windshield of the aircraft cockpit, reflected into the pilot's eye, forming a virtual image of the display in front of the aircraft cockpit. According to this embodiment, the position of the image projection can be adjusted according to the crew eye position and gaze direction information provided by the sensing module 201, so that the crew can read the display information at any time, and the projection images on two sides are avoided in the field of view of one pilot in the case of a double pilot crew, so that the field of view is prevented from being disturbed. In addition, the optical projection module 205 may implement an anti-shake algorithm to eliminate and mitigate shake for vibrations that may be present during flight.
In yet another embodiment of the present disclosure, where the flight crew of the aircraft includes two pilots, the augmented reality image generation module 103 may be further configured to: determining a stick pilot and a non-stick pilot based on the crew's maneuvers at the sidesticks, steering wheel, and/or pedals collected from the flight control system; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a portion of the windshield in front of the stick pilot and projecting the second augmented reality image onto a portion of the windshield in front of the non-stick pilot. In this embodiment, the content of the first and second augmented reality images may be different, considering that the stick pilot is tasked differently than the non-stick pilot, and thus the content requirements on the display may also be different.
To prevent the augmented reality image from affecting the field of view of the pilot (and particularly the stick pilot), in a preferred embodiment, the augmented reality image generation module 103 may be further configured to pause or adjust blind stitch the display of a second augmented reality image of the non-stick pilot in response to the direction in which the head of the stick pilot turns toward the non-stick pilot.
In order for the pilot, particularly a non-stick pilot, to conveniently review the various information required, the augmented reality image generation module 103 may also be configured to replace the content of the projected second augmented reality image in response to input instructions from the non-stick pilot. For example, the augmented reality image generation module 103 may generate a plurality of second augmented reality images and display a corresponding one of the second augmented reality images in response to a request from a non-stick pilot. Alternatively, the augmented reality image generation module 103 may be further configured to generate a new second augmented reality image in real time in response to a request from a non-stick pilot.
In yet another embodiment of the present disclosure, the flight phase of the aircraft may include ground taxiing, takeoff, climb/cruise/descent, approach, landing, and the like, and wherein during the ground taxiing phase, the augmented reality image may include at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image may include at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image may include at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image may include at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in the landing phase, the augmented reality image may include at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft. The content requirements of the augmented reality image are described below in specific examples in each flight phase.
In the ground taxi phase (in the scenario of the aircraft being before take-off, after landing), the system 100 may: reading the current aircraft configuration setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring current airport map information through an airplane information database; obtaining a target point and a path which need to go to by reading a traffic control instruction of a tower; acquiring an image of the forward advancing direction of the aircraft through a visual camera; surrounding objects, which may collide during the movement of the aircraft, are perceived by means of millimeter wave radar. After obtaining the above information, the system 100 may provide the above information combinations to the pilot. For example, combining aircraft positioning, airport maps, travel targets, and travel routes into map path guidance; and fusing the traveling target, the traveling route and the possibly collided object with the image information of the front of the aircraft shot by the visual camera, and overlaying the traveling route, the destination and the possibly collided object on the corresponding object in the field of view of the pilot in a high-brightness mark mode through an augmented reality technology. When the brightness of the picture shot by the visual camera is too dark or the illumination intensity is lower than a preset threshold value, an Enhanced Vision System (EVS) picture shot by the infrared camera, which is the same as the visual field of the visual camera, can be used as an alternative or supplement to the image shot by the visual camera.
To ensure that the pilot's field of view is not disturbed while the aircraft is in the takeoff phase, the system 100 may provide only the pilot (and in particular the stick pilot) with current aircraft-related flight information, including, for example: aircraft attitude related information; sideslip information; airspeed-related information; highly relevant information; the direction of the next waypoint; an alarm that the take-off phase is not suppressed. To this end, the system 100 may read information related to the current aircraft flight status via the flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; reading an aircraft flight plan via a flight control system, and so forth. Alternatively, the pilot may turn on the view enhancement function by manual setting or when the brightness of the picture taken by the visual camera is too dark or the illumination intensity is below a predetermined threshold, an Enhanced View System (EVS) picture taken by an infrared camera that is the same as the visual camera field of view may be used as an alternative or supplement to the image taken by the visual camera to enhance the pilot's perception of the external environment.
During the climb/cruise/descent phase, the system 100 may primarily provide the pilot with current aircraft flight status, next waypoint location information, and other important information that may affect the flight process (such as weather, collision warning, etc.), among other things. To this end, the system 100 may: reading the related information of the current aircraft flight state through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; and acquiring the current flight plan and the next waypoint through the airplane information database.
During the flight phase of an aircraft in an approach scenario, system 100 primarily provides information to pilots regarding current aircraft flight configuration information, current waypoint locations, approach routes, target airport locations, and the like. To this end, the system 100 may: reading the current flight state data setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring a current flight plan and the position of the next waypoint through an airplane information database; obtaining a target point and a path which need to go to by reading a traffic control instruction of a tower; acquiring an image of the forward advancing direction of the aircraft through a camera; and sensing the surrounding terrain environment information of the aircraft through a laser radar. Thereafter, the system 100 may provide the above combinations of information to the pilot, such as combining aircraft positioning, airport location, and glidepath path into map path guidance; image recognition is carried out on the image information of the front of the aircraft shot by the visual camera, and the runway position for landing is obtained through visual recognition; the glidepath route, landing runway contour, etc. are overlaid over the corresponding objects in the pilot's field of view in the form of highlighting by augmented reality technology. Alternatively, the pilot may turn on the view enhancement function by manual setting or when the brightness of the picture taken by the visual camera is too dark or the illumination intensity is below a predetermined threshold, an Enhanced View System (EVS) picture taken by an infrared camera that is the same as the visual camera field of view may be used as an alternative or supplement to the image taken by the visual camera to enhance the pilot's perception of the external environment. Alternatively, when the visible camera and the infrared camera can not acquire clear external environment images, through manual setting of a pilot, the satellite positioning information of the airplane, map information stored in an onboard database of the airplane and surrounding environment information acquired by detection of an onboard radar can be used for constructing a picture, and the picture shot by the infrared camera are synthesized together to be projected to a windshield.
During the flight phase of an aircraft in a landing scenario, system 100 primarily provides information to pilots such as current aircraft flight configuration information, braking distance cues, runway departure location cues, and the like. To this end, the system 100 may: reading the current aircraft landing configuration setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring the current deceleration trend of the aircraft through a brake system; obtaining the runway departure point position by reading the traffic control instruction of the tower; acquiring an image of the forward advancing direction of the aircraft through a visual camera; the surrounding terrain environment information of the aircraft is perceived through a laser radar. Thereafter, the system 100 may provide the pilot with the above combination of information: image recognition is carried out on the image information of the front of the aircraft shot by the camera, and an aircraft braking distance prompt is obtained by matching with an aircraft deceleration trend; the braking distance, the runway threshold that is expected to be disengaged, is overlaid over the corresponding object in the pilot's field of view in the form of a highlighting by augmented reality technology.
Preferably, in one example, in the case of a dual pilot crew, during flight phases other than ground taxiing phases, the system 100 may determine the division of work (i.e., stick pilots and non-stick pilots) for the current crew based on the inputs (or pilot manual settings) of the crew at the sidesticks/steering columns and pedals. Preferably, depending on the division of labor, the projected display on the pilot side of the stick will emphasize the flight-related data more than the projected display on the non-stick pilot side, while the display on the pilot side of the stick will have a higher display priority.
In yet another embodiment of the present disclosure, the windshield of an aircraft may be a multi-layer structure, and the layers of the windshield are filled with a wedge-shaped interlayer to mitigate imaging ghosting problems with multiple layers of glass. FIG. 3 shows a comparative schematic of imaging of a windshield with a wedge-shaped interlayer and a windshield without a wedge-shaped interlayer. As shown in fig. 3 at (a), a windshield without a wedge-shaped interlayer will cause the ghost image to not coincide with the virtual image, causing trouble to the pilot; while a windshield with a wedge-shaped interlayer will have the ghost image coincident with the virtual image, as shown at (b), alleviating the imaging ghost problem associated with the multiple layer glass structure of the windshield.
Furthermore, since the optical projection system is arranged on the front side of the aircraft cockpit, the system 100 may also be equipped with corresponding electromagnetic shielding measures to prevent environmental damage to the system 100, given the complex electromagnetic environment of the aircraft nose position during flight.
In an example of the present disclosure, the projected position of the center of the augmented reality image on the windshield of the aircraft is the gaze position of the pilot of the aircraft while sitting in a flat windshield on the cockpit, such that the augmented reality image is overlaid over the corresponding real object in the pilot's field of view. For example, referring to fig. 4, a projected head-up display effect diagram of a system of augmented reality windshield projections is shown according to an example embodiment of the present disclosure.
Referring now to fig. 5, a flow chart of a method 500 for augmented reality windshield projection of an aircraft cockpit is shown, according to an example embodiment of the present disclosure.
The method 500 may include determining a flight phase currently being performed by the aircraft based on information obtained from an onboard instruction recording device at block 510. In an example, determining the stage of flight in which the aircraft is in may be based on pilot manual inputs, current altitude of flight of the aircraft, airspeed, landing gear and flap settings, and so forth, which are not described in detail herein.
At block 520, the method 500 may include determining aircraft status information and environmental information to be displayed based on the determined flight phase. For example, the flight phase of the aircraft may include ground taxiing, takeoff, climb/cruise/descent, approach, landing, and the like, and wherein during the ground taxiing phase, the augmented reality image may include at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image may include at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image may include at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image may include at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in the landing phase, the augmented reality image may include at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft.
At block 530, the method 500 may include collecting aircraft state information and environmental information for a desired display from an onboard instruction recording system and various environmental sensors disposed onboard the aircraft, and at block 540, the method 500 may include using the acquired aircraft state information and environmental information to generate an augmented reality image to be projected. For example, the method 500 may integrate data from the directives recording system and radar detection module, coordinate the data formats of the different information to obtain information content that can be used for direct projection to the windshield display, and the method 500 may also receive information from the visual sensing device and the cockpit detection module to determine the content of the picture to be projected for display and eliminate distortion of the picture as it is projected to the windshield based on the curvature of the windshield.
At block 550, the method 500 may include projecting an augmented reality image to a cockpit front windshield of an aircraft.
In yet another embodiment of the present disclosure, the flight crew of the aircraft includes two pilots, and the method 500 may further include collecting crew maneuvers at the sideshields, the steering column, and/or the pedals from the aircraft flight control system; determining a stick pilot and a non-stick pilot based on the collected maneuvers; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a cockpit front windshield portion in front of the stick pilot and projecting the second augmented reality image onto a cockpit front windshield portion in front of the non-stick pilot.
Referring to fig. 6, a schematic diagram of an aircraft 600 is shown according to an example embodiment of the present disclosure. In this embodiment, aircraft 600 may include an augmented reality windshield projection system described in accordance with embodiments of the present disclosure, such as system 100 for augmented reality windshield projection of an aircraft cockpit described with respect to fig. 1.
As described above, the present disclosure, on the premise of providing an equal or even better head-up display effect, releases the head space of the pilot by providing the projection device in the front hood of the cockpit of the aircraft, and does not need to provide the display screen in front of the head of the pilot, avoiding the risk of collision between the pilot and the display screen under the conditions of jolt and the like.
The system and the method also improve the situational awareness of the aircraft unit and the perception capability of the aircraft to the external environment in each flight process. According to different requirements of a machine set on flight tasks in different flight phases, the system and the method for providing the cockpit head-up display of the augmented reality function are innovatively provided, the augmented reality display image related to the current execution task of a pilot is provided, the situational awareness of the pilot is enhanced, and the running capability of an airplane is improved.
For the dual pilot crew cockpit feature, the present disclosure also utilizes an intra-cabin camera to acquire crew eye position and gaze direction, uses crew inputs to acquire the current work tasks of each pilot to display augmented reality images to different pilots, respectively, and also enables the simultaneous appearance of augmented reality images for one pilot in the field of view of another pilot.
The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings illustrate specific embodiments that can be practiced by way of illustration. These embodiments are also referred to herein as "examples". Such examples may include elements other than those shown or described. However, examples including the elements shown or described are also contemplated. Moreover, it is also contemplated that examples using any combination or permutation of those elements shown or described, or with reference to specific examples (or one or more aspects thereof) shown or described herein, or with reference to other examples (or one or more aspects thereof) shown or described herein.
In the appended claims, the terms "including" and "comprising" are open-ended, i.e., a system, apparatus, article, or process of claim that is defined to be within the scope of the claim, except for those elements recited after such term. Furthermore, in the appended claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to indicate the numerical order of their objects.
In addition, the order of the operations illustrated in the present specification is exemplary. In alternative embodiments, the operations may be performed in a different order than shown in the figures, and the operations may be combined into a single operation or split into more operations.
The above description is intended to be illustrative, and not restrictive. For example, the examples described above (or one or more aspects thereof) may be used in connection with other embodiments. Other embodiments may be used, such as by one of ordinary skill in the art after reviewing the above description. The abstract allows the reader to quickly ascertain the nature of the technical disclosure. This Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Furthermore, in the above detailed description, various features may be grouped together to streamline the disclosure. However, the claims may not state every feature disclosed herein, as embodiments may characterize a subset of the features. Further, embodiments may include fewer features than are disclosed in the specific examples. Thus the following claims are hereby incorporated into the detailed description, with one claim standing on its own as a separate embodiment. The scope of the embodiments disclosed herein should be determined with reference to the appended claims, along with the full range of equivalents to which such claims are entitled.

Claims (13)

1.一种用于飞机驾驶舱的增强现实风挡投影的系统,其特征在于,包括:1. A system for augmented reality windshield projection in an aircraft cockpit, comprising: 感知模块,所述感知模块被配置成感测飞机自身的状态信息以及飞机外部环境的信息;A perception module, wherein the perception module is configured to sense the state information of the aircraft itself and the information of the external environment of the aircraft; 增强现实图像生成模块,所述增强现实图像生成模块被配置成基于飞机自身的状态信息来确定飞机所处于的飞行阶段,基于所述飞行阶段并使用来自所述感知模块的信息来生成增强现实图像;以及an augmented reality image generation module, the augmented reality image generation module being configured to determine the flight phase of the aircraft based on the aircraft's own state information, and to generate an augmented reality image based on the flight phase and using the information from the perception module; and 光学投影模块,所述光学投影模块被布置在飞机驾驶舱的前部遮光罩内并且被配置成将所述增强现实图像投影到所述飞机的风挡。An optical projection module is disposed in a front sunshade of an aircraft cockpit and is configured to project the augmented reality image onto a windshield of the aircraft. 2.根据权利要求1所述的系统,其特征在于,所述感知模块包括:2. The system according to claim 1, wherein the perception module comprises: 用于感测飞机自身的状态信息的机上指示记录装置:On-board indication and recording device for sensing the aircraft's own status information: 设置在驾驶舱外的可视传感装置,所述可视传感装置采集在飞行员视野内的飞机前方图像并且包括可视摄像头与红外摄像头;以及A visual sensor device disposed outside the cockpit, the visual sensor device collects images of the front of the aircraft within the pilot's field of view and includes a visual camera and an infrared camera; and 雷达探测装置,所述雷达探测装置包括用于在飞机运行过程中探测飞机周边环境地形信息及障碍物的激光雷达和/或毫米波雷达,A radar detection device, wherein the radar detection device includes a laser radar and/or a millimeter wave radar for detecting terrain information and obstacles in the aircraft's surrounding environment during the operation of the aircraft. 其中飞机自身的状态信息包括飞机空速、飞行高度、全球导航卫星系统(GNSS)定位、飞机各系统的构型状态、告警信息中的至少一者。The aircraft's own status information includes at least one of the aircraft's airspeed, flight altitude, global navigation satellite system (GNSS) positioning, configuration status of each aircraft system, and warning information. 3.根据权利要求2所述的系统,其特征在于,使用来自所述感知模块的信息来生成增强现实图像包括:3. The system of claim 2, wherein using information from the perception module to generate an augmented reality image comprises: 整合来自所述感知模块的信息以获得要投影的增强现实图像的画面内容;以及Integrating information from the perception module to obtain the picture content of the augmented reality image to be projected; and 根据所述风挡的曲率来消除所述增强现实图像在被投影到所述风挡时的畸变。The distortion of the augmented reality image when projected onto the windshield is eliminated according to the curvature of the windshield. 4.根据权利要求2所述的系统,其特征在于,所述感知模块还包括:4. The system according to claim 2, characterized in that the perception module further comprises: 驾驶舱探测装置,所述驾驶舱探测装置包括布置在驾驶舱内的摄像头以及从飞控系统获取机组人员在侧杆、驾驶杆盘和/或脚蹬处的操纵的采集装置,所述摄像头采集与机组人员的头部和注视方向有关的信息。A cockpit detection device, the cockpit detection device includes a camera arranged in the cockpit and a collection device for obtaining the crew's operations on the side stick, the control stick plate and/or the pedals from the flight control system, and the camera collects information related to the crew's head and gaze direction. 5.根据权利要求4所述的系统,其特征在于,所述飞机的飞行机组包括两个飞行员,并且使用来自所述感知模块的信息来生成增强现实图像包括:5. The system of claim 4, wherein the flight crew of the aircraft includes two pilots, and using information from the perception module to generate an augmented reality image comprises: 基于从飞控系统采集到的机组人员在侧杆、驾驶杆盘和/或脚蹬处的操纵来确定把杆飞行员和非把杆飞行员;Determine the pilot who holds the stick and the pilot who does not hold the stick based on the flight crew's manipulation of the side stick, control panel and/or pedals collected from the flight control system; 使用所获取的飞机状态信息与环境信息来针对所述把杆飞行员和所述非把杆飞行员分别生成要投影的第一增强现实图像和第二增强现实图像;以及Using the acquired aircraft status information and environmental information to generate a first augmented reality image and a second augmented reality image to be projected for the pilot at the helm and the pilot not at the helm, respectively; and 将所述第一增强现实图像投影到所述把杆飞行员前方的风挡部分上,以及将所述第二增强现实图像投影到所述非把杆飞行员前方的风挡部分上。The first augmented reality image is projected onto a portion of the windshield in front of the stick pilot, and the second augmented reality image is projected onto a portion of the windshield in front of the non-stick pilot. 6.根据权利要求5所述的系统,其特征在于,所述增强现实图像生成模块还被配置成响应于所述把杆飞行员头部转向所述非把杆飞行员的方向上,中止或调暗所述第二增强现实图像的显示。6. The system according to claim 5 is characterized in that the augmented reality image generation module is also configured to stop or dim the display of the second augmented reality image in response to the stick pilot turning his head toward the direction of the non-stick pilot. 7.根据权利要求5所述的系统,其特征在于,所述增强现实图像生成模块还被配置成响应于来自所述非把杆飞行员的输入指令而更换所投影的第二增强现实图像的内容。7. The system according to claim 5, characterized in that the augmented reality image generation module is further configured to change the content of the projected second augmented reality image in response to an input instruction from the non-coaching pilot. 8.根据权利要求1所述的系统,其特征在于,所述飞行阶段包括地面滑行、起飞、爬升/巡航/下降、进近、着陆,并且其中:8. The system according to claim 1, wherein the flight phases include ground taxi, take-off, climb/cruise/descent, approach, landing, and wherein: 在地面滑行阶段中,所述增强现实图像包括以下中的至少一者:飞机当前构型设置、所在机场地图、当前目的地、前往当前目的地的路径指引、塔台交通管制指令、碰撞预警、飞机前方行进方向的图像、飞机在移动过程中可能会发生碰撞的周边物体;During the ground taxiing phase, the augmented reality image includes at least one of the following: the current configuration settings of the aircraft, a map of the airport where the aircraft is located, the current destination, a path guide to the current destination, a tower traffic control instruction, a collision warning, an image of the aircraft's forward direction of travel, and surrounding objects that the aircraft may collide with during movement; 在起飞阶段中,所述增强现实图像包括以下中的至少一者:飞机姿态相关信息、侧滑信息、空速相关信息、飞行高度相关信息、下一个航路点方向、在起飞阶段中不被抑制的告警;During the take-off phase, the augmented reality image includes at least one of the following: information related to aircraft attitude, side slip information, airspeed information, flight altitude information, direction to the next waypoint, and warnings that are not suppressed during the take-off phase; 在爬升/巡航/下降中,所述增强现实图像包括以下中的至少一者:当前飞机飞行状态、下一航路点位置信息以及天气信息和碰撞预警;During climb/cruise/descent, the augmented reality image includes at least one of the following: current aircraft flight status, next waypoint location information, weather information, and collision warning; 在进近阶段中,所述增强现实图像包括以下中的至少一者:当前飞行状态、当前飞机准确位置、当前飞行计划与下一个航路点位置、塔台交通管制指令、需要前往的目标点与路径;飞机前方行进方向的图像、飞机周边地形环境信息、下滑道路线、着陆跑道轮廓;以及During the approach phase, the augmented reality image includes at least one of the following: current flight status, current accurate aircraft position, current flight plan and next waypoint position, tower traffic control instructions, target point and path to be traveled; image of the aircraft's forward direction, terrain environment information around the aircraft, glide path route, and landing runway contour; and 在着陆阶段中,所述增强现实图像包括以下中的至少一者:当前飞机着陆构型设置、当前飞机准确位置、飞机当前减速趋势;塔台交通管制指令、跑道脱离点位置、飞机前方行进方向的图像、飞机周边地形环境信息。During the landing phase, the augmented reality image includes at least one of the following: the current aircraft landing configuration setting, the current accurate position of the aircraft, the current deceleration trend of the aircraft; tower traffic control instructions, the runway departure point position, an image of the aircraft's forward direction of travel, and terrain environment information around the aircraft. 9.根据权利要求1所述的系统,其特征在于,所述风挡是多层结构,并且所述风挡的各层内填充楔形夹层。9. The system according to claim 1, wherein the windshield is a multi-layer structure, and each layer of the windshield is filled with a wedge-shaped interlayer. 10.根据权利要求1所述的系统,其特征在于,所述增强现实图像的中心在所述飞机的风挡上的投影位置是飞机的飞行员在驾驶座上就坐平视风挡时的注视位置,使得所述增强现实图像在飞行员的视野中覆盖在对应的现实物体之上。10. The system according to claim 1 is characterized in that the projection position of the center of the augmented reality image on the windshield of the aircraft is the gaze position of the pilot of the aircraft when sitting in the cockpit and looking straight at the windshield, so that the augmented reality image is covered on the corresponding real object in the pilot's field of view. 11.一种用于飞机驾驶舱的增强现实风挡投影的方法,其特征在于,包括:11. A method for augmented reality windshield projection in an aircraft cockpit, comprising: 根据从机上指示记录装置获得的信息来确定飞机当前所处于的飞行阶段;Determine the current flight phase of the aircraft based on the information obtained from the onboard indication and recording device; 根据所确定的飞行阶段,来确定所需显示的飞机状态信息与环境信息;Determine the aircraft status information and environmental information to be displayed according to the determined flight phase; 从所述机上指示记录系统与设置在飞机上的各个环境传感器来采集所需显示的飞机状态信息与环境信息;Collecting the aircraft status information and environmental information required for display from the onboard indication and recording system and various environmental sensors arranged on the aircraft; 使用所获取的飞机状态信息与环境信息来生成要投影的增强现实图像;以及generating an augmented reality image to be projected using the acquired aircraft status information and environmental information; and 将所述增强现实图像投影到所述飞机的驾驶舱前风挡。The augmented reality image is projected onto a front windshield of a cockpit of the aircraft. 12.根据权利要求11所述的方法,其特征在于,所述飞机的飞行机组包括两个飞行员,所述方法还包括:12. The method according to claim 11, wherein the flight crew of the aircraft includes two pilots, and the method further comprises: 从所述飞机的飞控系统采集机组人员在侧杆、驾驶杆盘和/或脚蹬处的操纵;collecting the crew's manipulations on the side stick, control column and/or pedals from the flight control system of the aircraft; 基于所采集到的所述操纵来确定把杆飞行员和非把杆飞行员;Determining a stick-holding pilot and a non-stick-holding pilot based on the collected maneuvers; 使用所获取的飞机状态信息与环境信息来针对所述把杆飞行员和所述非把杆飞行员分别生成要投影的第一增强现实图像和第二增强现实图像;以及Using the acquired aircraft status information and environmental information to generate a first augmented reality image and a second augmented reality image to be projected for the pilot at the helm and the pilot not at the helm, respectively; and 将所述第一增强现实图像投影到所述把杆飞行员前方的驾驶舱前风挡部分上,以及将所述第二增强现实图像投影到所述非把杆飞行员前方的驾驶舱前风挡部分上。The first augmented reality image is projected onto a front windshield portion of the cockpit in front of the pilot at the yoke, and the second augmented reality image is projected onto a front windshield portion of the cockpit in front of the non-pilot at the yoke. 13.一种飞机,包括如权利要求1-10中的任一项所述的系统。13. An aircraft comprising a system according to any one of claims 1-10.
CN202411259305.4A 2024-09-09 2024-09-09 System and method for augmented reality windshield projection in aircraft cockpits Pending CN118977859A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411259305.4A CN118977859A (en) 2024-09-09 2024-09-09 System and method for augmented reality windshield projection in aircraft cockpits

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411259305.4A CN118977859A (en) 2024-09-09 2024-09-09 System and method for augmented reality windshield projection in aircraft cockpits

Publications (1)

Publication Number Publication Date
CN118977859A true CN118977859A (en) 2024-11-19

Family

ID=93455006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411259305.4A Pending CN118977859A (en) 2024-09-09 2024-09-09 System and method for augmented reality windshield projection in aircraft cockpits

Country Status (1)

Country Link
CN (1) CN118977859A (en)

Similar Documents

Publication Publication Date Title
US11215834B1 (en) Head up display for integrating views of conformally mapped symbols and a fixed image source
US7486291B2 (en) Systems and methods using enhanced vision to provide out-the-window displays for a device
EP2133728B1 (en) Method and system for operating a display device
EP2610590B1 (en) System and method for selecting images to be displayed
EP2167920B1 (en) Aircraft landing assistance
US8487787B2 (en) Near-to-eye head tracking ground obstruction system and method
US9389097B2 (en) Aircraft display systems and methods for enhanced display of flight path information
US8010245B2 (en) Aircraft systems and methods for displaying a touchdown point
EP2461202B1 (en) Near-to-eye head display system and method
US8170729B2 (en) Method and system for operating a display device on-board an aircraft
US9864194B2 (en) Systems and methods for displaying FOV boundaries on HUDs
CN104648683B (en) The method and apparatus that automatically vector aircraft is slided on the ground
EP2200005B1 (en) Method and system for managing traffic advisory information
CA3097448A1 (en) Display systems and methods for aircraft
EP4066079B1 (en) Aircraft piloting system
CN118977859A (en) System and method for augmented reality windshield projection in aircraft cockpits
JP7367922B2 (en) Pilot support system
US10969589B2 (en) Head up display system, associated display system and computer program product
Cheng et al. A prototype of Enhanced Synthetic Vision System using short-wave infrared

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination