CN118977859A - System and method for augmented reality windshield projection in aircraft cockpits - Google Patents
System and method for augmented reality windshield projection in aircraft cockpits Download PDFInfo
- Publication number
- CN118977859A CN118977859A CN202411259305.4A CN202411259305A CN118977859A CN 118977859 A CN118977859 A CN 118977859A CN 202411259305 A CN202411259305 A CN 202411259305A CN 118977859 A CN118977859 A CN 118977859A
- Authority
- CN
- China
- Prior art keywords
- aircraft
- augmented reality
- information
- reality image
- pilot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D43/00—Arrangements or adaptations of instruments
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure relates to a system and method for augmented reality windshield projection of an aircraft cockpit. The system comprises: the sensing module is configured to sense state information of the aircraft and information of an external environment of the aircraft; an augmented reality image generation module configured to determine a flight phase in which an aircraft is based on state information of the aircraft itself, generate an augmented reality image based on the flight phase and using information from the perception module; and an optical projection module disposed within a front hood of an aircraft cockpit and configured to project the augmented reality image to a windshield of the aircraft.
Description
Technical Field
The present disclosure relates to aircraft cockpit human-machine interactive display systems and methods, and more particularly to systems and methods for augmented reality windshield projection of an aircraft cockpit.
Background
An aircraft (such as a large civil aircraft) cockpit is used as a main place for a pilot to execute tasks, a large amount of information is integrated in a limited space, and an interaction scene is complex. For example, the aircraft has the characteristics of complex operation process, short reaction time and much information required to be focused by pilots in the key stages of taking off and approaching of the aircraft (particularly the civil aircraft). The aircraft has many parameter instruments and large distribution range, so that the unit is difficult to simultaneously consider the external view and instrument data.
However, the existing aircraft is provided with a cockpit head-up display device on the head space of the crew, occupies the head space of the crew, has the defects of single information display mode, insufficient vividness in single-color display, occupation of the head space of the crew, possibility of personnel injury in a bumpy environment and the like, and needs to be improved. And the existing equipment only obtains the information in the aircraft through an internal indication recording system of the aircraft, so that the sensing means of the external environment are limited.
The present disclosure is improved upon with respect to, but is not limited to, the factors described above.
Disclosure of Invention
To this end, the present disclosure provides an augmented reality windshield projection system and method applied to an aircraft cockpit that will sense the external environment in real time during various phases of flight of the aircraft (e.g., ground taxiing and in-flight), and project critical information on the pilot's perceived needs onto the cockpit windshield so that the pilot can conveniently look at the scene.
According to a first aspect of the present disclosure there is provided a system for augmented reality windshield projection of an aircraft cockpit, comprising: the sensing module is configured to sense state information of the aircraft and information of an external environment of the aircraft; an augmented reality image generation module configured to determine a flight phase in which an aircraft is based on state information of the aircraft itself, generate an augmented reality image based on the flight phase and using information from the perception module; and an optical projection module disposed within a front hood of an aircraft cockpit and configured to project the augmented reality image to a windshield of the aircraft.
According to an embodiment, the sensing module comprises: on-board indication recording means for sensing status information of the aircraft itself: the visual sensing device is arranged outside the cockpit, acquires an image in front of the aircraft in the field of view of the pilot and comprises a visual camera and an infrared camera; and the radar detection device comprises a laser radar and/or a millimeter wave radar for detecting the terrain information and the obstacles of the surrounding environment of the airplane during the running process of the airplane, wherein the state information of the airplane comprises at least one of airspeed, flying height, global Navigation Satellite System (GNSS) positioning, configuration state of each system of the airplane and alarm information of the airplane.
According to another embodiment, generating an augmented reality image using information from the perception module comprises: integrating information from the perception module to obtain the picture content of the augmented reality image to be projected; and eliminating distortion of the augmented reality image when projected to the windshield according to a curvature of the windshield.
According to a further embodiment, the perception module further comprises: cockpit detection means including cameras arranged in the cockpit and acquisition means for acquiring the crew's maneuvers at the side bars, steering wheel and/or pedals from the flight control system, the cameras acquiring information about the crew's head and gaze direction.
According to yet another embodiment, the flight crew of the aircraft includes two pilots, and generating the augmented reality image using information from the perception module includes: determining a stick pilot and a non-stick pilot based on the crew's maneuvers at the sidesticks, steering wheel, and/or pedals collected from the flight control system; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a portion of the windshield in front of the stick pilot and the second augmented reality image onto a portion of the windshield in front of the non-stick pilot.
According to a further embodiment, the augmented reality image generating module is further configured to suspend or dim the display of the second augmented reality image in response to the direction of the stick pilot's head turning to the non-stick pilot.
According to a further embodiment, the augmented reality image generation module is further configured to replace the content of the projected second augmented reality image in response to an input instruction from the non-stick pilot.
According to a further embodiment, the flight phase comprises ground taxiing, takeoff, climb/cruise/descent, approach, landing, and wherein: in a ground taxi phase, the augmented reality image includes at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image includes at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image includes at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image includes at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in a landing phase, the augmented reality image includes at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft.
According to yet another embodiment, the windshield is a multi-layer structure and each layer of the windshield is filled with a wedge-shaped interlayer.
According to a further embodiment, the projected position of the center of the augmented reality image on the windshield of the aircraft is the gaze position of the pilot of the aircraft when sitting on the cockpit in a flat vision windshield, such that the augmented reality image is overlaid over the corresponding real object in the pilot's field of view.
According to a second aspect of the present disclosure, there is provided a method for augmented reality windshield projection of an aircraft cockpit, comprising: determining the current flight phase of the aircraft according to the information obtained from the on-board indication recording device; determining aircraft state information and environment information to be displayed according to the determined flight phase; acquiring aircraft state information and environment information to be displayed from the on-board indication recording system and each environment sensor arranged on the aircraft; generating an augmented reality image to be projected using the acquired aircraft state information and environmental information; and projecting the augmented reality image to a cockpit front windshield of the aircraft.
According to an embodiment, the flight crew of the aircraft comprises two pilots, the method further comprising: collecting the crew manoeuvres at the side bars, steering wheel and/or pedals from the flight control system of the aircraft; determining a stick pilot and a non-stick pilot based on the collected maneuvers; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a cockpit front windshield portion in front of the stick pilot and projecting the second augmented reality image onto a cockpit front windshield portion in front of the non-stick pilot.
According to a third aspect of the present disclosure there is provided an aircraft comprising a system for augmented reality windshield projection of an aircraft cockpit according to the first aspect of the present disclosure.
Aspects generally include a method, apparatus, system, computer program product, and processing system substantially as described herein with reference to and as illustrated by the accompanying drawings.
The foregoing has outlined rather broadly the features and technical advantages of examples in accordance with the present disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The disclosed concepts and specific examples may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. The features of the concepts disclosed herein, both as to their organization and method of operation, together with associated advantages, will be better understood from the following description when considered in connection with the accompanying drawings. Each of the figures is provided for the purpose of illustration and description and is not intended to limit the claims.
Drawings
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to aspects, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this disclosure and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects. The same reference numbers in different drawings may identify the same or similar elements.
FIG. 1 illustrates a schematic diagram of a system for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a system for augmented reality windshield projection of an aircraft cockpit according to a specific example of the present disclosure;
FIG. 3 shows a comparative schematic of imaging of a windshield with a wedge-shaped interlayer and a windshield without a wedge-shaped interlayer;
FIG. 4 illustrates a projected head-up display effect diagram of a system of augmented reality windshield projections according to an example embodiment of the present disclosure;
FIG. 5 illustrates a flowchart of a method for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure; and
Fig. 6 is a schematic diagram illustrating an example aircraft in accordance with aspects of the present disclosure.
Detailed Description
The inventor realizes that the existing aircraft is provided with cockpit head-up display equipment on the head space of a crew member, occupies the head space of the crew member, has the defects of single information display mode, insufficient vivid single-color display, occupation of the head space of the crew member, possibility of personnel injury in a bumpy environment and the like, and needs to be improved. And the existing equipment only obtains the information in the aircraft through an internal indication recording system of the aircraft, so that the sensing means of the external environment are limited.
To this end, the present disclosure provides an augmented reality windshield projection method and system for an aircraft cockpit. The system determines display information required to be provided for the pilot according to the current flight stage of the aircraft and the work task of the pilot, acquires the surrounding environment data of the aircraft through the sensors such as the camera, the laser radar and the like, acquires the state data of the aircraft through the flight control system, generates an augmented reality image after data processing through a data fusion method, then projects the augmented reality image on a front windshield of the cockpit of the aircraft, provides the required information for the pilot, enhances the scene awareness of the flight set and improves the flight safety.
The methods and systems of the present disclosure may provide a heads-up display interface to crew members by disposing a projection device within a front hood of the cockpit and using a windshield as a display screen. In particular, in providing a display interface, the methods and systems of the present disclosure may identify crew member eye position and gaze direction through an in-cockpit camera in connection with crew input information (or pilot input settings) obtained from a flight control system to determine the current stage of crew composition to thereby display different augmented reality images to different pilots and such that the augmented reality images do not interfere with each other.
The detailed description set forth below in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of the various concepts. It will be apparent, however, to one skilled in the art that these concepts may be practiced without these specific details.
Referring to fig. 1, a schematic diagram of a system 100 for augmented reality windshield projection of an aircraft cockpit according to an example embodiment of the present disclosure is shown.
As shown in fig. 1, a system 100 for augmented reality windshield projection of an aircraft cockpit may include a perception module 101, an augmented reality image generation module 103, and an optical projection module 105.
In an embodiment of the present disclosure, the perception module 101 may be configured to sense status information of the aircraft itself as well as information of the aircraft external environment. In an example, the perception module 101 may comprise on-board indication recording means for sensing status information of the aircraft itself: the visual sensing device is arranged outside the cockpit, acquires an aircraft front image in the field of view of a pilot and comprises a visual camera and an infrared camera; and the radar detection device comprises a laser radar and/or a millimeter wave radar for detecting the surrounding environment topographic information and obstacles of the airplane during the running process of the airplane (especially in the scenes of the ground, the take-off and the approach landing fields, etc.). In yet another example, the perception module 101 may further comprise a cockpit detection means, which may comprise a camera arranged in the cockpit and acquisition means to acquire the crew's manoeuvres at the side stick, the steering wheel and/or the foot pedal from the flight control system, wherein the camera acquires information about the crew's head and gaze direction. According to this example, the status information of the aircraft itself may include at least one of aircraft airspeed, altitude, global Navigation Satellite System (GNSS) positioning, configuration status of various systems of the aircraft, alert information, or any other suitable information. Further according to this example, the on-board indication recording device may be an original structure of the aircraft, responsible for providing the optical projection system with the status information of the aircraft itself; the visual camera shoots a scene image in the view field of a driver in front of the airplane and is used for carrying out image recognition processing subsequently; the infrared camera can particularly shoot in dark external environment, and provides pictures for enhancing a vision system and image recognition.
Referring to fig. 2, a schematic diagram of a system 200 for augmented reality windshield projection of an aircraft cockpit according to a specific example of the present disclosure is shown. As can be seen in fig. 2, the system 200 may include a perception module 201, an augmented reality image generation module 203, and an optical projection module 205. As shown in fig. 2, the perception module 201 includes (onboard) an indication recording system (such as an atmospheric data and inertial reference system for providing airspeed and attitude information, a GNSS for providing position information, and any other suitable means for providing other flight information), a radar detection module (such as a lidar for providing environmental object information, an onboard millimeter wave radar), a visual sensing device (such as a visual camera for providing a front video image, an infrared camera for providing visual enhancement information), a cockpit detection module (such as a cockpit camera for providing crew eye position information, a crew flight control acquisition module for providing crew flight mission division information).
In an embodiment of the present disclosure, the augmented reality image generation module 103 may be configured to determine a flight phase in which the aircraft is based on the status information of the aircraft itself, and to generate the augmented reality image using information from the sensing module 101 based on the determined flight phase. In an example, determining the stage of flight in which the aircraft is in may be based on pilot manual inputs, current altitude of flight of the aircraft, airspeed, landing gear and flap settings, and so forth, which are not described in detail herein. Further according to this embodiment, generating the augmented reality image using the information from the sensing module 101 may include integrating the information from the sensing module 101 to obtain the picture content of the augmented reality image to be projected; and eliminating distortion of the augmented reality image when projected to the windshield according to the curvature of the windshield.
As shown in the specific example in fig. 2, the augmented reality image generation module 203 may include a data fusion unit and an image generation unit, wherein the data fusion unit may integrate data from the indication recording system and the radar detection module, coordinate data formats of different information to obtain information content that can be used for direct projection on the windshield display to the image generation unit, and the image generation unit may receive information from the data fusion unit and information from the visual sensing device and the cockpit detection module to determine the content of the picture to be projected for display and eliminate distortion of the picture when projected onto the windshield according to the curvature of the windshield.
In an embodiment of the present disclosure, the optical projection module 105 may be disposed within a front hood of the aircraft cockpit and configured to project the augmented reality image from the augmented reality image generation module 103 to a windshield of the aircraft. In an example, the optical projection module 105 may include optics and corresponding optical paths for projection of the augmented reality image.
As shown in the specific example in fig. 2, the optical projection module 205 may include optics and corresponding optical paths for projecting the augmented reality image from the augmented reality image generation module 203 to the windshield. In this example, the optical device may receive the generated augmented reality image, and the image generation unit is controlled by the driving board to generate the optical image. The generated optical image is projected by the optical path to the windshield of the aircraft cockpit, reflected into the pilot's eye, forming a virtual image of the display in front of the aircraft cockpit. According to this embodiment, the position of the image projection can be adjusted according to the crew eye position and gaze direction information provided by the sensing module 201, so that the crew can read the display information at any time, and the projection images on two sides are avoided in the field of view of one pilot in the case of a double pilot crew, so that the field of view is prevented from being disturbed. In addition, the optical projection module 205 may implement an anti-shake algorithm to eliminate and mitigate shake for vibrations that may be present during flight.
In yet another embodiment of the present disclosure, where the flight crew of the aircraft includes two pilots, the augmented reality image generation module 103 may be further configured to: determining a stick pilot and a non-stick pilot based on the crew's maneuvers at the sidesticks, steering wheel, and/or pedals collected from the flight control system; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a portion of the windshield in front of the stick pilot and projecting the second augmented reality image onto a portion of the windshield in front of the non-stick pilot. In this embodiment, the content of the first and second augmented reality images may be different, considering that the stick pilot is tasked differently than the non-stick pilot, and thus the content requirements on the display may also be different.
To prevent the augmented reality image from affecting the field of view of the pilot (and particularly the stick pilot), in a preferred embodiment, the augmented reality image generation module 103 may be further configured to pause or adjust blind stitch the display of a second augmented reality image of the non-stick pilot in response to the direction in which the head of the stick pilot turns toward the non-stick pilot.
In order for the pilot, particularly a non-stick pilot, to conveniently review the various information required, the augmented reality image generation module 103 may also be configured to replace the content of the projected second augmented reality image in response to input instructions from the non-stick pilot. For example, the augmented reality image generation module 103 may generate a plurality of second augmented reality images and display a corresponding one of the second augmented reality images in response to a request from a non-stick pilot. Alternatively, the augmented reality image generation module 103 may be further configured to generate a new second augmented reality image in real time in response to a request from a non-stick pilot.
In yet another embodiment of the present disclosure, the flight phase of the aircraft may include ground taxiing, takeoff, climb/cruise/descent, approach, landing, and the like, and wherein during the ground taxiing phase, the augmented reality image may include at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image may include at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image may include at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image may include at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in the landing phase, the augmented reality image may include at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft. The content requirements of the augmented reality image are described below in specific examples in each flight phase.
In the ground taxi phase (in the scenario of the aircraft being before take-off, after landing), the system 100 may: reading the current aircraft configuration setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring current airport map information through an airplane information database; obtaining a target point and a path which need to go to by reading a traffic control instruction of a tower; acquiring an image of the forward advancing direction of the aircraft through a visual camera; surrounding objects, which may collide during the movement of the aircraft, are perceived by means of millimeter wave radar. After obtaining the above information, the system 100 may provide the above information combinations to the pilot. For example, combining aircraft positioning, airport maps, travel targets, and travel routes into map path guidance; and fusing the traveling target, the traveling route and the possibly collided object with the image information of the front of the aircraft shot by the visual camera, and overlaying the traveling route, the destination and the possibly collided object on the corresponding object in the field of view of the pilot in a high-brightness mark mode through an augmented reality technology. When the brightness of the picture shot by the visual camera is too dark or the illumination intensity is lower than a preset threshold value, an Enhanced Vision System (EVS) picture shot by the infrared camera, which is the same as the visual field of the visual camera, can be used as an alternative or supplement to the image shot by the visual camera.
To ensure that the pilot's field of view is not disturbed while the aircraft is in the takeoff phase, the system 100 may provide only the pilot (and in particular the stick pilot) with current aircraft-related flight information, including, for example: aircraft attitude related information; sideslip information; airspeed-related information; highly relevant information; the direction of the next waypoint; an alarm that the take-off phase is not suppressed. To this end, the system 100 may read information related to the current aircraft flight status via the flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; reading an aircraft flight plan via a flight control system, and so forth. Alternatively, the pilot may turn on the view enhancement function by manual setting or when the brightness of the picture taken by the visual camera is too dark or the illumination intensity is below a predetermined threshold, an Enhanced View System (EVS) picture taken by an infrared camera that is the same as the visual camera field of view may be used as an alternative or supplement to the image taken by the visual camera to enhance the pilot's perception of the external environment.
During the climb/cruise/descent phase, the system 100 may primarily provide the pilot with current aircraft flight status, next waypoint location information, and other important information that may affect the flight process (such as weather, collision warning, etc.), among other things. To this end, the system 100 may: reading the related information of the current aircraft flight state through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; and acquiring the current flight plan and the next waypoint through the airplane information database.
During the flight phase of an aircraft in an approach scenario, system 100 primarily provides information to pilots regarding current aircraft flight configuration information, current waypoint locations, approach routes, target airport locations, and the like. To this end, the system 100 may: reading the current flight state data setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring a current flight plan and the position of the next waypoint through an airplane information database; obtaining a target point and a path which need to go to by reading a traffic control instruction of a tower; acquiring an image of the forward advancing direction of the aircraft through a camera; and sensing the surrounding terrain environment information of the aircraft through a laser radar. Thereafter, the system 100 may provide the above combinations of information to the pilot, such as combining aircraft positioning, airport location, and glidepath path into map path guidance; image recognition is carried out on the image information of the front of the aircraft shot by the visual camera, and the runway position for landing is obtained through visual recognition; the glidepath route, landing runway contour, etc. are overlaid over the corresponding objects in the pilot's field of view in the form of highlighting by augmented reality technology. Alternatively, the pilot may turn on the view enhancement function by manual setting or when the brightness of the picture taken by the visual camera is too dark or the illumination intensity is below a predetermined threshold, an Enhanced View System (EVS) picture taken by an infrared camera that is the same as the visual camera field of view may be used as an alternative or supplement to the image taken by the visual camera to enhance the pilot's perception of the external environment. Alternatively, when the visible camera and the infrared camera can not acquire clear external environment images, through manual setting of a pilot, the satellite positioning information of the airplane, map information stored in an onboard database of the airplane and surrounding environment information acquired by detection of an onboard radar can be used for constructing a picture, and the picture shot by the infrared camera are synthesized together to be projected to a windshield.
During the flight phase of an aircraft in a landing scenario, system 100 primarily provides information to pilots such as current aircraft flight configuration information, braking distance cues, runway departure location cues, and the like. To this end, the system 100 may: reading the current aircraft landing configuration setting through a flight control system; acquiring the accurate position of the current aircraft through a satellite positioning system; acquiring the current deceleration trend of the aircraft through a brake system; obtaining the runway departure point position by reading the traffic control instruction of the tower; acquiring an image of the forward advancing direction of the aircraft through a visual camera; the surrounding terrain environment information of the aircraft is perceived through a laser radar. Thereafter, the system 100 may provide the pilot with the above combination of information: image recognition is carried out on the image information of the front of the aircraft shot by the camera, and an aircraft braking distance prompt is obtained by matching with an aircraft deceleration trend; the braking distance, the runway threshold that is expected to be disengaged, is overlaid over the corresponding object in the pilot's field of view in the form of a highlighting by augmented reality technology.
Preferably, in one example, in the case of a dual pilot crew, during flight phases other than ground taxiing phases, the system 100 may determine the division of work (i.e., stick pilots and non-stick pilots) for the current crew based on the inputs (or pilot manual settings) of the crew at the sidesticks/steering columns and pedals. Preferably, depending on the division of labor, the projected display on the pilot side of the stick will emphasize the flight-related data more than the projected display on the non-stick pilot side, while the display on the pilot side of the stick will have a higher display priority.
In yet another embodiment of the present disclosure, the windshield of an aircraft may be a multi-layer structure, and the layers of the windshield are filled with a wedge-shaped interlayer to mitigate imaging ghosting problems with multiple layers of glass. FIG. 3 shows a comparative schematic of imaging of a windshield with a wedge-shaped interlayer and a windshield without a wedge-shaped interlayer. As shown in fig. 3 at (a), a windshield without a wedge-shaped interlayer will cause the ghost image to not coincide with the virtual image, causing trouble to the pilot; while a windshield with a wedge-shaped interlayer will have the ghost image coincident with the virtual image, as shown at (b), alleviating the imaging ghost problem associated with the multiple layer glass structure of the windshield.
Furthermore, since the optical projection system is arranged on the front side of the aircraft cockpit, the system 100 may also be equipped with corresponding electromagnetic shielding measures to prevent environmental damage to the system 100, given the complex electromagnetic environment of the aircraft nose position during flight.
In an example of the present disclosure, the projected position of the center of the augmented reality image on the windshield of the aircraft is the gaze position of the pilot of the aircraft while sitting in a flat windshield on the cockpit, such that the augmented reality image is overlaid over the corresponding real object in the pilot's field of view. For example, referring to fig. 4, a projected head-up display effect diagram of a system of augmented reality windshield projections is shown according to an example embodiment of the present disclosure.
Referring now to fig. 5, a flow chart of a method 500 for augmented reality windshield projection of an aircraft cockpit is shown, according to an example embodiment of the present disclosure.
The method 500 may include determining a flight phase currently being performed by the aircraft based on information obtained from an onboard instruction recording device at block 510. In an example, determining the stage of flight in which the aircraft is in may be based on pilot manual inputs, current altitude of flight of the aircraft, airspeed, landing gear and flap settings, and so forth, which are not described in detail herein.
At block 520, the method 500 may include determining aircraft status information and environmental information to be displayed based on the determined flight phase. For example, the flight phase of the aircraft may include ground taxiing, takeoff, climb/cruise/descent, approach, landing, and the like, and wherein during the ground taxiing phase, the augmented reality image may include at least one of: the method comprises the steps of setting the current configuration of an airplane, locating an airport map, a current destination, guiding a route to the current destination, controlling a tower traffic, pre-warning collision, images of the advancing direction in front of the airplane and surrounding objects possibly collided during the moving process of the airplane; in the takeoff phase, the augmented reality image may include at least one of: aircraft attitude related information, sideslip information, airspeed related information, altitude related information, next waypoint direction, alarms that are not suppressed during the take-off phase; in climb/cruise/descent, the augmented reality image may include at least one of: current aircraft flight state, next waypoint position information, weather information and collision early warning; in the approach phase, the augmented reality image may include at least one of: the method comprises the steps of current flight state, current airplane accurate position, current flight plan and next waypoint position, tower traffic control instruction, target point and path needing to go to; images of the forward advancing direction of the aircraft, surrounding terrain environment information of the aircraft, a glidepath route and a landing runway contour; and in the landing phase, the augmented reality image may include at least one of: setting a current aircraft landing configuration, a current aircraft accurate position and a current aircraft deceleration trend; the control method comprises the steps of a tower traffic control instruction, runway departure point positions, images of forward advancing directions of the aircraft and surrounding terrain environment information of the aircraft.
At block 530, the method 500 may include collecting aircraft state information and environmental information for a desired display from an onboard instruction recording system and various environmental sensors disposed onboard the aircraft, and at block 540, the method 500 may include using the acquired aircraft state information and environmental information to generate an augmented reality image to be projected. For example, the method 500 may integrate data from the directives recording system and radar detection module, coordinate the data formats of the different information to obtain information content that can be used for direct projection to the windshield display, and the method 500 may also receive information from the visual sensing device and the cockpit detection module to determine the content of the picture to be projected for display and eliminate distortion of the picture as it is projected to the windshield based on the curvature of the windshield.
At block 550, the method 500 may include projecting an augmented reality image to a cockpit front windshield of an aircraft.
In yet another embodiment of the present disclosure, the flight crew of the aircraft includes two pilots, and the method 500 may further include collecting crew maneuvers at the sideshields, the steering column, and/or the pedals from the aircraft flight control system; determining a stick pilot and a non-stick pilot based on the collected maneuvers; generating first and second augmented reality images to be projected for the stick pilot and the non-stick pilot, respectively, using the acquired aircraft state information and environmental information; and projecting the first augmented reality image onto a cockpit front windshield portion in front of the stick pilot and projecting the second augmented reality image onto a cockpit front windshield portion in front of the non-stick pilot.
Referring to fig. 6, a schematic diagram of an aircraft 600 is shown according to an example embodiment of the present disclosure. In this embodiment, aircraft 600 may include an augmented reality windshield projection system described in accordance with embodiments of the present disclosure, such as system 100 for augmented reality windshield projection of an aircraft cockpit described with respect to fig. 1.
As described above, the present disclosure, on the premise of providing an equal or even better head-up display effect, releases the head space of the pilot by providing the projection device in the front hood of the cockpit of the aircraft, and does not need to provide the display screen in front of the head of the pilot, avoiding the risk of collision between the pilot and the display screen under the conditions of jolt and the like.
The system and the method also improve the situational awareness of the aircraft unit and the perception capability of the aircraft to the external environment in each flight process. According to different requirements of a machine set on flight tasks in different flight phases, the system and the method for providing the cockpit head-up display of the augmented reality function are innovatively provided, the augmented reality display image related to the current execution task of a pilot is provided, the situational awareness of the pilot is enhanced, and the running capability of an airplane is improved.
For the dual pilot crew cockpit feature, the present disclosure also utilizes an intra-cabin camera to acquire crew eye position and gaze direction, uses crew inputs to acquire the current work tasks of each pilot to display augmented reality images to different pilots, respectively, and also enables the simultaneous appearance of augmented reality images for one pilot in the field of view of another pilot.
The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings illustrate specific embodiments that can be practiced by way of illustration. These embodiments are also referred to herein as "examples". Such examples may include elements other than those shown or described. However, examples including the elements shown or described are also contemplated. Moreover, it is also contemplated that examples using any combination or permutation of those elements shown or described, or with reference to specific examples (or one or more aspects thereof) shown or described herein, or with reference to other examples (or one or more aspects thereof) shown or described herein.
In the appended claims, the terms "including" and "comprising" are open-ended, i.e., a system, apparatus, article, or process of claim that is defined to be within the scope of the claim, except for those elements recited after such term. Furthermore, in the appended claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to indicate the numerical order of their objects.
In addition, the order of the operations illustrated in the present specification is exemplary. In alternative embodiments, the operations may be performed in a different order than shown in the figures, and the operations may be combined into a single operation or split into more operations.
The above description is intended to be illustrative, and not restrictive. For example, the examples described above (or one or more aspects thereof) may be used in connection with other embodiments. Other embodiments may be used, such as by one of ordinary skill in the art after reviewing the above description. The abstract allows the reader to quickly ascertain the nature of the technical disclosure. This Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Furthermore, in the above detailed description, various features may be grouped together to streamline the disclosure. However, the claims may not state every feature disclosed herein, as embodiments may characterize a subset of the features. Further, embodiments may include fewer features than are disclosed in the specific examples. Thus the following claims are hereby incorporated into the detailed description, with one claim standing on its own as a separate embodiment. The scope of the embodiments disclosed herein should be determined with reference to the appended claims, along with the full range of equivalents to which such claims are entitled.
Claims (13)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411259305.4A CN118977859A (en) | 2024-09-09 | 2024-09-09 | System and method for augmented reality windshield projection in aircraft cockpits |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411259305.4A CN118977859A (en) | 2024-09-09 | 2024-09-09 | System and method for augmented reality windshield projection in aircraft cockpits |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118977859A true CN118977859A (en) | 2024-11-19 |
Family
ID=93455006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202411259305.4A Pending CN118977859A (en) | 2024-09-09 | 2024-09-09 | System and method for augmented reality windshield projection in aircraft cockpits |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118977859A (en) |
-
2024
- 2024-09-09 CN CN202411259305.4A patent/CN118977859A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11215834B1 (en) | Head up display for integrating views of conformally mapped symbols and a fixed image source | |
US7486291B2 (en) | Systems and methods using enhanced vision to provide out-the-window displays for a device | |
EP2133728B1 (en) | Method and system for operating a display device | |
EP2610590B1 (en) | System and method for selecting images to be displayed | |
EP2167920B1 (en) | Aircraft landing assistance | |
US8487787B2 (en) | Near-to-eye head tracking ground obstruction system and method | |
US9389097B2 (en) | Aircraft display systems and methods for enhanced display of flight path information | |
US8010245B2 (en) | Aircraft systems and methods for displaying a touchdown point | |
EP2461202B1 (en) | Near-to-eye head display system and method | |
US8170729B2 (en) | Method and system for operating a display device on-board an aircraft | |
US9864194B2 (en) | Systems and methods for displaying FOV boundaries on HUDs | |
CN104648683B (en) | The method and apparatus that automatically vector aircraft is slided on the ground | |
EP2200005B1 (en) | Method and system for managing traffic advisory information | |
CA3097448A1 (en) | Display systems and methods for aircraft | |
EP4066079B1 (en) | Aircraft piloting system | |
CN118977859A (en) | System and method for augmented reality windshield projection in aircraft cockpits | |
JP7367922B2 (en) | Pilot support system | |
US10969589B2 (en) | Head up display system, associated display system and computer program product | |
Cheng et al. | A prototype of Enhanced Synthetic Vision System using short-wave infrared |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |