Head up display for integrating views of conformally mapped symbols and a fixed image source
11215834 ยท 2022-01-04
Assignee
Inventors
Cpc classification
G06F3/017
PHYSICS
G06F3/011
PHYSICS
B64D43/00
PERFORMING OPERATIONS; TRANSPORTING
G01C23/00
PHYSICS
G02B2027/0141
PHYSICS
G02B2027/0187
PHYSICS
International classification
Abstract
A method or system can be used with an aircraft or other vehicle. The system can include or the method can use a head up display for integrating views of conformally mapped symbols and a first image from at least one image source in an environment. The head up display includes a computer and a combiner configured to provide a second image in response to the computer. The second image includes the conformally mapped symbols and a window for viewing the first image on the image source.
Claims
1. An apparatus in an environment, the environment comprising a head down image source, the head down image source being disposed at an image source position, and a head down display (HDD) configured to display one of a plurality of HDD images, the apparatus comprising: a projector; a combiner configured to provide one of a plurality of projector images from the projector and provided in front of the head down image source the one of the plurality of projector images comprising conformally mapped symbols, the one of the plurality of projector images having a window for viewing the one of the plurality of HDD images, wherein the window has a virtual location matching the location of the screen, wherein the window is smaller in area than the one of a plurality of projector images and wherein the combiner provides the conformally mapped symbols outside of the window and conformally with objects viewed within or outside the environment; at least one processor configured to cause the projector to provide the one of the plurality of projector images; a user interface, wherein the at least one processor causes the at least one of the plurality of HDD images to be viewable on the HDD in response to a user command received by the user interface; and an eye or head tracking device, wherein the processor is configured to: generate the one of the plurality of projector images in accordance with an eye or head position sensed by the eye or head tracking device; and cause the projector to provide the one of the plurality of projector images to the combiner.
2. The apparatus of claim 1, wherein the combiner is configured to be worn by a user.
3. The apparatus of claim 1, wherein the combiner is fixed to a structure in a cockpit.
4. The apparatus of claim 1, wherein the user command is a grab and pull gesture.
5. The apparatus of claim 1, wherein the user command is a mouse, trackball, or joy stick interface command.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying drawings, wherein like reference numerals denote like components, and:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
(10) Before describing in detail the particular improved system and method, it should be observed that the inventive concepts include, but are not limited to, a novel structural combination of conventional data/signal processing, displays, optical components and/or communications circuits, and not in the particular detailed configurations thereof. Accordingly, the structure, methods, functions, control and arrangement of various components, optics, software, and circuits have, for the most part, been illustrated in the drawings by readily understandable block representations and schematic diagrams, in order not to obscure the disclosure with structural details which will be readily apparent to those skilled in the art, having the benefit of the description herein. Further, the inventive concepts disclosed herein are not limited to the particular embodiments depicted in the exemplary diagrams, but should be construed in accordance with the language in the claims.
(11) According to some exemplary embodiments, a display system provides a window for viewing a head down display (HDD), other display, gauge or sensor. In some embodiments, the image on a combiner of a head up display (HUD) includes one or more transparent windows at one of more virtual locations associated with the actual location of the HDD, other display, gauge or sensor. In some embodiments, the transparent windows (e.g., regions free of overlays provided by the HUD) allow information on the HDDs, displays, gauges or sensor to be viewed without interference from the symbols or images displayed on the combiner. The display system allows sensed (from an enhanced vision system (EVS)) and generated (from a synthetic vision system (SVS)) real-world features and/or representative icons to be displayed to the flight crew in conjunction with HUD operations and yet does not interfere with views of the cockpit instrumentation and HDDs. Advantageously, the system and method of some embodiments allows information more easily viewed on the HDDs to be viewed without clutter from information provided on the combiner of the HUD, thereby providing an integrated user interface to unify the information presented on a HUD overlay and HDDs.
(12) In some embodiments, the system and method utilizes a processor in communication with a worn display (e.g., a head mounted display (HMD)) and HDDs. Various information or symbols can be provided in the HUD image and moved to the HDD by using gestures or signals from user interfaces. For example, a grab and hold gesture can be used to move airspeed information, altitude tape information, an airspace boundary overlay, airport data, and/or communication data from the HUD view to a view on the HDD. Other gestures can be used to select information from menus or page through information screens on the HDD in some embodiments.
(13) With reference to
(14) The display system 10 includes one or more of a HUD 18 and one or more of a HDD 20, a HDD 28, and a HDD 30 provided below a glare shield 31. The HDDs 20, 28 and 30 and the HUD 18 can be used to provide information to the flight crew, thereby increasing visual range and enhancing decision-making abilities. The HUD 18 includes a combiner 32 and a projector 34. The HDDs 28 and 30 are large area format HDDs in some embodiments.
(15) In some embodiments, the HDDs 20, 28 and 30 and the combiner 32 provide images associated with weather displays, weather radar displays, communication displays, flight data displays, engine instrument information displays, chart displays, mapping displays, flight plan displays, terrain displays, or other flight instrumentation. Further, the HDDs 20, 28 and 30 and the combiner 32 provide a synthetic vision system (SVS) image, an enhanced vision system (EVS) image (e.g., an EFVS image), a radar image, a sensor image or a merged or combined image derived from any two or more of the SVS image, the radar image, the sensor image, and the EVS image in some embodiments. The HDDs 20, 28 and 30 and the combiner 32 are configured to display a three dimensional or perspective image of terrain and/or weather information in some embodiments. Other views of terrain and/or weather information can also be provided (e.g., plan view, horizontal view, vertical view, or combinations thereof).
(16) The HDDs 20, 28 and 30 and the combiner 32 can be implemented using any of a variety of display technologies, including cathode ray tube (CRT), liquid crystal display (LCD), organic LED display, laser-based, and other display technology. The combiner 32 can be any type of device for providing conformal images, including but not limited to, waveguide combiners, reflective combiners, or holographic combiners, in some embodiments. The combiner 32 is embodied as a head worn combiner or a fixed HUD combiner in some embodiments. In some embodiments, the combiner 32 utilizes waveguide optics and diffraction gratings to receive collimated light provided by the projector 34 and provide collimated light to a user. In some embodiments, the combiner 32 is a goggle, glasses, helmet or visor-type combiner.
(17) In some embodiment, the HUD 18 is a head worn display system (e.g., an HMD) with head and/or eye tracking. The HUD 18 utilizes the projector 34 to provide the image to the combiner 32 including at least one virtual region corresponding to the locations of the HDDs 20, 28, and 30 and/or gauges, instrumentation, or other equipment in the flight control center 12.
(18) With reference to
(19) The tracker 36 is a head or eye tracker. In some embodiments, the tracker 36 provides gaze information associated with the user (e.g., pilot) to the computer 56 in one embodiment. The tracker 36 can be any type of sensor or set of sensors for determining head positon and/or eye positon including but not limited to camera based sensors, magnetic sensors, mechanical sensors, infrared sensors, etc. In some embodiments, the tracker 36 can be or include one or more cameras or sensors to provide gaze information. The cameras can be fixed with in the aircraft control center 12 (
(20) In operation, the HUD 18 provides images from the image source 58 via the optics 60 to a pilot or other operator so that he or she can simultaneously view the images and the real world scene on the combiner 32 in some embodiments. The images can include graphic and/or text information (e.g., flight path vector, target icons, symbols, fuel indicators, course deviation indicator, or pitch indicator). The image can also include information from other sensors or equipment (e.g., a vertical traffic collision avoidance display, terrain avoidance and awareness display, a weather radar display, flight control sensors, an electronic flight bag, a navigation system, and environmental sensors) in some embodiments. In addition, the images can include synthetic or enhanced vision images. In some embodiments, collimated light representing the image from image source 58 is provided on the combiner 32 so that the pilot can view the image conformally on the real world scene through the combiner 32 with the virtual windows 40 and 41 for viewing the HDDs 28 and 30. The virtual windows 40 and 41 do not include information on the combiner 32 and appear as a transparent region in some embodiments.
(21) The computer 56 can use gaze information, eye position and/or head position from the tracker 36 to determine the user's field of view and appropriately place the windows 40 and 41 as well as conformal symbols in some embodiments. In some embodiments, the user can select information on the image provided on the combiner 32 to be viewed on the HDDs 28 and 30 through windows 40 and 41. Advantageously, HUD 18 allows seamless integration of information displayed on the HDDs 28 and 30 and the combiner 32 using the windows 40 and 41 in some embodiments.
(22) In some embodiments, monochromatic symbols and information are provided on the combiner 32 while colored symbols and colored information are provided on the HDDs 28 and 30 and viewed through the windows 40 and 41. Weather radar information, terrain avoidance system information, and traffic collision avoidance systems information including colored symbology is provided on the HDDs 28 and 30 in some embodiments. Textual listings are provided on the HDDs 28 and 30 in some embodiments.
(23) The image source 58 can be or include any type of devices for providing an image including but not limited to a CRT display, an LED display, an active matrix liquid crystal display (LCD), a light emitting diode, laser illuminator, etc. In one embodiment, image source 58 can be a micro LCD assembly or liquid crystal on silicon (LCOS) display and can provide linearly polarized light. Image source 58 can include a laser or LED backlight in one embodiment.
(24) The computer 56 can be a HUD computer or HWD computer and controls the provision of images by the image source 58. The computer 56 can be a processing circuit or part of a processing circuit associated with other electronic components in the aircraft control center 12 (
(25) The optics 60 are collimating optics which can be a single optical component, such as a lens, or include multiple optical components, in some embodiments. The optics 60 are integrated with the image source 58 in some embodiments. The optics 60 are separate or partially separate from the image source 58 in some embodiments.
(26) With reference to
(27) The user can view information through a window 43 on the combiner 32 associated with primary flight display panel 206 in some embodiments. The user can view information on combiner 32 associated with the electronic flight bag display panel 212 through the window 43 in some embodiments. The user can view information on combiner 32 associated with the navigation display panel 204 through the window 41 in some embodiments. Although only three display panels 204, 206, and 212 are shown in
(28) With reference to
(29) The image 39 is an image including flight control symbols and/or other HUD symbology with or without a vision system image or SVS image provided conformally on the combiner 32 in some embodiments. In some embodiments, the image 39 does not include flight control symbols and/or other HUD symbology and includes a vision system image and/or a SVS image. The window 40 has a clear background for viewing information on the HDD 28 in some embodiments.
(30) The computer 56 includes a processor 425, an HDD frame module 426, an image renderer 428, a HUD frame module 436, and an image renderer 438 in some embodiments. The processor 125 is coupled to the projector 34 and is coupled to the HDDs 28 and 30 in some embodiments. In some embodiments, the display system 10 receives a synthetic vision frame from a synthetic vision system (SVS) and/or a vision frame from a vision system (VS). The processor 425 serves to provide a conformal image on the combiner 32 and select the information to be on display on the HDDs 28 and 30 in some embodiments.
(31) The image renderer 428 utilizes display information from the HDD frame module 426 to provide an image on the HDDs 28 and 30. The image renderer 428 can be utilized to provide any type of flight information. The HUD frame module 436 provides information (e.g., HUD symbology) to the image renderer 438 for providing the image 39 on the combiner 32 on the combiner 32. The image renderer 438 uses data from the tracker 36 to provide the window 40.
(32) The image renderers 428 and 438 can be hardware components or hardware components executing software configured to provide the images 38 and 39 in some embodiments. The frame modules 426 and 436 include memory such as a frame buffer.
(33) The processor 425 can be part of or integrated with a radar system, the SVS, the VS, a HDD display computer for the HDDs 20, 28, and 30, or a HUD computer for the projector 34 in some embodiments. In some embodiments, the processor 425 is an independent platform.
(34) The display system 10 also includes a data link receiver or data bus for receiving information from one or more of flight management computers and other avionic equipment for receiving phase of flight indications in some embodiments. Phase of flight indications are used to automatically choose information for displaying on HDDs 28 and 30 and the combiner 32 at landing, approach, cruise or take off in some embodiments. For example, the window 40 can automatically be removed or provided in response to a phase of flight such as landing. In some embodiments, certain information (e.g., airport information, enhanced vision information, traffic collision avoidance information or terrain avoidance information) is automatically provided in image 38 during the landing phase of flight. In another example, an airport moving map can be viewable through the window 40 (e.g., virtual window) on the HDD 28 when taxiing. In some embodiments, flight plan information automatically appears in the window 40 during cruise, and the window 40 is removed during landing or flight (e.g., altitude, roll, pitch, yaw, air speed, vertical speed indications) and position parameters are provided in the window 40 during landing.
(35) With reference to
(36) With reference to
(37) At an operation 610, the computer 56 determines if information has been selected for movement from the combiner 32 to the HDDs 28 and 30, from the HDDs 28 and 30 to the combiner 32, or between the HDDs 28 and 30. Grab and hold gestures can be used to virtually move the information from locations on the combiner 32 and the HDDs 28 and 30 in some embodiments. In some embodiments, cursors, pointers or other symbols are manipulated using user interface devices, such as track balls, mouse devices, buttons, joy sticks, or touch panels, associated with the user interface 452 to select and move the information.
(38) At an operation 612, the information is moved according to the gesture or selection in the operation 610. The flow 600 can be advantageously used to use windows 40 and 41 as drop zone for information that is displayed on the combiner 32 but can be displayed with a higher image quality on the HDDs 28 and 30. In addition, the information can augmented with data more appropriately displayed on the HDDs 28 and 30 when dropped into the windows 40 and 41 in some embodiments.
(39) With reference to
(40) Any types of symbols can be displayed in the windows 40, 41, and 42 and as part of the image 712. The symbols include airspeed and altitude tape. The symbols can be abstract and represent that more information is available when moved to the windows 40, 41, and 42 in some embodiments. For example, pages of information related to airports associated with an airspace boundary symbol is provided in one or more of windows 40, 41, and 42 when selected on the image 712 or at any location on the combiner 32. The information includes radio frequencies, instrument, approaches, runway length, runway width, elevation, and available services in some embodiments.
(41) With reference to
(42) Although discussed above with respect to the air space boundary symbol 806, other symbols can be moved to the window 41 and additional information associated with the symbol can be provided in the window 41 when moved. In some embodiments, information can be removed from display panel 812 and placed on the combiner 32 outside of window 41 in response to the grab and hold gesture. Information in the window 41 can be placed in another window by the grab and hold gesture.
(43) In some embodiments, the display system 10 can recognize gestures for paging through menus associated with information in the windows 40, 41, and 42. The user can quickly change pages with a swipe gesture in some embodiments. In some embodiments, the grab and hold gesture could be used to drop information in an HDD associated with a co-pilot for the co-pilot's review. Moving a navigation overlay from the combiner 32 to a window 40 can trigger a moving map page to be displayed on the HDDs 28 and 30 or on the combiner 32 in some embodiments. In some embodiments, virtual controls are provided on the combiner 32 and gestures are used to manipulate the virtual controls. In some embodiments, virtual handles adjacent radio equipment or throttle controls could pull up a tuning page or engine monitoring page on the HDDs 28 and 30 (
(44) Although exemplary embodiments are described with respect to cockpit environments, the display technology described herein can be utilized in other environments. While the detailed drawings, specific examples, detailed algorithms, and particular configurations given describe preferred and exemplary embodiments, they serve the purpose of illustration only. The inventive concepts disclosed herein are not limited to the specific forms shown. For example, the methods may be performed in any of a variety of sequence of steps or according to any of a variety of computer sequences. The hardware and software configurations shown and described may differ depending on the chosen performance characteristics and physical characteristics of the image and processing devices. For example, the type of system components and their interconnections may differ. The systems and methods depicted and described are not limited to the precise details and conditions disclosed. The flow charts show exemplary operations only. The specific data types and operations are shown in a non-limiting fashion. Furthermore, other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the exemplary embodiments without departing from the scope of the invention as expressed in the appended claims.