SYSTEM AND METHOD FOR DISPLAYING A VIEW OF AN EXTERNAL ENVIRONMENT OF AN AIRCRAFT INSIDE THE AIRCRAFT

20230415894 ยท 2023-12-28

    Inventors

    Cpc classification

    International classification

    Abstract

    A system for displaying an external environment of an aircraft inside the aircraft comprises optical couplers distributed across an outer surface of a fuselage and configured to capture light from the external environment of the aircraft; optical waveguides transmit the captured light from the optical couplers through the fuselage; at least one display arranged on the inside of the fuselage; and at least one of: a processing unit configured to receive the captured light, to reconstruct a digital live image by combining optical information contained in the captured light across the respective optical couplers, and to display the digital live image on the at least one display; and an optical beam expander configured to receive the captured light, to expand the captured light into a light beam carrying a visual live image of the external environment, and to project the visual live image on the at least one display.

    Claims

    1. A system for displaying a view of an external environment of an aircraft inside the aircraft, the system comprising: a plurality of optical couplers distributed across an outer surface of a fuselage of the aircraft and configured to capture light from the external environment of the aircraft; optical waveguides arranged through the fuselage to transmit the captured light from the optical couplers through the fuselage; at least one display arranged on the inside of the fuselage; and at least one of: a processing unit configured to receive the captured light from the optical waveguides, to reconstruct a digital live image of the external environment by combining optical information contained in the captured light across the respective optical couplers and to display the digital live image on the at least one display, and an optical beam expander configured to receive the captured light from at least one of the optical waveguides, to expand the captured light into a light beam carrying a visual live image of the external environment, and to project the visual live image on the at least one display.

    2. The system according to claim 1, wherein the at least one display comprises a digital display for displaying the digital live image provided by the processing unit, or a screen for projecting the visual live image provided by the optical beam expander, or both.

    3. The system according to claim 1, wherein the at least one display comprises a hybrid display configured to selectively display on a shared screen the digital live image provided by the processing unit and the visual live image provided by the optical beam expander.

    4. The system according to claim 3, wherein the shared screen is semi-transparent to transmit the visual live image provided by the optical beam expander.

    5. The system according to claim 3, wherein the hybrid display is configured to dynamically adjust the digital live image provided by the processing unit to match the visual live image provided by the optical beam expander.

    6. The system according to claim 1, wherein the at least one display comprises several displays arranged on an inner wall of the fuselage and shaped to represent aircraft windows.

    7. The system according to claim 1, wherein the optical couplers are configured as fiber collimators and the optical waveguides comprise optical fibers.

    8. The system according to claim 1, wherein the optical couplers are arranged recessed in the outer surface of the fuselage.

    9. The system according to claim 1, wherein the processing unit is configured to employ at least one of compressed sensing, sensor fusion, and computational imaging to reconstruct the digital live image.

    10. An aircraft comprising: the system according to claim 1.

    11. A method for displaying a view of an external environment of an aircraft inside the aircraft, the method comprising: capturing light from the external environment of the aircraft with a plurality of optical couplers distributed across an outer surface of a fuselage of the aircraft; transmitting the captured light from the optical couplers through the fuselage with optical waveguides arranged through the fuselage; and at least one of: reconstructing a digital live image of the external environment with a processing unit receiving the captured light from the optical waveguides by combining optical information contained in the captured light across the respective optical couplers and displaying the digital live image on at least one display arranged inside the fuselage; and expanding the captured light into a light beam carrying a visual live image of the external environment with an optical beam expander receiving the captured light from at least one of the optical waveguides and projecting the visual live image on the at least one display.

    12. The method according to claim 11, wherein the at least one display comprises a hybrid display with a shared screen, on which the digital live image provided by the processing unit and the visual live image provided by the optical beam expander are selectively displayed.

    13. The method according to claim 12, wherein the hybrid display dynamically adjusts the digital live image provided by the processing unit to match the visual live image provided by the optical beam expander.

    14. The method according to claim 12, wherein the processing unit employs at least one of compressed sensing, sensor fusion, and computational imaging to reconstruct the digital live image.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0038] FIG. 1 schematically depicts a cross-sectional view of an aircraft fuselage with a system according to an embodiment of the invention.

    [0039] FIG. 2 schematically depicts a view of an optical coupler and an optical waveguide as used in the system of FIG. 1.

    [0040] FIG. 3 is a schematic flow diagram of a method according to an embodiment of the invention.

    [0041] FIG. 4 schematically depicts another view of the system of FIG. 1.

    [0042] FIGS. 5 and 6 depict alternatives for the optical coupler of FIG. 2.

    [0043] FIG. 7 schematically depicts a system according to another embodiment of the invention.

    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

    [0044] Although specific embodiments are illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described without departing from the scope of the present invention. Generally, this application is intended to cover any adaptations or variations of the specific embodiments discussed herein. As used herein, at least one of A and B, shall mean A, or B, or A and B.

    [0045] FIG. 1 schematically depicts a cross-sectional view of an aircraft fuselage 5 with a system 10 according to an embodiment of the invention. Further views of the system 10 or components of the system 10 are shown in FIGS. 2 and 4. FIG. 3 depicts a schematic flow diagram of a corresponding method M.

    [0046] The aim of the system 10 is to realize a quasi-windowless cabin, in which the windows are replaced by virtual windows (displays 6a, 6b) connected to the outside via optical fibers. The underlying idea is to provide a view of the environment of an aircraft 100 from inside the aircraft 100 without having to weaken the fuselage structure with window apertures. The displays 6a, 6b are arranged on an inner wall 5b of the fuselage 5 and may be shaped to represent conventional aircraft windows (cf. FIG. 4). It is to be understood however that the present solution can also be used with completely different arrangements of virtual windows of different shapes.

    [0047] The system 10 comprises several fiber collimators as optical couplers 1, which are distributed across an outer surface 5a of a fuselage 5 of the aircraft 100, each optical coupler 1 being mounting in a respective recess (or hole) in the outer surface 5a. The optical couplers 1 are configured to capture light 7 from the external environment of the aircraft 100 and to couple it into respective optical waveguides 2, e.g. fiber bundles or cables, which are routed through the fuselage 5 to transmit the captured light 7 from the optical couplers 1 through the fuselage 5. One such optical coupler 1 is depicted in detail in FIG. 2.

    [0048] As can be seen here, light 7 from the environment enters the optical coupler 1 via a phase and/or amplitude mask 1a (similarly to what is done in lensless cameras) and is then focused into the waveguide 2 by beans of a lens 1b. The waveguide 2 (fiber cable) may carry multiple optical signals, which allows it to reconstruct an image, similarly to what is done in lensless imaging. For this purpose the waveguide 2 can be implemented in several possible ways, for example as a bundle of fibers, as a single multi-core fiber or as a multimode fiber. Here, the individual cores of the fibers play a similar role as the pixels of the sensor in a lensless camera.

    [0049] Some of the waveguides 2 are coupled to a processing unit 3, which is configured to receive the captured light 7 from the optical waveguides 2, to reconstruct a digital live image of the external environment by combining optical information contained in the captured light 7 across the respective optical couplers 1 and to display the digital live image on one or several digital displays 6b. To this end, the processing unit 3 may employ compressed sensing, sensor fusion and/or computational imaging or similar methods to reconstruct the digital live image from one or several optical couplers 1.

    [0050] The number of fibers required to form an image could be significantly less that the number of pixels required for a normal cameras. This the case, because the high number of pixels usually provided by modern cameras (often >10 megapixel) is not required for the present purpose (no images need to be printed). Moreover, and more importantly, computational imaging and other advanced methods allow to retrieve an image from very few pixels, that is fibers in the present case. Sensor fusion between the data produced by the different cameras/fiber bundles will likely allow to reduce the number of fiber-per-bundle even further, by exploiting redundancies (overlap between images) and/or by using deep learning (e.g. generative adversarial networks).

    [0051] An alternative approach, perhaps more conventional, is to use several very small cameras and then combine their outputs via sensor fusion techniques in order to combine all the electrical outputs of the mini-cameras into a single big image. Two possible implementations, one more conventional and one relaying on lensless imaging are presented in FIGS. 5 and 6.

    [0052] FIG. 5 shows an alternative implementation of the proposed solution in which the fiber collimators are replaced by conventional mini-cameras 11. FIG. 6 shows a corresponding embodiment with a smaller lensless camera 12. In the first case, the light rays are focused by a camera lens 11a into a pixelated sensor 11b. In the second case, the lens 11a is replaced by a phase/amplitude mask 12a. A key difference between this approach and the approach based on fiber couplers is that in this second approach there would be an electrical instead of an optical connection between the camera and the processing unit 3. It is to be understood of course that these connections as well as the electrical connections between the processing unit 3 and the displays 6a, 6b could in principle be replaced by wireless connections, thanks to the recent improvement in remote powering.

    [0053] Coming back to the embodiment of FIGS. 1 and 4, some of the optical waveguides 2 may also be fed to optical beam expanders 4 in order to directly project the natural light from the environment into the aircraft 100. To this end, each optical beam expander 4 is configured to receive the captured light 7 from at least one of the optical waveguides 2, to expand/magnify the captured light 7 into a light beam carrying a visual live image of the external environment and to project the visual live image on a screen 6c inside the fuselage 5.

    [0054] The screen 6c may be part of a corresponding (analog) display, for example. In the exemplary embodiment of FIG. 1, the screen 6c is however part of a hybrid display 6a and used as a shared screen to selectively display the digital live image and the visual light image. To this end, the hybrid display 6a may be provided with a semi-transparent screen 6c to transmit the visual live image provided by the optical beam expander 4.

    [0055] As can be seen in FIGS. 1 and 4, the waveguide 2 stemming from the optical coupler 1 feeding the hybrid display 6a with the visual live image is split up by means of a beam splitter and/or fiber coupler 8. The other output line is also routed to the processing unit 3, which provides a corresponding digital live image of the respective optical coupler 1. The hybrid display 6a may be adapted to dynamically adjust the digital live image provided by the processing unit 3 to match the visual live image provided by the optical beam expander 4. In this vein, the hybrid display 6a is able to selectively switch between natural light (visual live image) and digitized images (digital live images). For example, the digital live images may be displayed by default. Only under certain circumstances, e.g. during a power outage, the system 10 may switch to the visual live images to make sure that at least some of the windows of the aircraft 100 still provide light to the inside.

    [0056] The method M in FIG. 3 correspondingly comprises under M1 capturing light 7 from the external environment with several optical couplers 1 and under M2 transmitting the captured light 7 from the optical couplers 1 through the fuselage 5 with optical waveguides 2. Moreover, the method M comprises as a first option under M3a reconstructing a digital live image of the external environment with the processing unit 3 by combining optical information contained in the captured light 7 across the respective optical couplers 1 and under M4a displaying the digital live image on the digital display 6b. As a second option, the method M comprises under M3b expanding the captured light 7 into a light beam carrying a visual live image of the external environment with an optical beam expander 4 receiving the captured light 7 from at least one of the optical waveguides 2 and under M4b projecting the visual live image on the hybrid display 6a.

    [0057] FIG. 7 shows a detailed view of another embodiment of the invention. Also in this case, a hybrid display 6b is employed alongside several digital displays 6b. In this case however, the digital displays 6b are additionally fed by mini-cameras 11 and lensless cameras 12. Moreover, different types of optical waveguides 2 are employed to illustrate that not only fiber bundles (left side in FIG. 7) but also multi-core fibers etc. may be utilized for the present purpose.

    [0058] The systems and devices described herein may include a controller or a computing device comprising a processor and a memory which has stored therein computer-executable instructions for implementing the processes described herein. The processing unit may comprise any suitable devices configured to cause a series of steps to be performed so as to implement the method such that instructions, when executed by the computing device or other programmable apparatus, may cause the functions/acts/steps specified in the methods described herein to be executed. The processing unit may comprise, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, a central processing unit (CPU), an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.

    [0059] The memory may be any suitable known or other machine-readable storage medium. The memory may comprise non-transitory computer readable storage medium such as, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. The memory may include a suitable combination of any type of computer memory that is located either internally or externally to the device such as, for example, random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like. The memory may comprise any storage means (e.g., devices) suitable for retrievably storing the computer-executable instructions executable by processing unit.

    [0060] The methods and systems described herein may be implemented in a high-level procedural or object-oriented programming or scripting language, or a combination thereof, to communicate with or assist in the operation of the controller or computing device. Alternatively, the methods and systems described herein may be implemented in assembly or machine language. The language may be a compiled or interpreted language. Program code for implementing the methods and systems described herein may be stored on the storage media or the device, for example a ROM, a magnetic disk, an optical disc, a flash drive, or any other suitable storage media or device. The program code may be readable by a general or special-purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.

    [0061] Computer-executable instructions may be in many forms, including modules, executed by one or more computers or other devices. Generally, modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Typically, the functionality of the modules may be combined or distributed as desired in various embodiments.

    [0062] In the foregoing detailed description, various features are grouped together in one or more examples or examples with the purpose of streamlining the disclosure. It is to be understood that the above description is intended to be illustrative, and not restrictive. It is intended to cover all alternatives, modifications and equivalents. Many other examples will be apparent to one skilled in the art upon reviewing the above specification. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

    [0063] While at least one exemplary embodiment of the present invention(s) is disclosed herein, it should be understood that modifications, substitutions and alternatives may be apparent to one of ordinary skill in the art and can be made without departing from the scope of this disclosure. This disclosure is intended to cover any adaptations or variations of the exemplary embodiment(s). In addition, in this disclosure, the terms comprise or comprising do not exclude other elements or steps, the terms a or one do not exclude a plural number, and the term or means either or both. Furthermore, characteristics or steps which have been described may also be used in combination with other characteristics or steps and in any order unless the disclosure or context suggests otherwise. This disclosure hereby incorporates by reference the complete disclosure of any patent or application from which it claims benefit or priority.

    LIST OF REFERENCE SIGNS

    [0064] 1 optical coupler [0065] 1a phase/amplitude mask [0066] 1b lens [0067] 2 optical waveguide [0068] 2a phase/amplitude mask [0069] 2b pixelated sensor [0070] 3 processing unit [0071] 4 optical beam expander [0072] 5 fuselage [0073] 5a outer surface of fuselage [0074] 5b inner wall [0075] 6a hybrid display [0076] 6b digital display [0077] 6c screen [0078] 7 light [0079] 8 beam splitter/fiber coupler [0080] 9 electric line [0081] 10 system [0082] 11 mini-camera [0083] 11a camera lens [0084] 11b pixelated sensor [0085] 12 lensless camera [0086] 12a phase/amplitude mask [0087] 12b pixelated sensor [0088] 13 pixelated sensor [0089] 100 aircraft [0090] M method [0091] M1-M4b method steps