H04N2213/001

DEPTH PLANE SELECTION FOR MULTI-DEPTH PLANE DISPLAY SYSTEMS BY USER CATEGORIZATION

A display system includes a head-mounted display configured to project light, having different amounts of wavefront divergence, to an eye of a user to display virtual image content appearing to be disposed at different depth planes. The wavefront divergence may be changed in discrete steps, with the change in steps being triggered based upon whether the user is fixating on a particular depth plane. The display system may be calibrated for switching depth planes for a main user. Upon determining that a guest user is utilizing the system, rather than undergoing a full calibration, the display system may be configured to switch depth planes based on a rough determination of the virtual content that the user is looking at. The virtual content has an associated depth plane and the display system may be configured to switch to the depth plane of that virtual content.

DEVICE TO CREATE AND DISPLAY FREE SPACE HOLOGRAM
20230035915 · 2023-02-02 ·

A unique method and a device to generate free space “pop-out” & “sink-in” holograms is disclosed herein. The hologram disclosed herein does not use any special medium, mirrors, reflective screens or wearables such as headgear & special glasses. The hologram disclosed herein can be created in free space, outer space or in air, without any other optical components except for the special display screen of the hologram device. This device demonstrates a free space hologram and the hologram Augmented Reality & hologram Virtual Reality. A camera capable of hologram quality images equipped with a smart lens which mimics the human eye by changing it's lens aperture according to the light intensity as the pupil of the human eye and focus & capture “pop-out” & “sink-in” hologram images is disclosed herein. The audio which is incorporated with the device provide multi dimensional multi directional audio effects.

Direct camera-to-display system

In one embodiment, an electronic display assembly includes a sensor array located on one side of a circuit board and an electronic display array located on an opposite side of the circuit board from the sensor array. The sensor array includes a plurality of sensor pixel units. Each sensor pixel unit includes a plurality of sensor pixels. The electronic display array includes a plurality of display pixel units. Each display pixel unit includes a plurality of display pixels. Each particular one of the plurality of sensor pixel units is mapped to a corresponding one of the plurality of display pixel units such that display pixels of each particular one of the plurality of display pixel units display light corresponding to light captured by sensor pixels of its mapped sensor pixel unit.

Open view, multi-modal, calibrated digital loupe with depth sensing
11611735 · 2023-03-21 · ·

A digital loupe system is provided which can include a number of features. In one embodiment, the digital loupe system can include a stereo camera pair and a distance sensor. The system can further include a processor configured to perform a transformation to image signals from the stereo camera pair based on a distance measurement from the distance sensor and from camera calibration information. In some examples, the system can use the depth information and the calibration information to correct for parallax between the cameras to provide a multi-channel image. Ergonomic head mounting systems are also provided. In some implementations, the head mounting systems can be configurable to support the weight of a digital loupe system, including placing one or two oculars in a line of sight with an eye of a user, while improving overall ergonomics, including peripheral vision, comfort, stability, and adjustability. Methods of use are also provided.

Precision multi-view display

A precision multi-view (MV) display system can accurately and simultaneously display different content to different viewers over a wide field of view. The MV display system may include features that enable individual MV display devices to be easily and efficiently tiled to form a larger MV display. A graphical interface enables a user to graphically specify viewing zones and associate content that will be visible in those zones in a simple manner. A calibration procedure enables the specification of content at precise viewing locations.

Mobile phone/device case or cover having a 3D camera
11606449 · 2023-03-14 ·

A mobile phone/device case or cover having a 3D camera, realized by integrating or connecting two cameras and a 3D viewer sheet integrated on the phone/device case or cover; wherein the 3D camera comprises: a V-shaped optical component having left and right arms, which act as light guides to guide the light from an object, onto an image sensor, and a focusing lens for focusing an image on the image sensor, where two images are formed of any object in the field of view of the V-shaped optical component.

Electronic gaming machine with emulated three dimensional display
11475729 · 2022-10-18 · ·

A gaming machine includes a processor, a video controller coupled to the processor, a display device coupled to the video controller, and an input device coupled to the processor and receiving an input from a player. The display device includes a rear display panel and a front display panel arranged between the player and the rear display panel, the front display panel including an electrochromic display panel that is spaced apart from the rear display panel in a viewing direction of the player. The video controller causes the display device to alternate between a first state for displaying a first image on the rear display panel and a second state for displaying a second image on the front display panel. In the first state, the video controller displays the first image on the rear display panel and causes the front display panel to be transparent, and in the second state, the video controller displays the second image on the rear display panel and causes the front display panel to be opaque.

DUAL SYSTEM ON A CHIP EYEWEAR

Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate a first color camera, a second color camera, a first display, and a second display. The first SoC and a second SoC are configured to selectively operate a first and second computer vision (CV) camera algorithms. The first SoC is configured to perform visual odometry (VIO), track hand gestures of the user, and provide depth from stereo images. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.

Surgery 3D Visualization Apparatus

An apparatus for obtaining an image of a retina has an optical relay that defines an optical path and is configured to relay an image of the iris along the optical path to a pupil; a shutter disposed at the pupil and configured to define at least a first shutter aperture for control of light transmission through the pupil position; a tube lens disposed to direct light from the shutter aperture to an image sensor; and a prismatic input port disposed between the shutter and the tube lens and configured to combine, onto the optical path, light from the relay with light conveyed along a second light path that is orthogonal to the optical path.

ENDOSCOPE SYSTEM
20230157526 · 2023-05-25 · ·

An endoscope system includes an endoscope that captures a living tissue in a body cavity, and an image processing unit. The endoscope includes an objective lens provided on a front side of a light receiving surface of an image sensor and configured to simultaneously form images of the living tissue, obtained through a plurality of windows, on the light receiving surface as the captured image. The image processing unit includes a three-dimensional expansion processor configured to calculate different directions of a feature part visible through the plurality of windows based on position information in each of images of the feature part, which is distinguishably identified from other parts and included in common in the plurality of images obtained through the plurality of windows in the captured image captured by the endoscope, and to expand two-dimensional information of the images of the feature part to three-dimensional information.