H04N13/344

SEPARABLE DISTORTION DISPARITY DETERMINATION
20230007222 · 2023-01-05 ·

Systems and methods for determining disparity between two images are disclosed. Such systems and methods include obtaining a first pixel image of a scene from a first viewpoint, obtaining a second pixel image of the scene from a second viewpoint (e.g., separate from the first viewpoint in a camera baseline direction such as horizontal or vertical), modifying the first and second pixel images using component-separated correction to create respective first and second corrected pixel images maintaining pixel scene correspondence in the camera baseline direction from between the first and second pixel images to between the first and second corrected pixel images, determining pixel pairs from corresponding pixels between the first and second corrected pixel images in the camera baseline direction, and determining disparity correspondence for each of the determined pixel pairs from pixel locations in the first and second pixel images corresponding to respective pixel locations of the pixel pairs in the first and second corrected pixel images.

SMART WEARABLE DEVICE FOR VISION ENHANCEMENT AND METHOD FOR REALIZING STEREOSCOPIC VISION TRANSPOSITION
20230239447 · 2023-07-27 ·

The invent discloses a smart wearable device for vision enhancement and a method for realizing stereoscopic vision transposition, comprising a wearable device body, wherein the wearable device body is provided with camera lenses, image sensors, an image information receiving and transmitting unit, image enhancement units, and near-to-eye optical systems; the optical axis and field angle of the near-to-eye optical system are matched with the optical axis and field angle of the camera lens; the image sensor is arranged behind the camera lens; the real scene enters the image sensor through an image imaging device for image acquisition, and through the image enhancement unit, the low-light environment image collected by the smart wearable device in the low-light environment is enhanced and displayed clearly. The invention can ensure the enhancement of the real stereoscopic vision in the dark environment and the interchange of the remote and barrier-free stereoscopic real vision.

SMART WEARABLE DEVICE FOR VISION ENHANCEMENT AND METHOD FOR REALIZING STEREOSCOPIC VISION TRANSPOSITION
20230239447 · 2023-07-27 ·

The invent discloses a smart wearable device for vision enhancement and a method for realizing stereoscopic vision transposition, comprising a wearable device body, wherein the wearable device body is provided with camera lenses, image sensors, an image information receiving and transmitting unit, image enhancement units, and near-to-eye optical systems; the optical axis and field angle of the near-to-eye optical system are matched with the optical axis and field angle of the camera lens; the image sensor is arranged behind the camera lens; the real scene enters the image sensor through an image imaging device for image acquisition, and through the image enhancement unit, the low-light environment image collected by the smart wearable device in the low-light environment is enhanced and displayed clearly. The invention can ensure the enhancement of the real stereoscopic vision in the dark environment and the interchange of the remote and barrier-free stereoscopic real vision.

USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS

Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.

USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS

Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.

SYSTEM AND METHOD FOR VIRTUAL REALITY BASED HUMAN BIOLOGICAL METRICS COLLECTION AND STIMULUS PRESENTATION
20230004284 · 2023-01-05 ·

A method of updating a protocol for a Virtual Reality (VR) medical test via a user device having a processor, the VR medical test being performed on a subject via a VR device worn by the subject, wherein the method is performed by the processor and the method comprises: displaying GUI elements associated with the protocol on the user device, the GUI elements having user adjustable settings for modifying a functioning of the VR medical test; receiving a selection input from the user device corresponding to a selection of the GUI elements; receiving a setting input from the user device that corresponds to the selected GUI elements; modifying the user adjustable setting for each of the selected GUI elements according to the corresponding setting input; and updating the protocol based on the user adjustable setting for each of the selected GUI elements and operations associated with the VR device.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM

An information processing device (1) according to an embodiment includes a display control unit (34) and a decision unit (31). The display control unit (34) displays a content image on a head-mounted display. During display of the content image by the display control unit (34), the decision unit (31) decides whether or not a surrounding person exists in a front direction of the head-mounted display on the basis of a camera image obtained by capturing an image of a surrounding environment of the head-mounted display. In a case where the decision unit (31) decides that a surrounding person exists, the display control unit (34) moves a display position of the content image.

STEREOSCOPIC-IMAGE PLAYBACK DEVICE AND METHOD FOR GENERATING STEREOSCOPIC IMAGES

A method for generating stereoscopic images is provided. The method includes: creating a three-dimensional mesh to obtain a stereoscopic scene and capturing a two-dimensional image of the stereoscopic scene; performing image preprocessing to obtain a first image in response to the two-dimensional image not being a side-by-side image; utilizing a graphics processing pipeline to perform depth estimation on the first image to obtain a depth image, to update the three-dimensional mesh according to a depth setting of the depth image, and to map the three-dimensional mesh to a corresponding coordinate system; utilizing the graphics processing pipeline to project the first image onto the mapped three-dimensional mesh to obtain an output three-dimensional mesh, and to capture an output side-by-side image from the output three-dimensional mesh; and utilizing the graphics processing pipeline to weave a left-eye and right-eye image into an output image, and to display the output image.

MONITORING POSITION AND ORIENTATION OF A PROJECTOR
20230239443 · 2023-07-27 ·

A projection system includes an illumination light source configured to emit an illumination light beam, a monitor light source configured to emit a monitor light beam, and a projector configured to project both the illumination light beam and the monitor light beam into a projected combined light beam. A first portion of the projected combined light beam is propagated over a first beam path in a first direction, causing an eye of a user to see a display image. A second portion of the projected combined light beam is propagated over a second beam path in a second direction, causing a monitor camera to capture a monitor image. The monitor image is analyzed to determine an orientation or a position of the monitor image. In response to determining that the monitor image is not properly oriented or positioned, an orientation or position of the projector or the illumination image is adjusted.

CALIBRATION OF STEREOSCOPIC DISPLAY USING WAVEGUIDE COMBINER

Examples are disclosed that relate to calibration of a stereoscopic display system of an HMD via an optical calibration system comprising a waveguide combiner. One example provides an HMD device comprising a first image projector and a second image projector configured to project a stereoscopic image pair, and an optical calibration system. The optical calibration system comprises a first optical path indicative of an alignment of the first image projector, a second optical path indicative of an alignment of the second image projector, a waveguide combiner in which the first and second optical paths combine into a shared optical path, and one or more boresight sensors configured to detect calibration image light traveling along one or more of the first optical or the second optical path.