Patent classifications
H04N2213/001
Surgery 3D Visualization Apparatus
An apparatus for obtaining an image of a retina is described herein. The apparatus includes an optical relay that defines an optical path and is configured to relay an image of the iris along the optical path to a pupil, a shutter is disposed at the pupil and configured to define at least a first shutter aperture for control of light transmission through the pupil position, a tube lens disposed to direct light from the shutter aperture to an image sensor, and a prismatic input port disposed between the shutter and the tube lens and configured to combine, onto the optical path, light from the relay with light conveyed along a second light path that is orthogonal to the optical path.
Depth plane selection for multi-depth plane display systems by user categorization
A display system includes a head-mounted display configured to project light, having different amounts of wavefront divergence, to an eye of a user to display virtual image content appearing to be disposed at different depth planes. The wavefront divergence may be changed in discrete steps, with the change in steps being triggered based upon whether the user is fixating on a particular depth plane. The display system may be calibrated for switching depth planes for a main user. Upon determining that a guest user is utilizing the system, rather than undergoing a full calibration, the display system may be configured to switch depth planes based on a rough determination of the virtual content that the user is looking at. The virtual content has an associated depth plane and the display system may be configured to switch to the depth plane of that virtual content.
Transparent display device, and three-dimensional image display apparatus comprising same
A transparent display device includes an image display bar in which a plurality of light-emitting elements are arranged, and a bar driving unit which moves the image display bar along a predetermined path and provides transparent display using afterimages resulting from the movement of the light-emitting elements, wherein the transparency of the transparent display using afterimages is determined by the equation, transparency (%)=((A−B)/A)*100, where A denotes the entire display area of the transparent display, and B denotes the area of the image display bar.
STEREOSCOPIC THREE-DIMENSIONAL DISPLAY SYSTEM AND MANUFACTURING METHOD THEREOF
A stereoscopic three-dimensional display and manufacturing method thereof includes a three-dimensional display and a depth sensor. The three-dimensional display includes a three-dimensional display module and a parallax optical module that corresponds to a plurality of first depth display portions and a plurality of second depth display portions and a first parallax optical portion and a second parallax optical portion. The depth sensor is electrically connected to the stereoscopic display module, and detects the distance between the user and the stereoscopic display, and selects whether all of the first depth display parts jointly output the first parallax image, or all of the second depth display parts jointly output the second parallax image, so that the user can view the stereoscopic imaging through the first parallax optical unit or the second parallax optical unit at different positions.
Method and system for stereo-visual localization of object
Embodiments herein provide a method for stereo-visual localization of an object by a stereo-visual localization apparatus. The method includes generating, by a stereo-visual localization apparatus, a stereo-visual interface displaying the first stereo image of the object and the first stereo image of the subject in a first portion and the second stereo image of the object and the second stereo image of the subject in a second portion. Further, the method includes detecting, by the stereo-visual localization apparatus, a movement of the subject to align the subject in the field of view with the object. Furthermore, the method includes visually aligning, by the stereo-visual localization apparatus, the subject with the object based on the movement by simultaneously changing apparent position of the first and the second stereo images of the subject in each of the first portion and the second portion in the stereo-visual interface.
Electronic device and method for changing modes via multiple displays
An electronic device includes a transparent first display panel, a second display panel and a processor electrically connected to the two panels. The first display panel is movable with respect to the second display panel. The processor is configured to switch between a plurality of display modes based on relative positioning of the two panels and to provide video signals to the two panels based on a current display mode. When the first display panel is parallel to the second display panel and faces a display area of the second display panel, the processor executes a stereoscopic display mode. When an angle between the two panels is between 0 and 180 degrees, exclusive, the processor executes an augmented reality display mode. When the display areas of the two panels are oriented away from each other, the processor executes a dual display mode.
Near eye display system and operation method thereof
A near eye display system including a display screen, a graphic processing unit, a lens group, a focal length adjustment device, an interpupillary distance adjustment device, a detection unit, and a control unit is provided. Positions of a left-eye image and a right-eye image to be displayed on the display screen may be adjusted based on lateral displacement amounts of a left-eye lens and a right-eye lens, such that centers of the left-eye image and the right-eye image are respectively aligned with centers of the left-eye lens and the right-eye lens. Sizes (magnification) of the left-eye image and the right-eye image to be displayed on the display screen may be adjusted based on longitudinal displacement amounts of the left-eye lens and the right-eye lens, such that sizes of a visual left-eye image and a visual right-eye image seen by a user are identical. Operation methods thereof are also provided.
Integrated Camera System Having Two Dimensional Image Capture and Three Dimensional Time-of-Flight Capture With A Partitioned Field of View
An apparatus is described that includes an integrated two-dimensional image capture and three-dimensional time-of-flight depth capture system. The three-dimensional time-of-flight depth capture system includes an illuminator to generate light. The illuminator includes arrays of light sources. Each of the arrays is dedicated to a particular different partition within a partitioned field of view of the illuminator.
Imaging unit including a chassis and heat transfer member
An imaging unit includes a plurality of imaging devices configured to capture images of an object; a circuit substrate configured to generate image data based on the images captured by the imaging devices; a chassis that holds the imaging devices; and a heat transfer member including a contacting portion configured to contact an installed member in a case where the imaging unit is installed on the installed member. The heat transfer member contacts the chassis or the circuit substrate, and heat conductivity of the heat transfer member is greater than the heat conductivity of the chassis.
Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
An apparatus is described that includes a camera system having a time-of-flight illuminator. The time of flight illuminator has a light source and one or more tiltable mirror elements. The one or more tiltable mirror elements are to direct the illuminator's light to only a region within the illuminator's field of view.