H04N13/286

INTERCHANGEABLE LENS AND IMAGE PICKUP APPARATUS
20230077645 · 2023-03-16 ·

An interchangeable lens attachable to and detachable from an image pickup apparatus that includes a single image sensor includes a plurality of lens units that include a first lens unit and a second lens unit, and a communication unit configured to transmit to the image pickup apparatus information for setting an imaging area acquired by the first lens unit that is used in monaurally displaying image to a photometric area.

INTERCHANGEABLE LENS AND IMAGE PICKUP APPARATUS
20230077645 · 2023-03-16 ·

An interchangeable lens attachable to and detachable from an image pickup apparatus that includes a single image sensor includes a plurality of lens units that include a first lens unit and a second lens unit, and a communication unit configured to transmit to the image pickup apparatus information for setting an imaging area acquired by the first lens unit that is used in monaurally displaying image to a photometric area.

Methods and devices for selecting objects in images
11631227 · 2023-04-18 · ·

Methods and devices for manipulating an image are described. The method comprises receiving image data, the image data including a first image obtained from a first camera and a second image obtained from a second camera, the first camera and the second camera being oriented in a common direction; identifying one or more boundaries of an object in the image data by analyzing the first image and the second image; and displaying a manipulated image based on the image data, wherein the manipulated image includes manipulation of at least a portion of the first image based on boundaries of the object.

Methods and devices for selecting objects in images
11631227 · 2023-04-18 · ·

Methods and devices for manipulating an image are described. The method comprises receiving image data, the image data including a first image obtained from a first camera and a second image obtained from a second camera, the first camera and the second camera being oriented in a common direction; identifying one or more boundaries of an object in the image data by analyzing the first image and the second image; and displaying a manipulated image based on the image data, wherein the manipulated image includes manipulation of at least a portion of the first image based on boundaries of the object.

Stereoscopic endoscope system
09848758 · 2017-12-26 · ·

A stereoscopic endoscope system includes a stereoscopic endoscope and an identification information combining section. The stereoscopic endoscope is provided with an R image pickup section and an R output section which are provided on a right side of an endoscope body, an L image pickup section and an L output section which are provided on a left side of the endoscope body, and an R memory and an L memory which store correction information for either right or left and identification information. The right and left image pickup sections and the right and left memories are correctly or incorrectly combined and are connected to the right and left output sections. The identification information combining section performs image combination of an image and identification information inputted from the R output section or an image and identification information inputted from the L output section and outputs a combined image.

Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
09854226 · 2017-12-26 · ·

An apparatus is described that includes a camera system having a time-of-flight illuminator. The time of flight illuminator has a light source and one or more tiltable mirror elements. The one or more tiltable mirror elements are to direct the illuminator's light to only a region within the illuminator's field of view.

Imaging system and observation method
11678791 · 2023-06-20 · ·

A stereo imaging system comprises an observation instrument having an image acquisition unit for detecting first image data and second image data, which can be combined for stereo observation. There is provided at least one position sensor for detecting an orientation of the instrument in relation to a position reference. There is provided a control device that is operable in a first representation mode and a second representation mode, depending on the orientation of the instrument. The control device is configured for outputting an image signal, which comprises a stereo signal that is based on the first image data and the second image data in the first representation mode, and a mono signal that is based on the first image data or the second image data in the second representation mode. The control device is configured to erect images that are output with the image signal in the second representation mode, depending on the orientation.

Time-of-flight image sensor and light source driver having simulated distance capability
09812486 · 2017-11-07 · ·

An apparatus is described that includes an image sensor and a light source driver circuit having configuration register space to receive information pertaining to a command to simulate a distance between a light source and an object that is different than an actual distance between the light source and the object.

THREE-DIMENSIONAL DEPTH PERCEPTION APPARATUS AND METHOD
20170310946 · 2017-10-26 ·

A three-dimensional depth perception apparatus and method, comprising a synchronized trigger module, an MIPI receiving/transmitting module, and a multiplexing core computing module, a storage controller module, a memory, and an MUX selecting module; wherein the synchronized trigger module is for generating a synchronized trigger signal that is transmitted to an image acquiring module; the MIPI receiving/transmitting module is for supporting input/output of the MIPI video streams and other formats of video streams; the multiplexing core computing module is for selecting a monocular structured light depth perception working mode or a binocular structured light depth perception working mode as needed, including a pre-processing module, a block matching disparity computing module, a depth computing module, and a depth post-processing module. The apparatus flexibly adopts a monocular or binocular structured-light depth sensing manner as required by the user, so as to conveniently leverage the advantages of different modes: the MIPI in, MIPI out working manner is nearly transparent to the user, so as to facilitate the user to employ the apparatus to replace the MIPI camera in the original system, directly obtaining the depth graph.

Microdisplay based immersive headset

An immersive headset device is provided that includes a display portion and a body portion. The display portion may include microdisplays having a compact size. The microdisplays may be movable (e.g., rotational) relative to the body portion and can be moved (e.g., rotated) between a flipped-up position and a flipped-down position. In some instances, when the microdisplays are flipped up, the headset provides an augmented reality (AR) mode to a user, and when the microdisplays are flipped down, the headset provide a virtual reality (VR) mode to the user. In certain implementations, the headset includes an electronics source module to provide power and/or signal to the microdisplays. The electronics source module can be attached to a rear of the body portion in order to provide advantageous weight distribution about the head of the user.