H04N13/236

Optical observation instrument

An optical observation instrument according to the invention, in particular a surgical microscope or exoscope, comprises an optics unit with an objective arrangement and at least one electronic image recorder, wherein the optics unit has a first stereo channel with a first beam path and a second stereo channel with a second beam path for recording a stereo image of an object field with the at least one electronic image recorder and wherein the first and the second beam path extend through the objective arrangement. Further, the observation instrument comprises a retaining apparatus which comprises a retaining bracket, which engages over the optics unit, wherein the retaining bracket comprises an operating device with a number of operating elements for controlling a retaining arm, to which the retaining apparatus is connectable.

Optical observation instrument

An optical observation instrument according to the invention, in particular a surgical microscope or exoscope, comprises an optics unit with an objective arrangement and at least one electronic image recorder, wherein the optics unit has a first stereo channel with a first beam path and a second stereo channel with a second beam path for recording a stereo image of an object field with the at least one electronic image recorder and wherein the first and the second beam path extend through the objective arrangement. Further, the observation instrument comprises a retaining apparatus which comprises a retaining bracket, which engages over the optics unit, wherein the retaining bracket comprises an operating device with a number of operating elements for controlling a retaining arm, to which the retaining apparatus is connectable.

PLENOPTIC CAMERA FOR MOBILE DEVICES

A plenoptic camera for mobile devices is provided, having a main lens, a microlens array, an image sensor, and a first reflective element configured to reflect the light rays captured by the plenoptic camera before arriving at the image sensor, in order to fold the optical path of the light captured by the camera before impinging the image sensor. Additional reflective elements may also be used to further fold the light path inside the camera. The reflective elements can be prisms, mirrors or reflective surfaces of three-sided optical elements having two refractive surfaces that form a lens element of the main lens. By equipping mobile devices with this plenoptic camera, the focal length can be greatly increased while maintaining the thickness of the mobile device under current constraints.

STEREOSCOPIC VISUALIZATION CAMERA AND PLATFORM

A stereoscopic imaging apparatus and platform are disclosed. An example stereoscopic imaging apparatus includes a main objective assembly and left and right lens sets defining respective parallel left and right optical paths from light that is received from the main objective assembly of a target surgical site. Each of the left and right lens sets includes a front lens, first and second zoom lenses configured to be movable along the optical path, and a lens barrel configured to receive the light from the second zoom lens. The example stereoscopic imaging apparatus also includes left and right image sensors configured to convert the light after passing through the lens barrel into image data that is indicative of the received light. The example stereoscopic visualization camera further includes a processor configured to convert the image data into stereoscopic video signals or video data for display on a display monitor.

STEREOSCOPIC VISUALIZATION CAMERA AND PLATFORM

A stereoscopic imaging apparatus and platform are disclosed. An example stereoscopic imaging apparatus includes a main objective assembly and left and right lens sets defining respective parallel left and right optical paths from light that is received from the main objective assembly of a target surgical site. Each of the left and right lens sets includes a front lens, first and second zoom lenses configured to be movable along the optical path, and a lens barrel configured to receive the light from the second zoom lens. The example stereoscopic imaging apparatus also includes left and right image sensors configured to convert the light after passing through the lens barrel into image data that is indicative of the received light. The example stereoscopic visualization camera further includes a processor configured to convert the image data into stereoscopic video signals or video data for display on a display monitor.

STEREOSCOPIC CAMERA WITH FLUORESCENCE VISUALIZATION

A stereoscopic camera with fluorescence visualization is disclosed. An example stereoscopic camera includes a visible light source, a near-infrared light source, and a near-ultraviolet light source. The stereoscopic camera also includes a light filter assembly having left and right filter magazines positioned respectively along left and right optical paths and configured to selectively enable certain wavelengths of light to pass through. Each of the left and right filter magazines includes an infrared cut filter, a near-ultraviolent cut filter, and a near-infrared bandpass filter. A controller of the camera is configured to provide for a visible light mode, an indocyanine green (“ICG”) fluorescence mode, and a 5-aminolevulinic acid (“ALA”) fluorescence mode by synchronizing the activation of the light sources with the selection of the filters. A processor of the camera combines image data from the different modes to enable fluorescence emission light to be superimposed on visible light stereoscopic images.

STEREOSCOPIC CAMERA WITH FLUORESCENCE VISUALIZATION

A stereoscopic camera with fluorescence visualization is disclosed. An example stereoscopic camera includes a visible light source, a near-infrared light source, and a near-ultraviolet light source. The stereoscopic camera also includes a light filter assembly having left and right filter magazines positioned respectively along left and right optical paths and configured to selectively enable certain wavelengths of light to pass through. Each of the left and right filter magazines includes an infrared cut filter, a near-ultraviolent cut filter, and a near-infrared bandpass filter. A controller of the camera is configured to provide for a visible light mode, an indocyanine green (“ICG”) fluorescence mode, and a 5-aminolevulinic acid (“ALA”) fluorescence mode by synchronizing the activation of the light sources with the selection of the filters. A processor of the camera combines image data from the different modes to enable fluorescence emission light to be superimposed on visible light stereoscopic images.

PASSIVE THREE-DIMENSIONAL IMAGE SENSING BASED ON REFERENTIAL IMAGE BLURRING
20210352263 · 2021-11-11 ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

PASSIVE THREE-DIMENSIONAL IMAGE SENSING BASED ON REFERENTIAL IMAGE BLURRING
20210352263 · 2021-11-11 ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Head Mounted Display with Reflective Surface

In an example implementation according to aspects of the present disclosure, an electronic device comprises a display which comprises a head-mountable display (HMD) comprises a reflective surface coupled to a face plate of the HMD. The HMD comprises a light source to project light toward the reflective surface, wherein the projected light is reflected onto a wearer of the HMD by the reflective surface. The HMD also comprises a camera to capture an image of the wearer as reflected by the projected light from the reflective surface and a processor to identify a gesture of the wearer within the captured image of the wearer.