Patent classifications
G03B35/10
IMAGING DEVICE
Provided is an imaging device capable of reliably achieving both widening an angle of view and an improvement in productivity. An imaging device 100 includes a pair of camera modules 2 each including an imaging element 4 and a lens unit 3, in which optical axes OA of the lens units 3 are arranged in parallel to each other. Each of the pair of camera modules 2 has a configuration in which the imaging element 4 and the lens unit 3 are relatively arranged such that a center C of the imaging element 4 is separated from the optical axis CA by the same distance in the same direction. With respect to the posture of one camera module 2, the other camera modules 2 is arranged in an inverted posture in which the other camera modules 2 has rotated around a rotation axis RA along the optical axis OA. To read directions Dh, Dv in which signals are read from the imaging element 4 and that have been set in advance in one camera module 2, read directions Dh, Dv, in which signals are read from the imaging element 4 of the other camera module 2, are set to be opposite.
Hybrid sensor system and method for providing 3D imaging
Provided is a 3D depth sensing system and method of providing an image based on a hybrid sensing array. The 3D sensing system including a light source configured to emit light, a hybrid sensing array comprising a 2D sensing region configured to detect ambient light reflected from an object and a 3D depth sensing region configured to detect the light emitted by the light source and reflected from the object, a metalens on the hybrid sensing array, the metalens being configured to direct the ambient light reflected from the object towards the 2D sensing region, and to direct the light emitted by the light source and reflected from the object towards the 3D depth sensing region, and a processing circuit configured to combine 2D image information provided by the 2D sensing region and 3D information provided by the 3D depth sensing region to generate a combined 3D image.
Hybrid sensor system and method for providing 3D imaging
Provided is a 3D depth sensing system and method of providing an image based on a hybrid sensing array. The 3D sensing system including a light source configured to emit light, a hybrid sensing array comprising a 2D sensing region configured to detect ambient light reflected from an object and a 3D depth sensing region configured to detect the light emitted by the light source and reflected from the object, a metalens on the hybrid sensing array, the metalens being configured to direct the ambient light reflected from the object towards the 2D sensing region, and to direct the light emitted by the light source and reflected from the object towards the 3D depth sensing region, and a processing circuit configured to combine 2D image information provided by the 2D sensing region and 3D information provided by the 3D depth sensing region to generate a combined 3D image.
IMAGING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
The present disclosure relates to an imaging device, an information processing method, and a program capable of more easily grasping parallax among a plurality of images viewed from different individual-view optical systems.
Any one a plurality of images viewed from a plurality of individual-view optical systems having optical paths independent from one another is displayed while selectively and dynamically switching among the images. The present disclosure can be applied to, for example, an imaging device, an electronic device, an interchangeable lens or a camera system equipped with a plurality of individual-view lenses, an information processing method, a program, or the like.
LENS APPARATUS AND IMAGE PICKUP APPARATUS
A lens apparatus includes two optical systems each of which includes, in order from an object side to an image side, a negative first lens unit, a first reflective member, an aperture diaphragm, a second reflective member, and a positive second lens unit. Each optical system satisfies following inequalities:
5.9<DR/f<13.6
0.7<D2/f2<5.2
10.4<DP/f<19.8
DR represents a distance on an optical axis from a reflective surface of the first reflective member to a reflective surface of the second reflective member. f represents a focal length of the optical system. f2 represents a focal length of the second lens unit. D2 represents a distance on the optical axis from the reflective surface of the second reflective member to an image plane. DP represents a distance on the optical axis from the aperture diaphragm to the image plane.
Reflective optical element and stereo camera device
Provided is a reflective optical element that is lightweight and excellent in damping capacity. In the reflective optical element, a resin layer having an optical surface is formed on a metal substrate, and a reflective film is formed on the optical surface, and also, the metal substrate includes an alloy containing Mg as a main component.
Stereoscopic visualization camera and platform
A stereoscopic imaging apparatus and platform are disclosed. An example stereoscopic imaging apparatus includes a main objective assembly and left and right lens sets defining respective parallel left and right optical paths from light that is received from the main objective assembly of a target surgical site. Each of the left and right lens sets includes a front lens, first and second zoom lenses configured to be movable along the optical path, and a lens barrel configured to receive the light from the second zoom lens. The example stereoscopic imaging apparatus also includes left and right image sensors configured to convert the light after passing through the lens barrel into image data that is indicative of the received light. The example stereoscopic visualization camera further includes a processor configured to convert the image data into stereoscopic video signals or video data for display on a display monitor.
Stereoscopic visualization camera and platform
A stereoscopic imaging apparatus and platform are disclosed. An example stereoscopic imaging apparatus includes a main objective assembly and left and right lens sets defining respective parallel left and right optical paths from light that is received from the main objective assembly of a target surgical site. Each of the left and right lens sets includes a front lens, first and second zoom lenses configured to be movable along the optical path, and a lens barrel configured to receive the light from the second zoom lens. The example stereoscopic imaging apparatus also includes left and right image sensors configured to convert the light after passing through the lens barrel into image data that is indicative of the received light. The example stereoscopic visualization camera further includes a processor configured to convert the image data into stereoscopic video signals or video data for display on a display monitor.
LENS APPARATUS
A lens apparatus includes a lens disposed closest to an object, a holder holding the lens, a cover having a first opening to expose the lens when viewed from an optical axis direction of the lens and being positioned with the holder in the optical axis direction, and an exterior member having a second opening to engage with an outer diameter of the cover. A first gap in a diameter direction orthogonal to the optical axis direction formed between the holder and the cover is larger than a second gap in the diameter direction formed between the exterior member and the cover.
LENS APPARATUS
A lens apparatus includes a lens disposed closest to an object, a holder holding the lens, a cover having a first opening to expose the lens when viewed from an optical axis direction of the lens and being positioned with the holder in the optical axis direction, and an exterior member having a second opening to engage with an outer diameter of the cover. A first gap in a diameter direction orthogonal to the optical axis direction formed between the holder and the cover is larger than a second gap in the diameter direction formed between the exterior member and the cover.