Patent classifications
G02B23/2415
ENDOSCOPE MAGNIFICATION OPTICAL SYSTEM AND ENDOSCOPE
An endoscope magnification optical system includes, in order starting from an object side, a first lens group having positive power, a second lens group having positive power, and a third lens group including at least a meniscus lens with a concave surface facing the object side and a positive lens, and the endoscope magnification optical system is configured to magnify an optical image by moving at least the second lens group in an optical axis direction with respect to the first lens group, which is a fixed lens group, while a distance from a lens surface located the closest to the object of the first lens group to an image plane is kept constant.
OPTICAL SYSTEM OF A STEREO VIDEO ENDOSCOPE, STEREO VIDEO ENDOSCOPE AND METHOD FOR OPERATING AN OPTICAL SYSTEM OF A STEREO VIDEO ENDOSCOPE
An optical system for a stereo video endoscope, a stereo video endoscope and a method for operating an optical system. The optical system includes a distal optical assembly and a proximal optical assembly with a left lens system channel and a right lens system channel. The distal optical assembly couples light incident from an object space into the left lens system channel and into the right lens system channel of the proximal optical assembly. The distal optical assembly is an optical assembly with an adjustable focal length, wherein a change in the focal length causes a displacement of an axis intersection point in the object space.
SURFACE ESTIMATION METHOD, SURFACE ESTIMATION DEVICE, AND RECORDING MEDIUM
A surface estimation method includes a region-setting step and an estimation step. In the region-setting step, a reference region that is one of a three-dimensional region and a two-dimensional region is set. The three-dimensional region includes three or more points and is set in a three-dimensional space. The three-dimensional space includes three-dimensional coordinates of three or more points on a subject calculated on the basis of a two-dimensional image of the subject. The three-dimensional coordinates of the three or more points are included in three-dimensional image data. The two-dimensional region includes three or more points and is set in the two-dimensional image. In the estimation step, a reference surface that approximates a surface of the subject is estimated on the basis of three or more points of the three-dimensional image data corresponding to the three or more points included in the reference region.
Endoscope with integrated measurement of distance to objects of interest
The present specification describes a method for determining the distance of an object from the tip of an endoscope during an endoscopic procedure, wherein at least one lens is configured to converge light from outside the tip onto a sensor that includes a plurality of photodiodes a portion of which are adjacent pairs of photodiodes configured to be phase detection pixels. The method includes receiving light into each adjacent pair of photodiodes, wherein said light is reflected off a surface of said object; determining a first response curve to said light for a first photodiode of said adjacent pair of photodiodes and a second response curve to said light for a second photodiode of said adjacent pair of photodiodes; identifying an intersection between the first response curve and the second response curve; and using data derived from said intersection to determine said distance to the object.
Apparatus for capturing a stereo image
An apparatus for capturing a stereo image includes a first objective for producing a first image, a second eye objective for producing a second eye image, a first viewing direction device rotatable about a first axis assigned the first objective, and a second viewing direction device rotatable about a second axis assigned the second objective. The viewing direction of the apparatus is rotatable by simultaneous rotation of the first and second viewing direction devices. The first objective or part is movable in a translational fashion. A cam mechanism is provided and embodied to couple a rotation of the first viewing direction device to a translational movement of the first objective or part.
ENDOSCOPE APPARATUS, PROCESSOR FOR ENDOSCOPE IMAGE, AND METHOD FOR GENERATING ENDOSCOPE IMAGE
An endoscope apparatus includes first and second optical lenses that are separately provided at a distal end portion of an endoscope, and form first and second optical images, respectively, an optical path length changing filter that makes optical path lengths of the first and second optical images different, an image pickup device that performs image pickup of the first and second optical images to generate first and second image pickup signals, and a processor that generates first and second images from the first and second image pickup signals, corrects parallax, removes an area including a figure present only in one of the first and second images, and combines respective focusing areas of the first and second images to generate an endoscope image.
VIRTUAL REALITY SURGICAL CAMERA SYSTEM
A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.
SYSTEMS AND METHODS FOR MEDICAL IMAGING
The present disclosure provides systems and methods for medical imaging. The system may comprise a scope assembly. The scope assembly may comprise a housing unit configured to releasably couple to at least a portion of an elongated scope. The scope assembly may comprise an imaging unit operably coupled to the housing unit, wherein the imaging unit comprises an optics assembly configured to (i) receive one or more light beams that are transmitted through the elongated scope and (ii) direct at least a portion of the one or more light beams onto two or more locations within a subject's body. At least one of the two or more locations may comprise a target site. The imaging unit may be configured to move via a rotational and/or translational motion relative to the housing unit to alter a field of view when imaging within the subject's body.
Virtual reality surgical camera system
A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.
Imaging apparatus having configurable stereoscopic perspective
In some embodiments, a stereoscopic imaging apparatus includes a tubular housing having a bore extending longitudinally through the housing. First and second image sensors are disposed proximate a distal end of the bore, each including a light sensitive elements on a face and mounted facing laterally outward. The apparatus further includes a first beam steering element associated with the first image sensor and a second beam steering element associated with the second image sensor. The beam steering elements receive light from first and second perspective viewpoints and direct the received light onto the faces of the image sensors forming first and second images. Either the first and second beam steering elements or the first and second image sensors are moveable to cause a change a spacing between or an orientation of the perspective viewpoints to cause sufficient disparity between the first and second images to provide image data including three-dimensional information.