A61B1/00194

Visualization of depth and position of blood vessels and robot guided visualization of blood vessel cross section

A system for visualizing an anatomical target includes an imaging device (105) configured to collect real-time images of an anatomical target. A three-dimensional model (136) is generated from pre- or intra-operative images and includes images of structures below a surface of the anatomical target not visible in the images from the scope. An image processing module (148) is configured to generate an overlay (107) registered to the real-time images and to indicate the structures below the surface and a depth of the structures below the surface. A display device (118) is configured to concurrently display the real-time images and the overlay.

Pediatric nasal endoscope, gastroscope and aerodigestive scope

TNE provides the opportunity to make the care of children with EoE and other gastrointestinal or aerodigestive conditions safer, more efficient, and less costly while simultaneously advancing our understanding of the pathophysiology and natural course of this condition. A pediatric endoscope was developed to facilitate TNE in children with EoE. The pediatric endoscope (combined gastroscope, bronchoscope, laryngoscope) includes a 3-4 mm flexible, fiber optic endoscope that allows HD TV viewing with the head of a pediatric bronchoscope that allows four way tip deflection, a scope stiffening apparatus to minimize the endoscopes flexibility when needed, a foot and hand activation to allow air/water insufflation and image/video capture, a light source, 2 mm biopsy channel.

ARTICULATED STRUCTURED LIGHT BASED-LAPAROSCOPE
20220394161 · 2022-12-08 ·

In a method of using a structured-light based system, real-time 2D images of a portion of a field of view are captured using an endoscope. A portion of an object in the field of view is illuminated with a structured light pattern, and light reflected from the field of view is detected. From the reflected light, a 3D image of the field of view is constructed, and 3D locations of points on a surface of the object are determined. The real time 3D spatial position of the endoscope and/or a surgical tool is determined. If a distance between the surface the endoscope and/or surgical tool, as determined using the 3D spatial position, falls below a predetermined distance, an alert is generated to notify a user.

Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance

An imaging system, such as a surgical microscope, laparoscope, or endoscope or integrated with these devices, includes an illuminator providing patterned white light and/or fluorescent stimulus light. The system receives and images light hyperspectrally, in embodiments using a hyperspectral imaging array, and/or using narrowband tunable filters for passing filtered received light to an imager. Embodiments may construct a 3-D surface model from stereo images, and will estimate optical properties of the target using images taken in patterned light or using other approximations obtained from white light exposures. Hyperspectral images taken under stimulus light are displayed as fluorescent images, and corrected for optical properties of tissue to provide quantitative maps of fluorophore concentration. Spectral information from hyperspectral images is processed to provide depth of fluorophore below the tissue surface. Quantitative images of fluorescence at depth are also prepared. The images are displayed to a surgeon for use in surgery.

Surgical system with combination of sensor-based navigation and endoscopy

A set of pre-operative images may be captured of an anatomical structure using an endoscopic camera. Each captured image is associated with a position and orientation of the camera at the moment of capture using image guided surgery (IGS) techniques. This image data and position data may be used to create a navigation map of captured images. During a surgical procedure on the anatomical structure, a real-time endoscopic view may be captured and displayed to a surgeon. The IGS navigation system may determine the position and orientation of the real-time image; and select an appropriate pre-operative image from the navigation map to display to the surgeon in addition to the real-time image.

Image synchronization without input clock and data transmission clock in a pulsed hyperspectral, fluorescence, and laser mapping imaging system

Pulsed hyperspectral, fluorescence, and laser mapping imaging without input clock or data transmission clock is disclosed. A system includes an emitter for emitting pulses of electromagnetic radiation and an image sensor comprising a pixel array for sensing reflected electromagnetic radiation. The system includes a plurality of bidirectional data pads and a controller in communication with the image sensor. The system is such that at least a portion of the pulses of electromagnetic radiation emitted by the emitter comprises one or more of: electromagnetic radiation having a wavelength from about 513 nm to about 545 nm, from about 565 nm to about 585 nm, from about 900 nm to about 1000 nm, an excitation wavelength of electromagnetic radiation that causes a reagent to fluoresce, or a laser mapping pattern.

SCANNING SYSTEM FOR DETERMINING A HEALTH-CONDITION
20220369907 · 2022-11-24 · ·

Disclosed is a scanning system for determining a health-condition based on scanning an intraoral object. More specifically, the present disclosure is related to different methods of providing the health-condition, for example by adapting the scanner with different scanning modes.

Plenoptic endoscope with fiber bundle
11503987 · 2022-11-22 · ·

A plenoptic endoscope includes a fiber bundle with a distal end configured to receive light from a target imaging region, a sensor end disposed opposite the distal end, and a plurality of fiber optic strands each extending from the distal end to the sensor end. The plenoptic endoscope also includes an image sensor coupled to the sensor end of the fiber bundle, and a plurality of microlenses disposed between the image sensor and the sensor end of the fiber bundle, the plurality of microlens elements forming an array that receives light from one or more of the plurality of fiber optic strands of the fiber bundle and directs the light onto the image sensor. The plurality of microlens elements and the image sensor together form a plenoptic camera configured to capture information about a light field emanating from the target imaging region.

STAPLER APPARATUS AND METHODS FOR USE
20230056943 · 2023-02-23 ·

Apparatus and methods are provided for performing a medical procedure using a stapler apparatus including a reusable handle portion including a shaft include proximal and distal ends, a disposable end effector attached to the distal end of the shaft of the reusable handle carrying one or more staples. For example, the end effector may include first and second jaws movable relative to one another between open and closed positions, the first jaw carrying a cartridge which includes the one or more staples. A Doppler sensor, cutting element, thermal element, and/or grasper may be provided on the end effector. The end effector is introduced into a patient's body, tissue is positioned/locked between the jaws, and a plurality of staples are deployed into the tissue. The Doppler sensor is used to confirm that blood flow has discontinued in the stapled tissue, and the cutting element is actuated to sever the stapled tissue.

Systems and methods for projecting an endoscopic image to a three-dimensional volume

A method comprises obtaining an endoscopic image dataset of a patient anatomy from an endoscopic imaging system and retrieving an anatomic model dataset of the patient anatomy obtained by an anatomic imaging system. The method also comprises mapping the endoscopic image dataset to the anatomic model dataset and displaying a first vantage point image using the mapped endoscopic image dataset. The first vantage point image is presented from a first vantage point at a distal end of the endoscopic imaging system. The method also comprises displaying a second vantage point image using at least a portion of the mapped endoscopic image dataset. The second vantage point image is presented from a second vantage point, different from the first vantage point.