A61B2090/364

SYSTEMS AND METHODS FOR PROJECTING AN ENDOSCOPIC IMAGE TO A THREE-DIMENSIONAL VOLUME
20230218356 · 2023-07-13 ·

A method comprises obtaining an endoscopic image dataset of a patient anatomy from an endoscopic imaging system and retrieving an anatomic model dataset of the patient anatomy obtained by an anatomic imaging system. The method also comprises mapping the endoscopic image dataset to the anatomic model dataset and displaying a first vantage point image using the mapped endoscopic image dataset. The first vantage point image is presented from a first vantage point at a distal end of the endoscopic imaging system. The method also comprises displaying a second vantage point image using at least a portion of the mapped endoscopic image dataset. The second vantage point image is presented from a second vantage point, different from the first vantage point.

DATA PROCESSING APPARATUS, DATA PROCESSING METHOD, AND DATA PROCESSING SYSTEM
20230218374 · 2023-07-13 · ·

A data processing apparatus for processing three-dimensional data including a position of each point of a point group representing at least a surface of an object, the three-dimensional data being acquired by a three-dimensional scanner, the data processing apparatus, including a scanner interface to which the three-dimensional data acquired by the three-dimensional scanner is input; and processing circuitry configured to generate a data set by using a plurality of pieces of the three-dimensional data located in a predetermined range among a plurality of pieces of the three-dimensional data input from the scanner interface, and, set, when a plurality of data sets are generated, a data set with a largest data amount or at least a predetermined data amount as a true data set among the plurality of data sets.

DATA PROCESSING APPARATUS, DATA PROCESSING METHOD, AND DATA PROCESSING SYSTEM
20230218375 · 2023-07-13 · ·

The data processing apparatus includes a scanner interface to which three-dimensional data acquired by the three-dimensional scanner is inputted, and processing circuitry configured to verify first three-dimensional data input from the scanner interface and second three-dimensional data input from the scanner interface by comparing the first three-dimensional data and the second three-dimensional data in a virtual space set with respect to the position of the three-dimensional scanner.

LIGHTING ARRANGEMENT
20230009128 · 2023-01-12 ·

The present disclosure relates to a lighting arrangement, which comprises at least one light source, with at least one camera and a control system, wherein the colour and/or colour temperature of the light source is variable and the camera records images of a region illuminated by the light source. According to the disclosure, the control system controls the lighting arrangement such that the light source illuminates the illuminated region successively with light of a different colour and/or colour temperature, wherein the control system separately evaluates and/or outputs a first image recorded in a first colour and/or colour temperature of the lighting.

ENDOTRACHEAL TUBE SIZE SELECTION AND INSERTION DEPTH ESTIMATION USING STATISTICAL SHAPE MODELLING AND VIRTUAL FITTING

An intubation assistance device includes an electronic controller configured to: generate a patient respiratory tract geometry model of at least a portion of a human respiratory tract by inputting one or more patient variables into a statistical shape model (SSM) of at least a portion of the human respiratory tract; select a recommended endotracheal tube (ETT) size by modeling at least one ETT model inserted into the patient respiratory tract geometry model to form a virtual fit model and estimating at least one fit parameter based on the virtual fit model; and display the recommended ETT size on a display device.

SYSTEMS AND METHODS FOR SURGICAL NAVIGATION
20230008222 · 2023-01-12 ·

Imaging systems and methods may facilitate positioning an imaging device in a procedure room. A 3D image of a subject may be obtained, where the subject is to have a procedure performed thereon. A view of the 3D image of the subject may be adjusted to a desired view and an associated 2D image reconstruction at the desired view may be obtained. A position for the imaging device that is associated with the desired view of the 3D image of the subject may be identified. Adjusting a view of the 3D image to a desired view and obtaining a 2D image reconstruction may be performed pre-procedure, such that a user may be able to create a list of desired views pre. A user may adjust a physical position of the imaging device to obtain reconstructed 2D preview images at the adjusted physical position of the imaging device prior to capturing an image.

SYSTEMS AND METHODS FOR USING REGISTERED FLUOROSCOPIC IMAGES IN IMAGE-GUIDED SURGERY

A method performed by a computing system comprises receiving a fluoroscopic image of a patient anatomy while a portion of a medical instrument is positioned within the patient anatomy. The fluoroscopic image has a fluoroscopic frame of reference. The portion has a sensed position in an anatomic model frame of reference. The method further comprises identifying the portion in the fluoroscopic image and identifying an extracted position of the portion in the fluoroscopic frame of reference using the identified portion in the fluoroscopic image. The method further comprises registering the fluoroscopic frame of reference to the anatomic model frame of reference based on the sensed position of the portion and the extracted position of the portion.

Attachments for tracking handheld implements

Devices and systems are provided for tracking a position and orientation of a handheld implement, such that the handheld implement may be trackable with an overhead tracking system. A support member secures one or more markers relative to a longitudinal portion of the handheld implement, and a marker plane containing the markers is orientated an angle relative to a longitudinal axis of the longitudinal portion. A marker assembly may include a support member for supporting the markers, and a connector for removably attaching the marker assembly to one or more handheld implements. The marker assembly may be configured to be removably attachable to a plurality of connection adapters, where each connection adapter is further connectable to a handheld implement, optionally at a calibrated position, such that a single connection adapter can be optionally employed to track a plurality of handheld implements. The handheld implement may be a medical instrument.

System for real-time organ segmentation and tool navigation during tool insertion in interventional therapy and method of operation thereof

An interventional therapy system may include at least one catheter configured for insertion within an object of interest (OOI); and at least one controller which configured to: obtain a reference image dataset including a plurality of image slices which form a three-dimensional image of the OOI; define restricted areas (RAs) within the reference image dataset; determine location constraints for the at least one catheter in accordance with at least one of planned catheter intersection points, a peripheral boundary of the OOI and the RAs defined in the reference dataset; determine at least one of a position and an orientation of the distal end of the at least one catheter; and/or determine a planned trajectory for the at least one catheter in accordance with the determined at least one position and orientation for the at least one catheter and the location constraints.

Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display

A system includes a robotic arm, an autosteroscopic display, a user image capture device, an image processor, and a controller. The robotic arm is coupled to a patient image capture device. The autostereoscopic display is configured to display an image of a surgical site obtained from the patient image capture device. The image processor is configured to identify a location of at least part of a user in an image obtained from the user image capture device. The controller is configured to, in a first mode, adjust a three dimensional aspect of the image displayed on autostereoscopic display based on the identified location, and, in a second mode, move the robotic arm or instrument based on a relationship between the identified location and the surgical site image.