A61B2090/373

METHOD OF FITTING A KNEE PROSTHESIS WITH ASSISTANCE OF AN AUGMENTED REALITY SYSTEM

The method of fitting a knee prosthesis in a knee of a patient includes displaying, by a visual display device of a mobile augmented reality navigation system worn or carried by a user performing or assisting at least one navigation-assisted stage of the method, at least one visual element superposed on a view of at least a portion of a surgical scene of the fitting of the knee prosthesis. The visual element can be a 3D model of at least one portion of the knee of the patient, a 3D model of another component of the surgical scene, information relative to the at least one portion of the knee of the patient, and/or information relative to the other component of the surgical scene.

PREDICTING CURVED PENETRATION PATH OF A SURGICAL DEVICE
20230045709 · 2023-02-09 ·

A surgical device comprising an elongated body, a tissue penetrating apparatus and a light projector. The elongated body can reach with distal end thereof a surface of an organ within a subject's body. The tissue penetrating apparatus can be extended from the elongated body distal end along a curved penetration path restricted to a chosen penetration plane. The light projector can generate a shaped illumination on the surface of the organ indicative of an intersection of the penetration plane with the surface of the organ.

CA IX-NIR dyes and their uses

The present disclosure relates to compounds that are useful as near-infrared fluorescence probes, wherein the compounds include i) a ligand that binds to the active site of carbonic anhydrase, ii) a dye molecule, and iii) a linker molecule that comprises an amino acid, amide, ureido, or polyethylene glycol derivative thereof. The disclosure further describes methods and compositions for making and using the compounds, methods incorporating the compounds, and kits incorporating the compounds.

Systems and methods for imaging communication and control

A telesurgical mentoring platform with a wheeled base, a lower rack mounted on the base, an upper rack extending vertically from the lower rack, a compactly foldable articulated arm that is configured to extend horizontally outward away from the upper rack and configured to connect to a connector piece holding an end effectuator at its distal end, a tablet personal computer; the console configured to be readily mobilized on the floor of an existing operating room and is capable of providing a connectivity point for communication, audiovisual, and data transfer services in an operating room.

Mixed-reality surgical system with physical markers for registration of virtual models

An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.

Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures

A situationally aware surgical system configured for use during a surgical procedure performed on a patient by an operating clinician is disclosed including a surgical instrument configured to generate a signal and a cloud-based analytics subsystem including a memory and a control circuit. The memory is configured to store a plurality of baseline variables. The control circuit is configured to receive the signal, determine a type of surgical procedure being performed, at least in part, on the received signal, determine that a baseline variable of the plurality of baseline variables corresponds to the determined type of surgical procedure, determine a procedural variable of the surgical procedure based, at least in part, on the received signal, compare the determined procedural variable to the corresponding baseline variable, and generate an alert for the operating clinician based, at least in part, on the comparison.

Visualization systems using structured light

A visualization system including multiple light sources, an image sensor configured to detect imaging data from the multiple light sources, and a control circuit is disclosed. At least one of the light sources is configured to emit a pattern of structured light. The control circuit is configured to receive the imaging data from the image sensor, generate a three-dimensional digital representation of the anatomical structure from the pattern of structured light detected by the imaging data, obtain metadata from the imaging data, overlay the metadata on the three-dimensional digital representation, receive updated imaging data from the image sensor, and generate an updated three-dimensional digital representation of the anatomical structure based on the updated imaging data. The visualization system can be communicatively coupled to a situational awareness module configured to determine a surgical scenario based on input signals from multiple surgical devices.

Image processing device, image processing method, and surgical navigation system
11707340 · 2023-07-25 · ·

Provided is an image processing device including a matching unit that performs matching processing between a predetermined pattern on a surface of a 3D model of a biological tissue including an operating site generated on the basis of a preoperative diagnosis image and a predetermined pattern on a surface of the biological tissue included in a captured image during surgery, a shift amount estimation unit that estimates an amount of deformation from a preoperative state of the biological tissue on the basis of a result of the matching processing and information regarding a three-dimensional position of a photographing region which is a region photographed during surgery on the surface of the biological tissue, and a 3D model update unit that updates the 3D model generated before surgery on the basis of the estimated amount of deformation of the biological tissue.

Generation of three-dimensional scans for intraoperative imaging

A system for executing a three-dimensional (3D) intraoperative scan of a patient is disclosed. A 3D scanner controller projects the object points included onto a first image plane and the object points onto a second image plane. The 3D scanner controller determines first epipolar lines associated with the first image plane and second epipolar lines associated with the second image plane based on an epipolar plane that triangulates the object points included in the first 2D intraoperative image to the object points included in the second 2D intraoperative image. Each epipolar lines provides a depth of each object as projected onto the first image plane and the second image plane. The 3D scanner controller converts the first 2D intraoperative image and the second 2D intraoperative image to the 3D intraoperative scan of the patient based on the depth of each object point provided by each corresponding epipolar line.

Systems and methods for surgical navigation

Disclosed are systems, methods, and techniques for registering a HMD coordinate system of a head-mounted display (HMD) and a localizer coordinate system of a surgical navigation localizer. A camera of the HMD captures at least one image of a registration device having a registration coordinate system and a plurality of registration markers. The registration markers are analyzed in the at least one image to determine a pose of the HMD coordinate system relative to the registration coordinate system. One or more position sensors comprised in the localizer detect a plurality of tracking markers comprised in the registration device to determine a pose of the registration coordinate system relative to the localizer coordinate system. The HMD coordinate system and the localizer coordinate system are registered using the registration device, wherein positions of the registration markers are known with respect to positions of the tracking markers in the registration coordinate system.