A61B2034/2057

Virtual reality surgical camera system

A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.

Method for monitoring object flow within a surgical space during a surgery

One variation of a method for tracking objects within a surgical space during a surgery includes: based on a first image depicting the surgical space at a first time, detecting a first object and a constellation of objects in the surgical space, estimating distances from each object—in the constellation of objects—to the first object, and calculating a contamination risk of the first object based on contamination scores and distances to the first object for each object in the constellation of objects; calculating a contamination score of the first object based on a combination of the contamination risks of the first object during the surgery; and, in response to the contamination score of the first object exceeding a threshold contamination score prior to contact between the first object and a patient, serving a prompt within the surgical space to address sterility of the first object.

Optical tracking device with built-in structured light module

A system is disclosed that includes an optical tracking device and a surgical computing device. The optical tracking device includes a structured light module and an optical module that includes an image sensor and is spaced from the structured light module at a known distance. The surgical computing device includes a display device, a non-transitory computer readable medium including instructions, and processor(s) configured to execute the instructions to generate a depth map from a first image captured by the image sensor during projection of a pattern into a surgical environment by the structured light module. The pattern is projected in a near-infrared (NIR) spectrum. The processor(s) are further configured to execute the stored instructions to reconstruct a 3D surface of anatomical structure(s) based on the generated depth map. Additionally, the processor(s) are configured to execute the stored instructions to output the reconstructed 3D surface to the display device.

Real-time surgical reference indicium apparatus and methods for astigmatism correction
11497561 · 2022-11-15 · ·

A system, method, and apparatus for guiding an astigmatism correction procedure on an eye of a patient are disclosed. An example apparatus include a photosensor configured to record a pre-operative still image of an ocular target surgical site of the patient. The apparatus also includes a real-time, multidimensional visualization module configured to produce a real-time multidimensional visualization of the ocular target surgical site during an astigmatism correction procedure. The apparatus further includes a data processor configured to determine a virtual indicium that includes data for guiding the astigmatism correction procedure. The data processor uses the pre-operative still image to align the virtual indicium with the multidimensional visualization such that the virtual indicium is rotationally accurate. The data processor then displays the multidimensional visualization of the ocular target surgical site in conjunction with the virtual indicium.

Systems and methods for capturing, displaying, and manipulating medical images and videos

A surgical image capture and display system includes a handheld image capture and pointing device and a display assembly. An image is captured by an image sensor of the handheld device and displayed on the display assembly. The image sensor detects light emitted by one or more beacons of the display assembly. The system determines, based on the light emitted by the one or more beacons, a position or orientation of the handheld device relative to the display assembly. The system updates display of a graphical user interface comprising the image on the display assembly in accordance with the determined position or orientation of the handheld device.

Systems And Methods For Tracking Objects

Systems and methods to track objects within an operating room with a navigation system that includes an optical sensor including sensing elements and a controller in communication with the optical sensor. The controller controls the optical sensor to process a first quantity of the sensing elements to view a first region of interest. The object is tracked within the first region of interest with the optical sensor. The controller obtains data related to the object within the operating room. The data may include a type of the object and/or prior pose data of the object being tracked. Based on the data, the controller controls the optical sensor to process a second quantity of the sensing elements, different from the first quantity, to view a second region of interest different from the first region of interest. The object is tracked within the second region of interest with the optical sensor.

Surgical Tracker With Emitters Triggered By Electromagnetic Radiation

A tracker, a surgical tracking system, and a method for operating the tracker are provided. The tracker comprises an interface configured to attach the tracker to a surgical object that is to be tracked. The tracker further comprises circuitry comprising a detector configured to detect electromagnetic radiation, wherein the circuitry is configured to generate a trigger signal upon detection of a change of intensity of electromagnetic radiation by the detector. The circuitry further comprises a plurality of emitters configured to emit electromagnetic radiation, wherein the circuitry is configured to control the plurality of emitters to emit electromagnetic radiation responsive to the trigger signal.

ROBOTIC SURGERY

Teleoperative, partially automated, and fully automated robotic surgery systems and methods are described herein. These systems and methods relate to at least improvement of robotic movements, three dimensional tracking and pose correction for robots interacting with deformable objections, controlling and optimizing the redundant axis of a seven degree of freedom robotic arm, virtual robotic surgery and simulation, and task coordination and optimization for multi-robot surgery.

Patella Tracking

Disclosed herein are a surgical system for patella tracking and a method for selecting a properly-sized patellar implant utilizing the same. The surgical system may include first and second trackers and a patellar tracking system. The first tracker may be configured to contact an unresected or a resected patella, and the second tracker may be configured to contact a bone. The patellar tracking system may be configured to track the first and second trackers during patellar flexion and extension to generate patellar range of motion and patellar trial range of motion. A method for selecting a patellar implant may utilize the first and second trackers and the patellar tracking system.

POSE MEASUREMENT CHAINING FOR EXTENDED REALITY SURGICAL NAVIGATION IN VISIBLE AND NEAR INFRARED SPECTRUMS
20230040292 · 2023-02-09 ·

A surgical system includes a camera tracking system that determines a first pose transform between a first object coordinate system and the first tracking camera coordinate system based on first object tracking information from the first tracking camera which indicates pose of the first object. The camera tracking system determines a second pose transform between the first object coordinate system and the second tracking camera coordinate system based on first object tracking information from the second tracking camera indicating pose of the first object, and determines a third pose transform between a second object coordinate system and the second tracking camera coordinate system based on second object tracking information from the second tracking camera indicating pose of the second object. The camera tracking system determines a fourth pose transform between the second object coordinate system and the first tracking camera coordinate system based on combining the first, second, and third pose transforms.