A61B2090/502

Systems and methods for detection and illumination of regions of interest

An illumination system for a lighting assembly comprises a light assembly configured to selectively illuminate an operating region in a surgical suite and a plurality of light sources positioned within the light assembly and configured to emit light. The system further comprises at least one imager configured to capture image data and a controller. The controller is configured to scan the image data in at least one region of interest for a shaded region and identify a location of the shaded region within the region of interest. The controller is further configured to control the light assembly to activate at least one of the light sources to emit light impinging on the shaded region within the region of interest.

Patient-specific guides for latarjet procedure

Patient-specific guides for the Latarjet procedure, as well as surgical systems and methods of performing the Latarjet procedure to treat glenohumeral instability using such patient-specific guides are disclosed. A patient-specific coracoid guide and a patient-specific glenoid guide may be configured based on preoperatively generated three-dimensional models of the patient's shoulder anatomy. Guides may be configured for coracoid graft preparation and glenoid decortication. The coracoid graft may be placed in the desired position based on three-dimensional (3D) preoperative planning.

SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL ROBOTIC ASSEMBLY IN AN INTERNAL BODY CAVITY

Methods and systems for performing a surgery within an internal cavity of a subject are provided herein. An example method for controlling a robotic assembly of a surgical robotic system includes, while at least a portion of the robotic assembly is disposed in an interior cavity of a subject, receiving a first control mode selection input from an operator and changing a current control mode of the surgical robotic system to a first control mode in response to the first control mode selection input; while the surgical robotic system is in the first control mode, receiving a first control input from hand controllers; in response to receiving the first control input, changing a position and/or an orientation of: at least a portion of the camera assembly, of at least a portion of the robotic arm assembly, or both, while maintaining a stationary position of instrument tips of the end effectors disposed at distal ends of the robotic arms.

Stereo microscope for use in microsurgical operations on a patient and method for controlling the stereo microscope

A stereo microscope includes a stand, two optical image acquisition units configured to connect to the stand to capture a stereoscopic image, which define an imaging plane using two optical axes of the image acquisition units, a pair of video glasses including two optical image reproduction units, each having an optical axis and a display for reproducing an image, which together define an image plane, wherein the optical image reproduction units are arranged to produce a stereoscopic image impression, and two optical axes of the optical image reproduction units define an image reproduction plane, a detection device configured to determine spatial orientation of the video glasses, the image reproduction plane, the image plane and the imaging plane, and a control unit configured to pivot the stand so that the intersection lines of the image plane and the imaging plane on the image reproduction plane are made parallel. Methods are also disclosed.

Rotary motion passive end effector for surgical robots in orthopedic surgeries
11510684 · 2022-11-29 · ·

A passive end effector of a surgical system includes a base connected to a rotational disk, and a saw attachment connected to the rotational disk. The base is attached to an end effector coupler of a robot arm positioned by a surgical robot, and includes a base arm extending away from the end effector coupler. The rotational disk is rotatably connected to the base arm and rotates about a first location on the rotational disk relative to the base arm. The saw attachment is rotatably connected to the rotational disk and rotates about a second location on the rotational disk. The first location on the rotational disk is spaced apart from the second location on the rotational disk. The saw attachment is configured to connect to a surgical saw including a saw blade configured to oscillate for cutting. The saw attachment rotates about the rotational disk and the rotational disk rotates about the base arm to constrain cutting of the saw blade to a range of movement along arcuate paths within a cutting plane.

Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications

A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.

VIRTUAL REALITY SURGICAL CAMERA SYSTEM

A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.

HEAD-MOUNTED VISION DETECTION EQUIPMENT, VISION DETECTION METHOD AND ELECTRONIC DEVICE
20220369924 · 2022-11-24 ·

The present disclosure relates to head-mounted vision detection equipment, vision detection method and electronic equipment, which relates to the technical field of vision detection. The head-mounted vision detection equipment includes a virtual reality headset, a sound collection device and a fundus detection device that are arranged on the virtual reality headset, and a processor. The vision detection headset is configured to display content to be recognized under control of the processor; the sound collection device is configured to obtain a recognition voice of a wearer for the content to be recognized; the fundus detection device is configured to obtain a fundus image of the wearer; and the processor is configured to acquire the recognition voice and the fundus image.

GESTURE BASED SELECTION OF PORTION OF CATHETER
20220370145 · 2022-11-24 ·

In one embodiment, a medical system includes a catheter configured to be inserted into a body part of a living subject, a display configured to provide a view of at least part of a hand of a user, and a processor configured to track a position of the catheter in the body part, render to the display a three-dimensional view of an interior of an anatomical map of the body part and a representation of the catheter inside the anatomical map responsively to the tracked position, while the display is providing the view of the at least part of the hand of the user, recognize a gesture of the at least part of the hand of the user selecting a portion of the catheter, and perform an action responsively to recognizing selection by the user of the portion of the catheter.

Surgical Simulation System With Coordinated Imagining

An interactive and dynamic surgical simulation system may be used in the context of a computer-implemented interactive surgical system. The surgical simulation system may provide coordinated surgical imagining. A processor may be configured to execute a simulation of a surgical procedure. The surgical procedure may be simulated in a simulated surgical environment. The processor may generate a first visual representation and a second visual representation. The first visual representation may be of a first portion of the simulated surgical environment. The second visual representation may also be of the first portion of the simulated surgical environment. The processor may coordinate generation of the first visual representation and the second visual representation such that the first visual representation and the second visual representation correspond to a common event in the surgical procedure. And the processor may present the first visual representation and the second visual representation for user interaction within the simulated surgical environment.