Patent classifications
A61B2090/372
Tracker for a head-mounted display
A tracker 30 for a Head-Mounted Display, HMD, unit is provided. The tracker 30 comprises a carrier element 10 carrying one or more markers 16a, 16b that are configured to permit determining a position of the tracker 30. The carrier element 10 comprises at least one magnetic element 32 configured to cooperate with at least one magnetic element 22 provided on the HMD unit 62, or on a base element 20 that is to be fixed to the HMD unit 62, for detachably attaching the carrier element 10 to the HMD unit 62.
SURGICAL VIRTUAL REALITY USER INTERFACE
A surgical virtual reality user interface generating system comprising a sensor and tracking unit for sensing and tracking a position a user and generating position data based on movement of the user, a computing unit for receiving the position data and processing the position data and generating control signals. The system also includes a surgical robot system for receiving the control signals and having a camera assembly for generating image data, and a virtual reality computing unit for generating a virtual reality world. The virtual reality computing unit includes a virtual reality rendering unit for generating an output rendering signal for rendering the image data for display, and a virtual reality object generating unit for generating virtual reality informational objects and for emplacing the informational objects in the virtual reality world. A display unit is provided for displaying the virtual reality world and the informational objects to the user.
CUP ALIGNMENT SYSTEMS AND METHODS
A system can include a module to measure mobility, such as pre-operative pelvic mobility, for surgical planning. The module can include one or more inertial sensors that can be positioned relative to the anatomy of a patient. Hip navigation systems can guide an acetabular cup to patient-specific target angles, based in part, on the pre-operative pelvic mobility of the patient.
CONTROL ACCESS VERIFICATION OF A HEALTH CARE PROFESSIONAL
A computing system may identify a surgical instrument for a surgical procedure in an operating room (OR). The computing system may detect a control input by a health care professional (HCP) to control the surgical instrument. The computing system may determine the HCP's access control level associated with the surgical instrument. The computing system may determine whether the HCP has an authorization to control the surgical instrument. If the computing system determines that HCP is unauthorized to control the surgical instrument based on the access control level associated with the HCP, the computing system may block the control input by the HCP. If the computing system determines that the HCP is authorized to control the surgical instrument based on the access control level associated with the HCP, the computing system may effectuate the control input by the HCP to control the surgical instrument.
SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL ROBOTIC ASSEMBLY IN AN INTERNAL BODY CAVITY
Methods and systems for performing a surgery within an internal cavity of a subject are provided herein. An example method for controlling a robotic assembly of a surgical robotic system includes, while at least a portion of the robotic assembly is disposed in an interior cavity of a subject, receiving a first control mode selection input from an operator and changing a current control mode of the surgical robotic system to a first control mode in response to the first control mode selection input; while the surgical robotic system is in the first control mode, receiving a first control input from hand controllers; in response to receiving the first control input, changing a position and/or an orientation of: at least a portion of the camera assembly, of at least a portion of the robotic arm assembly, or both, while maintaining a stationary position of instrument tips of the end effectors disposed at distal ends of the robotic arms.
Stereo microscope for use in microsurgical operations on a patient and method for controlling the stereo microscope
A stereo microscope includes a stand, two optical image acquisition units configured to connect to the stand to capture a stereoscopic image, which define an imaging plane using two optical axes of the image acquisition units, a pair of video glasses including two optical image reproduction units, each having an optical axis and a display for reproducing an image, which together define an image plane, wherein the optical image reproduction units are arranged to produce a stereoscopic image impression, and two optical axes of the optical image reproduction units define an image reproduction plane, a detection device configured to determine spatial orientation of the video glasses, the image reproduction plane, the image plane and the imaging plane, and a control unit configured to pivot the stand so that the intersection lines of the image plane and the imaging plane on the image reproduction plane are made parallel. Methods are also disclosed.
Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.
VIRTUAL REALITY SURGICAL CAMERA SYSTEM
A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.
HEAD-MOUNTED VISION DETECTION EQUIPMENT, VISION DETECTION METHOD AND ELECTRONIC DEVICE
The present disclosure relates to head-mounted vision detection equipment, vision detection method and electronic equipment, which relates to the technical field of vision detection. The head-mounted vision detection equipment includes a virtual reality headset, a sound collection device and a fundus detection device that are arranged on the virtual reality headset, and a processor. The vision detection headset is configured to display content to be recognized under control of the processor; the sound collection device is configured to obtain a recognition voice of a wearer for the content to be recognized; the fundus detection device is configured to obtain a fundus image of the wearer; and the processor is configured to acquire the recognition voice and the fundus image.
GESTURE BASED SELECTION OF PORTION OF CATHETER
In one embodiment, a medical system includes a catheter configured to be inserted into a body part of a living subject, a display configured to provide a view of at least part of a hand of a user, and a processor configured to track a position of the catheter in the body part, render to the display a three-dimensional view of an interior of an anatomical map of the body part and a representation of the catheter inside the anatomical map responsively to the tracked position, while the display is providing the view of the at least part of the hand of the user, recognize a gesture of the at least part of the hand of the user selecting a portion of the catheter, and perform an action responsively to recognizing selection by the user of the portion of the catheter.