A61B2034/2057

Method and system of providing visual information about a location and shape of a tumour under a body surface of a human or animal body

In a method and system for providing visual information about a tumour location in human or animal body, an electromagnetic tumour sensor is provided in the tumour and tracked to determine its location in space, which is mapped to a tumour model. A surgical tool sensor is provided on a surgical tool, and tracked to determine its location in space, which is mapped to a surgical tool model. The body is scanned to obtain information about an anatomical structure. A reference sensor is provided on the body, and tracked to determine its location in space, which is mapped to the anatomical structure. A virtual image is displayed showing the tumour model, located with the at least one tumour sensor, in spatial relationship to the surgical tool model, located with the at least one surgical tool sensor, and the anatomical structure, located with the at least one reference sensor.

SURGICAL VIRTUAL REALITY USER INTERFACE
20220387128 · 2022-12-08 ·

A surgical virtual reality user interface generating system comprising a sensor and tracking unit for sensing and tracking a position a user and generating position data based on movement of the user, a computing unit for receiving the position data and processing the position data and generating control signals. The system also includes a surgical robot system for receiving the control signals and having a camera assembly for generating image data, and a virtual reality computing unit for generating a virtual reality world. The virtual reality computing unit includes a virtual reality rendering unit for generating an output rendering signal for rendering the image data for display, and a virtual reality object generating unit for generating virtual reality informational objects and for emplacing the informational objects in the virtual reality world. A display unit is provided for displaying the virtual reality world and the informational objects to the user.

SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT FOR IMPROVED MINI-SURGERY USE CASES
20220387129 · 2022-12-08 ·

An imaging system aka 3d camera operative in conjunction with a tube having two open ends, the system comprising active portions small enough to fit into the tube and an electronic subsystem including a hardware processor operative to receive image/s from the active portions and to generate therefrom at least one 3D image of a scene visible via one of the tube's open ends. The system may comprise a tracker configured to be secured to the tube, and a method for monitoring location, e.g. absolute location, of the tube, accordingly.

INSTRUMENT MONITORING DEVICE AND INSTRUMENT MONITORING METHOD
20220392227 · 2022-12-08 ·

It is judged that a relevant tool is lost at a second time point after the elapse of a certain amount of time or longer after a first time point when it is judged that the tool is not recognized at all in images; and regarding the tool which is judged to have been lost, trace data corresponding to heads-up time, which is immediately before the first time point to immediately after the first time point, is read with reference to a corresponding identifier and a movement locus of the tool based on the trace data is displayed on a screen.

Systems and methods for measuring bone joint laxity

A system and device (110) for determining bone laxity. For example, the system includes a tracked probe (300) comprising at least one probe marker (310) and a computer assisted surgical (CAS) system (100). The CAS system includes a navigation system (130) and a processing device (110) operably connected to the navigation system and a computer readable medium configured to store one or more instructions that, when executed, cause the processing device to receive location information from the navigation system, generate (820) a surgical plan comprising a post-operative laxity assumption (720), collect (850) first motion information related to movement of the joint through a first range of motion, collect (860) second motion information related to movement of the joint through a second range of motion, determine (870) a post-operative laxity (710), and compare the post-operative laxity and the post-operative laxity assumption to determine laxity results.

Systems and methods for tracking objects

Systems and methods track objects within an operating room. A machine vision system includes a camera and a controller. A navigation system includes a camera unit including a sensor array. The sensor array includes a plurality of sensing elements. The controller system identifies a first subset of the plurality of sensing elements to be active based on the position of the object. The controller is also configured to track a movement of the object within the operating room using the first subset of the plurality of sensing elements while preventing the use of the second subset of the plurality of sensing elements.

Patella tracking

Disclosed herein are a surgical system for patella tracking and a method for selecting a properly-sized patellar implant utilizing the same. The surgical system may include first and second trackers and a patellar tracking system. The first tracker may be configured to contact an unresected or a resected patella, and the second tracker may be configured to contact a bone. The patellar tracking system may be configured to track the first and second trackers during patellar flexion and extension to generate patellar range of motion and patellar trial range of motion. A method for selecting a patellar implant may utilize the first and second trackers and the patellar tracking system.

Rotary motion passive end effector for surgical robots in orthopedic surgeries
11510684 · 2022-11-29 · ·

A passive end effector of a surgical system includes a base connected to a rotational disk, and a saw attachment connected to the rotational disk. The base is attached to an end effector coupler of a robot arm positioned by a surgical robot, and includes a base arm extending away from the end effector coupler. The rotational disk is rotatably connected to the base arm and rotates about a first location on the rotational disk relative to the base arm. The saw attachment is rotatably connected to the rotational disk and rotates about a second location on the rotational disk. The first location on the rotational disk is spaced apart from the second location on the rotational disk. The saw attachment is configured to connect to a surgical saw including a saw blade configured to oscillate for cutting. The saw attachment rotates about the rotational disk and the rotational disk rotates about the base arm to constrain cutting of the saw blade to a range of movement along arcuate paths within a cutting plane.

Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications

A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.

VIRTUAL REALITY SURGICAL CAMERA SYSTEM

A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.