A61B90/37

Hub for device navigation with optical shape sensed guidewire

A hub for an optical shape sensing reference includes a hub body (606) configured to receive an elongated flexible instrument (622) with a shape sensing system coupled to the flexible instrument within a path formed in the hub body. A profile (630) is formed in the hub body in the path to impart a hub template configured to distinguish a portion of the elongated flexible instrument within the hub in shape sensing data. An attachment mechanism (616) is formed on the hub body to detachably connect the hub body to a deployable instrument such that a change in a position of the hub body indicates a corresponding change in the deployable device.

Image processing system and method

A System for image processing (IPS), in particular for lung imaging. The system (IPS) comprises an interface (IN) for receiving at least a part of a 3D image volume (VL) acquired by PAT an imaging apparatus (IA1) of a lung (LG) of a subject (PAT) by exposing the subject (PAT) to a first interrogating signal. A layer definer (LD) of the system (IPS) is configured to define, in the 3D image volume, a layer object (LO) that includes a representation of a surface (S) of the lung (LG). A renderer (REN) of the system (IPS) is configured to render at least a part of the layer object (LO) in 3D at a rendering view (V.sub.p) for visualization on a display device (DD).

Use of eye tracking for tool identification and assignment in a robotic surgical system
11690677 · 2023-07-04 · ·

A robotic surgical system includes an eye gaze sensing system in conjunction with a visual display of a camera image from a surgical work site. Detected gaze of a surgeon towards the display is used as input to the system. This input may be used by the system to assign an instrument to a control input device (when the user is prompted to look at the instrument), or it may be used as input to a computer vision algorithm to aid in object differentiation and seeding information, facilitating identification/differentiation of instruments, anatomical features or regions.

Surgical arm system with internally drive gear assemblies

Example embodiments relate to robotic arm assemblies. The robotic arm assembly includes forearm and upper arm segments. Upper arm segment includes distal motor. Robotic arm assembly includes elbow coupling joint assembly connecting distal end of upper arm segment to proximal end of forearm segment via a serial arrangement of proximal and distal elbow joints. Proximal elbow joint is located between upper arm segment and distal elbow joint. Distal elbow joint is located between proximal elbow joint and forearm segment. Proximal elbow joint forms proximal main elbow axis. Distal elbow joint forms distal main elbow axis. Elbow coupling joint assembly includes distal elbow joint subassembly connected to forearm segment. Elbow coupling joint assembly includes proximal elbow joint subassembly connecting upper arm segment to distal elbow joint subassembly. Proximal elbow joint subassembly is configured to be driven to rotate forearm segment relative to proximal main elbow axis.

Segmented control inputs for surgical robotic systems

A robotic surgical system for treating a patient is disclosed including a surgical tool movable relative to the patient and a user input device configured to remotely control the surgical tool. The surgical tool includes a shaft and an end effector. The user input device includes a base and a controller movable to effect a first control motion a second control motion. The controller includes a first accessibility mode and a second accessibility mode. The robotic surgical system further includes a control circuit configured to receive a motion control signal from the user input device, determine a controller accessibility mode, permit the first control motion in response to the motion control signal in the first accessibility mode and in the second accessibility mode and permit the second control motion in response to the motion control signal in the second accessibility mode but not the first accessibility mode.

Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment

A virtual model a planned instrument attachment can be provided to ensure correct selection of a physical instrument attachment. An XR headset controller can generate a shape and a pose of the virtual model of the planned instrument attachment based on predetermined information associated with the planned instrument attachment and based on a pose of an instrument relative to the XR headset. An XR headset can display the virtual model on a see-through display screen of the XR headset that is configured to allow at least a portion of a real-world scene to pass therethrough.

IMAGE-GUIDED SURGICAL SYSTEMS WITH QUANTITATIVE EVALUATION OF IN VIVO THERMAL TREATMENTS AND RELATED METHODS

Methods and systems that provide quantitative assessments of in vivo thermal treatments, such as ablations, during image-guided surgeries using a high-resolution pre-operative MRI image segmented with a shape constrained and deformable mesh representations of brain structures and generating 3-D visualizations of thermally treated volumes during the thermal treatment that can provide near real time visual and quantitative feedback to a clinician.

SURGICAL VIEWING SYSTEM
20230000501 · 2023-01-05 · ·

A surgical viewing system including an X-ray source, a surgical tool and an actuator. The X-ray source creates a beam of radiation used in an image creating process. The surgical tool has the X-ray source coupled thereto, and the surgical tool has an axis of rotation. The actuator is coupled to the surgical tool causing the beam of radiation to be shifted relative to the axis of rotation.

IMAGE SPACE CONTROL FOR ENDOVASCULAR TOOLS
20230000566 · 2023-01-05 ·

Systems and methods for image space control of a medical instrument are provided. In one example, a system is configured to display a two-dimensional medical image including a view of at least a distal end of an instrument. The system can determine, based on one or more fiducials on the instrument, a roll estimate of the instrument. The system further can receive a user input comprising a heading command to change a heading of the instrument within a plane of the medical image, or an incline command to change an incline of the instrument into or out of the plane of the medical image. Based on the roll estimate and the user input, the system can generate one or more motor commands configured to cause a robotic system coupled to the medical instrument to move the robotic medical instrument.

Device and system for generating ultrasonic waves in a target region of a soft solid and method for locally treating a tissue

This device (2) for generating ultrasonic waves in a target region of a soft solid, includes at least two ultrasound sources (32), light sources (40) distributed around a central axis (X2) of the device (2), for enlightening a zone of the soft solid via subsurface scattering, and a video camera (50), for capturing images of the zone enlightened by the lighting means. The ultrasound source (32), the light sources (40) and the video camera (50) are mounted on a body of the device (20) and oriented toward a common target zone which includes a focal point of the ultrasound sources (32). A boresight of the video camera is aligned on the central axis (X2).