Patent classifications
A61B2017/00207
Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.
GESTURE BASED SELECTION OF PORTION OF CATHETER
In one embodiment, a medical system includes a catheter configured to be inserted into a body part of a living subject, a display configured to provide a view of at least part of a hand of a user, and a processor configured to track a position of the catheter in the body part, render to the display a three-dimensional view of an interior of an anatomical map of the body part and a representation of the catheter inside the anatomical map responsively to the tracked position, while the display is providing the view of the at least part of the hand of the user, recognize a gesture of the at least part of the hand of the user selecting a portion of the catheter, and perform an action responsively to recognizing selection by the user of the portion of the catheter.
VIDEO BASED MICROSCOPE ADJUSTMENT
The present disclosure relates to an optical observation device which is controlled in a sterility preserving manner, and to a corresponding controlling program and/or program storage medium. The optical observation device includes a main structure having at least one optical camera, a support structure adapted to variably position the main structure, and a control unit that receives a sequence of images from the at least one optical camera, searches a current image from the sequence of images for a trackable object, tracks the trackable object shown in the sequence of images subsequent to the current image; and controls adjustable properties of AR-content superimposed with the sequence of images in accordance with a motion of the trackable object, wherein the adjustable properties of the AR-content include a location, orientation, and/or size of the AR-content with respect to the sequence of images.
DEVICE AND SYSTEM FOR MULTIDIMENSIONAL DATA VISUALIZATION AND INTERACTION IN AN AUGMENTED REALITY VIRTUAL REALITY OR MIXED REALITY IMAGE GUIDED SURGERY
The present technology relates to devices and systems for multidimensional data visualization and interaction in an augmented reality, virtual reality, or mixed reality image guided surgery. The disclosed embodiment provides a tool for a physician or other medical specialist to load and review medical scans in an AR/VR/MR environment, assisting medical diagnostics, surgical planning, medical education, or patient engagement.
Image guidance methods and apparatus for glaucoma surgery
An imaging probe comprises a camera or endoscope with an external detector array, in which the probe is sized and shaped for surgical placement in an eye to image the eye from an interior of the eye during treatment. The imaging probe and a treatment probe can be coupled together with a fastener or contained within a housing. The imaging probe and the treatment probe can be sized and shaped to enter the eye through an incision in the cornea and image one or more of the ciliary body band or the scleral spur. The treatment probe may comprise a treatment optical fiber or a surgical placement device to deliver an implant. A processor coupled to the detector can be configured with instructions to identify a location of one or more of the ciliary body band, the scleral spur, Schwalbe's line, or Schlemm's canal from the image.
Optical tracking device with built-in structured light module
A system is disclosed that includes an optical tracking device and a surgical computing device. The optical tracking device includes a structured light module and an optical module that includes an image sensor and is spaced from the structured light module at a known distance. The surgical computing device includes a display device, a non-transitory computer readable medium including instructions, and processor(s) configured to execute the instructions to generate a depth map from a first image captured by the image sensor during projection of a pattern into a surgical environment by the structured light module. The pattern is projected in a near-infrared (NIR) spectrum. The processor(s) are further configured to execute the stored instructions to reconstruct a 3D surface of anatomical structure(s) based on the generated depth map. Additionally, the processor(s) are configured to execute the stored instructions to output the reconstructed 3D surface to the display device.
DEVICE AND SYSTEM INCLUDING MECHANICAL ARMS
A device sized and shaped for insertion into a body comprising: at least one mechanical limb comprising: a support segment; a first flexible section extending from the support segment and terminating in a coupling section; and a second flexible section extending from the coupling section and terminating in a tool or a connector for a tool; wherein a long axis of one or more of the flexible sections is bendable in a single bending plane; wherein a long axis length of the first flexible section is at least double a maximum extent of the first flexible section perpendicular to a flexible section long axis; wherein a long axis length of the second flexible section is at least double a maximum extent of the second flexible section perpendicular to a flexible section long axis.
Systems And Methods For Tracking Objects
Systems and methods to track objects within an operating room with a navigation system that includes an optical sensor including sensing elements and a controller in communication with the optical sensor. The controller controls the optical sensor to process a first quantity of the sensing elements to view a first region of interest. The object is tracked within the first region of interest with the optical sensor. The controller obtains data related to the object within the operating room. The data may include a type of the object and/or prior pose data of the object being tracked. Based on the data, the controller controls the optical sensor to process a second quantity of the sensing elements, different from the first quantity, to view a second region of interest different from the first region of interest. The object is tracked within the second region of interest with the optical sensor.
MEDICAL SUPPORT ARM AND MEDICAL SYSTEM
A medical support arm includes: a support arm that supports an endoscope; an actuator that drives the support arm; a measurement unit that measures a load applied to the actuator; a generation unit that generates three-dimensional information in a body into which the endoscope is inserted; and a correction unit that corrects the three-dimensional information on a basis of the measured load.
OPERATING LAMP ASSEMBLY COMPRISING AN AUTOMATICALLY ORIENTABLE CAMERA
The present disclosure relates to an operating lamp assembly which comprises at least one operating lamp for illuminating an operating field, having at least one camera for acquiring an image of the operating field. The operating lamp assembly comprises a controller which automatically orientates the image of the camera.