A61B2034/2068

Dual mode augmented reality surgical system and method

A dual mode augmented reality surgical system configured to operate in both a tracking and a non-tracking mode includes a head mounted display configured to provide an optical view of a patient and to inject received data content over top of the optical view to form an augmented reality view of the patient, and comprising internal tracking means configured to determine a surgeon's position as well as angle and direction of view relative to the patient. The system further includes an augmented reality computing system comprising one or more processors, one or more computer-readable tangible storage devices, and program instructions stored on at least one of the one or more storage devices for execution by at least one of the one or more processors.

DETERMINING RELATIVE 3D POSITIONS AND ORIENTATIONS BETWEEN OBJECTS IN 2D MEDICAL IMAGES
20220398776 · 2022-12-15 · ·

Systems and methods are provided for processing X-ray images, wherein the methods are implemented as a software program product executable on a processing unit of the systems. Generally, an X-ray image is received by the system, the X-ray image being a projection image of a first object and a second object. The first and second objects are classified and a respective 3D model of the objects is received. At the first object, a geometrical aspect like an axis or a line is determined, and at the second object, another geometrical aspect like a point is determined. Finally, a spatial relation between the first object and the second object is determined based on a 3D model of the first object, a 3D model of the second object, and the information that the point of the second object is located on the geometrical aspect of the first object.

AUGMENTED REALITY-ASSISTED METHOD FOR PERFORMING SURGERY
20220395328 · 2022-12-15 ·

An augmented reality-assisted method for performing surgery comprises: disposing a position sensing element at a facial positioning point of a patient before craniotomy to obtain skull space and intracranial space information for defining a coordinate space; obtaining a brain anatomical image for constructing a three-dimensional graphic, the graphic comprising a graphic positioning point and a feature associated with a gyrus feature; defining a relative positional relationship between the graphic and the space, aligning the facial positioning point with the graphic positioning point; using a probe to obtain a spatial position of the gyrus feature after craniotomy, using the gyrus feature as a calibration reference point; generating a displacement and rotation parameter based on a coordinate difference of the feature relative to the reference point; adjusting a position and/or an angle of the graphic on a display according to the parameter, and the display displaying the calibrated three-dimensional graphic.

Systems and methods for computer-aided orthognathic surgical planning

Systems and methods for orthognathic surgical planning are described herein. An example computer-implemented method can include generating a composite three-dimensional (3D) model of a subject's skull, defining a global reference frame for the composite 3D model, performing a cephalometric analysis on the composite 3D model to quantify at least one geometric property of the subject's skull, performing a virtual osteotomy to separate the composite 3D model into a plurality of segments, performing a surgical simulation using the osteotomized segments, and designing a surgical splint or template for the subject.

Optical-based input for medical devices

A system for adjusting an operating state of a medical electronic device is described. In an aspect, the system includes an optical tracking system configured to detect three or more tracking markers. The system also includes a processor coupled with the optical tracking system. The processor is programmed with instructions which, when executed, configure the processor to: configure an input command by assigning at least one operating state of the medical electronic device to a particular state of at least one of the tracking markers; after receiving a priming command, identify a present state of the tracking markers based on data from the optical tracking system; compare the present state with the particular state assigned to the operating state; and based on the comparison, determine that an input command has been received and adjust the operating state of the medical electronic device to the assigned operating state.

Bone registration methods for robotic surgical procedures

A computer-implemented method to improve the point collection process during registration of a bone for a computer-assisted surgical procedure is provided. Based on bone digitization data, a simulation is performed to confirm the accuracy of the registration for different digitization regions. Results are tested to identify which digitization regions meet a predefined accuracy requirement. The resulting information is used to perform a computer-assisted surgical procedure. A computerized simulation method for registration of a bone for a computer-assisted surgical procedure is also provided based on processor executing random stroking an expected exposed surface of a bone model with multiple of stroke curves to cover most of the bone model surface with uniform noise and a random sample consensus is applied to remove outlying point to yield the best registration results, to find the top subset as to overlap. A method to perform computer-assisted surgery is also provided.

Registration of an image with a tracking system
11527002 · 2022-12-13 · ·

A medical apparatus includes a registration tool, which includes a position sensor, A position-tracking system is configured to acquire position coordinates of the sensor in a first frame of reference defined by the position-tracking system. A processing unit is configured to receive 3D image data with respect to the body of the patient in a second frame of reference, to generate a 2D image of the surface of the patient based on the 3D image data, to render the 2D image to a display screen, and to superimpose onto the 2D image icons indicating locations of respective landmarks. The processing unit receives the position coordinates acquired by the position-tracking system while the registration tool contacts the locations on the patient corresponding to the icons on the display, and registers the first and second frames of reference by comparing the position coordinates to the three-dimensional image data.

HEAD STABILIZATION SYSTEM WITH SENSING FEATURES

A head stabilization system useable for stabilizing a head of a patient during a medical procedure includes a sensor assembly and a connection assembly connected with a head fixation device. The sensor assembly includes one or more sensors positioned on the head fixation device for detecting one or more characteristics of the head fixation device. The connection assembly receives and processes the detected characteristic to provide feedback of the detected characteristic. The head fixation device can be adjusted or modified based on the provided feedback.

INSTRUMENTS FOR ROBOTIC KNEE REVISION
20220387121 · 2022-12-08 ·

A device for registering a bone for a robotic knee arthroplasty with a surgical robot can include a plate and a registration device. The plate can be engageable with the bone and can include a lateral portion, a medial portion, and a hinge. The registration device can be connected to the plate and can be configured to interface with the surgical robot for registration of the plate and the bone.

Systems and methods for measuring bone joint laxity

A system and device (110) for determining bone laxity. For example, the system includes a tracked probe (300) comprising at least one probe marker (310) and a computer assisted surgical (CAS) system (100). The CAS system includes a navigation system (130) and a processing device (110) operably connected to the navigation system and a computer readable medium configured to store one or more instructions that, when executed, cause the processing device to receive location information from the navigation system, generate (820) a surgical plan comprising a post-operative laxity assumption (720), collect (850) first motion information related to movement of the joint through a first range of motion, collect (860) second motion information related to movement of the joint through a second range of motion, determine (870) a post-operative laxity (710), and compare the post-operative laxity and the post-operative laxity assumption to determine laxity results.