Patent classifications
A61F2002/4011
NEURAL NETWORK FOR RECOMMENDATION OF SHOULDER SURGERY TYPE
A computing system generates a plurality of training datasets from past shoulder surgery cases and uses the plurality of training datasets to train a neural network. Each output layer neuron in a plurality of output layer neurons of the neural network corresponds to a different class in one or more shoulder pathology classification systems. The computing system may obtain a current input vector that corresponds to a current patient. Additionally, the computing system may apply the neural network to the current input vector to generate a current output vector. The computing system may determine, based on the current output vector, a recommended type of shoulder surgery for the current patient.
MIXED REALITY-AIDED DEPTH TRACKING IN ORTHOPEDIC SURGICAL PROCEDURES
An example medical device system includes a medical device that includes a rotatable shaft and a tooling bit located on a distal end of the rotatable shaft; a depth aid element positioned at a fixed location relative to the tooling bit along an axis of the rotatable shaft; one or more cameras configured to capture one or more images of the depth aid element; and one or more processors configured to determine a depth of the tooling bit along the axis, based on analysis of the images.
NEURAL NETWORK FOR DIAGNOSIS OF SHOULDER CONDITION
A computing system generates a plurality of training datasets from past shoulder surgery cases and uses the plurality of training datasets to train a neural network. Each output layer neuron in a plurality of output layer neurons of the neural network corresponds to a different class in one or more shoulder pathology classification systems. The computing system may obtain a current input vector that corresponds to a current patient. Additionally, the computing system may apply the neural network to the current input vector to generate a current output vector. The computing system may determine, based on the current output vector, a classification of a shoulder condition of the current patient.
MIXED-REALITY SURGICAL SYSTEM WITH PHYSICAL MARKERS FOR REGISTRATION OF VIRTUAL MODELS
An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.
MIXED-REALITY SURGICAL SYSTEM WITH PHYSICAL MARKERS FOR REGISTRATION OF VIRTUAL MODELS
An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.
MIXED REALITY-AIDED EDUCATION USING VIRTUAL MODELS OR VIRTUAL REPRESENTATIONS FOR ORTHOPEDIC SURGICAL PROCEDURES
An example system for demonstrating at least one aspect of an orthopedic surgical procedure includes a first device and a second device. In this example, the first device is configured to display a presentation to a first user, wherein the presentation includes one or more virtual elements and wherein the one or more virtual elements comprise a three-dimensional (3D) virtual representation of one or more anatomical features associated with the orthopedic surgical procedure. In this example, the second device is configured to display the presentation to a second user. In this example, the one or more virtual elements demonstrate at least one aspect of the orthopedic surgical procedure and wherein control of at least some of the one or more virtual elements are assignable from the first device to the second device.
MIXED REALITY-AIDED SURGICAL ASSISTANCE IN ORTHOPEDIC SURGICAL PROCEDURES
An example system includes a first device configured to display a first presentation to a first user, wherein the first presentation includes one or more virtual elements configured to assist the first user in an orthopedic surgical procedure; and a second device configured to display a second presentation to a second user, wherein the second user provides surgical assistance on the orthopedic surgical procedure. In this example, the second device is further configured to display content that informs the second user on one or more previously-executed steps of the orthopedic surgical procedure.
MULTI-USER COLLABORATION AND WORKFLOW TECHNIQUES FOR ORTHOPEDIC SURGICAL PROCEDURES USING MIXED REALITY
An example mixed reality (MR) system includes a first MR device configured to present first medical information and first real-world information to a first user via a first MR presentation; a second MR device configured to provide second medical information and second real-world information to a second user via a second MR presentation; and a third device configured to provide third information to a third user, wherein the third information is based at least in part on the first MR presentation or the second MR presentation.
Virtual guidance for orthopedic surgical procedures
An example method includes displaying, via a visualization device and overlaid on a portion of an anatomy of a patient viewable via the visualization device, a virtual model of the portion of the anatomy obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; and displaying, via the visualization device and overlaid on the portion of the anatomy, a virtual guide that guides at least one of preparation of the anatomy for attachment of the prosthetic or attachment of the prosthetic to the anatomy.
EXTENDED REALITY VISUALIZATION OF RANGE OF MOTION
A computing system obtains motion data describing a movement of an appendage of a patient. Additionally, the computing system determine, based on the motion data, a range of motion of the appendage. Furthermore, the computing system generates, for display by an extended reality visualization device, an extended reality visualization of the range of motion of the appendage superimposed on an image of the patient or an avatar of the patient.