Patent classifications
A61B2090/0818
TIME-SPACED ROBOTIC REFERENCE FRAMES
A robotic navigation system includes a robot base (140); a robotic arm (144) comprising a proximal portion secured to the robot base, a distal portion movable relative to the proximal portion, and a tracking marker (156) secured to the robotic arm proximate the distal portion; at least one processor; a navigation system including a tracking marker sensor configured to identify positions of the tracking marker in a first coordinate space; and a memory. The memory stores instructions that cause the at least one processor to: cause the robotic arm (144) to move to a plurality of different poses; receive information relating to a position of the tracking marker (156) in a second coordinate space when the robotic arm is in each of the plurality of different poses; and compare the positions of the tracking marker in the first coordinate space to the positions of the tracking marker in the second coordinate space.
SHAPE SENSOR SYSTEMS WITH REDUNDANT SENSING
A method of operating a shape sensing apparatus comprises receiving shape data from a first shape sensor. The first shape sensor includes a first plurality of optical cores extending between a first section and a second section of an elongated optical fiber. The method further comprises receiving shape data from a second shape sensor. The second shape sensor includes a second plurality of optical cores extending between the first and second sections of the elongated optical fiber. The method further comprises determining a shape of the elongated optical fiber between the first and second sections by combining the shape data from the first and second shape sensors.
SYSTEMS AND METHODS FOR CONTROLLING A ROBOTIC MANIPULATOR OR ASSOCIATED TOOL
A system comprises a robotic manipulator for control of motion of a medical tool. The robotic manipulator including a joint and a link connected to the joint. The link is configured to connect to the medical tool. A processing unit of the system is configured to receive first data from an encoder of the joint. A first tool tip estimate of a first parameter of a tool tip coupled at a distal end of the medical tool is generated using the first data. The first parameter of the tool tip is a position or a velocity of the tool tip. Second data is received from a sensor system located at a sensor portion of the link or the medical tool. The joint is controlled based on a first difference between the first tool tip estimate and a second tool tip estimate generated using the first and second data.
Smart cartridge wake up operation and data retention
An electronic system for a surgical instrument is disclosed. The electronic system comprises a main power supply circuit configured to supply electrical power to a primary circuit. A supplementary power supply circuit configured to supply electrical power to a secondary circuit. A short circuit protection circuit coupled between the main power supply circuit and the supplementary power supply circuit. The supplementary power supply circuit is configured to isolate itself from the main power supply circuit when the supplementary power supply circuit detects a short circuit condition at the secondary circuit. The supplementary power supply circuit is configured to rejoin the main power supply circuit and supply power to the secondary circuit, when the short circuit condition is remedied.
Powered medical device including measurement of closure state of jaws
A powered surgical instrument including an end effector, which includes jaws that are configured to transition between various closure states. The surgical instrument includes a sensor configured to measure the closure state of the jaws and a display configured to display information indicative to the detected closure state. The surgical instrument further includes a firing member movable between a first position and a second position to transition the end effector between the plurality of closure states and a motor configured to drive the firing member between the first position and the second position.
Tracking system for image-guided surgery
Apparatus and methods are described including tracking a tool portion and a patient marker from a first line of sight, using a first tracking device disposed upon a first head-mounted device that includes a display. The tool portion and the patient marker are tracked from a second line of sight, using a second tracking device. When a portion of the patient marker and the tool portion are both within the first line of sight, an augmented reality image is generated upon the first display based upon data received from the first tracking device and without using data from the second tracking device. When at least the patient marker portion and the tool portion are not both within the first line of sight, a virtual image of the tool and anatomy of the patient is generated using data received from the second tracking device. Other applications are also described.
Intertial device tracking system and method of operation thereof
A tracking system for ultrasound imaging includes an imaging probe which acquires ultrasound image information including a plurality of ultrasound image frames; an inertial measurement unit coupled to the imaging probe and which synchronously acquires tracking information including a plurality of tracking frames indicative of motion of the imaging probe; and a controller. The controller is configured to obtain the ultrasound image information for at least two of the plurality of ultrasound image frames from the plurality of ultrasound imaging frames, and determine a similarity value based upon a comparison of the at least two ultrasound image frames. The controller is configured to compute whether the similarity value (C.sub.frame) is less than a similarity threshold value (C.sub.thresh), and select first or second pose estimation methods, each different from each other, based upon results of the computation of whether the similarity value (C.sub.frame) is less than the similarity threshold value (C.sub.thresh).
Medical navigation system employing optical position sensing and method of operation thereof
An apparatus and method that uses shape sensing and imaging to record, display and enable return to an imaging probe location or to predetermined imaging parameters. The apparatus includes an ultrasound probe (104, 304, 404, 750); a shape-sensing-device (SSD) (102, 302, 602, 740) associated with the ultrasound probe; and a controller (122, 710). The controller may be configured to: determine at least one of location and orientation of the ultrasound probe based upon position sensor information (PSI) received from the SSD; select a view of a plurality of views of a workflow that are stored in the memory; obtain view setting information (VSI) including parameters and a position and/or orientation of the ultrasound probe for each of the views; determine guidance information; and render the determined guidance information on the rendering device and set ultrasound probe parameters based on the parameters of the VSI for the selected view.
Alignment precision
Alignment precision technology, in which a system accesses image data of a bone to which a reference marker array is fixed. The system generates a three-dimensional representation of the bone and the reference markers, defines a coordinate system for the three-dimensional representation, and determines locations of the reference markers relative to the coordinate system. The system accesses intra-operative image data that includes the bone and a mobile marker array that is attached to an instrument used in a surgical procedure. The system co-registers the intra-operative image data with the three-dimensional representation by matching the reference markers included in the intra-operative image data to the locations of the reference markers. The system determines locations of the mobile markers in the co-registered image and determines a three-dimensional spatial position and orientation of the instrument relative to the bone.
Selected image acquisition technique to optimize specific patient model reconstruction
A system and a method are disclosed that allow for generation of a model or reconstruction of a model of a subject based upon acquired image data. The image data can be acquired in a substantially mobile system that can be moved relative to a subject to allow for image acquisition from a plurality of orientations relative to the subject. The plurality of orientations can include a first and final orientation and a predetermined path along which an image data collector or detector can move to acquire an appropriate image data set to allow for the model of construction.