Graphical user interface for a robotic surgical system
11504191 · 2022-11-22
Assignee
Inventors
Cpc classification
A61B34/76
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
International classification
A61B34/00
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
Abstract
A method, apparatus and computer readable medium for schematically representing a spatial position of an instrument used in a robotic surgery system is disclosed. The instrument includes an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace. The method involves causing a processor circuit to calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device. The method also involves causing the processor circuit to generate display signals for displaying a graphical depiction of the surgical workspace on a display in communication with the processor circuit, the graphical depiction including a planar representation includes an instrument movement region having a boundary indicating limitations to transverse movement of the instrument within the surgical workspace, and a two-dimensional projection of the current spatial position of the positioning device and the end effector onto the planar representation.
Claims
1. A method for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to an articulated arm configured to spatially position the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the method comprising: by a processor circuit, calculating a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; by the processor circuit, causing a display to display a graphical depiction of the surgical workspace, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the articulated arm and the end effector onto the planar representation; by the processor circuit, receiving an enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; by the processor circuit, in response to the enablement signal transitioning from the active state to the inactive state, causing the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and by the processor circuit, in response to the enablement signal transitioning from the inactive state to the active state, causing the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the articulated arm and the end effector.
2. The method of claim 1 wherein in the two-dimensional projection the end effector is represented by an indicator and the articulated arm is represented by an area corresponding to two dimensional projected extents of at least a portion of the articulated arm.
3. The method of claim 1 further comprising generating the non-anatomical boundary by: defining a three-dimensional boundary within the surgical workspace; and generating a two-dimensional projection of the three-dimensional boundary onto the planar representation.
4. The method of claim 1 further comprising, by the processor circuit, in response to a determination that the instrument is proximate the non-anatomical boundary of the instrument movement region, causing the display to display an active constraint indication at the non-anatomical boundary.
5. The method of claim 1 wherein the robotic surgery system comprises a plurality of instruments within the surgical workspace and wherein displaying the graphical depiction comprises displaying a graphical depiction for each of the plurality of instruments.
6. The method of claim 1 wherein displaying the graphical depiction comprises displaying the graphical depiction at a peripheral region of the display.
7. The method of claim 1 wherein the graphical depiction further comprises: an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace; an indicator representing a current depth of the end effector within the instrument depth range; and an input device depth range representing a portion of the instrument depth range that is accessible by the instrument based on a current mapping between the input device workspace and the surgical workspace, wherein the input device workspace defines a limited range of the instrument depth range being accessible by the instrument.
8. The method of claim 7 further comprising, by the processor circuit, in response to a determination that the end effector is proximate an end of the input device depth range, causing the display to display an active constraint indication.
9. The method of claim 7 wherein the input device depth range is depicted as a hatched region superimposed on a depiction of the instrument depth range.
10. The method of claim 1 wherein the input signals include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace, and wherein the graphical depiction further comprises: an instrument rotation range indicating limitations on rotational movement of the instrument; an indicator representing a current rotation of the end effector; and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
11. The method of claim 10 further comprising by the processor circuit: in response to the enablement signal transitioning from the active state to the inactive state, causing the display to display a current hand controller rotation indicator on the graphical depiction as an offset from the indicator representing a current rotation of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, causing the display to discontinue displaying the current hand controller rotation indicator.
12. The method of claim 1 wherein the non-anatomical boundary is depicted as a hemisphere.
13. The method of claim 1 wherein the instrument movement region further includes at least one keep-out zone identifying a first region of the surgical workspace positioned within a second region of the surgical workspace circumscribed by the non-anatomical boundary, the at least one keep-out zone indicating a region that may not be accessed by the instrument.
14. An apparatus for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to an articulated arm configured to spatially position the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the apparatus comprising: a display; and a processor circuit configured to: calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; cause the display to display a graphical depiction of the surgical workspace on a display, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the articulated arm and the end effector onto the planar representation; receive an enablement signal, the enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the articulated arm and the end effector.
15. The apparatus of claim 14 wherein the processor circuit is configured to cause the display to display an active constraint indication at the non-anatomical boundary in response to a determination that the instrument is proximate the non-anatomical boundary of the instrument movement region.
16. The apparatus of claim 14 wherein the graphical depiction further comprises: an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace; an indicator representing a current depth of the end effector within the instrument depth range; and an input device depth range representing a portion of the instrument depth range that is accessible by the instrument based on a current mapping between the input device workspace and the surgical workspace, wherein the input device workspace defines a limited range of the instrument depth range being accessible by the instrument.
17. The apparatus of claim 16 wherein the processor circuit is configured to cause the display to display an active constraint indication in response to a determination that the end effector is proximate an end of the input device depth range.
18. The apparatus of claim 14 wherein the input signals include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace, and wherein the graphical depiction further comprises: an instrument rotation range indicating limitations on rotational movement of the instrument; an indicator representing a current rotation of the end effector; and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
19. The apparatus of claim 18 wherein the processor circuit is further configured to: in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller rotation indicator on the graphical depiction as an offset from the indicator representing a current rotation of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue display of current hand controller rotation indicator.
20. The apparatus of claim 14 wherein the instrument movement region further includes at least one keep-out zone identifying a first region of the surgical workspace positioned within a second region of the surgical workspace circumscribed by the non-anatomical boundary, the at least one keep-out zone indicating a region that may not be accessed by the instrument.
21. The method of claim 13 wherein the at least one keep-out zone is defined based on input received from an operator and patient imaging data.
22. A non-transitory computer readable medium storing instructions that, when executed by a processor circuit of a robotic surgery system, direct the processor circuit to represent a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the instructions further directing the processor circuit to: calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; cause a display to display a graphical depiction of the surgical workspace, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the positioning device and the end effector onto the planar representation; receive an enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the positioning device and the end effector.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) In drawings which illustrate disclosed embodiments,
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
DETAILED DESCRIPTION
(16) Referring to
(17) The instrument 106 and instrument mount 108 are shown in more detail in
(18) In the embodiment shown the end effector 210 is a pair of forceps having opposing moveable gripper jaws 216 controlled by the instrument drive for grasping tissue, while the end effector 214 is a pair of curved dissecting forceps. The instrument 106 also includes a camera 218 deployed on an articulated arm 220 that is able to pan and tilt the camera. The camera 218 includes a pair of spaced apart image sensors 222 and 224 for producing stereoscopic views of the surgical workspace. The instruments 208 and 212 and the camera 218 are initially positioned in-line with the insertion tube 202 prior to insertion through the incision and then deployed as shown at 206.
(19) Referring back to
(20) The workstation 102 also includes a display 122 in communication with the workstation processor circuit 120 for displaying real time images and/or other graphical depictions of the surgical workspace. In this embodiment where the camera 218 includes the pair of spaced apart image sensors 222 and 224, the display 122 is configured to provide separate 2D stereoscopic views of the surgical workspace that provide a 3D depth effect when viewed through suitable stereoscopic spectacles worn by the surgeon.
(21) The workstation 102 also includes a footswitch 134, which is actuable by the surgeon to provide an enablement signal to the workstation processor circuit 120. The enablement signal has an active state and an inactive state and in this embodiment depressing the footswitch 134 causes the enablement signal to change from the active state to the inactive state. The active state of the enablement signal permits movement of the instrument 106 in response to the input signals produced by the input device 110 while the inactive state inhibits movement of the instrument.
(22) The input signals are generated by the right and left input devices 116 and 118 in response to movement of the hand controllers 112 and 114 by a surgeon within an input device workspace. The positioning devices 209 and 213 associated with the instruments 208 and 212 spatially position the respective end effectors 210 and 214 in the surgical workspace in response to the input signals.
(23) A block diagram of the processor circuit elements of the system 100 is shown in
(24) In this embodiment the input device 110 communicates using a USB protocol and the USB interface 254 receives input signals produced by the input device in response to movements of the hand controllers 112 and 114. The microprocessor 250 processes the input signals based on a current mapping between the input device workspace and the surgical workspace and causes the motion control interface 258 to transmit control signals, which are conveyed to the instrument processor circuit 130 via the interface cable 132. The mapping may include a scale factor that scales movements in input device workspace to produce scaled movements in surgical workspace. For example a 100 mm translation in input device workspace may be scaled by a scale factor of 0.5 to produce a 50 mm movement in surgical workspace for fine movement.
(25) The enablement signal produced by the footswitch 134 is received at the input/output 256. The workstation memory 252 includes a current buffer 320 and a previous buffer 340 including a plurality of stores for storing values associated with the control signals, as described later herein.
(26) The instrument processor circuit 130 includes a microprocessor 280, a memory 282, a communications interface 284, and a drive control interface 286, all of which are in communication with the microprocessor. The microprocessor 280 receives the input signals at the communications interface 284. The microprocessor 280 processes the control signals and causes the drive control interface 286 to produce drive signals for moving the instruments 208 and 212.
(27) The workstation processor circuit 120 thus acts as a master subsystem for receiving user input, while the instrument processor circuit 130 and instruments 208 and 212 act as a slave subsystem in responding to the user input.
(28) Referring to
(29) The process 300 begins at block 302, which directs the microprocessor 250 to determine whether the enablement signal is active. If the footswitch 134 is not currently being depressed then the instruments 208 and 212 are under control of the input device 110 and block 302 directs the microprocessor 250 to block 306. If the footswitch 134 is currently depressed then movement of the instrument 106 is inhibited and block 302 directs the microprocessor 250 to block 304 to execute a base setting process, which will be described later herein. Following the base setting process at block 304, the microprocessor 250 is directed to block 306.
(30) Block 306 directs the microprocessor 250 to calculate a current three-dimensional (3D) spatial position of the instruments 208 and 212 within the surgical workspace for current input signals received from the input device 110. Referring back to
(31) Block 308 then directs the microprocessor 250 to generate display signals for displaying a graphical depiction of the surgical workspace on the display 122. Referring back to
(32) Block 308 then directs the microprocessor 250 back to block 302 and the process 300 is repeated. In one embodiment the process 300 is repeated at a frequency of about 1 kHz.
(33) Referring to
(34) The graphical depictions 136 and 138 also include a two-dimensional (2D) projection of the current spatial position of the respective positioning devices 209 and 213 and the end effectors 210 and 214. In the embodiment shown the end effectors 210 and 214 are represented by indicators 408 and 410 that indicate at least an approximate orientation of jaws of the respective end effectors. The positioning devices 209 and 213 are represented by areas 412 and 414 corresponding to 2D projected extents of portions of the positioning devices onto the planar representation.
(35) The graphical depictions 136 and 138 also each include an instrument depth range 416 and 418 indicating limitations to axial movement of the instruments into the surgical workspace. The limitations to axial movement of the instrument are represented by ends 424 and 426 of the instrument depth range 416 and ends 428 and 430 of the instrument depth range 418. The instrument depth ranges 416 and 418 also each include a current depth indicator 420 and 422 (in this case a circle) representing a current depth of the end effector within the respective instrument depth ranges. The current depth indicator 420 is closer to the end 424 of the range 416 than the current depth indicator 422, since the right side instrument 208 is located further into the surgical workspace than the left side instrument 212 (as shown in
(36) The input signals produced by the input device 110 also include rotation signals defining a current rotation of each of the hand controllers 112 and 114. The rotation signals are used by the workstation processor circuit 120 to produce control signals for causing rotation of the respective end effectors 210 and 214 in the surgical workspace. The graphical depictions 136 and 138 shown in
(37) As disclosed above, blocks 302-308 of the process 300 are repeated at a frequency of about 1 kHz, thus updating the graphical depictions 136 and 138 to provide the surgeon with a near real-time display of the spatial position of the instruments 208 and 212. In the embodiment shown in
(38) Referring to
(39) The boundary surface 485 in
(40) Movements of the hand controller 112 of the input device 116 are able to cause the positioning device 209 of the instrument 208 to move within the surgical workspace 484 while the end effector 210 is capable of extending outwardly to reach into a region 488 for the current mapping. The area 488 represents an additional portion of surgical workspace that can be accessed by the end effector 210 and has a 3D boundary surface 489.
(41) The right graphical depiction 136 shown in
(42) Changes in the mapping between the input signals produced by the input device 110 and the control signals produced by the workstation processor circuit 120 at the motion control interface 258 may be made when the footswitch 134 is depressed allowing the hand controllers 112 and 114 to be repositioned to access a different portion of the surgical workspace 484 or in response to a change of scale factor, allowing a larger or smaller proportion of the surgical workspace to be accessed.
(43) Input Device
(44) The right input device 116 is shown in greater detail in
(45) The input device 116 has sensors (not shown) that sense the position of each of the arms 502-506 and rotation of the hand controller 112 about each of the x.sub.1, y.sub.1 and z.sub.1 axes and produces signals representing the position of the hand controller in the workspace and the rotational orientation of hand controller relative to an input device Cartesian reference frame x.sub.r, y.sub.r, z.sub.r. In this embodiment, the position and orientation signals are transmitted as input signals via a USB connection 518 to the USB interface 254 of the workstation processor circuit 120.
(46) In this embodiment, the gimbal mount 510 has a pin 512 extending downwardly from the mount and the base 500 includes a calibration opening 514 for receiving the pin. When the pin 512 is received in the opening 514 the input device 116 is located in a calibration position that is defined relative to the input device Cartesian reference frame x.sub.r, y.sub.r, z.sub.r. The input device reference frame has an x.sub.r-z.sub.r plane parallel to the base 500 and a y.sub.r axis perpendicular to the base. The z.sub.r axis is parallel to the base 500 and is coincident with an axis 516 passing centrally through the input device 116.
(47) The input device 116 produces current hand controller signals and current hand controller orientation signals that represent the current position and orientation of the hand controller 112. The signals may be represented by a current hand controller position vector and a current hand controller rotation matrix. The current hand controller position vector is given by:
(48)
(49) where x.sub.1, y.sub.1, and z.sub.1 represent coordinates of the hand controller position 508 (i.e. the origin of the coordinate system x.sub.1, y.sub.1, z.sub.1) relative to the input device reference frame x.sub.r, y.sub.r, z.sub.r. The current hand controller rotation matrix is given by:
(50)
where the columns of the matrix represent the axes of the hand controller reference frame x.sub.1, y.sub.1,
(51) z.sub.1 relative to the input device reference frame x.sub.r, y.sub.r, z.sub.r. The matrix R.sub.MCURR thus defines the current rotational orientation of the hand controller 112 relative to the x.sub.r, y.sub.r, and z.sub.r fixed master reference frame. The current hand controller position vector .sub.MCURR and current handle rotation matrix R.sub.MCURR are transmitted as current hand controller position and current hand controller orientation signals via the USB connection 518 to the USB interface 254 of the workstation processor circuit 120. The workstation processor circuit 120 stores the three values representing the current handle position vector
.sub.MCURR in a store 322 and the nine values representing the current hand controller rotation matrix R.sub.MCURR in a store 324 of the current buffer 320 of workstation memory 252.
(52) Instrument
(53) The right side instrument 208 is shown in greater detail in
(54) The instrument 208 includes a plurality of the identical “vertebra” 550 as described in U.S. Patent Publication No. 2016/0143633, which is incorporated herein by reference. The vertebra 550 are operable to move with respect to each other when control wires passing through the vertebra are extended or retracted to cause movements of the positioning device 209. The position and orientation of the end effector 210 is defined relative to a fixed slave reference frame having axes x.sub.v, y.sub.v and z.sub.v, which intersect at a point referred to as the fixed slave reference position 552. The fixed slave reference position 552 lies on a longitudinal axis 554 of the instrument 208 and is contained in a plane perpendicular to the longitudinal axis and containing a distal edge of the insertion tube 202.
(55) In the embodiment shown, the end effector 210 includes gripper jaws 216, which may be positioned and oriented within an end effector workspace. A tip of the gripper jaws 216 may be designated as an end effector position 560 defined as the origin of an end effector Cartesian reference frame x.sub.2, y.sub.2, z.sub.2. The end effector position 560 is defined relative to the slave reference position 552 and the end effector may be positioned and orientated relative to the fixed slave reference frame x.sub.v, y.sub.v, z.sub.v, for causing movement of the positioning device 209 and/or the end effector 210.
(56) The current hand controller position signal {right arrow over (P)}.sub.MCURR and current hand controller orientation signal R.sub.MCURR cause movement of the end effector 210 of the instrument 208 to new end effector positions and desired new end effector orientations and are represented by a new end effector position vector {right arrow over (P)}.sub.EENEW:
(57)
where x.sub.2, y.sub.2, and z.sub.2 represent coordinates of the end effector position 560 within the end effector workspace relative to the x.sub.v, y.sub.v, z.sub.v fixed slave reference frame, and a 3×3 end effector rotation matrix R.sub.EENEW:
(58)
where the columns of the R.sub.EENEW matrix represent the axes of the end effector reference frame x.sub.2, y.sub.2, and z.sub.2 written in the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. R.sub.EENEW thus defines a new orientation of the end effector 210 in the end effector workspace, relative to the x.sub.v, y.sub.v, and z.sub.y fixed slave reference frame. Values for the vector {right arrow over (P)}.sub.EENEW and rotation matrix R.sub.EENEW are calculated as described later herein and stored in stores 330 and 332 of the current buffer 320 of the workstation memory 252 respectively.
Base Setting Process
(59) When the system 100 initially starts up, the workstation processor circuit 120 sets a master base position vector {right arrow over (P)}.sub.MBASE equal to the current hand controller vector {right arrow over (P)}.sub.MCURR and causes a definable master base rotation matrix R.sub.MBASE to define an orientation that is the same as the current orientation defined by the hand controller rotation matrix R.sub.MCURR associated with the current hand controller rotation. At startup the following operations are therefore performed:
{right arrow over (P)}.sub.MBASE={right arrow over (P)}.sub.MCURR, and
R.sub.MBASE=R.sub.MCURR.
(60) The hand controller 112 reference frame represented by the axes x.sub.1, y.sub.1, and z.sub.1 shown in .sub.MBASE and the definable master base rotation matrix R.sub.MBASE in the stores 326 and 328 of the current buffer 320 of the workstation memory 252.
(61) At startup of the system 100 there would be no previously stored values for the new end effector position vector {right arrow over (P)}.sub.EENEW and the new end effector Rotation matrix R.sub.EENEW and in one embodiment these values are set to home configuration values. A home configuration may be defined that produces a generally straight positioning device 209 of the instrument 208 as shown in
{right arrow over (P)}.sub.EEBASE={right arrow over (P)}.sub.EENEW, and
R.sub.EEBASE=R.sub.EENEW.
(62) The end effector reference frame represented by the axes x.sub.2, y.sub.2, and z.sub.2 shown in
(63) The base setting process (block 304 of the process 300 shown in
(64) Block 602 then directs the microprocessor 250 to determine whether the enablement signal has transitioned from the inactive state to the active state again. If the enablement signal remains in the inactive state, block 602 directs the microprocessor 250 to repeat block 602 and the process 304 is thus effectively suspended while the enablement signal is in the inactive state. When the enablement signal transitions from the inactive state to the active state, block 602 directs the microprocessor 250 to block 604.
(65) Block 604 directs the microprocessor 250 to set new base positions and orientations for the hand controller 112 and end effector 210 respectively. While the footswitch 134 is depressed the surgeon may move the hand controller 112 to a new location to relocate the input device workspace relative to the surgical workspace. When the enablement signal transitions to the active state, block 604 directs the microprocessor 250 to cause current values of current hand controller position vector {right arrow over (P)}.sub.MCURR and the hand controller rotation matrix R.sub.MCURR to be stored in locations 326 and 328 of the current buffer 320 workstation memory 252 as new values for the master base position vector {right arrow over (P)}.sub.MBASE and master base rotation matrix R.sub.MBASE. Block 604 also directs the microprocessor 250 to cause current values for the end effector position signal {right arrow over (P)}.sub.EENEW and the end effector orientation signal R.sub.EENEW to be stored in stores 334 and 336 of the current buffer 320 as the definable end effector base position vector {right arrow over (P)}.sub.EEBASE and definable slave base rotation matrix R.sub.MBASE.
(66) The base setting process 304 then continues at block 606, which directs the microprocessor 250 to permit further movement of the instrument 208 while the enablement signal produced by the 134 remains active.
(67) The base setting process 304 thus allows the instrument 208 to be immobilized by depressing the footswitch 134 while the hand controller 112 of the input device 116 is moved to a new location. When the footswitch 134 is released, control of the instrument 208 resumes at the new position of the hand controller 112. The hand controller 112 may thus be repositioned as desired while the instrument remains immobile, preventing unintended movements that may inflict injury to the patient.
(68) In one embodiment, when the footswitch 134 causes the enablement signal to transition to the inactive state, the indicators 408, 412, 410 and 414 in
(69) Instrument Position and Orientation
(70) Further details of block 306 of the process 300 shown in
(71) The process 306 begins at block 630 which directs the microprocessor 250 to read current values for {right arrow over (P)}.sub.MCURR and R.sub.MCURR from the current buffer 320 of the workstation memory 252, which represent the current hand controller position vector {right arrow over (P)}.sub.MCURR and current hand controller matrix R.sub.MCURR. Block 632 then directs the microprocessor 250 to calculate new end effector position signals {right arrow over (P)}.sub.EENEW and new end effector orientation signals R.sub.EENEW representing a desired end effector position 560 and desired end effector orientation, relative to the fixed slave reference position 552 and the slave base orientation. Block 632 also directs the microprocessor 250 to store values representing the new end effector position vector {right arrow over (P)}.sub.EENEW in the store 330 and to store values representing the desired end effector orientation matrix R.sub.EENEW in the store 332 of the current buffer 320 of the workstation memory 252.
(72) The new end effector position signals {right arrow over (P)}.sub.EENEW and new end effector orientation signals R.sub.EENEW are calculated according to the following relations:
{right arrow over (P)}.sub.EENEW=A({right arrow over (P)}.sub.MCURR−{right arrow over (P)}.sub.MBASE)+{right arrow over (P)}.sub.EEBASE Eqn 1a
R.sub.EENEW=R.sub.EEBASER.sub.MBASE.sup.−1R.sub.MCURR Eqn 1b
where:
(73) .sub.EENEW is the new end effector position vector that represents the new desired position of the end effector 210 in the end effector workspace, and is defined relative to the slave base reference position; A is a scalar value representing a scaling factor in translational motion between the master and the slave; {right arrow over (P)}.sub.MCURR is the current representation of the hand controller position vector stored in the store 322 of the current buffer 320, the hand controller position vector being defined relative to the fixed master reference frame x.sub.r, y.sub.r, and z.sub.r; {right arrow over (P)}.sub.MBASE is the last-saved position vector {right arrow over (P)}.sub.MCURR for the hand controller 112 that was shifted at the last transition of the enablement signal from the inactive state to the active state or on system initialization or by operation of a control interface by an operator; {right arrow over (P)}.sub.EEBASE is the last saved position vector {right arrow over (P)}.sub.EENEW for the end effector 210 that was shifted at the last transition of the enablement signal from the inactive state to the active state or on system initialization; R.sub.EENEW is the new end effector orientation matrix representing the current orientation of the end effector 210, and is defined relative to the fixed slave reference position 552; R.sub.EEBASE is the last-saved rotation matrix R.sub.EENEW of the end effector 210 shifted at the last transition of the enablement signal from the inactive state to the active state; R.sub.MBASE.sup.−1 is the inverse of rotation matrix R.sub.MBASE, which is the last-saved rotation matrix R.sub.MCURR of the hand controller 112 saved at the last transition of the enablement signal from the inactive state to the active state; and R.sub.MCURR is the currently acquired rotation matrix representing the orientation of hand controller 112 relative to the fixed master reference frame x.sub.r, y.sub.r, and z.sub.r.
(74) Block 634 then directs the microprocessor 250 to determine whether the enablement signal is in the active state. If the enablement signal is in the active state, block 636 directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the newly calculated values for {right arrow over (P)}.sub.EENEW and R.sub.EENEW. When the control signals are received at the communications interface 284 of the instrument processor circuit 130, the microprocessor 280 causes drive signals to be produced to cause the end effector 210 to assume a position and orientation determined by the current position and current orientation of the hand controller 112.
(75) Block 638 then directs the microprocessor 250 to copy the current position vector {right arrow over (P)}.sub.MCURR and the current rotation matrix R.sub.MCURR stored in stores 322 and 324 of the current buffer 320 into stores 342 ({right arrow over (P)}.sub.MPREV) and 344 (R.sub.MPREV) of the previous buffer 340 of the workstation memory 252. Block 638 also directs the microprocessor 250 to copy the newly calculated end effector position vector {right arrow over (P)}.sub.EENEW and the newly calculated end effector rotation matrix R.sub.EENEW into stores 346 and 348 of the previous buffer 340. By storing the newly calculated end effector position vector {right arrow over (P)}.sub.EENEW and newly calculated end effector rotation matrix R.sub.EENEW, as previously calculated end effector position vector {right arrow over (P)}.sub.EEPREV and previously calculated end effector rotation matrix R.sub.EEPREV, a subsequently acquired new end effector position vector {right arrow over (P)}.sub.EENEW and subsequently acquired new end effector rotation matrix R.sub.EENEW can be calculated from the next received hand controller position vector {right arrow over (P)}.sub.MCURR and next receive hand controller rotation matrix R.sub.MCURR provided by the input device 116.
(76) If at block 634, the enablement signal is in the inactive state the microprocessor 250 is directed to block 642. Block 642 directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the previously calculated values of .sub.EPREV and R.sub.EEPREV in the respective stores 346 and 348 of the previous butter 340 of the workstation memory 252. The control signals transmitted by the motion control interface 258 are thus derived from the last saved values of
.sub.EENEW and R.sub.EENEW, causing the end effector 210 to remain stationary since the same control signals as previously determined are transmitted to the communications interface 284 of the instrument processor circuit 130. The microprocessor 250 is then directed to block 640.
(77) While enablement signal remains inactive (i.e. while the footswitch 134 is depressed) the control signals transmitted by the motion control interface 258 are based only on the previously calculated end effector position and previously calculated orientation signals {right arrow over (P)}.sub.EEPREV and R.sub.EEPREV that were in effect before the enablement signal transitioned to inactive.
(78) In another embodiment certain special functions may be executed before executing block 636 when the enablement signal is determined to be in the active state at block 634. One example of such a special function is an alignment control function, as described in applicant's co pending U.S. Patent Publication No. 2018/0271607 and U.S. Patent Publication No. 2017/0367777, hereby incorporated by reference in their entirety. For example, in one embodiment an alignment control function may have one of two outcomes. The first outcome may direct the microprocessor 250 to execute block 636, which directs the microprocessor to cause the motion control interface 258 to transmit control signals to the instrument processor circuit 130 based on the newly calculated end effector position and newly calculated end effector orientation {right arrow over (P)}.sub.EENEW and R.sub.EENEW. The second outcome directs the microprocessor 250 to execute block 638, which causes the microprocessor to cause the motion control interface 258 to transmit control signals based on a previously calculated end effector position and previously calculated end effector orientation {right arrow over (P)}.sub.EEPREV and R.sub.EEPREV. This causes the end effector 210 to assume a position and orientation determined by a previous position and previous orientation of the hand controller 112.
(79) Accordingly, when the enablement signal is in the inactive state, the hand controller 112 can be moved and rotated and the calculations of {right arrow over (P)}.sub.EENEW and R.sub.EENEW will still be performed by block 632, but there will be no movement of the end effector 210, since the previous control signals are sent to the instrument processor circuit 130. This allows “clutching” or repositioning of the hand controller 112 without corresponding movement of the end effector 210. The movement may be useful in relocating the hand controller within the input device workspace to a comfortable position and/or providing an increased range of movement for the end effector 210 within the surgical workspace.
(80) The end effector position vector {right arrow over (P)}.sub.EENEW or {right arrow over (P)}.sub.EEPREV and end effector orientation matrix R.sub.EENEW or R.sub.EEPREV produced at block 636 or block 638 provide a desired location end effector tip 560 with respect to the fixed slave reference position 552. However, in the embodiment shown in
(81) Motion Control Signals
(82) The right side instrument 208 is shown in a bent pose in
(83) The s-segment 700 extends from the first position 704 to a third position 706 defined as an origin of a third reference frame having axes x.sub.5, y.sub.5, and z.sub.5 and is capable of assuming a smooth s-shape when control wires (not shown) inside the s-segment 700 are pushed and pulled. The s-segment 700 has a mid-point at a second position 708, defined as the origin of a second position reference frame having axes x.sub.4, y.sub.4, and z.sub.4. The s-segment 700 has a length L.sub.1, best shown in
(84) The distal segment 702 extends from the third position 706 to a fourth position 710 defined as an origin of a fourth reference frame having axes x.sub.6, y.sub.6, and z.sub.6. The distal segment 702 has a length L.sub.2, best shown in
(85) Each end effector 210 and 214 also has an end effector length, which in the embodiment shown is a gripper length L.sub.3 extending from the fourth position 710 to the end effector tip position 560 defined as the origin of the axes x.sub.2, y.sub.2, and z.sub.2. The gripper length L.sub.3 is best shown in
(86) As described in U.S. Patent Publication No. 2016/0143633 (hereby incorporated herein by reference in its entirety) by pushing and pulling on control wires inside the positioning devices 209 and 213, the s-segments 700 of the positioning devices 209 and 213 may be bent into various degrees of an s-shape, from the straight condition shown in
(87) In addition, the distal segment 702 lies in a second bend plane containing the third position 706 and the fourth position 710. The second bend plane is at an angle δ.sub.dist to the x.sub.v-z.sub.v plane of the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. The distal segment 702 is bent in the second bend plane at an angle ϑ.sub.dist. Thus, by pushing and pulling the control wires within the positioning device 209, the fourth position 710 can be placed within another volume in space about the fourth position 710. This volume may be referred to as the distal workspace. The combination of the s-segment workspace and the distal workspace may be referred to as the positioning device workspace as this represents the total possible movement of the instrument 208 as effected by the positioning device 209. The left side instrument 212 may be similarly positioned by the positioning device 213.
(88) The distance between the fourth position 710 and the end effector position 560 is the distance between the movable portion of the distal segment 702 and the tip of the gripper end effector 210 in the embodiment shown, i.e. the length the gripper length L.sub.3 shown in
(89) In the embodiment shown, the end effector 210 include moveable gripper jaws 216 that are rotatable about the z.sub.2 axis in the x.sub.2-y.sub.2 plane of the end effector reference frame, the angle of rotation being represented by an angle γ relative to the positive x.sub.2 axis. Finally, the gripper jaws 216 may be at any of varying degrees of openness from fully closed to fully open (as limited by a hinge joint of the jaws). The varying degrees of openness may be defined as the “gripper”. In summary therefore, the motion control signals are generated based on a kinematic configuration of the positioning device 209 and end effector 210 as defined by the following configuration variables: q.sub.ins represents a distance from the slave reference position 552 defined by axes x.sub.v, y.sub.v, and z.sub.v to the first position 704 defined by axes x.sub.3, y.sub.3 and z.sub.3 where the s-segment 700 of the positioning device 209 begins; δ.sub.prox represents a first bend plane in which the s-segment 700 is bent relative to the x.sub.v-y.sub.v plane of the fixed slave reference frame; ϑ.sub.prox represents an angle at which the first and second sections 712 and 714 of the s-segment 700 are bent in the first bend plane; δ.sub.dist represents a second bend plane in which the distal segment 702 is bent relative to the x.sub.v-y.sub.v plane of the fixed slave reference frame;
(90) θ.sub.dist represents an angle through which the distal segment 702 is bent in the second bend plane; γ represents a rotation of the end effector 210 about axis z.sub.2; and Gripper: represents a degree of openness of the gripper jaws 216 of the end effector 210 (this is a value which is calculated in direct proportion to a signal produced by an actuator (not shown) on the hand controller 112 indicative of an amount of pressure the operator exerts by squeezing the actuator to actuate the jaws 216 to close).
(91) To calculate the configuration variables, it will first be recalled that the end effector rotation matrix R.sub.EENEW is a 3×3 matrix:
(92)
where the last column of R.sub.EENEW is the z-axis of the end effector reference frame written relative to the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. The values ϑ.sub.dist, δ.sub.dist, and γ associated with the distal segment 702 may be calculated according to the relations:
(93)
else
γa tan 2(y.sub.2z,−x.sub.2z)−δ.sub.dist Eqn 4b
(94) The third position 706 may then be written in terms of a vector
where:
(95)
where ī is a unit vector in the x direction,
The vector
(96)
Taking a ratio of Eqn 8b and Eqn 8a yields:
δ.sub.prox=a tan 2(−
where ī and
(97)
where ī is the unit vector in the x direction. The equation Eqn 10 is Eqn 8a rewritten in the form f(ϑ.sub.prox)=0. The Newton-Raphson method tends to converge very quickly because in the range 0<ϑ.sub.prox<π the function has a large radius of curvature and has no local stationary points. Following the Newton-Raphson method, successive improved estimates of ϑ.sub.prox can be made iteratively to satisfy equation Eqn 10 using the following relationship:
(98)
(99) Finally, upon determination of ϑ.sub.prox, the following equation can be used to find q.sub.ins:
(100)
where
(101) The above configuration variables calculated for the end effector position and orientation signals
(102) The above configuration variables are calculated for the end effector position and orientation signals .sub.EENEW and R.sub.EENEW at block 636 or
.sub.EEpREv and R.sub.EEpREv at block 642 of the process 306. The configuration variables generally define a pose of the positioning device 209 required to position the end effector 210 at the desired location and orientation in end effector workspace. Configuration variables are produced for each end effector 210 and 214 of the respective right and left side instruments 208 and 212. Two sets of configuration variables referred to as left and right configuration variables respectively are thus produced and transmitted by the motion control interface 258 to the instrument processor circuit 130 and used by the microprocessor 280 to generate drive control signals for spatially positioning the positioning device 209 and end effector 210 of the instrument 208 in the surgical workspace.
(103) 3D Spatial Positioning
(104) Further details of block 308 of the process 300 shown in
(105) The process 308 begins at block 740, which directs the microprocessor 250 to select the first reference position (shown at 704 in
(106) Block 744 then directs the microprocessor 250 to determine locations of intermediate points along the first section 712 of the positioning device 209 (i.e. between the first position 704 and the second position 708). The location of the first position 704 determined at block 740 is used to determine locations of all vertebrae 550 in the first section 712 of the s-segment 700. For example in the embodiment shown in
(107)
relative to the first position 704. A vector from the first position 704 to the n.sup.th vertebra position may thus be determined and added to the vector
(108) Block 746 then directs the microprocessor 250 to determine whether all of the reference positions have been processed, and if not, the microprocessor is directed to block 748 where the next reference position is selected for processing. Block 748 then directs the microprocessor 250 back to block 742 and blocks 742 and 744 are repeated for each reference position.
(109) The location of the second position 708 relative to the fixed slave reference position 552 may be determined from the configuration variables q.sub.ins, ϑ.sub.prox, and δ.sub.prox. Determining a vector
(110)
relative to the second position 708. A vector from the second position 708 to the n.sup.th vertebra position may thus be determined and added to the vector
The location of the third position 706 at the end of the s-segment 700 may be expressed in terms of the vector
(111)
relative to the third position 706. A vector from the third position 706 to the n.sup.th vertebra position may thus be determined and added to the vector
(112) The location of the fourth position 710 may be determined from the vector
(113) Finally, the theoretical location of the end effector position 560 may be determined as a vector
(114) If at block 746, each of the reference positions along the positioning device 209 has been processed, the locations of a plurality of points along the 209 and end effector 210 will have been determined, thus defining the 3D spatial positioning of the instrument 208 in the surgical workspace.
(115) The process 308 then continues at block 748, which directs the microprocessor 250 to generate a two-dimensional projection of the current 3D spatial position of the positioning device 208 to generate the area 412 representing the positioning device shown in the graphical depiction 136 of
(116) The process 308 then continues at block 750, which directs the microprocessor 250 to determine whether any projected portion of the positioning device 209 is proximate the boundary 406 in
(117) Block 752 directs the microprocessor 250 to cause an active constraint alert to be generated. In one embodiment a visual alert may be generated by changing a color or displayed intensity of the boundary 402 or 406 or by displaying an alert symbol on the display 122. The alert may alternatively be displayed in the graphical depictions 136 and 138 overlaying the location of the indicators 412 and 414. In other embodiments an audible alert may be generated. Alternatively or additionally, the microprocessor 250 may cause the input device 110 to generate haptic feedback via the hand controller 112. Block 752 then directs the microprocessor 250 back to block 302 in
(118) If at block 750, the positioning device 209 and end effector 210 are not proximate any boundaries, the microprocessor 250 is directed back to block 302 in
(119) Depth
(120) The instrument depth range 416 depiction shown in
(121) Rotation
(122) The instrument rotation range 440 shown in
(123) Positioning Device Active Constraints
(124) The intermediate positions of the positioning device 209 of the right side instrument 208 calculated as described define the 3D location of the positioning device 209 of the instrument 208 within the surgical workspace (shown at 484 in
(125) In the next example 802, the positioning device 209 has been moved up and the positioning device 213 has been moved down and intermediate locations at 804 are determined by the microprocessor 250 to be proximate upper and lower portions of the boundary surface 485. The dots depicting the instruments 208 and 212 are shown at locations proximate the boundary. An alert may be generated by coloring portions of the boundary in a conspicuous color to indicate the condition to the surgeon.
(126) An example of left/right limits for the positioning devices 209 and 213 are shown at 806. In the example shown at 808, the positioning devices 209 and 213 are positioned generally as in the example 806 but with the end effectors 210 and 214 turned outwardly. The end effectors 210 and 214 are located proximate the boundary surface 489 of the region 488 shown in
(127) An example 810 shows the instruments 208 and 212 slightly turned in so that the end effector indicators 408 and 410 and the areas 412 and 414 are visible. In the example 812, the end effectors 210 and 214 remain turned inwardly while the positioning devices 209 and 213 have reached the upper and lower limits as shown at 814. In example 816, the end effectors 210 and 214 have turned outwardly and are proximate respective upper and lower portions of the 3D boundary surface 489. In the final example 818, a similar situation shown in example 812 is shown for the left/right limits to positioning device movement.
(128) While specific embodiments have been described and illustrated, such embodiments should be considered illustrative of the invention only and not as limiting the invention as construed in accordance with the accompanying claims.