HAND EYE COORDINATION SYSTEM FOR ROBOTIC SURGICAL SYSTEM
20220022984 · 2022-01-27
Inventors
Cpc classification
A61B34/20
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
International classification
Abstract
A method of controlling a tool of a surgical robot with a processing unit such that the tool follows input of the input handle of a user console includes receiving movement of the input handle in a master frame of the user console, translating the movement of the input handle in the master frame to movement of the tool in a camera frame of a camera providing real-time images of a surgical site, translating the movement of the tool in the camera frame to movement of the tool in a world frame defined in a surgical theater, translating the movement of the tool in the world frame to movement of the tool in a tool frame which is defined by an arm supporting the tool, and transmitting control commands to a motor controlling the tool such that the tool follows movement of the input handle.
Claims
1. A method of controlling a tool of a surgical robot with a processing unit such that the tool follows input of an input handle of a user console, the method comprising: receiving movement of the input handle in a master frame of the user console; translating the movement of the input handle in the master frame to movement of the tool in a camera frame of a camera providing real-time images of a surgical site; translating the movement of the tool in the camera frame to movement of the tool in a world frame defined in a surgical theater; translating the movement of the tool in the world frame to movement of the tool in a tool frame which is defined by an arm supporting the tool; and transmitting control commands to a motor controlling the tool such that the tool follows movement of the input handle.
2. The method according to claim 1, wherein translating the movement of the input handle in the master frame to movement of the tool in the camera frame includes applying a display rotation matrix to movement of the input handle in the master frame to translate movement of the input handle to a display frame of a display of the user console, the display providing visualization of the real-time images of the surgical site.
3. The method according to claim 2, wherein the display rotation matrix rotates the movement 30 degrees about an axis of the master frame.
4. The method according to claim 1, wherein translating the movement of the input handle in the master frame to movement of the tool in the camera frame includes applying a flip rotation matrix to movement of the input handle in the master frame to translate movement of the input handle to the camera frame, the flip rotation matrix rotating movement 180 degrees about an axis of the display frame.
5. The method according to claim 4, further comprising: receiving a flip flag; and applying the flip rotation matrix when the flip flag indicates that the real-time images from the camera are flipped before being viewed on a display.
6. The method according to claim 1, wherein translating the movement of the tool in the camera frame to the movement of the tool in the world frame includes: determining yaw and pitch angles of the camera frame relative to the world frame; and applying a world frame rotation matrix including the yaw and pitch angles of the camera frame relative to the world frame to translate the movement of the tool in the camera frame to movement of the tool in the world frame.
7. The method according to claim 1, wherein translating the movement of the tool in the camera frame to the movement of the tool in the world frame includes: receiving an offset of the camera; and adjusting the movement in the camera frame by the offset of the camera by applying an adjustment rotation matrix including the offset of the camera to the movement of the tool in the camera frame.
8. The method according to claim 1, wherein translating the movement of the tool in the world frame to movement of the tool in the tool frame includes: determining yaw and pitch angles of the tool frame relative to the world frame; and applying a transpose of a tool frame rotation matrix including the yaw and pitch angles of the tool frame relative to the world frame to translate the movement of the tool in the world frame to movement of the tool frame.
9. The method according to claim 1, further comprising verifying the camera is fixed during the movement of the input handle before transmitting the control commands to the motor controlling the tool.
10. A robotic surgical system comprising: a surgical robot including a first arm supporting a camera and a second arm supporting a tool, the camera defining a camera frame and configured to capture real-time images of a surgical site, the second arm defining a tool frame and configured to manipulate the tool within the tool frame, the surgical robot defining a world frame; a user console including an input handle and a display, the input handle moveable within a master frame defined by the user console, the display configured to display real-time images of a surgical site provided by the camera; and a processing unit configured to: receive movement of the input handle in the master frame; translate the movement of the input handle in the master frame to movement of the tool in the camera frame; translate the movement of the tool in the camera frame to movement of the tool in the world frame; translate the movement of the tool in the world frame to movement of the tool in the tool frame; and transmit control commands to the second arm such that the tool follows movement of the input handle.
11. The robotic surgical system according to claim 10, wherein the display defines a display frame that is rotated relative to one axis of the master frame.
12. The robotic surgical system according to claim 11, wherein the display is rotated 30 degrees about the one axis of the master frame.
13. The robotic surgical system according to claim 11, wherein the processing unit is configured to apply a display rotation matrix to movement of the input handle in the master frame to translate movement of the input handle to the display frame.
14. The robotic surgical system according to claim 10, wherein the display is configured to flip the real-time images provided by the camera 180 degrees before displaying the real-time images.
15. The robotic surgical system according to claim 14, wherein the processing unit is configured to receive a flip flag indicative of the real-time images being flipped on the display and to apply a flip rotation matrix to movement of the tool to translate the movement of the tool to the camera frame.
16. The robotic surgical system according to claim 10, wherein the processing unit is configured to receive yaw and pitch angles of the camera frame relative to the world frame and to apply a world frame rotation matrix including the yaw and pitch angles of the camera frame relative to the world frame to translate the movement of the tool in the camera frame to movement of the tool in the world frame.
17. The robotic surgical system according to claim 10, wherein the processing unit is configured to receive an offset angle of the camera and to apply an adjustment rotation matrix including the offset angle of the camera to adjust the movement of the tool in the camera frame.
18. The robotic surgical system according to claim 10, wherein the processing unit is configured to receive yaw and pitch angles of the tool frame relative to the world frame and to apply a transpose of a tool frame rotation matrix including the yaw and pitch angles of the tool frame relative to the world frame to translate the movement of the tool in the world frame to movement of the tool in the tool frame.
19. The robotic surgical system according to claim 10, wherein the processing unit is configured to verify the camera is fixed during the movement of the input handle before transmitting the control commands.
20. A computer-readable medium having stored thereon sequences of instructions that, when executed by a processor, cause the processor to: receive movement of an input handle in a master frame of a user console; translate the movement of the input handle in the master frame into movement of a tool in a camera frame of a camera providing real-time images of a surgical site; translate the movement of the tool in the camera frame to movement of the tool in a world frame defined in a surgical theater; translate the movement of the tool in the world frame to movement of the tool in a tool frame which is defined by an arm supporting the tool; and transmit control commands to a motor controlling the tool such that the tool follows movement of the input handle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION
[0024] Embodiments of the present disclosure are now described in detail with reference to the drawings in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel.
[0025] Referring to
[0026] The user console 40 includes a display device 44 which is configured to display three-dimensional images. The display device 44 displays three-dimensional images of the surgical site “S” which may include data captured by imaging devices 16 positioned on the ends 14 of the linkages 12 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site “S”, an imaging device positioned adjacent the patient “P”, imaging device 56 positioned at a distal end of an imaging arm 52). The imaging devices (e.g., imaging devices 16, 56) may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site “S”. The imaging devices transmit captured imaging data to the processing unit 30 which creates three-dimensional images of the surgical site “S” in real-time from the imaging data and transmits the three-dimensional images to the display device 44 for display.
[0027] The user console 40 also includes input handles 42 which are supported on control arms 43 which allow a clinician to manipulate the surgical robot 10 (e.g., move the linkages 12, the ends 14 of the linkages 12, and/or the tools 20). Each of the input handles 42 is in communication with the processing unit 30 to transmit control signals thereto and to receive feedback signals therefrom. Additionally or alternatively, each of the input handles 42 may include input devices (not explicitly shown) which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools 20 supported at the ends 14 of the linkages 12.
[0028] Each of the input handles 42 is moveable through a predefined workspace to move the ends 14 of the linkages 12, e.g., tools 20, within a surgical site “S”. The three-dimensional images on the display device 44 are orientated such that the movement of the input handles 42 moves the ends 14 of the linkages 12 as viewed on the display device 44. The three-dimensional images remain stationary while movement of the input handles 42 is scaled to movement of the ends 14 of the linkages 12 within the three-dimensional images. To maintain an orientation of the three-dimensional images, kinematic mapping of the input handles 42 is based on a camera orientation relative to an orientation of the ends 14 of the linkages 12. The orientation of the three-dimensional images on the display device 44 may be mirrored or rotated relative to the view captured by the imaging devices 16, 56. In addition, the size of the three-dimensional images on the display device 44 may be scaled to be larger or smaller than the actual structures of the surgical site permitting a clinician to have a better view of structures within the surgical site “5”. As the input handles 42 are moved, the tools 20 are moved within the surgical site “S” as detailed below. Movement of the tools 20 may also include movement of the ends 14 of the linkages 12 which support the tools 20.
[0029] For a detailed discussion of the construction and operation of a robotic surgical system 1, reference may be made to U.S. Pat. No. 8,828,023, the entire contents of which are incorporated herein by reference.
[0030] Referring to
[0031] With particular reference to
[0032] Referring back to
[0033] Referring to
[0034] With reference to the robotic surgical system of
[0035] As detailed herein, a vector v in a respective coordinate frame r will be expressed as v.sub.r=[v.sub.x v.sub.y v.sub.z]. In addition, a rotation matrix that relates a coordinate frame F.sub.d and a coordinate frame F.sub.b is expressed as .sup.a.sub.bR such that .sup.a.sub.bRv.sub.b=v.sub.a. From this, it will be understood that .sup.a.sub.bR.sub.C.sup.bR=.sup.a.sub.CR and that .sup.a.sub.bR=(.sup.b.sub.aR).sup.T.
[0036] As detailed below, the processing unit 30 determines movement in the master frame F.sub.m and translates the movement to the display frame F.sub.d. The movement may then be translated to the camera frame F.sub.c which is related to the world frame F.sub.w. The relation of the camera frame F.sub.c to the world frame F.sub.w allows the processing unit 30 to translate the movement to the world frame F.sub.w. Then the relation between the tool frame F.sub.t and the world frame F.sub.w allows the processing unit 30 to translate the movement from the world frame F.sub.w to the tool frame F.sub.t.
[0037] To map movement of a respective one of the input handles 42 in the master frame F.sub.m to desired movement of the assigned tool 20 in the tool frame F.sub.t, the movement of the input handle 42 in the master frame F.sub.m from a time k−1 to a time k can be expressed as Δp.sub.m=[v.sub.x v.sub.y y.sub.z].
[0038] As shown, the display 44 is angled to provide a comfortable view to a clinician interfacing with the user console 40 such that the display frame F.sub.d is offset or rotated relative to the master frame F.sub.m. Specifically, the display frame F.sub.d is rotated about the X.sub.d axis at a display angle θ.sub.a of about 30° or η/6. However, in some embodiments, the display 44 may be positioned to provide a comfortable view to a clinician such that the display frame F.sub.d is substantially aligned with the master frame F.sub.m or at a different display angle θ.sub.a.
[0039] When the display frame F.sub.d is offset from the master frame F.sub.m, it is necessary to translate movement in the master frame F.sub.m to the display frame F.sub.d. To translate movements in the master frame F.sub.m to the display frame F.sub.d the rotation matrix is
such that ΔN, in the display frame F.sub.d can be translated using the transpose of the rotation matrix as Δp.sub.d=(.sup.m.sub.dR).sup.TΔp.sub.m=.sup.d.sub.mRΔp.sub.m which provides
[0040] In some embodiments, the display 44 is capable of displaying an image of the surgical site S that is flipped or rotated about the Z.sub.d axis to allow for a more comfortable manipulation of the tool 20 based on the orientation of the displayed image. In embodiments, the display 44 allows for the flipping of the displayed image by 180° with the manipulation of a single control, e.g., pushing a button or flipping a switch. To translate movements from the display frame F.sub.d to the camera frame F.sub.c, a flip rotation matrix R.sup.f for the orientation of the displayed image is applied to the movement in the display frame F.sub.d such that movement in the camera frame F.sub.c is expressed as Δp.sub.c=R.sub.fΔp.sub.d. This flipping of the displayed image by 180° may be represented by a flag such that when the image is flipped the flip flag is “true” or “1” and when the image is not flipped the flip flag is “false” or “0”. When the flip flag is false, the flip rotation matrix R.sub.f is
such that movement in a positive Z.sub.d or up on the display 44 is translated to movement of the tool 20 in a negative Y.sub.c, movement in a positive Y.sub.d or left on the display 44 is translated to movement of the tool 20 in a positive Z.sub.c, and movement in a positive X.sub.d or into the display 44 is translated to movement of the tool 20 in a positive X.sub.c. Alternatively, when the flip flag is true, the flip rotation matrix R.sub.f is
such that movement in a positive Z.sub.d or up on the display 44 is translated to movement of the tool 20 in a positive Y.sub.c, movement in a positive Y.sub.d or left on the display 44 is translated to movement of the tool 20 in a negative Z.sub.c, and movement in a positive X.sub.d or into the display 44 is translated to movement of the tool 20 in a positive X.sub.c. It is noted that movement of along X.sub.d is not affected by a condition of the flip flag.
[0041] For the surgical robot 10 to move the tool 20, movement of the tool in the camera frame Δp.sub.c is translated to movement of the tool in the tool frame Δp.sub.t. To translate movement of the tool in the camera frame Δp.sub.c is translated to movement of the tool in the tool frame Δp.sub.t, movement of the tool in the camera frame Δp.sub.c is first translated to movement of the tool in the world frame Δp.sub.w. Movement of the tool in the world frame Δp.sub.w is then translated to movement of the tool in the tool frame Δp.sub.t.
[0042] To translate movement of a tool or camera, e.g., tool 20 or camera 56, in the world frame F.sub.w to or from a frame of another object, e.g., the camera frame F.sub.c or the tool frame F.sub.t, a relation between the world frame F.sub.w and the other frame can be used. For example, the relation between the camera frame F.sub.t and the world frame F.sub.w can be used to translate movement from the camera frame F.sub.c to the movement in the world frame F.sub.w. The relationship between the camera frame F.sub.t and the world frame F.sub.w can be determined using a setup arm system 15 utilizing lasers 19a, 19b to determine the relation between the camera and world frames F.sub.t, F.sub.w. Specifically, the setup arm system 15 includes the lasers 19a, 19b on each robot base 18 to determine a yaw angle θ.sub.yaw and a pitch angle θ.sub.pitch between the respective robot base and the ground, which is a plane of the world frame F.sub.w. For a detailed description of a suitable setup arm system and a method for determining a relation between a frame of a robot base and another frame reference can be made to U.S. Provisional Patent Application No. 62/833,876, filed Apr. 15, 2019 (now International Patent Application Serial No. PCT/US2019/036657, filed on Jun. 12, 2019), entitled “SYSTEM AND METHOD FOR ALIGNING A SURGICAL ROBOTIC ARM”, the entire contents of which are hereby incorporated by reference.
[0043] To translate the camera frame F.sub.c to the world frame F.sub.w, the yaw and pitch rotations are applied to the world frame F.sub.w to locate the camera frame F.sub.c within the world frame F.sub.w. The order of the rotations is determinative with rotation about the Z.sub.w axis by the yaw angle θ.sub.yaw preceding rotation about the preceding rotation about an intermediate Yi axis (not explicitly shown), which is the Y.sub.w axis rotated by the yaw angle θ.sub.yaw about the Z.sub.w axis, by the pitch angle θ.sub.pitch. This provides the following rotation matrix for translating the camera frame F.sub.c to the world frame F.sub.w:
[0044] In embodiments, the camera 56 is a 0° endoscope such that the rotation matrix for translating the camera frame remains as shown above. However, in some embodiments, the camera 56 may be an offset endoscope such that an additional adjustment rotation matrix R.sub.a of the offset endoscope is required to translate the world frame F.sub.w to the camera frame F.sub.c. The adjustment rotation matrix
where θ.sub.a is an adjustment angle for the offset of the camera 56. For example, when a 30° offset endoscope is used for the camera 56, the adjustment angle θ.sub.a is π/6 such that the adjustment rotation matrix
This provides that the rotation matrix for translating the camera frame F.sub.c to the world frame F.sub.w, including the adjustment rotation matrix R.sub.a, can be expressed as:
[0045] or, when a 30° offset endoscope is used for the camera 56, expressed as:
[0046] Applying the relation between the camera frame F.sub.c to the world frame F.sub.w to movement in the camera frame Δp.sub.c provides movement in the world frame as:
Δp.sub.w=.sup.w.sub.cRR.sub.aΔp.sub.c.
[0047] Now that movement in the world frame Δp.sub.w is determined, a relation between the world frame F.sub.w and the tool frame F.sub.t is used to translate movement in the world frame Δp.sub.w to movement in the tool frame Δp.sub.t such that a rotation matrix .sup.w.sub.tR can be determined. As it is necessary to determine the rotation in tool frame F.sub.t from the world frame F.sub.w, a transpose of the .sup.w.sub.tR rotation matrix is used such that:
where the θ.sub.yaw and the θ.sub.pitch are taken from the tool frame F.sub.t relative to the world frame F.sub.w.
[0048] This provides that translating movement in the world frame Δp.sub.w to movement in the tool frame Δp.sub.t can be expressed as Δp.sub.t=(.sup.w.sub.tR).sup.TΔp.sub.w. Combining this translation with the above translations allows a total transform from the master frame F.sub.m to the tool frame F.sub.t to be expressed as a handeye transformation T.sub.handeye such that:
T.sub.handeye=(.sup.w.sub.tR).sup.T.Math.R.sub.a.Math..sup.w.sub.cR.Math.R.sub.f.Math..sup.d.sub.mR
As detailed above, the flip rotation matrix Rdf is dependent on a flip flag of the being either true or false to translate the display frame F.sub.d to the camera frame F.sub.c. Thus, to translate movement of an input handle 42 to movement of an assigned tool 20 the transformation provides:
Δp.sub.t=T.sub.handeyeΔp.sub.m
[0049] The above transformation relies on the camera 56 being stationary during movement of the input handle 42. Thus, when applying the transformation, a camera flag may be included to pause movement of the tool 20 when the camera 56 is moving, rotating, and/or zooming.
[0050] With reference to
[0051] Initially, movement of an input handle 42 of the user console 40 is determined in a master frame F.sub.m by the processing unit 30 (Step 110). The control arm 43 may include encoders and/or motors to determine movement of the input handle 42. For a detailed description of suitable encoders and motors, reference may be made to International Patent Applications No. PCT/US2017/034433, filed May 25, 2017, and PCT/US2017/035580, filed Jun. 2, 2017. The entire contents of each of these applications are hereby incorporated by reference.
[0052] Before translating movement of the input handle 42 to movement of the assigned tool 20, the processing unit 30 verifies that the camera 56, e.g., the camera frame F.sub.c, was stationary or fixed during the movement of the input handle 42 (Step 120). The processing unit 30 may include an input in the form of a camera flag that is “True” or “1” when the camera 56 is fixed and is “False” or “0” when the camera is not stationary. When the camera flag is not stationary, the processing unit 30 does not translate movement of the input handle 42 and returns to a ready state for detecting movement of the input handle 42 (Step 122).
[0053] When the camera 56 is stationary, the processing unit 30 translates the movement of the input handle 42 from the master frame F.sub.m to the display frame F.sub.d of the display 44 using the rotation matrix .sup.d.sub.mR (Step 124). As detailed above, the rotation matrix .sup.d.sub.mR accounts for an offset of the display frame F.sub.d from the master frame F.sub.m. After the movement is translated to the display frame F.sub.d, the processing unit 30 translates the movement to the camera frame F.sub.c (Step 130). To translate the movement to the camera frame F.sub.c, the processing unit 30 checks a flip flag to determine if the view provided on the display 44 is flipped from the camera 56 or is not flipped with respect to the camera 56. When the view on the display 44 is flipped, the processing unit applies the flip rotation R.sub.f to the movement (Step 132) before proceeding to translate the movement to the world frame F.sub.w. When the image is not flipped from the camera 56, the processing unit 30 may proceed to translate the movement to the world frame F.sub.w, without applying the flip rotation R.sub.f or may treat the flip rotation R.sub.f as being equal to 1.
[0054] To translate the movement to the world frame F.sub.w, the processing unit 30 receives the yaw and pitch angles of an arm 12 supporting the camera 56 (Step 140). The yaw and pitch angles of the arm 12 may be determined by the setup lasers 19a, 19b of the arm 12 and provided by a base 18 supporting the arm 12 to the processing unit 30. A controller 32 within the base 18 may determine and calculate the yaw and pitch angles of the arm 12. The yaw and pitch angles of the arm 12 may are determined for the setup arm 19 and then transferred to the camera 56 by determining positions of joints of the arm 12 supporting the camera 56. With the yaw and pitch angles of the camera 56, the processing unit 30 translates the movement to the world frame F.sub.w using the rotation matrix .sup.w.sub.cR (Step 142). Once the processing unit 30 translates movement to the world frame F.sub.w, the processing unit 30 checks an offset of the camera 56 (Step 144). When the camera 56 is a zero offset camera, the processing unit 30 proceeds to translate the movement to the tool frame F.sub.t or assigns the adjustment rotation R.sub.a a value of “1”. When the offset of the camera 56 is non-zero, the processing unit 30 applies an adjustment rotation R.sub.a to the movement before proceeding to translating movement to the tool frame F.sub.t (Step 146).
[0055] With the movement in the world frame F.sub.w, the processing unit 30 receives yaw and pitch angles of the tool frame F.sub.t from a base 18 supporting the tool 20. The yaw and pitch angles of the tool frame F.sub.t are determined by the setup lasers 19a, 19b of the arm 12 and provided by the base 18 supporting the arm 12 to the processing unit 30 (Step 150). With the yaw and pitch angles of the base 18 supporting the tool 20, the processing unit 30 translates the movement to the tool frame F.sub.t using the transpose of the rotation matrix .sup.w.sub.tR (Step 152).
[0056] With the movement translated to the tool frame F.sub.t, the processing unit 30 sends control commands to the base 18 supporting the tool 20 such that motors, within the base 18 and/or an instrument drive unit, actuate the arm 12 and/or tool 20 such that the tool 20 is moved in a desired manner in response to movement of the input handle 42 in the master frame F.sub.m (Step 160) such that the tool 20 follows movement of the input handle 42. A controller 34 disposed within the base 18 may receive the control commands and actuate the arm 12 and/or the tool 20 in response to the control commands.
[0057] Referring to
[0058] In some embodiments, the memory 302 stores data 314 and/or an application 316. In some aspects the application 316 includes a user interface component 318 that, when executed by the processor 304, causes the display device 306 to present a user interface (not shown in
[0059] While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the appended claims. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope of the claims appended hereto.