CALIBRATION METHOD FOR THE AUTOMATED CALIBRATION OF A CAMERA WITH RESPECT TO A MEDICAL ROBOT, AND SURGICAL ASSISTANCE SYSTEM
20240407883 ยท 2024-12-12
Inventors
Cpc classification
A61B2090/397
HUMAN NECESSITIES
G05B2219/39008
PHYSICS
G06T7/80
PHYSICS
G05B2219/39057
PHYSICS
International classification
A61B90/00
HUMAN NECESSITIES
Abstract
A method is used for calibrating a robot camera and external camera system relative to a medical robot. The robot camera is guided on an arm. The camera system has an external camera. The method includes: moving the robot camera via the robot arm during sensing and capturing; detecting a pose of a calibration pattern and/or an external tracker, each having a transformation to a pose of the external camera, and/or detecting a pose of the external camera; determining a transformation between the robot camera and external camera, and determining a field of view; moving a flange into at least three poses in the field of view and sensing the at least three poses via the external camera, and simultaneously sensing a transformation between the robot base and the flange; and performing a hand-eye calibration. The method can be used with a surgical assistance system and computer-readable storage medium.
Claims
1. A calibration method for an automated calibration of a robot camera in relation to a medical robot, the robot camera being movably guided on a robot flange of a robot arm, which is connected to a robot base, and for an automated calibration of an external camera system, which has at least one external camera, in relation to the medical robot, comprising the steps of: moving the robot camera by the robot arm during sensing and capturing by the robot camera; detecting, based on the capturing, a pose of an optical calibration pattern that is predefined and/or of an external tracker, each having a predefined transformation from this detected pose to a pose of the external camera, and/or detecting, based on the capturing, a pose of the external camera; determining, based on the pose of the external camera, a transformation between the robot camera and the external camera, and determining a field of view of the external camera; moving the robot flange into at least three different poses in the field of view of the external camera, and sensing the at least three poses of the robot flange via the external camera, and simultaneously sensing a transformation between the robot base and the robot flange; and carrying out, on the basis of the at least three sensed poses and the at least three sensed transformations, a hand-eye calibration, more particularly with determination of a transformation from the robot flange to the robot camera and/or of a transformation from the external camera to the robot base and/or a transformation between the robot flange and the tracker.
2. The calibration method according to claim 1, wherein, with a tracker fastened on the robot flange, the step of moving the robot flange into the at least three poses comprises the steps of: determining a transformation between the tracker and the external camera in each pose of the at least three poses; and carrying out the hand-eye calibration based on the at least three sensed transformations from the robot base to the robot flange and the at least three sensed transformations from the tracker to the external camera, more particularly with determination of a transformation between the tracker and the robot flange.
3. The calibration method according to claim 2, wherein the steps of moving the robot flange and carrying out the hand-eye calibration of the calibration method are carried out iteratively, and that, after a first round of carrying out the hand-eye calibration, the transformation between the tracker and the robot flange, the transformation between the external camera and the tracker, as well as forward kinematics of the robot are used to determine new poses of the robot flange and to move the robot flange correspondingly into these poses for a next iteration.
4. The calibration method according to claim 1, wherein the step of moving the robot camera: is carried out heuristically and/or systematically on the basis of a first transformation between the robot flange and the robot camera, more particularly on the basis of a stored first transformation on the basis of a 3D model, more particularly a CAD model; and/or is carried out on the basis of random movements until the optical calibration pattern or the external tracker or the external camera is detected.
5. The calibration method according to claim 1, wherein the step of detecting the optical calibration pattern comprises the steps of: comparing sections of the capturing of the robot camera with a stored calibration pattern and when there is conformity: determining the pose of the calibration pattern via image analysis and determining, on the basis of a stored transformation between the pose of the calibration pattern and the pose of the external camera, a pose of the external camera.
6. The calibration method according to claim 1, wherein the step of detecting a pose of the external camera comprises the steps of: comparing sections of the capturing of the robot camera with a stored geometric model of the external camera, and when detecting conformity with the stored geometric model, determining a pose of the external camera by correlating three-dimensional structures.
7. The calibration method according to claim 1, wherein the calibration method further comprises the step of a geometric calibration, more particularly prior to the step of moving the robot camera or after the step of carrying out the hand-eye calibration.
8. The calibration method according to claim 1, wherein the step of moving the robot flange into at least three different poses further comprises the step or the steps of: determining an area within the field of view of the external camera which can be sensed in a particularly accurate manner and moving the robot flange, more particularly the tracker, into this area of the field of view; and/or determining a joint configuration of the robot which allows a particularly accurate sensing of the poses, and moving into same; and/or moving the robot flange, more particularly the tracker, into at least three poses distributed in the field of view of the external camera, more particularly into those poses where an angle between the robot flange, more particularly the tracker, and the external camera may be distributed between small and large.
9. The calibration method according to claim 1, further comprising the steps of: hand-eye calibration between the robot and the external camera; and/or hand-eye calibration between the robot camera and the external camera; and/or hand-eye calibration between the robot camera and the robot and/or determination of a transformation between the tracker and the robot flange via a hand-eye calibration, wherein when all three hand-eye calibrations are carried out and hence redundant transformations exist, error minimisation is carried out.
10. A surgical navigated assistance system, comprising: at least one robot comprising a robot arm with a robot flange connected to a robot base, the robot arm being movable; a robot camera connected to the robot flange and movable via the robot arm; and an external camera system comprising at least one external camera, wherein, the robot flange and/or the robot camera being movable into a field of view of the external camera, the surgical navigated assistance system comprising a control unit adapted: to move the robot camera via the robot arm and to take and process a capturing by the robot camera; to sense, in the capturing, a pose of an optical calibration pattern and/or an external tracker, each having a predefined transformation to the external camera, and/or to determine a pose of the external camera based on the capturing; to determine, based on the pose of the external camera, a transformation between the robot camera and the external camera as well as a field of view of the external camera; to move the robot flange into at least three different poses in the field of view and to sense, via the external camera, the at least three different poses and to simultaneously sense at least three transformations between the robot base and the robot flange; and to carry out, based on the at least three different poses and the at least three transformations, a hand-eye calibration.
11. The surgical assistance system according to claim 10, wherein: the external camera is fastened on a base and additionally the optical calibration pattern is arranged at the base in a rigid manner relative to the external camera, and a static transformation between a pose of the optical calibration pattern and the pose of the external camera is stored in a storage unit and is provided to the control unit for the determination of the pose of the external camera, or the external camera is fastened on a base and the optical calibration pattern is relatively movable to the external camera which tracks the optical calibration pattern and a static transformation to the optical calibration pattern which is stored in a storage unit, wherein, based thereon, the control unit calculates a dynamic transformation from the pose of the optical calibration pattern to the pose of the external camera.
12. The surgical assistance system according to claim 10, wherein the external camera is a stereo camera for tracking, and the tracker on the robot flange is an infrared-based tracker with a plurality of infrared markers spaced apart from each other.
13. The surgical assistance system according to claim 10, wherein the robot flange, the tracker, and the robot camera are rigid with respect to each other.
14. A computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to perform the calibration method according to claim 1.
15. The surgical navigated assistance system according to claim 10, wherein the external camera tracks the optical calibration pattern via a calibration tracker with optical markers which is mounted rigidly on the calibration pattern.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0053] The present disclosure will be explained in detail in the following by means of preferred embodiments with reference to the accompanying Figures.
[0054]
[0055]
[0056]
[0057] The Figures are of schematic nature and shall only serve the understanding of the present disclosure. Equal elements are provided with the same reference signs. The features of the various embodiments are interchangeable.
DETAILED DESCRIPTION
[0058]
[0059] The surgical assistance system 1 has a robot 2 with a controllable and movable robot arm 4 which comprises an end portion with a robot flange 6. An end effector 8, for instance, in the form of grasping forceps or, as in the instant case, a rod is mounted on this robot flange 6 so as to manipulate an object with the end effector 8. For sensing the object and for a corresponding control of the robot, a camera 10 on the robot 2 (eye-in-hand; in the following called robot camera) is mounted on the robot flange 6, the field of view of which points in the direction of the end effector 8, for serving as an eye of the robot 2 and for optically sensing in particular a portion of the end effector 8. In this manner, objects may be sensed and, after the calibration of robot camera 10 to robot 2, also be controlled and manipulated appropriately. Like the end effector 8, the robot camera 10 may also be controlled and moved. Moreover, a tracker 12 in the form of a geometric tracker with four marking points spaced apart from each other is provided on the robot flange 6 so as to sense the tracker 12 and hence the robot flange 6, by means of an external camera system 14 with an external camera 16, in a particularly precise manner spatially with respect to a pose, i.e. a position and an orientation.
[0060] The external camera 16 on a static base 18 is directed to the robot 2 and senses, when the robot arm 4 with the tracker 12 and the robot camera 10 is moved in correspondence with its kinematics into the field of view of the external camera 16, the robot flange 6, the tracker 12, and the robot camera 10.
[0061] The surgical assistance system 1 comprises, for a function and/or configuration of an automated calibration, furthermore a control unit 20 which is specifically adapted to carry out an automatic calibration between the robot camera 10, the external camera 16, and the robot 2.
[0062] In contrast to the state of the art, not only a camera is thus provided, be it a camera on the robot or an external camera, but specifically two cameras 10, 16 are provided, namely a controllable and actively guidable robot camera 10 on the robot 2 itself and an external camera 16 which is statically fastened on the base 18 and does not move along with the robot 2.
[0063] A plane face with an imprinted optical calibration pattern 22 in the form of a chequered pattern is fastened on the base 18 below the external camera and has a defined pose relative to the external camera 16. Alternatively, the optical calibration pattern may also be displayed by means of a monitor. Specifically, the chequered pattern comprises individual quadrangles with further markings. This optical calibration pattern 22 serves for a particularly easy detection of a pose.
[0064] The control unit 20 is specifically adapted to move, in a first step, more particularly after a coarse positioning of the robot 2 in a field of view of the external camera 16 (eye-on-base (camera)), the robot camera 10 fastened on the robot flange 6 randomly in space by means of random movements. In this process, the robot camera 10 continuously senses the environment and/or takes a continuous (video) capturing A and provides same in a computer-readable manner to the control unit 20. The control unit 20 in turn analyses this capturing A so as to detect the external camera 16 and the pose thereof.
[0065] Concretely, the control unit 20 is adapted to detect, in the capturing A of the robot camera 10, the optical calibration pattern 22 which is also stored in a storage unit 24 and provided to the control unit 20. The optical calibration pattern 22 is particularly easy to detect since it may be arranged in any, but predefined, relation to the external camera 16. For instance, the optical calibration pattern may be arranged above 1.5 m, so that it is not completely concealed by medical professionals or by waist-high objects. Also, this is a plane face.
[0066] On the basis of the optical calibration pattern 22 sensed by the robot camera 10, the control unit 20 then determines a pose of this optical calibration pattern 22. Since the calibration pattern 22 is, depending on an angle of a normal of the plane surface to a direct connection line between the robot camera 10, represented in a distorted, but recalculable manner in the capturing A, a pose, i.e. a position and an orientation, may be determined by means of common methods of image analysis.
[0067] In the storage unit 24, apart from the optical calibration pattern 22 also a transformation, here a transformation matrix, between the pose of the calibration pattern 22 and the pose of the external camera 16 is stored. It may also be said that a transformation matrix between a local coordinate system (COS) of the calibration pattern 22 and the local COS of the external camera 16 is stored. The control unit 20 now determines on the basis of the sensed pose of the calibration pattern 22 in combination with the stored transformation the pose of the external camera and can thus calculate a (coarse) transformation between the external camera 16 and the robot camera 10, which may even be further specified. The individual local coordinate systems and/or groups are illustrated as dashed boxes in
[0068] In other words, a transformation between the eye-on-base camera and the eye-in-hand camera as well as a field of view of the external camera 16 is thus known. In order to continue with the calibration, a coarse transformation between the robot camera 10 (eye-in-hand camera) and the robot flange 6 is helpful. In the storage unit 24, a coarse 3D model (CAD model) is stored for this purpose, from which the control unit 20 determines a first coarse estimation of such a transformation.
[0069] On the basis of the determined transformation between the external camera 16 and the robot camera 10 as well as optionally the first coarse transformation between the robot flange 6 and the robot camera 10 as well as a known field of view of the external camera 16, all necessary data are available for moving the robot camera 10 in the field of view of the external camera 16 and for carrying out a calibration.
[0070] In other words, with the transformation between the eye-in-hand camera and the eye-on-base camera in combination with a known field of view of the eye-on-base camera it is possible to move the eye-in-hand camera robot-like in the field of view of the eye-on-base camera. Thus, so to speak, an optical framework for the movement of the robot camera 10 within this predefined optical framework is defined.
[0071] In a next step, the solution of the problem of the hand-eye-calibration takes place. For this step, the tracker 12 is disposed close to the robot flange 6 of the robot 2, so that it may be assumed that it is disposed preferably next to the robot camera 10 and preferably the robot flange 6, also in the field of view of the robot camera 10. Alternatively, a coarse pose of the tracker 12 relative to the robot flange 6 may also be known, in particular from a CAD model stored in the storage unit 24.
[0072] For a hand-eye calibration the tracker 12 fastened on the robot flange 6 of the robot 2 is, by means of the control unit 20, moved along with the robot arm 4 in the known field of view of the external camera 16.
[0073] Data are used to calculate the poses for the robot 2 and hence for the tracker 12. The poses of the tracker 12 relative to the external camera 12 are generated such that a hand-eye calibration error is minimised. For calculation of the poses especially the following data are taken into account: a most accurate area within the field of view of the external camera 16; such a joint configuration of the robot 2 with which the best accuracy is to be expected; a distribution of the poses of the tracker 12 within the entire field of view of the external camera so as to sense as different poses as possible; and/or distribution of the poses of the tracker 12 within the field of view of the external camera 16, so that angles between the tracker 12 and the camera 16 between large and small may be selected and controlled differently. The control unit 20 determines at least three poses for the robot flange 6 with the tracker 12.
[0074] Subsequently, the control unit 20 controls the robot 2 such that it is moved into the calculated poses. In this process, data (samples) both for a transformation between the robot flange 6 and a robot base 26 and for a transformation between the external camera 16 and the tracker 12 on the flange are collected.
[0075] Preferably, in an optional step after the sensing of at least three poses, an intermediate calibration of the hand-eye calibration may be calculated with known methods, more particularly the Tsai-Lenz algorithm. The new hand-eye calibration comprises a transformation between the tracker 12 and the robot flange 6. Using the transformation from the external camera 16 to the tracker 12, from the tracker 12 to the robot flange 6 and forward kinematics of the robot 2, the known transformation and/or pose between the external camera 16 and the robot base 26 may be updated with a more exact transformation. In order to collect further (random) samples for an even more exact hand-eye calibration, the afore-calculated poses are newly calculated with the new transformation now. Alternatively or additionally, also new poses for the robot 2 may be calculated with the foregoing approach. The robot 2 then continues with the movement of the tracker 12, and appropriate random samples are collected.
[0076] Finally, the control unit 20 then calculates the hand-eye calibration by means of known methods, more particularly the Tsai-Lenz algorithm, on the basis of the samples sensed. The control unit 20 calculates in particular the transformation between the robot camera 10 and the robot flange 6 and provides a calibration and/or registration between the robot camera 10 and the robot 2 as well as a calibration between the external camera 16 and/or the tracker and the robot flange 6 and hence the robot 2. Thus, the problem of hand-eye calibration has been solved.
[0077] In
[0078] In addition to the afore-described configuration the robot camera 10, as an optional calibration, is also calibrated geometrically. This geometric calibration may be carried out optionally by the control unit 20. This geometric calibration is only necessary if the robot camera 10 and/or the external camera also are to be calibrated geometrically. The geometric calibration is independent of the foregoing hand-eye calibration.
[0079] For this step, a camera calibration algorithm is used. For the geometric camera calibration, again a calibration pattern is required. This calibration pattern may especially be the optical calibration pattern 22. For the geometric calibration, the calibration pattern has to be place in the field of view of the camera, and calibration takes placed with the aid of the known geometry of the calibration pattern and a corresponding calibration algorithm so as to calculate (optical) camera parameters such as focal length and distortion coefficients from the distortions sensed. It is crucial that the calibration pattern is moved relative to the camera, either by displacing of the calibration pattern or by moving of the camera.
[0080] In contrast to the state of the art, however, in the present disclosure the geometric camera calibration is combined with the hand-eye calibration during the movement of the robot. In this scenario the calibration pattern is fixed on the base.
[0081] The control unit 20 is adapted to carry out the following partial steps. In a first step, the robot camera 10 is again moved by means of the robot arm 4, for instance, systematically, on the basis of heuristics starting out from a coarse position of the calibration pattern relative to the robot camera (by means of a known coarse transformation between the robot camera 10 and the robot flange 6), or again with the aid of random movements.
[0082] The robot camera 10 again senses the environment and is adapted to detect the calibration pattern 22 with the aid of image processing and object detection. Once the (geometric) calibration pattern 22 has been detected, the coarse transformation between the robot camera 10 and the (geometric) calibration pattern 22 is known. Using this transformation and a known (coarse) transformation between the robot camera 10 and the robot flange 6 as well as forward kinematics, the transformation between the calibration pattern 22 and the robot base 26 can be determined by the control unit 20.
[0083] The control unit 20 is adapted to place, in a subsequent step, the calibration pattern 22 for calibration in various positions in the remote and close areas of the robot camera 10. Moreover, the control unit 20 is adapted to also place the calibration pattern 22 in various positions, so that it appears on every side and in every corner as well as in the middle of the camera image of the robot camera 10. Moreover, the control unit 20 is adapted to incline the calibration pattern 22 in relation to the robot camera 10. In a traditional approach these steps have to be performed manually. In accordance with the present disclosure the poses of the robot 2 (and hence of the robot camera 10 with respect to the calibration pattern 22) are calculated with the aid of the known transformations, so that the calibration pattern 22 appears at any of the afore-described positions of the capturing A taken by the robot 10 and may be processed appropriately by the control unit 20 so as to carry out the geometric calibration. The robot is, controlled by the control unit 20, moved into each of these positions, and the robot camera 10 takes capturings A (images) of the calibration pattern 22 in each of these positions. Subsequently, the control unit 20 calculates the geometric calibration of the robot camera 10 by means of the images taken.
[0084]
[0085]
[0086] In a first step S1, a robot camera as a camera connected to a robot arm and adapted to be moved by it (eye-in-hand) is moved in space, and a continuous capturing is taken by the camera and a scene and/or environment is sensed.
[0087] In a step S2, a predefined optical calibration pattern and/or an external tracker and the pose thereof is detected with a predefined transformation to a pose of the external camera and/or a pose of the external camera on the basis of the capturing.
[0088] In a step S3, on the basis of the ascertained pose of the external camera, a transformation between the robot camera and the external camera and a field of view of the external camera is determined.
[0089] In a step S4, the robot flange with a tracker mounted on the robot flange is moved into at least three different poses in the field of view of the external camera, and the at least three poses of the tracker are sensed by the external camera, and simultaneously a transformation between the robot base and the robot flange is sensed.
[0090] Finally, in step S5, a hand-eye calibration with determination of a transformation to the external camera takes place on the basis of the at least three sensed poses and the at least three transformations between the robot base and the robot flange.
LIST OF REFERENCE SIGNS
[0091] 1 surgical assistance system [0092] 2 robot [0093] 4 robot arm [0094] 6 robot flange [0095] 8 end effector [0096] 10 robot camera [0097] 12 tracker [0098] 14 external camera system [0099] 16 external camera [0100] 18 base [0101] 20 control unit [0102] 22 calibration pattern [0103] 24 storage unit [0104] 26 robot base [0105] 28 calibration tracker [0106] A capturing robot camera [0107] S1 Step moving of robot camera [0108] S2 Step detecting of pose of external camera [0109] S3 Step determining of transformation and field of view of external camera [0110] S4 Step moving of tracker on the robot flange into several poses in the field of view [0111] S5 Step carrying out of hand-eye calibration