STEREO CAMERA CALIBRATION USING SCREEN BASED TARGET PROJECTIONS

20230217003 · 2023-07-06

    Inventors

    Cpc classification

    International classification

    Abstract

    In a system and method of calibrating a stereoscopic camera, a target pattern is electronically displayed on a display. The camera to be calibrated captures images of the target pattern using a stereoscopic camera while the displayed target pattern is altered over time. The captured images are analyzed and calibration parameters for the stereoscopic camera are determined.

    Claims

    1. A method of calibrating a stereoscopic camera, comprising the steps of: (a) displaying a target pattern on a display; (b) capturing images of the target pattern using a stereoscopic camera; (c) altering the displayed target pattern; (d) capturing the altered target pattern; (e) repeating steps (a) through (d) a plurality of times; and (f) analyzing the captured images to determine calibration parameters for the stereoscopic camera.

    2. The method of claim 1, wherein displaying the target pattern includes displaying a pattern having black elements and elements of a first color, and altering the displayed target pattern includes displaying a pattern having black elements of a second color, the first color different from the second color.

    3. The method of claim 1, wherein the method includes displaying a sequence of target patterns, wherein patterns in the sequence vary in position on the display, angular orientation on the display, and perceived planar orientation (perspective).

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0006] FIG. 1A shows an example of a calibration system in accordance with the disclosed embodiments.

    [0007] FIG. 1B shows a screen used to display a calibration target mounted to a fixtured on a portion of a robotic manipulator that also supports the camera undergoing calibration.

    [0008] FIG. 2A illustrates one example of a calibration pattern electronically displayed as part of a sequence of patterns. In this illustrated pattern, red, black, and white squares are shown.

    [0009] FIG. 2B shows left and right images of the pattern of FIG. 2A as captured from the stereo camera.

    [0010] FIG. 3A illustrates a second example of a calibration pattern electronically displayed as part of a sequence of patterns. In this illustrated pattern, blue, black, and white squares are shown.

    [0011] FIG. 3B shows left and right images of the pattern of FIG. 2A as captured from the stereo camera.

    [0012] FIGS. 4B and 4B are similar to FIGS. 3A and 3B and show patterns from yet another target sequence.

    DETAILED DESCRIPTION

    [0013] This application describes the use of calibration targets that are displayed in view of the camera to be calibrated. The displayed patterns electronically change so that operating room personnel do not need to move the targets in front of the camera.

    [0014] Referring to FIG. 1A, an embodiment of a calibration system 100 system comprises:

    [0015] 1. A stereo camera 10, which may be a laparoscopic/endoscopic camera used during surgery, stereo or monocular. This camera is the goal of the calibration procedure. The stereo camera 10 is preferably positioned in a stable location such that it does not move during the calibration procedure. Although this invention is not limited to calibration of cameras used in robotic surgery, in the example shown in FIG. 1B, the camera 10 is one used for a robotic surgical system, and it is shown mounted to a surgical manipulator.

    [0016] 2. A target 12 in the form of a display for displaying a graphical target. The display may be a screen of a tablet computer or other image display. It is positioned in view of the camera 10. It is preferably positioned in a stable location, such as by being mounted to a fixture. In the example shown in FIG. 1B, screen displaying the target 12 is positioned on a fixed portion of the robotic manipulator that holds the camera.

    [0017] The displayed graphical target is a changing sequence of images displayed to be presented on the screen, and the set of corner points located in each image. The software for generating and presenting the pattern images may be on the processor used to captures the image from the camera, or a different one. As one particular example, it may be an application running on a tablet computer that also serves as the target display. Known projection screen properties are used for calculating the planar 3D locations of the points presented on the screen (typically multiplication by a factor calculated from the screen resolution and physical size).

    [0018] 3. At least one processor 14 that receives the image data/video from the camera(s) and samples the images. Algorithms stored in memory of the computing unit(s) are executable to perform various functions, including:

    [0019] Analyzing each image captured by each camera for detecting the pattern corner points from the captured images. Solving the calibration problem by optimization (e.g., Levenberg-Marquardt) of the unknown parameters, using the known planar 3D locations, and their 2D projections in both camera images.

    [0020] 4. Optionally, a means for changing the relative camera to screen pose for projecting the patterns in more than one location. This can be manual, or motorized if the camera (or screen) is mounted on a robotic arm, e.g., laparoscopic camera held by an arm in a robotic surgical system.

    Method

    [0021] In order to calibrate camera 10 using the disclosed system, a changing series of patterns is projected/displayed on a screen 12 viewed by the stereo camera 10. The projected pattern grid points of the pattern have known 2D locations on the screen, which can be calculated as 3D locations on the planar surface of the screen. As can be seen by comparing the image display of FIG. 2A and that of FIG. 3A, the perspective of the grid pattern changes during the displayed sequence, as of a planar grid pattern is being tilted in various directions to change its planar orientation relative to the plane of the display. The position and rotational orientation of the grid pattern also varies throughout the sequence. See also FIGS. 4A and 4B which show images from another target sequence.

    [0022] The locations of the points are identified by the at least one processor 14 from the image captured by each of camera sensor within the stereo camera 10, imposing 3D to 2D projection constraints. The screen and stereo camera 10 need not be moved during the projection of a predefined series of images, therefore, the 6DOF of the relative screen to camera pose is fixed throughout the projection process, which reduces the number of unknowns to a fixed number. This is more simple than standard calibration techniques using a standard handheld moving camera (or pattern), where the pose changes in each frame, adding 6 unknowns to each image.

    [0023] The optimization problem consists of minimizing the mean squared reprojection error. The unknowns to be determined are the parameters of both cameras in the stereo setting, the relative pose of the two cameras, and the relative pose of the screen relative to the stereo cameras. The 3D locations of the pattern points in screen coordinates are known in each frame, and their 2D projections on both cameras. In order to improve the accuracy, the patterns projection process may be performed in more than one relative camera-screen location, in this case, it is beneficial to group all the images projected in the same relative pose, under the same 6DOF screen-camera parameters, and there are several of these, according to the number of relative poses. One relative screen-camera location may be used, but two are more are optimal for ensuring good results for the relative translation parameters between the two stereo cameras. In the configuration shown in FIG. 1B, a change in relative screen-camera location can be achieved by repositioning the manipulator holding the camera, or repositioning the tablet (e.g., moving it to a different fixture on the manipulator, or if the fixture itself has joints or other moveable features, adjusting the position of the fixture).

    [0024] If the patterns are not projected by the same computer capturing the camera images (for example if the screen is a tablet application for cyclic presenting the sequence of patterns, not synched to the computer capturing the images), data must be communicated or know to the processor 14 to allow it to know which pattern is presented at any given moment, in order to expect the proper 2D (and 3D screen, by the screen properties) locations. One method of doing this is to color code the patterns in the displayed in order to encode the number of image projected. This allows the processor 14 to easily detect an image change (red-blue code) from the displayed images. Thus, for an exemplary sequence, FIG. 2A illustrates the pattern shown as the ninth image of the sequence. It includes red color codes (the squares marked “R”). FIG. 3A illustrates the pattern shown as the sixteenth displayed image of the sequence. It includes blue color codes (the squares marked “B”). The color codes in this example are binary encoded in the displayed squares, replacing the black color, the codes are only on the inner squares. The alternating blue-red throughout the displayed sequences is used to help the processor quickly determine that there has been a change in the image. Note that the R's and B's shown in the drawing will not typically be part of the calibration pattern, but they are used to distinguish the black squares from the colored squares in black and white drawings.

    [0025] The concepts described in this application provide a number of advantages over existing calibration methods. It eliminates the need for a user to wave the pattern in front of the camera, since the pattern change. The patterns are changed automatically, requiring little user involvement. Moreover, since the patterns are projected from a static camera-screen pose, there are more constraints to the problem, and fewer unknowns, facilitating a faster and less complex calibration. In addition, once the target display screen is inside the field of view of the camera, all the patterns projected are in the field of view, whereas in the standard calibration, the pattern can easily be partially outside the field of view, which reduces the number of useful frames.

    [0026] All patents, applications and articles referred to herein are incorporated by reference.