Technique For Determining A Need For A Re-Registration Of A Patient Tracker Tracked By A Camera System
20230225796 · 2023-07-20
Assignee
Inventors
Cpc classification
A61B2034/2068
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B2090/3983
HUMAN NECESSITIES
International classification
Abstract
A technique for determining a need for a re-registration of an optical patient tracker with medical image data of a patient is presented. A camera system is configured to generate camera image data for tracking the tracker. The camera system comprises an acceleration sensor configured to generate inertial data indicative of an acceleration of the camera system. A method implementation of the technique comprises the following steps performed by a processor: receiving image data from the camera system and analyzing the received image data for a positional change of the tracker indicative of at least one of a drift of the tracker and an impact on the tracker; receiving inertial data acquired by the acceleration sensor and analyzing the received inertial data, or data derived therefrom, with respect to at least one first predetermined condition indicative of an impact on the camera system; and generating, in case a positional change of the tracker indicative of at least one of a drift of the tracker and an impact on the tracker is identified based on the image data and the at least one first predetermined condition is not fulfilled, at least a re-registration signal.
Claims
1. A method for determining a need for a re-registration of tracker attached to a patient with medical image data of the patient, wherein a camera system is configured to generate camera image data for tracking the tracker, the camera system comprising a first acceleration sensor configured to generate inertial data indicative of an acceleration of the camera system, the method comprising the following steps performed by a processor: receiving image data from the camera system; analyzing the received image data for a positional change of the tracker indicative of at least one of i) a drift of the tracker; and ii) an impact on the tracker; receiving, from the first acceleration sensor, inertial data; analyzing the received inertial data, or data derived therefrom, with respect to at least one first predetermined condition indicative of an impact on the camera system; and generating, in case a positional change of the tracker indicative of at least one of a drift of the tracker and an impact on the tracker is identified based on the image data and the at least one first predetermined condition is not fulfilled, a re-registration signal.
2. The method according to claim 1, wherein the tracker comprises a second acceleration sensor configured to generate inertial data indicative of an acceleration of the tracker, the method further comprising: receiving, from the second acceleration sensor of the tracker, inertial data; analyzing the received inertial data, or data derived therefrom, with respect to at least one second predetermined condition indicative of at least one of i) the drift of the tracker; and ii) the impact on the tracker, wherein the re-registration signal is generated in case a positional change of the tracker indicative of at least one of the drift of the tracker and the impact on the tracker is identified based on the image data and the at least one first predetermined condition is not fulfilled, while the at least one second predetermined condition is fulfilled.
3. The method according to claim 1, wherein the at least one first predetermined condition comprises a threshold decision.
4. The method according to claim 3, wherein the re-registration signal is generated when the received inertial data are indicative of an acceleration above a decision threshold of at least 5 m/s.
5. The method according to claim 1, wherein the step of analyzing the received image data for a positional change of the tracker comprises: deriving a movement pattern of the positional change from the received image data; and comparing the derived movement pattern to at least one predetermined movement pattern, wherein the re-registration signal is generated in case the positional change of the tracker is indicative of the predetermined movement pattern.
6. The method according to claim 1, wherein the at least one first predetermined condition is indicative of at least one predetermined movement pattern, and wherein the step of analyzing the received inertial data, or the data derived therefrom, with respect to the at least one first predetermined condition indicative of an impact on the camera system comprises: deriving a movement pattern from the received inertial data; and comparing the derived movement pattern to the at least one predetermined movement pattern.
7. The method according to claim 6, wherein the at least one predetermined movement pattern is indicative of a damped oscillation.
8. The method according to claim 1, further comprising triggering a re-registration notification based on the re-registration signal.
9. The method according to claim 8, wherein the re-registration notification is at least one of an acoustic notification and an optical notification.
10. The method according to claim 8, wherein a notification device is configured to receive at least the re-registration signal and output the re-registration notification.
11. The method of claim 10, wherein the notification device is part of the camera system.
12. The method according to claim 9, wherein a notification device is configured to receive at least the re-registration signal and output the re-registration notification, and, optionally, wherein the notification device is part of the camera system.
13. The method according to claim 1, wherein a tracker coordinate system associated with the tracker or image data thereof has been registered with a medical image coordinate system associated with the medical image data of the patient, and wherein at least the re-registration signal triggers one of i) re-registering the tracker coordinate system with the medical image coordinate system; and ii) suggesting the re-registration.
14. The method according to claim 1, wherein at least the tracker is imaged in image data continuously generated by the camera system, and the method further comprising visualizing the image data at least for a point in time corresponding to a detected positional change of the tracker.
15. The method according to claim 1, wherein the received inertial data and the received image data are each associated with time stamps, and wherein the analyzed image data are associated with corresponding analyzed inertial data based on the time stamps.
16. The method according to claim 1, wherein the first acceleration sensor is configured to measure a gravity vector at its position.
17. The method according to claim 16, further comprising: creating a coordinate system based on the measured gravity vector and a tracker position, wherein the step of analyzing the received inertial data, or data derived therefrom, with respect to the at least one first predetermined condition indicative of an impact on the camera system comprises: verifying a positional change of the tracker based on the created coordinate system.
18. A computer program product comprising non-transitory computer readable medium storing instructions configured to be executed on one or more processors to perform the steps of: receiving image data from the camera system; analyzing the received image data for a positional change of the tracker indicative of at least one of i) a drift of the tracker; and ii) an impact on the tracker; receiving, from the first acceleration sensor, inertial data; analyzing the received inertial data, or data derived therefrom, with respect to at least one first predetermined condition indicative of an impact on the camera system; and generating, in case a positional change of the tracker indicative of at least one of a drift of the tracker and an impact on the tracker is identified based on the image data and the at least one first predetermined condition is not fulfilled, a re-registration signal.
19. A data processing system for determining a need for a re-registration of a tracker attached to a patient with medical image data of the patient, wherein a camera system is configured to generate camera image data for tracking the tracker, the camera system comprising an acceleration sensor configured to generate inertial data indicative of an acceleration of the camera system, the data processing system comprising a processor configured for: receiving image data from the camera system; analyzing the received image data for a positional change of the tracker indicative of at least one of i) a drift of the tracker; and ii) an impact on the tracker; receiving, from the acceleration sensor, inertial data; analyzing the received inertial data, or data derived therefrom, with respect to at least one first predetermined condition indicative of an impact on the camera system; and generating, in case a positional change of the tracker indicative of at least one of a drift of the tracker and an impact on the tracker is identified based on the image data and the at least one first predetermined condition is not fulfilled, at least a re-registration signal.
20. The data processing system of claim 19, further comprising a camera system configured to image at least the tracker that is imaged in camera image data continuously taken by the camera system.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0033] Further features and advantages of the method, the computer program product and the data processing system presented herein are described below with reference to the accompanying drawings, in which:
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
DETAILED DESCRIPTION
[0048] In the following description, for purposes of explanation and not limitation, specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details.
[0049] The same reference numerals are used to denote the same or similar components.
[0050]
[0051] The tracker 100 is attached to a portion of a patient anatomy 200, e.g., to a vertebra 210 of the patient's spine. In some variants, the tracker 100 is clamped to a spinal process of the vertebra 210. In other variants, the tracker 100 is configured to be attached (e.g., via an adhesive or otherwise) to a skin surface.
[0052]
[0053] The camera system 300 comprises at least one acceleration sensor 310. The at least one acceleration sensor 310 may be configured as, or comprised by, an IMU. The at least one acceleration sensor 310 may be integrated into the camera system 300 so that a movement of the camera system 300 reflected in the image data of the camera system 300 can be detected in inertial data of the at least one acceleration sensor 310. The at least one acceleration sensor 310 may be integrated into an optical component of the camera system 300 or in structure (e.g., a stand) mechanically supporting the optical component.
[0054] The acceleration sensor 310 of the camera system is configured to generate inertial data indicative of an acceleration of the camera system 300. In some implementations, the camera system 300 comprises at least one of an accelerometer and a gyroscope, e.g., 3 accelerometers and/or 3 gyroscopes (i.e., multiple acceleration sensors 310). In some implementations, the inertial data are indicative of acceleration in multiple DOFs (e.g., in at least 3 translatory DOFs, or in at least 3 rotatory DOFs, or in combined 6 DOFs). The inertial data indicative of acceleration in multiple DOFs are acquired, for example, by a multiple axes accelerometer or by a combination of multiple single axis accelerometers.
[0055] Two- or three-dimensional medical image data of the patient anatomy 200 is provided. The medical image data are associated with a coordinate system COS_medical image. The medical image data may have been previously generated, for example via a medical imaging modality such as MRI, ultrasound imaging, X-ray projection techniques, angiography or CT. In the example illustrated in
[0056]
[0057] The coordinate systems COS_camera and COS_tracker are related by a known or at least derivable coordinate transformation T (and its inverse transformation T{circumflex over ( )}−1). The coordinate transformation T is, for example, derivable based on an at least temporarily fixed position between the camera system 300 and the tracker 100. The transformation T may continuously be updated as the patient anatomy 200 with the tracker 100 is moved relative to the camera system 300 in an intentional manner.
[0058] Since the coordinate systems COS_tracker and COS_camera are related by the transformation T, each of the coordinate systems COS_tracker and COS_camera is suited to serve as a first coordinate system in an initial registration process for registering the first coordinate system with the medical image coordinate system COS_medical image. While both coordinate systems are suited to serve as the first coordinate system in the initial registration process, in practice only one registration is needed. In this regard, COS_tracker is chosen as the first coordinate system in the following description, as is illustrated in
[0059] The initial registration process may be performed in various ways, for example by touching anatomical features of the vertebra 210 with a tracked pointer tool (not shown) and matching the point cloud thus obtained in COS_camera with corresponding vertebra surface information as detected in the medical image data associated with COS_medical image.
[0060] During surgery, there are different kinds of accelerations possibly acting on the patient tracker 100, and these accelerations are associated with different kinds of movements of the tracker 100. For example, the tracker 100 may be accelerated intentionally, e.g., when a surgeon moves the patient anatomy 200 together with the tracker 100, or when an operating table the patient is lying on is moved. Further, the tracker 100 may be accelerated due to a positional drift of the tracker, e.g., as the tracker 100 is clamped to the patient and a clamping force is not sufficient to fixedly attach the tracker 100 to the patient over an extended period of time in view of gravitational forces acting on the tracker. Still further, the tracker 100 may unintentionally be bumped against by a surgeon or a robot, i.e., there may be an acceleration due to an impact on the tracker 100.
[0061] These or other tracker accelerations may lead to a relative movement between the tracker 100 and the patient anatomy 200, in particular the vertebra 210 the tracker 100 is attached to. As a result, the initial registration R is rendered incorrect and a re-registration of COS_tracker with COS_medical image is necessary (e.g., for ensuring correct navigation of a tracked surgical tool by a surgeon or robot). A schematic representation of this case is illustrated in
[0062]
[0063] The method comprises a step 410 of receiving image data generated by the camera system 300 and a step 420 of analyzing the received image data for a positional change of the tracker 100 that is indicative of at least one of a drift of the tracker and an impact on the tracker, i.e., for a relative movement between the tracker 100 and the camera system 300. The step 420 of analyzing the received image data for a positional change of the tracker 100 comprises in some variants deriving a movement pattern of the positional change from the received image data. The step 420 may further comprise comparing the derived movement pattern to the at least one predetermined movement pattern. The predetermined movement patter may be a damped oscillation (optionally having an amplitude exceeding a predefined amplitude threshold).
[0064] Analyzing the received image data for a positional change of the tracker 100 may comprise determining first and second pixel coordinates of a center of at least one of the tracker 100 and each of the one or more markers of the tracker 100. The first pixel coordinates may be determined from image data taken in a situation without any movement of the tracker 100 or the camera system 300, e.g., directly after the initial registration process. The second pixel coordinates may be determined from the image data received in step 410. A difference between the first and second pixel coordinates may be indicative of a positional change of the tracker 100. Based on the the amount of the difference and/or the duration in which the indicated positional change takes place, the positional change of the tracker 100 may be indicative of at least one of a drift of the tracker and an impact on the tracker 100.
[0065] It has been observed that movement of the tracker 100 and movement of the camera system 300 may result in similar image data changes (i.e., it cannot be told from the image data if the tracker 100 has moved relative to the camera system 300 or vice versa). To address, and possibly resolve, this ambiguity, at least the inertial data generated by the acceleration sensor 310 are received in step 430. The inertial data may be acquired in one or more DOFs. The inertial data may be sensory data as generated by the (at least one) acceleration sensor 310.
[0066] The received data, or data derived therefrom, are analyzed in step 440 with respect to at least one predetermined condition indicative of an impact on the camera system 300. An impact on the camera system 300 can be associated, for example, with the inertial data being indicative of an acceleration exceeding an acceleration threshold, e.g., of at least 5 m/s.sup.2 or higher. In another example, an impact on the camera system 300 may be associated with an acceleration indicative of a predefined movement over time, e.g., a damped oscillation having a certain behavior as defined by the at least one first predetermined condition.
[0067] In case a positional change of the tracker 100 indicative of at least one of a drift of the tracker 100 and an impact on the tracker 100 is identified based on the image data and, at substantially the same point in time, the inertial data generated by the acceleration sensor 310 is not indicative of an impact on the camera system 300, a re-registration signal is generated in step 450. In some variants, the re-registration signal is generated only in case the positional change of the tracker (as determined in step 420) is indicative of the predetermined movement pattern.
[0068] The at least one re-registration signal may trigger a re-registration notification. In one variant, the re-registration notification may be indicative of a need for a re-registration. The re-registration notification may be, or may trigger, a user notification suggesting triggering of the re-registration to a user for further facilitating decision-making of a surgeon, e.g., regarding the need of a suggested re-registration of the tracker 100, i.e., of COS_tracker with COS_medical image. In another variant, the re-registration notification may be indicative of a re-registration that is triggered automatically. The automatically triggered re-registration may be a re-registration of COS_tracker with COS_medical image.
[0069] When the re-registration is triggered automatically or manually, new coordinate transformations for the re-registration are determined (e.g., in a similar manner as for the initial registration).
[0070] Data generated by the camera system 300 and the acceleration sensor 310 as described above may be received in near real time. The generated data from the camera system 300 and the acceleration sensor 310 may be received substantially in parallel. In this case, steps 410 and 420 as well as steps 430 and 440 may be performed substantially in parallel, as indicated in
[0071]
[0072] The notification device 500 is configured to output, responsive to the re-registration signal, a notification signal. The notification signal may be a re-registration notification for notifying a user that a re-registration has been triggered automatically or that a need for a re-registration has been determined. The notification signal may be generated by switching an LED to a different mode, e.g., to a different color (e.g., from green to red), to a different geometric pattern in case of multiple LEDs (e.g., from a ring to a cross) or to a different operating frequency (e.g., from constant illumination to an on/off modulation at 1 to 10 Hz).
[0073] In other examples (not shown), the notification device 500 is an acoustic device (e.g., a loudspeaker) or a combination of an optical and an acoustical device 500. Accordingly, the user notification signal that is output by the notification device 500 may be an optic or acoustic notification or a combination thereof.
[0074] In some implementations the tracker 100 may comprise a notification device 505 as an addition or as an alternative to the notification device 500 of the camera system 300. The notification device 505 of the tracker 100 may be an optical or acoustical notification device or a combination thereof, analogous to the notification device 500 of the camera system 300. The notification device 505 of the tracker may have a similar functionality as the notification device 500 of the camera system 300.
[0075]
[0076] The notification device 510 may be an optical or acoustical notification device or a combination thereof, analogous to the notification device 500 of
[0077]
[0078]
[0079] The image data thus obtained is intended to be visualized, e.g., on a display 530, in the field of view of a user. The image data is, for example, continuously stored in a ring buffer of a certain size (e.g., sufficient to store at least 10 seconds of image data). The image data is configured to be replayed when a positional change of the patient tracker 100 (in particular an impact) and no impact on the camera system 300 is detected. In some examples, the image data is visualized in response to a manual input of a user (e.g., in response to the notification being output by the notification device 500 or automatically. The visualization of the image data may help a user identifying the kind of detected tracker movement and deciding whether or not a re-registration is necessary. The visualization may reduce cognitive load on a surgeon and duration of a surgery.
[0080]
[0081] In the variant of
[0082] To increase the accuracy of the determination of the need for a re-registration, the method further comprises a step 760 of receiving inertial data generated by the dedicated acceleration sensor 600 of the tracker 100 and a step 770 of analyzing the received inertial data, or data derived therefrom.
[0083] The received data, or data derived therefrom, are analyzed in step 770, with respect to at least one second predetermined condition indicative of at least one of at least one of a drift of the tracker 100 and an impact on the tracker 100. The at least one second predetermined condition indicative of at least one of a drift of the tracker 100 and an impact on the tracker 100 may be analogous to the at least one first predetermined condition indicative of an impact on the camera system 300 as explained with reference to
[0084] Analyzing inertial data of both, the tracker acceleration sensor 600 and the camera system acceleration sensor 310 further enables distinguishing between a movement of the tracker 100, a movement of the camera system 300, and a movement of both of the tracker 100 and the camera system 300. In case a positional change of the tracker 100 indicative of at least one of a drift of the tracker 100 and an impact on the tracker 100 is identified based on the image data (see step 720) and, at the same time, the first predetermined condition is not fulfilled (see step 740) while the second predetermined condition is fulfilled (see step 770), at least a re-registration signal, is generated in step 750.
[0085] In one variant, the re-registration signal generated in step 750 triggers generation of a re-registration notification for further facilitating decision-making of a surgeon, e.g., regarding the need of a suggested re-registration of the tracker 100, i.e., of COS_tracker with COS_medical image.
[0086] By combining an optical data based determination and an inertial data based determination as described above, the accuracy of the determination of the need for a re-registration may be increased since the optical data based determination may be utilized to compensate for possible deficits of the inertial data based determination and the other way round. For example, a determination based on optical tracking requires a line of sight from the camera system 300 to the tracker 100. A determination based on inertial data on the other hand is applicable without the need for any line of sight. As another example, any inertial data generated by an acceleration sensor 310, 600 is subject to integration drift (i.e., a virtual drift). Image data generated by a camera system 300 on the other hand are not subject to virtual drift.
[0087] Data generated by the camera system 300 and any of the acceleration sensors 310, 600 as described above may be received in near real time. The generated data from the camera system 300 and any or all of the acceleration sensors 310, 600 may be received substantially in parallel. Alternatively, some or all of the generated data may be received in sequence. For example, the inertial data from the dedicated acceleration sensor 600 of the tracker 100 may only be received when a positional change of the first tracker 100 is indicated in the received image data. In this case, the inertial data may be associated with the corresponding image data based on time stamps. As a result, usage of energy and data transmitting resources of the tracker 100 may be reduced.
[0088]
[0089]
[0090] Since a detected positional change of a patient tracker 100 can be indicative of a relative movement between the tracker 100 and a patient anatomy 200, the technique presented herein enables continuously maintaining a high registration quality. Any interval a surgeon operates on the basis of an incorrect registration is minimized, since a possible time gap between an unintended tracker movement relative to the patient anatomy 200 and a re-registration for compensating the resulting inaccuracy is minimized.