X-RAY PRE-EXPOSURE CONTROL DEVICE
20170322484 · 2017-11-09
Inventors
Cpc classification
A61B6/462
HUMAN NECESSITIES
G16H50/20
PHYSICS
A61B6/04
HUMAN NECESSITIES
International classification
A61B6/00
HUMAN NECESSITIES
Abstract
The invention relates to an X-ray pre-exposure control device (10), an X-ray imaging system (1), an X-ray imaging method, and a computer program element for controlling such device and a computer readable medium having stored such computer program element. The X-ray pre-exposure control device (10) comprises a subject detection unit (11), a subject model unit (12), an interface unit (13), a processing unit (14), and a display unit (15). The subject detection unit (11) is configured to detect subject data of the subject (111) to be exposed. The subject model unit (12) is configured to provide a subject model and to refine the subject model based on the subject data into a refined subject model. The interface unit (13) is configured to provide setting data of an X-ray unit (131) to be used for exposing the subject. The processing unit (14) is configured to calculate a virtual X-ray projection (151) based on the refined subject model and the provided setting data. The display unit (15) is configured to display the virtual X-ray projection (151).
Claims
1. An X-ray pre-exposure control device, comprising: a subject detection unit, a subject model unit, an interface unit, a processing unit, and a display unit, wherein the subject detection unit is configured to detect subject data of the subject to be exposed, wherein the subject model unit is configured to provide a subject model and to refine the subject model based on the subject data into a refined subject model, wherein the interface unit is configured to provide setting data of an X-ray unit to be used for exposing the subject, wherein the processing unit is configured to calculate a virtual X-ray projection based on the refined subject model and the provided setting data, and wherein the display unit is configured to display the virtual X-ray projection.
2. X-ray pre-exposure control device according to claim 1, further comprising an input unit configured to adjust the subject, the subject model and/or the setting data of the X-ray unit, wherein the processing unit is further configured to recalculate the virtual X-ray projection based on this adjustment, and wherein the display unit is configured to display the virtual X-ray projection based on this adjustment.
3. X-ray pre-exposure control device according claim 2, wherein the setting data is a collimation parameter of the X-ray unit to be used for exposing a sub-region of the subject.
4. X-ray pre-exposure control device according to claim 3, wherein the collimation parameter is a collimation window displayed by the display unit and wherein the input unit is configured to interactively adjust the position, size and/or orientation of the collimation window.
5. X-ray pre-exposure control device according to claim 1, wherein the subject detection unit is configured to detect positions of anatomical landmarks of the subject and to detect an orientation of the subject based on the positions of the anatomical landmarks.
6. X-ray pre-exposure control device according to claim 1, wherein the subject detection unit at least one of the group of an optical, an infrared, an ultrasound, a radar camera or sensor, a weight sensor, a depth sensor, a sensor sensing a breathing cycle, a sensor sensing a heart cycle, a millimeter wave sensor and a backscatter X-ray sensor.
7. X-ray pre-exposure control device according to claim 6, wherein the subject data is dimension data and/or phase data, wherein the dimension data comprise at least one of the group of the subject's shape, size, position and orientation, wherein the phase data comprise a heart cycle and/or a breathing cycle, and wherein the processing unit is configured to continuously recalculate the virtual X-ray projection based on the phase data, and wherein the display unit is configured to continuously display the virtual X-ray projection based on the phase data.
8. X-ray pre-exposure control device according to claim 1, further comprising a patient positioning quality indication unit comprising a positioning quality sensor configured to detect a subject's positioning relative to an X-ray unit.
9. X-ray pre-exposure control device according to claim 8, wherein the positioning quality sensor comprise at least one of the group of a contact sensor, a force sensor, and an optical camera, wherein the latter is configured to track a subject's breathing.
10. X-ray pre-exposure control device according to claim 1, wherein the subject model unit is configured to select the subject model based on at least one of the group of the subject's size, weight, age, sex, thorax volume and distance between landmarks.
11. X-ray pre-exposure control device according to claim 1, wherein the setting data of the X-ray unit to be used for exposing the subject is at least one of the group of position or orientation of an X-ray source, detector, focal spot or collimator, exposure time, availability of a scatter grid and kVp.
12. An X-ray imaging system, comprising: an X-ray pre-exposure control device according to claim 1, and an X-ray unit, wherein the X-ray unit is configured to expose the subject to X-ray radiation.
13. An X-ray imaging method with pre-exposure controlling, comprising the following steps: detecting subject data of the subject to be exposed, providing a subject model, refining the subject model based on the subject data into a refined subject model, providing setting data of an X-ray unit to be used for exposing the subject, calculating a virtual X-ray projection based on the refined subject model and the provided setting data, and displaying the virtual X-ray projection.
14. A computer program element for controlling a device or system according to claim 1, which, when being executed by a processing unit, is adapted to perform the method steps.
15. A computer readable medium having stored the computer program element of claim 14.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0041] Exemplary embodiments of the invention will be described in the following with reference to the accompanying drawings:
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
DETAILED DESCRIPTION OF EMBODIMENTS
[0049]
[0050] The X-ray pre-exposure control device 10 comprises a subject detection unit 11, a subject model unit 12, an interface unit 13, a processing unit 14, and a display unit 15. The X-ray pre-exposure control device 10 allows computing the virtual X-ray projection based on a refined subject model and current settings (e.g. collimation, orientation, position) of the X-ray unit 131. To this end, data on e.g. the shape and size of the subject 111 are collected with for example optical cameras to measure the body shape. Then, a subject model is automatically selected from the data base 121 and adjusted to e.g. the subject's size. By using the current settings of the X-ray unit 131 for a collimation and for a positioning of e.g. an X-ray source and an X-ray detector, a simulated virtual X-ray projection (CXR) is computed, on which the actual collimation window may be displayed. The virtual X-ray projection is displayed to e.g. a radiographer in order to decide whether the positioning of the subject 111 and the collimation of the X-ray source are suitable for the current examination.
[0051] In detail:
[0052] The subject detection unit 11 detects subject data of the subject 111 to be exposed. The subject data is dimension data and/or phase data. The dimension data comprise at least one of the group of the subject's shape, size, weight, body mass index, sex, age, position and orientation of the subject and/or at least a subject's landmark. The phase data comprise at least a heart cycle and/or a breathing cycle. The detection of the subject data may comprise an automatic detection and/or a manual input.
[0053] The subject detection unit 11 is at least one of the group of an optical, a time-of-flight, an infrared, an ultrasound, a radar camera or sensor, a weight sensor, a 3D depth sensor, a sensor sensing a breathing cycle, a sensor sensing a heart cycle, a millimeter wave sensor, a backscatter X-ray sensor and the like. The camera or sensor may be combined with infrared light. The subject detection unit 11 detects or extracts positions or coordinates of anatomical landmarks of the subject 111 and detects an orientation of the subject 111 based on the positions of the anatomical landmarks (see
[0054] The subject model unit 12 provides a subject model and refines the subject model based on the subject data into a refined subject model. The subject model unit 12 selects the subject model based on at least one of the group of the subject's size, shape, weight, age, sex, thorax volume, distance between landmarks and/or the like from e.g. a database of pre-defined software models. The database may include models of e.g. different size (small/medium/large), age (child/adult) and sex (male/female). The selection procedure can be based on derived parameters from the body shape such as thorax volume, distance from left to right shoulder, distance from hip to shoulder and combinations thereof. The subject model can be e.g. a thorax model.
[0055] The selected model is refined by an adaptation to the landmarks and/or the subject's orientation. The adaptation step may comprise rigid and/or non-rigid transformations of the selected subject model.
[0056] The interface unit 13 provides setting data of an X-ray unit 131 to be used for exposing the subject 111. The setting data of the X-ray unit 131 is at least one of the group of position or orientation of an X-ray source, an X-ray detector, a focal spot or a collimator, exposure time, availability of a scatter grid, kVp and/or the like. The provision of the setting data can be done automatically and/or manually.
[0057] The setting data of the X-ray unit 131 are used to improve the simulation of the virtual X-ray projection. The virtual X-ray projection is computed through the adapted subject model using the derived settings of the X-ray unit 131. The setting data is here a collimation parameter of the X-ray unit 131 to be used for exposing a sub-region of the subject. The collimation parameter is a collimation window (see
[0058] In other words, the projected collimator window boundaries are computed in the virtual X-ray projection and the computed virtual X-ray projection image are here displayed on a viewing monitor to the radiographer. For an interactive adjustment of the collimators, a larger field-of-view may be displayed on the monitor together with an indication of the active collimated area. In this way, the radiographer can adjust the position of the collimators with direct visual feedback.
[0059] The processing unit 14 calculates a virtual X-ray projection based on the refined subject model and the provided setting data. The processing unit 14 further continuously recalculates the virtual X-ray projection based on the phase data, and the display unit 15 continuously displays the virtual X-ray projection based on the phase data. The display unit 15 displays the virtual X-ray projection.
[0060]
[0061] detecting subject data of the subject 111 to be exposed.
[0062] providing a subject model.
[0063] refining the subject model based on the subject data into a refined subject model.
[0064] providing setting data of an X-ray unit 131 to be used for exposing the subject.
[0065] calculating a virtual X-ray projection based on the refined subject model and the provided setting data.
[0066] displaying the virtual X-ray projection.
[0067] These steps will be explained in further detail below. In the first step S1, the subject or patient is tracked with for example optical cameras, time-of-flight cameras, or 3D depth sensors in combination with infrared light. From the resulting body shape model or subject model, coordinates of landmarks of the subject such as shoulders, neck, hip bones are extracted and used to compute the orientation of the subject's body.
[0068]
[0069] From the tracked body shape, a thorax model is selected in a second step S2 from a database of pre-defined software models. The database here includes models of different size (small/medium/large), age (child/adult) and sex (male/female).
[0070] Further, additionally data on the subject 111 may be collected with other sensors. For example, the subject 111 might stand on a weight plate (not shown) to measure its weight, from which a body mass index might be derived for further selection of the subject model and the image acquisition parameters. Furthermore, the breathing and heart cycle of the subject 111 might be tracked to generate a 4D model of the subject.
[0071] In the third step S3, the selected model is refined by adaptation to the landmarks 112 and the subject's orientation as generated in step Sl. The adaptation step S3 may comprise rigid and non-rigid transformations of the selected subject model.
[0072] A viewport of the X-ray system is retrieved in step S4, i.e. the position of the focal spot, the position and orientation of the detector unit and the collimator positions are derived from the system. Further, acquisition settings such as kVp, exposure time, availability of a scatter grid may be derived to improve the following simulation of the virtual X-ray projection.
[0073] The virtual X-ray projection is computed in step S5 through the adapted subject model using the derived settings of the X-ray unit 131. Furthermore, the projected collimator boundaries are computed in the virtual X-ray projection.
[0074] As shown in
[0075] The system may be adjusted to raise a warning if automatically computed image measures indicate a poor image quality. Such indications may comprise for example a rotation of the subject or lung fields, which extend outside the collimated area. To this end, a computer software may analyze the virtual X-ray projection 151 for standard positioning quality criterions. In another embodiment of this invention, a dynamic virtual X-ray projection 151 is displayed indicating the breathing cycle of the subject to ensure the correct positioning of the subject in the breathing state (e.g. inspiration), in which the X-ray image is to be taken. In this way, the actual X-ray exposure can be triggered with the live feedback from the dynamic 2D virtual X-ray projection 151.
[0076] The X-ray pre-exposure control device 10 here further comprises a patient positioning quality indication unit. This patient positioning quality indication unit comprises a positioning quality sensor 171 (see
[0077] The position quality can be automatically derived from various types of at least one positioning quality sensor 171 or combinations of several positioning quality sensors 171 of the same or different kind. The positioning quality sensors 171 shown in
[0078] The positioning quality sensors 171 shown in
[0079] The positioning quality sensors 171 can also be optical cameras or electromagnetic sensors to track the subject shape and to send an image to the quality indicator algorithm to analyze a centered subject positioning. These sensors may additionally measure the state of the breathing cycle of the patient.
[0080] The information of all positioning quality sensors 171 can be combined into one quality indicator value, for example by increasing a counter if a respective positioning quality sensor 171 provides a signal above a sensor-specific threshold. The signal is regularly updated from the continuous data of the positioning quality sensors 171. If the overall quality indicator value is above a final quality value threshold, a visual signal is displayed to the radiographer.
[0081] As shown in
[0082] In another exemplary embodiment of the present invention, a computer program or a computer program element is provided that is characterized by being adapted to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
[0083] The computer program element might therefore be stored on a computer unit, which might also be part of an embodiment of the present invention. This computing unit may be adapted to perform or induce a performing of the steps of the method described above. Moreover, it may be adapted to operate the components of the above described apparatus. The computing unit can be adapted to operate automatically and/or to execute the orders of a user. A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method of the invention.
[0084] This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and a computer program that by means of an up-date turns an existing program into a program that uses the invention.
[0085] Further on, the computer program element might be able to provide all necessary steps to fulfil the procedure of an exemplary embodiment of the method as described above.
[0086] According to a further exemplary embodiment of the present invention, a computer readable medium, such as a CD-ROM, is presented wherein the computer readable medium has a computer program element stored on it, which computer program element is described by the preceding section.
[0087] A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
[0088] However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to a further exemplary embodiment of the present invention, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
[0089] It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
[0090] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.
[0091] In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfil the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.