SYSTEM FOR GIVING FEEDBACK BASED ON MOTION BEFORE OR DURING MEDICAL IMAGING AND CORRESPONDING METHOD
20220401039 · 2022-12-22
Inventors
- Pawel Sebastian SOLUCH (Warsaw, PL)
- Mateusz Marek ORZECHOWSKI (Warsaw, PL)
- Krzysztof Mateusz MALEJ (Warsaw, PL)
- Wojciech OBREBSKI (Warsaw, PL)
- Pawel ROGOWSKI (Warsaw, PL)
- Krzysztof WROTKOWSKI (Warsaw, PL)
Cpc classification
A61B5/7285
HUMAN NECESSITIES
G06T7/246
PHYSICS
A61B5/055
HUMAN NECESSITIES
A63F13/213
HUMAN NECESSITIES
H04N23/611
ELECTRICITY
H04N23/64
ELECTRICITY
A63F13/428
HUMAN NECESSITIES
A61B5/0033
HUMAN NECESSITIES
A61B5/721
HUMAN NECESSITIES
International classification
A61B5/00
HUMAN NECESSITIES
G06T7/246
PHYSICS
Abstract
Disclosed is a system (100, 200) for giving feedback based on motion before or during medical imaging, comprising: —an optical camera device (110, 210) that is configured to generate image data of at least two images of a subject (408) or of a part of a subject (408) which can be arranged or is arranged at a subject placing location (194) of a medical imaging device (192), and—a feedback signaling unit (120, 124) that is configured to generate based on movement data obtainable from the image data a feedback signal (Si1, Si2) that is perceptible by the subject (408) and/or by an operator of a medical imaging device or by MRI technician (192).
Claims
1. System (100, 200) for giving feedback based on movement before or during medical imaging, comprising: an optical camera device (110, 210) that is configured to generate image data of at least two optical images of a subject (408) or of a part of a subject (408) which can be arranged or which is arranged at a subject placing location (194) of a medical imaging device (192), and a feedback signaling unit (120, 124) that is configured to generate based on movement data obtainable from the image data a feedback signal (Si1, Si2) that is perceptible by the subject (408) and/or by an operator of a medical imaging device (192) and/or by a technician that is responsible for the medical imaging device, and an image processing unit (130, 230, 250, 260, 600, IPU) that is coupled to the optical camera device (110, 210) and that is configured to process the at least two images generated by the camera device (110, 210) in order to determine the movement of the subject (408) or the part of the subject (408) before or during medical imaging, wherein a first trigger signal is issued by the image processing unit (130, 230, 250, 260, 600, IPU) for a medical imaging system to repeat only a part of a current sequence, if movement of the subject (409) exceeded a limit and the patient got back to his/her position.
2. System (100, 200) for giving feedback based on movement before or during medical imaging, comprising: an optical camera device (110, 210) that is configured to generate image data of at least two optical images of a subject (408) or of a part of a subject (408) which can be arranged or which is arranged at a subject placing location (194) of a medical imaging device (192), and a feedback signaling unit (120, 124) that is configured to generate based on movement data obtainable from the image data a feedback signal (Si1, Si2) that is perceptible by the subject (408) and/or by an operator of a medical imaging device (192) and/or by a technician that is responsible for the medical imaging device.
3. System (100, 200) according to claim 2, comprising an image processing unit (130, 230, 250, 260, 600, IPU) that is coupled to the optical camera device (110, 210) and that is configured to process the at least two images generated by the camera device (110, 210) in order to determine the movement of the subject (408) or the part of the subject (408) before or during medical imaging, wherein the image processing unit (130, 230, 250, 260, 600, IPU) generates at least one output signal depending on the determined movement.
4. System (100, 200) according to claim 1, 2 or 3, wherein the feedback signal (Si1, Si2) changes if the orientation and/or location of the subject relative to the medical image device (192) changes by more than a predetermined amount, and/or wherein the optical camera device (110, 210) is configured to operate within a static magnetic field of at least one tesla.
5. System (100, 200) according to claim 1, 3 or 4, wherein the image processing unit (130, 230, 250, 260, 600, IPU) is configured to generate an intermediate score that indicates the movement during a time period that is preferably shorter than a time that is needed for the medical imaging, wherein the score is preferably formed by summing up at least two values that indicate the amount of movement during the time period.
6. System (100, 200) according to one of the preceding claims, wherein the feedback signaling unit (120, 124) is integrated into the camera device, or wherein the feedback signaling unit (120, 124) is a device that is separate from the camera device (110, 210), preferably the feedback signaling unit (120, 124) is integrated within a medical imaging device, e.g. an MRI machine (192).
7. System (100, 200) according to one of the preceding claims as far as referenced back to claim 1 or 3, wherein the image processing unit (120, 124, IPU) is comprised in the optical camera device (110, 210) or in a control unit (130, 230) for the optical camera device (110, 210) which is operatively coupleable to the control unit (130, 230).
8. System (100, 200) according to one of the claims 1 to 6 as far as referenced back to claim 1 or 3, wherein the image processing unit (250, 260, 600, IPU) is a device that is separated from the camera device (110, 210) and from a control unit (130, 230) of the camera device (110, 210).
9. System (100, 200) according to one of the preceding claims as far as referenced back to claim 1 or 3, wherein the image processing unit is configured to detect movements that are less than or equal to 3 mm, less than or equal to 2 mm or less than or equal to 1 mm, wherein preferably the image processing unit is also configured to detect only movements that are more than and/or or equal with regard to a threshold value, preferably to 0.1 mm, more than or equal to 0.2 mm, more than or equal to 0.3 mm or more than or equal to 0.5 mm.
10. System (100, 200) according to one of the preceding claims as far as referenced back to claim 1 or 3, wherein the image processing unit (130, 230, 250, 260, 600, IPU) is configured to determine movements within a time interval of less than half a second or less than 100 milliseconds between commencement of capturing the last image that is considered for determining the current movement and the generation of the output signal.
11. System (100, 200) according to one of the preceding claims as far as referenced back to claim 1 or 3, wherein the image processing unit (130, 230, 250, 260, 600, IPU) comprises a masking unit or a masking functionality that is configured to identify and/or mask (518) at least one area or other feature of the subject, preferably areas of the eyes and/or of the lips of the subject (408), wherein the image processing unit (130, 230, 250, 260, 600, IPU) is configured to disregard movement inside of the at least one area.
12. System (100, 200) according to claim 11, wherein the masking unit or a masking functionality is configured to determine the areas or features automatically or semi automatically.
13. System (100, 200) according to one of the preceding claims as far as referenced back to claim 3, wherein a first trigger signal is issued by the image processing unit (130, 230, 250, 260, 600, IPU) for a medical imaging system to repeat only a part of a current sequence, preferably the past k-space line in MRT, if movement of the subject (409) exceeded a limit and the patient got back to his/her position, and/or wherein a second trigger signal is issued by the image processing unit (130, 230, 250, 260, 600) for the medical imaging system if movement exceeded the limit and the subject (409) did not return to his/her previous position, wherein preferably the second trigger signal triggers the MRI system (102) to repeat the whole sequence.
14. Image processing unit (130, 230, 250, 260, 600, IPU), preferably an image processing unit (130, 230, 250, 260, 600, IPU) of a system (100, 200) according to one of the preceding claims, comprising: an input unit (In), an output unit (Out) and a processing unit (Pr), wherein the input unit (In) is configured to receive image data of at least two images from an optical camera device (110, 210), wherein the processing unit (Pr) is configured to process at least two optical images in order to determine the movement of a subject (408) or a part of the subject (408) before or during a medical imaging that is preferably non-optical, wherein the processing unit (Pr) is configured to generate at least one output signal or output data depending on the determined movement, and wherein the output unit (Out) is configured to send the output data or the output signal to a feedback signaling unit (120, 124).
15. Method (500) of giving feedback based on motion before or during medical imaging, comprising: generating (520, 532) movement data of a subject (408) or of a part of the subject (408) from which at least one medical image is generated during and/or before generating the at least one medical image, and sending (526) to the subject (408) a feedback signal (Si1, Si2) that depends on the amount of the determined movement.
16. Method (500) according to claim 15, comprising: determining (522) movement of the subject (408) or of the part of the subject (408) based on at least two images that are generated optically.
17. Method (500) according to claim 15 or 16, comprising generating (529) non-optically at least one medical image of the inside of the subject (408).
18. Method (500) according to any one of the claims 15 to 17, wherein a feedback signal (Si1, Si2) is presented using a computer game wherein preferably an object or a character adjusts its movement and/or its behavior according to the movement of the subject (408).
19. Method (500) according to any one of the claims 15 to 18, wherein the medical image is an MR image of an MRI machine (192).
20. Method (500) according to any one of claims 15 to 19, wherein at least two or at least three movement levels are defined and wherein the determined movement is classified according to the specified movement levels, and wherein preferably different feedback signal (Si1, Si2) are generated depending on the classification.
21. Method (500) according to any one of the claims 15 to 20, wherein the subject (408) is instructed to control and/or to minimize his/her movement to achieve predefined feedback, preferably on a feedback signaling unit (120, 124, 220) and/or preferably constantly over a defined time of a medical imaging procedure.
22. Method (500) according to any one of the claims 15 to 21, comprising: matching (522) at least one keypoint within the at least two optical images and determining a value that represents the movement of the at least one keypoint, comparing (524) the value with at least one threshold value (TH), giving (526) to the subject (408) a feedback signal that indicates that the value is above the at least one threshold value (TH) and/or feeding back (528) to the subject (408) a signal that indicates that the value is below and/or equal to the at least one threshold value (TH).
23. Computer program product comprising computer readable program code with instructions which, when loaded and/or executed on a processor (Pr), cause the processor (Pr) to carry out at least one of, an arbitrarily selected plurality of, or all of the method steps according to any one of the claims 15 to 22.
24. System (100, 200), comprising: an optical camera device (110, 210) that is configured to generate image data of at least two optical images of a subject (408) or of a part of a subject (408) which can be arranged at a subject placing location (194) of a medical imaging device (192), and at least one trigger unit of: a first trigger unit that is configured to generate based on the image data a first trigger signal for a medical imaging system to repeat only a part of a current sequence of images if movement of the subject (409) exceeded a limit and the subject (408) got back to his/her position, and/or a second trigger unit that is configured to generate a second trigger signal based on the image data for the medical imaging system if movement exceeded the limit and the subject (408) did not return to his/her previous position, wherein preferably the second trigger signal triggers the medical imaging system to repeat the whole sequence of images.
Description
[0102] For a more complete understanding of the presently disclosed concepts and the advantages thereof, reference is now made to the following description in conjunction with the accompanying drawings. The drawings are not drawn to scale. In the drawings the following is shown in:
[0103]
[0104]
[0105]
[0106]
[0107]
[0108]
[0109]
[0113] Camera device 110, Cam and optical output device 120 may be separate devices. However, in the preferred embodiment camera device 110, Cam and optical output device 120 are arranged within the same housing and it may be said that the optical output device 120 is integrated within the camera device 110.
[0114] Camera device 110, Cam and optical output device 120 may be arranged within the interior space 194 of a MRI machine 192 (scanner), i.e. within the inner tube or gantry that is surrounded by big coils that generate a high magnetic field during image acquisition using magnetic resonance tomography (MRT). However, camera device 110 may generate images/pictures or video data using optical sensors, for instance CCD (Charges Coupled Device) or CMOS (Complementary “Metal” Oxide Semiconductor) sensors arranged in a matrix, i.e. in lines and columns. Camera device 110, Cam and optical output device 120 may have to fulfill requirements with regard to MRI shielding and compliance, i.e. they should work properly within high magnetic fields and they should not disturb the MRT.
[0115] Camera device 110, Cam and optical output device 120 may be removable or removably placed within MRI machine 192. A connection segment 170 may connect control unit 130 to camera device 110 and to optical output device 120. Connection segment 170 may comprise flexible cables that form a first connection 172 between control unit 130 and camera device 110 and a second connection 174 between control unit 130 and optical output device 124. However, both connections 172 and 174 may end at camera device 110 if optical output device 120 is integrated within camera device 110.
[0116] Optical output device 120 may comprise an illumination unit 122 and a signaling unit 124. Illumination unit 122 may comprise light sources, for instance for white light, or other radiation sources (for instance IR (Infrared) radiation) that radiate electromagnetic radiation 111 into the field of view of the camera device 110 enabling recording of optical images thereby.
[0117] Signaling unit 124 may comprise a light source that generates light that is used for signaling purposes. The light generated by signaling unit 124 may also be directed mainly to the field of view (FOV) of the camera of camera device 110, see signaling light Si1. This may be the case, for instance if the face of the patient is within the focus of the camera of camera device 110. Alternatively, light generated by signaling unit 124 may be directed mainly to a region that is not within the focus of the camera, see signaling light Si2, for instance if the chest of the patient or subject is within the focus but the signaling light has to be seen by the eyes of the patient. One example for the arrangement of camera device 110, illumination unit 122 and a signaling unit 124 is shown in
[0118] Control unit 130 may comprise an output unit Mon, for instance a screen, display, a monitor or a touchscreen, for showing the video stream that is generated by camera device 110 to an operator or user. Furthermore, control unit 130 may comprise an input device “In” that is used to enter control instructions and/or control data, for instance switching on/off illumination light, switching on/off signaling light, for instance using different colors, selecting video mode of camera (PAL, SECAM, NTSC), etc. Input device In may also be a touchscreen or other input device, for instance a keyboard.
[0119] System 100 may be a system which comprises a recording camera device 110 designed for diagnostics and testing in MRI scanners 192. The use of the camera device 110 may increase the safety of the test subjects or of patients and the effectiveness of MR (magnetic resonance) tests or of MRT. It may allow one to see the patient or subject during MRI and fMRI (functional MRI) tests/imaging and may provide feedback on their activity. Alternatively and/or additionally, a special kind of feedback may be movement feedback. This is described in more detail below with regard to the method that is shown in
[0120] The system may include or comprise a camera device 110, an output device Mon (monitor) and a lighting system 120 mounted for instance on and/or in the camera device 110.
[0121] The camera of the camera device 110 may allow watching the face or other parts of the patient's or subject's body during the MRI scanning procedure. The camera device 110 may provide feedback about the activity of the patient. The camera device 110 may also allow the patient to be observed by the investigator or, in the case of procedures done with children, by the parents. Alternatively and/or additionally, the camera device 110 may be used to generate image data that is used for movement detection or determination of the subject or of the part of the subject that is in the interior space 194 (gantry). The determined level of movement may be the basis for the movement feedback to the subject 408. This is also described in more detail below with regard to the method that is shown in
[0122] An output device Mon (monitor), for instance a touch screen, may be used for viewing the image and setting the lighting parameters. The output device Mon and/or the control device 130 may be mounted on the MRI scanner's 192 housing. The touch screen or another input device “In” may allow the examiner or investigator to adjust some of or all settings of the camera that may be placed inside of the gantry, i.e. within the tube, without leaving the MRI scanning room, making their work easier and more convenient.
[0123] Lights may be mounted on and/or within the camera housing, see
[0124] An image processing unit IPU that is used for movement detection may be comprised within camera device 100 or within control unit 230.
[0125]
[0132] An image processing unit IPU, see for instance
[0133] Sending and receiving unit 250 may be a separate unit or may be part of computing device 260, i.e. using the same internal power supply unit, being arranged within the same housing etc.
[0134] There may be the following connections within system 200: [0135] a connection segment 270 between control unit 230 and optical output device 220/camera device 210. Connection segment 270 may correspond to connection segment 170 (see features mentioned above) and may comprise an optical connection 272 (may correspond to 172) and a power line connection 274 (may correspond to 174), for instance via an electrical cable or line, and/or [0136] a power line connection 280 that delivers electrical current and electrical voltage from power supply 240 to control unit 230, for instance an electrical conductive cable or line, and/or [0137] an optional optical connection 284 between control unit 230 and sending and receiving unit 250, and/or [0138] an optional connection 286 or a wireless connection between sending and receiving unit 250 and computer 260, for instance a USB (Universal Serial Bus) connection.
[0139] A splitting unit 600 may be comprised within control unit 230. The splitting unit 600 may comprise at least one or at least two optical splitters, for instance 50%/50% splitter units each having four ports. The splitting unit 600 may allow the forwarding of data within system 200. The splitting unit 600 is used in the embodiment that is shown in
[0140] An MRI machine room 290 may comprise: MRI machine 192, optical output unit 210 (arranged within an interior space 194 that is surrounded by MRI machine 192), camera device 220 (arranged within interior space 194 that is surrounded by MRI machine 192) and power supply device 240. Optical output unit 220, camera device 210 and power supply device 240 may be MRI shielded/protected in order to guarantee proper operation during MRT imaging process and in order to prevent artefacts within the MRT image due to the operation of these devices. Alternatively, power supply device 240 may be located outside MRI machine room 290. Furthermore, all power lines may comprise additional electrical filtering.
[0141] A wall 292 may separate MRI machine room 290 from a control room 294. Wall 292 may have special shielding, for instance magnetic shielding or EMC (Electro Magnetic Compatibility) shielding. Alternatively or additionally, wall 292 may have an appropriate thickness and/or material, for instance armored concrete. Control room 294 comprises sending and receiving unit 250 and computing device 260 and/or optionally power supply device 240. This also means that sending and receiving unit 250 and computing device 260 and/or power supply device 240 in control room 294 do not have to fulfill special requirements with regard to MRI shielding/protection.
[0142] Thus, a communication between a touch screen unit or control unit 230, a receiver unit (sending and receiving unit 250) and a camera device 210 is described. Control unit 230 and sending and receiving unit 250 may allow controlling some or all camera setting options of the camera within camera device 210 and receiving of video signals. All control signals and/or video signals may pass through touch screen unit, i.e. through control unit 230. Thus, control and monitoring of image/video data may be possible from control unit 230 and from computing device 260. Alternatively, it may only be possible to enter control data using control unit 230 or computing device 260. Furthermore, it is possible to operate the light sources of optical output device 220 using control unit 230 and/or computing device 260, for instance for communication with the person or patient who is examined within MRI machine room 290, i.e. sending signals to this person.
[0143]
[0144] Frame head 300 may comprise: [0145] an outer ring 302, [0146] a housing 304, [0147] an optional operating element 306, [0148] a camera 308 of camera device 110, 210, [0149] at least one illuminating device 310, i.e. one, two, three, four or more than four, and [0150] only one or at least one signaling device 320, i.e. one, two, three, four or more than four.
[0151] Outer ring 302 may have a circular or elliptical shape. Outer ring 302 may be used to mount and hold housing 304 relative to an arm of a frame that comprises frame head 300, see also
[0152] Housing 304 may comprise camera device 110, 210 and optical output unit 120, 220. Housing 304 may have a disc shape or a disc like shape. There may be only a narrow gap between outer ring 302 and housing 304 enabling a good protection of the housing, especially of the breakable camera 308 against mechanical impact.
[0153] Operating element 306 may be mounted to housing 304, i.e. if operating element 306 is rotated or turned, housing 304 pivots or rotates around an axis A with regard to outer ring 302. Housing 304 may be tilted relative to outer ring 302, see
[0154] Camera 308 may be part of camera device 110, 210. Camera 308 may allow use of several interchangeable photographic objectives or lenses of different angels of view and/or different focal lengths. Alternatively only one lens may be used. An aperture of camera 308 may be located on a central axis of housing 304 that may be arranged coaxially with outer ring 302 if both parts are within the same plane.
[0155] In the example, there are four illuminating devices 310 that may be part of illuminating unit 122 or of a corresponding illuminating unit of optical output device 220. Preferably, optoelectronic devices are used as illuminating devices 310, for instance LEDs. It is possible to use LEDs that radiate white light and/or LEDs that emit IR (infrared) radiation. Alternatively, other types of illuminating devices may be used, for instance lamps with or without a filament.
[0156] Four illuminating LED modules 310, for instance arranged crosswise, may be used in the example that is shown in
[0157] In the example shown in
[0158] The RGB (Red Green Blue) LEDs may be driven by a PWM (Pulse Width Modulated) controlled current source, preferably by a voltage controlled current source. Alternatively, it is possible to use digital analog converters (DAC) of a microprocessor or separate DACs to control the current source. Other examples may comprise more than one RGB LED module or single LEDs of different colors.
[0159] In the example shown in
[0163] Housing 304 may comprise further parts, for instance screws for holding two or more parts of housing 304 together, or parts that are placed on the rear side that is not visible in
[0164]
[0168] A connection port 406 may be arranged onto housing 304. Connection port 406 may be used to connect optical connection 172 or 272 to housing 304. Furthermore, connection port 406 may be used to connect power cable 174 or 274 to housing 304. Optical connection 172, 272 and power connection 174, 274 may be combined into one physical cable. Connection port 406 may then comprise an optical connection and electrical connection. Power cable 174, 274 and optical connection 172, 272 may be connected to housing 304 in various ways, for instance using arm 402 or parts of arm 402 for guiding the cable 174, 274.
[0169] An inner tube of MRI machine 192 is also shown in
[0170] Signaling device 320 may be located nearer to the eyes of subject 408 or patient than illuminating devices 310 in order to ease recognition of the signaling. The nose 412 of the patient is nearer to the head 300 of frame 400 than the back of head 410 of the patient, i.e. the back of head 410 rests on foot plate 404. Foot plate 410 of frame 400 may be upholstered. The distance between head 300 of frame 400 and foot plate 404 may be in the range of 30 cm (centimeters) to 50 cm.
[0171]
[0172] In a method step 512 subject 408 is placed in MRI machine 192. If images of only a part of the body of subject 408 should be taken then this part is placed in the interior space 194 of MRI machine 192. Subject 408 may be instructed to solve a task that involves the signals that are send by signaling unit, e.g. optical output device 220. The task may be to avoid red lights. A more specific task may be to get a movement score that is as less as possible during the whole medical imaging procedure. Alternatively and or additionally, stickers or markers may be attached to subject 408.
[0173] In a method step 514 the first image or several images are captured by camera device 110, 210. Image processing is performed by the image processing unit (IPU). Keypoints (markers) may be automatically determined.
[0174] In a method step 516 the IPU may calculate digital descriptors for each keypoint. The digital descriptors may enable to differentiate between different descriptors and may be used as a basis for matching the descriptors in a series of images, i.e. a first descriptor in the first image to the descriptor having the same values in the second image, optionally a second descriptor in the first image may be matched to a second descriptor in the second image, and so on. Based on these descriptors movement detection/matching or determination may be possible as is described in detail below. Method step 516 may be optional, for instance if the difference of two adjacent images in the sequence is calculated by subtraction or if other methods are used for movement recognition.
[0175] In an optional step 518, some of the descriptors, e.g. at least two descriptors, may be associated with masked areas or regions, for instance in order to mask the eyes and/or the lips of subject 408 if the camera 110, 210 captures images of the face of subject 408 during for instance MRI of head 410. Open source, proprietary or commercial software packages may be used for this purpose. The masking may be done automatically, semi-automatically (e.g. an automatic proposal may be generated and manual correction or manual adaption of the area(s) may be performed) or only manually. The masking may make sure, that movement of the eyes and/or of the lids and/or of the lips and/or of eye brows is not recognized as movement that is detrimental for the medical imaging process.
[0176]
[0177] In a method step 522 the IPU may determine for instance a translation vector between the last two images. However, the last three images or a longer series of the last images may be used for movement recognition, for instance based on translation vectors. Masked areas, see for instance optional method step 518, may not be considered in method step 522 in order to consider only relevant movements which may distort the medical imaging process, e.g. rotation of the head but not movement of the eyes.
[0178] In a method step 524 it is tested whether the recognized movement or translation T is less than a threshold TH. It is for instance possible to calculate the length of the translation vector between the same descriptors of the same keypoints in two successive images captured by camera device 110, 210. Alternatively, the length of the translation vector may be calculated over a series of more than two images, for instance considering more than three, four or five images. More sophisticated methods may determine the start of a movement and the end of a movement, e.g. a rest point or a point at which the direction of the movement changes as is the case with a forward and back movement. The amount of translation may be determined very exactly if a start point and an end point are available by image processing. The translation vector may be calculated from the start point to the end point. The camera device 110 does not move and may therefore form a fixed reference system for determination of the movement, e.g. there is always the same fixed reference point within each image, for instance the lower left corner, the center of the image, etc.
[0179] The threshold TH may be selected appropriately, for instance with regard to the largest movement that is still tolerable for the medical imaging process. An example is for instance 0.3 mm or 0.5 mm. In method step 524 the IPU tests whether the calculated movement value T is less than threshold TH. If the calculated movement value T is greater than threshold TH a method step 526 follows immediately after method step 524. This is the case when the movement of subject 408 was too strong. In this case the green light may be switched off in method step 526. Furthermore, the red light may be switched on in order to signal that the subject should try harder not to move. The method goes directly to method step 530 after method step 526, i.e. method steps 528 and 529 are not performed within the same loop 535 of method steps 522 to 534 if method step 526 is performed. If the movement was too strong, the IPU may generate a trigger signal that is sent to the MRI machine in order to stop capturing the current sequence of medical images. Method 500 may also be ended in this case.
[0180] A more sophisticated approach may use several trigger signals from the IPU to the MRI machine depending on a specific criterium. One of these criteria is whether the subjects moves back or whether the part of the subject is moved back to its previous position. If yes, it may only be necessary to cancel or to delete some of the recorded data or medical images of the current sequence. The start and the first part of the sequence may be used later for medical purposes. Alternatively, medical images that are distorted because of too much movement of subject 408 may be marked by some additional data. This data may indicate that the quality of the respective medical image is not good. If the subject does not return to its original or previous position after a stronger movement a different trigger signal may be sent to the MRI that indicates that the whole sequence is distorted and that medical imaging has to be repeated. Other trigger signals may be used as well.
[0181] Furthermore, it is possible to sum up all values T for the cases in which T is equal to or greater to TH. The sum may be used as an overall score of movement during capturing the complete sequence of medical images. Alternatively, all values of T may be summed up in the score, i.e. independently of the result of the test in method step 524. Intermediate scores may also be signaled to the subject/patient 408.
[0182] If the calculated movement value T is less than threshold TH, a method step 528 may follow immediately after method step 524. This is the case if the subject does not move or moves only slightly. In method step 528 it is made sure that the red light is switched off and that the green light is switched on. An action may only be taken if a change of the color of the light is necessary.
[0183] An optional method step 529 may follow after method step 528 if and when method 500 is performed during medical imaging. However, it is also possible to perform method 500 before medical imaging or in another application context. In method step 529 a further medical image may be captured by the MRI machine. The medical imaging may be based on non-optical image capturing,
[0184] In another embodiment of method 500 the movement T is classified in more than two classes. It is for instance possible to use a third LED or a third color of light in order to signal a movement that is not such intensive as a movement that results in red light.
[0185] Method step 530 is performed after method step 529 and after method step 526. The counter variable n is incremented in method step 530, for instance by value 1.
[0186] In the following method step 532 a further image may be captured optically by camera device 110, 210. This new image may be tagged or named as image I(n).
[0187] The previous image may be tagged or named as image I(n−1) from the last loop in which method step 532 has been performed or from the preparation phase.
[0188] A method step 534 follows after method step 532. In method step 534 it may be tested whether the medical imaging procedure is already done, for instance using a timer or another appropriate signal or event.
[0189] If the medical imaging procedure is ongoing, method 500 proceeds with method step 522. Thus, method 500 is in a loop 535 comprising the method steps 522 to 534. During performing this loop 535 the medical imaging process is performed and movement recognition is active.
[0190] The loop 535 may be left in method step 534 only then if the medical imaging process is done. In this case, a method step 536 follows immediately after method step 534. In method step 536 the subject may leave the MRI machine 192 or may remove the body part from the MRI machine 192. Subject 408 may leave MRI machine room 290. Subject 408 may be interested in knowing the score that he/she has reached. The score may be told to the subject and a reward may be given. Method 500 may end in a method step 540. Method steps 536 and 540 may form an end phase of method 500.
[0191]
[0197] There may be a connection/bus 610 between processor Pr and memory Mem.
[0198] Further units of calculation unit 600 are not shown but are known to the person skilled in the art, for instance a power supply unit, an optional internet connection, etc. Alternatively, a server solution may be used that uses calculation power and/or memory space available on the internet supplied by other service providers or on an intranet of a company.
Example Communication of Video Signals and of Control Signals Over Optical Fiber
[0199] Although, a specific embodiment is given in the following there are plenty of ways to realize the invention using other communication systems and/or protocols. To provide a good performance and easy to use setup in MRI (Magnetic Resonance Imaging) environment, sending and receiving unit 250 should or may send and receive control signals to/from control unit 230 (for instance comprising a touch screen) through optical connection 284 (fiber), i.e. passing through an electromagnetic waveguide for light. To avoid using multiple optical fibers, a single optical fiber may be used for either transmitting video signal and control signals. Splitting unit 600 located within control unit 230 (touch screen) may combine signals coming from video output and control signals. Optical signals may be transmitted through transmission channels that operate using for instance transmitters HFBR-1414MZ of Broadcom® and receivers HFBR-2416TZ Broadcom®. However, other devices of Broadcom® or of other companies may also be used. These electronic circuits allow a nominal bandwidth of up to 125 MHz (Megahertz). Video signals may use a bandwidth of up to 60 MHz (Megahertz). This may leave higher frequencies unoccupied and suitable to use them for control signal transmission. In order to simplify design, it was proposed that wide bandwidth radio transmitters or transceivers (for example using frequencies of 80 MHz and higher, ex. ADF7020-1BCPZ of Analog Devices®, or corresponding devices of other companies) may be used to control transmission channels for control signals and/or for video signals. Frequency shift keying may be used to transmit control signals.
[0200] It should be noted that MRI scanners may use radio frequencies for their operation and this may lead to noise during the operation of the system. 1.5 T (Tesla) MRI scanners may use frequencies of about 64 MHz while 3 T MRI scanners may operate with radio frequencies of about 128 MHZ (for instance 127.734 MHz).
[0201] To avoid any artifacts, it could be necessary to use frequencies above this value, so clearly over the 125 MHz bandwidth of optical channels. It may be highly recommended to avoid using of any signal near MRI work frequency and signals which multiples are near MRI work frequency.
[0202] Radio frequency transmitters, however, may have the advantage of being able to operate at low signal-to-noise ratio and have very high dynamic range. A system that may comprise a radio transmitter and optical channels may work even without matching to transmission line's characteristic, i.e. electrical and/or optical, provided that the system comprises separate receive RX and transmit TX lines of radio transmitter (transceiver) and/or that the radio transceiver is voltage controlled. The transceiver circuits may be voltage controlled by a microprocessor, for instance using TTL (Transistor-Transistor Logic) technology. Current control may be used only for some components of the system, especially for some other components than the transceiver, in order to control current changes more precisely. Minimal dynamics of transmission line should or must be about 80 dB (Decibel). Radio transceivers may allow to couple analog or digital video signal with digital control signals in one fiber without both signals degradation. Alternatively, a multi-fiber connection may be used. However more fibers may complicate the connection between control unit 130, 230 (Touch Screen Unit) and sending and receiving unit 250.
[0203] However, another transmission system may be used as well, for instance only based on electrical conductive signal transmission or only based on optical signal transmission.
[0204] Spoken with other words, an electronic unit for image processing IPU and a corresponding method for camera based feedback of MRI procedures are disclosed to prevent motion artifacts in medical image data.
[0205] The camera 110 and the image processing units IPU may be used to track a movement of the subjects 408 head 410 during the MRI procedure. Based on some predefined constraints system 100, 200 may decide if the movement may introduce some artifacts to the medical image and indicate it to the patient using for instance RGB lights. There could be the following meaning of colors: [0206] Green light—ok, maintain this position, [0207] Orange—you are moving slightly, [0208] Red—heavy movement.
[0209] This may be intended to be a kind of gamification for the patient or subject 408 that would help to avoid artifacts during MRI procedures.
[0210] The feature may be performed simply using a computer application on computer 260 connected to system 100, 200 or within one of the disclosed electronic units of system 100, 200, for instance in control unit 130, 230. This is explained in more detail in the following and also above.
[0211] Artifacts caused by motion of subject 408 during MRI procedure may cause difficulties in proper recording and/or analysis of MR images, that may be done e.g. for medical diagnostic. In many cases the procedure needs to be repeated in order to obtain proper MRI data. This may expand the time and costs required for the medical imaging procedures. To avoid motion artifacts, various solutions might be applied, including MRI compatible positioning devices, such as immobilizing pads, frames and masks. However, this kind of solutions still allow micro movements. Moreover, they often cause discomfort of the subject that may lead to an increased amount of micro movements.
[0212] The disclosed solution aims in preventing movement of subject 408 in specific time slots of the procedures using a method for feedback or gamification that is using data from MRI compatible optical camera 110 to identify movement of subject 408 and to inform subject 408 about the level of the movement using preferably an optical communication device. The proposed method may influence cognitive engagement of the patient, helping him/her to remain still and improving the comfort during a MRI procedure through reduced use of physical immobilizing devices.
[0213] The obvious approach for movement tracking and adjusting image acquisition process or data analysis process would be to use image data that is produced by the MRI machine in order to perform motion tracking. These methods may help to avoid restricting patient's movement. However, they often require precise and complex additional apparatus, as well as present limited capabilities.
[0214] The proposed solution is relatively simple in terms of hardware being used. It is also based on hardware that can be used for multiple purposes. Furthermore, it allows for discomfort-free movement prevention/minimization.
[0215] Description of an embodiment of the invention:
[0216] An electronic unit for image processing may comprise: [0217] a processing module, [0218] an input data port, used to input image data from MRI compatible camera 110 to the processing module, [0219] an output port, used to output the control signal to a output device configured to provide information to the subject undergoing MRI procedure, e.g. optical output device 120, [0220] a program code implementing, when executed on the processing module, the method for camera based gamification of MRI procedures.
[0221] A method for camera based feedback or gamification of MRI procedures may comprise: [0222] movement recognition and tracking, e.g. movement of the head 400 or other part of the body undergoing MRI procedure, [0223] classification of the recorded movement level or type to two or more classes, [0224] issuing control signal for the output device based on classified movement class, [0225] a method, where patient/subject is instructed to control (minimize) his/her movement to achieve predefined information on the output device constantly over defined time of the procedure.
[0226] The image processing module IPU may be a dedicated electronic unit 130, 230 working together with components disclosed above. The processing module IPU may comprise an embedded system (e.g. using the same power unit and the same input output devices as the dedicated electronic unit 130, 230) for computer vision, comprising for instance an image signal processor for image data processing, e.g. a graphic processing unit GPU. The processing module IPU may be integrated in common housing together with one of the previously disclosed components, preferably with control unit 130, 230 or with receiving unit 250. Alternatively the processing module IPU may be built into a separate housing connected to control unit 130, 203 via optical fiber or other connection systems or connected to the receiving unit 250 with an electrical connection or with another appropriate connection, e.g. USB (Universal Serial Bus) cable.
[0227] Alternatively, the processing module IPU may be a computer, for instance a personal computer 260, connected to receiving unit 250, for instance with an USB cable, with computer program code executing image processing and issuing control commands to the system disclosed above.
[0228] The method for camera based feedback and/or gamification of MRI procedures may comprise movement recognition and tracking based on pixel images that are generated from the analog or digital video from the camera 110. The method may use one of commonly known methods for optical flow calculation, e.g.: [0229] 1) ORB feature detector to detect features of the image (Ethan Rublee, Vincent Rabaud, Kurt Konolige and Gary Bradski, “ORB: an efficient alternative to SIFT or SURF,” IEEE International Conference on Computer Vision, 2011), preferably combined with [0230] 2) Lukas-Kanade method to calculate optical flow (B. D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision.”, Proceedings of Imaging Understanding Workshop (1981), pages 121 to 130).
[0231] Alternatively, a method of frames subtraction may be used. This means that the corresponding pixel values of two successive pixel images are subtracted from one another to get a difference pixel image which may preferably show the movement directly.
[0232] The signal that is coming from the camera may be an analog signal or a digital signal. There may be at least two ways of further processing: [0233] 1. The analog signal may be transmitted to technical room 294, for instance to receiving unit 250. The analog signal may be transformed to a pixel image by a frame grabber device and may be processed by dedicated image processor or a computer 260 with dedicated image processing software. [0234] 2. In the second option, the analog signal may be digitalized by a circuit in control unit and digitized image data may be processed by an image processor, integrated with the control unit 230.
[0235] In both cases a program or a dedicated hardware, especially a hardware without a processor that performs instructions of a program, may analyze the movement of subject 408 and may be based on analyzing results issue at least one corresponding command to the optical output device.
[0236] To avoid detecting movements which are irrelevant to the quality of MR image in some of the procedure a calibration procedure may be used. Examples for irrelevant movements are: eye blinking, lip(s) movements, chin movement, yawning, nose wrinkling movement, eye brow movement, cheek movement, etc. The calibration may be done manually. However, an automatic or semi-automatic calibration procedure may comprise automatic face feature detection, see for instance: [0237] 3) Fernando De la Torre, Wen-Sheng Chu, Xuehan Xiong, Francisco Vicente, Xiaoyu Ding, Jeffrey Cohn, “IntraFace”, IEEE Int. Conf. Autom. Face Gesture Recognition Workshops. 2015.
[0238] The recognition of eye regions and/or lip regions or areas, etc. may be followed by masking of the detected face features. Alternatively an MRI technician or an operator may manually tag irrelevant features among the detected features through a graphical user interface GUI, using for instance a touchscreen of control 130, 230 or a computer 260 connected to the disclosed system 100, 200. Alternatively an MRI technician or an operator may manually select the image area to be ignored while detecting moving features.
[0239] The method may further comprise a classification of the detected movement into two or more classes according to characteristics of the movement, including movement speed, displacement etc. In one example, two classes may be used: [0240] with one class defined as “movement within allowed range”, and [0241] the second class defined as “movement out of the allowed range”.
[0242] In a second example, three classes may be used: [0243] first class defined as “no movement”, [0244] second class defined as “light movement”, and [0245] third class defined as “excessive movement”.
[0246] Alternatively only one, more than two or more than three classes may be used to distinguish various types and intensities of the movement.
[0247] The method may further comprise issuing a control signal to the output device 120.
[0248] The control signal may dependent on the detected movement class. The control signal may preferably control optical output device 120, 220 to provide to the patient an information about the movement level. The information may be presented as various colors of for instance an LED (light emitting diode) light produced by optical output device 120, 220 e.g. green for class “no movement”, yellow for class “light movement”, red for class “excessive movement”. However, the colors may be adjusted according to the number of classes and/or patient's requirements, e.g. to make colors distinguishable by the patient in case of e.g. color blindness or other physiological, anatomical or psychological conditions. Alternatively other methods of providing information through optical output device 120, 220 or another output device may be used, such as various frequency of LED blinking, changing intensity of generated light or fluent changes of LED color.
[0249] The disclosed method may comprise a task for the patient to minimize his/hers movements to keep the information provided by output device 120, 220 as close to the desired one as possible. Thus, the method may establish a movement based biofeedback method to minimize MRI artifacts. The method may gamify the MRI procedure for the patient with a strategy to reward the patient with positive information if he or she achieves low movement score during the procedure.
[0250] As mentioned in above, in one of the embodiments the image processing module or unit IPU may be integrated in a common housing with the control unit 130, 230. In control unit 130, 230, there may be an integrated LCD (liquid crystal device) video processor to digitize the analog signal from the camera. This processor may be responsible for controlling the control unit's screen, through its embedded TFT (thin film transistor) panel support. The signals dedicated for the TFT panel may be used simultaneously as input signals to the image processing module or unit IPU, where it can be processed using image processing algorithms. This is only one example. Various other scenarios may be considered. The main scenario may be to use the image processing module or unit IPU connected to the receiving unit 250, i.e. a unit that is outside of the MRI room 290.
[0251] Benefits and capabilities are: [0252] avoiding motion artefacts in MR images saves costs and time for patients and for clinics, [0253] more comfort and ease of the subject 408 during MR imaging, [0254] faster and more cost efficient movement detection compared to image detection that is based on MRI related medical image data, i.e. motion detection in real time, e.g. in less than 100 milliseconds or in less than 2 milliseconds, [0255] lesser involvement of the MRI technician in obligation to follow patient movement.
[0256] Other technical aspects may be: [0257] functional housing 304 and frame 400 design may meet medical standards, and/or [0258] lightweight and easy to install structure, especially housing 304 and/or frame 400, and/or [0259] possibility of convenient hanging of the camera device 110, 210 on the scanner's device 192 gantry, and/or [0260] a tripod or stand or frame 400 that allows one to adjust camera device 110, 210 position as desired, and/or [0261] possibility of directing the camera device 110, 210 at any part of the patient's body.
[0262] Although embodiments of the present disclosure and their advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims. For example, it will be readily understood by those skilled in the art that many of the features, functions, processes and methods described herein may be varied while remaining within the scope of the present disclosure. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the system, process, manufacture, method or steps described in the present disclosure. As one of ordinary skill in the art will readily appreciate from the disclosure of the present disclosure, systems, processes, manufacture, methods or steps presently existing or to be developed later that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present disclosure. Accordingly, the appended claims are intended to include within their scope such systems, processes, methods or steps. The embodiments mentioned in the first part of the description may be combined with each other. The embodiments of the description of Figures may also be combined with each other. Further, it is possible to combine embodiments mentioned in the first part of the description with examples of the second part of the description which relates to