ROBOTIC SYSTEM FOR PERFORMING AN ULTRASOUND SCAN

20230404527 · 2023-12-21

Assignee

Inventors

Cpc classification

International classification

Abstract

A robotic system for performing an automatic ultrasound scan on a body part of a patient. The system comprises a support surface for supporting the body part, a display, and a robotic arm. The robotic arm is configured to hold an ultrasound probe and move the ultrasound probe to obtain the automatic ultrasound scan of the body part supported by the support surface. The display is integrated into the support surface and adapted for supporting at least part of the body part to be scanned.

Claims

1. A robotic system for performing an ultrasound scan on a body part of a patient, the system comprising: a support surface for supporting the body part, a display, and a positioning device configured to hold an ultrasound probe and move the ultrasound probe to obtain the ultrasound scan of the body part supported by the support surface, wherein the display is integrated into the support surface and adapted for supporting at least part of the body part to be scanned.

2. A robotic system according to claim 1, wherein the display is configured to display an instruction pattern.

3. A robotic system according to claim 1, wherein the display is a touch sensitive display.

4. A robotic system according to claim 1, wherein the system further comprises a controller for controlling movement of the positioning device.

5. A robotic system according to claim 1, wherein the robotic system is configured to: detect, via the display, an outline of the at least part of the body part supported by the display.

6. A robotic system according to claim 1, wherein the robotic system is configured to: detect, via the display, movement and/or the presence of the at least part of the body part supported by the display.

7. A robotic system according to claim 1, wherein the system further comprises a 2D sensor and/or a 3D sensor configured to obtain data on the body part of the patient supported by the support surface and/or data on the robotic arm with the ultrasound probe.

8. A robotic system according to claim 1, wherein the robotic system is configured to utilize the display in a calibration process.

9. A robotic system according to claim 1, wherein a display surface of the display is configured to support at least part of the body part, wherein the display surface extends within a display plane with an angle to a horizontal plane of 0-75 degrees, preferably 0-60 degrees, even more preferred 0-45 degrees.

10. A robotic system according to claim 1, wherein the robotic system is a movable unit.

Description

BRIEF DESCRIPTION OF DRAWINGS

[0074] In the following description embodiments of the invention will be described with reference to the schematic drawings, in which:

[0075] FIG. 1 is a block diagram of a robotic system according to an embodiment of the invention;

[0076] FIG. 2 is a schematic perspective view of a robotic system according to an embodiment of the invention;

[0077] FIG. 3 is a schematic side view of the robotic system of FIG. 2; and

[0078] FIG. 4 is a flow diagram of a calibration process for a robotic system according to an embodiment of the invention.

DETAILED DESCRIPTION

[0079] The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided for thoroughness and completeness.

[0080] Referring initially to FIG. 1, depicting a block diagram of a robotic system 1 according to an embodiment of the invention. The robotic system 1 being configured for performing an automatic ultrasound scan on a body part of a patient. The robotic system 1 comprises a support surface 2 for supporting the body part to be scanned. The support surface 2 being adapted for receiving a body part and supporting the body part during a scan performed by the robotic system 1. Constituting at least part of the support surface 2 is a display 3. The display 3 is adapted for supporting at least part of the body part to be scanned. Thus, if a patient needs to have a hand scan performed by the robotic system 1, the support surface 2 supports the whole hand and the display 3 constituting at least part of the support surface 2 may support the fingers or the palm of the hand. The display 3 may fully constitute the support surface 2, then the display 3 is adapted for supporting the whole body part to be scanned.

[0081] To carry out the automatic scan the robotic system 1 is provided with a robotic arm 4. The robotic arm 4 is configured to hold an ultrasound probe 8 and move the ultrasound probe 8 to obtain the automatic ultrasound scan of the body part supported by the support surface 2.

[0082] In the shown embodiment the display 6 is provided with a display processing device 6. The display processing device 6 is communicatively connected to a controller 5. The controller 5 being for controlling movement of the robotic arm 4. The controller 5 is furthermore communicatively connected to the robotic arm processing device 7 provided with the robotic arm 4. The controller 5 is furthermore communicatively connected to a 3D sensor 9. The 3D sensor 9 being configured to obtain data regarding the support surface 2, a body part supported by the support surface 2, and/or the robotic arm 4. The controller 5 may receive signals from the display processing device 6, the robotic arm processing device 7, and/or the 3D sensor 9. The controller 5 may transmit signals to the display processing device 6, the robotic arm processing device 7, and/or the 3D sensor 9. Preferably, the controller 5 receives data from the 3D sensor 9, and/or the display processing device 6, and generates a movement instruction for controlling movement of the robotic arm 4. The movement instruction after being generated is transmitted to the robotic arm processing device 7. After receiving the generated movement instruction from the controller 5, the robotic arm processing device 7 may execute the movement instruction to move the robotic arm 4 in accordance with the received movement instruction.

[0083] Referring to FIG. 2, depicting a schematic perspective view of a robotic system 10 according to an embodiment of the invention. In the shown embodiment the robotic arm 14 is provided as an articulated arm 14. The articulated arm consists of a plurality of joints 141, 142, 143, 144, 145, 146. The articulated arm 14 is connected to a system housing 101 via an end joint 141 connected to a base 148 of the robotic arm. The other end joint 146 of the articulated arm 14 is provided with a holder 147. The holder 147 being for holding an ultra sound probe. During scanning the articulated arm 14 moves the holder 147 to obtain a scan of a body part. The joints 141, 142, 143, 144, 145, 146 of the articulated arm 14 may be rotatable independent of each other, thus allowing the articulated arm 14 to perform a wide variety of movements.

[0084] In the shown embodiment, the body part is during scanning supported by the display 13. The display 13 constitutes the support surface in the shown embodiment. The display 13 consists of a display housing 131 and a display surface 132. The display surface 132 being for displaying a pattern and/or a colour. The display housing 131 houses the display surface 132. The display surface 132 is provided as a substantially planar surface. The display housing 131 is at one connected to the system housing 101. The display housing 132 may house a display processing device 6 and/or other electronics. The display 13 is a touch sensitive display 13, i.e. an operator or a patient may give input directly to the robotic system 10 via the display surface 132. Input given to the display surface 132 may be transmitted to a controller 5 of the robotic system 10. The controller 5 being for controlling movement of the articulated arm 14. The display 13 in the shown embodiment is configured to display an instruction pattern 133. The instruction pattern 133 is in the shown embodiment is an outline of a body part 133 to be supported by the display 13. The outline of the body part 133 indicates to a patient where to place the body part on the display 13. The outline of the body part 133 displayed is displayed dependent on the patient, i.e. if the patient is an adult female where the right hand is to be scanned, the patient or an operator of the robotic system 10 may give an input to the display 13 or the controller 5 that the patient is an adult female where the right hand is to be scanned, the display 13 may then display the corresponding outline of the body part 133.

[0085] Connected to the display housing 131 is a 3D sensor 19. The 3D sensor 19 consists of a sensor stand 192 and a 3D sensor unit 191. The sensor stand 192 is connected to the display housing 131 and extends vertically upwards from the display housing 131. Arranged in the sensor stand 192 is the 3D sensor unit 191. The 3D sensor unit 191 is located vertically above the display 13 in the sensor stand 192. The 3D sensor unit 191 is configured to obtain data on the display 13, a body part supported by the display 13, and/or the articulated arm 14. The data obtained by the 3D sensor unit 191 may be position and/or orientation data of the articulated arm 14 or a body part supported by the display 13. The data obtained by the 3D sensor unit 191 may be 3D data of the body part supported by the display 13, and/or 3D data of the articulated arm 14. The 3D sensor unit may transmit the obtained data to the controller 5 of the robotic system 10. The controller 5 may use the received data for controlling movement of the articulated arm 14.

[0086] The robotic system 10 in the shown embodiment is a movable unit 10. The movable unit 10 is achieved by the robotic arm 14 and the display 13 being connected to the system housing 101. The system housing 101 may house the controller 5 and/or other electronics usable by the robotic system 10. The system housing 101 is mounted on a plurality of wheels 102. The plurality of wheels 102 allows for the robotic system 10 to be moved around.

[0087] The robotic system 10 may comprise the controller 5 for controlling movement of the articulated arm 14. The controller 10 may be housed in the system housing 101 or be located remotely from the rest of the robotic system 10. The controller 5 may receive inputs from the articulated arm 14, the display 13, and/or the 3D sensor 19. The controller may transmit instructions to the articulated arm 14, the display 13, and/or the 3D sensor 19.

[0088] The robotic system 10 is configured to detect, via the display 13, an outline of the body part supported by the display 13. The outline of the body part may be detected by the display 13 being a touch sensitive display 13 capable of detecting an outline of an object contacting the display 13. The outline of the body part may be detected by the display 13 in conjunction with the 3D sensor 19. The 3D sensor 19 may obtain image data of the display 13 and the display 13 may be configured for running a specific pattern and/or colour sequence. The obtained image data may then be transmitted to the controller 5, which is configured to process the image data to see which pixels deviates from the specific pattern and/or colour sequence and determine the outline of the body part supported by the display 13. The outline of the body part detected may be used by the controller 5 for controlling movement of the articulated arm 14.

[0089] The robotic system 10 is configured to detect, via the display 13, movement and/or the presence of the body part supported by the display 13. Movement and/or the presence of the body part supported by the display 13 may be detected by the display 13 being a touch sensitive display capable of detecting movement and/or the presence of an object contacting the display. Movement and/or the presence of the body part may be detected by the display 13 in conjunction with the 3D sensor 19. The 3D sensor 19 may be configured to obtain image data of the display 13 and a body part supported by the display 13. The display 13 may be configured for running a pattern and/or colour sequence. Image data obtained by the 3D sensor of the display 13 and a body part supported by the display 13 may be transmitted to the controller 5. The controller 5 may be configured to process the image data, to see over time which pixels deviates from the pattern and/or colour sequence. Based on the overtime deviating pixels, movement of the body part supported by the display may be determined by the controller 5.

[0090] The presence and/or movement of the body part detected may be used in controlling the robotic arm, e.g. the controller 5 may adapt movement of the articulated arm 14 in real time in accordance with the presence and/or movement detected. Furthermore, the presence and/or movement of the body part may be used by the controller 5 as an emergency stop. The robotic system may be configured to stop a scan if excessive movement of the body part is detected. Excessive movement may be defined as the body part moving more than 1 cm, 2 cm, 3 cm, 4 cm or 5 cm.

[0091] Referring to FIG. 3, depicting a schematic side view of the robotic system 10 of FIG. 2. The display 13 extends with an angle A1 to a horizontal plane H. Preferably, the angle A1 is 0-75 degrees, preferably 0-60 degrees, even more preferred 0-45 degrees. The display housing 131 is at one end provided with a display connection structure 132 facilitating the connection between the display 13 and the system housing 101. The display connection structure 132 is provided as L-shape in display housing 131, wherein the L-shape is configured to be connected to the system housing 101. Furthermore, the articulated arm 14 comprises a base 148. The base 148 being connected to a connection surface 103 of the system housing 101. The connection surface 103 is downwardly angled relative to the horizontal plane H, with a connection angle A2 of 0-60 degrees, preferably 0-45 degrees, even more preferred 0-30 degrees. The connection angle A2 results in the articulated arm 14 being lowered, thus lowering the effective height of the robotic system, and resulting in a more compact robotic system 10.

[0092] Referring to FIG. 4, depicting a flow diagram of a calibration process 20 for a robotic system 1, 10 according to an embodiment of the invention. The first step 21 comprises initiating the calibration process 20. The initiation of the calibration process 20 may be done by a controller 5, which is configured to, before a scan of a body part is initiated, to calibrate the robotic system 10. The initiation of the calibration process 20 may be done periodically by the controller when no scan of a body part is being carried out. The initiation of the calibration process 20 may be done in response to an operator inputting to the controller 5 that the calibration process is to be initiated.

[0093] The second step 22 comprises displaying by the display 3, 13 a calibration pattern and/or a calibration colour. The display 3, 13 may be configured to display the calibration pattern and/or a calibration colour in response to receiving a signal from the controller 5 that the calibration process 20 has been initiated. The calibration pattern and/or calibration colour may be a stationary image displayed by the display 3, 13. The calibration pattern and/or calibration colour may be a sequence of images and/or patterns displayed by the display 3, 13. The calibration pattern and/or calibration colour may be used for displaying one calibration point for use in calibrating the robotic system. The calibration pattern and/or calibration colour may be used for displaying a plurality of calibration points for use in calibrating the robotic system.

[0094] The calibration process 20 may be carried out to calibrate different components of the robotic system 1, 10. If a robotic arm 4, 14 of the robotic system 1, 10 is to be calibrated a third step 23 and a fourth step 24 may be carried out. If a 3D sensor 9, 19 of the robotic system 1, 10 is to be calibrated a fifth step 25 and a sixth step 26 may be carried out instead of the third step 23 and the fourth step 24. Alternatively, both the robotic arm 4, 14 and the 3D sensor 9, 19 may be calibrated in parallel, thus the third step 23 and the fourth step 24 is carried out in parallel with the fifth step 25 and the sixth step 26.

[0095] The third step 23 comprises moving the robotic arm 4, 14 to contact the calibration pattern and/or calibration colour displayed by the display 3, 13. The calibration of the robotic arm 4, 14 may be performed by an operator moving the robotic arm 4, 14 to contact one or more calibration points defined by the calibration pattern and/or calibration colour. The operator may specifically move an ultra sound probe 8 held by an end piece 146 of the robotic arm 4, 14 to contact one or more calibration points defined by the calibration pattern and/or calibration colour. Alternatively, the movement of the robotic arm 4, 14 to contact one or more calibration points defined by the calibration pattern and/or calibration colour may be determined by the controller 5. The controller may transmit a movement instruction comprising a determined movement to the robotic arm 4, 14 to move the robotic arm 4, 14 according to the determined movement. The controller 5 may determine the movement in conjunction with the 3D sensor 9, 19 configured to obtain image data of the support surface 2 and image data on the robotic arm 4, 14 with the ultra sound probe 8. The 3D sensor 9, 19 may transmit image data to the controller 5. The controller 5 is configured to determine based on the received image data from the 3D sensor 9, 19, where the robotic arm 4, 14 is relative to the one or more calibration points defined by the calibration pattern and/or calibration colour. The controller may then generate a movement instruction based on the location of the robotic arm 4, 14 relative to the one or more calibration points defined by the calibration pattern and/or calibration colour. The movement instruction comprising the movement needed for the robotic arm 4, 14 to reach the one or more calibration points defined by the calibration pattern and/or calibration colour. The movement instruction is transmitted to the robotic arm 4, 14, which then executes the movement. The movement instruction generated by the controller 5 may be a movement instruction for moving the end piece 146 of the robotic arm 14 towards the one or more calibration points. The movement instruction generated by the controller 5 may be a movement instruction for moving the holder 147 of the robotic arm 14 towards the one or more calibration points. The movement instruction generated by the controller 5 may be a movement instruction for moving an ultrasound probe 8 held by the robotic arm 14 towards the one or more calibration points.

[0096] The fourth step 24 comprises verifying the robotic arm 4, 14 has reached the one or more calibration points defined by the calibration pattern and/or calibration colour. The verification may be done by the operator inputting to the robotic system that the robotic arm 4, 14 has reached the one or more calibration points. Alternatively, the controller 5 may verify whether the robotic arm has reached the one or more calibration point by analysing the image data received by the 3D sensor 9, 19.

[0097] The fifth step 25 comprises obtaining image data by the 3D sensor 9, 19 and transmitting the obtained image data to the controller 5.

[0098] The sixth step 26 comprises correlating, by the controller, the received image data to one or more known parameters of the calibration pattern and/or colour displayed by the display 3, 13. The one or more known parameters may 7 be stored in a data storage communicatively connected to the controller 5. The correlation by the controller 5 may be to obtain a transformation matrix allowing mapping of 3D coordinates in space to 2D image coordinates.

[0099] Additionally, variations to the disclosed embodiments can be understood and effected by the skilled person in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word comprising does not exclude other elements or steps, and the indefinite article a or an does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage.