Fitting System
20200197195 ยท 2020-06-25
Inventors
Cpc classification
A61B90/06
HUMAN NECESSITIES
A61F2002/7635
HUMAN NECESSITIES
A61F2/5046
HUMAN NECESSITIES
A61F2/76
HUMAN NECESSITIES
A61B2562/0295
HUMAN NECESSITIES
A61B5/0073
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
International classification
A61F2/76
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
Abstract
A method and apparatus to scan and measure the surface of objects using a radiation source, such as a laser. When combined with bio-data from bio-sensors to create a profile, the invention is particularly useful for measuring the internal surface of a prosthetic socket and improving the fit between the socket and the residual limb.
Claims
1. A method of identifying the differences in shape and the physical contact characteristics between an object and a body part which is engageable with the object comprising: scanning the object with radiation in order to produce a surface map of the object; attaching a plurality of bio-sensors to at least one of a surface of the object or to a surface of the body part at locations which are known relative to a reference point, engaging the body part with the object; collecting bio-sensor data from the bio-sensors to record information on the engagement between the body part and the object over the surface of the object; and superimposing the data from the bio-sensors over the surface map of the object in order to identify areas of the object which need to be adjusted in order to improve the fit of the body part with the object.
2. The method of claim 1, wherein the object is one of a prosthetic socket, an orthotic article, an article of furniture, or a wheelchair, and the body part is one of a stump of an amputated limb, stump of an amputated toot, a complete limb, a skin of a patient, a bottom sitting on a wheelchair, a back lying on a bed, or a liner covering at least part of the body part.
3. (canceled)
4. The method according to claim 1, wherein the scanning comprises: projecting radiation pattern onto the surface of the object with a radiation source at a first distance from the surface of the object; taking as image data an image of the radiation projected onto the surface using at least one capturing element which is in a fixed and known position relative to the radiation source; analyzing the image data from the capturing element to identify a position in three dimensions of each point illuminated by the projected radiation; varying the distance of the radiation source from the surface in order to change the parts of the surface illuminated by the projected radiation; using the data from the capturing element in order to identify the position of each new point illuminated by the projected radiation; and repeating until all points on the surface have been scanned.
5. The method according to claim 2, wherein locations of the plurality of biosensors is determined by illuminating the plurality of biosensors and capturing image data.
6. The method according to claim 2, wherein the scanning is carried out by a radiation pattern as a projected laser pattern.
7. The method according to claim 2, wherein the collecting of data comprises one of data relating to pressure or temperature at the locations.
8. The method according to claim 2, further comprising displaying a 3D model of the mapped surface to a user with the bio-sensor data superimposed thereon.
9. The method according to claim 1, further comprising projecting onto the surface of the object a pattern which is calibrated to the surface map generated by a computer so as to enable a user easily to correlate the actual surface with the virtual surface and hence easily identify on the real surface areas of the object which need adjusting based on the bio-sensor information.
10. The method according to claim 8, wherein the object is a prosthetic socket and the adjusting comprises forming an interior surface of the prosthetic socket.
11. The method according to claim 1, further comprising using data from at least one motion unit to identify areas of the object which need to be adjusted in order to improve the fit of the body part with the object during movement.
12. An apparatus for identifying the differences in shape and physical contact characteristics between an object and a body part which is engageable with the object, the apparatus comprising: a radiation source for scanning the surface of the object in order to produce a surface map thereof; an adjuster for varying at least one of a distance or orientation of the radiation source from the surface of the object; a plurality of bio-sensors attachable to at least one of the surface of the object, to the surface of the body parts or to liners covering body parts at locations which are known relative to a reference point, data collectors connected to the plurality of bio-sensors for collecting bio-sensor data from the plurality of bio-sensors; data processing device adapted for superimposing the bio-sensor data onto the surface map to produce a bio-data profile map of the object; and a display for displaying the bio-data profile map to a technician.
13. The apparatus of claim 12, wherein the plurality of bio-sensors are arranged as a bio-sensor strip comprising plastic films, powder leads and data leads, a power and data connector, wherein the plurality of bio-sensors are disposed on the bio-sensor strip (821, 921) and at least one power lead and data lead is connected to one or more of the plurality of bio-sensors and the at least one power lead and data lead are placed on the bio-sensor strip and are in contact with an interlace component itself connected with a power supply and the data processing device.
14. The apparatus according to claim 12, wherein the radiation source is a conical laser assembly which comprises a single point laser and an optical element which converts the single laser beam into a two-dimensional
15. (canceled)
16. (canceled)
17. An apparatus according to claim 12, further comprising a capturing element associated with, and in a known position relative to pattern on the surface and measuring the distance between a plurality of points illuminated by the radiation source and the capturing element, and a data processing device for processing the data from the capturing element and converting the data into a map of the surface.
18. An apparatus according to claim 17, wherein the capturing element is arranged in a fixed position relative to the radiation source, the distance from the laser being known, and moves with the radiation source towards and away from the surface.
19. (canceled)
20. The apparatus according to claim 12, further comprising a plurality of light sources for illuminating the surface of the object to identify locations of the plurality of biosensors.
21. (canceled)
22. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0040] In order that the invention may be well understood, there will now be described some embodiments thereof, given by way of example, reference being made to the accompanying drawings, in which:
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
DETAILED DESCRIPTION OF THE INVENTION
[0056] The invention will now be described on the basis of the drawings. It will be understood that the embodiments and aspects of the invention described herein are only examples and do not limit the protective scope of the claims in any way. The invention is defined by the claims and their equivalents. It will be understood that features of one aspect or embodiment of the invention can be combined with a feature of a different aspect or aspects and/or embodiments of the invention.
[0057] Referring first to
[0058]
[0059] In
[0060] In
[0061] The camera 211 is mounted above a single point laser device 213 which projects a conventional laser beam 236 onto a conical mirror 215. The laser 213 is arranged to focus the laser beam 236 on the vertex of the conical mirror 215, and the mirror surface of the conical mirror 215 reflects the laser beam 236 outwards from the plane of the base of the mirror 215 so as to project the laser line 201 extending from the plane of the base of the mirror 215. The scanned object, a prosthetic socket 216, is mounted on a fixing base 217 which does not move so that the scanned object remains stationary. A physical origin coordinate 212, identified by a circled cross and placed on the surface of the socket 216 provides a spatial reference point which will be useful to orient and align the physical socket 216 with the virtual 3D model of the socket 216 and with the 3D profile of the bio-data.
[0062] In use, the devices support frame 210 moves vertically, starting from a top position where the laser 213 focuses its laser beam 236 on the conical mirror 215 and begins to scan the top of the socket 216. A line of laser light, the perimeter of which is represented by points line 201 is projected on the internal area of the socket 216, whereupon the process previously described of laser line acquisition, segmentation, calibration, distance-to-reference point calculation and line coordinates calculation is performed. These data are stored in a computer (not shown) and the motor 207 turns the linear screw 208 which in turn moves the moving support assembly 240 to the next incremental position, thereby lowering the devices support frame 210 one unit of movement (typically in increments of 5 mm, but this is not limiting of the invention). The entire process is repeated again in a new data acquisition stage until the entire socket 216 internal surface is scanned and mapped. At the conclusion of the process the data corresponding to each slice of socket surface is joined and a full 3D map of the lines is formed, thus rendering a virtual image of the socket 216 as previously shown in
[0063] It will be inferred that there is a blind spot on socket 216 mapping caused by the devices support frame 210, the arms of which will block the field of view of camera 211. In order to acquire the hidden points on the socket 216 surface, this may be achieved by installing a decoupling mechanism (not shown) between the moving support assembly 240 and the devices support frame 210, which will allow the support arms of the devices support frame 210 to rotate sufficiently, for the previously hidden portion of laser plane 201 to become visible to the camera 211 while the camera 211 stays in the same position.
[0064] Strips of bio-sensors 219 are arranged on the inside of the socket 216 and will record various biomedical parameters, as explained below. These bio-sensors 219 are described in more detail in connections with
[0065]
[0066] The laser 313 and the camera 311 are mounted on the device supporting frame 310 that is attached to a moving support assembly 340. The moving support assembly 340 is connected to a linear screw 308 by a bushing 309 so as to be moveable towards and away (more frequently vertically) along the longitudinal axis of the scanned socket 316. The linear screw 308 is attached to the anchor frame 339 which will be attached to a solid, immovable surface or point such as a wall or an appropriate floor-standing apparatus housing 341. The linear screw 308 will be attached to a motor 307, which will rotate the linear screw 308 leading to a movement of the bushing 309 and consequently of the moving support assembly 340 and of all elements (310, 313, 311, 315) connected thereto.
[0067] The capturing element in the form of the camera 311 is mounted in a fixed position relative to the laser 313 but unlike the first aspect shown in
[0068] In use, the laser 313 and the camera 311 move together to scan and map the interior surface of socket 316. The optical element 315 diffracts the laser light so as to produce a projected laser cone 301 on the surface of the scanned socket 316, where the laser cone 301 is projected, whereupon the process previously described of line acquisition, segmentation, calibration, distance-to-reference point calculation and line coordinates calculation is performed. These data are stored in a computer (not shown) and the motor 307 moves to the next incremental position, thereby moving the devices support frame 310 one unit of movement (typically but not limiting of the invention, 5 mm), and the entire process is repeated again until the entire socket 316 surface is scanned and mapped. At the conclusion of the process the data corresponding to each slice of socket surface is joined and a full 3D map of the lines is formed, thus rendering a virtual image of the socket 316 as shown in
[0069] Unlike the first aspect shown in
[0070] A third aspect of the invention is shown in
[0071] In
[0072] In
[0073] However, in this second aspect shown in
[0074] Similar issues occur with the third aspect of the invention shown in
[0075] The relative orientation/position and origin of the field of views of the cameras 311 is known and thus by identifying a given physical reference (i.e. a point 301 of the projected laser beam) in the captured image by both cameras 311, it is possible to infer the relative position of the point 301 of the projected laser beam 337 to the reference axis. The reference axis has an origin between the cameras 311. This same physical reference of the point 301 captured by both of the cameras 311 is represent by different pairs of pixels in the two-dimensional images taken by the cameras 311 and it is this difference combined with the position and orientation between cameras 311 that enables the calculation of the three-dimensional position of the points 310 on the projected laser beam 337.
[0076] To find the relative position between both of the cameras 311 (after being placed in the camera mount), several images of a given reference figure should be captured simultaneously by both cameras 311, at different distances and orientation to the cameras 311. For example, the given reference figure could be a chessboard, but this is not limiting of the invention. By identifying key-points in this reference figure in both of the captured images (either by manually picking or automatic processing) and previously knowing their real/physical distances between key points in the reference figure, it is possible to mathematically determine the relative position between the field of view of both of the cameras 311.
[0077] Both the sockets 316 in
[0078]
[0079]
[0080] It will be appreciated that the use of the two cameras 311, 511, 711 in the third aspect of the invention means that both cameras 311, 511, 711 need to be calibrated in order to know the precise relative position and pose between the two cameras 311, 511, 711 and the lens distortion in each of the two cameras. These parameters are always different due to manufacturing variability.
[0081] To calibrate the cameras 311, 511, 711 and to compensate for difference in the lens parameters of the cameras 311, 511 and 711, a method based on Zhang. A Flexible New Technique for Camera Calibration published in IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):1330-1334, 2000, is used. This method requires a custom-designed camera mount which holds both cameras 311, 511, 711 and the laser 313, 513, 713 i.e. the same laser that will be used in the laser scanner system. In the calibration, a chessboard design placed in a flat surface is used as a reference figure. The dimensions of the squares of the chessboard is known. A number of pictures are taken simultaneously from both of the cameras 311, 511, 711 in which the chessboard is placed at different distances to the cameras 311, 511, 711 and at different orientations.
[0082] The corners of the chessboard of every pair of images taken from the cameras 311, 511, 711 are detected. The number of squares of the chessboard are known and thus it is simple to match the corners detected by both of the stereo images taken from the two cameras 311, 511, 711. The chessboard plane is considered to be at z=0, only leaving the problem to be solved only in this plane (with origin in one of the corners of the chessboard).
[0083] Each of the images is automatically processed in order to find the chessboard patterns, acquiring one conversion from the image corner pixels to the real 3D positions of the chessboard. This enables the computation of the intrinsic lens parameters for each of the cameras 311, 511, 711 (i.e. distortion coefficients), by trying to minimize the 2D<->3D re-projection error in all images. After carrying out this calculation, it is possible to use these 2D-3D correspondences to calculate the transformation matrix between the images from the two cameras 311, 511, 711.
[0084] This calibration enables the computation of the matrix which projects the images of both cameras 311, 511, 711 onto a common image plane, i.e., to rectify the images. This process makes it easier to find correspondences between the stereo images because the process aligns the image in such a way that theoretically it is only necessary to search along a single line (if the calibration is accurate, correspondent points are in the same row of the rectified images). After constructing the undistorted and coplanar image planes, the 3D reconstruction can be achieved by triangulation.
[0085]
[0086]
[0087] The power leads and data leads 820 made, for example, of silver ink are not illustrated in this
[0088] Side B1 of polymer film 825 on
[0089] The side A1 of the plastic film 824 will have one or more markings 830 on the surface. These markings are illuminated by the light source 220, 320 to locate the bio-sensors on the inside of the socket 216, 316 as explained previously. In one non-limiting aspect, the markings are multi-coloured roundels (concentric circles). The different colours are used to indicate differing positions or the different types of bio-sensors within the socket 216, 316. Currently, at least two markings per strip are required to uniquely identify the position of the bio-sensor strip 819, but a single marking could be acceptable.
[0090]
[0091] Before use, the bio-sensor strips 819 are covered on the adhesive side with a peelable cover if dispensed in pre-cut strips, or not covered if they are dispensed in rolls. In use, the peelable covers are removed, the bio-sensor strips 820 are cut to the right size, if needed, and the bio-sensor strips 820 are applied to the internal surface of the prosthetic socket 216, 316, in the orientation best determined by the prosthetic fitting technician's experience.
[0092] The biosensor strips 819 can measure different types of biodata, which include but are not limit to pressure between the stump skin and the prosthetic socket surface, or temperature. In one non-limiting example, the biosensor strips are formed of a force sensing resistor comprising at least one polymer layer whose resistance varies on application of a pressure. The change in the resistance can be measured, for example, by a bridge circuit. In one aspect of the bio-sensor a plurality of polymer layers is used with different characteristics to allow a wide range of different pressures to be measured.
[0093]
[0094] Furthermore, the spacing between the bio-sensor strips 921 can be adjusted to vary the resolution of the data obtainedthe bio-sensor strips 921 can be arranged closer together or even on top of each other with an offset in areas where greater data resolution is required. When evaluating pressure, a correct fit between the stump 932 and the socket 916 will produce uniform pressure distribution across certain areas of the surface of the socket 916, depending on the socket type, while a poor fit will produce areas of altered pressure which will be evidenced by more concentrated curves in the bio-data profile 929, in zones where this should not occur.
[0095] It will be appreciated that absolute values from the bio-sensors are not required. The values can be normalised or otherwise mathematically manipulated with respect to the maximum value recorded.
[0096] Artificial colour may be added to the bio-data profile 929 to create a heat map and thus illustrate the areas of pressure. Shifts in the colour may be used to differentiate between areas of equally uncomfortable areas of increased or reduced pressure, such as red for higher pressure, and blue for lower pressure. The prosthesis technician can therefore identify high pressure areas of the socket 916 which need fine tuning and shaping back as well as areas of lower pressure which indicate regions of the socket 916 which need building up. Other types of bio-data of interest may be represented using the same method.
[0097] The arrangement of the bio-sensors 919 on the bio-sensor strips 921 enables the system to be wearable, non-invasive, autonomous (with long battery time), modular, flexible (with different placement of sensors), scalable (more sensors as needed), and versatile (different type of modules/sensors).
[0098] The images of the bio-sensor strips 921 are drawn on the surface of the stump 932 to indicate their location in relation to the stump 932. The real origin coordinates 912 are actually or virtually drawn on the scanned socket 916 and the scanning and data acquisition apparatus produces the 3D image of the scanned socket 906.
[0099] An example can serve to illustrate this in more detail. Suppose the prosthesis is an artificial leg or an artificial arm. In use, the patient with the artificial leg is made to walk (in the case of a leg) or move (in case of an artificial arm) for a certain amount of time until sufficient bio-sensor data has been obtained from the bio-sensors 919 to produce the virtual 3D bio-data profile 930 comprising bio-data profile curves 929 of the pressure, temperature or any other bio-data of interest. The position of these bio-data profile curves 929 is known by reference to virtual origin coordinates 905 of the 3D bio-data profile 930.
[0100] It is also possible to combine the bio-sensor data with data from one of more inertial motion units which is carried by the patient and attached to the limb. The inertial motion unit will have three, six or nine axes and provide information about the changes in the data as the patient moves. This data can be used to characterise potential gait anomalies.
[0101]
[0102]
[0103]
[0104] Furthermore, a light grid may be projected onto the patient's stump or over the surface of the actual socket 916 which is calibrated to the 3D socket model so as to help the prosthetic fitting technician to correlate the 3D image with the actual socket surface of the socket 916 and hence help identify the areas that need adjustment on the actual socket surface.
[0105]
[0106] The purpose of the layering of the various data maps is to evidence the areas where pressure, temperature or other data are more intense, and which may indicate areas of pain or discomfort for the person wearing the prosthesis, or less intense which may indicate areas which lack support.
[0107] In
[0108]
[0109]
[0110]
[0111] In
[0112]
[0113]
[0114] In
[0115]
[0116]
[0117] The alignment of the socket surface map 1106 and/or the 3D bio-data profile 1130 over the object of interest can be achieved by using the same origin coordinate in the real object (real origin coordinate 1112), in the socket surface map 1106 (virtual origin coordinate 1105) and in the 3D bio-data profile 1130 (virtual origin coordinate 1105), or by resorting to a best match algorithm, that computes the best geometrical match between two 3-dimensional objects.
[0118] The method will now be described with respect to the flow diagrams shown in
[0119] The method for creating the surface map is shown in
[0120] The present invention comprising the use of the bio-sensors, the mapping of the internal surface of the prosthetic socket, the generation of bio-data related to socket/stump fit and the identification of socket areas which require adjustment represents several economic and comfort benefits. The method is non-invasive, unobtrusive and does not require a clinician's attendance. It saves considerable time in the fitting process, thereby reducing cost and increasing the patient's quality of life.
REFERENCE NUMERALS
[0121] 101 Laser Line
[0122] 102 Centre
[0123] 103 Data Points
[0124] 104 Laser line
[0125] 105 Origin coordinates
[0126] 106 Surface map
[0127] 200 Conical laser assembly
[0128] 201 Projected radiation line
[0129] 207 Motor
[0130] 208 Linear screw
[0131] 209 Bushing
[0132] 210 Support frame
[0133] 211 Camera
[0134] 212 Physical origin coordinate
[0135] 213 Laser
[0136] 214 Laser lens
[0137] 215 Conical mirror
[0138] 216 Socket
[0139] 217 Fixing base
[0140] 219 Bio-Sensors
[0141] 220 Radiation source
[0142] 236 Radiation beam
[0143] 239 Anchor frame
[0144] 240 Support assembly
[0145] 241 Wall or housing
[0146] 300 Conical laser assembly
[0147] 301 Projected radiation pattern
[0148] 307 Motor
[0149] 308 Linear screw
[0150] 309 Bushing
[0151] 310 Device supporting frame
[0152] 311 Camera
[0153] 312 Original coordinate
[0154] 313 Laser
[0155] 314 Laser lens
[0156] 315 Optical element
[0157] 316 Scanned socket
[0158] 317 Fixing base
[0159] 319 Bio-sensors
[0160] 320 Radiation source
[0161] 339 Anchor frame
[0162] 340 Support Assembly
[0163] 341 Wall or housing
[0164] 401 Laser Plane
[0165] 411 Camera
[0166] 413 Laser
[0167] 415 Conic mirror
[0168] 418 Field of view
[0169] 501 Laser plane
[0170] 511 Camera
[0171] 513 Laser
[0172] 515 Optical Element
[0173] 518 Field of view
[0174] 601 Laser plane
[0175] 611 Camera
[0176] 613 Laser
[0177] 618 Field of view
[0178] 701 Laser plane
[0179] 711 Camera
[0180] 713 Laser
[0181] 719 Field of view
[0182] 819 Bio-sensors
[0183] 820 Data Leads
[0184] 821 Bio-sensor strip
[0185] 822 Transmitting device
[0186] 823 Power and data connector
[0187] 824 Plastic film
[0188] 825 Plastic film
[0189] 826 Adhesive finish
[0190] 830 Markings
[0191] 905 Virtual original coordinates
[0192] 906 Socket surface map
[0193] 912 Real origin coordinate
[0194] 916 Prosthetic socket
[0195] 919 Bio-sensors
[0196] 921 Bio-sensor strip
[0197] 927 Data processing device
[0198] 928 Sensorised socket
[0199] 929 Curves
[0200] 930 Bio data profile
[0201] 932 Stump
[0202] 1005 Virtual origin coordinate point
[0203] 1006 3D surface map
[0204] 1016 Socket
[0205] 1029 Bio-data profile maps
[0206] 1030 Bio-data profile
[0207] 1031 Virtual reality vision system
[0208] 1032 Residual member
[0209] 1033 Liner or sock
[0210] 1038 Shaping tools
[0211] 1105 Virtual origin coordinate point
[0212] 1106 Socket surface map
[0213] 1112 Real sump origin coordinate
[0214] 1129 Bio-data profile curves
[0215] 1130 Bio-data profile
[0216] 1132 Residual limb
[0217] 1134 Handheld device
[0218] 1135 Support
[0219] 1138 Shaping tool