A PHOTOBIOMODULATION THERAPY LOW-LEVEL LASER TARGETING SYSTEM
20230233874 · 2023-07-27
Inventors
Cpc classification
A61N2005/0626
HUMAN NECESSITIES
A61B5/4836
HUMAN NECESSITIES
International classification
Abstract
A photobiomodulation therapy low-level laser targeting system has a controller, a low lever laser emitter controlled by the controller and a projector operably coupled to the emitter and controlled by the controller to control the projection direction of light from the emitter. The controller has a targeting controller configured for controlling the projector to project light from the emitter onto a skin surface target area in use to target a subdermal target region.
Claims
1. A photobiomodulation therapy low-level laser targeting system comprising: a controller; a low lever laser emitter controlled by the controller; and a projector operably coupled to the emitter and controlled by the controller to control the projection direction of light from the emitter, wherein: the controller comprises a targeting controller configured for controlling the projector to project light from the emitter onto a skin surface target area in use to target a subdermal target region, and the controller is configured with geospatial data representing the subdermal target region and wherein the targeting controller is configured for controlling the projector depending on relative positioning of the projector with respect to the skin surface target area and the geospatial data.
2. The system as claimed in claim 1, wherein the projector directs the light in two axes.
3. The system as claimed in claim 2, wherein the projector comprises a mechanical gimbal which controls the orientation of the emitter.
4. The system as claimed in claim 2, wherein the projector comprises a mechanical gimbal which adjusts a mirror or prism against or through which the light is reflected or propagated.
5. The system as claimed in claim 2, wherein the projector comprises at least one rotating prism and wherein the emitter is operated at specific rotational offsets of the at least one rotating prism to target the skin surface target area.
6. The system as claimed in claim 2, wherein the projector comprises a beamforming lens.
7. The system as claimed in claim 6, wherein the beamforming lens may form a pinpoint for XY raster scanning.
8. The system as claimed in claim 6, wherein the beamforming lens forms a line which is swept across the skin surface targeted treatment area.
9. The system as claimed in claim 1, wherein the projector is set at a preconfigured position with respect to the subdermal target region.
10. The system as claimed in claim 10, wherein the controller is configured with relative positional coordinates representing a relative position of the projector with respect to the subdermal target region.
11. The system as claimed in claim 1, wherein the controller comprises a data interface for receiving geospatial data obtained from at least one of a medical scanning devices and procedures comprising at least one of a CT scanner, CAT-scanner, MRI scanner, colonoscopy, endoscopy, x-ray scanner, mammogram and ultrasound investigation.
12. The system as claimed in claim 1, further comprising a computer aided modelling geospatial editor for editing the geospatial data with reference to a 3D patient model.
13. The system as claimed in claim 1, wherein an incident point on the skin surface target area is controlled according to a penetration depth depending on relative positioning of the projector and the subdermal target region.
14. The system as claimed in claim 1, further comprising a ranging controller operably coupled to a sensor for determining a target region and wherein the targeting controller controls the projector according to the target region determined by the ranging controller.
15. The system as claimed in claim 14, wherein the sensor comprises a thermal sensor configured for determining a skin surface heat map topography.
16. The system as claimed in claim 15, wherein the targeting controller is configured for targeting areas of the surface heat map topography exceeding a temperature threshold.
17. The system as claimed in claim 15, wherein the thermal sensor comprises an infrared camera.
18. The system as claimed in claim 15, wherein the thermal sensor comprises an infrared temperature sensor which emits an infrared energy beam focused by a lens to a surface of the skin surface target area.
19. The system as claimed in claim 14, wherein the sensor comprises a vision sensor configured for identifying a skin marking.
20. The system as claimed in claim 19, wherein the skin marking is a point and wherein the targeting controller is configured for targeting a region around the point.
21. The system as claimed in claim 19, wherein the skin marking is a marked boundary and wherein the targeting controller is configured for targeting a region within the boundary.
22. The system as claimed in claim 21, wherein the targeting controller employs boundary area analysis image processing on image data obtained by the vision sensor to determine the area within a marked boundary for targeting.
23. The system as claimed in claim 19, wherein the skin marking is a visible skin marking.
24. The system as claimed in claim 19, wherein the skin marking is an infrared visible skin marking.
25. The system as claimed in claim 19, wherein skin marking is indicated with reference to a display of image data captured by the vision sensor and wherein the ranging controller is configured to thereafter target the indicated marking.
26. The system as claimed in claim 14, wherein the sensor is a camera and wherein the ranging controller uses image processing on image data received therefrom to determine the target region.
27. The system as claimed in claim 26, wherein the ranging controller targets a selected portion of a 3D patient model.
28. The system as claimed in claim 27, wherein the ranging controller uses image recognition to recognise the selected portion.
29. The system as claimed in claim 1, wherein the system comprises a small form applicator device comprising the emitter and projector therein and wherein the applicator device is operably coupled to a user interface device having a digital display and wherein the digital display displays a user interface for controlling the controller thereon.
30. The system as claimed in claim 29, wherein the applicator device attaches to the user interface device and wherein the controller further comprises a ranging controller operably coupled to a sensor for determining a target region and wherein the targeting controller controls the projector according to the target region determined by the ranging controller irrespective of the relative orientation and position of the user interface device and the transdermal target region.
31. The system as claimed in claim 30, wherein the sensor is a camera of the user interface device.
32. The system as claimed in claim 29, wherein the user interface displays a treatment area augmented with image data obtained from a camera of the user interface device.
33. The system as claimed in claim 29, wherein the applicator device is physically separate from the user interface device and the applicator device comprises gyroscopic sensors to determine the orientation of the applicator device and wherein the projector controls the laser beam depending on the orientation of the electronic device determined by the gyroscopic sensors.
34. The system as claimed in claim 29, wherein the applicator device is physically attached to the user interface device and the projector controls the laser beam depending on the orientation of the electronic device determined by gyroscopic sensors of the user interface device.
35. The system as claimed in claim 29, wherein the user interface device displays an augmented vision map representation of the skin surface target area augmented with image data obtained from a camera of the system.
36. The system as claimed in claim 35, wherein the map representation is interactive for marking the treatment boundary for targeting by the targeting controller.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Notwithstanding any other forms which may fall within the scope of the present invention, preferred embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
[0019]
[0020]
[0021]
[0022]
[0023]
DESCRIPTION OF EMBODIMENTS
[0024] A photobiomodulation laser targeting system 100 comprises a controller 125 and a low-level laser emitter 114 controlled by the controller 125 via an l/O interface 113.
[0025] The system 100 further comprises a projector 115 operably coupled to the emitter 114 and controlled by the controller 125.
[0026] The controller 125 comprises a processor 112 for processing digital data. In operable communication with the processor 112 across a system bus 110 is a memory device 109 configured for storing digital data including computer program code instructions. The computer program code instructions may be logically divided into various computer program code controllers 108 and associated data 105. In use, the processor 112 fetches these computer program code instructions and associated data from the memory device 109 for interpretation and execution for implementing the control functionality described herein.
[0027] The controller 125 comprises a targeting controller 107 configured for controlling the projector 115 to direct light from the emitter 114 onto a skin surface target area 116 to target a subdermal target region 117.
[0028] The emitter 114 may emit red and near infrared light in the range of 660 nm -905 nm at low power of between 10 mW - 500 mW to deliver a power density (irradiance) of up to approximately 5 W/cm.sup.2 on the skin surface target area 116.
[0029] The projector 115 may direct the light in two axes, thereby allowing the system 100 to direct light onto skin surface target areas 116 of differing shapes and sizes.
[0030] The projector 115 may comprise a mechanical gimbal which controls the orientation of the emitter 114. In alternative embodiments, a mechanical gimbal may adjust a mirror or prism against or through which the light is reflected or propagated.
[0031] In embodiments, the projector 115 comprises at least one rotating prism and wherein the emitter 114 is operated at specific rotational offsets of the at least one rotating prism to target the skin surface target area 116.
[0032] In embodiments, the projector 115 may comprise a beamforming lens. The beamforming lens may form a pinpoint for XY raster scanning or alternatively a line which is swept across the skin surface targeted treatment area 116.
[0033] The controller 125 may be configured with geospatial data 104 representing the subdermal target region 117.
[0034] The controller 125 may comprise a data interface 111 for receiving geospatial data 104 from a medical scanner device 101 or procedure. The medical scanning device 101 or procedures comprising CT scanner, CAT-scanner, MRI scanner, colonoscopy, endoscopy, x-ray scanner, mammogram, ultrasound investigation and the like.
[0035] The system 100 may comprise a computer aided modelling geospatial editor 102 for configuring geospatial data received from the patient scanner 101. In embodiments, the geospatial editor 102 may comprise a 3D model representation of a patient body which may be customised according to patient specific parameters.
[0036] With reference to data received from the patient scanner 101, a physician may configure the geospatial data 104 representing the subdermal target region 117 within the 3D model. For example, with reference to frontal and lateral x-ray data, the physician may configure the geospatial data 104 to represent the appropriate the subdermal target region 117.
[0037] As such, in use, the targeting controller 117 targets the subdermal targeting region 117 specified by the geospatial data 104.
[0038] The targeting controller 117 may target the subdermal target region 117 with the geospatial data 104 with reference to relative positioning of the projector 115 to the subdermal target region 117.
[0039] In one embodiment, the projector 115 may be placed at a set position with respect to the patient and wherein the targeting controller 117 targets the skin surface target area 116 and therefore the subdermal target region 117 thereunderneath with respect to the relative position of the projector 115 and the patient. In further embodiments, the targeting controller 117 may be configured with positional offsets, such as X, Y and Z coordinates representing the relative positioning of the projector 115 from the patient.
[0040] In embodiments, the controller 125 comprises a ranging controller 106 operably coupled to a sensor for determining a target region (such as the skin surface target area 116 or subdermal target region 117) and wherein the targeting controller 107 controls the projector 115 according to the target region and determined by the ranging controller 106.
[0041] In embodiments, the sensor comprises a thermal sensor 119 configured for determining skin surface heat map topography. The thermal sensor 119 may comprise an infrared camera orientated towards the skin of the patient. Alternatively, the thermal sensor 119 may comprise an infrared temperature sensor which emits an infrared energy beam focused by a lens to a surface of the skin surface target area 116 to determine the temperature thereof according to the energy of the reflected beam.
[0042] The ranging controller 106 may determine a region of elevated temperature for targeting by the targeting controller 107. A region of elevated temperature may be indicative of inflammation requiring treatment.
[0043] In further embodiments, the sensor comprises a vision sensor 118. In one embodiment, the vision sensor 118 is configured for identifying a skin marking. For example, a physician may mark a treatment area using a skin marking either using visible or infrared visible dye which is detected by the vision sensor 118. The skin marking may comprise a point and wherein the targeting controller 107 targets a region surrounding the point. In alternative embodiments, the skin marking may comprise a boundary and wherein the targeting controller 107 targets a region within the boundary. The targeting controller 107 may employ boundary area analysis image processing on image data obtained by the vision sensor to determine the area within a marked boundary for targeting.
[0044] In embodiments, when making the skin marking, the physician may indicate the skin marking with reference to image data captured by the vision sensor 118 displayed by a digital display 123 the system 100, thereby allowing the ranging controller 106 to thereafter target the indicated marking. For example, once having made a marking, the physician may tap the digital display 123 to indicate marking. Similarly, the physician may tap the display 123 within a marked boundary, thereby allowing the range controller 106 to subsequently target the area determined within the boundary.
[0045] In further embodiments, the sensor comprises a camera and wherein the ranging controller 106 uses shape detection and/or object recognition to determine regions of a body for targeting. For example, the ranging controller 106 may recognise a portion of the patient’s body using shape and/or object recognition for targeting by the targeting controller 107. Example, for targeting a knee, the ranging controller 106 may determine the boundary of the leg using shape detection and furthermore determine the location of the knee between the upper leg and the lower leg using shape for object recognition.
[0046] In embodiments, using the aforedescribed 3D model, the user may select a portion of the patient’s body for treatment. For example, the 3D model may be displayed on the display 123 when the physician may select the knee from the displayed 3D model. As such, the range controller 106 may use the shape and/or object recognition to recognise the knee selected from the 3D model for targeting.
[0047] The ranging controller 106 and targeting controller 107 may adjust targeting in real-time including if the position of the skin surface targeted treatment area 116 moves with respect to the projector 115 in use.
[0048] The controller 125 may be configured with adjustable settings 103 which, in embodiments may, for example, be used to adjust the treatment program. In embodiments, the settings 103 be used to control the emitter 114 and the projector 115, including for setting whether constant or pulsed light is applied, the light energy level, the dosage level, the treatment time period and treatment frequency.
[0049] The emitter 114 and the projector 115 may be controlled by the settings 103 to adjust the penetration depth. Penetration depth may be controlled by the energy level of the emitter 114.
[0050] In alternative embodiments, penetration depth may be controlled geometrically with respect to the relative positioning of the projector 115 and the subdermal target region 117. For example, as the position of the projector 115 moves with respect to the patient, the incident point on the skin surface target area 116 may be controlled by the targeting controller 107 to target the same depth of the subdermal target region 117 irrespectively.
[0051] In embodiments, the controller 115 is in operable communication with a user interface device 124. The user interface device 124 may take the form of a mobile communication device, tablet computing device or the like. The user interface device 124 may execute a software application thereon.
[0052] The user interface device 124 may comprise the digital display 123 configured for displaying a user interface 122 for controlling the operation of the controller 125.
[0053] The user interface 122 may display operational parameters. The user interface 122 may display settings 121 which may be adjusted.
[0054] The user interface 122 may display an augmented vision map representation 120 of the skin surface target area 116, augmented with image data obtained from a camera of the user interface device 124. In embodiments, the map representation 120 is interactive for marking the treatment boundary for targeting by the targeting controller 107.
[0055] In embodiments, a small form factor handheld applicator device 127 comprises the emitter 114 and projector 115. The applicator device 127 may be operably coupled to the user interface device 124. The applicator device 127 may comprise a rechargeable battery therein for powering the emitter 114 or may draw power from the user interface device 124. The applicator device 127 may be physically attached to the user interface device 124 or separated therefrom.
[0056] In accordance with this embodiment, the projector 115 may control the laser beam depending on the orientation and position of the user interface device 124 with respect to the subdermal target region 117.
[0057] For example, where the applicator device 127 is physically attached to the user interface device 124, both can be held in one hand during home-based photobiomodulation therapy wherein the ranging controller 106 works in conjunction with the vision sensor 118 or thermal sensor 119 to adjust the targeting of the targeting controller 107. In embodiments, the controller uses image data obtained from an image sensor of the user interface device 124 for targeting, thereby avoiding image sensing componentry and associated computation requirements of the applicator device 127 itself.
[0058] In embodiments, the applicator device 127 comprises gyroscopic sensors to determine the orientation of the applicator device and wherein the projector 115 further controls the laser beam depending on the orientation of the applicator device 127 determined by the gyroscopic sensors. Similarly, the system 100 may use gyroscopic sensors of the user interface device 124, thereby avoiding the applicator device 127 requiring separate gyroscopic sensors.
[0059]
[0060]
[0061] The apparatus may comprise a separate applicator 127 having the LLL emitter 114 and projector 115 therein.
[0062] The applicator 127 may be held within an applicator cradle 128 which may comprise a stand plate 130 and a footplate 135. With reference to
[0063] The projection head 132 may comprise a face 133 having the projector 115 having adjustable optics located centrally therein and from which the light is projected onto the skin surface target area 116. The face 133 may further comprise an infrared camera 134 as the vision sensor 118.
[0064] In embodiments, the applicator 127 may remain within the cradle 128 during photobiomodulation therapy. In alternative embodiments, the applicator 127 is handheld during photobiomodulation therapy wherein targeting thereof is controlled by the ranging controller and/or gyroscopic sensors thereof.
[0065] The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practise the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed as obviously many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.
[0066] The term “approximately” or similar as used herein should be construed as being within 10% of the value stated unless otherwise indicated.