Autonomous robotic mobile support system for the mobility-impaired

11806288 · 2023-11-07

Assignee

Inventors

Cpc classification

International classification

Abstract

Disclosed is a robotic mobile support system configured to autonomously follow a subject with impaired mobility from a close but safe distance behind the subject and react to movements of the subject's torso and upper body to provide dynamic support for the subject and stop the subject from falling. The system comprises a mobile base vehicle, a robotic arm installed on the mobile base vehicle, and a LIDAR sensor for detecting the distance to a subject and the direction/speed of the subject. The robotic arm comprises one or more adjustable rear and side soft supports configured to support the subject and stop the subject from falling. The system can also comprise an onboard computer configured to process LIDAR data to control the movement of the mobile base vehicle and the robotic arm to stop the subject from falling.

Claims

1. A robotic mobile support system, comprising: a mobile base vehicle comprising two differentially driven wheels; a robotic arm coupled to the mobile base vehicle and extending from the mobile base vehicle in a direction of a subject, wherein the robotic arm comprises an end effector configured to support the subject in the event of a fall; a LIDAR sensor coupled to the robotic arm, wherein the LIDAR sensor is configured to undertake distance measurements between the LIDAR sensor and defined points along a dorsum region or a posterior truncal region of the subject; and an onboard computing device comprising one or more processors and memory units, wherein the one or more processors are programmed to execute instructions stored in the one or more memory units to use the distance measurements from the LIDAR sensor to generate commands to adjust at least one of a speed and direction of the mobile base vehicle in response to a movement of the subject to allow the robotic mobile support system to autonomously follow the subject from a posterior direction and support the subject in the event of a fall.

2. The system of claim 1, wherein the one or more processors are programmed to execute further instructions to determine a forward velocity (v.sub.x) of the mobile base vehicle using the equation:
v.sub.x=k1*(xlmin−dist) wherein xlmin is a minimum distance from a set of distance measurements made by the LIDAR sensor, dist is a predefined desired distance, and k1 is a first proportional control gain.

3. The system of claim 2, wherein the one or more processors are programmed to execute further instructions to determine an angular velocity (ω) of the mobile base vehicle using the equation: ω = k 2 * ϕ 2 + ϕ 3 2 wherein k2 is a second proportional control gain, and ϕ.sub.2 and ϕ.sub.3 are angles of rotation of the subject determined by the equations: ϕ 2 = a tan [ xl ( first selected defined point ) - xl ( second selected defined point ) ] [ yl ( first selected defined point ) - yl ( second selected defined point ) ] ϕ 3 = a tan [ xl ( third selected defined point ) - xl ( fourth selected defined point ) ] .Math. yl ( third selected defined point ) - yl ( fourth selected defined point ) ] wherein yl and xl represent the (y, x) position of the defined points across the dorsum region or the posterior truncal region of the subject as measured by the LIDAR sensor, wherein an origin is located at a center of the LIDAR sensor, wherein x(i) represents a distance to a selected defined point in the x-direction where the positive x-axis is directly in front of the LIDAR sensor, and wherein y(i) represents a distance to the selected defined point in the y-direction where the positive y-axis is toward a right-hand side of the subject.

4. The system of claim 3, wherein the one or more processors are programmed to execute further instructions to determine a right wheel angular velocity (ω.sub.R) of a right wheel of the two differentially driven wheels and determine a left wheel angular velocity (ω.sub.L) of a left wheel of the two differentially driven wheels, using the equations: ω R = v x + h ω R w ω L = v x - h ω R w wherein ω is the angular velocity, wherein h is one-half of a distance separating the two differentially driven wheels, and R.sub.w is a radius of at least one of the differentially driven wheels.

5. The system of claim 4, wherein the one or more processors are programmed to execute further instructions to determine a left wheel revolutions per minute (RPM.sub.L_Des) of the left wheel of the two differentially driven wheels and a right wheel revolutions per minute (RPM.sub.R_Des) of the right wheel of the two differentially driven wheels using the following equations: RPM L_Des = ω L * 3 0 π RPM R_Des = ω R * 3 0 π wherein ω.sub.R is the right wheel angular velocity and ω.sub.L is the left wheel angular velocity.

6. The system of claim 1, wherein a minimum weight of the mobile base vehicle is determined by a height of the subject (H), a weight of the subject (WP), and the equation:
Minimum Weight of Mobile Base Vehicle=H*WP*sin β/(2h) wherein β is a tilt angle of the subject as measured from an upright position, and h is one-half of a distance separating the two differentially driven wheels.

7. The system of claim 1, further comprising a leg LIDAR sensor positioned on the mobile base vehicle and configured to undertake distance measurements between the leg LIDAR sensor and defined points along a leg region of the subject.

8. The system of claim 1, wherein the onboard computing device further comprises a wireless communication module configured to receive commands from a radio frequency (RF) transmitter, and wherein movement of the mobile base vehicle is halted in response to an emergency stop command received from the RF transmitter via the wireless communication module.

9. The system of claim 1, wherein the LIDAR sensor is a 2D LIDAR sensor configured to emit a modulated infrared laser light at the dorsum region or the posterior truncal region of the subject.

10. The system of claim 9, wherein the infrared laser light can have a wavelength of between about 775 nm and 795 nm.

11. The system of claim 9, wherein the LIDAR sensor has a scanning range of up to 8 meters.

12. A method of supporting a subject, the method comprising: measuring, using a LIDAR sensor, distances between the LIDAR sensor and defined points along a dorsum region or a posterior truncal region of the subject, wherein the LIDAR sensor is coupled to a robotic arm extending from a mobile base vehicle in a direction of the subject, and wherein the mobile base vehicle, the LIDAR sensor, and the robotic arm are part of a robotic mobile support system, wherein the robotic arm comprises an end effector configured to support the subject in the event of a fall; and generating commands to adjust at least one of a speed and direction of the mobile base vehicle, using one or more processors of an onboard computing device of the robotic mobile support system, based on the distance measurements made by the LIDAR sensor in response to a movement of the subject to allow the robotic mobile support system to autonomously follow the subject from a posterior direction and support the subject in the event of a fall.

13. The method of claim 12, further comprising determining, using the one or more processors, a forward velocity (v.sub.x) of the mobile base vehicle using the equation:
v.sub.x=k1*(xlmin−dist) wherein xlmin is a minimum distance from a set of distance measurements made by the LIDAR sensor, dist is a predefined desired distance, and k1 is a first proportional control gain.

14. The method of claim 13, further comprising determining, using the one or more processors, an angular velocity (ω) of the mobile base vehicle using the equation: ω = k 2 * ϕ 2 + ϕ 3 2 wherein k2 is a second proportional control gain, and ϕ.sub.2 and ϕ.sub.3 are angles of rotation of the subject determined by the equations: ϕ 2 = a tan [ xl ( first selected defined point ) - xl ( second selected defined point ) ] [ yl ( first selected defined point ) - yl ( second selected defined point ) ] ϕ 3 = a tan [ xl ( third selected defined point ) - xl ( fourth selected defined point ) ] .Math. yl ( third selected defined point ) - yl ( fourth selected defined point ) ] wherein yl and xl represent the (y, x) position of the defined points across the dorsum region or the posterior truncal region of the subject as measured by the LIDAR sensor, wherein an origin is located at a center of the LIDAR sensor, wherein x(i) represents a distance to a selected defined point in the x-direction where the positive x-axis is directly in front of the LIDAR sensor, and wherein y(i) represents a distance to the selected defined point in the y-direction where the positive y-axis is toward a right-hand side of the subject.

15. The method of claim 14, further comprising determining, using the one or more processors, a right wheel angular velocity (ω.sub.R) of a right wheel of the two differentially driven wheels and a left wheel angular velocity (ω.sub.L) of a left wheel of the two differentially driven wheels, using the equations: ω R = v x + h ω R w ω L = v x - h ω R w wherein ω is the angular velocity, wherein h is one-half of a distance separating the two differentially driven wheels, and R.sub.w is a radius of at least one of the differentially driven wheels.

16. The method of claim 15, further comprising determining, using the one or more processors, a left wheel revolutions per minute (RPM.sub.L_Des) of the left wheel of the two differentially driven wheels and a right wheel revolutions per minute (RPM.sub.R_Des) of the right wheel of the two differentially driven wheels using the following equations: RPM L_Des = ω L * 3 0 π RPM R_Des = ω R * 3 0 π wherein ω.sub.R is the right wheel angular velocity and ω.sub.L is the left wheel angular velocity.

17. The method of claim 12, further comprising determining a minimum weight of the mobile base vehicle using the equation:
Minimum Weight of Mobile Base Vehicle=H*WP*sin β/(2h) wherein H is a height of the subject, WP is a weight of the subject, β is a tilt angle of the subject as measured from an upright position, and h is one-half of a distance separating the two differentially driven wheels.

18. The method of claim 12, further comprising measuring, using a leg LIDAR sensor positioned on the mobile base vehicle, distances between the leg LIDAR sensor and defined points along a leg region of the subject.

19. The method of claim 12, further comprising: receiving an emergency stop command from a radio frequency transmitter via a wireless communication module of the onboard computing device, wherein power to the plurality of direct current (DC) motors is shut off in response to the emergency stop command.

20. The method of claim 12, wherein the LIDAR sensor is a 2D LIDAR sensor configured to emit a modulated infrared laser light at the dorsum region or the posterior truncal region of the subject.

21. The method of claim 20, wherein the infrared laser light can have a wavelength of between about 775 nm and 795 nm.

22. The method of claim 20, wherein the LIDAR sensor has a scanning range of up to 8 meters.

23. A robotic mobile support system, comprising: a mobile base vehicle comprising two differentially driven wheels; a robotic arm coupled to the mobile base vehicle and extending from the mobile base vehicle in a direction of a subject, wherein the robotic arm comprises an end effector comprising a padded back support and padded adjustable side supports, wherein the padded back support is configured to support the subject in the event of a backward fall, and wherein the padded adjustable side supports are configured to support the subject in the event of a lateral fall; a plurality of geared servomotors configured to operate the robotic arm including the end effector; a LIDAR sensor coupled to the robotic arm, wherein the LIDAR sensor is configured to undertake distance measurements between the LIDAR sensor and defined points along a dorsum region or a posterior truncal region of the subject; and an onboard computing device comprising one or more processors and memory units, wherein the one or more processors are programmed to execute instructions stored in the memory units to use the distance measurements from the LIDAR sensor to generate commands to adjust at least one of a speed and direction of the mobile base vehicle in response to a movement of the subject to allow the robotic mobile support system to autonomously follow the subject from a posterior direction and support the subject in the event of a fall.

24. The system of claim 23, wherein the one or more processors are programmed to execute further instructions to generate a command to rotate the end effector with respect to a horizontal orientation plane using one of the plurality of geared servomotors in response to an initial rotation of the end effector caused by a lateral fall of the subject, wherein the initial rotation is beyond a preset threshold rotation value.

25. The system of claim 23, wherein a length of the robotic arm is adjustable to adapt to a height of the subject, and wherein a width separating the padded adjustable side supports is adjustable to adapt to a girth of the subject.

26. The system of claim 23, wherein the onboard computing device further comprises a wireless communication module configured to receive commands from a radio frequency (RF) transmitter, and wherein movement of the mobile base vehicle is halted in response to an emergency stop command received from the RF transmitter via the wireless communication module.

27. The system of claim 23, wherein the LIDAR sensor is a 2D LIDAR sensor configured to emit a modulated infrared laser light at the dorsum region or the posterior truncal region of the subject.

28. The system of claim 27, wherein the infrared laser light can have a wavelength of between about 775 nm and 795 nm.

29. The system of claim 27, wherein the LIDAR sensor has a scanning range of up to 8 meters.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 illustrates a front perspective view of an implementation of an autonomous robotic mobile support system.

(2) FIG. 2 illustrates a side perspective view of an implementation of the autonomous robotic mobile support system.

(3) FIG. 3 is a schematic drawing illustrating a top of a mobile base vehicle of the autonomous robotic mobile support system and showing certain geometric parameters for mathematical modeling.

(4) FIG. 4 is a schematic drawing illustrating a side of the autonomous robotic mobile support system and showing certain geometric parameters for mathematical modeling.

(5) FIG. 5A is a schematic drawing illustrating a top of a robotic arm of the autonomous robotic mobile support system and showing certain geometric parameters for mathematical modeling.

(6) FIG. 5B is a schematic drawing illustrating a side of a robotic arm of the autonomous robotic mobile support system and showing certain geometric parameters for mathematical modeling.

(7) FIG. 6A illustrates a perspective view of a component used to construct a back support of an end effector of the robotic arm.

(8) FIG. 6B illustrates a perspective view of components used to construct side supports of the end effector of the robotic arm.

(9) FIG. 7A illustrates example defined points along a dorsum region of a subject used by a LIDAR sensor of the autonomous robotic mobile support system to undertake distance measurements to the subject.

(10) FIG. 7B illustrates an example angle used to determine the subject's direction of motion.

(11) FIG. 8 illustrates the robotic mobile support system autonomously following a subject from a posterior direction and configured to support the subject in the event of a fall.

DETAILED DESCRIPTION

(12) FIG. 1 illustrates a front perspective view of an implementation of an autonomous robotic mobile support system 100. The system 100 can be used to stop falls in and provide support to mobility-impaired or geriatric subjects. The system 100 can be used in a hospital setting, long-term care facility, or other type of healthcare facility. The system 100 can autonomously follow a mobility-impaired subject and dynamically react to a sudden motion or movement of the subject to stop the subject from falling. The system 100 can provide support to subjects using a mobility aid such as a walker or rollator.

(13) FIG. 2 illustrates a side perspective view of an implementation of the autonomous robotic mobile support system 100. FIGS. 1 and 2 illustrate that the system 100 can comprise a mobile base vehicle 102 comprising two differentially driven wheels 104, a robotic arm 106 coupled to the mobile base vehicle 102 and extending from the mobile base vehicle 102 in a direction of a subject. The system 100 can also comprise a LIDAR sensor 108 coupled to the robotic arm 106 and an onboard computing device 110.

(14) In some implementations, the onboard computing device 110 can be coupled to the mobile base vehicle 102. In other implementations, the onboard computing device 110 can be coupled to the robotic arm 106. The onboard computing device 110 can comprise one or more processors and memory units. The one or more processors can be programmed to execute instructions stored in the one or more memory units to control the autonomous operation of the robotic mobile support system 100.

(15) The robotic arm 106 can comprise an end effector 112 configured to support the subject in the event of a fall. The end effector 112 can comprise a padded back support 114 and padded adjustable side supports 116.

(16) The padded back support 114 can be configured to support the subject in the event of a backward fall. The padded side supports 116 can be configured to support the subject in the event of a lateral fall.

(17) A length of the robotic arm 106 can be adjusted to adapt to a height of the subject. A width separating the padded side supports 116 can be adjusted to adapt to a girth of the subject.

(18) The robotic arm 106 can comprise a base arm linkage 118 or segment (also referred to in this disclosure as “link 0”), a first arm linkage 120 or segment (also referred to in this disclosure as “link 1”), and a second arm linkage or segment (also referred to in this disclosure as “link 2” or the end effector 112).

(19) A plurality of geared servomotors can be configured to operate the robotic arm 106 including the end effector 112. The plurality of geared servomotors can comprise a first servomotor 122 (also referred to in this disclosure as “servo 1”) and a second servomotor 124 (also referred to in this disclosure as “servo 2”). The one or more processors can be programmed to execute further instructions to generate a command to rotate the end effector 112 with respect to a horizontal orientation plane using one of the second servomotor 124 in response to an initial rotation of the end effector 112 caused by the subject applying a force (by putting the subject's weight) on one of the side supports 116 due to a fall or the subject losing his/her balance. The end effector 112 can then be rotated back to a default starting position using the second servomotor 124 when the initial rotation is beyond or exceeds a preset threshold rotation value.

(20) The base arm linkage 118 or segment (“link 0”) of the robotic arm 106 can be coupled to the mobile base vehicle 102. The length of the base arm linkage 118 can be adjustable or extendible to accommodate a height of the subject. The base arm linkage 118 can be positioned or otherwise set at an angle with respect to a top of the mobile base vehicle 102. The angle can be adjusted to accommodate a height of the subject. When the system 100 autonomously follows the subject, the base arm linkage 118 can be substantially aligned with a sagittal plane of the subject.

(21) The first arm linkage 120 can be coupled to the base arm linkage 118. The first arm linkage 120 can be rotatable with respect to the base arm linkage 118. The first servomotor 122 can rotate the first arm linkage 120 along the sagittal plane. The first arm linkage 120 can also be rotated manually.

(22) The base arm linkage 118, the first arm linkage 120, or a combination thereof can be made in part of a rigid material. For example, the base arm linkage 118, the first arm linkage 120, or a combination thereof can be made in part of a rigid metallic material. As a more specific example, the base arm linkage 118, the first arm linkage 120, or a combination thereof can be made in part of an aluminum alloy (e.g., 6105-T5 aluminum alloy), stainless steel, or titanium.

(23) In some implementations, the base arm linkage 118 and the first arm linkage 120 can be made of metal T-slot frames. For example, the base arm linkage 118 and the first arm linkage 120 can be made of metal T-slot frames (made of 6105-T5 aluminum alloy) provided by 80/20 Inc. (Part No. 2020).

(24) The angle made by the first arm linkage 120 with respect the base arm linkage 118 can be adjusted until the end effector 112 is oriented substantially horizontal or parallel to the floor. The first arm linkage 120 can be locked in place when the robotic arm 106 is adjusted (the length of the base arm linkage 118 and the angles are adjusted) to accommodate a height of the subject.

(25) The robotic arm 106, including the length of the base arm linkage 118 and the various linkage angles can be adjusted until the height of the end effector 112 is at the waist of the subject and the end effector 112 is oriented substantially parallel to the floor.

(26) The LIDAR sensor 108 can be configured to undertake distance measurements between the LIDAR sensor 108 and defined points along a dorsum region of the subject. The one or more processors of the onboard computing device 110 can be programmed to execute instructions stored in the one or more memory units to use the distance measurements from the LIDAR sensor 108 to generate commands to adjust at least one of a speed and direction of the mobile base vehicle 102 in response to a movement of the subject to allow the robotic mobile support system 100 to autonomously follow the subject from a posterior direction and support the subject in the event of a fall.

(27) The LIDAR sensor 108 can be a two-dimensional (2D) LIDAR sensor. The LIDAR sensor 108 can have a scanning range of up to 8 meters. The LIDAR sensor 108 can undertake distance measurements in an indoor environment and outdoor environment. The LIDAR sensor 108 can operate based on laser triangulation. For example, the LIDAR sensor 1008 can emit a modulated infrared laser light (see, e.g., FIG. 8) at a target (e.g., the subject) and the laser light can be reflected by the target back to the sensor to be detected. The returning light signal can be sampled by one or more processors within the LIDAR sensor 108 and/or by the onboard computing device 110. The infrared laser light (see, e.g., FIG. 8) can have a laser wavelength of between about 775 nm and 795 nm (e.g., about 785 nm).

(28) The system 100 can further comprise a leg LIDAR sensor 126 (see, e.g., FIG. 2) positioned on the mobile base vehicle 102 and configured to undertake distance measurements between the leg LIDAR sensor 126 and defined points along a leg region of the subject.

(29) The onboard computing device 110 can further comprise a wireless communication module configured to receive commands from a radio frequency (RF) transmitter 128. Movement of the mobile base vehicle 102 can be halted in response to an emergency stop command received from the RF transmitter 128 via the wireless communication module.

(30) The system 100 can provide support to subjects using a mobility aid such as a walker or rollator. This is because a walker or rollator can help the subject avoid falling forward while the system 100 stops the subject from falling sideways or backward (the directions of most falls). The system 100 can be configured such that the mobile base vehicle 102 autonomously follows the subject and adjust its speed and direction such that it always stays a safe constant distance behind the subject and not interfere with the subject. The robotic arm 106 of the system 100 can be configured to not make physical contact with the subject until the subject requires supportive assistance. Components of the robotic arm 106 can be configured such that they can easily be adjusted to accommodate the height and girth of most subjects.

(31) In addition to the components disclosed above, the system 100 can also comprise passive caster wheels 300 (see, e.g., FIGS. 3 and 4) positioned near the front of the mobile base vehicle 102, one or more batteries carried by the mobile base vehicle 102, a microcontroller, motor drivers, a servomotor power hub, a servomotor adapter board, a switch, a voltage display unit, a voltage regulator, encoders and decoders, an RF receiver (serving as the wireless communication module), and a charging apparatus for charging the batteries.

(32) The differentially driven wheels 104 can be driven by electric motors (e.g., direct current (DC) motors) which are controlled via a dual motor driver. The motor driver receives the wheel speed commands from a microcontroller and directs the motors to rotate the wheels 104 as commanded. The wheel commands are passed to the microcontroller from the onboard computing device 110 where the autonomous control system software resides.

(33) The LIDAR sensor 108 can be positioned on top of the robotic arm 106. For example, the LIDAR sensor 108 can be positioned on top of the first arm linkage 120. The LIDAR sensor 108 can be positioned on the top of the robotic arm 106 (e.g., on the top of the first arm linkage 120) such that the LIDAR sensor 108 can measure the distance between the LIDAR sensor 108 and defined points 700 (see, e.g., FIG. 7A) along a dorsum region or posterior truncal region of the subject without obstruction. The dorsum region can refer to a back of the subject (including a region encompassing any of the trapezius or latissimus dorsi), a posterior waist region of the subject, and a posterior upper hip region of the subject.

(34) In other implementations, the LIDAR sensor 108 or an additional LIDAR sensor can be positioned below the end effector 112 or below the first arm linkage 120 such that the LIDAR sensor 108 or the additional LIDAR sensor can measure the distance between the LIDAR sensor 108 and defined points 700 along a waist or posterior hip region of the subject without obstruction. For example, the LIDAR sensor 108 or the additional LIDAR sensor can measure the distance between the LIDAR sensor 108 and defined points 700 along a lumbar or sacral region of the subject.

(35) Data from any of the LIDAR sensor 108, the leg LIDAR sensor 126, and the additional LIDAR sensor can be transmitted to the onboard computing device 110 for processing. As will be discussed in more detail in the following sections, a plurality of algorithms and equations can determine a distance of the subject to the LIDAR sensor(s) (and by extension, the system 100), a speed of the subject, and the subject's direction of motion. The control system software can reside in the memory units of the onboard computing device 110 and can instruct the one or more processors to use the processed LIDAR data to determine the required wheel speeds of the differentially driven wheels 104 to follow the mobility-impaired subject. Signals concerning the wheel speeds can then be passed on to the microcontroller where other algorithms ensure timely adjustment of wheel speed commands through the motor driver that drives the DC motors and, consequently, the wheels 104.

(36) The system 100 can be configured such that if the subject starts falling backward (as determined by the one or more processors of the onboard computing device 110 using distance measurements from the LIDAR sensor 108), the mobile base vehicle 102 can advance forward and the soft padded back support 114 of the end effector 112 can push the subject's waist forward to prevent the subject from falling. The system 100 can also be configured such that if the subject starts falling sideways (for example, as little as 2 to 5 degrees), the second servomotor 124 rotates the end effector 112 (in a plane approximately parallel with the floor) and at least one of the soft padded side supports 116 pushes the subject back toward the upright position.

(37) The system 100 can be powered by one or more batteries. The batteries can power the various motors and onboard electronic components. The batteries can be rechargeable and allow the system 100 to be recharged while not in use and overnight. Spare batteries can also be used, if required, for uninterrupted operation of the system 100 while the original batteries recharge.

(38) In some implementations, the system 100 can be powered by two 3000 mAh 11.1v lithium polymer (LiPo) batteries. The two LiPo batteries can power the DC motors, servomotors, motor drivers, microcontroller, and onboard computing device 110, and any other electronic component requiring power.

(39) The operation time of the batteries in hours can be calculated as follows: Time=C/A, where C is the capacitance of the battery in amp-hours, and A is the average current draw.

(40) The system 100 can comprise three modes of operation. The three modes of operation can be selected by the subject or a user (e.g., a healthcare professional) using an RF transmitter 128.

(41) The primary mode of operation can be an autonomous mode. After selecting this mode, the system 100 will not become active unless the subject is properly positioned within the end effector 112 of the robotic arm 106. For example, the subject can be properly positioned within the end effector 112 when the waist or a truncal region of the subject is surrounded on three sides by the back support 114 and the side supports 116. The system 100 can use the LIDAR sensor 108 and the onboard computing device 110 to determine whether the subject is properly positioned within the end effector 112.

(42) The second mode can be a remote-control mode. This mode can be used to move the system 100 from a storage or charging location to the operation area for use by the subject. The operator (e.g., a caregiver or other medical professional) can communicate with the system 100 by applying user input to the RF transmitter 128.

(43) The third mode can be an emergency stop (or E-Stop) mode. The emergency stop mode can be a default mode of the system 100. In order to operate the system 100, the emergency stop mode has to be turned off by pressing a button on the RF transmitter 128 instructing the system 100 to enter another mode. Once the system 100 is in operation in another mode, an emergency stop button can be pressed to shut power to the DC motors and stop the mobile base vehicle 102 from moving (at which point, the system 100 is once again in the emergency stop mode).

(44) The RF transmitter 128 can be a handheld RF transmitter. In some implementations, the handheld RF transmitter can be a specialized controller or remote control having four push buttons. For example, the buttons can be marked “A,” “B,” and “C.” Pressing the A button can command the system 100 to enter the autonomous mode and the system 100 can become active when the subject is properly positioned within the end effector 112 as discussed above. Pressing the B button can command the system 100 to enter the emergency stop mode. Power to the DC motors can be immediately shut down once the B button has been pressed. Pressing the C button can command the system 100 to enter the remote-control mode.

(45) The system 100 can further comprise a touch screen display (e.g., a light-emitting diode (LED) or liquid crystal display (LCD) display). The touch screen display can be coupled to the mobile base vehicle 102 or to at least part of the robotic arm 106. The touch screen display can be electrically coupled or in electrical communication with the onboard computing device 110. The touch screen display can provide useful information concerning the system 100 to the subject or a healthcare professional including a distance traveled by the subject, how often the subject required assistance (how often the system 100 engaged the subject to potentially prevent a fall), error messages during operation, and the current mode of operation of the system 100.

(46) FIG. 3 is a schematic drawing illustrating the top of the mobile base vehicle 102. As will be discussed in more detail in the following sections, FIG. 3 depicts certain dimensions and geometric parameters of the mobile base vehicle 102 for mathematical modeling. Table 1 below provides example measurements for one implementation of the mobile base vehicle 102.

(47) FIG. 4 is a schematic drawing illustrating the side of the autonomous robotic mobile support system 100. As will be discussed in more detail in the following sections, FIG. 4 depicts certain dimensions and geometric parameters of the system 100 for mathematical modeling. Table 1 below provides example measurements for one implementation of system 100.

(48) FIGS. 5A and 5B are schematic drawings illustrating a top and side, respectively, of the robotic arm 106. FIGS. 5A and 5B depict certain dimensions of the robotic arm 106 including the lengths of the base arm linkage 118 (“link 0” or l.sub.0), the first servomotor 122, the first arm linkage 120 (“link 1” or l.sub.1), and the second servomotor 124 and the length and width of components of the second arm linkage or end effector 112 (“link 2” “or l.sub.2). Table 1 below provides example measurements for one implementation of the robotic arm 106.

(49) For example, the lengths of the base arm linkage 118 and the first arm linkage 120 can be adjusted along with angles α.sub.o and α.sub.o to accommodate a height of the subject.

(50) Mathematical Model

(51) Presented below is a mathematical model for constructing and autonomously operating one implementation of the system 100 disclosed herein.

(52) Geometric and Mass Properties

(53) Tables 1 and 2 below list example geometric and mass parameters for one implementation of the mobile base vehicle 102 and robotic arm 106. In addition, the tables list motion parameters required for mathematical modeling and development of a motion controller for autonomous operation of the system 100.

(54) TABLE-US-00001 TABLE 1 List of example parameters in SI units Parameter Definition Value l Axle to caster distance 0.330 m h Wheel to wheel half distance 0.152 m l.sub.b Base to axle distance 0.0 m R.sub.w Wheel radius 0.045 m h.sub.e Range of total desired height of the end- 0.914-1.291 m effector d.sub.e Desired distance from caster to the end- 0.406 m effector l.sub.0.sup.l, l.sub.0.sup.s Length of link 0 and servo 1 1.000, 0.050 m l.sub.1.sup.l, l.sub.1.sup.s Length of link 1 and servo 2 0.203, 0.050 m l.sub.2b Length of link from servo 2 to the end- 0.000 m effector base w.sub.e Half of the width of the end-effector 0.305 m t.sub.e Additional end-effector tip thickness 0.025 m m.sub.1.sup.s Mass of servo/gear 1 assembly 0.180 kg m.sub.2.sup.s Mass of servo/gear 2 0.180 kg ρ Density per unit length of 8020 part 0.441 kg/m 2020 m.sub.v Total mass of the base vehicle 28.0 kg h.sub.v Height of the base of the arm 0.100 m from ground

(55) TABLE-US-00002 TABLE 2 List of example parameters in SI units for h.sub.e = 1.067 m Parameter Definition Value d Overall center of mass (CM) 0.184 m distance to axle α.sub.0 Link 0 fixed angle from 70.3° the horizontal α.sub.1 Link 1 angle from the horizontal −59.1°-32.5° θ.sub.10 Link 1 angle relative to link 0 −129.4°-27.4°  θ.sub.21 Link 2 angle relative to link 1 −32.5°-59.1° m.sub.0 Total mass of link 0 including 0.621 kg servo 1 & other parts m.sub.1 Total mass of link 1 including 0.27 kg servo 2 & other parts m.sub.2 Total mass of link 2 including 0.693 kg end-effector & other parts m Total mass of the system 29.584 kg x Global x-position of the CM of the whole robot y Global y-position of the CM of the whole robot θ Orientation of the mobile base vehicle relative to global X-axis θ.sub.1 Servo 1 Joint angle Range: ±45° (will likely be between-10° and 30°) θ.sub.2 Servo 2 joint angle Range: ±45° (default at 0° and goes back to 0° after angle changes due to contact with subject) v.sub.x Forward velocity v.sub.y Lateral velocity ω Angular velocity ω.sub.R Right wheel angular velocity ω.sub.L Left wheel angular velocity XYZ & Global & body-fixed e.sub.xe.sub.y reference frame
Link Lengths and Angles:
l.sub.0=l.sub.0.sub.l+l.sub.1.sub.s, l.sub.1=l.sub.1.sub.t+l.sub.2.sub.s cos α.sub.2, l.sub.2=l.sub.2.sub.b+l.sub.e
Overall Geometry:

(56) d t = l + d e , d l = d t - l 2 - l 2 s , h e = h e - h v θ 1 = - cos - 1 ( d l 2 + h e ′2 - l 0 2 - l 1 l 2 2 l 0 l 1 l ) , α 0 = ϕ + cos - 1 ( A h e ′2 + d l 2 ) α 1 = α 0 + θ 1 , A = d l 2 + h e ′2 + l 0 2 - l 1 l 2 2 l 0 , ϕ = tan - 1 ( h e d l )
Masses:
m.sub.0=m.sub.0.sub.l+m.sub.1.sub.s, m.sub.1=m.sub.1.sub.l+m.sub.2.sub.s, m.sub.2=m.sub.2.sub.b+2(m.sub.l.sub.e+m.sub.w.sub.e+m.sub.t.sub.e)+m.sub.e
Total mass: m=m.sub.v+m.sub.0+m.sub.1+m.sub.2
Center of Mass (CM) of Each Link:

(57) 0 a 0 = [ m 0 l l 0 l 2 + m 1 s ( l 0 l + l 1 s 2 ) ] / m 0 a 1 = [ m 1 l l 1 l 2 + m 2 s ( l 1 l + l 2 s cos α 1 2 ) ] / m 1 a 2 = [ m 2 b l 2 b 2 + m w e l 2 b + m l e ( l 2 b + l e 2 ) + ( m t e + m e ) ( l 2 b + l e ) ] m 2
The overall CM is at distance d from the axle:
d=[m.sub.vd.sub.v+m.sub.0(l.sub.b+a.sub.0 cos α.sub.0)+m.sub.1(l.sub.b+l.sub.0 cos α.sub.0+a.sub.1 cos α.sub.1)+m.sub.2(l.sub.b+l.sub.0 cos α.sub.0+l.sub.1 cos α.sub.1+a.sub.2)]/m
Tip-Over Stability Requirements

(58) The risk of the system 100 tipping over when subjected to back forces F.sub.b or side forces F.sub.s from the subject are mitigated by several design features. The various forces are illustrated in FIG. 5. The tip-over depends on the magnitude of these forces, the height of the robotic arm h.sub.e, overall CM distance from the axle d and wheel to wheel half distance h. The stability conditions are as follows:
F.sub.s.Math.h.sub.e≤mg.Math.h.fwdarw.mg≥F.sub.s.Math.h.sub.e/h for lateral stability
F.sub.b.Math.h.sub.e≤mg.Math.d.fwdarw.mg≥F.sub.b.Math.h.sub.e/d for longitudinal stability
Considering practical values of h=d=0.2 m and a height of h.sub.e=1 m (relatively tall person), the system 100 must weigh 5 times more than F.sub.b or F.sub.s. The magnitude of F does factor in a subject's weight. For example, for a 10-lb lateral or longitudinal force, the system 100 must weigh about 50 lbs. Since the system is designed to prevent the subject from falling before going unstable, we do not expect any force to exceed 10 lbs. and the system will be designed to handle those force levels.

(59) A minimum weight of the mobile base vehicle 102 can be determined by a height of the subject (H), a weight of the subject (WP), and the equation:
Minimum Weight of Mobile Base Vehicle (WR)=H*WP*sin β/(2h)
where β is a tilt angle of the subject as measured from an upright position (where 0≤β≤5°), and h is one-half of a distance separating the two differentially driven wheels. For example, if a width of the mobile base vehicle 102 is 24-inch, h=12 in =0.305 m. The minimum weight of the mobile base vehicle 102 can be calculated using a 5-degree angle to provide a level of cushion relative to a 2-degree threshold.
Planned Scenario, β≤5°:
For a tall 200-lb 6 ft 2 in. subject, a full 5° from upright: WR=24.4 kg=53.7 lbs
For a tall 160-lb 5 ft 9 in. subject, a full 5° from upright: WR=18.2 kg=40.1 lbs
Control System

(60) Below is a description of control system software and hardware that allows the system 100 to operate autonomously. The LIDAR sensor 108 can undertake distance measurements between the LIDAR sensor 108 and defined points 700 along the dorsum region of the subject (see, e.g., FIGS. 7A and 8). The LIDAR sensor 108 can transmit such measurements to the onboard computing device 110 (e.g., an onboard Rasberry Pi® computer) and the onboard computing device 110 can process such data and determine a speed and direction of the mobile base vehicle 102. The onboard computing device 110 can then transmit commands to a microcontroller (e.g., an Arduino® board) electrically coupled or in electrical communication to the onboard computing device 110. Software can be written on the microcontroller to control the DC motors powering the differentially driven wheel 104 based on commands from the onboard computing device 110. The microcontroller can also maintain the angles (θ.sub.1, θ.sub.2) of the robotic arm 106 through the two servomotors.

(61) The system 100 can be configured to follow a continuous, albeit variable trajectory. The trajectory can be defined by the subject, as the system 100 follows the subject from a safe distance behind the subject, responding to variations in cadence, direction, velocity and angular movements. The software can use distance measurements from the LIDAR sensor 108 (i.e., distance measurements to multiple defined points 700 across the back and/or waist/hips of the subject, see FIG. 7A) to ensure the system 100 follows the subject with the robotic arm ready to help stabilize the subject if the subject begins to fall backwards and/or to the side.

(62) Below are several examples of the system in operation:

(63) 1) When the subject is walking forward or in a substantially straight-line trajectory, the system 100 can sense the changing distance and adjust the speed of the mobile base vehicle 102 in response to variations in the subject's velocity of gait.

(64) 2) If the subject turns left or right, the system 100 can sense a change in the distances separating the LIDAR sensor 108 (and, by extension, the system 100) from the defined points 700 (see, e.g., FIG. 7A) on the back and/or waist/hips of the subject and the system 100 can use such distance measurements to calculate wheel rotational speeds that would allow the mobile base vehicle 102 to also turn left or right.
3) If the subject falls backward, the system 100 can sense a change in the distance separating the LIDAR sensor 108 from the subject and the system 100 can instruct the mobile base vehicle 102 increase a speed of the mobile base vehicle 102 and use the padded back support 114 (or a combination of the padded back support 114 and at least one of the padded side supports 116) of the end effector 112 to push the subject back to a straight or upright position. 4) If the subject starts falling laterally, the subject will push or apply a force to at least one of the side supports 116 and angle θ.sub.2 (see FIG. 4) will become non-zero. Once this angle has reached or exceeds a preset threshold rotation value ((±2 degrees), the system 100 can use the second servomotor 124 to straighten the end effector 112 and push the subject back to a straightened or upright position.

(65) FIG. 6A illustrates a perspective view of a back support component 600 used to construct the back support 114 of the end effector 112 of the robotic arm 106. FIG. 6B illustrates a perspective view of side support components 602 used to construct the side supports 116 of the end effector 112 of the robotic arm 106.

(66) The back support component 600 and the side support components 602 can be made in part of a rigid material such as a rigid metallic material. For example, the back support component 600 and the side support components 602 can be made in part of aluminum or aluminum alloy, stainless steel, titanium, or a combination thereof.

(67) As shown in FIGS. 6A and 6B, each of the side support components 602 can comprise a narrowed connector portion 604. The narrowed connector portions 604 can be configured to fit or extend (at least partially) into a connecting cavity 606 of the back support component 600. Once the narrowed connector portions 604 are extended or slid into the connecting cavity 606, the side support components 602 can be coupled to the back support component 600 using nuts and bolts, screws, or a combination thereof. A distance or width separating the two side support components 602 can be adjusted to accommodate a girth of the subject.

(68) The back support component 600 and the side support components 602 can be covered (at least partially) by a soft padding or cushioning material. The back support component 600 and the side support components 602 can also serve as a frame for one or more padded coverings or padded portions. The padded back support 114 and side supports 116 can be the only parts of the system 100 that come into physical contact with the subject to support the subject and/or prevent the subject from falling.

(69) As discussed above, the system 100 can follow the subject from a safe distance behind the subject and adjust the speed of the mobile base vehicle 102 in response to a speed of the subject. For example, the one or more processors of the onboard computing device 110 can be programmed to execute further instructions to determine a forward velocity (v.sub.x) of the mobile base vehicle 102 using the equation:
v.sub.x=k1*(xlmin−dist)
where xlmin is a minimum distance from the set of distance measurements made by the LIDAR sensor 108 (e.g., distance measurements between the LIDAR sensor 108 and the defined points 700 along the dorsum region of the subject), dist is a predefined desired distance, and k1 is a first proportional control gain. The proportional control gain can be a ratio of an output response to an error signal.

(70) The forward velocity can be augmented by an angular speed to ensure the mobile base vehicle 102 can turn either left or right in response to a left or right turn of the subject.

(71) For example, the one or more processors of the onboard computing device 110 can be programmed to execute further instructions to determine an angular velocity (ω) of the mobile base vehicle 102 using the equation:

(72) ω = k 2 * ϕ 2 + ϕ 3 2
where k2 is a second proportional control gain, and ϕ.sub.2 and ϕ.sub.3 are angles of rotation of the subject determined by the equations:

(73) ϕ 2 = atan [ xl ( first selected defined point ) - xl ( second selected defined point ) ] [ yl ( first selected defined point ) - yl ( second selected defined point ) ] ϕ 3 = atan [ xl ( third selected defined point ) - xl ( fourth selected defined point ) ] [ yl ( third selected defined point ) - yl ( fourth selected defined point ) ]

(74) where yl and xl represent the (y, x) position of the defined points 700 (see FIG. 7A) across the dorsum of the subject as measured by the LIDAR sensor 108.

(75) FIG. 7A illustrates example defined points 700 that can be used by the LIDAR sensor 108 to undertake distance measurements to the subject. In some instances, the defined points 700 can be a horizontally aligned set of points along a back of the subject. In other instances, the defined points 700 can be defined along a waist, lumbar region, and/or hips of the user. In instances where the system 100 comprises a leg LIDR sensor 126, the defined points 700 can be defined along the legs (e.g., along the hamstrings or calves) of the subject.

(76) A shown in FIG. 7A, an origin point 702 can be located in line or aligned with a center of the LIDAR sensor 108, x(i) represents a distance to a selected defined point in the x-direction where the positive x-axis is directly in front of the LIDAR sensor 108, and y(i) represents a distance to the selected defined point in the y-direction where the positive y-axis is toward a right-hand side of the subject.

(77) It has been discovered by the applicant that certain defined points 700 can be selected that can yield more accurate angle calculations for the equations above. For example, the first selected defined point 704 can be point 4 in FIG. 7A, the second selected defined point 706 can be point 9 in FIG. 7A, the third selected defined point 708 can be point 5 in FIG. 7A, and the fourth selected defined point 710 can be point 8 in FIG. 7A.

(78) Thus, the angle in each case (ϕ.sub.2, ϕ.sub.3) can be determined using a basic right-angle triangle as shown in FIG. 7B:

(79) { ϕ 2 = a tan [ x l ( 4 ) - x l ( 9 ) ] [ y l ( 4 ) - y l ( 9 ) ] ϕ 3 = a tan [ x l ( 5 ) - x l ( 8 ) ] [ y l ( 5 ) - y l ( 8 ) ]

(80) Although 11 points are shown in FIG. 7A, it is contemplated by this disclosure and it should be understood by one of ordinary skill in the art that the defined points 700 can range from 8 points up to 20 points. In other implementations, the defined points 700 can range from 20 points to 30 points or greater.

(81) As previously discussed, the system 100 can comprise two differentially driven wheels 104 where each wheel is independently driven by a DC motor. The one or more processors of the onboard computing device 110 can also be programmed to execute further instructions to determine a right wheel angular velocity (ω.sub.R) and a left wheel angular velocity (ω.sub.L) using the kinematic equations:

(82) ω R = v x + h ω R w ω L = v x - h ω R w
where ω is the angular velocity, where h is one-half of a distance separating the two differentially driven wheels, and R.sub.w is a radius of at least one of the differentially driven wheels.

(83) The right wheel angular velocity (ω.sub.R) and the left wheel angular velocity (ω.sub.L) can then be used to calculate desired wheel speeds in revolutions per minute (RPM).

(84) For example, the one or more processors can be programmed to execute further instructions to determine a left wheel RPM (RPM.sub.L_Des) and a right wheel RPM (RPM.sub.R_Des) using the following equations:

(85) RPM L_Des = ω L * 3 0 π RPM R_Des = ω R * 3 0 π
where ω.sub.R is the right wheel angular velocity and ω.sub.L is the left wheel angular velocity.

(86) When a subject begins to fall backwards, the wheel RPMs (the left wheel and right wheel RMPs) can be set to 25 RPM in order to provide a stabilizing push. This value can be adjusted to provide better support for subjects depending on the weight of the subject.

(87) Once the wheel RMPs are calculated, these values are sent from the onboard computing device (e.g., the onboard Raspberry Pi® computer) to the microcontroller (e.g., the Arduino® control board) which uses a proportional integral controller to maintain the desired wheel speeds. Each motor is equipped with a rotary encoder which determines the actual wheel RPM (RPM.sub.L, RPM.sub.L) as follows:

(88) { DCounter L = Counter L - lastCounter L DCounter R = Counter R - lastCounter R { RPM L = DCounter L * 5 4 RPM R = DCounter R * 5 4
where DCounter is the change in encoder count, Counter is the current encoder position, and lastCounter is the previous counter position. The error is determined by taking the difference between the actual and desired wheel RPMs as follows:

(89) { Error L = RPM L_Des - RPM L Error R = RPM R_Des - RPM R
The sum of the error Error_sum is given as:

(90) { Error - sum L = Error L + lastError L Error - sum R = Error R + lastError R
where lastError is the error sum from the previous step. The controller is then defined as:

(91) { PWM L = k pL * Error L + k iL * Error_sum L PWM R = k pR * Error R + k iR * Error_sum R
where (k.sub.pL, k.sub.iL) and (k.sub.pR, k.sub.jR) are the left and right wheel proportional and integral gains, respectively, and PWM.sub.L, PWM.sub.R are the pulse-width modulation scales that determine the DC motor speeds.

(92) FIG. 8 illustrates that the robotic mobile support system 100 can autonomously follow a subject from a posterior direction and the robotic arm 106 can support the subject in the event of a fall. The system 100 can autonomously follow the subject without the subject having to physically contact (e.g., hold on to) any part of the system 100 or use a joystick or other type of controller. The system 100 can react to movements of the subject (e.g., movements indicating that a fall may occur) and provide support for the subject via the robotic arm 106 without the subject having to command or alert the system 100.

(93) Each of the individual variations or embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other variations or embodiments. Modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the present invention.

(94) Methods recited herein may be carried out in any order of the recited events that is logically possible, as well as the recited order of events. Moreover, additional steps or operations may be provided or steps or operations may be eliminated to achieve the desired result.

(95) Furthermore, where a range of values is provided, every intervening value between the upper and lower limit of that range and any other stated or intervening value in that stated range is encompassed within the invention. Also, any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. For example, a description of a range from 1 to 5 should be considered to have disclosed subranges such as from 1 to 3, from 1 to 4, from 2 to 4, from 2 to 5, from 3 to 5, etc. as well as individual numbers within that range, for example 1.5, 2.5, etc. and any whole or partial increments therebetween.

(96) All existing subject matter mentioned herein (e.g., publications, patents, patent applications) is incorporated by reference herein in its entirety except insofar as the subject matter may conflict with that of the present invention (in which case what is present herein shall prevail). The referenced items are provided solely for their disclosure prior to the filing date of the present application. Nothing herein is to be construed as an admission that the present invention is not entitled to antedate such material by virtue of prior invention.

(97) Reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in the appended claims, the singular forms “a,” “an,” “said” and “the” include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.

(98) In understanding the scope of the present disclosure, the term “comprising” and its derivatives, as used herein, are intended to be open-ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps. The foregoing also applies to words having similar meanings such as the terms, “including”, “having” and their derivatives. Also, the terms “part,” “section,” “portion,” “member” “element,” or “component” when used in the singular can have the dual meaning of a single part or a plurality of parts. As used herein, the following directional terms “forward, rearward, above, downward, vertical, horizontal, below, transverse, laterally, and vertically” as well as any other similar directional terms refer to those positions of a device or piece of equipment or those directions of the device or piece of equipment being translated or moved. Finally, terms of degree such as “substantially”, “about” and “approximately” as used herein mean a reasonable amount of deviation (e.g., a deviation of up to ±0.1%, ±1%, ±5%, or ±10%, as such variations are appropriate) from the specified value such that the end result is not significantly or materially changed.

(99) This disclosure is not intended to be limited to the scope of the particular forms set forth, but is intended to cover alternatives, modifications, and equivalents of the variations or embodiments described herein. Further, the scope of the disclosure fully encompasses other variations or embodiments that may become obvious to those skilled in the art in view of this disclosure.