WALKING ASSISTING SYSTEMS AND METHODS

20220323285 · 2022-10-13

    Inventors

    Cpc classification

    International classification

    Abstract

    A walking assisting system for a user. The system includes first and second sub-systems each having at least one optical sensor, for example a camera. The first sub-system is fitted to the user's abdomen region, for example waist, and the second sub-system is fitted to the user's head region, for example eyes. The system is arranged to align between data obtained from optical sensors of the first and second sub-systems in order to indicate to the user presence of obstacles detected by the first sub-system in a coordinate system of the second sub-system.

    Claims

    1. A walking assisting system for a user, the system comprising first and second sub-systems, wherein the first sub-system comprising at least one depth camera and an Inertial Measurement Unit (IMU) comprising three accelerometers and at least one gyroscope, and the second sub-system comprising an Inertial Measurement Unit (IMU) comprising three accelerometers, the first sub-system being a belt-mounted sub-system that is fitted to the user's waist, and the second sub-system being fitted to another region of the user's body, for example head in particular eyes or hand, wherein the system being arranged during walking of a user to assess by the first sub-system a real time walking pattern of the user and obstacles in a footpath of the user and to align between data obtained from the first and second sub-systems in order to indicate to the user presence of such footpath dependent obstacles detected by the first sub-system in a coordinate system of or related to the second sub-system.

    2. The walking assisting system of claim 1, wherein accelerometers of an IMU are arranged to derive a Cartesian coordinate system associated to a sub-system including the IMU.

    3. The walking assisting system of claim 1, wherein the optical sensor of the first sub-system is arranged to sense towards a direction of movement of the user, for example a walking direction of the user.

    4. The walking assisting system of claim 1, wherein indication to a user of presence of obstacles is by any one of: augmented reality technology, sound, vibration, visual indication by a marker (e.g. a laser marker pointing to the obstacle).

    5. A method for assisting a user in walking comprising the steps of: providing a system comprising first and second sub-systems wherein the first sub-system comprising at least one depth camera and an Inertial Measurement Unit (IMU) comprising three accelerometers and at least one gyroscope, and the second sub-system comprising an Inertial Measurement Unit (IMU) comprising three accelerometers, fitting the first sub-system as a belt-mounted sub-system to the user's waist, and fitting the second sub-system to another region of the user's body, for example head in particular eyes or hand, wherein potential obstacles detected during walking of a user in coordinates of sensed data of the first sub-system data undergo transformation to be presented in a correct location in coordinates of sensed data of the second sub-system.

    6. The method of claim 5, wherein accelerometers of an IMU are arranged to derive a Cartesian coordinate system associated to a sub-system including the IMU.

    7. The method of claim 5, wherein the optical sensor of the first sub-system is arranged to sense towards a direction of movement of the user, for example a walking direction of the user.

    8. The method of claim 5, wherein indication to a user of presence of obstacles is by any one of: sound, vibration, visual indication by a marker (e.g. a laser marker pointing to the obstacle).

    9. A walking assisting system for a user comprising a belt-mounted sub-system that is fitted to the user's waist, wherein the sub-system comprising at least one depth camera and an Inertial Measurement Unit (IMU) comprising three accelerometers and at least one gyroscope, and wherein real time data obtained by the sub-system during walking of a user is arranged to assess a real time walking pattern of the user and alert the user of obstacles in his/her footpath in response to the real time walking pattern.

    10. The walking assisting system of claim 9, wherein assessing a real time walking pattern of the user comprises comparing to a pre-measured/assessed normal or natural walking pattern of the same user.

    11. The walking assisting system of claim 9, wherein deriving a walking pattern comprises measuring at least one of the following parameters: leg's stride length, velocity, walking symmetry, tiredness, a user's center of mass and/or monitoring angular changes in the Cartesian coordinate system derived from the IMU.

    12. The walking assisting system of claim 11, wherein the pre-measured/assessed normal or natural walking pattern is derived from a ‘timed up and go’ (TUG) test, where tracking of the user is performed when he/she sits on a chair—stands up—and returns to sit down.

    13. The walking assisting system of claim 9, wherein the real time data obtained by the sub-system during walking of a user is arranged to assess also obstacles in a footpath of the user.

    14. The walking assisting system of claim 9 and being arranged to synchronize between measurements made by the camera and the IMU.

    15. The walking assisting system of claim 9, wherein data obtained by the IMU is used for determining the direction at which the camera is aimed at.

    16. The walking assisting system of claim 9, wherein providing an alert to the user is also determined according to a real time distance obtained by the depth camera of the user from an obstacle in his/her footpath.

    17. The walking assisting system of claim 16, wherein the determining if to provide an alert is if the detected obstacle is below a trigger distance ‘D’ from the user in his/her direction of advancement.

    18. The walking assisting system of claim 17, wherein the determining if to provide an alert to the user is according to sensed data obtained by the IMU.

    19. The walking assisting system of claim 18, wherein changes in sensed data obtained by the IMU determines if to provide an alert to the user.

    20. The walking assisting system of claim 19, wherein the changes in the sensed data are changes in sensed frequency of the stride of the user as obtained by the at least one gyroscope.

    Description

    BRIEF DESCRIPTION OF THE FIGURES

    [0110] Exemplary embodiments are illustrated in referenced figures. It is intended that the embodiments and figures disclosed herein are to be considered illustrative, rather than restrictive. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying figures, in which:

    [0111] FIGS. 1 to 3 schematically show a walking assisting system in accordance with at least certain embodiments of the present invention, fitted to a user viewed in different postures, where FIGS. 1 and 2 show a side view and FIG. 3 a top view of the user and system;

    [0112] FIG. 4 schematically shows a further possible embodiment of a walking assisting system of the present invention;

    [0113] FIGS. 5A and 5B schematically show a flow diagram relating to at least certain system embodiments and a possible time-lag aspect relating to such diagram; and

    [0114] FIGS. 6 to 10 schematically show various walking assisting systems in accordance with further embodiments of the present invention.

    [0115] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated within the figures to indicate like elements.

    DETAILED DESCRIPTION

    [0116] Attention is first drawn to FIG. 1 schematically showing a user 5 fitted with a walking assisting system 1 according to at least certain embodiments of the invention. System 1 in this example is shown including a first sub-system 11 and a second sub-system 12. First sub-system 11 is here illustrated fitted to a waist region of the user and second sub-system 12 to a head of the user.

    [0117] First sub-system 11 includes in this example a sensor 111 aimed at a generally frontal direction directed towards a direction of advancement of the user during walking. In certain embodiments, sensor 111 may be embodied as a depth camera. Second sub-system 12 in this example is embodied including sensor 121 in the form of wearable computer glasses that are arranged to add information alongside or to what the users sees.

    [0118] The sensor of the of first sub-system 11 when embodied as a camera may be arranged to have a field of view (FOV) 112 that has a central axis 1121 generally extending towards and/or along a center of FOV 112. The sensor of the second sub-system 12 when embodied as a camera may be arranged to have a field of view (FOV) 122 that has a central axis 1122 generally extending towards and/or along a center of FOV 122.

    [0119] In certain cases, each one of the sensors 111, 121 may be adapted to sense information in a respective local coordinate system. In the illustrated example, a first coordinate system (denoted by a Cartesian coordinate system X1, Y1, Z1) is shown affixed to the first sub-system 11 and a second coordinate system (denoted by a Cartesian coordinate system X2, Y2, Z2) is shown affixed to the second sub-system 12. Hence, sensed data (such as image data captured by a camera type sensor) of any one of the sub-systems 11, 12 may be obtained in the respective local coordinate system of the sub-system.

    [0120] In accordance with various embodiments of the present invention, means may be employed for transforming and aligning data sensed by both the first and second sub-systems one towards the other and/or into a global coordinate system (denoted by a Cartesian coordinate system X.sub.G, Y.sub.G, Z.sub.G).

    [0121] In certain cases, such transforming and aligning may be facilitated by processing information in respective images captured by the first and second sub-systems, and/or by fitting devices such as Inertial Measurement Units (IMU) to both sub-systems in order to track their respective movements in space.

    [0122] In an aspect of the present invention, the first sub-system 11 may act for monitoring possible obstacles 18 located ahead of the user that may be assessed for determining potential hazard to the walker. Such sensed data may then be provided to the user via various means, such as in the present example, as added information alongside to what the users sees.

    [0123] The dotted rectangle at the upper left side of FIG. 1 (and FIGS. 2 and 3) represents what the user may see through his/her wearable computer glasses and since the example in FIG. 1 demonstrates that the user's attention being focused ahead to his/her direction of advancement—obstacle 18 may be seen in this view. Around obstacle 18 a marking 3 may be added to highlight the detected obstacle. Such marking 3 may be added due to processing taking place, e.g. in a processor included in the first sub-system, dictating that the obstacle poses a hazard that should be reported to the user.

    [0124] In certain cases, a potential hazard posed by a detected obstacle may be reported to a user via other means, such as by any one of: sound, vibration, visual indication by a marker (e.g. a laser marker pointing to the obstacle) and the like.

    [0125] Attention is drawn to FIG. 2 illustrating a scenario where the user's attention is drawn downwards to his/her close vicinity in order e.g. to be cautious with a next walking step being taken. In such a scenario the user may not be aware of the upcoming obstacle ahead that is not in his/her FOV and hence may similarly be provided with a marking 3 here in the form of an arrow encouraging him to look upwards.

    [0126] Attention is drawn to FIG. 3 illustrating a scenario where the user's attention may be drawn sideways and thus again the user may not be aware of the upcoming obstacle ahead that is not in his/her FOV. Similarly a marking 3 here in the form of an arrow may be provided in order to encourage him/her to turn his/her attention back towards his/her direction of walking.

    [0127] In at least certain embodiments, the second sub-system 12 may be arranged to monitor a field of view (FOV) of the eyes of the user, while not necessarily making use of sensors in form of a camera. FIG. 6 schematically illustrates such an embodiment absent of a camera in its second sub-system 12.

    [0128] Such monitoring and/or tracking of a FOV of the user's eyes may e.g. be accomplished by making use of a second sub-system 12 that only includes an IMU (and processor etc.) and using data derived from the IMU (i.e. from the inertial accelerometers) that is attached to the user's head region e.g. to his/her eyeglasses (or the like).

    [0129] Typically, during an initial walking phase when starting to walk, people usually tend to look down towards their walking direction, before feeling secure enough to rotate/lift their heads upwards/sidewards (or the like). During such initial walking phase, the IMU accelerometers of the first sub-system 11 and of the second sub-system 12 may be synchronized to form a ‘home’ alignment, and from this point onwards monitoring of movement in head direction vs walking direction may be computed by comparing angular deviations of the second sub-system's IMU accelerometers from the ‘home’ alignment.

    [0130] In a non-binding example, FIG. 2 may represent an initial walking phase of a user where alignment between accelerometers of the second sub-systems 12 IMU and accelerometers of the first sub-system's 11 IMU may be performed to compute a ‘home’ position of the user, and tracking of the FOV of the user's eyesight e.g. in FIG. 1 or 3 may be performed by computing deviations from this ‘home’ position.

    [0131] Attention is drawn to FIG. 4 illustrating a system embodiment where a mobile device in this example held by the user may constitute the second sub-system. In other examples (now shown) other types of mobile devices may be envisioned, such as mobile devices designed to be worn on the user's wrist (e.g. a smart watch) or the like. A user may be provided as above with markings on his/her mobile device catching his/her attention as to obstacles or other notifications that the system may be configured to provide.

    [0132] In certain cases, real time data obtained by gyroscopes and/or accelerometers of an IMU that are affixed to a user, may be compared to prior recorded data of gyroscopes and/or accelerometers of said same user—in order to monitor in real time changes in a walking pattern of such a user.

    [0133] With attention drawn to FIG. 7, in certain embodiments a first sub-system 11 located at a user's center of mass at his/her waist—that includes a depth camera and an IMU may be used (not necessarily with sub-system 12) in order to monitor a walking pattern of the user.

    [0134] Comparison to pre-measured/assessed normal or natural walking pattern of a user may assist to determine various aspects relating to the user, such as if a user's current walking pattern may be indicative of lack of response or attention e.g. to an incoming obstacle detected by a depth camera of the sub-system (or system).

    [0135] Attention is drawn to FIG. 9 showing a user advancing in a forward direction and equipped with a sub-system attached to his/her waist in this example in a belt like arrangement. The sub-system in this embodiment includes an IMU and a depth camera that is aimed at the direction of advancement, where both the IMU and camera are attached to the user's center of mass at the waist.

    [0136] Attachment of such sensing devices to a center of mass of a user at the waist has been found by the inventors as advantageous in monitoring walking kinematics of a user. For example, a walking pattern of a user may be sensed by accelerometers of the IMU that collect the user's body acceleration, while the gyroscopes of the IMU detect rotation of the user's waist within a walking cycle and thus can be used to detect changes in walking strides/patterns of a user.

    [0137] In FIG. 9 a user advancing towards an obstacle 18 is seen. A depth camera of the waist sub-system can be used to detect such an obstacle and provide a distance to the obstacle in real time.

    [0138] In certain cases, a trigger distance D may determine a distance to an obstacle below which alerts may be provided to a user. The trigger distance D may be specific to a user and may be determined e.g. according to prior history of the user's walking patterns (and the like).

    [0139] Attention is drawn to FIG. 10 providing a rough schematic illustration of signals picked up by a gyroscope of an IMU over a time span during walking of a user such as that seen in FIG. 9, which is equipped with a sub-system that includes a depth camera and an IMU affixed to the user's waist.

    [0140] With progression of time during a walking action of the user, the signals picked up by the gyroscope can be seen indicative of steps accomplished by the user's right R and left L legs. As seen in this example, the sensed steps are initially at a first frequency that then change (at the vertical ‘dotted line’) to a second frequency, which in this example is higher than the initial more lower frequency. The frequency here is measured by the sensed stride of the user as picked up by the gyroscope, which stride as seen changes at the ‘dotted line’ in this example to be shorter.

    [0141] Providing an alert to a user in certain cases may be dictated according to sensed data arriving from the IMU, in this example from one or more of the gyroscopes of the IMU. A change in frequency of a signal picked up by a gyroscope may be indicative for example of the user being aware of the obstacle, and hence providing an additional alarm to the user may be avoided. In other cases, absence of change in frequency or a change indicative of lack of attention to the obstacle may activate an alarm to the user of proximity to the obstacle.

    [0142] In embodiments where alerts are provided to a user only when the distance to an obstacle is below the trigger distance ‘D’, such alarms may be activated only when the distance to the obstacle as picked up by the depth camera is below the trigger distance ‘D’.

    [0143] In cases where presence of the first sub-system 11 may be inconvenient and/or unsuitable, such as when walking within a home environment and/or when the user is elderly—certain system embodiments, may be arranged to operate instead with a smaller module (SM) sub-system 117. FIG. 8 schematically illustrates one possible example of such an embodiment.

    [0144] For example, a home environment where a user moves within a known given location, may permit monitoring a user without need of the more bulky first sub-system being fitted to the user.

    [0145] In certain embodiments, instead of the relative more bulky first subsystem with its depth camera, IMU, power source, processor (and the like)—an SM sub-system 117 may include a relative smaller and simpler camera, IMU, processor (and the like) that may be attached to the users' clothing (or the like). In certain cases, styles static photos/images provided via such SM sub-system may be useful in assessing a location of the user within his/her home environment.

    [0146] Such styles photos may be obtained in certain embodiments from a video stream possibly taken by the SM sub-system's camera—wherein such styles photos may be compared with prior obtained images and/or video of the home environment obtained by e.g. a depth camera (or the like). Comparison between image data from the SM sub-system and prior taken image data—may lead to obstacle detection (or the like).

    [0147] An SM sub-system 117 may be suitable for use e.g. with a relative smaller processor and IMU. In a non-binding example such SM sub-system may include at least one of: an ESP32-CAM camera module, a Raspberry Pi Zero W processor, a Raspberry Pi Camera, an IMU such as the ADXL345 3-axis accelerometer (or the like).

    [0148] In certain cases, an SM sub-system 117 may be fitted to a user's shirt or may be hand held—and when a user starts walking, images from the SM sub-system may be transferred (e.g. via WIFI) possibly as low rate video (e.g. video rate of about 2, 3, 4 images per sec) or sequence of styles photos to a larger CPU possibly located outside of the user (e.g. within his/her first sub-system 11 currently not in use), where alignment/correlation between images from the SM sub-system and prior obtained image data may be performed.

    [0149] Such prior taken image data may be derived from a video stream (image library)—and may be used to alert a user e.g. if identification is made that a current image taken by the SM sub-system while walking may be in a vicinity of a potential obstacle. Provision of image data within such image library, which were taken by a depth camera, possibly the depth camera of the first sub-system 11, may assist in providing distance to such obstacle(s).

    [0150] Attention is drawn to FIG. 5A illustrating a flow of data that may be performed by the system possibly by processors located within the first or second sub-systems. Block 101 represents a step of gathering sensed data, e.g. via the depth cameras of the first sub-system.

    [0151] Possibly each image obtained may be assessed at step 102 to determine if it is ‘useful’. Such ‘usefulness’ may be defined by assessing if the image captured an area of interest (e.g. a route ahead while walking). For example, due to movements of a user during walking, the sensor may move while taking an image resulting in a blurry image. Or, the sensor may momentarily aim at a different direction (e.g. upwards towards the sky) and thus not contain information representative of the area of interest.

    [0152] Image data (either with or without determination of ‘usefulness’) may be processed at step 103 to detect objects of interest within the images, such as potential obstacles that may impede a user's walking.

    [0153] Objects of interest detected within an image taken by the first sub-system may then at step 104 undergo mapping to compute their position in the user's current position of the first sub system or second sub-system's FOV. This may be accomplished by applying a transformation (possibly a matrix or coordinate transformation) for formulating the coordinates of a detected object in the second coordinate system in terms of the coordinates where it was detected in first coordinate system.

    [0154] Preferably, the real time location of the second coordinate system may be taken for such formulation—so that the detected object and/or markings relating to such object may be correctly placed in the FOV of the second sub-system to the user.

    [0155] FIG. 5B schematically illustrates in dashed lines a user's previous location (e.g. while walking) during which an image taken by the user's first sub-system was obtained. Due to processing time required for certain steps possibly taken as indicated in FIG. 5A, a “time lag” may exist between the images used for detecting obstacles by the first sub-system and the real-time location of the user when such detected obstacles may be mapped into the FOV of his/her second sub-system.

    [0156] In the description and claims of the present application, each of the verbs, “comprise” “include” and “have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb.

    [0157] Further more, while the present application or technology has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and non-restrictive; the technology is thus not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art and practicing the claimed technology, from a study of the drawings, the technology, and the appended claims.

    [0158] In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures can not be used to advantage.

    [0159] The present technology is also understood to encompass the exact terms, features, numerical values or ranges etc., if in here such terms, features, numerical values or ranges etc. are referred to in connection with terms such as “about, ca., substantially, generally, at least” etc. In other words, “about 3” shall also comprise “3” or “substantially perpendicular” shall also comprise “perpendicular”. Any reference signs in the claims should not be considered as limiting the scope.

    [0160] Although the present embodiments have been described to a certain degree of particularity, it should be understood that various alterations and modifications could be made without departing from the scope of the invention as hereinafter claimed.