A METHOD FOR OPERATING A MOTIVATION ENHANCING EXERCISE SYSTEM

20230149793 · 2023-05-18

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for operating a motivation enhancing exercise system includes an exercise device configured to be driven by an exercising user, a head mounted output device configured to output a visual and/or an audible output for an exercising user, and a processing unit. Movements of a movable part of the exercise device are measured by means of a first sensor unit mounted on the movable part, and movements of a head of the user are measured by means of a second sensor unit comprised in the head mounted output device. The processing unit determines an activity being performed by the exercising user, and generates an output signal for the head mounted output device, in accordance with the determined activity. The head mounted output device provides a visual and/or audible output for the exercising user in accordance with the output signal.

    Claims

    1.-10. (canceled)

    11. A method for operating a motivation enhancing exercise system, the exercise system comprising an exercise device configured to be driven by an exercising user; a first sensor unit mounted on a movable part of the exercise device, the first sensor unit being configured to detect movements of the movable part of the exercise device; a head mounted output device configured to output a visual and/or an audible output for an exercising user, the head mounted output device further comprising a second sensor unit configured to detect movements of a head of the exercising user; and a processing unit being communicatively connected to the first sensor unit and to the head mounted output device, the method comprising the steps of: measuring movements of the movable part of the exercise device by means of the first sensor unit, in response to an exercising user operating the exercise device, measuring movements of a head of the exercising user, by means of the second sensor unit, while the head mounted output device is mounted on the head of the exercising user, and while the exercising user operates the exercise device, providing the measurements performed by the first sensor unit and by the second sensor unit to the processing unit, the processing unit determining an activity being performed by the exercising user, based on the measurements provided from the first sensor unit and from the second sensor unit, the processing unit generating an output signal for the head mounted output device, in accordance with the determined activity, and providing the output signal to the head mounted output device, and the head mounted output device providing a visual and/or audible output for the exercising user in accordance with the output signal, wherein the step of the processing unit determining an activity comprises the processing unit comparing the received measurements to expected movement patterns of the first sensor unit and the second sensor unit for one or more predefined activities, and wherein the expected movement patterns are generated by means of machine learning.

    12. The method according to claim 11, wherein the first sensor unit comprises a three-dimensional accelerometer and a gyroscope, and wherein the step of measuring movements of the movable part of the exercise device by means of the first sensor unit comprises measuring three-dimensional accelerations and gravitational orientation of the movable part of the exercise device.

    13. The method according to claim 11, wherein the step of providing the measurements from the first sensor unit to the processing unit is performed by means of a wireless communication channel.

    14. The method according to claim 11, wherein the processing unit forms part of the head mounted output device, and wherein the step of providing measurements from the second sensor unit to the processing unit is performed by direct communication.

    15. The method according to claim 11, wherein the step of the processing unit determining an activity comprises the processing unit determining a type of the exercise device.

    16. The method according to claim 11, wherein the head mounted output device is a virtual reality or an augmented reality device, and wherein the step of the head mounted output device providing an output comprises providing a visual experience output for the exercising user.

    17. A motivation enhancing exercise system comprising: an exercise device configured to be driven by an exercising user, a first sensor unit mounted on a movable part of the exercise device, the first sensor unit being configured to detect movements of the movable part of the exercise device, a head mounted output device configured to output a visual and/or an audible output for an exercising user, the head mounted output device further comprising a second sensor unit configured to detect movements of a head of the exercising user, and a processing unit connected to the first sensor unit and the head mounted output device, wherein the motivation enhancing exercise system is configured to perform the method according to claim 11.

    18. The motivation enhancing exercise system according to claim 17, wherein the first sensor unit comprises a three-dimensional accelerometer and a gyroscope.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0068] The invention will now be described in further detail with reference to the accompanying drawings in which

    [0069] FIG. 1 illustrates two different types of head mounted output devices for a motivation enhancing exercise system according to embodiments of the invention,

    [0070] FIGS. 2-4 illustrate sensor units mounted on three different types of exercise devices,

    [0071] FIG. 5 is a flow chart illustrating a method according to a first embodiment of the invention,

    [0072] FIG. 6 is a flow chart illustrating a method according to a second embodiment of the invention,

    [0073] FIG. 7 illustrates determining an activity performed by an exercising user in accordance with a method according to an embodiment of the invention, using a neural network, and

    [0074] FIG. 8 illustrates a single node of the neural network of FIG. 7.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0075] FIG. 1 illustrates part of a motivation enhancing exercise system 1 according to an embodiment of the invention. More particularly, FIG. 1 illustrates head mounted output devices in the form of a virtual reality (VR) or augmented reality (AG) device 2 for providing a visual output for a user, and a set of headphones 3 for providing an audible output for the user. An audible output may also be provided via earplugs 4 forming part of the VR or AR device 2. A user using the system 1 may choose whether to apply the VR or AR device 2, thereby obtaining visual as well as audible output, or to apply the headphones 3, thereby obtaining audible output only.

    [0076] The system 1 further comprises a processing unit (not shown) which is arranged to communicate with the VR or AR device 2, with the headphones 3 and with a sensor unit (not shown) which is mounted on a movable part of an exercise device (not shown). The processing unit may form part of the VR or AR device 2 and/or the headphones 3.

    [0077] During use, a user interacts with an exercise device, thereby causing a movable part of the exercise device to move. This will be described in further detail below with reference to FIGS. 2-4. A first sensor unit mounted on the movable part of the exercise device measures these movements and communicates the measurements to the processing unit via a wireless communication channel.

    [0078] The user further has the VR or AR device 2 or the headphones 3 mounted on the head. A second sensor unit (not shown) mounted on or forming part of the VR or AR device 2 or the headphones 3 measures movements of the head of the user. These measurements are also communicated to the processing unit via a wireless communication channel, via a wired connection, or provided directly to the processing unit in the case that the processing unit forms part of the respective output device 2, 3.

    [0079] Based on the received measurements, the processing unit determines an activity being performed by the exercising user. Thus, the activity is determined based on the interaction between the exercising user and the exercise device, as well as based on movements of the head of the user. As described above, the determination of the activity is accurate and fast, since it is performed based on measurements performed by the first sensor unit as well as on measurements performed by the second sensor unit.

    [0080] The processing unit further generates an output signal for the chosen head mounted output device 2, 3, in accordance with the determined activity, and provides the output signal to the output device 2, 3. The output device 2, 3 finally provides a visual and/or audible output for the user in accordance with the output signal. Thus, the exercising user receives a reward, in the form of a specific visual and/or audible output, which reflects the activity which the user is performing.

    [0081] FIG. 2 illustrates a first sensor unit 7 mounted on a movable part, in the form of a pedal arm 8, of an exercise device, in the form of a bicycle. When an exercising user interacts with the bicycle, the pedal arm 8 performs a rotating movement. This movement is measured by the first sensor unit 7, and the first sensor unit 7 provides these measurements to a processing unit (not shown), via a wireless communication channel. The processing unit then applies the measurements in order to determine the activity being performed by the user, in the manner described above.

    [0082] FIG. 3 illustrates a first sensor unit 7 mounted on a movable part, in the form of a movable arm 9, of an exercise device, in the form of a rowing machine 10. Similarly to the embodiment described above with reference to FIG. 2, when an exercising user interacts with the rowing machine 10 by performing rowing movements, the movable arm 9 moves back and forth, and the first sensor unit 7 measures this movement and communicates the measurements to a processing unit, in order to allow the processing unit to determine the activity being performed by the user.

    [0083] FIG. 4 illustrates a first sensor unit 7 mounted on a movable part, in the form of a cable 11, of an exercise device, in the form of a pulley system 12 for weight lifting workout. Also in this embodiment, when an exercising user interacts with the pulley system 12 by lifting the weights 13, the cable 11 moves up and down, and the first sensor unit 7 measures this movement and communicates the measurements to a processing unit, in order to allow the processing unit to determine the activity being performed by the user.

    [0084] FIG. 5 is a flow chart illustrating a method according to a first embodiment of the invention. At step 14 measurements from a first sensor unit, mounted on a movable part of an exercise device, are received at a processing unit. At step 15 it is investigated whether or not an exercising user has manually entered the activity being performed by the exercising user. If this is the case, the process is forwarded to step 16, where the processing unit determines the activity being performed as the activity which was manually entered by the exercising user, and a corresponding output signal for an output device is generated.

    [0085] If step 15 reveals that the exercising user has not manually entered the activity, the process is forwarded to step 17, where the data received from the first sensor unit is processed, and at step 18 a preliminary determination of the activity being performed by the exercising user is obtained, based on the processed data from the first sensor unit.

    [0086] At step 19 it is investigated whether or not measurement data is available from a second sensor unit forming part of a head mounted output device being worn by the exercising user. If this is the case, the process is forwarded to step 20, where the measurement data from the second sensor unit is processed and combined with the processed data from the first sensor unit, and at step 21 the processing unit determines the activity being performed by the exercising user, based on the combined processed data from the first sensor unit and the second sensor unit.

    [0087] In the case that step 19 reveals that no measurement data is available from the second sensor unit, the process is forwarded directly to step 21, and the activity being performed by the exercising user is determined solely on the basis of measurement data obtained by the first sensor unit, i.e. the determination is in line with the preliminary determination performed at step 18.

    [0088] At step 22, an output signal for a head mounted output device is generated in accordance with the activity which was determined at step 21.

    [0089] In the case that the system has access to a database 23 containing known movement patterns for a number of specified activities, then the process is forwarded from step 21 to step 24, where the processing unit derives a movement pattern from the available data. The derived movement pattern is compared to the stored movement patterns from the database 23, at step 25. At step 26 it is investigated whether or not the derived movement pattern matches one of the stored patterns. If this is the case, the process is forwarded to step 27, where the activity performed by the exercising user is determined as the activity defined by the movement pattern which the derived movement pattern matches. Furthermore, an output signal for a head mounted output device is generated in accordance herewith.

    [0090] In the case that step 26 reveals that no match can be found between the derived movement pattern and any of the stored movement patterns, then the process is forwarded to step 22, and the activity is determined based on the determination performed at step 21.

    [0091] FIG. 6 is a flow chart illustrating a method according to a second embodiment of the invention. The process is started at step 28. At step 29 an exercising user is asked to start moving, i.e. to start interacting with an exercise device.

    [0092] At step 30, a processing unit processes measurement data received from sensor units and determines an activity being performed by the exercising user, e.g. in the manner described above with reference to FIG. 5.

    [0093] At step 31, at number of available experiences are presented to the user, and the user is requested to select one of them. In FIG. 5 five experiences 32 are shown. The experiences 32 represent various visual and/or audible outputs which can be provided to the exercising user as a reward for performing the activity. Accordingly, the available experiences 32 which are presented to the user at step 31, have been preselected from a larger group of experiences, based on the determined activity.

    [0094] At step 33, the user may either select one of the presented experiences 32, or indicate that the activity has been incorrectly determined. In the case that the user selects one of the presented experiences 32, this is also a confirmation that the activity was determined correctly. In this case, the process is forwarded to step 34, where it is communicated to an artificial intelligence (AI) or machine learning (ML) engine that the activity was correctly determined. This information is used for improving or training the AI/ML model. Finally, the selected experience is loaded, in step 35.

    [0095] In the case that the user, in step 33, indicated that the activity had been incorrectly determined, the process is forwarded to step 36, where this is communicated to the AI/ML engine. This information is also useful for improving or training the AI/ML model. Furthermore, the process is forwarded to step 37, where the user is requested to enter the correct activity. Based on the entered activity, a new set of available experiences 32 is presented to the user, and the user is requested to select one of them, in the manner described above. However, the new set of available experiences 32 is associated with the entered, correct activity.

    [0096] FIG. 7 illustrates determining an activity performed by an exercising user in accordance with a method according to an embodiment of the invention. Inputs, x.sub.1, x.sub.2, . . . , x.sub.N0, in the form of measured data from sensor units are supplied to nodes, Y.sub.1.sup.1, Y.sub.2.sup.1, . . . , Y.sub.N1.sup.1, of a first hidden layer of the neural network.

    [0097] The inputs x.sub.1, x.sub.2, . . . , x.sub.N0 could, e.g. be in the form of acceleration along the x direction, acceleration along the y direction, acceleration along the z direction, three dimensions of orientation provided by a gyroscope, magnetometer measurements along the x, y and z direction, etc. Furthermore, the inputs x.sub.1, x.sub.2, . . . , x.sub.N0 may originate from a sensor unit mounted on a movable part of an exercising device and/or from a sensor unit mounted on a head mounted output device.

    [0098] The sensor data is processed by the nodes of the first hidden layer, and the processed data is supplied to nodes, Y.sub.1.sup.2, Y.sub.2.sup.2, . . . , Y.sub.N2.sup.2, of a second hidden layer, where further processing is, performed before the data is supplied to the next hidden layer, etc., until an output layer of the neural network is reached. For each layer of the neural network, deeper and deeper features are extracted from the data, thereby identifying patterns in the provided data, and the identified patterns may be compared to patterns identified in similar data obtained while users performed well defined exercising activities, and used for training the neural network.

    [0099] The nodes, y.sub.1.sup.k+1, y.sub.2.sup.k+1, . . . , y.sub.N.sup.k+1, of the output layer output a number of final outputs in the form of values, each representing an activity and a confidence level, i.e. an indication regarding how likely it is that the determined activity is in fact the activity being performed by the exercising user. The confidence level thus reflects to which extend the patterns identified in the processed data match the corresponding patterns related to well defined exercising activities.

    [0100] It should be noted that the activity being identified is not merely the kind of activity or the kind of exercising equipment being used, but also includes how the activity is being performed, e.g. in terms of speed, intensity, load, duration, etc.

    [0101] FIG. 8 illustrates the node y.sub.j.sup.k of the neural network of FIG. 7. The node receives processed inputs, x.sub.1.sup.k−1, x.sub.2.sup.k−1, . . . , x.sub.N.sup.k−1, from the nodes from the previous layer. The inputs are provided with weights, w.sub.1, w.sub.2, . . . , w.sub.N, and the weighted inputs are processed at the node, using an activation function, z. The output of the node, F(z), is supplied to the nodes of the next layer of the neural network.