A METHOD FOR OPERATING A MOTIVATION ENHANCING EXERCISE SYSTEM
20230149793 · 2023-05-18
Inventors
Cpc classification
A63B2071/0638
HUMAN NECESSITIES
A63B2220/833
HUMAN NECESSITIES
A63B2225/50
HUMAN NECESSITIES
A63B2071/0666
HUMAN NECESSITIES
A63B22/0076
HUMAN NECESSITIES
A63B69/16
HUMAN NECESSITIES
A63B24/0062
HUMAN NECESSITIES
A63B71/0622
HUMAN NECESSITIES
International classification
A63B71/06
HUMAN NECESSITIES
Abstract
A method for operating a motivation enhancing exercise system includes an exercise device configured to be driven by an exercising user, a head mounted output device configured to output a visual and/or an audible output for an exercising user, and a processing unit. Movements of a movable part of the exercise device are measured by means of a first sensor unit mounted on the movable part, and movements of a head of the user are measured by means of a second sensor unit comprised in the head mounted output device. The processing unit determines an activity being performed by the exercising user, and generates an output signal for the head mounted output device, in accordance with the determined activity. The head mounted output device provides a visual and/or audible output for the exercising user in accordance with the output signal.
Claims
1.-10. (canceled)
11. A method for operating a motivation enhancing exercise system, the exercise system comprising an exercise device configured to be driven by an exercising user; a first sensor unit mounted on a movable part of the exercise device, the first sensor unit being configured to detect movements of the movable part of the exercise device; a head mounted output device configured to output a visual and/or an audible output for an exercising user, the head mounted output device further comprising a second sensor unit configured to detect movements of a head of the exercising user; and a processing unit being communicatively connected to the first sensor unit and to the head mounted output device, the method comprising the steps of: measuring movements of the movable part of the exercise device by means of the first sensor unit, in response to an exercising user operating the exercise device, measuring movements of a head of the exercising user, by means of the second sensor unit, while the head mounted output device is mounted on the head of the exercising user, and while the exercising user operates the exercise device, providing the measurements performed by the first sensor unit and by the second sensor unit to the processing unit, the processing unit determining an activity being performed by the exercising user, based on the measurements provided from the first sensor unit and from the second sensor unit, the processing unit generating an output signal for the head mounted output device, in accordance with the determined activity, and providing the output signal to the head mounted output device, and the head mounted output device providing a visual and/or audible output for the exercising user in accordance with the output signal, wherein the step of the processing unit determining an activity comprises the processing unit comparing the received measurements to expected movement patterns of the first sensor unit and the second sensor unit for one or more predefined activities, and wherein the expected movement patterns are generated by means of machine learning.
12. The method according to claim 11, wherein the first sensor unit comprises a three-dimensional accelerometer and a gyroscope, and wherein the step of measuring movements of the movable part of the exercise device by means of the first sensor unit comprises measuring three-dimensional accelerations and gravitational orientation of the movable part of the exercise device.
13. The method according to claim 11, wherein the step of providing the measurements from the first sensor unit to the processing unit is performed by means of a wireless communication channel.
14. The method according to claim 11, wherein the processing unit forms part of the head mounted output device, and wherein the step of providing measurements from the second sensor unit to the processing unit is performed by direct communication.
15. The method according to claim 11, wherein the step of the processing unit determining an activity comprises the processing unit determining a type of the exercise device.
16. The method according to claim 11, wherein the head mounted output device is a virtual reality or an augmented reality device, and wherein the step of the head mounted output device providing an output comprises providing a visual experience output for the exercising user.
17. A motivation enhancing exercise system comprising: an exercise device configured to be driven by an exercising user, a first sensor unit mounted on a movable part of the exercise device, the first sensor unit being configured to detect movements of the movable part of the exercise device, a head mounted output device configured to output a visual and/or an audible output for an exercising user, the head mounted output device further comprising a second sensor unit configured to detect movements of a head of the exercising user, and a processing unit connected to the first sensor unit and the head mounted output device, wherein the motivation enhancing exercise system is configured to perform the method according to claim 11.
18. The motivation enhancing exercise system according to claim 17, wherein the first sensor unit comprises a three-dimensional accelerometer and a gyroscope.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0068] The invention will now be described in further detail with reference to the accompanying drawings in which
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
DETAILED DESCRIPTION OF THE DRAWINGS
[0075]
[0076] The system 1 further comprises a processing unit (not shown) which is arranged to communicate with the VR or AR device 2, with the headphones 3 and with a sensor unit (not shown) which is mounted on a movable part of an exercise device (not shown). The processing unit may form part of the VR or AR device 2 and/or the headphones 3.
[0077] During use, a user interacts with an exercise device, thereby causing a movable part of the exercise device to move. This will be described in further detail below with reference to
[0078] The user further has the VR or AR device 2 or the headphones 3 mounted on the head. A second sensor unit (not shown) mounted on or forming part of the VR or AR device 2 or the headphones 3 measures movements of the head of the user. These measurements are also communicated to the processing unit via a wireless communication channel, via a wired connection, or provided directly to the processing unit in the case that the processing unit forms part of the respective output device 2, 3.
[0079] Based on the received measurements, the processing unit determines an activity being performed by the exercising user. Thus, the activity is determined based on the interaction between the exercising user and the exercise device, as well as based on movements of the head of the user. As described above, the determination of the activity is accurate and fast, since it is performed based on measurements performed by the first sensor unit as well as on measurements performed by the second sensor unit.
[0080] The processing unit further generates an output signal for the chosen head mounted output device 2, 3, in accordance with the determined activity, and provides the output signal to the output device 2, 3. The output device 2, 3 finally provides a visual and/or audible output for the user in accordance with the output signal. Thus, the exercising user receives a reward, in the form of a specific visual and/or audible output, which reflects the activity which the user is performing.
[0081]
[0082]
[0083]
[0084]
[0085] If step 15 reveals that the exercising user has not manually entered the activity, the process is forwarded to step 17, where the data received from the first sensor unit is processed, and at step 18 a preliminary determination of the activity being performed by the exercising user is obtained, based on the processed data from the first sensor unit.
[0086] At step 19 it is investigated whether or not measurement data is available from a second sensor unit forming part of a head mounted output device being worn by the exercising user. If this is the case, the process is forwarded to step 20, where the measurement data from the second sensor unit is processed and combined with the processed data from the first sensor unit, and at step 21 the processing unit determines the activity being performed by the exercising user, based on the combined processed data from the first sensor unit and the second sensor unit.
[0087] In the case that step 19 reveals that no measurement data is available from the second sensor unit, the process is forwarded directly to step 21, and the activity being performed by the exercising user is determined solely on the basis of measurement data obtained by the first sensor unit, i.e. the determination is in line with the preliminary determination performed at step 18.
[0088] At step 22, an output signal for a head mounted output device is generated in accordance with the activity which was determined at step 21.
[0089] In the case that the system has access to a database 23 containing known movement patterns for a number of specified activities, then the process is forwarded from step 21 to step 24, where the processing unit derives a movement pattern from the available data. The derived movement pattern is compared to the stored movement patterns from the database 23, at step 25. At step 26 it is investigated whether or not the derived movement pattern matches one of the stored patterns. If this is the case, the process is forwarded to step 27, where the activity performed by the exercising user is determined as the activity defined by the movement pattern which the derived movement pattern matches. Furthermore, an output signal for a head mounted output device is generated in accordance herewith.
[0090] In the case that step 26 reveals that no match can be found between the derived movement pattern and any of the stored movement patterns, then the process is forwarded to step 22, and the activity is determined based on the determination performed at step 21.
[0091]
[0092] At step 30, a processing unit processes measurement data received from sensor units and determines an activity being performed by the exercising user, e.g. in the manner described above with reference to
[0093] At step 31, at number of available experiences are presented to the user, and the user is requested to select one of them. In
[0094] At step 33, the user may either select one of the presented experiences 32, or indicate that the activity has been incorrectly determined. In the case that the user selects one of the presented experiences 32, this is also a confirmation that the activity was determined correctly. In this case, the process is forwarded to step 34, where it is communicated to an artificial intelligence (AI) or machine learning (ML) engine that the activity was correctly determined. This information is used for improving or training the AI/ML model. Finally, the selected experience is loaded, in step 35.
[0095] In the case that the user, in step 33, indicated that the activity had been incorrectly determined, the process is forwarded to step 36, where this is communicated to the AI/ML engine. This information is also useful for improving or training the AI/ML model. Furthermore, the process is forwarded to step 37, where the user is requested to enter the correct activity. Based on the entered activity, a new set of available experiences 32 is presented to the user, and the user is requested to select one of them, in the manner described above. However, the new set of available experiences 32 is associated with the entered, correct activity.
[0096]
[0097] The inputs x.sub.1, x.sub.2, . . . , x.sub.N0 could, e.g. be in the form of acceleration along the x direction, acceleration along the y direction, acceleration along the z direction, three dimensions of orientation provided by a gyroscope, magnetometer measurements along the x, y and z direction, etc. Furthermore, the inputs x.sub.1, x.sub.2, . . . , x.sub.N0 may originate from a sensor unit mounted on a movable part of an exercising device and/or from a sensor unit mounted on a head mounted output device.
[0098] The sensor data is processed by the nodes of the first hidden layer, and the processed data is supplied to nodes, Y.sub.1.sup.2, Y.sub.2.sup.2, . . . , Y.sub.N2.sup.2, of a second hidden layer, where further processing is, performed before the data is supplied to the next hidden layer, etc., until an output layer of the neural network is reached. For each layer of the neural network, deeper and deeper features are extracted from the data, thereby identifying patterns in the provided data, and the identified patterns may be compared to patterns identified in similar data obtained while users performed well defined exercising activities, and used for training the neural network.
[0099] The nodes, y.sub.1.sup.k+1, y.sub.2.sup.k+1, . . . , y.sub.N.sup.k+1, of the output layer output a number of final outputs in the form of values, each representing an activity and a confidence level, i.e. an indication regarding how likely it is that the determined activity is in fact the activity being performed by the exercising user. The confidence level thus reflects to which extend the patterns identified in the processed data match the corresponding patterns related to well defined exercising activities.
[0100] It should be noted that the activity being identified is not merely the kind of activity or the kind of exercising equipment being used, but also includes how the activity is being performed, e.g. in terms of speed, intensity, load, duration, etc.
[0101]