GAME DEVICE

20220314098 · 2022-10-06

    Inventors

    Cpc classification

    International classification

    Abstract

    A game device includes a display surface, a game space assigned to the display surface, a sensor system configured to detect an impact site of an object on the display surface, an acquisition system configured to detect the position of the object and/or of a player in at least a part of the game space, and a computing unit configured to determine a target field on the display surface using the position, wherein the game device is configured to display the target field on the display surface and to determine whether the impact site lies in the target field.

    Claims

    1. A game device, comprising: a display surface; a game space assigned to the display surface; a sensor system configured to detect an impact site of an object on the display surface; an acquisition system configured to detect a position of at least one of the object and a player in at least a part of the game space; and a computing unit configured to determine a target field on the display surface based on the position, and wherein the game device is configured to display the target field on the display surface and to determine whether the impact site lies in the target field.

    2. The game device according to claim 1, wherein the computing unit is further configured to: determine a target area arranged in the game space based on a predetermined rule based on the position, and, determine, based on the target area, the position, and a ballistic model of the object, the target field such that when the player plays the object towards the target field, the object enters the target area after bouncing back from the display surface.

    3. The game device according to claim 2, wherein the computing unit is further configured to: determine a target point representative for the target area, calculate, based on the ballistic model, a trajectory of the object extending from the position of the object via the display surface to the target point, and determine the target field about an intersection point of the trajectory on the display surface.

    4. The game device according to claim 2, wherein, in a case in which the acquisition system is not configured to determine the position of the object, the computing unit is configured to: determine the position of the object depending on the position of the player, and equate the position of the object to the position of the player.

    5. The game device according to claim 2, wherein the ballistic model describes a force-free movement of the object.

    6. The game device according to claim 2, wherein the ballistic model comprises an initial speed for the object at the position of the object or a final speed for the object at a target point.

    7. The game device according to claim 2, wherein the ballistic model comprises an acceleration due to gravity.

    8. The game device according to claim 2, wherein the ballistic model takes an energy loss of kinetic energy of the object due to bouncing-back from the display surface into consideration.

    9. The game device according to claim 2, wherein a plurality of predetermined rules are stored in the computing unit, each of which has a different degree of difficulty for the player when playing with the game device.

    10. The game device according to claim 1, wherein the acquisition system is configured to determine a number of players.

    11. The game device according to claim 2, wherein the acquisition system is configured to determine a number of players, and a different one of predetermined rules or a different set of the predetermined rules is stored in each case in the computing unit for each number of players.

    12. The game device according to claim 1, wherein the acquisition system is configured to determine positions of at least one of the object and the player or of at least two objects or of at least two players.

    13. The game device according to claim 2, wherein: the acquisition system is configured to determine positions of at least two players, and the computing unit is configured to determine the target area based on the positions of the at least two players.

    14. The game device according to claim 1, wherein the acquisition system is configured to determine an eye point position of an eye point of the player in the part of the game space and to adjust a perspective of an image displayed on the display surface to the eye point position.

    15. The game device according to claim 1, wherein the acquisition system is configured to recognize whether the object was played by a hand of the player or by a foot of the player.

    16. The game device according to claim 1, wherein the acquisition system is configured to recognize whether the player is holding a racquet in a hand and whether the object was played by the racquet.

    17. The game device according to claim 1, wherein the acquisition system is configured to recognize whether the object was played in accordance with a set of rules belonging to the game device.

    18. The game device according to claim 1, wherein the object is a ball, a medicine ball, or a frisbee.

    19. The game device according to claim 1, wherein the game device is configured such that a rebound sport, such as squash, tennis, volleyball, a shooting sport, archery, or American football can be played with the game device.

    20. An apparatus, comprising: two game devices according to claim 1, and wherein the apparatus is configured to display the position acquired in one of the two game devices in another one of the two game devices on the display surface of the other one of the two game devices.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0039] The disclosure will now be described with reference to the drawings wherein:

    [0040] FIG. 1 shows a perspective view of a game device.

    [0041] FIG. 2 shows a first exemplary embodiment for a predetermined rule.

    [0042] FIG. 3 shows a second exemplary embodiment for the predetermined rule.

    [0043] FIG. 4 shows a third exemplary embodiment for the predetermined rule.

    DESCRIPTION OF EXEMPLARY EMBODIMENTS

    [0044] As can be seen in FIG. 1, a game device 1 includes a display surface 2, a game space 8 assigned to the display surface 2, a sensor system 3 that is configured to detect an impact site 11 of an object 6 on the display surface 2, an acquisition system 4 that is configured to detect the position of the object 6 and/or of a player 5 in at least a part of the game space 8, and a computing unit 15 that is configured to determine a target field 7 on the display surface 2 using the position. The game device 1 is configured to display the target field 7 on the display surface 2 and to determine whether the impact site 11 lies in the target field 7. The game space 8 can be adjacent to the display surface 2. The computing unit 15 can be a personal computer and/or a server. The computing unit can be configured to determine whether the impact site lies in the target field. It is alternatively conceivable that the game device includes a further computing unit that is configured to determine whether the impact site lies in the target field. The further computing unit can also be a personal computer and/or a server. The object can be a flying object. For this purpose, the object 6 can be a ball, a medicine ball or a frisbee. The game device 1 can include the object 6. It is conceivable that the target area 12 extends from a lower end to an upper end of the game space 8. It is also conceivable that the target area 12 only extends in a part of the extent of the full height of the game space 8.

    [0045] An exemplary coordinate system with an x-axis, a y-axis and a z-axis, each of which is arranged perpendicularly to one another, is shown in FIG. 1. The x-axis and the y-axis are arranged horizontally, and the z-axis is arranged vertically. The normal to the display surface 2 is arranged parallel to the x-axis.

    [0046] FIG. 1 shows that the acquisition system 4 can, for example, be arranged at an upper end of the display surface 2. It is conceivable that the acquisition system 4 includes a depth camera. The depth camera is configured to record a two-dimensional or three-dimensional image of the game space 8, wherein each image point of the two-dimensional or three-dimensional image represents a distance value. The distance value can, for example, be the distance of a point of the object 6 and/or of the player 5 from the depth camera. The depth camera can, for example, be configured to determine the distance with a time-of-flight measurement of an electromagnetic pulse.

    [0047] Alternatively or in addition, the acquisition system 4 can include a laser scanner and/or a touch-sensitive floor 13. The laser scanner can also be configured to determine a distance of a point of the object 6 and/or of the player 5 from the laser scanner. The touch-sensitive floor 13 can bound the game space 8 at its lower end and be configured to determine the position of the object 6 and/or of the player 5 through their contact with the touch-sensitive floor 13.

    [0048] The game device 1 can include a projector that is configured to project the target field 7 onto the display surface 2. Alternatively, the game device 1 can include a screen that includes the display surface 2.

    [0049] The object can be a real object. It is furthermore conceivable that the object is a virtual object.

    [0050] According to a first exemplary embodiment of the sensor system 3, the sensor system 3 can include a first row of photoelectric sensors arranged in parallel that is configured to determine a y-coordinate, belonging to the y-axis shown in FIG. 1, of the impact site 11. The sensor system 3 can include a camera that is configured to determine a z-coordinate, belonging to the z-axis shown in FIG. 1, of the impact site 11. The camera can be configured to use the speed and the direction of the object 6 before and after bouncing back from the impact site 11 to determine the z-coordinate. As an alternative to the camera, the sensor system 3 can include a second row of photoelectric sensors arranged in parallel that is configured to determine the z-coordinate of the impact site 11. According to the first exemplary embodiment, the object can be the real object.

    [0051] According to a second exemplary embodiment of the sensor system 3, the sensor system 3 typically includes a camera that is configured to determine a y-coordinate, belonging to the y-axis shown in FIG. 1, and a z-coordinate, belonging to the z-axis shown in FIG. 1, of the impact site 11. According to the second exemplary embodiment, the object can be the real object.

    [0052] In a third exemplary embodiment of the sensor system 3, the game device 1 can include the screen, which is implemented as a touchscreen and thus forms at least part of the sensor system 3. The touchscreen can be configured to determine a y-coordinate, belonging to the y-axis shown in FIG. 1, and a z-coordinate, belonging to the z-axis shown in FIG. 1, of the impact site 11. According to the third exemplary embodiment, the object can be the real object.

    [0053] In a fourth exemplary embodiment of the sensor system 3, the sensor system 3 can at least partially be formed by the acquisition system 4, and the sensor system 3 can be configured to derive the impact site from the position and/or a movement of a player 5. A stroke with a tennis racquet can, for example, be simulated for this purpose from an arm movement of the player 5. According to the fourth exemplary embodiment, the object is the virtual object.

    [0054] FIGS. 2 to 4 show that the computing unit 15 can be configured to determine a target area 12 arranged in the game space 8 on the basis of a predetermined rule using the position, and, using the target area 12, the position, and a ballistic model of the object 6, to determine the target field 7 in such a way that when the player 5 plays the object 6 towards the target field 7, the object 6 enters the target area 12 after bouncing back from the display surface 2. For this purpose, the computing unit 15 can be configured to determine a target point 9 representative for the target area 12, to calculate, using the ballistic model, a trajectory 10 of the object 6 extending from the position of the object 6 via the display surface 2 to the target point 9, and to determine the target field 7 about an intersection point 14 of the trajectory 10 on the display surface 2. The computing unit 15 here can be configured as a client-computing unit, with which the predetermined rule can be entered into the computing unit 15 and/or manipulated remotely, in particular via the server.

    [0055] To determine the trajectory 10, the acquisition system 4 can be configured to determine the position of the object 6. In the alternative case in which the acquisition system 4 is not configured to determine the position of the object 6, the computing unit 15 can be configured to determine the position of the object 6 depending on the position of the player 5. It is, for example, conceivable that the computing unit 15 is configured to equate the position of the object 6 to the position of the player 5, as is, for example, shown in FIGS. 2 to 4. It is alternatively conceivable that the computing unit 15 can be configured to determine the position of the object 6 in accordance with a predetermined calculation rule at a position other than the position of the player 5. It is conceivable here that the acquisition system 4 is configured to distinguish whether the player 5 is holding a racquet in a forehand or in a backhand, and that the predetermined calculation rule is different according to whether the player 5 is holding the racquet in the forehand or in the backhand.

    [0056] It is conceivable that the ballistic model describes a force-free movement of the object 6. It is conceivable here that the computing unit 15 is configured to determine the trajectory 10 without using a speed of the object. Alternatively, it is conceivable that the ballistic model takes occurring forces into consideration in the calculation of the trajectory 10. It is conceivable here that the computing unit 15 is configured to take a speed of the object 6 into consideration. It is conceivable here that the ballistic model includes an initial speed for the object 6 at the position of the object, or a final speed for the object 6 at the target point 9. To take the occurring forces into consideration, the ballistic model can include an acceleration due to gravity, take an air friction resistance into consideration and/or take an energy loss of the kinetic energy of the object 6 due to the bouncing-back from the display surface 2 into consideration.

    [0057] Three exemplary embodiments for the predetermined rule are shown in FIGS. 2 to 4, wherein the predetermined rule according to FIG. 2 relates to a game in which only one of the players 5 is located in the game space 8, the predetermined rule according to FIG. 3 relates to a game in which precisely two of the players 5, 5′ are located in the game space 8, and the predetermined rule according to FIG. 4 relates to a game in which precisely two of the players 5, 5′ or more than two of the players 5, 5′, 5″ are located in the game space 8. According to FIG. 3, the game is played by the players 5, 5′, 5″ with one another and with only one object 6, whereas according to FIG. 4, the game is played by players 5, 5′, 5″ next to one another with one respective object 6 for each of the players 5, 5′, 5″.

    [0058] For the exemplary embodiment shown to FIG. 2, the target field 7 is selected in such a way that a straight line that runs parallel to the x-axis and through the intersection point 14 runs next to the player 5. As a result, after the player 5 has played the object 6 into the target field 7, he must move in order to subsequently reach the object 6 again. It is alternatively conceivable that a straight line that runs parallel to the x-axis and through the intersection point 14 meets the player 5. It is conceivable that a plurality of the predetermined rules are stored in the computing unit 15, each of which has a different degree of difficulty for the player 5 when playing with the game device 1. Thus, for example, as the degree of difficulty increases, the distance from the position to the target area 12 can be chosen to be longer and longer.

    [0059] In the exemplary embodiment shown to FIG. 3, the acquisition system 2 is configured to determine the positions of two of the players, namely a first player 5 and a second player 5′. In addition, the computing unit 15 is configured to determine the target area 12 using the positions of the first player 5 and of the second player 5′. The first player 5 plays the object 6 at the display surface, and the second player 5′ should reach the object 6. The target area 12 can now be selected such that the target area 12 is arranged at the side of the second player 5′ that faces away from the first player 5. The second player 5′ must therefore move away from the first player 5 in order to reach the object 6. It is alternatively conceivable that the target area 12 is arranged at the position of the second player 5′. The second player 5′ can therefore concentrate on playing the object 6. It is also conceivable that the target area 12 is arranged between the first player 5 and the second player 5′. The second player 5′ must therefore move towards the first player 5 in order to reach the object 6. It is also conceivable that according to the predetermined rule, in a sequence of game plays, there is an alternation between the target area 12 that is arranged on the side of the second player 5′ that faces away from the first player 5, the target area 12 that is arranged at the position of the second player 5′, and/or the target area 12 that is arranged between the first player 5 and the second player 5′. It is conceivable that a plurality of the predetermined rules are stored in the computing unit 15, each of which has a different degree of difficulty for the second player 5′ when playing with the game device 1. Thus, for example, as the degree of difficulty increases, the distance of the target area 12 from the position of the second player 5′ can be chosen to be longer and longer.

    [0060] In the exemplary embodiment shown to FIG. 3, the acquisition system 4 is configured to determine the positions of at least two of the players 5, 5′, 5″. In addition, the computing unit 15 is configured to determine a target area 12 for each of the players 5, 5′, 5″ using the positions in each case. The target area 12 of each of the players 5, 5′, 5″ can be determined in such a way that the target area 12 is arranged between the players 5, 5′, 5″ belonging to the target area 12 and his neighboring players 5, 5′, 5″, or, in the case in which the player 5″ only has one neighboring player 5′, is arranged on the side of the player 5″ facing away from the neighboring player 5′. It is conceivable that a plurality of the predetermined rules are stored in the computing unit 15, each of which has a different degree of difficulty for the players 5, 5′, 5″ when playing with the game device 1. Thus, for example, as the degree of difficulty increases, the distance from the position to the target area 12 can be chosen to be longer and longer.

    [0061] The acquisition system 4 can be configured to determine the number of players 5. Another of the predetermined rules or another set of the predetermined rules can, moreover, be stored in the computing unit 15 for each number of players 5 in each case. For example, the predetermined rules or the sets of the predetermined rules can be the predetermined rules described for FIGS. 2 to 4.

    [0062] The acquisition system can be configured to determine an eye point position of an eye point of the player 5 in the part of the game space 8 and to adjust the perspective of an image displayed on the display surface 2 to the eye point position. For this purpose, the acquisition system can for example be configured to determine the position of both eyes of the player 5 and to determine the eye point position as the center point of the positions of both eyes. Equally, the acquisition system can, for example, be configured to detect the head and to determine the eye point position, for example, as the center point of the head or as the center point of a surface that is located at the end of the head that is facing the display surface.

    [0063] The acquisition system 4 can be configured to recognize whether the object 6 was played by a hand of the player 5 or by a foot of the player 5. The acquisition system 4 can also be configured to recognize whether the player 5 is holding a racquet in the hand and, in particular, whether the object 6 was played by the racquet. The acquisition system 4 can, moreover, be configured to recognize whether the object 6 was played in accordance with a set of rules belonging to the game device 1.

    [0064] It is conceivable that the game device 1 is configured such that a rebound sport, in particular squash, tennis, a shooting sport, in particular archery, or American football can be played with the game device 1.

    [0065] It is understood that the foregoing description is that of the exemplary embodiments of the disclosure and that various changes and modifications may be made thereto without departing from the spirit and scope of the disclosure as defined in the appended claims.

    LIST OF REFERENCE NUMERALS

    [0066] 1 Game device [0067] 2 Display surface [0068] 3 Sensor system [0069] 4 Acquisition system [0070] 5 Player [0071] 6 Object [0072] 7 Target field [0073] 8 Game space [0074] 9 Target point [0075] 10 Trajectory [0076] 11 Impact site [0077] 12 Target area [0078] 13 Floor [0079] 14 Intersection point