Play apparatus
11517830 · 2022-12-06
Assignee
Inventors
- Simon Parsons (Glasgow, GB)
- Gordon Ross (Auchterarder Perth and Kinross, GB)
- Anthony James Bibby (Brightons Nr. Falkirk Stirling, GB)
Cpc classification
A63H33/18
HUMAN NECESSITIES
International classification
A63H33/18
HUMAN NECESSITIES
Abstract
A play apparatus has a pair of devices. Each device has: a capture arrangement such as a cavity to capture an object such as a ball; an object detector to detect that the object has been captured; a transmission module to transmit a release signal; a receiver module to receive a release signal; and a release actuator to release the captured object. A first device detects capture of a first object, transmits a first release signal to cause a second device to release a second object. The first device withholds the first object until releasing it responsive to receipt of a second release signal indicating that another object has been captured by the second device.
Claims
1. A play apparatus comprising a pair of devices, each device comprising: a capture arrangement configured to capture an object, wherein the capture arrangement comprises a cavity configured to conceal the captured object until it is revealed by its release; an object detector configured to detect that the object has been captured; a transmission module configured to transmit a release signal, and a receiver module configured to receive a release signal, wherein the transmission module and receiver modules are embodied in a non-transitory computer-readable medium of the apparatus or of each pair of devices; and a release actuator configured to release the captured object, wherein a first device of the pair of devices is configured to: respond to detection of capture by the first device of a first object, transmit a first release signal configured to cause a second device of the pair of devices to release a second object; and withhold the first object until releasing it, responsive to receipt of a second release signal configured to indicate that another object has been captured by the second device; wherein the play apparatus further comprises: a media output device configured to output to a user of the first device an indication of the second object having been released by the second device in response to the first release signal; and a media acquisition device configured to acquire and transmit media representing the second object having been released by the second device in response to the first release signal, for output by the media output device.
2. The play apparatus of claim 1, wherein the release actuator is configured to release the captured object by ejecting it.
3. The play apparatus of claim 1, wherein the second device is configured to: receive the first release signal transmitted by the first device; responsive to the first release signal, release the second object; detect that the other object has been captured by the second device; and responsive to the detection of the capture of the other object by the second device, transmit the second release signal to the first device.
4. The play apparatus of claim 1, further comprising a media sequence module embodied in a non-transitory computer-readable medium of the apparatus or of each pair of devices configured to generate a media sequence having a duration corresponding to latency from the transmission of the first release signal to the release of the second object, for output by a media output device.
5. The play apparatus of claim 1, further comprising a media sequence module embodied in a non-transitory computer-readable medium of the apparatus or of each pair of devices configured to generate a media sequence responsive to user input, between the detection of the capture of the first object and the release of the second object, for output by a media output device.
6. The play apparatus of claim 1, wherein each device comprises an attribute detector module embodied in a non-transitory computer-readable medium of the apparatus or of each pair of devices configured to detect an attribute of an object upon its capture and wherein the transmission module of the first device is further configured to transmit a first attribute signal configured to cause the second device to apply the attribute to the second object.
7. The play apparatus of claim 6, wherein each device comprises an attribute actuator configured to apply an attribute to an object.
8. The play apparatus of claim 7, wherein applying the attribute comprises imparting a force to the object.
9. The play apparatus of claim 7, wherein applying the attribute comprises selecting an object to release from a plurality of objects captured in the device from which the selection is made.
10. A method comprising the steps: capture a first object by a first device; detect that the first object has been captured by the first device; responsive to the detection, transmit a first release signal from the first device configured to cause a second device to release a second object; outputting to a user of the first device an indication of the second object having been released by the second device in response to the first release signal; withholding of the first object by the first device until releasing it responsive to receipt of a second release signal configured to indicate that the second or a third object has been captured by the second device; and concealing the captured first object in a cavity until it is revealed by its release, wherein the step of outputting to the user an indication of the second object having been released comprises acquiring and transmitting media of the second object having been released by the second device in response to the first release signal.
11. The method of claim 10, wherein the step of releasing the first object comprises ejecting it.
12. The method of claim 10, further comprising the steps: generate and output a media sequence having a duration dependent on latency from the transmission of the first release signal to the release of the second object.
13. The method of claim 10, further comprising the steps: generate and output a media sequence responsive to user input, between the detection of the capture of the first object and the release of the second object.
14. The method of claim 10, further comprising the steps: receive the first release signal; responsive to the first release signal, release of the second object by the second device; detect that the other object has been captured by the second device; and responsive to the detection of the capture of the other object by the second device, transmit the second release signal to the first device.
15. A method comprising the steps: capturing a first object by a first device; detecting that the first object has been captured by the first device; responsive to the detection, transmit a first release signal from the first device configured to cause a second device to release a second object; withholding of the first object by the first device until releasing it responsive to receipt of a second release signal configured to indicate that the second or a third object has been captured by the second device; concealing the captured first object in a cavity until it is revealed by its release; and outputting to a user of the first device an indication of the second or the third object being captured by the second device, wherein the step of outputting to the user of the first device an indication of the second or the third object being captured by the second device comprises acquiring and transmitting media of the second or the third object being captured by the second device.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1) Embodiments of the present invention will now be described, by way of example only, with reference to the drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
DESCRIPTION OF EMBODIMENTS
(14) Embodiments of the present invention allow a parent and child to engage in unstructured play, by providing remote, two-way, real-time, tactile “passing” of objects.
(15) Embodiments of the present invention create the impression that a real object has been transferred from one location to another in real time. Embodiments enable dialogic communication and play without the use of language via tangible interfaces that extend the play-space beyond the screen into the physical space around it.
(16)
(17) With reference to
(18) In the description of
(19) The device has a communication interface (COMM) 110 with an antenna 112, for wireless communication, such as using WiFi™, although wired communication could be used. A central processing unit (CPU) 114 and power supply (PWR) 116 are also provided. The CPU 114 controls the operation of the device and its communication. Although a CPU is convenient, the present invention is not limited to needing a CPU or other computing device. Instead other electrical circuits could be used with the detectors to trigger the sending of a release signal to the other device, or to release an object responsive to a release signal.
(20) The memory 118 stores various program modules, which are executed by the processor to control other components of the device. These include a transmission module 120 operable to transmit a release signal, using the communication interface 110. A receiver module 122 is operable to receive a release signal, using the communication interface 110.
(21) A media sequence module 124 is provided and its operation is described below with reference to
(22) The device 102 has a capture arrangement, in this example a cavity 128, configured to capture (at 206 in
(23) One or more object detector 132 is operable to detect (at 210) that the ball has been captured. The object detector 132 is connected to the CPU to provide detection signals to the CPU.
(24) The device is operable to withhold (at 210-226) the captured ball until releasing it using one or more release actuator 134. The release actuator 134 may be used, under control of the CPU, for example to allow the ball to drop under the influence of gravity. It may be used to eject the ball, with the release actuator being all or part of a firing mechanism that fires the ball. The actuator may use DC motors or stepper motors. The release actuator may be spring-loaded, using a compression spring or a torsion spring. A motor, such as a servo may pull back the spring.
(25) One or more attribute detector 136 is operable to detect one or more attribute of a ball upon its capture. The attribute detectors 136 are connected to the CPU to provide attribute signals to the CPU. Attributes can include motion attributes such a velocity, angle and spin. Attributes may also include visual attributes such as colour and pattern. An attribute detector may function as the object detector.
(26) In the case where the attribute is intrinsic to the object (for example colour) the play device can identify and select an appropriate coloured object before attaching the remaining attributes to it. Thus, applying an attribute involves selecting an object to release. In that case there are two or more objects captured in the device from which the selection is made.
(27) The transmission module 120 of the first device is further operable to control the communication interface 110 to transmit one or more first attribute signal configured to cause the second device to apply the one or more attribute to the second ball.
(28) One or more attribute actuator 138 is operable to apply one or more attribute to an object. The attribute actuator 138 may be used, under control of the CPU, to apply a motion attribute (at 216) by imparting a force to the ball. The attribute actuator may thus be part of a firing mechanism that fires the ball with a speed, angle and/or spin that corresponds to the motion of the other ball when captured by the other device. An attribute actuator may function as the release actuator.
(29) Other attributes may be metadata, which may be stored in the RFID tag of the ball, being read by an RFID reader as the attribute detector 136. The metadata may for example be used to identify a sound attribute, which is applied to the ball when released by the other device by playing the sound.
(30) Attributes may be for example a surface pattern (or texture) detected on a ball (or other object). The surface texture may then be applied to the object by superimposing the identified texture on the ball's representation in a video sequence. For example, the parent may have just one ball with a marker pattern on it. The child may choose one of several balls, such as one having a particular image of its favourite comic character on its surface. The child's play device recognises the attribute of that particular image and the parent's (or child's) play device or tablet computer can augment the reality of the patterned ball at the parent's end by applying the particular image as a texture to the surface of the parent's ball (using the marker pattern for texture registration) in the video sequence of the parent in real time. The end result is that the child thinks they have passed the selected ball to the parent, whereas the parent only needs to have just one ball. The parent's tablet computer can display a notification of which ball the child passed to them, to help them talk about it with the child, if the parent doesn't notice which ball is inserted by the child into their play device. Additionally, or alternatively, an attribute may be a 3D shape of an object, which can be similarly rendered using augmented reality. Other ways of applying a surface texture include illuminating a ball from inside using LEDs or using e-ink to apply patterns to a ball or both balls.
(31) With reference to
(32) A first device is initially empty at 202 with no ball captured in it. A ball B is captured in the second device at 204.
(33) A first ball A is thrown, pushed or dropped into the first device at 206. The ball B is withheld in the second device at 208. The object detector detects the capture of the ball A at 210, or if using detectors close to the hole, at 206.
(34) The first device withholds the first ball at 210, 214, 218, 222, 226 until releasing it at 230.
(35) Responsive to detection of the capture by the first device of the ball A, at 210 the first device transmits a first release signal 211 configured to cause the second device to release a second ball B. The release signal 211 may be an explicit command or code instructing release. The release signal 211 may be in the form of a control signal encoding attributes to be applied to the second object. Thus, the release is implicitly signalled and the release signal comprises an attribute signal. The release signal may be relayed or transformed or the release information may be transferred from one signal to another during its transmission between the devices. For example, an intermediate server may receive the release signal and generate a new release signal that is sent on to the other device. In that example, the release signal is still to be interpreted as being transmitted from one device to another.
(36) The second device receives at 212 the first release signal 211. Responsive to the first release signal 211, the second device releases the second ball B at 216.
(37) At this stage, the user of the first device has “passed” their ball A to the second device's user, who receives ball B. The first device at 218 withholds and conceals the captured ball A, while the second device is empty at 220.
(38) The user of the second device can catch the ball B. The user of the first device can see that happening by use of a video conference call. The user of the second device then passes the ball back to the first device's user by throwing, pushing or dropping it into the second device at 224.
(39) The second device detects at 224 or 228 that the other object (which may actually be the ball B or a different object) has been captured by the second device.
(40) Responsive to the detection of the capture of the other object by the second device, the second release signal 227 is transmitted to the first device. Thus the second release signal 227 indicates that another object has been captured by the second device. In this example, the second device transmits the second release signal 227, but it may be transmitted for example by an intermediate server, which has been informed of the detection of the capture at 224 or 228.
(41) At 230, the first device releases the ball A responsive to receipt of the second release signal 227. At 232, the second device withholds the other ball B.
(42) This takes the play apparatus back to the initial state, thus devices at 234 and 236 are in the same state as the devices at 202 and 204.
(43)
(44) A small child 328 is a user of a play device 324. A media output device 308, in this case a tablet computer, is positioned behind play device, from the child's point of view. The child's tablet 308 has a camera 312 with a field of view 320 that encompasses the space above the play device 324 and encompasses the child's head and shoulders 328.
(45) The play device 324 and the child's tablet 308 are connected wirelessly to a router 302, which connects via the internet 304 to a remote router 306 for communication with the parent 330. The remote router 306 is connected to the parent's media output device 310, in this case a tablet, and their play device 326. The parent's tablet 310 has a camera 314 with a field of view 322 that encompasses the space above the play device 326 and encompasses the parent's head and shoulders 330. Tablet 310 outputs a video stream 318 acquired by the camera 312.
(46) Tablet 308, is operable to output to a user of the first device 328 an indication 316 of the second object B having been released by the second device 326 in response to the first release signal (211 in
(47) A media acquisition device, in this example tablet 310 with camera 314, is operable to acquire and transmit media representing the second object B having been released by the second device 326 in response to the first release signal, for output by the media output device 308. In this example, the indication is media being a live video (with sound) stream of the ball B. Another type of media is sound. The media acquisition device 314 is operable to acquire and transmit media representing the other object being captured by the second device. Another type of media is animated holographic imagery.
(48) In this example, the tablets 308 and 310 each operate both as media output devices and media acquisition devices. At a given end, the media output device and media acquisition device may be separate devices. At a given end, the media output device and media acquisition device may be integrated in the play device.
(49)
(50) The media sequence module (124 in
(51) The media sequence module is operable to superimpose the media sequence 404 over an output real-time media sequence 406 representing a user of the second device.
(52) Once the latency period has ended and the animation is complete, the media sequence 414 is rendered with a hill 418 that is empty 420. Meanwhile the media output device 412 shows 416 the ball B having been released by the second device with the media sequence 414 superimposed.
(53)
(54) A game is instigated when two balls are placed into the paired devices.
(55) As an example of a fetch game, the following sequence of events may be performed. It's time for Junior and Mum to play. They are on different continents. Each device is turned on and they link visually. They decide to play a “fetch” game. The distinctive ball is placed in first device by Junior. Mum does the same (effectively loading the game). A ball appears on screen, loaded into a catapult and is propelled over the hill out of sight. Alternatives to the catapult might be a canon firing the ball or a donkey kicking the ball, but are not restricted to these. Optional additional step: Junior activates the game by striking a pressure and/or velocity sensor on the device. The pressure and/or velocity are detected as attributes or are used to modify the attributes. Thus the pressure and/or velocity sensor functions as an attribute detector. This may supplement any attribute sensor in the device. At this point the ball flies from Mum's play device across the floor as if Junior has literally fired the ball into Mum's environment. Mum shrieks with surprise before “fetching” the ball from the floor. Mum replaces the ball into her play device. Optional additional step replicates that in the first device: Mum strikes her pressure and/or velocity sensor. It then fires out of Junior's device. Too young to catch it, it fires past him and lands on the floor. The fun begins as Junior searches for where the ball landed, and returns it to the device hole, to once again send it across the globe to Mum.
(56) With reference to
(57) One or more of the devices may be provided with user input components, such as buttons, joysticks, pressure and/or velocity sensors, or microphones. These components can be used for game controllers and/or attribute detectors.
(58) The media sequence is an animation of a computer-generated virtual play space, in this example a hill 508 with a virtual ball 510 being fired away from the viewpoint by in this example a catapult 512. The catapult 512 may be controlled by user input for example a pressure/velocity sensor.
(59) The media sequence module is operable to superimpose the media sequence 504 over an output real-time media sequence 506 representing a user of the second device.
(60) Once the game has ended and the animation is complete, the media sequence 516 is rendered with a virtual play space, in this case a hill 520 that has an empty launch device—in this case a catapult 522. Meanwhile the media output device 514 shows 518 the real ball B having been released by the second device with the media sequence 516 superimposed.
(61) On-screen games and associated graphics enhance the play. The users' faces are visible, with only the bottom third (approximately) used for play.
(62) The game may be played using peripherals to control the rendered ball, such as a pressure, velocity or tilt sensitive pad, or such as a breath-sensitive “blowstick” as described below.
(63) A blowstick can be used to blow at the screen or over the hole in the device, to control an object on a rendered surface. A blowstick can be used to blow and control a musical instrument in a rendered band.
(64) A tilt pad can be used for example to: alter a surface rendered on screen to adjust the roll of a ball or other object, for example to adjust the path of a boat depicted on the high seas, or to adjust the path of a plane around the sky.
(65)
(66) The method has the following steps:
(67) 602: Capture 206 a first object by a first device. The object is withheld and may be concealed.
(68) 604: Detect 206 that the first object has been captured by the first device. Attributes of the first object are detected upon its capture by the first device.
(69) 606: Modifying the attributes. The modification may be responsive to user input. For example, the user may set the direction and speed of ejection of the other object upon its release, in a catapult game as described with reference to
(70) 608: Responsive to the detection, transmit 210 a first release signal 211 from the first device configured to cause a second device to release 216 a second object (at step 612). A first attribute signal is transmitted configured to cause the second device to apply the attribute to the second object (at step 612).
(71) 610: Generate and output a media sequence (images and/or sound) having a duration dependent on latency from the transmission of the first release signal 211 to the release 216 of the second object. Additionally or alternatively, a media sequence is generated and output responsive to user input (e.g. a game), between the detection of 206 the capture the first object and the release 216 of the second object. The media sequence is superimposed over an output real-time media sequence representing a user of the second device.
(72) 612: The first release signal 211 is received 212 at the second device. Responsive to the first release signal, the second object is released 216 by the second device. The attributes detected in step 604 and 610 are applied to the second object. This may involve imparting a force to the second object.
(73) 614: Acquiring and transmitting media of the second object having been released by the second device in response to the first release signal 211. The media are output to a user of the first device as an indication of the second object B having been released 216 by the second device in response to the first release signal 211.
(74) 616: Detect that another object (which may be the second object or a different object) has been captured 224 by the second device. Responsive to the detection of the capture 224 of the other object by the second device, the second release signal 227 is transmitted (from the second device or an intermediate server) to the first device. Media of the other object being captured by the second device are acquired and transmitted. An indication of an object (ball B or a different object) being captured 224 by the second device is output to a user of the first device.
(75) An attribute of the other object may be detected upon its capture by the second device and one or more second attribute signal may be transmitted configured to cause the first device to apply the one or more attribute to the first object. Attributes may also be modified in a step equivalent to that at 606. Applying the attribute(s) to the first object may involve imparting a force to the first object.
(76) The first object, which has been withheld 210-226 by the first device, is released 230 responsive to receipt of the second release signal 227 configured to indicate that the other object has been captured 224 by the second device.
(77) In the steps above, a captured object may be concealed in the cavity until it is revealed by its release. The object may comprise a ball.
(78)
(79) A play device 702 is shown in cross section. The hole 704 has a tapered throat, to aid with catching the ball 710. It has tapered sidewalls 706, 708. A stepper motor 712 connected via its shaft 714 to an arm 716. The motor and arm assembly act as a release actuator. In this example, the releasing action is an ejecting action. As the motor turns, the end of the arm moves upwards and ejects the ball. The motor can move slowly at first to move the ball around the bend in the sidewalls that has been acting to conceal the captured ball. This reduces energy being lost to the sidewalls. Once past the bend, the motor speeds up and it applies more force to the ball, which is ejected out of the throat 704.
(80) In the remaining Figures, features with reference numerals the same as shown in earlier Figures are the same features, and the description of those features given above in relation to the earlier Figures should be used to interpret the latter features.
(81)
(82) In embodiments, the captured object may be supported on a platform until it is released. The platform may be on top of the play device, or within it, such as in a cavity. For example, there may be no cavity and the object may rest on the platform of the top surface of the play device, held in place by gravity.
(83)
(84) Instead of the object detector being a sensor 132 in the play device as shown in
(85) The object detector 932 may be simply processing logic that receives an object detection message, for example from an external object detection sensor, and causes the device to respond to detection of capture of an object.
(86)
(87) Having three or more play devices allows users to pass balls in different ways rather than backwards and forwards. They can choose who they pass the ball to. In this example, three players can pass a ball to one or other of two players. Each player may be in a different location and has a play device. The play devices are all connected via the internet 304. Shown here, the user of the third play device 1014 puts the ball A into their right-had hole 1018. By choosing the right-hand hole, sensors associated with that hole detect the attribute, which is labelled “pass to the right”. This causes the release signal to be sent from play device 1014 to the second play device 1008, rather than play device 1002. This causes the second play device 1008 to release and eject the ball B. If the user of the third play device had put the back in the left-hand hole 1016, then the release signal would have been sent instead to the first play device 1002.
(88) One player may act as a “games master” and they may have control of the game play of the other players. This may involve for example modifying, interrupting or overriding the release signals the release signals between the other players. Thus, they may for example block a ball pass between the other payers or bat a ball back to a player, or steal the ball to their own play device, instead of allowing it to pass to another player. The games master may act to enforce rules of game play or may suspend or terminate play. The role of games master between two or more players may be performed by a person without their own play device, or it may be performed automatically by software running on a processor in a play device or externally.
(89)
(90)
(91) A pair of play devices 1202, 1208 are configured to act as toy car parking garages. The play devices 1202, 1208 are connected via the internet 304. The play devices may operate in the same way as described with reference to
(92) The object may be a volume of solid, such as a ball, or a volume of liquid or gas.
(93) Advantages of embodiments of the present invention include:
(94) Communication is via tangible objects rather than words. Play is unstructured and mimics play with real objects when participants are in the same room. Play is in real time. Embodiments enable parents to engage with their children even when they are absent.