SYSTEM FOR, AND METHOD OF, CHANGING OBJECTS IN AN ENVIRONMENT BASED UPON DETECTED ASPECTS OF THAT ENVIRONMENT
20190049906 ยท 2019-02-14
Inventors
Cpc classification
G01S5/0027
PHYSICS
International classification
Abstract
This document describes a system in which a sensory instrumentalitydetecting and monitoring its location within a particular environmentdetects and monitors aspects of its environment (such as, for example, sounds, movements, lighting, colors, surfaces, smells, tastes, signals or combinations of the foregoing) and then sends signals that triggers changes in the environment (or objects in the environment) based upon certain relational matches of locations and aspects of the environment detected. The method described in this document includes the steps of detecting and monitoring locations and the surroundings of a sensory instrumentality, comparing the combination of location and surroundings readings with specific parameters, and causing a change in one or more aspects of or in the environment (or one or more objects in the environment) when there is a match between a particular location, the detected aspect of the surroundings, and the specific parameter.
Claims
1. A system for interactively controlling aspects of an environment comprising: a sensory instrumentality capable of detecting its position in the environment and detecting at least one audio cue within the environment; means of comparing the position of the sensory instrumentality once the sensory instrumentality detects the audio cue with parameters that coordinate the position of the sensory instrumentality with the audio cue; and means of generating at least one signal when there is a match of the detected position of the sensory instrumentality, the audio cue within the environment and the parameters that coordinate the position of the sensory instrumentality with the audio cue, wherein a receiver of the generated signal causes a change in an aspect of the environment and wherein the sensory instrumentality transmits a signal that can change at least one aspect of a garment to which sensory instrumentality is attached.
2. The system of claim 1 wherein the external aspect of such environment is from the group of sounds therein, lighting therein, movements therein, colors therein, smells therein, tastes therein and electronic signals therein.
3. The system of claim 1 wherein such sensory instrumentality interprets parameters in correlating to such sensory instrumentality's position with relevant detected aspects.
4. The system of claim 1 wherein the sensory instrumentality electronically transmits its position and sensory readings to a means of analyzing information relative to the coordinating parameters.
5. The system of claim 3 wherein, based upon the correlation of such parameters with such sensory instrumentality's position with relevant detected aspects, such sensory instrumentality transmits a signal that can change at least one aspect of a garment to which such sensory instrumentality is attached.
6. The system of claim 1 wherein the sensory instrumentality transmits a signal that can change at least one aspect of another object within the environment.
7. The system of claim 1 wherein the coordinating parameters are pre-established and embedded within the sensory instrumentality.
8. The system of claim 1 wherein the coordinating parameters are receivable from a remote source.
9. The system of claim 1 wherein the coordinating parameters are established by activating a function in which the sensory instrumentality captures and stores, within the sensory instrumentality, the coordinating parameters in at least near real time.
10. A system for interactively controlling aspects of an environment comprising: a sensory instrumentality capable of detecting its position in such environment and at least one external aspect of such environment, wherein (i) the external aspect of such environment is from the group of sounds therein, lighting therein, movements therein, colors therein, smells therein, tastes therein and electronic signals therein, (ii) such sensory instrumentality interprets parameters in correlating to such sensory instrumentality's position with relevant detected aspects, and (iii) such sensory instrumentality electronically transmits its position and sensory readings to a means of analyzing such information relative to such parameters; means of comparing such detected position and such external aspects of such environment with other parameters, wherein such other parameters are pre-established and embedded within such sensory instrumentality; and means of generating at least one signal when there is a match of the detected position, such external aspect of such environment and such other parameters, wherein a receiver of such signal causes a change in an aspect of such environment.
11. A method of interactively controlling aspects of an environment comprising the steps of: establishing the parameters that need to be met for there to be a change to be made in the aspects of such environment; detecting, identifying and monitoring the position of a sensory instrumentality within such environment; detecting and monitoring, through such sensory instrumentality, at least one external aspect of the environment; interpreting parameters and correlating such parameters with position of the sensory instrumentality and with relevant detected aspects; and transmitting a signal that can change an element associated with the environment.
12. The method of claim 11 wherein the external aspect of such environment is from the group of sounds in the environment, the movement of the sensory instrumentality in its surroundings, lighting in the area, colors within its sensory range, surfaces with which the sensory instrumentality comes in contact, smells in proximity, tastes of objects placed against the sensory instrumentality, and signals from an external source.
13. The method of claim 11 wherein such sensory instrumentality performs such monitoring when configured as an element of a wearable garment
14. The method of claim 11 wherein such sensory instrumentality performs such monitoring contained an object.
15. The method of claim 11 wherein such parameters are pre-established and embedded within such sensory instrumentality.
16. The method of claim 11 wherein such parameters are transmitted to such sensory instrumentality over time.
17. The method of claim 11 wherein such parameters are established by causing such sensory instrumentality to register such parameters by activating a function in such sensory instrumentality that captures and stores, within the sensory instrumentality, such parameters in at least near real time.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
DETAILED DESCRIPTION
[0020] In a preferred embodiment of the present inventive system, a sensory instrumentality is used to detect, identify and monitor certain aspects of the environment in which it is situated. As shown in
[0021] In this particular embodiment, sensory instrumentality 100 is set to listen for audio cues. For example, sensory instrumentality 100 can be set to listen for a string of spoken words. As suggested, the listened for words can be pre-stored within sensory instrumentality 100 or, in another embodiment of the present invention, they can be transmitted to sensory instrumentality 100, possibly any time prior or in real or near real time. Additionally or alternatively, sensory instrumentality 100 can, for example, detect its movement within the environment, lighting in the area in which it is positioned, colors within its sensory range, surfaces with which it comes in contact (e.g., if it comes in contact with skin or a certain fabric), smells in proximity (e.g., a fog machines output or cookies baking in an on-stage oven), tastes of objects placed against it (bitter vs. sweet), signals generated by external sources, or combinations of the foregoing. Like with audio cues, the aspects to be detected and monitored can be pre-stored in sensory instrumentality 100 or transmitted into it after activation. Accordingly, in the operation of the present inventive system, sensory instrumentality 100 can receive the relevant aspect(s) (e.g., sounds, lighting, movement, colors, smells, tastes, signals, etc.) of the environment it is in.
[0022] Additionally, in the operation of the present inventive system, sensory instrumentality 100 would detect, identify and monitor its position in the environment it is in. One of ordinary skill in the art would realize that are numerous technologies that can be used for sensory instrumentality 100 to detect, identify and monitor its position within the environment. It is the coordination of the position of sensory instrumentality 100 and the detection of the anticipated sensory trigger(s) that, in essence and actuality, interactively control and cause changes in the desired aspects of the environment.
[0023] Depending upon the additional features and functions incorporated in sensory instrumentality 100, when in the desired position and, for example, when a certain string of spoken words is heard, sensory instrumentality 100 can transmit signal 110 to receiver 116as shown in
[0024] Based upon the correlation of parameters with the associated information, as shown in
[0025] In a further embodiment of the inventive system, as shown in
[0026]
[0027] In one particular embodiment of the present invention, as discussed above and shown in
[0028]
[0029] Another use in an entertainment environment is shown in
[0030]
[0031] In other instances, if the parameters have not been pre-established, then an additional step would include the setting of such parameters. One of ordinary skill in the art would realize that such parameters can be transmitted to the sensory instrumentality from a remote location or, if the sensory instrumentality includes the necessary functionality, such parameters can be set within the sensory instrumentality in real or near real time.
[0032]
[0033] The inventive method also includes the step of coordinating the relationship between the position(s) and other readings in anticipation of detecting a configuration that matches an actionable parameter. A separate step in the inventive method is the transmitting of a signal to a receiver when there is a match of the parameters. It is such receiver or the apparatus with which it is associated that causes the desired change in the environment or in an object therein.
[0034] A sensory instrumentality, with the capability of detecting and monitoring its position in the environment and other external aspects of it, monitors the environment, reading, for example, sounds in the environment, the movement of the sensory instrumentality or in its surroundings, lighting in the area, colors within its sensory range, surfaces with which it comes in contact, smells in proximity, tastes of objects placed against it, signals receivable by it, or combinations of the foregoing. In one embodiment of the present invention, the sensory instrumentality performs such monitoring when configured as an element of a wearable garment. In another embodiment, the sensory instrumentality performs such monitoring as, or as contained in, an object.
[0035] In the performance of the inventive method, the sensory instrumentality detects, identifies and monitors its position in the environment and receives the relevant aspect(s) of the environment (e.g., sounds, lighting, movement, colors, smells, tastes, signals etc.). Such sensory instrumentality either (x) interprets parameters and correlates its position(s) relative to the detected aspects and/or (y) electronically transmits the pertinent information (e.g., position and sensory readings) to a device that can analyze such information relative to such parameters. Based upon the correlation of parameters with the associated information, the present invention includes the step of transmitting a signal that can change (A) one or more of the aspects of the environment (e.g., it can cause a change in the lighting based upon the position of a costume wearer and sounds detected), (B) one or more aspects of a garment to which it is attached or object in which it is contained, (C) aspects of a combination of the foregoing, or (D) some other element associated with the environment.
[0036] One of ordinary skill in the art would recognize that the present invention can be used as a part of the function in other entertainment, sports, everyday work or specialized environments, such as situationally-dependent safety equipment or periodic activity as part of a job or process. One benefit of the use of the present invention is the enablement of performances and other activities that are more flexible since the actions can be triggered in real time instead of being a cascade of timed events or a pre-set action.
[0037] Some of the examples of the use of the inventive system and/or method include, without limitation, guest interaction, stage performances, and parades. With guest interaction, e.g., in an environment with many guests (i.e., an audience), it is possible to operate the inventive system as part of an interactive experience. Take the King Arthur sword in the stone exhibit example. Only the right person who uses the pre-established words would be able to, unbeknownst to the person, deactivate that magnets holding the sword in the stone. The host can issue costumes with an audio sensory instrumentality embedded to the guests and the sword will only interact when the person wearing such a costume says the pre-established words while pulling on the sword. A variant can be that every costume reacts to some activities, but not all (e.g., the Jedi Training Academy at Disney World uses costumes for every volunteer and these can be interactive costumes instead of basic fabric). For stage performances, the inventive system can be used for lighting, costuming, and visual effects on a stage. In connection with parades, a float, a prop, or the costume of the performer can react to the performer interacting with the float.
ADDITIONAL THOUGHTS
[0038] The foregoing descriptions of the present invention have been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner of ordinary skill in the art. Particularly, it would be evident that while the examples described herein illustrate how the inventive apparatus may look and how the inventive process may be performed. Further, other elements/steps may be used for and provide benefits to the present invention. The depictions of the present invention as shown in the exhibits are provided for purposes of illustration.
[0039] The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others of ordinary skill in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated.