Lighting control based on interaction with toys in play area
09763307 · 2017-09-12
Assignee
Inventors
- Dzmitry Viktorovich Aliakseyeu (Eindhoven, NL)
- Jonathan David Mason (Waalre, NL)
- Sanae CHRAIBI (EINDHOVEN, NL)
Cpc classification
A63F9/001
HUMAN NECESSITIES
International classification
Abstract
The present disclosure is directed to inventive methods, systems and apparatus for lighting control. For example, light output of a lighting system (100) that illuminates a play area (101) may be altered, e.g., by a lighting system controller (102, 302), based on characteristics of toys (104) one or more children is playing with in the area, as well as alterations of the toys or relationships between the toys that the one or more children are effecting.
Claims
1. A method for controlling a lighting system having one or more LEDs, comprising: receiving, at a lighting system controller, a signal indicative of a first characteristic of one or more toys present in a play area supplied with ambient light by the lighting system; receiving, at the lighting system controller, a signal indicative of an alteration of the one or more toys; corresponding, by the lighting system controller, with a remote computing system to determine an additional characteristic of the one or more toys based on the first characteristic, said additional characteristic including color associated with the one or more toys; and energizing, by the lighting system controller, the one or more LEDs of the lighting system to illuminate the play area with light having one or more attributes selected based on the first characteristic of the one or more toys, the additional characteristic of the one or more toys and the alteration of the one or more toys.
2. The method of claim 1, further comprising determining, by the lighting system controller based on the first characteristic of the one or more toys, an identity associated with the toy.
3. The method of claim 2, wherein the determination of the additional characteristic is based on the identity of the toy.
4. The method of claim 1, further comprising: facilitating, by the lighting system controller, an image search via the remote computing system by a search engine associated with the remote computing system by submitting, to the search engine, a search query comprising search criteria that is based on the first characteristic; and selecting, by the lighting system controller, the color based on results of the image search.
5. The method of claim 1, wherein the signal indicative of the alteration of the one or more toys comprises a signal indicative of at least one of a change in proximity or physical contact between two or more toys.
6. The method of claim 1, wherein the signal indicative of an alteration of the one or more toys comprises a signal indicative of a change in orientation of the one or more toys.
7. The method of claim 1, wherein the one or more toys comprises a first toy, and the signal indicative of the alteration of the one or more toys comprises a signal indicative of an addition of a second toy to the play area.
8. The method of claim 1, wherein the signal indicative of the first characteristic of one or more toys present in the play area comprises a signal from an image capture device.
9. The method of claim 1, wherein the signal indicative of the first characteristic of one or more toys present in the play area comprises a wireless signal from a transmitter associated with the one or more toys.
10. The method of claim 1, wherein the one or more attributes are selected further based on a number of lighting units configured to illuminate the play area.
11. The method of claim 1, wherein the one or more attributes are selected further based on a spatial arrangement of lighting units configured to illuminate the play area.
12. The method of claim 1, wherein the one or more attributes are selected further based on light-rendering capabilities of lighting units configured to illuminate the play area.
13. A lighting system, comprising: one or more LEDs; one or more sensors to detect a first characteristic of one or more toys present in a play area illuminated by the lighting system and an alteration of the one or more toys; and a lighting system controller operably coupled with the one or more LEDs and configured to: receive, from the one or more sensors, signals indicative of the first characteristic of one or more toys present in a play area and the alteration of the one or more toys; correspond with a remote computing system to determine an additional characteristic of the one or more toys based on the first characteristic, said additional characteristic including a color associated with the one or more toys; and energize the one or more LEDs of the lighting system to illuminate the play area with light having one or more attributes selected based on the first characteristic, the additional characteristic and the alteration of the one or more toys.
14. The lighting system of claim 13, wherein the lighting system controller is further configured to identify, based on the first characteristic of the one or more toys, an identity associated with the toy.
15. The lighting system of claim 14, wherein the lighting system controller is further configured to correspond with the remote computing system to determine the additional characteristic of the one or more toys based on the identity of the toy.
16. The lighting system of claim 15, wherein the additional characteristic comprises a brightness associated with the one or more toys.
17. The lighting system of claim 13, wherein the lighting system controller is further configured to facilitate an image search via the remote computing system by a search engine associated with the remote computing system by submitting, to the search engine, a search query comprising search criteria that is based on the first characteristic, and wherein the lighting system controller is further configured to select the color based on results of the image search.
18. An apparatus for controlling a lighting system with one or more LEDs, comprising: one or more processors; and memory operably coupled with the one or more processors and containing instructions that, by execution of the instructions by the one or more processors, cause the one or more processors to: receive, from one or more sensors, signals indicative of a first characteristic of one or more toys present in a play area illuminated by the lighting system and an alteration of the one or more toys; correspond with a remote computing system to determine an additional characteristic of the one or more toys based on the first characteristic, said additional characteristic including a color associated with the one or more toys; and energize the one or more LEDs of the lighting system to illuminate the play area with light having one or more attributes selected based on the first characteristic, the additional characteristic and the alteration of the one or more toys.
19. The apparatus of claim 18, wherein the signal indicative of the alteration of the one or more toys comprises a signal indicative of at least one of a change in proximity or physical contact between two or more toys.
20. The apparatus of claim 18, wherein the instructions, by the execution of the instructions by the one or more processors, cause the one or more processors to: facilitate an image search via the remote computing system by a search engine associated with the remote computing system by submitting, to the search engine, a search query comprising search criteria that is based on the first characteristic, and select the color based on results of the image search.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention.
(2)
(3)
(4)
DETAILED DESCRIPTION
(5) Many conventional lighting systems and fixtures incorporate light sources such as LEDs than can be selectively energized to emit light having various attributes. However, light output of such systems and fixture is typically controlled using interfaces a wall-mounted interface and/or a smart phone or tablet computer. Some lighting systems automatically control light output based on parameters such as time of day, product placement in a display, or user interest in a displayed product. However, Applicants have recognized and appreciated that it would be beneficial to configure a lighting system to provide light output with one or more attributes selected based on user interaction with one or more physical tangible objects, such as, for example. Toys, in a play area, e.g., to provide ambient, accent, sport or other types of illumination that enhances a child's experience playing with the toys. In view of the foregoing, various embodiments and implementations of the present invention are directed to energizing one or more light sources of a lighting system to emit light having one or more attributes selected based on characteristics of one or more toys being played with and/or alterations of the one or more toys.
(6) Referring to
(7) In various embodiments, lighting system controller 102 may be a computing device such as a bridge component that is configured to communicate with LEDs 104a-d using various wired and/or wireless technologies, including but not limited to Ethernet, WiFi, coded light, ZigBee, Bluetooth, RFID, NFC, and so forth. In various embodiments, lighting system controller 102 may be controlled by an onboard user interface, or it may be controlled by a remote device such as a smart phone 106 or a tablet computer 108. In some embodiments, lighting system controller 102 may be integral with smart phone 106 and/or tablet computer 108, or even with another computing device (not depicted in
(8) Various sensors 112 may be in communication with lighting system controller 102 and/or other computing devices (e.g., smart phone 106, tablet computer 108), and may be configured to detect and provide signals indicative of characteristics of and/or alterations to toys 110a-c. For instance, referring to
(9) In various embodiments lighting system controller 102 may be configured to receive, e.g., from sensors 112a-d, various signals indicative of various characteristics and/or alterations of toys 110a-c. These signals may come in various forms. In some embodiments, such as where the sensor 112 is an image capture device such as a camera (e.g., 112a and 112b), a signal may come in the form of a signal carrying digital image data captured by the camera. Image processing may be performed on the image data carried in the signal, e.g., by lighting system controller 102 or another computing device such as smart phone 106, tablet computer 108, or another remote computing device (see, e.g.,
(10) In some embodiments, a toy may be equipped with a visual indicator such as a bar code or QR code. One or more sensors such as sensor 112a or 112b (e.g., image capture devices that may act as both cameras and barcode/QR code readers) may obtain information about one or more characteristics of one or more toys from the visual indicator. In some embodiments, one or more toys 110 may be equipped with a transmitter (e.g., WiFi, Bluetooth, RFID, NFC, coded light, etc.). In such case, a sensor 112 may obtain information wirelessly from the transmitter associated with the one or more toys.
(11) Signals lighting system controller 102 receives from sensors 112a-d may be indicative of various things. For example, lighting system controller 102 may receive a signal indicative of a characteristic of one or more of toys 110a-c present in play area 101. Toys such as plurality of toys 110a-c may have various characteristics, including but not limited to identity, color(s), size, shape, configuration (e.g., position of movable limbs, clothing worn by toy, weapon carried by toy), proximity to other toys, orientation (relative to play area 101 or other toys), various levels of genus and species (e.g., animal.fwdarw.mammal.fwdarw.ape.fwdarw.gorilla), and so forth.
(12) Additionally or alternatively, lighting system controller 102 may receive a signal indicative of an alteration of the one or more toys. For example, assume first and second toy 110a and 110b have NFC transceivers that are configured to detect one another when those toys are brought within a predetermined proximity of each other (e.g., within NFC range). On such detection, one or both toys may emit a signal indicative of the toys' proximity or a change thereof. That signal may be received by one or more sensors 112 and communicated to lighting system controller 102, or lighting system controller 102 itself may receive the signal directly. As another example, lighting system controller 102 may receive a signal indicative of an addition of one or more toys 110 to play area 101. For instance, a camera (e.g., 112a or 112b) of a portable computing device may detect visually when third toy 110c is introduced to play area 101. As another example, the signal may be indicative of a change in orientation of the one or more toys, alone or relative to another toy. For instance, sensor 112 such as first sensor 112a may detect that a first toy representing a female is turned by a child to face a second toy representing a male (suggesting romance).
(13) In various embodiments, a signal indicative of an alteration of the one or more toys may include a signal indicative of physical contact between two or more toys. For instance, instead of a robot and toy cube, assume that first and second toys 110a and 110b are two toy blocks. When one or more sensors 112 (e.g., 112d) detects that those two blocks make physical contact with each other, the one or more sensors may transmit a signal to lighting system controller 102. In addition to physical contact, in embodiments where toys include interlocking building blocks, signals indicative of two or more interlocking blocks being secured together or connected could be provided to lighting system controller 102, e.g., by one or more sensors 112. In some embodiments, a special block that is configured to communicate with lighting system controller 102 (e.g., via Bluetooth, WiFi, NFC, coded light, etc.) may be added to a construction to cause a particular lighting scene to be created by lighting system 100. For instance, when building a castle with a special castle-themed block, altering a catapult connected block may cause lighting system 100 to initiate a dynamic, battle-themed lighting scene (e.g., “castle siege”). Moving altering the catapult in a different way (e.g., moving it away from a wall) may cause lighting system 100 to initiate a “peaceful” lighting scene.
(14) Physical contact between toys other than blocks, such as physical contact between vehicles or action figures, may also be detected, e.g., by one or more sensors 112. Or, for younger children, appropriate placement of shaped blocks into similarly-shaped recesses may be detected, e.g., by one or more sensors 112.
(15) Physical contact between toys may be detected by one or more sensors 112 in various ways. In some embodiments, physical contact between toys may be detected by sensors on the toys themselves (e.g., 112d). For instance, a capacitive sensor on one or more building blocks may detect changes in capacitance of that block occurring in response to physical contact with other blocks. Additionally or alternatively, toys may be equipped with NFC components that may be activated when the toys are in physical contact. In various embodiments, sensors 112 on the toys may provide a signal indicative of physical contact and/or interconnection between toys to lighting system controller 102, either directly (e.g., via RFID, Bluetooth, NFC if they're close enough, coded light, etc.) or indirectly, e.g., via transmitters on other toys. In other embodiments, sensors 112 separate from toys may detect physical contact between toys. For example, image capture devices such as 112a or 112b may visually detect physical contact between toys. In some cases, one or more sensors 112 in the form of a pressure wave sensor (e.g., microphone) may listen for a noise that results from physical contact between two or more toys, such as an alarm or other noise raised by one or both toys (e.g., when a toddler places the correct block in the correct hole).
(16) In response to signals such as those described above, lighting system controller 102 may be configured to energize one or more LEDs 104a-d of lighting system 100 to provide play area 101 with light having one or more attributes selected based on a characteristic of one or more toys 110a-c and/or an alteration of one or more toys 110a-c. Attributes of light (ambient or otherwise) that may be selected include but are not limited to hue, temperature, saturation, brightness, intensity, dynamic lighting effects and sequences, and so forth.
(17) For instance, if lighting system controller 102 determines, e.g., based on one or more signals from sensors 112, that a toy introduced into play area 101 is associated with an evil character, lighting system controller 102 may cause one or more LEDs 106 to emit light with various dynamic lighting effects, such as to emulate flashing lightning or to emit a dark color. As another non-limiting example, if lighting system controller 102 determines, e.g., based on one or more signals from sensors 112, that two or more toys in play area 101 are based on aquatic life forms (fictional or nonfictional), lighting system controller 102 may cause one or more LEDs 104 to emit light having one or more attributes associated with aquatic life, such as a blue color. As another example, if lighting system controller 102 determines, e.g., based on one or more signals from sensors 112, that a projectile toy such as a grenade or missile is has been launched, or that a toy configured to mimic being destroyed has in fact been manipulated by a child to mimic such destruction, lighting system controller 102 may cause one or more LEDs 104 to emit a dynamic lighting sequence (e.g., flashing light) to emulate an explosion. As yet another non-limiting example, if lighting system controller 102 determines, e.g., based on one or more signals from sensors 112, that male and female toys are oriented towards each other, lighting system controller 102 may cause one or more LEDs 104 to emit romantic light. As yet another non-limiting example, if lighting system controller 102 determines, e.g., based on one or more signals from sensors 112, that a toddler has correctly placed a toy having a particular shape into a hole having the same shape, lighting system controller 102 may cause one or more LEDs 104 to emit light with congratulatory attributes (e.g., excited blinking, flashing, encouraging color, etc.)
(18) As mentioned previously, in various embodiments, the signal indicative of an alteration of the one or more toys may include a signal indicative of an addition of an additional toy to the play area. In some such embodiments, the signal is indicative of a characteristic shared between a newly added toy and toys already in play area 101. For instance, if the shared characteristic of the first and second toys is that both are orange, lighting system controller 102 may energize one or more LEDs to emit light having a complimentary color to orange, or even orange light. As another example, the characteristic shared between the first and second toys may be an environment inhabited by fictional or nonfictional organisms or characters on which the first and second toys are based, such as in a jungle. In such case, lighting system controller 102 may energize one or more LEDs 104 to emit light having attributes associated with a jungle, such as green.
(19) As noted above, in various embodiments, lighting system controller 102 may receive a signal indicative of an identity associated with the toy. In some embodiments, the signal may contain sufficient information for lighting system controller 102 to identify the toy without further action. For example, if the toy has an RFID transceiver or QR code, a sensor 112 may be able to obtain sufficient data from the toy to identify it (e.g., model or serial number, the name of a character on which the toy is based, etc.). In other embodiments, however, the signal may only contain a clue about the toy's identity. In such embodiments, lighting system controller 102 may be configured to take additional action, such as corresponding with a remote computing system over one or more networks 114 (e.g., the Internet), to determine the toy's identity based on the received clue.
(20) Once lighting system controller 102 has the toy's identity, in various embodiments, lighting system controller 102 may correspond with a remote computing system, e.g., over one or more networks 114, to determine an additional characteristic of the one or more toys based on the identity of the toy. In various embodiments, lighting system controller 102 may then energize one or more LEDs 104 to emit ambient light having one or more attributes selected based on the additional characteristic of the one or more toys.
(21) For instance, once the toy is identified, lighting system controller 102 may determine a color associated with the toy's identity. To determine the color, in some embodiments, lighting system controller 102 may facilitate an image search by a search engine 116. In some such embodiments, lighting system controller 102 may select a color of light to be emitted by one or more LEDs 104 based on results of the image search. In other such embodiments, lighting system controller 102 may consult a remote database server 118, e.g., provided by the toy's manufacturer or compiled by enthusiasts, that stores predefined light attributes to be selected by lighting systems for use when particular toys or combinations of toys are in play. For instance, a toy manufacturer may host on remote database server 118 a portal with a predetermined light scene that should be utilized when two or more of its toys are in play. As another example, the portal may have a predetermined light scene that should be utilized when a particular combination of toys are in play.
(22) In various embodiments, lighting system controller 102 may be unable to identify the toy. In such case, lighting system controller 102 may energize one or more LEDs 104 to emit light having attributes selected based on other criteria. For example, lighting system controller 102 may receive a signal from one or more sensors 112 indicative of a color of the unidentifiable toy. Lighting system controller 102 may energize one or more LEDs 104 to emit light of a similar color, or of a complimentary color. If two or more toys are present in play area 101, and the toys have different colors, then lighting system controller 102 may energize one or more LEDs 104 to emit a mixture of those toys' colors, or may energize one LED to emit one color and another LED to emit another color. In some embodiments, if the particular identity of a toy is unattainable but a broader genus, or type, of the toy is attainable, lighting system controller 102 may facilitate an image search using that genus or type as a query, and may select a light color based on the results.
(23) In addition to the examples described previously, in various embodiments, lighting system controller 102 may select one or more attributes of light to be emitted by one or more LEDs 104 based on signals from one or more sensors 112 indicative of actions taken by a user while wearing one or more wearable toys. For instance, one or more sensors 112 may detect that multiple children are wearing costumes associated with fictional or nonfictional characters that inhabit a particular habitat. Based on signals from these sensors, lighting system controller 102 may take various actions, such as energizing one or more LEDs 104 to emit light having one or more attributes associated with that habitat.
(24) As another example, one or more sensors 112 may detect that a child wearing a particular wearable toy is moving quickly and/or in rhythm (e.g., dancing). Based on signals from these sensors, lighting system controller 102 may take various actions, such as energizing one or more LEDs 104 to emit light having one or more attributes associated with the child's activity (e.g., mimic dance floor lighting). In some embodiments, lighting system controller 102 may energize one or more LEDs 104 in synch with the child's movement, to enhance the child's experience while wearing the wearable toy.
(25) As another example, one or more sensors 112 may detect that a child wearing a particular wearable toy in the form of a puppet on her hand. One or more sensors 112 may also detect that the child is playing with the puppet on a toy stage. Based on signals from these sensors, lighting system controller 102 may take various actions, such as energizing one or more LEDs 104 to emit light having one or more attributes associated with a performance being enacted by the child with the puppet.
(26) In various embodiments, in addition to acting as sensors (or in some cases as lighting system controller 102), smart phone 106 and/or tablet computer 108 may be integrated with child's play. For instance, in some embodiments, smart phone 106 and/or tablet computer 108 may render, e.g., on a touch screen display, images that are related to the child's activity. For instance, a prop for a play being performed by the child with one or more puppets may be displayed. Additionally or alternatively, in various embodiments, smart phone 106 and/or tablet computer 108 may, e.g., based on signals from one or more sensors, provide audio to enhance a child's playing experience. For instance, smart phone 106 and/or tablet computer 108 may emit the sound of thunder to accompany “lighting” produced by lighting system 100.
(27) In various embodiments, lighting system controller 102 may be configured to select one or more attributes of the light emitted by one or more LEDs 104 based on information other than signals from sensors 112. For instance, in some embodiments, lighting system controller 102 may select one or more light attributes based a number of lighting units configured to illuminate play area 101, types of and/or light-rendering capabilities of lighting units (e.g., incandescent, retrofit LED, LED strip, fluorescent bulb, etc.) utilized to illuminate play area 101, and/or a spatial arrangement of lighting units configured to illuminate play area 101. In other embodiments, lighting system controller 102 may first select the one or more light attributes based solely on signals received from sensors 112, and may then alter the selected attributes based on the number of LEDs 104 present.
(28)
(29) At block 202, a signal indicative of a characteristic of one or more toys may be received, e.g., from an image capture device or wireless receiver. For instance, a camera may capture a shape, color, size or other characteristic of a toy, and provide that information to lighting system controller 102. In embodiments where the toy provides identifying information using some sort of identifier transmitter or visual marking (e.g., RFID tag, NFC tag, QR code, bar code, etc.), another type of sensor 112, such as an RFID or NFC transceiver, or a QR or bar code reader, may capture the identifying information and provide it to lighting system controller 102.
(30) At block 204, a signal indicative of an alteration of one or more toys may be received, e.g., by lighting system controller 102. For example, one or more sensors 112 may detect, e.g., visually using image capture technology (e.g., camera) or otherwise (e.g., by monitoring a beacon on the toy), that a single toy is reoriented or otherwise manipulated. Or, if there are multiple toys present in play area 101, one or more sensors 112 may detect, e.g., visually or otherwise, that two or more toys are repositioned relative to one another, brought into physical contact or even interconnected with each other.
(31) At block 206, an identity associated with the toy may be determined, e.g., by lighting system controller 102, based on the characteristic of the toy received in the signal at block 202. An identity associated with a toy may include an identifier having any combination of computer readable numbers, characters or symbols. In various embodiments, the identity associated with a particular toy may not be unique to that toy, but rather may be the identity of a fictional or nonfictional character on which that toy is based. Thus, there may be multiple copies of the same toys that have the same identifier. In some instances, multiple versions of toys may be based on a single fictional or nonfictional character. For instance, one toy may include clothing appropriate for the jungle, whereas another toy based on the same character may include clothing appropriate for the tundra. In such a scenario, lighting system controller 102 may ultimately select one or more attributes of ambient lighting based on both the identity associated with the toy and the outfit the toy is wearing. Playing with the jungle toy version of the character may cause jungle-themed ambient lighting to be emitted by lighting system 100. Playing with the tundra toy version of the character may cause tundra-themed ambient lighting to be emitted by lighting system 100.
(32) At block 208, lighting system controller 102 may correspond with a remote computing system to determine an additional characteristic of the toy. In some instances, lighting system controller 102 may initially receive a signal from one or more sensors 112 that is indicative of a characteristic of the toy that is insufficient to identify the toy. However, that characteristic may at least offer a clue of the toy's identity. In such case, lighting system controller 102 may correspond with a remote computing system hosted by, e.g., a toy manufacturer, to inquiry about an identity of a toy that has the particular characteristic. Thus, for instance, if a toy having a particular color or brightness is detected, lighting system controller 102 may correspond with a toy manufacturer's computing system to determine that a particular toy is the only toy having that color or brightness.
(33) In other instances, lighting system controller 102 may correspond with a remote computing system hosting a search engine to perform an image search. Based on results from the image search, lighting system controller 102 may select one or more colors to be emitted by one or more LEDs 104. For instance, assume lighting system controller 102 ascertains an identity associated with a toy based on a signal received from one or more sensors 112. That identity may be used as a query in the image search. Lighting system 102 may select one or more colors from the resulting images, such as a predominant color or colors, or even a plurality of the most common (e.g., ranked) colors, to be emitted by one or more LEDs 104. For instance, if blue is the most common color found in images returned from the image search, and orange is the second most common color, then lighting system 100 may emit predominantly blue light with an orange accent.
(34) At block 210, which may be performed in addition to or instead of block 208, lighting system controller 102 may correspond with a remote computing system, such as one hosted by or associated with a toy manufacturer, to determine a predefined lighting scene or sequence associated with a detect toy characteristic and/or alteration. For example, the toy manufacturer computing system or another remote computing system may provide a predefined lighting scheme that is to be emitted by lighting systems when a particular toy they manufacture is in play.
(35) At block 212, lighting system controller 102 may determine a configuration (e.g., count of light sources, physical characteristics and/or capabilities of light sources, spatial arrangement of light sources, etc.) of one or more light sources under its control. For instance, lighting system controller 102 in
(36) In various embodiments, lighting system controller 102 may determine the configuration of one more light sources under its control before and/or after it selects one or more light attributes to emit through those light sources. In some instances, the configuration of the one or more light sources may affect which light attributes lighting system controller 102 selects. In other instances, the light attributes selected by lighting system controller 102 may be independent of the configuration of light sources under its control. In such case, lighting system controller may selectively energize one or more LEDs 104 to emit light having the selected attributes based in part on the configuration of the one or more light LEDs 104.
(37) While all the light sources shown in
(38) At block 214, lighting system controller 102 may energize one or more LEDs 104, or may facilitate energizing of one or more LEDs 104, based on various signals. These signals may include but are not limited to one or more of the characteristic of one or more toys received at block 202, the alteration of one or more toys received at block 204, a toy identity determined at block 206, another toy characteristic obtained from a remote computing system at block 208, a predetermined lighting scene obtained at block 210, and/or a configuration of one or more light sources under the control of lighting system controller 102 determined at block 212.
(39)
(40) In various embodiments, one or more processors 320 may include one or more microprocessors configured to execute instructions stored, e.g., in memory 322, to perform selected aspects of method 200. In various embodiments, communication interface 324 may implement various technologies to communicate with other computing devices and/or lighting units, e.g., over one or more computer networks 326. Communication technologies that may be implemented by communication interface 324 include but are not limited to WiFi, Ethernet, Bluetooth, RFID, NFC, ZigBee, coded light, and so forth.
(41) As described above, lighting system controller 302 may correspond, e.g., via communication interface 324, with various remote computing systems to determine one or more attributes of ambient light attributes to be emitted. In some embodiments, and as described above, lighting system controller 302 may correspond with a search engine 316, e.g., to obtain one or more colors from one or more image search results. In some embodiments, and as described previously, lighting system controller 302 may correspond with a manufacturer's database 318, e.g., to obtain one or more predefined ambient light attributes and/or a predefined lighting scene to be implemented when particular toys of that manufacturer are in play. In some embodiments, lighting system controller 302 may correspond with other databases 332, such as databases established by enthusiasts that store custom lighting scenes, e.g., to obtain one or more ambient light attributes to be implemented when, for instance, toys from different manufacturers are in play simultaneously.
(42) While remote computing systems such as search engines (116, 316), manufacturer's databases (118, 318) and so forth are shown as being remotely located from lighting system controller 302, this is not meant to be limiting. In some embodiments, lighting system controller 302 may include, e.g., in memory 322, a database of toy identities and associated light attributes. In various embodiments, the information in this database may be obtained from manufacturers, e.g., over the Internet. In various embodiments, the database may be populated manually by a user, such as a child's parent or the child him or herself, e.g., using smart phone 106 or tablet computer 108. In some embodiments, a toy may come with preprogrammed light attribute data which may be automatically (e.g., when brought into NFC range) or manually provided to lighting system controller 302, e.g., via communication interface 324.
(43) While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
(44) All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
(45) The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
(46) The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
(47) As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
(48) As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
(49) It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited. Also, reference numerals appearing in the claims between parentheses, if any, are provided merely for convenience and should not be construed as limiting in any way.
(50) In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.