Visible light communication detecting and/or decoding

11817900 · 2023-11-14

Assignee

Inventors

Cpc classification

International classification

Abstract

A mobile system (1,11) comprises a camera (3) and at least one processor (5). The at least one processor is configured to capture an image with the camera, e.g. of a light beacon (15), to determine whether a certain object is recognized in the captured image, e.g. by querying a database (19), and to start detecting and/or decoding data being communicated via visible light in dependence on the certain object being recognized.

Claims

1. A mobile system configured to detect data communication via visible light emitted by a light source, the mobile system comprising: a camera; and at least one processor configured to capture an image with said camera, to determine whether any object of a plurality of objects is recognized in said captured image and to start detecting whether data is being communicated via visible light associated with any object of the plurality of objects, regardless of which object of the plurality of objects is recognized in said captured image by the at least one processor, only after said recognized object has been recognized, to obtain a decryption key associated with said recognized object, to change a setting of said camera to start said detecting whether data is being communicated via visible light in response to determining that said recognized object is recognized, to decode at least part of said data which is communicated via visible light using said obtained decryption key, to activate a further sensor to start said detecting of whether data is being communicated in response to determining that said recognized object is recognized, and to monitor whether a signal received from said further sensor comprises data communicated via visible light and to deactivate said further sensor in response to determining that said signal does not comprise data communicated via visible light, said further sensor having a wider view than said camera.

2. The mobile system as claimed in claim 1, wherein said at least one processor is configured to provide information to a remote database which allows said remote database to determine what information to provide to said mobile system in relation to the recognized object.

3. The mobile system as claimed in claim 1, wherein said at least one processor is configured to obtain protocol information associated with said recognized object and to decode at least part of said data which is communicated via visible light using said obtained protocol information.

4. The mobile system as claimed in claim 1, wherein said at least one processor is configured to obtain region information associated with said recognized object, said region information identifying a region of said recognized object, and to analyze only said region of said recognized object in the captured image to detect and/or decode said data which is being communicated via visible light.

5. The mobile system as claimed in claim 1, wherein said at least one processor is configured to capture a further image with said camera, to determine whether the recognized object is recognized in said captured further image and to stop detecting data being communicated via visible light in response to determining that the recognized object is not recognized in said captured further image.

6. The mobile system as claimed in claim 1, wherein said at least one processor is configured to determine whether any object of a plurality of objects is recognized in said captured image by recognizing a certain shape or a certain pattern associated with the recognized object in said captured image.

7. A method of visible light communication detecting, comprising: capturing an image with a camera; determining whether any object of a plurality of objects is recognized in said captured image; starting detecting whether data is being communicated via visible light emitted by a light source associated with any object of the plurality of objects, regardless of which object of the plurality of objects is recognized in said captured image by said determining, only after said recognized object has been recognized; changing a setting of said camera to start said detecting whether data is being communicated via visible light in response to determining that said recognized object is recognized; activating a further sensor to start said detecting whether data is being communicated via visible light communication in response to determining that said recognized object is recognized, said further sensor having a wider view than said camera, and monitoring whether a signal received from said further sensor comprises data communicated via visible light, and deactivating said further sensor in response to determining that said signal does not comprise data communicated via visible light.

8. A non-transitory computer readable medium comprising at least one software code portion, the software code portion, when run on a computer system, being configured for causing the computer system to perform the method of claim 7.

9. The mobile system as claimed in claim 7, wherein said at least one processor is configured to determine whether any object of a plurality of objects is recognized in said captured image by querying a remote database.

10. The method as claimed in claim 7, wherein the determining whether any object of a plurality of objects is recognized in said captured image is implemented by querying a remote database.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:

(2) FIG. 1 is a block diagram of embodiments of the mobile system of the invention;

(3) FIG. 2 is a flow diagram of the method of the invention;

(4) FIG. 3 is a flow diagram of a first embodiment of the method of the invention;

(5) FIG. 4 is a flow diagram of a second embodiment of the method of the invention;

(6) FIG. 5 is a flow diagram of a third embodiment of the method of the invention;

(7) FIG. 6 is a flow diagram of a fourth embodiment of the method of the invention;

(8) FIG. 7 shows an example of a captured image to illustrate the method of the invention;

(9) FIG. 8 shows an example of a recognized object to illustrate the method of the invention;

(10) FIG. 9 shows an example of a region of a recognized object to illustrate the embodiment of the method of FIG. 3.

(11) FIG. 10 shows an example of a light beacon that comprises a recognizable light source;

(12) FIG. 11 shows an example of a light beacon that comprises a recognizable pattern; and

(13) FIG. 12 is a block diagram of an exemplary data processing system for performing the method of the invention.

(14) Corresponding elements in the drawings are denoted by the same reference numeral.

DETAILED DESCRIPTION OF THE DRAWINGS

(15) FIG. 1 shows two embodiments of the mobile system of the invention: a mobile system 1 and a mobile system 11. Mobile systems 1 and 11 comprises a camera 3 and a processor 5. The processor 5 is configured to capture an image with the camera 3, to determine whether a certain object, e.g. a light beacon 15, is recognized in the captured image and to start detecting and/or decoding data being communicated via visible light in dependence on the certain object being recognized.

(16) The processor 5 may be configured to detect from the captured image, from another image captured by the camera 3 and/or from a signal received from a further sensor, e.g. from a further camera, whether data is being communicated via visible light. A camera sensor reads a lot of pixels, typically at a low rate (e.g. 25 frames per second). But visible light communication should be invisible for the eye, and emitted at a much higher rate (200 Hz or higher even to MHz). In order to detect the fluctuations with a camera, a special camera may be used, e.g. camera that uses a combination of slow camera pixels interlaced with fast diode pixels. The camera pixels detect the location of a fast fluctuating light source. The diode pixels then read out the sensed fluctuations at a high speed. It may also be possible to use an existing rolling shutter camera, which is a common type of camera in smartphones. With the rolling shutter camera, there is a small delay between the read out of the different rows. This means that a fast fluctuating source (or light effect), delivers a different intensity on the camera row, resulting in black and white stripes. When taking a picture, the camera processing might compensate for this.

(17) Mobile systems 1 and 11 may be mobile phones, e.g. smart phones, for example. In the embodiments shown in FIG. 1, mobile systems 1 and 11 both comprise a single device. In a different embodiment, a mobile system may comprise a plurality of devices, e.g. a smartphone may comprise processor 5 and a pair of smart glasses may comprise camera 3.

(18) The processor 5 of the mobile system 1 is configured to change a setting of the camera 3 to start detecting whether data is being communicated via visible light upon determining that the certain object is recognized. The setting may be changed to the setting described in the Luxapose paper, for example. Mobile system 1 comprises a communication interface 9. The processor 5 of the mobile system 1 is configured to use the communication interface 9 to query a remote database 19 in order to determine whether a certain object is recognized in the captured image.

(19) When the light beacon 15 is connected to a wired or wireless network (not shown), the light beacon 15 may be informed that its VLC signal is being detected and/or decoded by a mobile system, e.g. informed by the mobile system itself. The light beacon 15 may then alter the color of its light source, for example, as an indication to a user of the mobile system that he may want to point his mobile system in the direction of the light beacon 15, for example. The mobile system may inform the light beacon 15 using ZigBee, Bluetooth or Wi-Fi, for example.

(20) The processor 5 of the mobile system 1 may be configured to use the communication interface 9 to provide information to the remote database 19, e.g. a passkey, user identifier and/or group identifier, which allows the remote database 19 to determine what information to provide to the mobile system 1 in relation to a certain object recognized in the captured image, e.g. in view of access control. Mobile system 1 comprise storage means 7. Storage means 7 may store the captured image and/or further data and programs used by mobile system 1, for example.

(21) Mobile system 11 further comprises a further sensor 13. The processor 5 of the mobile system 11 is configured to activate the further sensor 13 to start detecting whether data is being communicated via visible light upon determining that the certain object is recognized. The further sensor 13 may have a wider view than the camera 3. The further sensor 13 may comprise a further camera or a photo diode, for example. The further camera may be, for example, the camera described in “LED and CMOS Image Sensor Based Optical Wireless Communication System for Automotive Applications”, Takai et al., IEEE Photonics Journal Volume 5, Number 5, October 2013.

(22) The processor 5 of the mobile system 11 may be configured to monitor whether a signal received from the further sensor 13 comprises data communicated via visible light and to deactivate the further sensor 13 upon determining that the signal does not comprise data communicated via visible light. Mobile system 11 comprises storage means 7. Storage means 7 may store the captured image, a database with objects which might transmit VLC signals, and/or further data and programs used by the mobile system 11, for example. Mobile system 11 may comprise a communication interface (not shown), e.g. a communication interface similar to communication interface 9 of mobile system 1.

(23) In the embodiment shown in FIG. 1, the mobile systems 1 and 11 comprise one processor 5. In an alternative embodiment, the mobile system comprises multiple processors. The processor 5 of the mobile systems 1 and 11 may be a general-purpose processor, e.g. from ARM or Qualcomm, or an application-specific processor. The processor 5 of the group controller 9 may run an iOS, Windows or Android operating system for example. The invention may be implemented using a computer program running on one or more processors.

(24) The communication interface 9 of mobile system 1 may use WiFi, GPRS, EDGE, UMTS, LTE and/or 5G technologies to communicate with a base station 17 of a communication network, for example. The base station 17 may be part of a mobile communication network or of a local area network, for example. The base station 17 may be connected (directly or via other components) to the Internet. The mobile systems 1 and 11 may comprise other components typical for a mobile system, e.g. a battery and a display. The external database 19 may be a server, e.g. a web server, located on the Internet, for example.

(25) In an embodiment, the processor 5 is configured to obtain protocol information associated with the certain object recognized in the captured image and to decode at least part of the data communicated via visible light using the obtained protocol information. In a different or in the same embodiment, the processor 5 is configured to obtain a decryption key associated with the certain object recognized in the captured image and to decode at least part of the data communicated via visible light using the obtained decryption key.

(26) In an embodiment, the processor 5 is configured to obtain region information associated with the certain object recognized in the captured image, the region information identifying a region of the recognized object, and to analyze only the region of the recognized object in a captured image to detect and/or decode the data being communicated via visible light.

(27) In an embodiment, the processor 5 is configured to capture a further image with the camera 3, to determine whether a certain object is recognized in the captured further image and to stop detecting data being communicated via visible light upon determining that a certain object is not recognized in the captured further image.

(28) In an embodiment, the processor 5 is configured to recognize a certain object in the captured image by recognizing a certain shape or a certain pattern associated with a certain object in the captured image.

(29) The method of visible light communication detecting and/or decoding of the invention comprises at least three steps, see FIG. 2. A step 21 comprises capturing an image with a camera. A step 23 comprises determining whether a certain object is recognized in the captured image. A step 25 comprises detecting and/or decoding data being communicated via visible light in dependence on the certain object being recognized. In an embodiment, an intermediate step between steps 23 and 25 may involve informing the user that a certain object or the certain object has been detected and asking the user whether detecting and/or decoding should commence. In this embodiment, step 25 is performed if the user indicates that detecting and/or decoding should commence. In the same or in a different embodiment, the recognized object is displayed separate from or without other image elements or is highlighted in the captured image. This may make a user aware of the recognized object and if the data being communicated via visible light is (expected to be) important, he might decide to stop to insure that the VLC signal remains in view of the camera or in view of a further sensor, for example. This may be done as part of the above-mentioned intermediate step, for example.

(30) An example of a captured image is shown in FIG. 7. The captured image 71 shows a light house 73 emitting a light beam 75. An image of the light house 73 or one or more features (e.g. the shape) of the light house 73 may be stored in a database with objects that might transmit VLC signals, for example. In step 23, the light house 73 may be recognized as light beacon in the captured image 71, as shown in FIG. 8, for example.

(31) In a first embodiment of the method, steps 21 and 23 are followed by a step 31, see FIG. 3. Step 31 comprises determining whether region information has been associated with the recognized object. If region information has been associated with the recognized object, a step 33 is performed. After step 33, step 35 is performed. If no region information has been associated with the recognized object, step 33 is skipped.

(32) Step 33 comprises obtaining region information associated with the certain object recognized in the captured image. The region information identifies a region of the recognized object, preferably a light source of a light beacon. FIG. 9 shows an example of a light house 73, which is recognized as a light beacon, with a region 95 corresponding to the light source of the light house 73.

(33) In the embodiment of FIG. 3, step 25 (see FIG. 2) comprises steps 35 and 37. Step 35 comprises analyzing only the region 95 of the recognized object, i.e. the light house 73, in the captured image 71 to detect data being communicated via visible light. If such data is detected, step 37 is performed next. Step 37 comprises decoding this data. If no such data is detected, the method returns to step 21 instead of proceeding to step 37. After step 37 has been performed, the method returns to step 21.

(34) In a second embodiment of the method, steps 21 and 23 are followed by a step 41, see FIG. 4. Step 41 comprises determining whether protocol information has been associated with the recognized object. If protocol information has been associated with the recognized object, a step 43 is performed. Step 43 comprises obtaining the protocol information associated with the recognized object. After step 43, step 45 is performed. If no protocol information has been associated with the recognized object, step 43 is skipped.

(35) In the embodiment of FIG. 4, step 25 (see FIG. 2) comprises steps 45 and 47. Step 45 comprises analyzing the captured image to detect data being communicated via visible light. If such data is detected, step 47 is performed next. Step 47 comprises decoding this data. If protocol information was obtained in step 43, this protocol information is used to decode the data being communicated via visible light. If no such data is detected in step 45, the method returns to step 21 instead of proceeding to step 47. After step 47 has been performed, the method returns to step 21.

(36) In a third embodiment of the method, see FIG. 5, steps 21, 23, 41 and 43 are the same as in the second embodiment, but step 25 comprises steps 51 and 53 instead of steps 45 and 47. Step 51 comprises detecting data being communicated via visible light with a further sensor, e.g. a photo diode or a camera dedicated to VLC signal reception. If such data is detected, step 53 is performed next. Step 53 comprises decoding the data detected in step 51 with the further sensor. If protocol information was obtained in step 43, this protocol information is used to decode the data being communicated via visible light. If no such data is detected in step 51, the method returns to step 21 instead of proceeding to step 53. After step 53 has been performed, the method returns to step 21.

(37) In a fourth embodiment of the method, steps 21 and 23 are preceded by a step 51, see FIG. 6. Step 51 comprises detecting with a further sensor, e.g. a photo diode, whether data is being communicated via visible light. In this embodiment, the further sensor may be continuously active, for example. Steps 21 and 23 are performed if data being communicated via visible light is detected in step 51. If no such data is detected, step 51 is repeated.

(38) Steps 21 and 23 are followed by a step 61. Step 61 comprises determining whether a decryption key has been associated with the recognized object. If a decryption key has been associated with the recognized object, a step 63 is performed. Step 63 comprises obtaining the decryption key associated with the recognized object. After step 63, step 65 is performed. If no protocol information has been associated with the recognized object, step 63 is skipped.

(39) In the embodiment of FIG. 6, step 25 (see FIG. 2) comprises a step 65. Step 65 comprises decoding the data detected in step 51 with the further sensor. If a decryption key was obtained in step 63, this key is used to decode the data being communicated via visible light in step 65. After step 65 has been performed, the method returns to step 51.

(40) In the example of the captured image 71 shown in FIG. 7, the recognized object is itself not a light source. It may alternatively or additionally be possible to recognize light sources of light beacons. For example, a lamp 101 comprises a light source 103 with the shape of a bunny head, see FIG. 10. The shape of this light source 103 may be recognizable.

(41) Instead of or in addition to recognizing a certain shape associated with a certain object, a certain pattern or texture associated with a certain object may be recognized. For example, a city info sign 111 may comprise a top part 119 with a certain pattern or logo, see FIG. 11. The city info sign 111 further comprises route indicators 113 and 115 and a light source 117 which transmits VLC signals.

(42) FIG. 12 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to FIGS. 2 to 6.

(43) As shown in FIG. 12, the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.

(44) The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the number of times program code must be retrieved from the bulk storage device 310 during execution.

(45) Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.

(46) In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in FIG. 12 with a dashed line surrounding the input device 312 and the output device 314). An example of such a combined device is a touch sensitive display, also sometimes referred to as a “touch screen display” or simply “touch screen”. In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.

(47) A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.

(48) As pictured in FIG. 12, the memory elements 304 may store an application 318. In various embodiments, the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 300 may further execute an operating system (not shown in FIG. 12) that can facilitate execution of the application 318. The application 318, being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.

(49) Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.

(50) The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

(51) The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.