APPARATUSES AND METHODS FOR ILLUMINATING OBJECTS
20200396360 ยท 2020-12-17
Inventors
Cpc classification
H05B47/11
ELECTRICITY
G06T3/08
PHYSICS
Y02B20/40
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
Abstract
An apparatus (1) for illuminating items comprises a the imaging device (2) for capturing an image of an area (A), and generating a first signal (IM) representative of the captured image; a processing unit (3) which is configured to: receive a first signal (IM), process it through an image recognition and edge detection algorithm to identify one or more shapes (F) delimited by edges (B), the processing unit (3) being further configured to process the captured image by applying a first color inside the edge (B) of each shape (F) and a second color outside the edge (B) of each shape (F), and generate a second signal (EL) representative of the processed image; the apparatus (1) further comprises an image projection device (4), which is configured to receive the second signal (EL), generate a light beam representative of the processed image and direct the light beam to the area (A).
Claims
1. An apparatus for illuminating objects in an area, said apparatus comprising: an imaging device configured to capture an image of said area and generate a first signal representative of said captured image, a processing unit in signal communication with said imaging device and configured to: receive said first signal, process said first signal by an image recognition and edge detection algorithm to identify one or more shapes bounded by edges in said acquired image, and process said acquired image by uniformly filling each identified shape with a first color, and uniformly filling the parts of said acquired image external to said edges with a second color to generate a second signal representative of the processed image, an image projection device in signal communication with said imaging unit and configured to: receive said second signal, and generate a light beam representative of the processed image, directing said light beam to said area.
2. The apparatus of claim 1, wherein: said light beam comprises light areas and dark areas, the portions with said first color in the processed image correspond to said light areas of said light beam, the portions with said second color in the processed image correspond to said dark areas of said light beam, whereby said image projection device illuminates the inside of the edges of the shapes so identified.
3. The apparatus of claim 1, wherein said first color is white and said second color is black.
4. The apparatus of claim 1, wherein: said imaging device is configured to be activated to capture said image of said area under uniform illumination of said area.
5. The apparatus of claim 4, wherein, said processing unit is configured to create a uniform image of a rectangular white background, said image projection device is configured to generate a uniform light beam representative of said background image and direct the uniform light beam on said area, said imaging device is configured to capture the image of said area under uniform illumination of said area by means of said uniform light beam.
6. The apparatus of claim 1, wherein: the light beam generated by said image projection device reproduces the processed image on said area.
7. The apparatus of claim 1, wherein: said image projection device is configured to: detecting the distance between said image projection device and each shape identified in said captured image, adjusting the illumination of the light beam generated as a function of the detected distance.
8. The apparatus of claim 1, wherein: said processing unit is also configured to process said first signal to detect the texture of the areas inside the edges of the identified shapes, said second signal comprising information representative of said detected texture, said image projection device is configured to adjust the illumination of the light beam generated according to the detected texture.
9. The apparatus of claim 1, wherein said processing unit is configured to process images by means of the Canny algorithm.
10. A method of illuminating objects in an area, said method comprising the steps of: capturing an image of an area; processing said captured image by an image recognition and edge detection algorithm to identify one or more shapes bounded by edges; uniformly filling each identified shape with a first color, and uniformly filling the parts of said captured image external to said edges with a second color to generate a processed image, generating a light beam representative of the processed image, directing said light beam to said area.
11. The method of claim 10, wherein: said light beam comprises light areas and dark areas, the portions with said first color in the processed image correspond to said light areas of said light beam, the portions with said second color in the processed image correspond to said dark areas of said light beam.
12. The method of claim 10, wherein said first color is white and said second color is black.
13. The method of claim 10, wherein: said step of acquiring an image of said area comprises an initialization step, that is carried out under uniform illumination of said area.
14. The method of claim 13, wherein said initialization step comprises the steps of: creating a uniform image of a rectangular white background, generating a uniform light beam representative of said background image, directing the uniform light beam to said area, capturing the image of said area.
15. The method as claimed in claim 10, wherein: said step of generating a light beam causes the reproduction of said processed image on said area.
16. The method as claimed in claim 10, wherein said step of generating a light beam comprises the step of: detecting the distance between a source of said light beam and each shape identified in said captured image, adjusting the illumination of the light beam generated as a function of the detected distance.
17. The method of claim 10, wherein: said step of processing said image comprises the step of detecting the texture of the areas inside the edges of the shapes so identified, said step of generating a light beam comprises the step of adjusting the illumination of the light beam so generated according to the detected texture.
18. An apparatus for illuminating objects in an area, said apparatus comprising: an imaging device configured to capture an image of said area and generate a first signal representative of said captured image, the imaging device being configured to be activated to capture said image of said area under uniform illumination of said area; a processing unit in signal communication with said imaging device and configured to: receive said first signal, process said first signal by an image recognition and edge detection algorithm to identify one or more shapes bounded by edges in said acquired image, and process said acquired image by uniformly filling each identified shape with a first color, and uniformly filling the parts of said acquired image external to said edges with a second color to generate a second signal representative of the processed image, an image projection device in signal communication with said imaging unit and configured to: receive said second signal, and generate a light beam representative of the processed image, directing said light beam to said area.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The characteristics and advantages of the present invention will result from the following detailed description of a possible embodiment, illustrated as a non-limiting example in the annexed drawings, in which:
[0023]
[0024]
DETAILED DESCRIPTION
[0025] Referring to the annexed figures, numeral 1 generally designates an apparatus for illuminating objects arranged in an area according to the present invention.
[0026] The apparatus 1 comprises an imaging device 2. Such an imaging device 2 is configured to capture an image of an area A and generate a first signal IM representative of the captured image. By mere way of example, the imaging device 2 may be a dedicated camera, a web cam or a smartphone having a camera incorporated therein.
[0027] The apparatus 1 also comprises a processing unit 3. Such processing unit 3 is particularly put in signal communication with the imaging device 2. Furthermore, the processing unit 3 is designed to receive the above mentioned first signal IM. The processing unit 3 analyzes the first signal IM, processes it and outputs a second signal EL representative of a processed image. Further details on the architecture and operation of the processing unit 3 will be provided hereinbelow.
[0028] The apparatus 1 further comprises an image projection device 4, which is in signal communication with the processing unit 3. Particularly, the image projection devices 4 is configured to receive the aforementioned second signal EL, which encodes the image processed by the processing unit, and projects it on the area A. For example, the image projection devices may be a commercially available video projector.
[0029] Particularly referring to
[0030] The processing unit 3 comprises an acquisition module 5, which is directly connected to the imaging device 2, and detects the transmission of the aforementioned first signal IM.
[0031] The processing unit 3 comprises a recognition module 6, which is adapted to process the first signal IM using an image recognition and edge detection algorithm. In the preferred embodiment of the invention, the recognition module 6 implements the Canny algorithm, substantially as described in J. F. Cannya computational approach to edge detectionIEEE Transactions on Pattern Analysis and Machine Intelligence, p. 679-698, 1986. By this arrangement, one or more closed shapes F, i.e. completely delimited by closed edges, are identified in the captured image. Particularly, the processing unit 3 has such purpose.
[0032] Optionally, the processing unit 3 is configured to process the first signal IM to detect the texture of the zones inside the edges B of the identified shapes F. This operation is particularly carried out by the recognition module 4. Advantageously, this information allows detection of the type of surface enclosed by the edge B (for example, in case of a painting, such detection will determine whether it is an oil on canvas). This information may be incorporated in the second signal EL, which will also include information representative of the texture that has been detected.
[0033] The processing unit 3 also comprises a control module 7, which completes the image processing operation. Particularly, the control module 7 is designed to apply a first color inside the edge B of each previously identified shape F. Similarly, a second color is applied outside the edges B of the shapes F, i.e. on the side of the image outside the shapes F. Namely, the shapes F are uniformly filled with the first color, whereas the external parts are uniformly filled with the second color. Thus, the processed image will only have two colors, i.e. is the first and the second color.
[0034] In the preferred embodiment of the invention, the first color is white and the second color is black. In alternative embodiments of the invention, the first and second colors may be any color as long as a good contrast is ensured therebetween. Thus, the processed image is encoded in the aforementioned second signal EL.
[0035] Particularly referring to
[0036] According to the invention, the light beam L comprises lighted areas and dark areas. Particularly, the portions that have been filled with the first color in the processed image correspond to the lighted areas of the light beam L. Likewise, the portions that have been filled with the second color in the processed image correspond to the dark areas of the light beam L. Thus, advantageously, the image projection device (4) illuminates the interior of the edges (B) of the previously identified shapes (F). It shall be noted that, in the preferred embodiment of the invention, the light beam L generated by the image projection devices 4 reproduces the image produced on the area A, thereby illuminating the objects therein and leaving the empty parts in the area A dark.
[0037] Optionally, the image projection devices 4 is configured to detect its distance from the area A, particularly with respect to each shape F identified in the captured image. The image projection device 4 is designed to adjust the illuminance of the light beam generated according to the detected distance. As used herein, the term illuminance is intended as the ratio of the luminous flux on an surface element to the area of the element. The term luminous flux is intended as the product of the luminous intensity of the light source by the solid angle of the light beam.
[0038] Optionally, the image projection device 4 is configured to adjust the illuminance of the light beam L generated in response to the detected texture if this information is available in the second signal EL.
[0039] A method for illuminating objects arranged in the area A is described below. Particularly, the method is described in its application to the above discussed apparatus 1. In certain alternative embodiments, which form part of the present invention, the method may also be carried out without using the apparatus 1.
[0040] In an optional initialization step, the area A is uniformly illuminated, with a white rectangle projected over the entire area A. Advantageously, this affords a more accurate recognition of the edges B.
[0041] More in detail, during the initialization step the processing unit 3, and particularly the control module 7, generate a rectangular background image having a uniform white color. This background image is encoded in the second signal EL, which is sent to the image projection device 4, which projects it on the area A, thereby generating a uniform light beam L, corresponding to the received image.
[0042] Now the projected image may be detected, particularly through the imaging device 2. Advantageously, the perspective distortion caused by the angle of incidence of the light beam L on the area A may be corrected.
[0043] In a first step, an image in the area A is captured, with the purpose of detecting the shapes F. This operation is preferably carried out by the imaging device 2, which sends the aforementioned first signal IM to the processing unit 3, particularly to the acquisition module 5.
[0044] If the initialization step has been carried out, the imaging device 2 captures the image in the area A while it is being illuminated by the uniform light beam L.
[0045] Then, the digitized image is processed using an image recognition and edge detection algorithm. The shapes F with their respective edges B are so defined. Preferably, this operation is carried out by the processing unit 3, particularly the recognition module 6.
[0046] Then, the first color is uniformly applied inside the edge B of each identified shape F. At the same time, the second color is uniformly applied outside the edges B. This operation is particularly carried out by the processing unit 3, more particularly by the control module 7, which generates the aforementioned second signal EL.
[0047] Then, a light beam L representative of the processed image is generated. This light beam L is directed to the area A, to illuminate it. This operation is preferably carried out by the image projection device 4.
[0048] Optionally, the method comprises a step of detecting the distance between the light source, preferably the image projection device 4, and each identified shape F. Based on such distance, the illuminance of the generated light beam L may be adjusted.
[0049] Optionally, the texture of the zones inside the edges B of the identified shapes F may be detected, particularly to detect the type of surface to be illuminated. Thus, the illuminance of the light beam L may be adjusted in response to the detected texture.
[0050] Those skilled in the art will obviously appreciate that a number of changes and variants as described above may be made to fulfill particular requirements, without departure from the scope of the invention, as defined in the following claims.