INTERACTION MODULE
20200408411 · 2020-12-31
Inventors
Cpc classification
G06F3/0425
PHYSICS
F27D21/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F27D2021/026
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F24C7/08
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
F24C3/12
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F27D21/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
An interaction module includes a projector configured to project a first image onto a working surface; and a camera configured to record a second image of an object that is placed on the working surface.
Claims
1-14. (canceled)
15. An interaction module, comprising: a projector configured to project a first image onto a working surface; and a camera configured to record a second image of an object placed on the working surface.
16. The interaction module of claim 15, wherein the projector is configured to project onto the working surface a position marker which indicates a scan region of the camera.
17. The interaction module of claim 16, wherein the position marker indicates a delimitation of the scan region of the camera in a plane of the working surface.
18. The interaction module of claim 17, wherein the camera defines an optical axis and the projector defines an optical axis, with the optical axes of the camera and projector being close to one another, so that the position marker is visible on part of the object, when the object is not completely within the scan region.
19. The interaction module of claim 15, further comprising an optical scanning device configured to determine a gesture of a user in a region above the working surface.
20. The interaction module of claim 15, wherein the first image projected by the projector comprises a representation of the second image.
21. The interaction module of claim 20, wherein the representation of the second image is arranged outside a scan region of the camera.
22. The interaction module of claim 15, wherein the projector is configured to illuminate the object with light of a predetermined spectrum.
23. The interaction module of claim 22, wherein the projector is configured to illuminate different segments of the object with different predetermined spectra.
24. The interaction module of claim 15, wherein the projector is configured to project a predetermined background around the object.
25. The interaction module of claim 24, further comprising an interface for receiving the predetermined background to be projected.
26. The interaction module of claim 15, further comprising: a data storage unit configured to hold a recipe; and a processing facility configured to assign the second image to a recipe in the data storage unit.
27. The interaction module of claim 15, further comprising an interface for supplying the second image to a social network.
28. A method for using an interaction module, said method comprising: projecting a first image onto a working surface using a projector; and recording a second image of an object placed on the working surface using a camera.
29. The method of claim 28, wherein the projector projects onto the working surface a position marker which indicates a scan region of the camera.
30. The method of claim 29, wherein the position marker indicates a delimitation of the scan region of the camera in a plane of the working surface.
31. The method of claim 30, further comprising configuring the camera and projector such that their optical axes are close to one another, so that the position marker is visible on part of the object, when the object is not completely within the scan region.
32. The method of claim 28, further comprising determining a gesture of a user in a region above the working surface by an optical scanning device.
33. The method of claim 28, wherein the first image projected by the projector comprises a representation of the second image.
34. The method of claim 33, further comprising arranging the representation of the second image outside a scan region of the camera.
35. The method of claim 28, further comprising illuminating with the projector the object with light of a predetermined spectrum.
36. The method of claim 35, wherein the projector illuminates different segments of the object with different predetermined spectra.
37. The method of claim 28, further comprising with the projector a predetermined background around the object.
38. The method of claim 37, further comprising receiving the predetermined background to be projected via an interface.
39. The method of claim 28, further comprising: holding a recipe by a data storage unit; and assigning with a processing facility the second image to a recipe in the data storage unit.
40. The method of claim 28, further comprising supplying the second image to a social network via an interface.
Description
[0021] The invention is described in more detail below with reference to the accompanying figures, in which:
[0022]
[0023]
[0024]
[0025] The interaction module 105 comprises a projector 120, a camera 125, an optional scanning device 130 and generally a processing facility 135. A data storage unit 140 and/or an interface 145 for wireless data transfer in particular can optionally also be provided.
[0026] The projector 120, camera 125 and scanning device 130 are substantially directed onto corresponding regions of the working surface 110. For example the projector 120 can be used to project a button onto the working surface 110. A user can touch the button with their finger for example and this can be captured by the scanning device 130 and converted to a corresponding control signal. An appliance, for example the extractor hood 115, can in particular be controlled in this manner. The projector 120 is generally designed to display any content, even moving images.
[0027] It is proposed that the interaction module 105 is also equipped with the camera 125, to produce an image of an object 150 arranged on the working surface 110. In the diagram in
[0028] It is further proposed that production of the image is assisted by the projector 120. To this end for example a position marker can be projected onto the working surface 110 to give the user an idea of which surface can be imaged by the camera 125 on the working surface 110. The position marker can be for example a spot, crosshair, point, Siemens star or other figure, on which the object 150 can be centered. The position marker can also show a delimitation of the region that can be imaged. For example the entire region of the working surface 110 that can be imaged by the camera 125 can also be illuminated using the projector 120. The projector 120 and camera 125 are preferably brought as close as possible to one another within the interaction module 105 so that it can accurately be assumed that only the segments of the object 150 illuminated by the projector 120 will appear on the image. In another variant the position marker can be outside the region that can be imaged by the camera 125 so that the segments of the object 150 which will lie outside the image detail can specifically be illuminated. In the diagram in
[0029] In further embodiments the projector 120 can illuminate the object 150 or add a projected image or pattern, which extends on the object 150 itself or the working surface 110, while the image is being recorded. For example a pattern reminiscent of a tablecloth for example can be projected in a region away from the object. An additional object can also be projected into the region of the image by projection. The projector 120 can also be used to illuminate the object 150, it being possible in particular to tailor a light intensity and/or light temperature to the object 150 to be recorded or user requirements. In certain circumstances a segment, partial object or detail of the object 150 can be removed from the image or made inconspicuous by projection.
[0030] The camera 125 can be triggered by a user performing a corresponding gesture within a scan region of the scanning device 130. The scan region can in particular correspond as closely as possible to, ideally coincide with, the recording region of the camera 125 or the projection region of the projector 120. A button can be superimposed on the image projected by the projector 120, it being possible for the user to touch said button manually or tactilely to control the production of an image. The camera 125 is preferably triggered with a time delay to give the user time to remove their hand from the recording region of the camera 125 and the projector 120 time to cancel the displayed button.
[0031] In a further embodiment the first image projected by the projector 120 comprises a representation of the second image, the representation of the second image being arranged outside a scan region of the camera 125. Virtual buttons or operating elements are arranged immediately adjacent to the representation or projection of the second image, allowing the user to trigger the camera 125 to record or store the second image and to change the image background. Operation of the virtual operating elements by the user is recognized by evaluating the user's gestures captured by the scanning device 130.
[0032] A resulting image can be stored in the data storage unit 140. It can also be assigned to a recipe, for example, which can also be stored in the data storage unit 140. The image can also be sent out using the interface 145, optionally for example to a portable mobile computer (smartphone, laptop), a storage or processing service or a social network.
[0033]
[0034] In an optional step 205 a background, a pattern, the image of an object 150 or other image information can be uploaded to the interaction module 105. One or more predetermined and/or user-defined backgrounds can later be selected for projection from a collection.
[0035] In an optional step 210 the object 150 in the region of the working surface 110 can be captured. Capturing can be performed using the camera 125, the scanning device 130 or by a user specification. In one embodiment specification can take place by user gesture control, for which purpose the projector 120 projects a control surface onto the working surface 110, which the user touches, the contact being captured by means of the scanning device 130.
[0036] In a step 215 a position marker can be projected onto the working surface 110, to make it easier for the user to position the object 150 within an imaging region of the camera 125. An instruction can also be projected for further user guidance for example. One or more buttons can also be projected for further control of the method 200.
[0037] In a further embodiment a marker can also be projected onto the object 155, comprising a proposed garnish or division. This can be used in particular for a round object such as a cake, pizza or fruit. For example a pattern can be projected onto a cake, making it easier for the user to divide it into a predetermined number of equal pieces. The number of pieces can be predetermined or selected in particular in dialog form. This allows an otherwise difficult division into an uneven number or a prime number also to be performed.
[0038] In a step 220 a background can be projected in the region of the object 150. The background can have been uploaded, otherwise predetermined or dynamically generated beforehand in step 205.
[0039] In a step 225 a lighting effect can be output using the projector 120. The lighting effect can be adjusted in particular in respect of brightness, color spectrum, light temperature or tone. The lighting effect can influence the outputting of the background for example. In a step 230 the camera 125 can produce an image of the object 150. In this process the object 150 and/or a surrounding region of the working surface 110 can preferably be illuminated using the projector 120.
[0040] In an optional step 235 the resulting image can be assigned to another object. In particular the image can be assigned to a recipe, another image or further information, which can be held in particular in the data storage unit 140.
[0041] In a step 240 the image can be supplied, in particular using the interface 145. This can comprise saving or sending the image, for example to a social network. Before sending the user can be given the opportunity to confirm sending, to amend the image, to add text or carry out other standard editing operations.
REFERENCE CHARACTERS
[0042] 100 System [0043] 105 Interaction module [0044] 110 Working surface [0045] 115 Extractor hood [0046] 120 Projector [0047] 125 Camera [0048] 130 Scanning device [0049] 135 Processing facility [0050] 140 Data storage unit [0051] 145 Interface [0052] 150 Object [0053] 155 Segment [0054] 200 Method [0055] 205 Upload background [0056] 210 Capture object [0057] 215 Project position marker [0058] 220 Project background [0059] 225 Project lighting effect [0060] 230 Record image [0061] 235 Assign image [0062] 240 Supply image