Cooking device with light pattern projector and camera

10228145 · 2019-03-12

Assignee

Inventors

Cpc classification

International classification

Abstract

A cooking appliance includes a cooking chamber having a loading opening which is closable by a door. A light pattern projector is arranged in a fixed manner relative to the cooking chamber and configured to generate and radiate a light pattern into the cooking chamber. A camera captures images from a region being irradiated by the light pattern projector when the cooking chamber is closed, and is arranged in a fixed manner relative to the cooking chamber, Coupled to the camera is an analysis facility which repeatedly calculates a three-dimensional shape by light pattern analysis of an object located in the region being irradiated by the light pattern projector during operation of the cooking appliance.

Claims

1. A cooking appliance, comprising: a cooking chamber having a loading opening which is closable by a door; a light pattern projector arranged in a fixed manner relative to the cooking chamber and configured to generate and radiate a light pattern into a region of the cooking chamber; a camera for capturing images from the region being irradiated by the light pattern projector even when the cooking chamber is closed, said camera being arranged in a fixed manner relative to the cooking chamber; an analysis facility coupled to the camera and configured to perform light pattern analysis of the light pattern irradiated into the region of the cooking chamber, the analysis facility configured to repeatedly calculate, based on the light pattern analysis of the light pattern, a three-dimensional shape of an object located in the region of the cooking chamber being irradiated by the light pattern projector during operation of the cooking appliance; and a muffle having predefined calibration markings in the muffle for delimiting the cooking chamber, wherein the analysis facility is configured to use the predefined calibration markings to perform a calibration.

2. The cooking appliance of claim 1, wherein an optical axis of the light pattern projector and an optical axis of the camera extend at an angle of between 20 and 30 to each other.

3. The cooking appliance of claim 1, further comprising a ceiling, said light pattern projector and said camera being arranged behind the ceiling of the cooking chamber.

4. The cooking appliance of claim 1, wherein the light pattern projector is configured to radiate different light patterns into the cooking chamber.

5. The cooking appliance of claim 1, wherein the light pattern projector includes at least one image point-type screen for shaping the light pattern.

6. The cooking appliance of claim 1, wherein the analysis facility is configured to recognize a type of food.

7. The cooking appliance of claim 1, wherein the analysis facility is configured to recognize a type of food support.

8. The cooking appliance of claim 1, wherein the analysis facility is configured to recognize a core temperature of the object.

9. The cooking appliance of claim 1, further comprising a control facility coupled to the analysis facility and configured to adjust operation of the cooking appliance based on at least one object parameter determined by the analysis facility.

10. The cooking appliance of claim 1, further comprising a screen configured to display at least one three-dimensional image of the object captured by the camera.

11. The cooking appliance of claim 1, wherein the light pattern projector is configured to illuminate the cooking chamber.

12. The cooking appliance of claim 1, wherein the analysis facility calculates the three-dimensional shape of the object based on a degree of deformation of the light pattern at the object.

13. The cooking appliance of claim 1, wherein the analysis facility calculates the three-dimensional shape of the object based on a degree of deformation of the light pattern at the object.

14. The cooking appliance of claim 1, wherein a light emitted by the light pattern projector and captured by the camera is visible light.

15. The cooking appliance of claim 1, wherein a light emitted by the light pattern projector and captured by the camera is infrared light.

16. The cooking appliance of claim 1, wherein the light pattern projector is configured to radiate a plurality of different light patterns into the cooking chamber.

17. The cooking appliance of claim 16, wherein the light pattern projector is configured to radiate the plurality of different light patterns into the cooking chamber in a predefined sequence.

18. The cooking appliance of claim 1, wherein the analysis facility is configured to recognize a plurality of types of food based on the light pattern analysis of the light pattern irradiated in the region by the light pattern projector during operation of the cooking appliance.

19. The cooking appliance of claim 1, wherein the analysis facility is configured to recognize at least one of a plurality of types of food supports or a plurality of types of equipment on a food support based on the light pattern analysis of the light pattern irradiated in the region by the light pattern projector during operation of the cooking appliance.

20. A cooking appliance, comprising: a cooking chamber having a loading opening which is closable by a door; a light pattern projector arranged in a fixed manner relative to the cooking chamber and configured to generate and radiate a light pattern into a region of the cooking chamber; a camera for capturing images from the region being irradiated by the light pattern projector even when the cooking chamber is closed, said camera being arranged in a fixed manner relative to the cooking chamber; an analysis facility coupled to the camera and configured to perform light pattern analysis of the light pattern irradiated into the region of the cooking chamber, the analysis facility configured to repeatedly calculate, based on the light pattern analysis of the light pattern, a three-dimensional shape of an object located in the region of the cooking chamber being irradiated by the light pattern projector during operation of the cooking appliance; and a muffle; and a food support in the muffle, the food support having predefined calibration markings, wherein the analysis facility is configured to use the predefined calibration markings to perform a calibration.

21. The cooking appliance of claim 20, wherein an optical axis of the light pattern projector and an optical axis of the camera extend at an angle of between 20 and 30 to each other.

22. The cooking appliance of claim 20, further comprising a ceiling, said light pattern projector and said camera being arranged behind the ceiling of the cooking chamber.

23. The cooking appliance of claim 20, wherein the light pattern projector includes at least one image point-type screen for shaping the light pattern.

24. The cooking appliance of claim 20, further comprising a screen configured to display at least one three-dimensional image of the object captured by the camera.

25. The cooking appliance of claim 20, wherein the analysis facility is configured to recognize a core temperature of the object.

26. The cooking appliance of claim 20, further comprising a control facility coupled to the analysis facility and configured to adjust operation of the cooking appliance based on at least one object parameter determined by the analysis facility.

27. The cooking appliance of claim 20, wherein the analysis facility calculates the three-dimensional shape of the object based on a degree of deformation of the light pattern at the object.

28. The cooking appliance of claim 20, wherein a light emitted by the light pattern projector and captured by the camera is visible light.

29. The cooking appliance of claim 20, wherein a light emitted by the light pattern projector and captured by the camera is infrared light.

30. The cooking appliance of claim 20, wherein the light pattern projector is configured to radiate a plurality of different light patterns into the cooking chamber.

31. The cooking appliance of claim 30, wherein the light pattern projector is configured to radiate the plurality of different light patterns into the cooking chamber in a predefined sequence.

32. The cooking appliance of claim 20, wherein the analysis facility is configured to recognize at least one of a plurality of types of food based on the light pattern analysis of the light pattern irradiated in the region by the light pattern projector during operation of the cooking appliance.

33. The cooking appliance of claim 20, wherein the analysis facility is configured to recognize at least one of a plurality of types of the food support or a plurality of types of equipment on the food support based on the light pattern analysis of the light pattern irradiated in the region by the light pattern projector during operation of the cooking appliance.

34. The cooking appliance of claim 20, wherein the muffle has predefined calibration markings delimiting the cooking chamber.

35. The cooking appliance of claim 20, wherein the analysis facility is configured to recognize an insertion level of the food support in the muffle based on the predefined calibration markings.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The attributes, features and advantages of this invention as described above as well as the manner in which these are achieved will become clearer and more comprehensible in conjunction with the following schematic description of an exemplary embodiment, which is explained in more detail in conjunction with the drawings.

(2) FIG. 1 shows an outline of an arrangement of a 3D scanner;

(3) FIG. 2 shows an outline of a reconstruction of a shape of an object scanned using a 3D scanner;

(4) FIG. 3 shows a sectional diagram of a side view of an inventive cooking appliance equipped with a 3D scanner; and

(5) FIG. 4 shows a diagram of a temperature and volume profile of a heated object with food determination by means of an inventive cooking appliance.

DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS OF THE PRESENT INVENTION

(6) FIG. 1 shows an arrangement (3D scanner) for determining a three-dimensional shape of at least one object O (3D scanner), having a light pattern projector 1 directed onto the object, a camera 2 directed onto the object O, a control facility C for operating the light pattern projector 1 and for calculating a three-dimensional shape of the object O based on at least one image received by the camera 2 by means of a light pattern analysis. A screen 3 is optionally present for observing the object O calculated by the control facility C.

(7) The light pattern projector 1 generates a predetermined light pattern L, e.g. a line or dot pattern. The light pattern projector 1 radiates its light in a light bundle with a first optical axis A1.

(8) The camera 2, typically a digital camera, has a field of view F with a second optical axis A2, which is aligned obliquely in relation to the first optical axis A1 of the light pattern projector 1. In other words the camera 2 is aligned obliquely in relation to the light pattern projector 1. It views a region of the object O that is or can be irradiated by the light pattern L.

(9) FIG. 2 shows an outline of a reconstruction of a shape of the object O scanned using the 3D scanner 1, 2, C.

(10) The light pattern projector 1 here has a light source Q, e.g. a field of light emitting diodes, downstream of which is a pattern generation element in the form of a permeable, freely programmable LCD surface D. Depending on the pattern M generated on the LCD surface D, a corresponding, in particular complementary, light pattern L is emitted from the LCD surface D. Alternatively an LED screen may serve as the light source (not shown), with the backlighting integrated therein then dispensing with the need for a separate light source.

(11) FIG. 2 shows an example of how light is radiated in the form of a vertical column or line G from the light pattern projector 1 onto the object O. A projection P(G) of this line G, distorted by the surface contour of the object O, therefore appears on the object O. Because of its oblique position in relation to the light pattern projector 1, the camera 2 captures an image of this projection P(G) which shows the distortion. The camera 2 stores the projection P(G) as correspondingly positioned image points B or pixels of a matrix, which results from a matrix-type arrangement of individual sensors in a sensor array S of the camera, e.g. a CCD sensor array. The height or depth information is defined by the deviation of the image points B from a vertical line.

(12) If the planes of all vertical columns or lines G are known and a light beam r in the space, from which the light striking the respective individual sensor originates, can be assigned to each image point B in the camera image, and if there is also an assignment of the image points to the projection P(G) visible from the image points and therefore also to the corresponding lines G, points on the object surface can be reconstructed by means of a simple beam plane section.

(13) Depth resolution here is a function of an angle W between the light beam r leading to the image point B and the direction of the column or line G. A theoretical resolution optimum would be present at a greatest possible angle W. However visibility of the projection P(G) on the surface of the object O and therefore its detectability in the camera image deteriorates as this optimum is approached. As reconstruction is only possible for those points on the surface of the object O which on the one hand are visible from the camera 2 and on the other hand can be irradiated by the light pattern projector 1, a compromise is reached here. Such a 3D scan is known in principle and is therefore not explained further in the following.

(14) FIG. 3 shows a sectional diagram of a side view of a cooking appliance equipped with a 3D scanner, in the form of an oven 4. The oven 4 has a cooking chamber 6 delimited by an oven muffle 5. At its front the oven muffle 5 has a loading opening 8 which can be closed by means of an oven door 7, through which loading opening 7 objects, in particular in the form of food O1, can be moved into the cooking chamber 6. A cooking chamber temperature T can be set by means of one or more, in particular electrically operated, heating units (not shown).

(15) On a ceiling 9 of the oven muffle 5 are two viewing windows 10 and 11, which can be covered with transparent glass panes for example. On the side of the oven muffle 5 facing away from the cooking chamber 6 and at a predefined distance from the oven muffle 5, behind the viewing window 10, is a light pattern projector 1 (e.g. with an LCD display for pattern generation) and behind the viewing window 11 is a camera 2. These are protected thermally by their distance from the oven muffle 5. A cooling air flow may also flow across the ceiling 9, e.g. to cool components arranged there, such as a control facility. The light pattern projector 1 and the camera 2 can also be further cooled by said cooling air flow.

(16) The light pattern projector 1 and the camera 2 are arranged with a lateral offset from one another. Their optical axis A1 and A2 also form an angle of between 20 and 30, which allows high depth resolution with good visibility. The light pattern projector 1 and the camera 2 are arranged in a fixed manner relative to the cooking chamber 6 and therefore do not move when the oven door 7 is actuated.

(17) The light pattern projector 1 radiates a light pattern L through the viewing window 10 into the cooking chamber 6 so that practically the entire horizontal surface of the cooking chamber 6 can be illuminated with the light pattern L from a predefined distance from the ceiling 9. This may be the case for example in a lower half or in a lower third of the cooking chamber 6. The camera 2 captures images from a region of the cooking chamber 6 which can be irradiated at least partially by the light pattern.

(18) The oven 4 also has an analysis facility 12 which is coupled to the camera 2 for calculating a three-dimensional shape for example of the food O1 and a food support O2, which are in the region that can be irradiated by the light pattern L, by means of a light pattern analysis. This is based on a 3D scan based on at least one image captured by the camera 2. The light pattern projector 1, the camera 2 and the analysis facility 12 together form the 3D scanner. As shown here, the analysis facility 12 may be integrated functionally in a central control facility of the oven 4 or may be coupled to a control facility as an independent unit.

(19) A 3D scan, comprising for example a capturing of an image of the projection P(G) of the light pattern L by means of the camera 2 and from this a calculation of the three-dimensional shape of the food O1 and the food support O2, can be performed with the oven door 7 or loading opening 8 closed.

(20) In particular a calibration can be performed first with the oven door 7 closed but before the cooking process has started. To this end the food support O2 has one or more, for example colored, calibration markings K on its upper face, which are of known size and can be easily identified. For example it is possible to identify a distance from the ceiling 9 and therefore for example the insertion level used from the size of the calibrations marking(s) K captured by the camera 2. If the insertion level is unsuitable for the 3D scan, because it is too high for example, the oven 4 may output a notification to a user, e.g. a display on a front screen 3 on an operating panel 13. At least one calibration marking may also be present on the oven muffle 4.

(21) After the calibration but before a cooking process or treatment of the food O1, an initial 3D scan of the food O1 may be performed by means of the 3D scanner 1, 2, 12, in order to calculate its original shape. The calculated shape may be displayed on the screen 3. The calculated shape may be used by the cooking appliance 4 to determine the food O1, in particular in addition to image recognition of the food O1 that can also be performed by the camera 2. A type of food support O2 may also be recognized using the 3D scanner 1, 2, 12. One or more cooking parameters of the cooking process may be adjusted based on recognition of the food O1 and optionally of the food support O2. A cooking time and/or the cooking chamber temperature T may therefore be adjusted based on the recognized type and/or volume of the food and/or the recognized food support O2.

(22) During the cooking process 3D scans can be performed repeatedly to determine a change in the shape and/or volume of the food O1. In the event of a change in the shape and/or volume the oven 4 may adjust the cooking process, e.g. change the cooking time and/or the cooking temperature, including switching off the heating units.

(23) In order to improve the accuracy of the depth information and therefore of the volume of the food O1, the light pattern projector 1 may radiate different light patterns L into the cooking chamber 6.

(24) FIG. 4 shows a diagram of a profile of a temperature T and a volume V of the food O1 with a food determination by means of the cooking appliance 4 for example.

(25) Purely by way of example, the food O1 is a baked product, for example a round pizza or a sponge cake in a springform pan. Image recognition by the camera 2 alone cannot distinguish between these two types of food O1, as in a two-dimensional view from above (top view) both the pizza and sponge cake look circular. Both are also similar in color. However both types of food require a different and specific baking environment. If the pizza were treated in the same way as the sponge cake, the result would be unsatisfactory and vice versa. The 3D scan by means of the 3D scanner 1, 2, 12 additionally provides the cooking appliance 4 with spatial information about the food O1. This spatial information relating to the initial state of the food O1 before the cooking process (e.g. an initial volume V0) may already be enough to distinguish the flat pizza from the taller sponge cake. Additionally or alternatively the type of food O1 may also be determined from the change in its shape, in particular a change V in its volume V, by means of the 3D scanner 1, 2, 12.

(26) Thus at an initial time point ts of the cooking process the cooking chamber temperature T has an initial value Ts, e.g. room temperature. As time t progresses, the cooking temperature T increases due to at least one activated heating unit, in the same manner for pizza and sponge cake, as shown by the curve T1+T2. When the cooking chamber temperature T reaches a target temperature Td1 for sponge cake, which is below a target temperature Td2 for pizza, at a time point td, a further 3D scan is performed to determine the type of food O1.

(27) If the height or volume V0 of the food O1 has not changed significantly, it can be assumed that it is pizza, which typically does not rise. Its volume profile is shown as the curve V2. Therefore if pizza is recognized, the cooking appliance 4 can then increase its cooking chamber temperature T for example to the associated target value Td2, as shown by the temperature curve T2. The cooking process ends at an associated end time point te2.

(28) However if the height or volume V0 of the food O1 has increased noticeably by V by time point td, it can be assumed that it is sponge cake, which typically rises. Its volume profile is shown as the curve V1. Therefore if sponge cake is recognized, the cooking appliance 4 may then keep its cooking chamber temperature T at the associated target value Td1, as shown by the temperature curve T1. The cooking process ends at an associated end time point te1.

(29) The height and/or volume information from the 3D scan can therefore be used to provide a clear distinguishing feature for food recognition.

(30) The present invention is of course not restricted to the exemplary embodiment shown.

(31) Generally one can refer to one or a number, in particular in the sense of at least one or one or more, unless this is specifically excluded, for example by the expression just one, etc.

(32) Also a figure can cover just the figure given as well as a standard tolerance range, unless this is specifically excluded.