Intelligent identification cooking system for oven

11478108 · 2022-10-25

Assignee

Inventors

Cpc classification

International classification

Abstract

An intelligent identification cooking system of an oven including an image acquisition system, an image analysis and processing system, and a temperature measurement and monitoring system. The image acquisition system is connected to the image analysis and processing system and an intelligent menu control system is connected to the image analysis and processing system and the temperature measurement and monitoring system respectively. Through computer vision and identification technology and temperature sensing technology, parameters such as the type, thickness, size, fattiness and temperature are identified and automatically matched and calibrated to a cooking menu. A control program is output to a control terminal and executed.

Claims

1. An intelligent identification cooking system of an oven comprising an image acquisition system, an image analysis and processing system, and a temperature measurement and monitoring system, wherein the image acquisition system is connected to the image analysis and processing system, and an intelligent menu control system is connected to the image analysis and processing system and the temperature measurement and monitoring system respectively; wherein the image acquisition system comprises a camera and a light source, wherein the camera is used to collect image information and the light source is a fixed light source which provides a stable lighting environment for the camera; wherein the image analysis and processing system comprises an image analysis processor and a signal modifier and an image mode acquired by the image acquisition system is in an RGB color format, wherein the analysis processor first converts the RGB color format into a YIQ format, where the Y value represents a brightness, the I and the Q are hue values representing from orange to green and from purple to yellow-green respectively, wherein the analysis processor is configured to perform calculations on YIQ values of a pixel in an image using Formula 1 to obtain a pixel classification value Pmn:
P.sub.mn=a×Y+b ×I+c×Q+A  [Formula 1] wherein in Formula 1 a, b, c and A are variable parameters, and 0≤P.sub.mn≤1; wherein by limiting a threshold of the pixel classification value P.sub.mn, a steak type image is segmented, and the image is divided into an upper surface area (S.sub.1) of the steak type, a side area (S.sub.2) of the steak type, and a background (S.sub.3), wherein the upper surface area (S.sub.1) is subdivided into a lean meat area (S.sub.1-1) and a fatty meat area (S.sub.1-2) according to the brightness value Y, and wherein a type of a steak is identified by thickness h, size or weight, and fattiness; wherein the temperature measurement and monitoring system comprises an infrared temperature sensor and a thermocouple temperature sensor, which are respectively used to measure and monitor a food surface temperature and an oven cavity temperature; and wherein the intelligent menu control system collects signals from the image analysis and processing system and the temperature measurement system and the type of the steak obtained through analysis is matched with a preset standard cooking curve, the standard cooking curve being calibrated with the thickness, the size or weight, the fattiness and an initial temperature to obtain a calibrated cooking curve; wherein a cooking control program is simultaneously obtained according to a rawness requirement input by a user and the control program is output to a control terminal and executed to perform cooking.

2. The intelligent identification cooking system according to claim 1, characterized in that, the camera is directly opposite to a center position of a bakeware/grill on which food is placed, and forms an angle of 30° to 60° with a horizontal plane.

3. The intelligent identification cooking system according to claim 1, characterized in that, the light source is located above an oven cavity, on a same plane as the camera and a center line of the grill/bakeware; and the light source and the camera are fixed respectively on both sides of another center line of the grill/bakeware.

4. The intelligent identification cooking system according to claim 1, characterized in that, the light source is a white or yellow light source and an angle between a connecting line from the light source to a center of the bakeware/grill and a horizontal plane is 60° to 90°.

5. The intelligent identification cooking system according to claim 1, characterized in that, an identification of the type of the steak is performed by limiting the threshold value of the pixel classification value Pmn of pixels in a surface of the lean meat area (S.sub.1-1).

6. The intelligent identification cooking system according to claim 1, characterized in that, an identification of a value of the thickness h of the steak is performed by to identifying a bottom edge of the side area (S.sub.2) of the steak type and correcting a height value of a side surface by a function of the bottom edge to obtain the value of thickness (h) of the steak.

7. The intelligent identification cooking system according to claim 6, characterized in that, the size or weight of the steak is identified by performing a field-of-view correction on the upper surface area (S.sub.1) of the steak by obtaining an upper surface area (S) of the steak by measuring a number of grids occupied by the area (S.sub.1) on a bakeware/grill plane, then according to the upper surface area (S) and a value of the thickness (h) obtained, using a formula V=S×h to estimate an approximate effective volume of the steak, wherein the approximate effective volume V corresponds to the size or weight of the steak.

8. The intelligent identification cooking system according to claim 1, characterized in that, a degree of fattiness is identified based on a field-of-view correction by identifying the lean meat area (S.sub.1-1) and the fatty meat area (S.sub.1-2) on the steak type, wherein a lean meat/fat meat ratio Z=S.sub.1-1/S.sub.1-2 is calculated to characterize a fattiness and leanness of the steak.

9. The intelligent identification cooking system according to claim 1, characterized in that, the infrared temperature sensor measures an initial temperature T.sub.0 of a surface of the steak and transmits a signal to the intelligent menu control system; during a baking process a temperature T of the steak is monitored, setting an overload temperature value T.sub.x to prevent burning; and the thermocouple temperature sensor monitors the oven cavity temperature T.sub.s and adjusts an output power of an electric heating tube of the oven to maintain a set baking temperature.

10. The intelligent identification cooking system according to claim 1, characterized in that, the variable parameters a, b, c and A are obtained by experimenters according to different steak types.

Description

DESCRIPTION OF THE FIGURES

(1) FIG. 1 is an illustrative diagram of the structure and implementation principle of the intelligent cooking system of the present invention.

(2) FIG. 2 is an illustrative diagram of the working principle of image processing of the intelligent cooking system of the present invention.

(3) FIG. 3 is an illustrative diagram of image correction of the thickness h and the upper surface area S of the steak type by the intelligent cooking system of the present invention.

DETAILED DESCRIPTION

(4) In order to better understand the present invention, the following further describes the present invention with reference to the accompanying drawings and embodiments, but the scope of protection claimed by the present invention is not limited to the scope of the embodiments.

First Embodiment

(5) As shown in FIG. 1, an intelligent identification cooking system of an oven mainly comprises an image acquisition system, an image analysis and processing system, and a temperature measurement and monitoring system; the image acquisition system is connected to the image analysis and processing system; an intelligent menu control system is connected to the image analysis and processing system and the temperature measurement and monitoring system respectively; in the figure, F represents a camera, L represents a light source, T1 represents an infrared temperature sensor, T2 represents a thermocouple temperature sensor, AA′B′B represents a steak-type food, MNQP represents the bakeware/grill, O point represents a center point of the bakeware/grill.

(6) The image acquisition system is mainly formed by a camera F and a light source L; where the camera F is located on a side of the oven cavity, facing the center point O of the bakeware/grill, and is at 45° to the horizontal plane, that is, ∠FOM=45°. The light source L is a yellow light tube, located above the oven cavity, on the same plane as the camera F and the center line of the grill/bakeware MNQP, and the line between the light source L and the center O of the grill/bakeware is at 75° to the horizontal plane, that is, ∠LON=75°.

(7) It should be noted that, in order to avoid overexposure of the steak-type surface in the collected image, preferably, the color of the bakeware is white, or when the grill is used for baking, white silicone oil paper is laid under the steak type.

(8) The image analysis processing system is formed by an image analysis processor and a signal modifier. The image information collected by the image acquisition system is segmented, counted and analyzed through the image analysis processor, then the analysis result is converted into an electrical signal through the signal modifier to output to the intelligent menu control system.

(9) As shown in FIG. 2, the image analysis processor analysis steps are as follows:

(10) I. Image Segmentation

(11) The analysis processor first converts the collected steak-type images from RGB color format to YIQ format, where the Y value represents a brightness, the I and the Q are hue values, which represent from orange to green and from purple to yellow-green respectively; then performs calculations on YIQ values of a pixel in an image using the following formula to obtain the classification value P.sub.mn;
P.sub.mn=a×Y.sub.mn+b×I.sub.mn+c×Q.sub.mn+A

(12) in the formula, P.sub.mn is the classification value of the pixel, a, b, c and A are optimized variable parameters, so that 0≤Pmn≤1, a, b, c and A are obtained by the experimenters according to teats of different steak-type. After substituting the YIQ value of the pixel into the above formula for calculation, the classification value P.sub.mn of each pixel in the image can be obtained. The threshold of each pixel P.sub.mn is limited, and the image is divided into the following areas: the side area S.sub.2 (0≤P.sub.mn<K.sub.1) of the steak, the upper surface area S.sub.1 (K.sub.1≤P.sub.mn<K.sub.2) of the steak, and the background S.sub.3 (K.sub.2≤P.sub.mn≤1). According to the brightness value Y, the upper surface area S.sub.1 is subdivided into a lean meat area S.sub.1-1 (Y≤K.sub.3) and a fatty meat area S.sub.1-2 (Y>K.sub.3). Among them, K.sub.1, K.sub.2 and K.sub.3 are optimized classification limit values.

(13) 2. Analysis and Processing

(14) The illustrative diagram of steak-type image correction is shown in FIG. 3, where the cuboid ABCD-A′B′C′D′ represents steak-type food, point O is the center of the field of view. The image is analyzed and processed by the image analysis processor, mainly to obtain the following information and parameters:

(15) a. Type

(16) By limiting the threshold value of the classification value P.sub.mn of the pixel point in the surface lean meat area S.sub.1-1, it is possible to effectively distinguish the main steaks such as chicken steak, pork chop, and steak.

(17) b. Thickness

(18) First, identify the lower bottom edge of the side area S.sub.2 of the steak type, and perform a function correction on the height value of the side surface through the lower bottom edge to obtain the side height of the steak type, and take the average thickness value to obtain h.

(19) c. Size/Weight

(20) The grid division method of the bakeware/grill plane is used to correct the area S.sub.1 of the upper surface of the steak type, and the upper surface area S of the steak type is obtained by measuring the number of grids occupied by the S.sub.1 area. Then according to the area S of the upper surface and the thickness value h obtained by the above method, the effective volume of the grill may be estimated according to the formula V=S×h. The effective volume V may correspond to the size or weight of the grill.

(21) d. Fattiness

(22) After the field-of-view correction, and after identifying the lean meat area S.sub.1-1 and the fatty meat area S.sub.1-2 on the surface of the steak type, calculate the lean/fat meat ratio, that is, Z=S.sub.1-1/S.sub.1-2, which represents the degree of fattiness of the steak.

(23) As shown in FIG. 1, the oven is equipped with a temperature measurement and monitoring system. The temperature measurement and monitoring system mainly comprises an infrared temperature sensor T.sub.1 and a thermocouple temperature sensor T.sub.2. Among them, the infrared temperature sensor T.sub.1 is used to measure the initial temperature T.sub.0 of the surface of the steak type. A signal is transmitted to the intelligent menu control system, and the temperature T of the steak type during the baking process is monitored, by setting the overload temperature value T.sub.x=110° C. to prevent burnt; the thermocouple temperature sensor T.sub.2 is used to monitor the temperature T.sub.s inside the oven cavity, and feedback to adjust the output power of the oven electric heating tube to maintain the set baking temperature.

(24) The intelligent menu control system collects the signals from the image analysis and processing system and the temperature measurement system. The type of the steak obtained through analysis is matched with a preset standard cooking curve, then calibrate the standard cooking curve with the thickness, the size/weight, the fattiness and an initial temperature to obtain a calibrated cooking curve; at the same time, a cooking control program is obtained according to a rawness requirement input by a user; finally, the control program is output to a control terminal and executed to realize an intelligent cooking.

(25) Table 1 shows the intelligent cooking effect of cooking some steak types using this embodiment. Table 1 lists some of the different type, thickness, weight, fattiness and initial temperature of steak type, the cooking effect achieved through the different rawness degree set by users. The cooking results in Table 1 show that the intelligent identification cooking system of oven of the present invention may effectively perform intelligent identification on the basic characteristics of steak type, match the cooking curve, and at the same time automatically obtain the cooking conditions according to rawness setting by the users, and finally achieve intelligent cooking, and achieve satisfactory cooking results.

(26) TABLE-US-00001 TABLE 1 Some cooking effects of intelligent identification cooking system of an oven Initial condition Rawness Intelligent cooking Type parameters setting conditions Cooking effect Sirloin thickness 1.5 cm, weight Medium rare 220° C., 4 min 30 s The centre area of the 100 g, fat lean ratio cut section is red, the about 1:9, initial middle area is pink, temperature 20° C. the outer area is grey, and the center temperature is 57° C. Sirloin thickness 1.5 cm, weight Medium 220° C., 5 min 30 s The centre area of the 100 g, fat lean ratio cut section is pink, the about 1:9, initial outer layer is grey, temperature 20° C. and the centre temperature is 63° C. Sirloin thickness 1.5 cm, weight Medium well 220° C., 6 min 30 s There is still a small 100 g, fat lean ratio amount of pink in the about 1:9, initial center of the cut temperature 20° C. section, and the center temperature is 70° C. Sirloin thickness 1.5 cm, weight Well done 220° C., 7 min 30 s The cut section is 100 g, fat lean ratio completely grey, the about 1:9, initial center temperature is temperature 20° C. 77° C. Sirloin thickness 1.5 cm, weight Medium well 220° C., 8 min 30 s There is still a small 100 g, fat lean ratio amount of pink in the about 1:9, initial center of the cut temperature −12° C. section, and the center temperature is 70° C. Sirloin thickness 1.0 cm, weight Medium well 220° C., 4 min 30 s There is still a small 60 g, fat lean ratio about amount of pink in the 1:9, initial temperature center of the cut 20° C. section, and the center temperature is 70° C. Sirloin thickness 2.0 cm, weight Medium well 220° C., 10 min 30 s There is still a small 150 g, fat lean ratio amount of pink in the about 1:9, initial center of the cut temperature 20° C. section, and the center temperature is 70° C. Sirloin thickness 2.0 cm, weight Medium well 220° C., 10 min 30 s There is still a small 100 g, fat lean ratio amount of pink in the about 1:9, initial center of the cut temperature −12° C. section, and the center temperature is 70° C. Beef thickness 1.5 cm, weight Medium well 220° C., 6 min 30 s There is still a small short ribs 100 g, fat lean ratio amount of pink in the about 3:7, initial center of the cut temperature 20° C. section, and the center temperature is 70° C. Beef thickness 2.0 cm, weight Well done 220° C., 12 min 30 s The cut section is short ribs 150 g, fat lean ratio completely gray, the about 3:7, initial center temperature is temperature −12° C. 77° C. Pork chop thickness 1.5 cm, weight Well done 250° C., 7 min The cut section 100 g, all lean, initial completely turns grey temperature 20° C. white, the center temperature is 77° C. Chicken thickness 1.5 cm, weight Well done 240° C., min The cut section breast 100 g, all lean, initial becomes completely steak temperature 20° C. white, the center temperature is 77° C.

Second Embodiment

(27) As shown in FIG. 1, an intelligent identification cooking system of an oven mainly comprises an image acquisition system, an image analysis and processing system, and a temperature measurement and monitoring system; the image acquisition system is connected to the image analysis and processing system; an intelligent menu control system is connected to the image analysis and processing system and the temperature measurement and monitoring system respectively; in the figure, F represents a camera, L represents a light source, T.sub.1 represents an infrared temperature sensor, T.sub.2 represents a thermocouple temperature sensor, AA′B′B represents a steak-type food, MNQP represents the bakeware/grill, O point represents a center point of the bakeware/grill.

(28) Among them, the camera F is located on the oven door, facing the center point O of the bakeware/grill, and is at 30° to the horizontal plane, that is ∠FOM=30°. The light source L is a white lamp tube, located above the oven cavity, on the same plane as the camera F and a center line of the grill/bakeware MNQP, and the line between the light source L and the center O of the grill/bakeware is at 75° to the horizontal plane, that is, ∠LON=75°.

(29) In order to avoid overexposure of the steak-type surface in the collected image, preferably, the color of the bakeware is white, or when the grill is used for baking, white silicone oil paper is laid under the steak type.

(30) The image analysis processing system is mainly formed by an image analysis processor and a signal modifier. The image information collected by the image acquisition system is segmented, counted and analyzed through the image analysis processor, then the analysis result is converted into an electrical signal through the signal modifier to output to A control panel to match a preset menu to realize smart cooking. The analysis steps and methods adopted by the image analysis processor are shown in FIG. 2 and are the same as those in the first embodiment.

(31) The oven is equipped with a temperature measurement and monitoring system. The temperature measurement and monitoring system mainly comprises an infrared temperature sensor T.sub.1 and a thermocouple temperature sensor T.sub.2. Among them, the infrared temperature sensor T.sub.1 is used to measure the initial temperature T.sub.0 of the surface of the steak type. A signal is transmitted to the intelligent menu control system, and the temperature T of the steak type during the baking process is monitored, by setting the overload temperature value T.sub.x=115° C. to prevent burnt; the thermocouple temperature sensor T.sub.2 is used to monitor the temperature T.sub.s inside the oven cavity, and feedback to adjust the output power of the oven electric heating tube to maintain the set baking temperature.

(32) The intelligent menu control system collects the signals from the image analysis and processing system and the temperature measurement system. The type of the steak obtained through analysis is matched with a preset cooking curve, then calibrate the cooking curve with the thickness, the size/weight, the fattiness and an initial temperature to obtain a calibrated cooking curve; at the same time, a cooking control program is obtained according to a rawness requirement input by a user; finally, the control program is output to a control terminal and executed to realize an intelligent cooking.

(33) Using this embodiment to cook steak type can achieve intelligent cooking effects that meet the expectations of users.