A METHOD AND A SYSTEM FOR DETERMINING A WEIGHT ESTIMATE OF A FOOD ITEM
20240037771 ยท 2024-02-01
Inventors
Cpc classification
A22C25/04
HUMAN NECESSITIES
International classification
Abstract
A method and a system for determining a weight estimate of food items include receiving an image of an outer surface of a food item. The image having a plurality of pixels each having a measure. By the measure of the pixels, a surface content of a first tissue which is distinct from a second tissue is identified. The surface content is translated into a volume content and a density parameter is recorded. An estimate of the weight is determined based on the volume content, the density parameter, and volume data identifying the volume of the food item.
Claims
1.-21. (canceled)
22. A method of determining a weight estimate of food items, the method comprising the steps of: a) receiving an image of an outer surface of a food item, the image comprising a plurality of pixels each having a measure; b) identifying by the measure of the pixels, a surface content of a first tissue which is distinct from a second tissue; c) translating the surface content into a volume content; d) recording a density parameter of the food item, the density parameter comprising at least a first density contribution related to the first tissue and a second density contribution related to the second tissue; e) receiving food item volume data representing a volume of the food item or a section thereof; and f) determining based on the volume content, the density parameter, and the food item volume data, a weight estimate of the food item or a section thereof.
23. The method according to claim 22, wherein the translation of the surface content into the volume content is carried out by linking each pixel or a group of pixels to a height of the food item at the pixel or group of pixels and converting, based on the height, the identified surface content into a volume content contribution.
24. The method according to claim 23, wherein the volume content is based on an accumulation of the volume content contribution from each pixel or group of pixels.
25. The method according to claim 23, comprising extracting from a 3D profile of the food item, the height of the food item at each pixel or group of pixels in the image.
26. The method according to claim 22, comprising extracting from a 3D profile of the food item, the food item volume data.
27. The method according to claim 22, wherein step c) of translating the surface content into a volume content is carried out by use of a transfer function depending on empirical data.
28. The method according to claim 22, comprising: providing a first transfer function and a second transfer function, both configured for translating the surface content into a volume content; and selecting between using the first transfer function and the second transfer function for carrying out step c) depending on empirical data.
29. The method according to claim 22, further comprising determining an actual weight by weighing the food item and amending at least one of the translations in step c, and the density parameter recorded in step d) depending on a deviation between the weight estimate found in step f) and the actual weight found by weighing the food item.
30. The method according to claim 22, further comprising a step of sectionalizing the food item into a plurality of sections, and a step of determining a sectional weight estimate of the food item sections based on the volume content, the density parameter, and the food item volume data.
31. The method according to claim 30, further comprising cutting the food items into food portions in accordance with the sectionalizing.
32. The method according to claim 22, wherein said food item is a fillet of a Salmonidae, such as Salmo salar.
33. A system for processing a food item, the system comprising: at least one conveyor configured to move the food item in a flow of food items from a start position to at least one processing position; an image capturing device configured to provide an image of the food item, the image comprising a plurality of pixels each having a measure; a processing structure configured to: identify by the measure of the pixels, a surface content of a first tissue which is distinct from a second tissue; translate the surface content into a volume content; record a density parameter of the food item, the density parameter comprising at least a first density contribution related to the first tissue and a second density contribution related to the second tissue; receive food item volume data representing a volume of the food item; and determine based on the volume content, the density parameter, and the food item volume data, a weight estimate of the food item.
34. The system according to claim 33, comprising a 3D image providing device configured to provide a 3D profile of the food item, and wherein the processing structure is configured to extract from the 3D profile, the food item volume data and/or a height of the food item.
35. The system according to claim 34, wherein the 3D image providing device is constituted by the image capturing device.
36. The system according to claim 33, forming part of a batching or grading device comprising multiple receiving bins and a controller configured for assigning food items to the bins based on a batching or grading criteria comprising the weight of the food item, wherein the controller is configured to use the weight estimate determined based on the volume content, the density parameter, and the food item volume data for the assigning of the food items into the bins.
37. The system according to claim 36, wherein the controller is configured to receive from a scale, a compiled weight of a plurality of food item contained in one of the bins, and to compare the compiled weight with summation of the estimated weight of each of the items in the bin.
38. The system according to claim 37, wherein the controller is configured to receive from a scale, a plurality of compiled weights of food item contained in a plurality of the bins, and to compare the compiled weight with summation of the estimated weight of each of the items in the bin.
39. The system according to claim 37, where the translation and/or the density parameter is amended based on a deviation between the compiled weight and the summation of the estimated weights.
40. The system according to claim 33, forming part of a food portioning system, wherein the processing structure is configured to sectionalize the food item into a plurality of sections, and to determine a sectional weight estimate of the food item sections based on the volume content, the density parameter, and the food item volume data.
41. The system according to claim 40, further comprising a cutting structure configured to cut the food items into food portions based on the sectionalizing.
42. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to claim 22.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0094] In the following, embodiments of the disclosure will now be described in further details with reference to the drawings, in which:
[0095]
[0096]
[0097]
[0098]
[0099]
[0100]
DETAILED DESCRIPTION OF THE DRAWINGS
[0101] The detailed description and specific examples, while indicating embodiments of the disclosure, are given by way of illustration only, since various changes and modifications within the spirit and scope of this disclosure will become apparent to those skilled in the art from this detailed description.
[0102]
[0103] In
[0104] The measure, i.e. in this case the grey scale, defines what is considered as fat and what is considered as meat, and the two different tissues can be distinguished from each other by introducing a threshold grey value. Pixels above the threshold represent meat, and pixels below the threshold represent fat.
[0105] In this case the first tissue is represented by meat 30, and the second tissue, in this case is represented by fat 40.
[0106] In the illustrated embodiment, larger white areas are observed in the fish fillets 2 illustrating that the main part hereof is meat 30.
[0107] When comparing the two food items 1, it can be seen that the areas of the second tissue 40; i.e. fat, of the two fish fillets 2 are different. The fish fillet 2 at the right side has broader stripes of fat 40 than the fish fillet 2 to the left. This indicates a higher amount of the second tissue 40, i.e. of fat, in the fish fillet at the right side and thereby a lower density for the fish fillet at the right side than the left side fish fillet 2.
[0108] As the fish fillet 2 to the right has a larger area of fat 40 than the fish fillet 2 to the left, the image for each fish fillet can identify by the measure of the pixels, a surface content of a first tissue which is distinct from a second tissue. The surface content in this example is a specific separation of pixels relating to fat from other pixels relating to meat. It could also have been an absolute surface area of the fat in the image, or it could have been a ratio of the area of the fat relative an area of meat in the image etc. This is referred to herein as image ratio.
[0109] As the amount of the first tissue 30 and/or the amount of the second tissue 40 may vary within the food item 1, the image ratio is not necessarily equal to the ratio between the first tissue 30 and the second tissue 40 in the food item as such.
[0110]
[0111]
[0112] The illustrated facility comprises a conveyor 203, such as a conveyor belt which transports food items 202 from the intake 204 to the outlet 205.
[0113] At the weight estimation position 206, an image capturing device 207 is located such that it can provide images of the food items. In the illustrated embodiment, the image capturing device is located above the conveyor 203 and lamps 208 may be arranged to provide enough light for recognizing even fine details in the food items. The image capturing device 207 is based on visual light reflection, but in alternative embodiments, the image capturing device could be X-ray based or ultrasound based etc.
[0114] The system comprises a 3D image providing device 209 located at the weight estimation position and configured to provide a 3D image. The 3D image is used as a basis for determining food volume data representing a volume of the food item 202.
[0115] The 3D image providing device 209 may replace the image capturing device 207 if it can capture a pixel image with a measure for each pixel.
[0116] A weighing station 210 is arranged after the weight estimation where it is possible to weigh some of the food items. The actual weight can subsequently be used to calibrate the method of estimating the weight of the food items. A lamp 211 is positioned at the weighing station 210 for additional manual, visual scanning of the food item.
[0117] Between the intake 204 and the outlet 205, the food items are handled at a handling station. In this case, the handling station comprises two processing tables 212, 213. Alternatively, such handling stations may comprise automatic processing equipment.
[0118] An operator 214, 215 may be assigned to each processing table.
[0119] The computer 216 comprises a data input configured to receive from the image capturing device 207, the image of the food item, which image includes a plurality of pixels, e.g. arranged in a matrix of pixels, and each having a measure in the form of a grayscale value.
[0120] Distinct tissues of the food item are visual in the image.
[0121] The computer is further configured to receive from the 3D image providing device 209, the food item volume data representing a volume of the food item, e.g. via a Local area network, LAN 217. A CPU with corresponding software code is configured to form a processing structure and configured to determine from the image, a surface content of the first tissue distinct from the second tissue. This surface content could be expressed in said image ratio between a first tissue and a second tissue in the image.
[0122] The computer has a library of different transfer functions each configured to translate the surface content into a volume content. Each transfer function matches a specific situation, e.g. a specific type of food item, a season of the year, or other variables. The most promising transfer function for a specifically encountered situation is selected in a user interface, or automatically recognised by the computer. The computer may automatically shift between different transfer functions depending on the date and thus the time of the year, and the operator may select a new range of transfer functions when shifting from one species of fish to another species of fish.
[0123] A transfer function may, as an example be:
F(x)=k*x
where x is the surface content of a tissue (e.g. fat), F(x) is the volume content of that tissue, and k is a constant.
[0124] In another example, the transfer function may be:
F(x)=(i1*l1)+k*x
where i1 is empirical value representing an impact of a lengthwise dimension of the food item, l1 is the lengthwise direction, e.g. determined from the image, and where x is the surface content of a tissue (e.g. fat), F(x) is the volume content of that tissue, and k is a constant.
[0125] In an alternative implementation, the computer is programmed to apply a volume parameter to each pixel or region individually. In this implementation, the computer program determines a thickness related to the pixel or group of pixels. This thickness can be determined e.g. from a 3D image or other scanners known in the art. Subsequently, the computer determines a volume of the tissue which is represented by the pixel or group of pixels. In this case, the transfer function may e.g. have the form:
where ai is the area of pixel number i, or the area of a group, i, of pixels, and hi is the height of the food item at a pixel identified as pixel number i, or the height of the food item at the group, i, of pixels. F is the volume content of that tissue.
[0126] In another implementation, the computer is programmed with the following transfer functions
[0127] Where p is all pixels, total number of pixels is n in the image. The function F1 returns the total volume of tissue 1 (e.g. fat), F2 returns the total volume of tissue 2 (e.g. meat). The function g returns the volume for this pixel group (calculated as surface area*height). The function F1 returns a number in the interval [0,1] representing the amount of tissue 1 (e.g. fat) for this pixel. The function F2 returns the amount of tissue 2 (e.g. meat) for this pixel.
[0128] The functions F1 and F2 will take surface tissue in this pixel group into consideration, and f1 could return 1.0 if the entire pixel group is tissue 1 at the surface, but it could also know based on product-age/spices/etc. or location on the food item that e.g. only 0.7 is fat.
[0129] The computer can record a density parameter of the food item, the density parameter comprising at least a first density contribution related to the first tissue and a second density contribution related to the second tissue. The density parameter can be received by LAN from another computer system, or it could be manually entered by an operator of the system.
[0130] Based on the volume content, the density parameter, and the food item volume data, the computer can estimate a weight of the food item.
[0131] An example of a C++ object for executing the method is provided below. In this example, the first tissue is fat and the second tissue is meat. The example is
TABLE-US-00001 enum TisueType { FAT, MEAT }; struct Pixel { double height; double area; TisueType tissue_type; list<double> location; }; double calcFatShare(Pixel p) { return 0.8; } double calcFatMeat(Pixel p){ return 0.98; } list<double> calcTisueVolumens(list<pixel> pixels) { double volumen_fat = 0; double volumen_meat = 0; for(int i=0; i<pixels.size( ); ++i) { Pixel p = pixels[i]; double pixel_volumen = p.height * p.area; if(p.tissue_type==FAT) { double k1 = calcFatShare(p); volumen_fat = volumen_fat + pixel_volumen * k1; volumen_meat = volumen_meat + pixel_volumen * (1k1); } else if(p.tissue_type==MEAT) { double k2 = calcMeatShare(p); volumen_meat = volumen_meat + pixel_volumen * k2; volumen_fat = volumen_fat + pixel_volumen * (1k2); } } return {volumen_fat, volumen_meat}; }
[0132] The functions calcFatShare and calcFatMeat could take p.height or p.location into consideration. It could also use age of the food item. In a more advanced formula, also neighbouring pixels could be taken into consideration. E.g. If all surrounding pixels in 4 mm radius is fat, the calcFatShare( ) could return 1.0, but if only 20% of the pixels is that radius is fat, then return 0.15.
[0133] The computer 216 may generate a data file 218, symbolized by the product card 218. The data file may contain the estimate weight of the food item 202 and/or non-cut sections hereof and other kinds of data related to the food item or to the handling of the food item.
[0134] Each processing table 212, 213 may have computer interfaces, e.g. in the form of touch screens 219, 220 enabling the operator to identify the food item, to generate information related to the food item. Such information may specify an ID of the operator, an ID of the table, or visually observed issues related to the food item, e.g. quality parameters observed by the operator. The information can be transmitted by the LAN 217 to the computer 216 or elsewhere, e.g. to the supervisor computer system 221 where a supervisor 222 may review the data. The data could be included in the data file 218, e.g. together with the estimated weight. Data, e.g. as included in the data file, could also be exported e.g. to adjacent handling stations, or to follow the food item or pieces thereof all the way to the consumer.
[0135] The outlet 205 could be arranged to deliver the food items to adjacent handling stations, e.g. for further processing, packing, inspection, or rejection etc.
[0136] The illustrated system comprises a weighing station 210 for controlling the estimated weight of the food items. The system may form part of a grader capable of grouping a number of food items depending on a characteristic, e.g. in order to obtain batches of a specific weight or number of food items. A weighing station could generally be placed anywhere downstream relative to the image capturing device 207. Particularly, the weighing station may be arranged to weigh the batches, i.e. an accumulation of a number of food items for which the weight is estimated individually. In this case the weight could be stored relative to a number of previously prepared batches, e.g. a weight of the last 1000 batches of food items being prepared in the system. The number of previously prepared batches may define a moving window as explained below.
[0137] The system may also store the volume of one of the tissues, e.g. the content of fat, and/or the content of meat for each fish in the box.
[0138] Every time a weight of a new batch is saved, the computer may be configured to check, by use of the scale if the weight is reasonable, e.g. within a few percent of the expected weight, i.e. the weight constituted by the sum of the estimated weights of the food items in that batch.
[0139] If the difference between the estimated weight and the measured weight is within certain tolerances, the oldest box is deleted from the memory.
[0140] The number of batches is dived in to two groups (could be even and odd batch numbers prepared by the system).
[0141] For the two groups two equations with two unknowns are found:
V1f*Df+V1m*Dm=W1
V2f*Df+V2m*Dm=W2
[0142] Wherein V1f is volume for fat of the food item no 1, D is density of the food item, Df is density for fat, and Dm is density for meat. W is weight of the food item. The reference sign 1 or 2 illustrates that the equation is for food item no. 1 and 2.
[0143] The equations are solved, and two new density-fat and density-meat are found.
[0144] The new density could be used directly, or it could be applied to the existing with a learning_rate e.g. L=0.1
Dup_dated=Dold*(1L)+Dnew*L
[0145] The above example could be a moving window with a larger number of batches, e.g. 1000 batches. A smaller moving window could also be used thereby providing a lower learning rate e.g. 0.001 and only use the last two batches instead of a large number of batches, e.g. instead of 1000 batches.
[0146] Alternatively, an algorithm could be defined where more outliers are filtered out. E.g. use the lase 2*10 batches, the 2 batches with most positive divination and the two batches with most negative divination are not used. The remaining 16 batches are divided into two groups and the two equations with two unknowns are created and solved.
[0147]
[0148]
[0149] Subsequently, the computer may accumulate the volume content contributions from each pixel or group of pixels and thereby define the volume content. The accumulation may e.g. be carried out by the below illustrated function:
where vcci is the volume content contribution from pixel no i or from group i of pixels and Vc is the volume content.
[0150]