A METHOD OF TRACKING A FOOD ITEM IN A PROCESSING FACILITY, AND A SYSTEM FOR PROCESSING FOOD ITEMS
20240000088 ยท 2024-01-04
Inventors
Cpc classification
A22C17/0073
HUMAN NECESSITIES
G06T7/246
PHYSICS
A22C25/08
HUMAN NECESSITIES
A22C25/04
HUMAN NECESSITIES
A22C17/0093
HUMAN NECESSITIES
International classification
A22C25/04
HUMAN NECESSITIES
A22C25/08
HUMAN NECESSITIES
G06K19/06
PHYSICS
G06T7/246
PHYSICS
Abstract
A method and a system for automatically tracking a food item in a processing facility involves a food item that is received in a check-in position where a check-in image is captured, and a unique identification insignia, UII is generated based on the check-in image. The food item is handled and moved to a check-out position where a check-out image is captured. To facilitate automatic tracking of the food items, a recognition process is carried out to establish if a check-out image can be assigned to the created UII, and a data-record is, in that case, generated with the recognized UII.
Claims
1.-31. (canceled)
32. A method for tracking a food item in a flow of food items through a handling facility, the method comprising: receiving the food item in a check-in position; acquiring at least one check-in image of the food item; creating at least one unique identification insignia (UII) from the check-in image; moving the food item or pieces thereof in the flow of food items through a handling station to a check-out position; acquiring at the check-out position, at least one check-out image of the food item or a piece thereof; conducting a recognition process configured to establish if the check-out image can be assigned to the created UII; and if it could be established that the check-out image can be assigned to the created UII, defining a data-record including the assigned UII.
33. The method according to claim 32, further comprising storing the data-record in a database or printing the data-record on a label.
34. The method according to claim 32, further comprising: defining a first handling station for further handling of the food item or the piece thereof, defining a second handling station for further handling of the food item or the piece thereof, selecting one of the first and second handling stations if the recognition process establishes that the check-out image can be assigned to the created UII; and selecting another one of the of the first and second handling stations if the recognition process establishes that the check-out image cannot be assigned to the created UII.
35. The method according to claim 32, comprising creating a plurality of UIIs from the check-in image.
36. The method according to claim 35, wherein the plurality of UIIs comprises a UII relating to one part of the check-in image and other UIIs relating to other parts of the check-in image such that the plurality of UIIs relate to different parts of the food item.
37. The method according to claim 36, wherein the different parts are overlapping parts.
38. The method according to claim 35, further comprising grouping the UIIs in a set of UIIs when the UIIs are created from the same check-in image or from the same food item.
39. The method according to claim 32, comprising establishing a que of UII's of one or more food items in the flow of food items and deleting from the que, a UII if the recognition process establishes that a check-out image can be assigned to that UII.
40. The method according to claim 38, comprising deleting all UII's of a set of UIIs if the recognition process establishes that one or more check-out images can be assigned to a predetermined number of UIIs from that set of UIIs.
41. The method according to claim 39, further comprising assigning a check-in time to each UII establishing when the UII is created and deleting a UII from the que when a predetermined time has elapsed after the check-in time.
42. The method according to claim 32, comprising receiving meta data related to the food item and adding the meta data to the data-record.
43. The method according to claim 42, wherein the meta data relates to at least one of: the handling station or a human operator of the handling station; a place where the food item is originally produced, and a weight, a size, a shape, or a quality of the food item.
44. The method according to claim 32, comprising identifying in the check-in image an undesired food item part and creating at least one UII from a part of the image which does not contain the undesired food item part.
45. The method according to claim 32, comprising identifying in the check-in image an undesired food item part and creating at least one UII from a part of the image which does contain the undesired food item part.
46. The method according to claim 44, comprising grading the food item or the piece thereof depending on identification of the undesired food item part.
47. The method according to claim 32, comprising grading the food item or the piece thereof depending on if the recognition process establishes that a check-out image can be assigned to that UII or if the recognition process does not establish that a check-out image can be assigned to that UII.
48. The method according to claim 32, comprising defining based upon the check-in image, a portioning plan defining intended cutting of the food item into pieces thereof, and creating at least one UII for each of the intended pieces thereof.
49. The method according to claim 48, comprising cutting the food item into the intended pieces in a handling station.
50. The method according to claim 32, wherein the UII is generated based on a structural uniqueness of the food item or generated based on a size or shape of the food item or pieces thereof, or generated based on a color or color distribution of the food item, or combinations thereof.
51. The method according to claim 32, wherein the food item is selected from the group consisting of: vegetables, fruits, meat, poultry, fish, and seafood, or slices thereof.
52. The method according to claim 32, wherein the food item is salmon such as a fillet of salmon.
53. The method according to claim 32, wherein each food item or the pieces thereof in the flow of food items are categorized in at least a first category and a second category, the categorization being carried out between the check-in position and the check-out position, and wherein a food item or pieces thereof being in the first category is directed through the check-out position and wherein a food item or pieces thereof being in the second category is not directed through the check-out position.
54. The method according to claim 53, wherein a food item or pieces thereof being in the second category is directed through an alternative check-out position.
55. The method according to claim 53, comprising establishing a que of UII's of food items for which a check-in image is acquired at the check-in position and deleting from the que, a UII for a food item or a piece thereof which is categorized in the second category.
56. The method according to claim 32, comprising defining a data-record for each food item or pieces thereof at the check-out position, and if it could be established that the check-out image can be assigned to the created UII assigning the UII to that food item or pieces thereof.
57. A system for tracking a food item in a flow of food items through a handling facility, the system comprising: a check-in camera arranged at a check-in position of the handling facility and configured to capture a check-in image of the food item; a processing structure configured to create unique identification insignias (UII's) from the check-in image; and a check-out camera arranged at a check-out position of the handling facility and configured to capture a check-out image of the food item or a piece thereof; wherein the processing structure is configured to conduct a recognition process establishing if the check-out image can be assigned to the created UII; and configured, if it could be established that the check-out image can be assigned to the created UII, to define a data-record including the assigned UII.
58. The system according to claim 57, further comprising a process selector configured to select one of at least two further handling stations if the recognition process establishes that the check-out image can be assigned to the created UII; and configured to select another one of the at least two further handling stations if the recognition process establishes that the check-out image cannot be assigned to the created UII.
59. The system according to claim 57, wherein the processing structure is configured to establish a que of UII of food items for which a check-in image is acquired at the check-in position and to delete from the que, a UII if the recognition processor establishes that a check-out image can be assigned to that UII.
60. The system according to claim 59, wherein the processing structure is further configured to assign a check-in time to each UII when created, and to delete a UII from the que when a predetermined time has elapsed.
61. The system according to claim 60, comprising a sensor configured to determine a lead time for the food items from the check-in position to the check-out position, and wherein the processing structure is configured to dynamically update the predetermined time based on the lead time.
62. The system according to claim 58, comprising a conveyor system configured to convey the food item from the check-in position to the check-out position.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0103] In the following, embodiments of the disclosure will be described in further details with reference to the drawing in which:
[0104]
[0105]
[0106]
[0107]
[0108]
[0109]
[0110]
DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS
[0111] The detailed description and specific examples, while indicating embodiments, are given by way of illustration only, since various changes and modifications within the spirit and scope of this disclosure will become apparent to those skilled in the art from this detailed description.
[0112]
[0113]
[0114] The illustrated system has automatic tracking capability allowing each food item which enters the system to be recognized when it leaves the food processing system. The illustrated facility comprises a conveyor 203 which transports food items from the intake 204 to the outlet 205.
[0115] A check-in position 206 is defined along the conveyor. At this position, a check-in camera 207 is located such that it can capture check-in images of the food items. In the illustrated embodiment, the check-in camera is located above the conveyor and lamps 208 may be arranged to provide enough light for recognizing even fine details in the food items. The camera 207 is based on visual light reflection, but in alternative embodiments, the camera could be X-ray based or ultrasound based etc.
[0116] The check-out camera 209 is arranged at the check-out position 210 where it can capture check-out images of the food items or pieces thereof e.g. by the assistance of lamps 211 etc. Cameras 207 and 209 may be arranged before corresponding lamps 208 and 211, however lamps may also be positioned before cameras or at any other suitable locations.
[0117] Between the check-in position and the check-out position, the food items are handled at a handling station. In this case, the handling station comprises two processing tables 212, 213. Alternatively, such handling stations may comprise automatic processing equipment.
[0118] An operator 214, 215 may be assigned to each processing table. Each table and each operator have individual IDs, e.g. in the form of an identification number.
[0119] The computer 216 comprises a data input configured to receive the captured images from the check-in camera and from the check-out camera, e.g. via a Local area network, LAN 217. A CPU with corresponding software code is configured to form a processing structure and to create a unique identification insignia (UII) based on the check-in images and to recognize that UII in the check-out images. The computer 216 may generate a data-record 218, symbolized by the product card 218. The data-record may contain the UII and other kinds of data, e.g. meta data related to the food item or to the handling of the food item, e.g. the ID of the operator etc.
[0120] Each table may have computer interfaces, e.g. in the form of touch screens 219, 220 enabling the operator to identify the food item, to generate information related to the food item. Such information may specify the ID of the operator, the ID of the table, or visually observed issues related to the food item, e.g. quality parameters observed by the operator. The information can be transmitted by the LAN 217 to the computer 216 or elsewhere, e.g. to the supervisor computer system 221 where a supervisor 222 may review the data. The data could be included as meta data in the data-record 218. Data, e.g. as included in the data-record, could also be exported e.g. to adjacent handling stations, or to follow the food item or pieces thereof all the way to the consumer.
[0121] The outlet could be arranged to deliver the food items or pieces thereof to adjacent handling stations, e.g. for further processing, packing, inspection, or rejection etc.
[0122] The system may comprise a diverter which can be controlled by the processing structure such that, depending on the recognizing of the food item or the piece thereof at the check-out position, the food item is directed to one or the other of the subsequent processing stations. By means of an example, the processing structure may have a list of N specific food items with corresponding UII. If these N food items are to be packed for a specific customer, the food item diverter could be controlled by the processing structure to direct these specific food items to a packing station dedicated to the specific customer. If a specific food item can't be recognized at the check-out position, the processing structure may use a preselected subsequent processing station, e.g. a processing station with a human operator who can decide an action for the non-recognized food item.
[0123] The UII could be generated based on a structural uniqueness of the food item. Examples of such structural uniqueness is shape and size of fat, bone, muscle, fiber patterns. It could also be generated based on a shape or size of the food item or generated based on a color or color distribution of the food item.
[0124] Particularly, the check-in image may be an image of a food item which is prepared for obtaining the best image of this unique feature.
[0125]
[0126] The gray area indicated by the box 301 is typically cut away and rejected to be processed differently. The pieces of the salmon within the gray box 301 could be categorized as a food item piece not supposed to pass through the check-out position. Accordingly, it may be an object to remove a UII from such a piece of the food item from a que of UIIs to thereby facilitate faster or more robust identification of the remaining UIIs in the check-out image.
[0127] The food item illustrated in
[0128] Since such cutting could take place between the check-in position and the check-out position it may be desirable not to use an area around the blood stain for the purpose of generating the UII, or at least to generate a UII based on a principle which is robust against removal of parts of the food item and thus a part of the image.
[0129] If such an undesired element of the food item is removed, it may impair the ability of the process structure to recognize the UII in the check-out image. Accordingly, the processing structure is programmed upon identifying such an undesired food item part and to create a frame 403 around the blood stain 402 and to generate the UII from a part of the image which does not include the frame 403.
[0130]
[0131]
[0132] The computer 216 may define for all these UIIs which relate to the same food item, a set of UIIs defining the relationship between the UIIs belonging to the same food item. As an example, all UIIs 61-90 may be defined as UII 1-61, 1-62, 1-63 . . . 1-90, thereby defining that they belong to food item no. 1.
[0133]
[0134] In
[0135] The computer 216 comprises a memory in which a que of UII of food items marked B is stored.
[0136] Each time a food item arrives at the check-in position, the que is updated with a new UII. Each time a food item leaves the check-out position and is recognized by the check-out image to have a specific UII, the corresponding UII is deleted from the que. In that way, the que only contains a limited number of available UIIs and the ability to recognize the UII correctly can be increased. The feature is particularly relevant when the UII is generated based on a feature, e.g. a structure, size, shape, or color which with statistically good probability may be considered unique within a relatively low amount of items, but which could not be considered unique in larger amounts of the food item.
[0137]
[0138] In one implementation, the check in-camera and the check-out camera are CCD cameras arranged along a conveyor belt. The cameras communicate digital images with the processing structure. The processing structure comprises an image recognizing unit providing a converter capable of converting a digital image from the check-in or check-out camera to a unique data-record. A communication unit is capable of outputting the data-record to other computers, e.g. for label printing purposes or for later authentication of the food item, or storing the data-record in a database.
[0139] Internally, the data-record is preserved in memory, and an image signal processing unit in the form of a microprocessor is configured to carry out the reorganization procedure in which it is verified if a check-out image and a check-in image is from the same food item. The microprocessor is programmed with software developed with similar techniques as standard software developed for recognizing features in images. Such systems are widely developed, e.g. for face detection or fingerprint recognition.
[0140] The processing structure may include a video card with a frame buffer and other features configured for handling input from the camera, and the UII could be generated based on standard processor functions available e.g. for face recognition, fingerprint recognition etc. Such processors can deliver a unique data-string representing a UII for a picture.