A METHOD OF TRACKING A FOOD ITEM IN A PROCESSING FACILITY, AND A SYSTEM FOR PROCESSING FOOD ITEMS

20240000088 ยท 2024-01-04

    Inventors

    Cpc classification

    International classification

    Abstract

    A method and a system for automatically tracking a food item in a processing facility involves a food item that is received in a check-in position where a check-in image is captured, and a unique identification insignia, UII is generated based on the check-in image. The food item is handled and moved to a check-out position where a check-out image is captured. To facilitate automatic tracking of the food items, a recognition process is carried out to establish if a check-out image can be assigned to the created UII, and a data-record is, in that case, generated with the recognized UII.

    Claims

    1.-31. (canceled)

    32. A method for tracking a food item in a flow of food items through a handling facility, the method comprising: receiving the food item in a check-in position; acquiring at least one check-in image of the food item; creating at least one unique identification insignia (UII) from the check-in image; moving the food item or pieces thereof in the flow of food items through a handling station to a check-out position; acquiring at the check-out position, at least one check-out image of the food item or a piece thereof; conducting a recognition process configured to establish if the check-out image can be assigned to the created UII; and if it could be established that the check-out image can be assigned to the created UII, defining a data-record including the assigned UII.

    33. The method according to claim 32, further comprising storing the data-record in a database or printing the data-record on a label.

    34. The method according to claim 32, further comprising: defining a first handling station for further handling of the food item or the piece thereof, defining a second handling station for further handling of the food item or the piece thereof, selecting one of the first and second handling stations if the recognition process establishes that the check-out image can be assigned to the created UII; and selecting another one of the of the first and second handling stations if the recognition process establishes that the check-out image cannot be assigned to the created UII.

    35. The method according to claim 32, comprising creating a plurality of UIIs from the check-in image.

    36. The method according to claim 35, wherein the plurality of UIIs comprises a UII relating to one part of the check-in image and other UIIs relating to other parts of the check-in image such that the plurality of UIIs relate to different parts of the food item.

    37. The method according to claim 36, wherein the different parts are overlapping parts.

    38. The method according to claim 35, further comprising grouping the UIIs in a set of UIIs when the UIIs are created from the same check-in image or from the same food item.

    39. The method according to claim 32, comprising establishing a que of UII's of one or more food items in the flow of food items and deleting from the que, a UII if the recognition process establishes that a check-out image can be assigned to that UII.

    40. The method according to claim 38, comprising deleting all UII's of a set of UIIs if the recognition process establishes that one or more check-out images can be assigned to a predetermined number of UIIs from that set of UIIs.

    41. The method according to claim 39, further comprising assigning a check-in time to each UII establishing when the UII is created and deleting a UII from the que when a predetermined time has elapsed after the check-in time.

    42. The method according to claim 32, comprising receiving meta data related to the food item and adding the meta data to the data-record.

    43. The method according to claim 42, wherein the meta data relates to at least one of: the handling station or a human operator of the handling station; a place where the food item is originally produced, and a weight, a size, a shape, or a quality of the food item.

    44. The method according to claim 32, comprising identifying in the check-in image an undesired food item part and creating at least one UII from a part of the image which does not contain the undesired food item part.

    45. The method according to claim 32, comprising identifying in the check-in image an undesired food item part and creating at least one UII from a part of the image which does contain the undesired food item part.

    46. The method according to claim 44, comprising grading the food item or the piece thereof depending on identification of the undesired food item part.

    47. The method according to claim 32, comprising grading the food item or the piece thereof depending on if the recognition process establishes that a check-out image can be assigned to that UII or if the recognition process does not establish that a check-out image can be assigned to that UII.

    48. The method according to claim 32, comprising defining based upon the check-in image, a portioning plan defining intended cutting of the food item into pieces thereof, and creating at least one UII for each of the intended pieces thereof.

    49. The method according to claim 48, comprising cutting the food item into the intended pieces in a handling station.

    50. The method according to claim 32, wherein the UII is generated based on a structural uniqueness of the food item or generated based on a size or shape of the food item or pieces thereof, or generated based on a color or color distribution of the food item, or combinations thereof.

    51. The method according to claim 32, wherein the food item is selected from the group consisting of: vegetables, fruits, meat, poultry, fish, and seafood, or slices thereof.

    52. The method according to claim 32, wherein the food item is salmon such as a fillet of salmon.

    53. The method according to claim 32, wherein each food item or the pieces thereof in the flow of food items are categorized in at least a first category and a second category, the categorization being carried out between the check-in position and the check-out position, and wherein a food item or pieces thereof being in the first category is directed through the check-out position and wherein a food item or pieces thereof being in the second category is not directed through the check-out position.

    54. The method according to claim 53, wherein a food item or pieces thereof being in the second category is directed through an alternative check-out position.

    55. The method according to claim 53, comprising establishing a que of UII's of food items for which a check-in image is acquired at the check-in position and deleting from the que, a UII for a food item or a piece thereof which is categorized in the second category.

    56. The method according to claim 32, comprising defining a data-record for each food item or pieces thereof at the check-out position, and if it could be established that the check-out image can be assigned to the created UII assigning the UII to that food item or pieces thereof.

    57. A system for tracking a food item in a flow of food items through a handling facility, the system comprising: a check-in camera arranged at a check-in position of the handling facility and configured to capture a check-in image of the food item; a processing structure configured to create unique identification insignias (UII's) from the check-in image; and a check-out camera arranged at a check-out position of the handling facility and configured to capture a check-out image of the food item or a piece thereof; wherein the processing structure is configured to conduct a recognition process establishing if the check-out image can be assigned to the created UII; and configured, if it could be established that the check-out image can be assigned to the created UII, to define a data-record including the assigned UII.

    58. The system according to claim 57, further comprising a process selector configured to select one of at least two further handling stations if the recognition process establishes that the check-out image can be assigned to the created UII; and configured to select another one of the at least two further handling stations if the recognition process establishes that the check-out image cannot be assigned to the created UII.

    59. The system according to claim 57, wherein the processing structure is configured to establish a que of UII of food items for which a check-in image is acquired at the check-in position and to delete from the que, a UII if the recognition processor establishes that a check-out image can be assigned to that UII.

    60. The system according to claim 59, wherein the processing structure is further configured to assign a check-in time to each UII when created, and to delete a UII from the que when a predetermined time has elapsed.

    61. The system according to claim 60, comprising a sensor configured to determine a lead time for the food items from the check-in position to the check-out position, and wherein the processing structure is configured to dynamically update the predetermined time based on the lead time.

    62. The system according to claim 58, comprising a conveyor system configured to convey the food item from the check-in position to the check-out position.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0103] In the following, embodiments of the disclosure will be described in further details with reference to the drawing in which:

    [0104] FIG. 1 illustrates schematically main components of a system according to the disclosure;

    [0105] FIG. 2 illustrates further details of a system according to the disclosure;

    [0106] FIG. 3 illustrates a food item with an identifiable structural uniqueness;

    [0107] FIG. 4 illustrates a food item with an area of a lower quality e.g. with an undesired element;

    [0108] FIG. 5 illustrates the food item from the previous FIG. 4 but with an overlapping portioning plan;

    [0109] FIG. 6 illustrates a food item for which a plurality of UIIs is created in the check-in image; and

    [0110] FIG. 7 illustrates a timeline.

    DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS

    [0111] The detailed description and specific examples, while indicating embodiments, are given by way of illustration only, since various changes and modifications within the spirit and scope of this disclosure will become apparent to those skilled in the art from this detailed description.

    [0112] FIG. 1 illustrates schematically a system comprising three stations, named S1, S2 and S3. The three stations could be connected e.g. by a conveyor or they could simply be three stations arranged at three different locations of a table. S1 could represent a check-in position. S2 could represent a handling station, e.g. a processing station where the food item is processed or handled in other ways. Examples of such handling could be sorting, counting, marking, filleting, trimming, weighing, batching or any similar kind of known handling of food items. S3 could represent a check-out position.

    [0113] FIG. 2 illustrates an example of such a system depicted schematically in FIG. 1. In FIG. 2, the system 201 is illustrated with further details. The exemplified system is configured for operator assisted processing of food items 202 but it could be fully automatic processing, or it may be a simple handling of the food item, e.g. simply registering the food item for packing.

    [0114] The illustrated system has automatic tracking capability allowing each food item which enters the system to be recognized when it leaves the food processing system. The illustrated facility comprises a conveyor 203 which transports food items from the intake 204 to the outlet 205.

    [0115] A check-in position 206 is defined along the conveyor. At this position, a check-in camera 207 is located such that it can capture check-in images of the food items. In the illustrated embodiment, the check-in camera is located above the conveyor and lamps 208 may be arranged to provide enough light for recognizing even fine details in the food items. The camera 207 is based on visual light reflection, but in alternative embodiments, the camera could be X-ray based or ultrasound based etc.

    [0116] The check-out camera 209 is arranged at the check-out position 210 where it can capture check-out images of the food items or pieces thereof e.g. by the assistance of lamps 211 etc. Cameras 207 and 209 may be arranged before corresponding lamps 208 and 211, however lamps may also be positioned before cameras or at any other suitable locations.

    [0117] Between the check-in position and the check-out position, the food items are handled at a handling station. In this case, the handling station comprises two processing tables 212, 213. Alternatively, such handling stations may comprise automatic processing equipment.

    [0118] An operator 214, 215 may be assigned to each processing table. Each table and each operator have individual IDs, e.g. in the form of an identification number.

    [0119] The computer 216 comprises a data input configured to receive the captured images from the check-in camera and from the check-out camera, e.g. via a Local area network, LAN 217. A CPU with corresponding software code is configured to form a processing structure and to create a unique identification insignia (UII) based on the check-in images and to recognize that UII in the check-out images. The computer 216 may generate a data-record 218, symbolized by the product card 218. The data-record may contain the UII and other kinds of data, e.g. meta data related to the food item or to the handling of the food item, e.g. the ID of the operator etc.

    [0120] Each table may have computer interfaces, e.g. in the form of touch screens 219, 220 enabling the operator to identify the food item, to generate information related to the food item. Such information may specify the ID of the operator, the ID of the table, or visually observed issues related to the food item, e.g. quality parameters observed by the operator. The information can be transmitted by the LAN 217 to the computer 216 or elsewhere, e.g. to the supervisor computer system 221 where a supervisor 222 may review the data. The data could be included as meta data in the data-record 218. Data, e.g. as included in the data-record, could also be exported e.g. to adjacent handling stations, or to follow the food item or pieces thereof all the way to the consumer.

    [0121] The outlet could be arranged to deliver the food items or pieces thereof to adjacent handling stations, e.g. for further processing, packing, inspection, or rejection etc.

    [0122] The system may comprise a diverter which can be controlled by the processing structure such that, depending on the recognizing of the food item or the piece thereof at the check-out position, the food item is directed to one or the other of the subsequent processing stations. By means of an example, the processing structure may have a list of N specific food items with corresponding UII. If these N food items are to be packed for a specific customer, the food item diverter could be controlled by the processing structure to direct these specific food items to a packing station dedicated to the specific customer. If a specific food item can't be recognized at the check-out position, the processing structure may use a preselected subsequent processing station, e.g. a processing station with a human operator who can decide an action for the non-recognized food item.

    [0123] The UII could be generated based on a structural uniqueness of the food item. Examples of such structural uniqueness is shape and size of fat, bone, muscle, fiber patterns. It could also be generated based on a shape or size of the food item or generated based on a color or color distribution of the food item.

    [0124] Particularly, the check-in image may be an image of a food item which is prepared for obtaining the best image of this unique feature.

    [0125] FIG. 3 illustrates a fish, particularly a salmon fillet. The illustrated salmon is filleted. This process reveals a structure of meat and fat, and that structure may be used for obtaining a good image representing a unique fingerprint which can define the UII. If filleting is desired, the system may particularly contain a filleting station placed prior to the check-in position to enable a check-in image containing the fingerprint structure of the fillet.

    [0126] The gray area indicated by the box 301 is typically cut away and rejected to be processed differently. The pieces of the salmon within the gray box 301 could be categorized as a food item piece not supposed to pass through the check-out position. Accordingly, it may be an object to remove a UII from such a piece of the food item from a que of UIIs to thereby facilitate faster or more robust identification of the remaining UIIs in the check-out image.

    [0127] The food item illustrated in FIG. 4 has been captured on a check-in image 401. The processing structure identifies a blood stain 402. Such a blood stain is generally undesirable and therefore typically cut away during the processing.

    [0128] Since such cutting could take place between the check-in position and the check-out position it may be desirable not to use an area around the blood stain for the purpose of generating the UII, or at least to generate a UII based on a principle which is robust against removal of parts of the food item and thus a part of the image.

    [0129] If such an undesired element of the food item is removed, it may impair the ability of the process structure to recognize the UII in the check-out image. Accordingly, the processing structure is programmed upon identifying such an undesired food item part and to create a frame 403 around the blood stain 402 and to generate the UII from a part of the image which does not include the frame 403.

    [0130] FIG. 5 illustrates the food item from the previous FIG. 4 but with an overlapping portioning plan defining cutting lines 501 for cutting the food item into pieces. The portioning plan is used for controlling a portioning cutter placed between the check-in position and the check-out position. Due to this process, it may be difficult to recognize the UII and for that reason, the processing structure is configured for creating a UII for each of the pieces according to the portioning plan.

    [0131] FIG. 6 illustrates a food item for which a plurality of UIIs is created in the check-in image. The created UIIs are uniquely numbered 61-90, and they relate to different unique parts of the food item. In alternative embodiments, at least some of the UIIs may relate to overlapping parts of the food item.

    [0132] The computer 216 may define for all these UIIs which relate to the same food item, a set of UIIs defining the relationship between the UIIs belonging to the same food item. As an example, all UIIs 61-90 may be defined as UII 1-61, 1-62, 1-63 . . . 1-90, thereby defining that they belong to food item no. 1.

    [0133] FIG. 7 illustrates a timeline T indicating a lead time defined herein as the duration from a food item enters the check-in position S1 until it leaves the check-out position S3. Depending on the specific layout of the system, a finite number of food items are processed, e.g. simultaneously or successively. To increase the efficiency of the system and particularly to increase the ability of the processing structure to recognize a created UII from the check-out image, it may be an advantage to reduce the number of food items which are considered by the processing structure.

    [0134] In FIG. 7, the five food items marked with A are to enter the check-in position S1. The seven food items marked with B are being handled. Five of these are in flow, and two of the seven food items, i.e. those marked B are currently parked between the check-in position S1 and the check-out position S3. The reason for parking food items could be lack of available resources, e.g. due to uneven processing needs for the incoming food items. If several successive food items need specific processing, the capacity for handling the food items may necessitate parking of food items until they can be processed.

    [0135] The computer 216 comprises a memory in which a que of UII of food items marked B is stored.

    [0136] Each time a food item arrives at the check-in position, the que is updated with a new UII. Each time a food item leaves the check-out position and is recognized by the check-out image to have a specific UII, the corresponding UII is deleted from the que. In that way, the que only contains a limited number of available UIIs and the ability to recognize the UII correctly can be increased. The feature is particularly relevant when the UII is generated based on a feature, e.g. a structure, size, shape, or color which with statistically good probability may be considered unique within a relatively low amount of items, but which could not be considered unique in larger amounts of the food item.

    [0137] FIG. 7 indicates by the timeline T that a clock is built into the computer. The clock is indicative of a function of the processing structure to assign a check-in time to each UII when created. Based on knowledge about the specific layout, it may be determined that a food item would normally have left the system at a predetermined point in time. Accordingly, the check-in time could be used for deleting UIIs from the que when the predetermined period has elapsed. That may occur, e.g. if the parked food items B for some reason are rejected.

    [0138] In one implementation, the check in-camera and the check-out camera are CCD cameras arranged along a conveyor belt. The cameras communicate digital images with the processing structure. The processing structure comprises an image recognizing unit providing a converter capable of converting a digital image from the check-in or check-out camera to a unique data-record. A communication unit is capable of outputting the data-record to other computers, e.g. for label printing purposes or for later authentication of the food item, or storing the data-record in a database.

    [0139] Internally, the data-record is preserved in memory, and an image signal processing unit in the form of a microprocessor is configured to carry out the reorganization procedure in which it is verified if a check-out image and a check-in image is from the same food item. The microprocessor is programmed with software developed with similar techniques as standard software developed for recognizing features in images. Such systems are widely developed, e.g. for face detection or fingerprint recognition.

    [0140] The processing structure may include a video card with a frame buffer and other features configured for handling input from the camera, and the UII could be generated based on standard processor functions available e.g. for face recognition, fingerprint recognition etc. Such processors can deliver a unique data-string representing a UII for a picture.