Modular robotic food preparation system and related methods
11577401 · 2023-02-14
Assignee
Inventors
- Ryan W. Sinnet (Pasadena, CA, US)
- Robert Anderson (Pasadena, CA, US)
- William Werst (Pasadena, CA, US)
- David Zito (Pasadena, CA, US)
Cpc classification
B25J9/08
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1612
PERFORMING OPERATIONS; TRANSPORTING
B25J21/00
PERFORMING OPERATIONS; TRANSPORTING
B25J11/0045
PERFORMING OPERATIONS; TRANSPORTING
B25J9/0084
PERFORMING OPERATIONS; TRANSPORTING
A23L5/10
HUMAN NECESSITIES
International classification
B25J11/00
PERFORMING OPERATIONS; TRANSPORTING
B25J9/00
PERFORMING OPERATIONS; TRANSPORTING
A23L5/10
HUMAN NECESSITIES
Abstract
A modular robotic kitchen system is conveniently adaptable to perform a wide range of cooking applications. The modular robotic kitchen system can include a plurality of discrete modular units organized in a small footprint such that multiple types of cooking applications can be performed without a need to replace the modular units. Exemplary modular units include an ingredient module, robotic arm module, assembly and packaging module, and warming module. Optionally a transport unit or sled moves the modules into position. The modular kitchen system includes a central processor operable to carry out different cooking applications upon downloading software corresponding to the specific cooking application and without retooling the existing modules. Related methods are also described.
Claims
1. A modular robotic kitchen system for preparing food items in combination with at least one kitchen appliance in a commercial or restaurant kitchen, the modular robotic kitchen system comprising: a main kitchen module, the main kitchen module comprising a cart, a first robotic arm arranged on the cart, and a shielded workspace; at least one sub kitchen module, wherein each of said at least one sub kitchen module comprises a cart and a shielded workspace, and wherein each sub kitchen module being arranged within reach of the robotic arm and selected from the group consisting of an ingredient unit, assembly unit, packaging unit, and pick-up unit; a sensor array, a central processor operable to compute and provide directions to the first robotic arm to prepare a food item using any one or more of the sub kitchen modules in combination with the kitchen appliance; and a temperature probe, and wherein the central processor and sensor array are operable to automatically locate and specify a candidate food item to test with the temperature probe based on volume, and to compute an angle of approach and penetration depth relative to the surface of the candidate food item for the robotic arm to aim the temperature probe, and based on the type of food item being tested.
2. The modular robotic kitchen system of claim 1, wherein the at least one sub kitchen module comprises the ingredient unit located adjacent the main module, and the ingredient unit adapted for holding multiple food items in separate areas accessible by the robotic arm, wherein at least one separate area is temperature controlled.
3. The modular robotic kitchen system of claim 2, further comprising an assembly unit for assembling a completed entrée, the assembly unit comprising work surface, the work surface having a designated unsorted food area and an assembly area for plating or packaging a completed entree, and the assembly unit further comprising at least one sensor aimed at the work surface and which is operable to send data to the central computer.
4. The modular robotic kitchen system of claim 3, further comprising a pick-up unit for storing the completed entrée until customer pickup, wherein the pick-up unit is temperature controlled.
5. The modular robotic kitchen system of claim 4, wherein the first robotic arm of the main module operates to perform cooking with the kitchen appliance, and the modular robotic kitchen system further comprises an RKA extension module comprising a second robotic arm, and operable to perform at least one of skill selected from the group consisting of unpacking with an ingredient unit, cooking with the kitchen appliance, packing with a packaging unit, and assembling with the assembly unit.
6. The modular robotic kitchen system of claim 1, further comprising a freezer-safe package configured to enclose a food item, and to open along a seam when the package is bowed/bent by the robotic arm, thereby allowing the food item to fall from the package.
7. The modular robotic kitchen system of claim 1, wherein the kitchen appliance heats the food item, and the central processor is operable to monitor and control the temperature of the kitchen appliance.
8. The modular robotic kitchen system of claim 1, wherein the at least one sub module is mobile.
9. The modular robotic kitchen system of claim 1, further comprising a scheduling engine to determine a sequence of food preparation steps using the main and sub modules and based on a plurality of inputs selected from the group consisting of camera data, customer orders, inventory, and recipe information.
10. The modular robotic kitchen system of claim 9, wherein the scheduling engine applies an optimization algorithm to determine the schedule of the food preparation steps.
11. The modular robotic kitchen system of claim 9, further comprising a food quantity sensor operable to detect quantity of food, and wherein the quantity of food is an input to the scheduling engine.
12. The modular robotic kitchen system of claim 1, wherein the main module comprises a drawer slidable from a first position outside of the shielded workspace and inaccessible to the robotic arm, and a second position within the shielded workspace accessible to the robotic arm.
13. The modular robotic kitchen system of claim 1, further comprising a conveyor belt assembly for transporting food items between the main module and the at least one sub module.
14. The modular robotic kitchen system of claim 13, wherein the conveyor belt assembly further comprises a belt, an enclosure surrounding the belt, and magnets underneath the belt to hold food items in a position fixed relative to the surface of the belt.
15. The modular robotic kitchen system of claim 1, further comprising a demand engine to compute a quantity of food items to be prepared based on at least one of the following inputs selected from the group consisting of time and date, inventory, and cooking time.
16. The modular robotic kitchen system of claim 1, wherein the temperature probe comprises a flange and a tip operable to extend beyond the flange to penetrate the food item, and wherein the tip can be retracted such that the flange contacts the food item to detach the food item from the tip.
17. A method of robotically preparing food items in a commercial or restaurant kitchen having at least one kitchen appliance for cooking a food item, the method comprising: providing a main robotic cart, the main robotic cart comprising a robotic arm and shielded workspace; providing a plurality of discreet sub carts, each of said plurality of sub carts having a shielded workspace; arranging the main robotic cart and the plurality of sub carts around the main robotic cart and at least one kitchen appliance such that robotic arm can access and reach the at least one kitchen appliance and the shielded workspace of each sub cart; identifying and locating, with at least one camera, at least one food item from a temperature controlled bin in a first sub cart of the plurality of sub carts; picking, with the robotic arm, the at least one food item from the temperature controlled bin from the first sub cart and transferring the at least one food item to the kitchen appliance for cooking; cooking the at least one food item; measuring, with a temperature probe arranged on the robotic arm, the internal temperature of at least one candidate food item of the at least one food item during the cooking step, wherein the measuring comprises: computing an optimal angle of approach and penetration depth of the temperature probe relative to the surface of the at least one candidate food item based on the type of the at least one candidate food item; advancing the temperature probe into the at least one candidate food item based on the optimal angle of approach and the optimal penetration depth; removing, with the robotic arm, the cooked at least one food item from the kitchen appliance and transferring the cooked at least one food item to a second sub cart of the plurality of carts; assembling, with the robotic arm, the cooked at least one food item with other ingredients according to a customer order on the second sub cart to obtain a completed entrée; and storing, by transferring with the robotic arm, the completed order to a third sub cart of the plurality of sub carts for safe storage at a controlled temperature until pickup.
18. The method of claim 17, further comprising robotically packaging, with the robotic arm, after the step of assembling and prior to the step of storing, the completed entrée at a fourth sub cart of the plurality of sub carts to create a packaged order for pickup.
19. The method of claim 18, further comprising providing a second kitchen appliance, and programming the robotic arm to be operable with a second kitchen appliance for cooking.
20. The method of claim 18, further comprising providing a second kitchen appliance, and an extender sub cart and a second robotic arm, wherein the method further comprises programming the second robotic arm to be operable with the second kitchen appliance.
21. The method of claim 20, further comprising determining, using a programed processor, a set of food preparation steps for the first and second robotic arms to prepare the completed entrée.
22. The method of claim 21, further comprising determining, using the programed processor, a schedule for the food preparations steps.
23. The method of claim 22, wherein the step of determining the set of food preparation steps and the schedule of the food preparation steps is based on a downloaded recipe, customer order, predicted demand, and inventory.
24. The method of claim 17, wherein the at least one food item comprises a plurality of food items, and the method comprises determining said at least one candidate food item based on evaluating the volume of each food item of the plurality.
25. The method of claim 17, further comprising sensing axial force as the temperature probe is advanced into the candidate food item.
26. The method of claim 17, wherein the computing steps are based on the homogeneity of the at least one food item.
27. The method of claim 26, wherein the computing steps are performed by a trained model.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
DETAILED DESCRIPTION OF THE INVENTION
(26) Before the present invention is described in detail, it is to be understood that this invention is not limited to particular variations set forth herein as various changes or modifications may be made to the invention described and equivalents may be substituted without departing from the spirit and scope of the invention. As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present invention. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the present invention.
(27) Methods recited herein may be carried out in any order of the recited events which is logically possible, as well as the recited order of events. Furthermore, where a range of values is provided, it is understood that every intervening value, between the upper and lower limit of that range and any other stated or intervening value in that stated range is encompassed within the invention. Also, it is contemplated that any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein.
(28) All existing subject matter mentioned herein (e.g., publications, patents, patent applications and hardware) is incorporated by reference herein in its entirety except insofar as the subject matter may conflict with that of the present invention (in which case what is present herein shall prevail).
(29) Described herein is a modular robotic kitchen system.
(30) Overview
(31)
(32) Also shown in
(33) RKA Module/Unit
(34) The robotic kitchen assistant (RKA) module 30 is shown including a shielded workspace, counter-top or bin area, a robotic arm having a plurality of degrees of freedom (preferably, 6 DOF), at least one sensor or camera, and a computer operable to control the motion of the robotic arm to carry out food preparation steps as discussed further herein. Examples of an RKA and robotic arm suitable are described in U.S. patent application Ser. No. 16/490,534, filed Aug. 31, 2019, entitled “ROBOTIC KITCHEN ASSISTANT FOR PREPARING FOOD ITEMS IN A COMMERCIAL KITCHEN AND RELATED METHODS” and US Patent Publication No. 20180345485, filed Aug. 10, 2018, and entitled “MULTI-SENSOR ARRAY INCLUDING AN IR CAMERA AS PART OF AN AUTOMATED KITCHEN ASSISTANT SYSTEM FOR RECOGNIZING AND PREPARING FOOD AND RELATED METHODS.”
(35) Unpacking/Ingredient Module
(36) The unpacking or ingredient cart 20 is shown including a shielded workspace, and four separate areas for holding ingredients or bins of ingredients. As discussed further herein, in embodiments, the ingredient cart 20 can hold multiple food items (up to 10), is robot friendly; includes face protection, and a lid or cover to close. Optionally, one or more of the separate areas are refrigerated. Additionally, in embodiments, discussed further herein, the system employs raw food packaging facilitating robot actions.
(37) Cooking Appliances
(38) The modular robotic kitchen system can operate with a wide range of cooking appliances (e.g. fryer 80, grill 90) as shown in
(39) Preferably, in embodiments, temperature of the food items being cooked is monitored. The temperature m/x can be input to scheduler engine, described further herein. Additionally, in embodiments, the temperature in the appliances (e.g., fryer oil, oven temperature, grill surface, etc.) can be monitored and automatically controlled, discussed further herein.
(40) Additionally, in embodiments, the modular robotic kitchen system can include various utensils to facilitate transferring from one station or cart to another. In a particular embodiment, a fry basket is operable with the fryer and enables convenient and safe transfer of the fried items to another unit or workspace, discussed further herein.
(41) Assembly & Packing Module
(42)
(43) Warming Module
(44)
(45) Extension Module
(46)
(47) System Architecture
(48)
(49) Examples for use with embodiments of the inventions of hardware and software include, without limitation, central computer, servers, processors, memory and storage, commination interface, sensors, cameras, input devices such as keyboards or touchscreen displays, display. The processor is programmed or operable to execute various applications described herein as well as enable modules or engines for determining location and identification of food items, doneness, scheduling of steps, demand of food items, and inventory. Examples of food identification and location, scheduling, and demand modules as descried in U.S. patent application Ser. No. 16/490,534, filed Aug. 31, 2019, entitled “ROBOTIC KITCHEN ASSISTANT FOR PREPARING FOOD ITEMS IN A COMMERCIAL KITCHEN AND RELATED METHODS”, U.S. patent application Ser. No. 16/490,775, filed Sep. 3, 2019, entitled “AUGMENTED REALITY-ENHANCED FOOD PREPARATION SYSTEM AND RELATED METHODS”, and US Patent Publication No. 20180345485, filed Aug. 10, 2018, and entitled “MULTI-SENSOR ARRAY INCLUDING AN IR CAMERA AS PART OF AN AUTOMATED KITCHEN ASSISTANT SYSTEM FOR RECOGNIZING AND PREPARING FOOD AND RELATED METHODS”, each of which is incorporated by reference in its entirety for all purposes.
(50) Core platform additionally shows skills 140 that are enabled by the hardware and software. Collectively, the core platform is highly flexible and adaptable to perform a wide range of cooking applications 150 which may include specific cooking workflows 160 and use of specific cooking equipment 170 such as a burger workflow and use of a griddle, respectively. The core platform 110, as described further herein, is readily adaptable to run a specific cooking workflow and use the provided equipment without needing to be reworked or rewired.
(51) In embodiments, a new cooking workflow software is downloaded to the central computer for execution. Optionally, trained models may be downloaded to the central computer or the system trains itself based on machine learning algorithms.
(52)
(53) The monitoring system 172 is operable to continuously track the status of the system and flags anomalous behavior to be corrected by local or remote staff.
(54) The continuous learning system 174 is operable to utilize these flagged issues to retrain the neural networks in order to improve the performance of the autonomous system for food classification.
(55) A performance analytic system 176 is operable to aggregate at regular intervals to improve store management and give guidance on where to focus efforts. The analytics serve to determine the difference between the amount food cooked compared to the amount of food ordered, to produce food safety and quality reports, and to report on the status of the machine and when the next maintenance cycle is due.
(56) Unpacking & Raw Food Packaging
(57) In embodiments, a method for packing, transporting, and unpacking raw food for preparation in kitchens includes providing custom containers designed for ergonomic access by humans and manipulation by mechanized systems.
(58) Preferably, the raw packing system is in a centrally located distribution warehouse and is operable to quickly unpack and repack the modular carts. Additionally, the contents in each cart is tracked throughout the time the contents are in the cart using an automated tracking system.
(59) The raw packing system can include various hardware such as a battery and power management system, a charging interface to supply power to the battery and power management system in the cart, and a wired and/or wireless communication system to maintain in-transit tracking of the cart and also to communicate with the robotic kitchen assistant modular unit, described herein.
(60) In embodiments, an access control system is provided with the cart and operable to obtain a return merchandise authorization (RMA) and to allow the contents in the cart to be returned safely back to the distribution warehouse and repacked for a different store, without risking store-to-store contamination.
(61) The packing and unpacking system can optionally log environmental data of the cart at all times.
(62) The packing and unpacking system may include an environmental control system to control the temperature and other environmental conditions within the warehouse or kitchen. For example, in embodiments, the environmental control system comprises a compressor-based bidirectional heat pump, and optionally the heat pump may be a solid state heat pump using, e.g., Peltier junctions.
(63) In embodiments, the environmental control system comprises a passive thermal reservoir utilizing ice or other similar latent heat of phase change and heavy insulation. In embodiments, a combination of the above thermal control systems are used in combination.
(64) In embodiments, the raw food is packaged in a thermal insulative container. In particular embodiments, raw food is packaged in pillow packs that are hermetically sealed via plastic welding. The pillow pack is opened via a blade or a perforation in the packaging material, and the contents can then be dumped into cooking container, e.g. fryer basket or pot. The packaging material is then discarded.
(65) In embodiments, the pillow pack container implementation can be grabbed with a suction cup.
(66) In embodiments, the pillow pack container can be grabbed with a molded gripping feature designed for a custom end effector to enhance manipulability of pillow pack.
(67) In embodiments, small reusable rigid containers are used to contain food product. Preferably, in embodiments, a freezer safe package can be opened without the use of a knife by pulling apart the bag.
(68) With reference to
(69) In embodiments, the freezer safe package encodes information about the product.
(70) In embodiments, a freezer safe package is adapted to dissolve in hot oil to release contents into the oil to cook. Exemplary materials for the freezer safe bag include rice paper, starch, etc.
(71) Temperature Testing
(72) In embodiments, a robotic assisted method for determining the temperature of food being cooked (e.g., batch of fried foods) comprises singulating the pieces of cooked food from a batch, ranking the pieces according to size, and testing the internal temperature of the largest pieces to guarantee food safety requirements.
(73)
(74) Step 210 states to insert a bin of the food items in a vibrating rack. With reference to
(75) An example of a vibrating rack mechanism which allows a bin to be agitated easily is shown in
(76) The bin shown in
(77) Step 220 states to vibrate for 30 seconds, or until the foods items are separated from one another. Steps 210 and 220 collectively serve to singulate the food items.
(78) Step 230 states to capture images of the food items from a plurality of cameras.
(79) Examples of the sensors include, without limitation, cameras (optical and/or IR) and Time-of-Flight sensors (depth measurements). The array of cameras 310 serves to provide enough information to estimate volume from reconstructed 3D models, discussed further herein. Additionally, the bin can be made of highly transparent material to allow vision from the bottom.
(80) Step 240 states to reconstruct the 3D model of the food items. The robotic temperature testing system performs this analysis using the array of cameras and performing a technique called stereo reconstruction as described in, for example, Ju Yong Chang, Haesol Park, In Kyu Park, Kyoung Mu Lee, Sang Uk Lee, GPU-friendly multi-view stereo reconstruction using surfel representation and graph cuts, Computer Vision and Image Understanding, Volume 115, Issue 5, 2011, Pages 620-634. In embodiments, the images from the plurality of cameras are fused together using Stereo Reconstruction to obtain a 3D scan of the bins and the objects therein.
(81) In embodiments, segmentation is achieved using a neural network as in Kaiming He, Georgia Gkioxari, Piotr Dollár, and Ross B. Girshick, Mask R-CNN, arXiv, 2017. Analyzing the segmentation can determine whether the food items have been fully singulated as well as provide a list of objects of interest.
(82) Step 250 states to identify the largest pieces. In embodiments, for each piece of food, the system performs a volumetric analysis. Particularly, the segmented pieces are analyzed to see which are the largest with select geometric calculations to find the largest part of a piece of food. The pieces can be ranked according thickness of the thickest part. One or more of the thickest pieces are then selected for temperature testing, discussed below.
(83) Step 260 states to compute the optimal angle of approach and penetration depth for the temperature probe discussed further herein. This approach and penetration step is calculated based on the information, size and orientation determined from the above steps. In embodiments, it is desirable to aim towards the center of mass of the food item, and of the largest food item.
(84) In order to test a given piece of food properly, an appropriate angle of approach and penetration depth must be selected. For homogeneous items (such as a piece of boneless chicken breast), it is sufficient to locate the largest cross-sectional area and penetrate orthogonally to the surface and up to the middle of the food item.
(85) For items that do not have reasonably homogeneous heat capacity, such as a bone-in chicken breast, it is not sufficient to simple insert into the largest cross section area. For the example of bone-in chicken breast, it's important to these the thickest piece but avoid the bone since it heats much faster than the surrounding tissue. Therefore, a model is necessary to infer optimal angle of approach and penetration depth.
(86) Learning a model for angle of approach and penetration depth can be accomplished either through heuristic approaches or using machine learning. With either approach, the goal is to build a model to estimate the pose and parameters of a food item. Using this model, some embodiments use heuristics to specify how to approach and penetrate.
(87) In embodiments, a heuristic model is sometimes used, such as located the largest cross-sectional area and penetrating orthogonal to that. This type of method can work well on a variety of food items. But some food items require more complicated techniques.
(88) Other embodiments use learning by demonstration to build a model for angle of approach and penetration. In embodiments, a thermal probe that publishes its pose in space is used by a human trainer. The human trainer goes through the motions and the pose of the thermal probe is tracked over time as the human trainer tests many pieces of a type of food item. Using these data, a model can be trained that will allow computation of the optimal angle of approach and penetration depth.
(89) These models for computing optimal angle of approach and penetration depth are generated using shared data via the Internet. This allows multiple robotic temperature testing systems to learn more quickly.
(90) Step 270 states to move and insert the probe. In embodiments, temperature testing is performed with a temperature testing tool 400 and probe 420 attached to the end of a robot arm 302. This robot arm 302 can have 4, 5, 6, 7, or a higher number of degrees of freedom. The robot arm can also take other configurations including but not limited to that of a SCARA arm or a delta arm.
(91) In the embodiment shown in
(92)
(93)
(94) The extension may be performed by various mechanisms such as, e.g., a loaded spring 440, a pneumatic actuator, or an electromagnetic actuator such as a motor. Retraction can be accomplished with a pneumatic actuator or an electromagnetic actuator. Preferably, the extension action is performed using a sufficiently fast actuator to cause the extending probe to quickly penetrate food. By moving quickly enough, the probe is able to avoid static friction altogether and operate with kinetic friction which allows for less friction overall. This mitigates undesired motion of the food item being tested that would otherwise occur during insertion of the thermal probe
(95) The probe may be made of various materials including, e.g., stainless steel or another food-safe material with appropriate thermal properties that can be inserted into a variety of cooked foods including but not limited to bone-in chicken, chicken tenders, and chicken nuggets, hen/turkey parts, boneless chicken/turkey pieces, steaks, hamburgers, fillets, tenders, cutlets, potato wedges, etc.
(96) In embodiments, the thermal probe has axial force sensing. This force sensing provides feedback if the probe makes contact with a bone in a piece of meat or if a probe makes contact with any other impenetrable components in a piece of food. In spring-loaded embodiments of the thermal probe, the force can be sensed by measuring the displacement of the probe from full extensions and applying Hooke's Law. In electromagnetic embodiments, current and dynamics can be measured and compared against a model of expected current.
(97) Step 280 states to record the temperature reading.
(98) Step 290 states to inform user testing is complete.
(99) Additionally, in embodiments of the invention, a sanitation is performed when a piece of food is measured to be below the food-safe temperature threshold. The sanitation step may be performed variously. In one embodiment, the probe is sanitized with an attached sanitation bath. The sanitation bath uses approved chemicals to sanitize the thermal probe and flange.
(100) Equipment Temperature Integration with Robotic System
(101) The modular kitchen systems described herein may also monitor and control temperature of the appliances (e.g., a fryer or oven) during operation.
(102) In one embodiment, a method for controlling kitchen equipment temperature includes selecting the optimal input at present time while optimizing for a time horizon based off of future thermal load prediction and oil life preservation goals.
(103) Oil life preservation may be performed, for example, by dropping the temperature of kitchen equipment such as a fryer to extend the lifetime of consumables such as fryer oil during periods when equipment is not in use, as determined by a kitchen production forecasting system.
(104) Additionally, the present invention includes preemptively changing thermal input into the kitchen equipment before a thermal load is applied. For example, fryer gas burner can be turned on 20 seconds before a basket of frozen fries is dropped into fryer.
(105) Preferably, control of the equipment is automated. In embodiments, a controller utilizes a camera or sensors to track workers in the kitchen to predict when food will be added to system. The controller raises or lowers the temperature of the appliance automatically based on the location and movement of the workers.
(106) In embodiments, the controller is connected to a production forecasting system based on various inputs. Examples of input to the production forecasting system include, without limitation: prior demand, point-of-sale data, and product stock levels.
(107) In embodiments, the controller is connected to a robotic kitchen assistant which relays its cooking cadence over to controller for predictive temperature control.
(108) In embodiments, the computer monitors the health of the kitchen equipment by observing effect of heat input on temperature readouts when equipment has no thermal load.
(109) In embodiments, the robot is operable to skim contents out of the fryer to preserve the lifetime of the equipment and the oil.
(110) In embodiments, the system determines optimum lifetime of the oil, and when the oil needs to be changed based on tracking the throughput of food cooked in the fryer.
(111) Robotic Food Packing System
(112)
(113) Step 610 states to insert bin of unsorted food. With reference to
(114) Step 620 states to place at least one packing container 730 inside packing area 704. In embodiments, one bin sits in the work area to be used for packing. Another bin sits in the work area and contains packing containers. However, the number of the bins and areas may vary.
(115) Step 630 states to capture images of the unsorted food. The cameras or sensors 760, described herein, can be arranged above the worksurface and food items or elsewhere to aim at and obtain images from multiple angles of the unsorted food. With reference to
(116) Indeed, in order to properly portion and plate or pack a container, a Robotic Food Packing System can see in 3D the objects inside a bin of unsorted food. This 3D imaging data can then be used to drive decisions, discussed herein, on how and what to pick out of the bin of unsorted food. High fidelity 3D vision in a Robotic Food Packing System is achieved with an array of optical cameras mounted above the Working Surface of the Cart. These cameras point at the various work areas as in
(117) Step 640 states to reconstruct the 3D model. Preferably, as discussed above, stereo reconstruction is employed for this step.
(118) Step 650 states to segment and classify the food items. This step may be carried out as described above.
(119) Step 660 states to compute an optimal grasp approach for a piece of food. This step may be determined and carried out based on parameters of the end effector tool 770 and the robot arm 710, and the output from step 650.
(120) Step 670 states to execute grasp.
(121) Step 680 states to place food pieces in appropriate configuration in packing container 730. This step is executed by the robotic arm, and based on order information. In embodiments, pick and place is achieved using computer vision. Images are captured by video cameras and processed by convolutional neural networks. Such a network involves a series of convolutional layers and max pool layers as well as other layers. These neural networks can be trained to infer the optimal angle of approach and determine the path necessary to successfully pick up an object.
(122) Step 690 states to remove bin of packed containers. Optionally, similar to step 610, the system is operable to interface with either a human kitchen worker or another robotic kitchen assistant to remove the bin of packed food containers from the packing area 730.
(123) As mentioned herein, the workspace of the modular cart may be shielded to protect workers. In embodiments, and with reference to
(124) In embodiments, various types of gripping, grasping, wedging, squeezing, clamping, scooping, ladling, skewering, and suctioning tools are used to pick up one or more pieces of food. With reference to
(125) In embodiments, sorting and packing is performed with a gripper tool attached to the end of a robot arm. The robot arm can have 4, 5, 6, 7, or more degrees of freedom. Additionally, the robot can have other configurations including but not limited to, a SCARA or delta-type robot.
(126) In embodiments, the robot arm may have a camera on the wrist. The data from this camera can be combined with the data from other cameras to improve the accuracy of pick and place behaviors. In embodiments, the wrist imaging sensor may be RGB, IR, or depth, or some combination of these sensors.
(127) In embodiments, a convolutional neural network is sometimes used to identify packing containers, either in a stack or set out in preparation for packing.
(128) In embodiments, the decision on what and how to pack is driven by external data coming in via sensors and the Internet. Packing contents are determined by recipes.
(129) In embodiments, learning by demonstration is sometimes used to build a model for picking up food items. A human expert goes through the motions of picking up many examples of a food item or various food items. These data can be used to train a model to pick up various food objects.
(130) In embodiments, reinforcement learning (trial and error) is used. In this process, the system makes repeated attempts at picking up food objects. The system uses these attempts to refine the grasping model and eventually the system learns to grasp a variety of objects consistently.
(131) In embodiments, learned models for grasping are shared amongst numerous robots potentially with a wide geographic distribution.
(132) Smart Robotic Kitchen
(133) As discussed herein, the modular robotic kitchen system includes modular carts, appliances, and transports operable to interact and communicate with one another to deliver and prepare food according to an optimal schedule and with limited waste.
(134) With reference to
(135) Food Quantity Sensors
(136) With reference to
(137) The configuration of the hot case may vary. The hot case 950 shown in
(138) Additionally, in embodiments, the contents of a hot case is shared with other participants in the robotic kitchen (and sometimes also with a main controller or computer) upon which scheduling decisions (e.g., scheduling the food preparation steps) are determined.
(139)
(140) Step 1030 states demand model. Inputs 1010 to the demand model shown in
(141) Step 1040 states schedule optimizer. An exemplary scheduling engine is described in US Patent Publication No. 20180345485, entitled “MULTI-SENSOR ARRAY INCLUDING AN IR CAMERA AS PART OF AN AUTOMATED KITCHEN ASSISTANT SYSTEM FOR RECOGNIZING AND PREPARING FOOD AND RELATED METHODS.” In embodiments, a central controller aggregates data to drive scheduling decisions for the entire Smart Robotic Kitchen.
(142) In embodiments, Just-in-Time production scheduling is implemented using data from all participants in the Smart Kitchen and drives mechanical devices to produce.
(143) The scheduler then directs or instructs one or more robotic kitchen assistant 1070, 1072, 1074 to perform the various food preparation tasks as described herein.
(144)
(145)
(146) In embodiments, the conveyor belt assembly comprises a belt, an enclosure surrounding the belt. The enclosure acts as a protective shield to protect moving parts of the conveyor from the food. Additionally, each food item is prepared on a magnetic tray. In embodiments, the conveyor belt has a series of magnets on it. The conveyor is operable to move the magnetic food tray from underneath the protective barrier through a magnetic force.
(147) In embodiments, the conveyor system can include one or more sensors. For example, a sensor module can be arranged on one or more of the carts to obtain image data, or time of flight sensing. The sensor module optionally includes one or more CPUs and GPUs. A processor can be provided that is operable to run convolutional neural networks and geometric analysis of 3D data achieved through stereographic reconstruction, time-of-flight sensing, or other methods.
(148) Novel Fry Basket
(149) A robotic-friendly fry basket 800 for improved packing efficiency and safety, and reduced payload on humans is shown in
(150)
(151)
(152)
(153) Some of the advantages of the basket described above includes enabling a method for containing food for cooking in a fryer while enabling computer vision localization of basket; reducing time required to clean after use; and protecting the human worker. Additionally, in embodiments, smaller baskets are provided and used with the modular robotic system. Maintaining packing efficiency in a fryer while decreasing payload requirements can be accomplished by using many smaller baskets.
Alternative Embodiments
(154) It is to be understood that the modular robotic kitchen system may vary widely except as recited in the appended claims. For example, in another embodiment, and with reference to
(155) In embodiments, the modular cart may contain a tool belt to hold a variety of tools including measuring tools, gripping tools, and calibration tools.
(156) In embodiments, the modular cart may have several fixed fiducial markers to provide constant feedback on calibration accuracy and allow instantaneous calibration.
(157) In embodiments, and with reference to
(158) In the embodiment shown in
(159) Feet 1370 are shown extending from the legs at right angles from the legs for stability. Optionally, the feet may be mounted to the floor.
(160) The carriage and guide cooperate together to axially move the robotic arm along the guide when commanded to do so by the computer processor, which may be located locally, as described above.
(161) Although the linear guide system shows one robotic arm, the invention is not so limited except where recited in the appended claims. The linear rail guide system may include additional robotic arms movable along the rail to further increase the effective reach of the robotic arms. The computer and sensors operate together to determine the food preparation steps, recognize and locate the food items and utensils, and to schedule and carry out the order efficiently.
(162) Additionally, the linear guide system may be oriented variously. In embodiments, a linear guide system extends from the front towards the back (or from the top to bottom) of the cooking area. In addition to such axial motion, the robot manipulator itself enjoys several other degrees of motion (multi-axis). Consequently, the linear guide systems can perform any of the skills and applications described above such as those identified in
(163) The linear movement may be generated using a number of different linear movement systems. In embodiments, a cleanable linear actuator design extends the reach of one or more manipulators. In one embodiment, the linear actuator is composed of a ball screw mechanism with thread and pitch size large enough to easily clean between the threads.
(164) The frame may be made of various materials. In embodiments, the frame is formed of steel tubing, welded together.
(165) Additionally, the linear actuator may be covered to protect it. In embodiments, a barrier is shaped to cover the sliding mechanisms from any splashes from food production. A cover allows access of the carriage to move freely along the rail.
(166) Still other techniques may be employed by the robotic kitchen assistant to automatically remove debris from the fryer including rapidly contacting the rim of a trash receptacle with the skimmer, or brushing the skimmer with a tool.