G06V20/68

Image-based kitchen tracking system with anticipatory preparation management

The subject matter of this specification can be implemented in, among other things, methods, systems, computer-readable storage medium. A method can include receiving, by a processing device, image data including one or more image frames indicative of a current state of a meal preparation area. The processing device determines a first quantity of a first ingredient disposed within a first container based on the image data. The processing device determines a meal preparation procedure associated with the first ingredient based on the first quantity. The processing device causes a notification indicative of the meal preparation procedure to be displayed on a graphical user interface (GUI).

Image-based kitchen tracking system with dynamic labeling management

The subject matter of this specification can be implemented in, among other things, methods, systems, computer-readable storage medium. A method can include receiving, by a processing device, image data having one or more image frames indicative of a state of a meal preparation area. The method may further include, determining, based on the image data, a first feature characterization of a first meal preparation item associated with the state of the meal preparation area. The method may further include determining that the first feature characterization does not meet object classification criteria for a set of object classifications. The method may further include causing a notification indicating the first meal preparation item and one of an object classification or a classification status corresponding to the first meal preparation item on a graphical user interface (GUI).

Image-based kitchen tracking system with dynamic labeling management

The subject matter of this specification can be implemented in, among other things, methods, systems, computer-readable storage medium. A method can include receiving, by a processing device, image data having one or more image frames indicative of a state of a meal preparation area. The method may further include, determining, based on the image data, a first feature characterization of a first meal preparation item associated with the state of the meal preparation area. The method may further include determining that the first feature characterization does not meet object classification criteria for a set of object classifications. The method may further include causing a notification indicating the first meal preparation item and one of an object classification or a classification status corresponding to the first meal preparation item on a graphical user interface (GUI).

Image-based drive-thru management system

The subject matter of this specification can be implemented in, among other things, methods, systems, computer-readable storage medium. A method can include receiving, by a processing device, image data including one or more image frames indicative of a current state of a drive-thru area. The processing device determines a vehicle disposed within the drive-thru area based on the image data. The processing device receives order data with a pending meal order. The processing device determines a first association between the vehicle and the pending meal order based on the image data. The processing devices determine a meal delivery procedure associated with the based on the association between the vehicle and the pending meal order. The processing device performs may perform the meal delivery procedure. The processing device may provide the meal delivery procedure for display on a graphical user interface (GUI).

Image-based drive-thru management system

The subject matter of this specification can be implemented in, among other things, methods, systems, computer-readable storage medium. A method can include receiving, by a processing device, image data including one or more image frames indicative of a current state of a drive-thru area. The processing device determines a vehicle disposed within the drive-thru area based on the image data. The processing device receives order data with a pending meal order. The processing device determines a first association between the vehicle and the pending meal order based on the image data. The processing devices determine a meal delivery procedure associated with the based on the association between the vehicle and the pending meal order. The processing device performs may perform the meal delivery procedure. The processing device may provide the meal delivery procedure for display on a graphical user interface (GUI).

Beverage dispense monitoring with camera

A beverage dispenser includes a nozzle to dispense a beverage. The beverage dispenser further includes a camera to capture an image of the beverage as the beverage is dispensed from the nozzle. The camera has a field of view that includes the beverage. The beverage dispenser further includes a light source that illuminates the field of view of the camera. The beverage dispenser further includes a computer. The computer analyzes the image of the beverage and determines a characteristic of the beverage.

INSPECTING PLANTS FOR CONTAMINATION
20180012344 · 2018-01-11 ·

A method of inspecting plants for contamination includes generating a first series of images of a plant, identifying a region of interest displayed in the first series of images, comparing a color parameter of the region of interest to a color criterion associated with a type of contamination, comparing a morphological parameter of the region of interest to a reference parameter associated with the type of contamination, and upon determining that the color parameter meets the color criterion and that the morphological parameter sufficiently matches the reference parameter, identifying the region of interest as a region of contamination on the plant. The method further includes transmitting an instruction to lift a cutter of a harvester up from a planting bed to avoid harvesting the plant in response to identifying the region of interest as the region of contamination, and generating a second series of images of an additional plant.

INSPECTING PLANTS FOR CONTAMINATION
20180012347 · 2018-01-11 ·

A method of inspecting plants for contamination includes generating a first series of images of a plant using a camera mounted to a frame being moved along a planting bed by a harvester, identifying a region of interest displayed in the first series of images as a region of contamination on the plant based on a color criterion and a morphological criterion applied to the region of interest, and transmitting data including an instruction to increase a vertical distance between the plant and a cutter of the harvester to avoid harvesting the plant in response to identifying the region of interest as the region of contamination. The method further includes generating a second series of images of an additional plant as the frame continues to be moved along the planting bed by the harvester while the vertical distance between the plant and the cutter is being increased.

WINE LABEL RECOGNITION METHOD, WINE INFORMATION MANAGEMENT METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
20230237824 · 2023-07-27 ·

A wine label recognition method, a wine information management method and apparatus, a computer device, and a computer-readable storage medium are provided. The method includes: obtaining a wine image, and performing optical character recognition (OCR) on the wine image in a preset OCR manner, to obtain text included in the wine image (S21); performing deep learning recognition on the wine image in a preset deep learning recognition manner, to obtain an image feature included in the wine image (S22); and sifting out a target wine label matching the text and the image feature from a preset wine label database according to the text and the image feature, and using the target wine label as a wine label corresponding to the wine image (S33). Advantages of deep learning and OCR are fully utilized thereby improving accuracy and efficiency of wine label recognition and improving automation efficiency of wine information management.

User terminal apparatus and control method thereof

A user terminal apparatus is provided. The user terminal apparatus includes a camera configured to obtain a captured image; a storage configured to store a food intake history; a processor configured to extract a food image from the captured image and determine a food type of a food item included in the captured image based on feature information of the extracted food image and the previously stored food intake history; and a display configured to display relevant information about the determined food type.