DISH CLEANING BY DIRT LOCALIZATION AND TARGETING
20220304546 ยท 2022-09-29
Inventors
Cpc classification
G06T1/0014
PHYSICS
A47L2401/30
HUMAN NECESSITIES
G06F18/214
PHYSICS
Y02B40/00
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
A47L15/4282
HUMAN NECESSITIES
A47L15/46
HUMAN NECESSITIES
A47L15/0089
HUMAN NECESSITIES
A47L15/4295
HUMAN NECESSITIES
A47L2401/04
HUMAN NECESSITIES
G06V20/647
PHYSICS
A47L15/0047
HUMAN NECESSITIES
G06T7/521
PHYSICS
International classification
A47L15/00
HUMAN NECESSITIES
A47L15/46
HUMAN NECESSITIES
G06T7/521
PHYSICS
Abstract
A system and method for cleaning a dish, comprising: capturing at least one image of said dish using at least one camera; computing a bounding polygon for a dirty region in said image, wherein said dirty region is a region of said dish containing unwanted material that needs to be removed; computing a three dimensional representation of said bounding polygon by estimating three dimensional locations of multiple points within said bounding polygon; computing a spray pattern within said three dimensional representation of said bounding polygon such that the spray pattern substantially covers all regions of said three dimensional representation; and spraying a fluid on said dirty region of said dish with a nozzle, wherein said nozzle is reoriented or relocated such that said fluid reaches at least one location within said three dimensional representation of said bounding polygon according to said spray pattern, whereby dirty regions of said dish are targeted for a fast and efficient cleaning of said dish.
Claims
1. A system for cleaning a dish, comprising: a. at least one camera for capturing at least one image of said dish; b. a processor configured to: i. compute a bounding polygon for a dirty region in said image, wherein said dirty region is a region of said dish containing unwanted material that needs to be removed; ii. compute a three dimensional representation of said bounding polygon by estimating three dimensional locations of multiple points within said bounding polygon; iii. compute a spray pattern within said three dimensional representation of said bounding polygon such that the spray pattern substantially covers all regions of said three dimensional representation; and c. a nozzle for spraying a fluid on said dirty region of said dish, wherein said nozzle is reoriented or relocated such that said fluid reaches at least one location within said three dimensional representation of said bounding polygon according to said spray pattern, whereby dirty regions of said dish are targeted for a fast and efficient cleaning of said dish.
2. The system of claim 1, wherein said processor is further configured to compute a plurality of bounding polygons for a plurality of dirty regions in said image; compute a plurality of three dimensional representations of said plurality of bounding polygons; compute a plurality of spray patterns within said plurality of three dimensional representations.
3. The system of claim 1, further comprising a light source for illuminating said dish.
4. The system of claim 3, wherein said light source emits a structured pattern of light such as dots or lines.
5. The system of claim 3, wherein said light source is either configured as a ring that surrounds a camera or configured as a diffuse illumination panel.
6. The system of claim 3, wherein said light source emits infrared light and said camera is designed to capture infrared images.
7. The system of claim 1, wherein said camera captures images when a dish is placed and is ready for cleaning.
8. The system of claim 1, further comprising an ultraviolet light source to disinfect said dish.
9. The system of claim 1, wherein said bounding polygon is computed from the edges of said dirty region of said dish.
10. The system of claim 9, wherein each edge of said bounding polygon is substantially parallel to its closest edge of said dirty region of said dish.
11. The system of claim 1, wherein the said bounding polygon is computed by computing multiple feature points along the edges of said dirty region of said dish, wherein said feature points are substantially different from at least some of its surrounding regions.
12. The system of claim 1, wherein said unwanted material in said dirty region is leftover food, dust, germs or any organic matter.
13. The system of claim 1, wherein said bounding polygon fully encompasses said dirty region in said image.
14. The system of claim 1, wherein said bounding polygon, said three dimensional representation or said spray pattern is estimated using a deep learning model.
15. The system of claim 1, wherein said three dimensional locations are computed from a depth map of said dish.
16. The system of claim 15, wherein said depth map is estimated using stereo matching from at least two camera images.
17. The system of claim 15, wherein said depth map is estimated by projecting a structured illumination pattern on said dish, recording an image of said dish with said camera, computing deformations to said illumination pattern from said image, and estimating depth map from said deformations.
18. The system of claim 1, wherein said three dimensional locations are computed by estimating the location of said bounding polygon within a known three dimensional model of said dish.
19. The system of claim 1, wherein said spray pattern comprises a plurality of waypoints such that each waypoint corresponds to a specific position or orientation of said nozzle.
20. A method for cleaning a dish, comprising: a. capturing at least one image of said dish using at least one camera; b. computing a bounding polygon for a dirty region in said image, wherein said dirty region is a region of said dish containing unwanted material that needs to be removed; c. computing a three dimensional representation of said bounding polygon by estimating three dimensional locations of multiple points within said bounding polygon; d. computing a spray pattern within said three dimensional representation of said bounding polygon such that the spray pattern substantially covers all regions of said three dimensional representation; and e. spraying a fluid on said dirty region of said dish with a nozzle, wherein said nozzle is reoriented or relocated such that said fluid reaches at least one location within said three dimensional representation of said bounding polygon according to said spray pattern, whereby dirty regions of said dish are targeted for a fast and efficient cleaning of said dish.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0024]
[0025] A light source 5 illuminates dish 6, while cameras 3 and 4 capture one or more images of dish 6. A nozzle 1 sprays a fluid on dish 6 with a predetermined spray distribution 2. Fluids include liquids and gases such as water, soap, rinsing agent, sanitizing agent, cleaning agent, or air. The nozzle 1 can reorient or relocate to spray fluid on any region of dish 6 visible in one or more images captured by cameras 3 and 4.
[0026] A processor configured to compute a bounding polygon 10 for a dirty region in a camera image. A three dimensional representation of the bounding polygon 10 is then computed by estimating three dimensional locations of multiple points within the bounding polygon 10. A spray pattern 11 within the three dimensional representation of the bounding polygon 10 is computed such that the spray pattern 11 substantially covers all regions of the three dimensional representation. In some embodiments, the bounding polygon 10 fully encompasses the dirty region in a camera image.
[0027] In some embodiments, a bounding polygon 10 is computed from the edges of the dirty region 7 of dish 6. In some embodiments, each edge of the bounding polygon 10 is substantially parallel to its closest edge of dirty region 7. In some embodiments, the bounding polygon 10 is computed by computing multiple feature points along the edges of the dirty region 7 of dish 6. In some embodiments, feature points along edges are points that are substantially different from at least some of its surrounding regions.
[0028] In some embodiments, the bounding polygon 10, the three dimensional representation, or the spray pattern 11 is estimated using a deep learning model such as a neural network. In some embodiments, a deep learning model takes a camera image of a dish as an input and returns one or more bounding polygon 10 as output. In some embodiments, a deep learning model is trained with multiple camera images and their corresponding bounding polygons 10.
[0029] In some embodiments, a deep learning model takes a camera image of a dish as an input and returns a three dimensional representation as output. In some embodiments, a deep learning model is trained with multiple camera images and their corresponding three dimensional representations. In some embodiments, a deep learning model takes a bounding polygon 10 as an input and returns a three dimensional representation as output. In some embodiments, a deep learning model is trained with multiple bounding polygons 10 and their corresponding three dimensional representations.
[0030] In some embodiments, a deep learning model takes a camera image of a dish as an input and returns a spray pattern 11 as output. In some embodiments, a deep learning model is trained with multiple camera images and their corresponding spray patterns 11. In some embodiments, a deep learning model takes a three dimensional representation as an input and returns a spray pattern 11 as output. In some embodiments, a deep learning model is trained with multiple three dimensional representations and their corresponding spray patterns 11. In some embodiments, a deep learning model takes a bounding polygon 10 as an input and returns a spray pattern 11 as output. In some embodiments, a deep learning model is trained with multiple bounding polygons 10 and their corresponding spray patterns 11.
[0031] In some embodiments, the three dimensional locations of multiple points within the bounding polygon 10 are computed from a depth map of dish 6. In some embodiments, the depth map is estimated using stereo matching from at least two camera images. In some embodiments, the depth map is estimated by projecting a structured illumination pattern on the dish, recording an image of dish 6 with said camera, computing deformations to the illumination pattern from the image, and estimating depth map from said deformations. In some embodiments, the three dimensional locations are computed by estimating the location of the bounding polygon 10 within a known three dimensional model of dish 6. In some embodiments, spray pattern 11 comprises a plurality of waypoints such that each waypoint corresponds to a specific position or orientation of nozzle 1.
[0032] The nozzle 1 sprays a fluid 2 on the dirty region 7 of dish 6. The nozzle 1 is reoriented or relocated such that the fluid reaches at least one location within the three dimensional representation of the bounding polygon 10 according to the spray pattern 11. Accordingly, dirty regions of said dish are targeted for a fast and efficient cleaning of said dish. The dish 6, nozzle 1, cameras 3 and 4, and light source 5 are enclosed in a module 12.
[0033] In some embodiments, a processor is further configured to compute a plurality of bounding polygon 10 for a plurality of dirty regions in a camera image. In some embodiments, a processor then computes a plurality of three dimensional representations of the plurality of bounding polygon 10. Further, in some embodiments, a processor computes a plurality of spray patterns 11 within the plurality of three dimensional representations.
[0034] Some embodiments comprise a light source for illuminating the dish 6. In some embodiments, the light source emits a structured pattern of light such as dots or lines. In some embodiments, the light source is either configured as a ring that surrounds a camera or configured as a diffuse illumination panel. In some embodiments, the light source emits infrared light. Some embodiments further comprise an ultraviolet light source to disinfect dish 6.
[0035] In some embodiments, a camera is designed to capture infrared images. In some embodiments, a camera captures images when a dish is placed and is ready for cleaning.
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]