Cleaning robot projecting different light patterns
11565423 · 2023-01-31
Assignee
Inventors
Cpc classification
A47L2201/06
HUMAN NECESSITIES
A47L11/4061
HUMAN NECESSITIES
G01S17/48
PHYSICS
B25J9/1666
PERFORMING OPERATIONS; TRANSPORTING
A47L2201/04
HUMAN NECESSITIES
International classification
B25J9/00
PERFORMING OPERATIONS; TRANSPORTING
B25J11/00
PERFORMING OPERATIONS; TRANSPORTING
A47L11/40
HUMAN NECESSITIES
Abstract
There is provided a cleaning robot including a light source module, an image sensor and a processor. The light source module projects a line pattern and a speckle pattern toward a moving direction. The image sensor captures an image of the line pattern and an image of the speckle pattern. The processor calculates one-dimensional depth information according to the image of the line pattern and calculates two-dimensional depth information according to the image of the speckle pattern.
Claims
1. A cleaning robot, comprising: a first diffractive optical element; a first light source configured to project a line pattern through the first diffractive optical element; a second diffractive optical element; a second light source configured to project a speckle pattern through the second diffractive optical element, wherein the speckle pattern is for identifying an appearance of an obstacle; and an image sensor configured to acquire an image of the line pattern and an image of the speckle pattern.
2. The cleaning robot as claimed in claim 1, wherein the line pattern is within a region of the speckle pattern.
3. The cleaning robot as claimed in claim 2, wherein the first light source and the second light source are turned on simultaneously, and a dominant wavelength of the first light source is different from that of the second light source.
4. The cleaning robot as claimed in claim 2, wherein the first light source and the second light source are turned on sequentially.
5. The cleaning robot as claimed in claim 1, wherein the line pattern is outside of a region of the speckle pattern.
6. The cleaning robot as claimed in claim 1, wherein a projected angle of the line pattern is different from that of the speckle pattern.
7. The cleaning robot as claimed in claim 1, further comprising a processor configured to identify whether there is the obstacle according to the image of the line pattern.
8. The cleaning robot as claimed in claim 7, wherein the processor is further configured to turn on the first light source but turn off the second light source when identifying no obstacle in the image of the line pattern.
9. The cleaning robot as claimed in claim 7, wherein the processor is further configured to identify a distance from the obstacle according to the image of the line pattern, and control the cleaning robot to turn to move in a direction parallel to the obstacle when the distance is identical to a predetermined distance.
10. The cleaning robot as claimed in claim 9, wherein the image sensor comprises a linear pixel array, and when the cleaning robot is moving parallel to the obstacle, the processor is further configured to control the cleaning robot to maintain the predetermined distance to move parallel to the obstacle according to an image size of the obstacle detected by the linear pixel array.
11. A cleaning robot, comprising: a first diffractive optical element, disposed at a first position of the cleaning robot; a first light source configured to project a line pattern through the first diffractive optical element; a second diffractive optical element, disposed at a second position, different from the first position, of the cleaning robot; a second light source configured to project a speckle pattern through the second diffractive optical element; and an image sensor configured to acquire an image of the line pattern and an image of the speckle pattern.
12. The cleaning robot as claimed in claim 11, wherein the line pattern is within a region of the speckle pattern.
13. The cleaning robot as claimed in claim 12, wherein the first light source and the second light source are turned on simultaneously, and a dominant wavelength of the first light source is different from that of the second light source.
14. The cleaning robot as claimed in claim 12, wherein the first light source and the second light source are turned on sequentially.
15. The cleaning robot as claimed in claim 11, wherein the line pattern is outside of a region of the speckle pattern.
16. The cleaning robot as claimed in claim 11, wherein a projected angle of the line pattern is different from that of the speckle pattern.
17. The cleaning robot as claimed in claim 11, further comprising a processor configured to identify whether there is an obstacle according to the image of the line pattern, and identify an appearance of the obstacle according to the image of the speckle pattern.
18. The cleaning robot as claimed in claim 17, wherein the processor is configured to turn on the first light source but turn off the second light source when identifying no obstacle in the image of the line pattern.
19. The cleaning robot as claimed in claim 17, wherein the processor is further configured to identify a distance from the obstacle according to the image of the line pattern, and control the cleaning robot to turn to move in a direction parallel to the obstacle when the distance is identical to a predetermined distance.
20. The cleaning robot as claimed in claim 19, wherein the image sensor comprises a linear pixel array, and when the cleaning robot is moving parallel to the obstacle, the processor is further configured to control the cleaning robot to maintain the predetermined distance to move parallel to the obstacle according to an image size of the obstacle detected by the linear pixel array.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Other objects, advantages, and novel features of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION OF THE EMBODIMENT
(12) It should be noted that, wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
(13) Referring to
(14) The cleaning robot 100 of the present disclosure includes a light source module 11, an image sensor 13 and a processor 15 electrically coupled to the light source module 11 and the image sensor 13. The light source module 11 includes at least one active light source, and is used to provide or project a line pattern T1 and a speckle pattern T2 toward a front of a moving direction (e.g., the right of
(15) Referring to
(16)
(17) The first light source LD1 is arranged opposite to the first diffractive optical element 113.sub.T1 and used to emit light to pass through the first diffractive optical element 113.sub.T1 to project a line pattern T1 in front of a moving direction of the cleaning robot 100. The second light source LD2 is arranged opposite to the second diffractive optical element 113.sub.T2 and used to emit light to pass through the second diffractive optical element 113.sub.T2 to project a speckle pattern T2 in front of the moving direction of the cleaning robot 100, wherein sizes and shapes of the speckles in the speckle pattern are not particularly limited as long as a plurality of speckles of identical or different shapes are generated on a projected surface.
(18)
(19) In
(20) Referring to
(21) When the line pattern T1 and the speckle pattern T2 are overlapped with each other as shown in
(22) In another embodiment, the line pattern T1 and the speckle pattern T2 are overlapped with each other and the first light source LD1 and the second light source LD2 are turned on simultaneously (as shown in
(23) In the embodiment of
(24) The processor 15 is, for example, a digital signal processor (DSP), a microcontroller unit (MCU), a central processing unit (CPU) or an application specific integrated circuit (ASIC) that identify, by software and/or hardware, whether there is an obstacle (e.g., wall, table legs, chair legs or lower part of other furniture or home appliances) according to an image containing the line pattern T1, and identify the appearance (referred to two-dimensional depth information) of the obstacle according to an image containing the speckle pattern T2.
(25) For example referring to
(26) When an obstacle smaller than a range of the FOV exists within the FOV, a part of the line section in the image of the line pattern T1 appears at a different height (i.e. not at the position P1). Accordingly, the processor 15 identifies that there is an obstacle in front according to line sections at different positions.
(27) When an obstacle larger than a range of the FOV exists within the FOV, the whole of the line section in the image of the line patter T1 appears at a different height, e.g., moving upward or downward from the position P1 which is determined according to relative positions between the light source module 11 and the image sensor 13. Accordingly, the processor 15 identifies that there is an obstacle in front according to a position shifting of the line section. In addition, the processor 15 further identifies a distance from the obstacle according to the height (or a shifting amount) of the line section in the image of the line pattern T1. For example, the cleaning robot 100 further includes a memory for storing a relationship between positions of the line section and distances from the obstacle (e.g., forming a look up table, LUT). When identifying a position of the line section in the image of the line pattern T1, the processor 15 compares the calculated position with the stored information to obtain a distance of the obstacle (also adaptable to the case that a part of the line section appears at different positions).
(28) To reduce the consumption power and increase the accuracy, when the processor 15 identifies no obstacle in the image of the line pattern T1, preferably only the first light source LD1 is turned on but the second light source LD2 is not turned on. For example,
(29) In the above embodiment, a cleaning robot 100 having only one image sensor 13 is taken as an example to illustrate the present disclosure, and the image sensor 13 captures images of both the line pattern T1 and the speckle pattern T2. In another non-limiting embodiment, the cleaning robot 100 includes a first image sensor for capturing an image of the line pattern T1 and a second image sensor for capturing an image of the speckle pattern T2 to reduce the interference therebetween. In this embodiment, arrangements of the first light source LD1, the first diffractive optical element 113.sub.T1, the second light source LD2 and the second diffractive optical element 113.sub.T2 are not changed, and thus details thereof are not repeated herein.
(30) The first image sensor and the second image sensor acquire images respectively corresponding to operations of the first light source LD1 and the second light source LD2. For example, the first light source LD1 and the second light source LD2 emit light sequentially, and the first image sensor and the second image sensor respectively capture images of the line pattern T1 and the speckle patter T2 corresponding to the lighting of the first light source LD1 and the second light source LD2. In this embodiment, the line pattern T1 and the speckle pattern T2 are overlapped or not overlapped with each other, and dominant wavelengths of the first light source LD1 and the second light source LD2 are identical or different.
(31) In another embodiment, the first light source LD1 and the second light source LD2 are turned on simultaneously. If the line pattern T1 and the speckle pattern T2 are not overlapped with each other, a dominant wavelength of the first light source LD1 is identical to or different from that of the second light source LD2 without particular limitations. However, if the line pattern T1 and the speckle pattern T2 are overlapped with each other, the dominant wavelength of the first light source LD1 is preferably different from that of the second light source LD2 to avoid interference. In this case, the first image sensor has a light filter to block light instead of the dominant wavelength of the first light source LD1, and the second image sensor has a light filter to block the light instead of the dominant wavelength of the second light source LD2.
(32) The processor 15 is electrically coupled to the first image sensor and the second image sensor, and used to identify whether there is an obstacle according to the image of the line pattern T received from the first image sensor, and identify the appearance of the obstacle according to the image of the speckle pattern T2 received from the second image sensor.
(33) Similarly, to reduce the power consumption and increase the accuracy, when the processor 15 identifies that there is no obstacle in a moving direction according to the image of the line patter T1, only the first light source LD1 and the first image sensor are turned on, but the second light source LD2 and the second image sensor are not turned on as shown in
(34) In another embodiment, when moving in a direction parallel to the obstacle (e.g., a wall) at a predetermined distance, the cleaning robot 100 of the present disclosure captures the image of the line pattern T1 using the same image sensor 13 to maintain a wall distance without using other sensors.
(35) For example referring to
(36) The operating method herein is adaptable to the above embodiments having a single image sensor and two image sensors, respectively. Referring to
(37) Step S51: Firstly, the cleaning robot 100 is moving toward an obstacle W1 (e.g., a wall). The first light source LD1 emits light to go through the first DOE 113.sub.T1 to project a line pattern T1 toward a first direction (i.e., toward the obstacle W1). In this embodiment, it is assumed that a projected distance of the line pattern T1 is Z. The image sensor 13 then captures a first image Im1 containing the line pattern T1 as shown in
(38) As mentioned above, when the processor 15 identifies that there is at least one obstacle in the captured first image Im1 (the line section therein being moved or broken), the operating method further includes the steps of: controlling the second light source LD2 to emit light to go through the second DOE 113.sub.T2 to project a speckle pattern T2 toward the obstacle W1; and processing, by the processor 15, the image containing the speckle pattern T2 to obtain two-dimensional distance information, and details thereof have been illustrated above and thus are not repeated herein.
(39) Step S53: Next, the processor 15 calculates a position (e.g., the position H1 shown in
(40) Step S55: During the cleaning robot 100 moving toward the obstacle W1, the processor 15 calculates the relative distance at a predetermined frequency (e.g., corresponding to the image capturing frequency). When identifying that the relative distance is shortened to be equal to a predetermined distance (e.g., a wall distance M which is set before shipment), the processor 15 controls the cleaning robot 100 to turn (left or right) the moving direction to be parallel to the obstacle W1, e.g.,
(41) Step S57: Next, the cleaning robot 100 moves in a direction parallel to the obstacle W1 at a predetermined distance M therefrom as shown in
(42) Step S59: To maintain a parallel distance between the cleaning robot 100 and the obstacle W1 to be substantially identical to the predetermined distance M, the processor 15 continuously calculates the parallel distance according to a second image Im2 (referring to
(43) In one non-limiting embodiment, the image sensor 13 includes a linear pixel array (i.e. a length thereof much larger than a width) for capturing the second image Im2. Meanwhile, the image sensor 13 preferably has a wide-angle lens to allow a field of view (shown as 2θ) the image sensor 13 to be larger than a diameter of the cleaning robot 100. In this way, when the cleaning robot 100 moves in a direction parallel to the obstacle W1, the second image Im2 acquired by the image sensor 13 still contains the obstacle image, e.g., the region Pn shown in
(44) The method of controlling a moving direction of the cleaning robot 100 (i.e. controlling wheels by a motor) is known to the art and not a main objective of the present disclosure, and thus details thereof are not described herein.
(45) In one non-limiting embodiment, the wide field of view of the image sensor 13 is determined according to a size (e.g., diameter W) of the cleaning robot 100, a projected distance Z of the line pattern T1 and a wall distance (i.e., the predetermined distance M) by triangular calculation, e.g., θ=arctan ((M+W/2)/Z). If the size W of the cleaning robot 100 is larger, the field of view 2θ becomes larger. In addition, the processor 15 preferably has the function of distortion compensation to eliminate the image distortion caused by the wide-angle lens.
(46) In addition, as shown in
(47) It should be mentioned that the “wall distance” mentioned in the above embodiments is not limited to a distance from a “wall”. The “wall distance” is a distance from any obstacle having a large area such that the cleaning robot 100 cleans in a direction parallel to it.
(48) When an obstacle is transparent (e.g., a glass wall), a line pattern T1 projected by a cleaning robot can penetrate the transparent obstacle such that the processor 15 may not identify a relative distance from the transparent obstacle correctly. Therefore, the cleaning robot can bump into the transparent obstacle to generate noises and cause damage to the device itself or to the wall. Accordingly, the present disclosure further provides a cleaning robot 100′ capable of identifying a relative distance from a transparent obstacle as shown in
(49) The cleaning robot 100′ of the present disclosure includes a laser light source LD3, a diffractive optical element 113′, a light emitting diode LD4, an image sensor 13 and a processor 15. In one non-limiting embodiment, the laser light source LD3 is implemented by the above first light source LD1, and the diffractive optical element 113′ is implemented by the above first diffractive optical element 113.sub.T1, and thus details thereof are not repeated herein. In this embodiment, the laser light source LD3 projects a line pattern T1 toward a moving direction through the diffractive optical element 113′.
(50) A dominant wavelength of light emitted by the light emitting diode LD4 is identical to or different from a dominant wavelength of light (e.g., 850 nm to 940 nm, but not limited to) emitted by the laser light source LD3. The light emitting diode LD4 illuminates light with an emission angle θ2 toward the moving direction. In one non-limiting embodiment, the laser light source LD3 projects a light pattern T1 toward the moving direction below a horizontal direction (i.e., having a dip angle θ1) such that when there is no obstacle in front of the cleaning robot 100′, the line pattern T1 is projected on the ground on which the machine is moving. The light emitting diode LD4 illuminates light right ahead of the moving direction (i.e. no deep angle or elevation angle). In some embodiments, the light emitting diode LD4 is arranged to emit light toward the moving direction with a deep angle or an elevation angle smaller than 5 degrees.
(51) The image sensor 13 is implemented by the above image sensor 13 which acquires images with a field of view FOV toward the moving direction. Accordingly, when the laser light source LD3 is lighting, the captured images contain an image of the line pattern T1. As mentioned above, the processor 15 calculates and identifies a relative distance form an obstacle according to an image of the line pattern T1 (e.g., according to the position P1 mentioned above).
(52) The processor 15 is electrically coupled to the laser light source LD3 and the light emitting diode LD4 to control the laser light source LD3 and the light emitting diode LD4 to emit light in a predetermined frequency.
(53) As mentioned above, this embodiment is used to identify a distance from a transparent obstacle. Accordingly, when there is no transparent obstacle in a moving direction of the cleaning robot 100′, a signal-to-noise ratio (SNR) of an image (
(54) For example referring to
(55) In other words, in this embodiment, when the SNR of the image containing the line pattern T1 is within a predetermined threshold range, the processor 15 calculates a relative distance from the obstacle according to the image captured when the laser light source LD3 is emitting light; whereas, when the SNR of the image containing the line pattern T1 exceeds the predetermined threshold range, the processor 15 calculates a relative distance from the obstacle according to the image captured when the light emitting diode LD4 is emitting light. In one non-limiting embodiment, a dominant wavelength of light emitted by the light emitting diode LD4 is selected to have a higher reflectivity corresponding to a specific material (e.g., glass) to facilitate the distance detection.
(56) Referring to
(57) In addition, the embodiment of
(58) As mentioned above, the conventional cleaning robot can only detect one-dimensional distance information but unable to detect the appearance of an obstacle. Furthermore, the conventional cleaning robot uses multiple sensors to detect a wall distance to have the problem of the existence of dead zones. Accordingly, the present disclosure further provides a cleaning robot (e.g.,
(59) Although the disclosure has been explained in relation to its preferred embodiment, it is not used to limit the disclosure. It is to be understood that many other possible modifications and variations can be made by those skilled in the art without departing from the spirit and scope of the disclosure as hereinafter claimed.