UNMANNED LAWN MOWER WITH AUTONOMOUS DRIVING
20200042009 ยท 2020-02-06
Inventors
Cpc classification
A01D42/00
HUMAN NECESSITIES
A01D34/84
HUMAN NECESSITIES
G05D1/0088
PHYSICS
G05D1/0038
PHYSICS
International classification
A01D34/84
HUMAN NECESSITIES
G05D1/00
PHYSICS
Abstract
An unmanned lawn mower includes a mower body, a cutting module, a wheel module, a camera module and a CPU. The cutting module is mounted on the mower body and configured to weed. The wheel module is mounted on the mower body and configured to move the mower body. The camera module is mounted on the mower body and configured to capture images of surroundings of the mower body. The CPU is coupled to the cutting module, the wheel module and the camera module. The central processing unit controls the cutting module and the wheel module to weed within an area according to the images captured by the camera module and control signals from a handheld electronic device, or the central processing unit controls the cutting module and the wheel module to weed within the area according to the images captured by the camera module.
Claims
1. An unmanned lawn mower with autonomous driving, comprising: a mower body; a cutting module mounted on the mower body and configured to weed; a wheel module mounted on the mower body and configured to move the mower body; a camera module mounted on the mower body and configured to capture images of surroundings of the mower body; and a central processing unit (CPU) mounted in the mower body and coupled to the cutting module, the wheel module and the camera module; wherein the central processing unit controls the cutting module and the wheel module to weed within an area according to the images captured by the camera module and control signals from a handheld electronic device, or the central processing unit controls the cutting module and the wheel module to weed within the area according to the images captured by the camera module.
2. The unmanned lawn mower of claim 1, wherein a boundary within the area for weeding is defined by the control signals sent by the handheld electronic device cooperatively with the images captured by the camera module, and the unmanned lawn mower weeds within the boundary.
3. The unmanned lawnmower of claim 2, wherein the CPU defines a plurality of image characteristics on the boundary according to the images captured by the camera module.
4. The unmanned lawn mower of claim 3, wherein the camera module is a stereo camera, and each of the image characteristics comprises a depth message.
5. The unmanned lawn mower of claim 2, wherein the CPU computes a weeding trajectory within the boundary based on a profile of the boundary.
6. The unmanned lawn mower of claim 1, wherein a route within the area for weeding is defined by the control signals sent by the handheld electronic device cooperatively with the images captured by the camera module, and the unmanned lawn mower weeds along the route.
7. The unmanned lawnmower of claim 6, wherein the CPU defines a plurality of image characteristics on the plurality of routes according to the images captured by the camera module.
8. The unmanned lawn mower of claim 7, wherein the camera module is a stereo camera, and each of the image characteristics comprises a depth message.
9. The unmanned lawn mower of claim 1, further comprising: a wireless signal based positioning module coupled to the CPU and configured to position the mower body by establishing connection with at least one wireless positioning terminal, wherein a boundary or a route is defined by the control signals sent by the handheld electronic device, the images captured by the camera module and wireless positioning signals transmitted from the at least one positioning terminal, and the unmanned lawn mower weeds within the boundary or along the route.
10. The unmanned lawn mower of claim 9, further comprising: a dead reckoning module coupled to the CPU and configured to position the mower body, wherein the boundary or the route is further defined by the dead reckoning module.
11. The unmanned lawn mower of claim 10, wherein the wireless signal based positioning module comprises at least one of a GPS module, a WiFi signal receiving module and a Bluetooth signal receiving module, and the dead reckoning module comprises a gyroscope and/or an accelerometer.
12. The unmanned lawn mower of claim 1, further comprising: a proximity sensor module coupled to the CPU and configured to detect an object around the mower body, the proximity sensor module generating a proximity warning signal when the object is within a predetermined range relative to the mower body.
13. The unmanned lawn mower of claim 1, further comprising: a remote device communication module coupled to the CPU and configured to establish connection with the handheld electronic device; wherein the handheld electronic device operably sends the control signals to the remote device communication module, and the CPU controls: the wheel module to move based on the control signals; and the camera module to capture the images when the mower body is moved; wherein the CPU controls the remote device communication module to transmit the images to the handheld electronic device.
14. The unmanned lawn mower of claim 1, further comprising: a storage unit coupled to the CPU and configured to store at least one identification image registered; wherein the CPU determines an initial user image of a user captured by the camera module matches the at least one identification image registered, and the CPU controls the wheel module to follow a movement of the user according to user motion images captured by the camera module when the initial user image of the user matches the at least one identification image registered, so as to define a boundary within the area for weeding, and the unmanned lawn mower weeds within the boundary.
15. The unmanned lawn mower of claim 14, wherein the CPU defines a plurality of image characteristics on the boundary according to the images captured by the camera module.
16. The unmanned lawn mower of claim 15, wherein the camera module is a stereo camera, and each of the image characteristics comprises a depth message.
17. The unmanned lawn mower of claim 14, wherein the CPU computes a weeding trajectory within the boundary based on a profile of the boundary.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DETAILED DESCRIPTION
[0036] In the following detailed description of the embodiments, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. In this regard, directional terminology, such as top, bottom, etc., is used with reference to the orientation of the Figure(s) being described. The components of the present invention can be positioned in a number of different orientations. As such, the directional terminology is used for purposes of illustration and is in no way limiting. On the other hand, the drawings are only schematic and the sizes of components may be exaggerated for clarity. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of including, comprising, or having and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms connected, and installed and variations thereof herein are used broadly and encompass direct and indirect connections and installations. Accordingly, the drawings and descriptions will be regarded as illustrative in nature and not as restrictive.
[0037] Referring in
[0038] In the present embodiment, the cutting module 2 can include a blade motor 20 and a blade unit 21. The blade unit 21 is configured to weed, and the blade motor 20 is configured to drive the blade unit 21 to weed. Further, the blade motor 20 is coupled to the CPU 5 and the blade unit 21. In such a manner, the CPU 5 is able to control the blade unit 21 to activate or to shut down depending on practical emergencies.
[0039] In the present embodiment, the wheel module 3 can include a wheel control unit 30, a wheel rotating motor 31, a rotary speed sensor 32, a front wheel mount 33 and a rear wheel mount 34. The wheel rotating motor 31 is coupled to the rear wheel mount 34 and configured to drive the mower body 1 to move forwards or backwards. The rotary speed sensor 32 is disposed near the rear wheel mount 34 and configured to detect a rotating speed of the rear wheel mount 34. The front wheel mount 33 is mounted on the mower body 1 and configured to change moving directions of the mower body 1 of the unmanned lawnmower 1000. The wheel control unit 30 is coupled to the CPU 5, the wheel rotating motor 31 and the rotary speed sensor 32. Practically, the wheel control unit 30 can be a circuitry on a main board of the unmanned lawn mower 1000. In such a manner, the CPU 5 is able to control the movement of the mower body 1 of the unmanned lawn mower 1000 through the wheel control unit 30, the wheel rotating motor 31, the rotary speed sensor 32, the front wheel mount 33 and the rear wheel mount 34.
[0040] As shown in
[0041] The blade shutdown module B is coupled to the CPU 5 and configured for tilt and lift sensing. For example, when the mower body 1 is lifted or tilted by an external force as the unmanned lawn mower 1000 is working and the cutting module 2 is activated, the blade shutdown module B is able to sense the attitude of the mower body 1 and sends an attitude warning signal to the CPU 5. The CPU 5 shuts down the cutting module 2 when receiving the attitude warning signal sent by the blade shutdown module B for the safety sake.
[0042] As shown in
[0043] In the present embodiment, the wireless signal based positioning module 8 can include at least one of a GPS module 80, a WiFi signal receiving module 81 and a Bluetooth signal receiving module 82. The GPS module 80 is configured to receive signals from satellites, so that the wireless signal based positioning module 8 could position the mower body 1 outdoors. The WiFi signal receiving module 81 is configured to establish connection with WiFi hotspots, i.e., the at least one wireless positioning terminal is WiFi hotspots, so that the wireless signal based positioning module 8 could position the mower body 1 indoors. The Bluetooth signal receiving module 82 is configured to establish connection with electronic devices with Bluetooth access, i.e., the at least one wireless positioning terminal is the electronic devices with Bluetooth access, so that the wireless signal based positioning module 8 could position the mower body 1 indoors.
[0044] The dead reckoning module 9 is coupled to the CPU 5 and configured to position the mower body 1. In the present embodiment, the dead reckoning module 9 can include a gyroscope 90 and/or an accelerometer 91. The gyroscope 90 is able to detect an orientation of the mower body 1 during a movement of the mower body 1, and the accelerometer 91 is able to detect a current speed of the mower body 1. A combination of the gyroscope 90 and the accelerometer 91 is able to position the mower body 1 without satellite signals, WiFi signals or Bluetooth signals.
[0045] The proximity sensor module A is coupled to the CPU 5 and configured to detect an object, e.g., an obstacle, a dog, a baby and so on, around the mower body 1. The proximity sensor module A generates a proximity warning signal when the object is within a predetermined range relative to the mower body 1, wherein the predetermined range depends on categories of the proximity sensor module A. In the present embodiment, the proximity sensor module A can be one or more selected from a sonar sensor module, an infrared sensor module, a light detection and ranging (LiDAR) module, a radar module.
[0046] Referring to
[0047] When the activating member F2 pushes the end of the second lever part F5 in the first driving direction D1, the lever member F3 pivots about the second shaft F1 to rotate relative to the casing 10 in a first rotating direction R1, leading to that the camera module 4 is lifted from a retracted position shown in
[0048] Referring to
[0054] Referring
[0055] For example, when the unmanned lawn mower 1000 is in the start location (i.e., the first position P1 shown in
[0056] Besides the real time display section 61, the user interface 60 of the handheld electronic device 6 further has a control section 62 including a direction button section 620, a mapping section 621, a go button section 622 and a stop button section 623. The direction button section 620, the go button section 622 and the stop button section 623 of the control section 62 are configured to generate the user-initiated commands, so that the user U could operably generate the user-initiated commands for controlling the unmanned lawn mower 1000 in cooperation with the images sent by the remote device communication module 7 of the unmanned lawn mower 1000.
[0057] Afterwards, the CPU 5 is able to define the boundary 100 by directing the unmanned lawn mower 1000 back to the start location according to the images and the control signals with respect to the user-initiated command (step 102). In other words, after completion of directing the unmanned lawn mower 1000 from the start location (i.e., the first position P1 shown in
[0058] It should be noticed that during the movement of the unmanned lawn mower 1000 from the start location back to the start location, the CPU defines a plurality of image characteristics on the boundary 100 according to the images captured by the camera module 4. For example, when the camera module 4 captures an image of a first geographic feature GF1 shown in
[0059] In the present embodiment, the camera module 4 can be a stereo camera, leading to that each of the image characteristics includes a depth message, i.e., a distance between the mower body 1 and the corresponding geographic feature is included in the image characteristic through image processing by a binocular field of views generated by the stereo camera. The boundary 100 can be generated by the depth message of the surroundings and be showed as the mapping section 621. Preferably, distance information detected by the proximity sensor module A can be referenced by the CPU 5 when generating the mapping section 621. The category of the camera module 4 is not limited to that illustrated in the present embodiment. For example, the camera module 4 can be a depth camera, a monocular camera and so on, and it depends on practical demands.
[0060] When the boundary 100 is defined, the CPU 5 computes the weeding trajectory 300 within the boundary 100 based on the profile of the boundary 100 (Step 103). Practically, the CPU computes the weeding trajectory 300 through several algorithms, such as an artificial potential field method, a grid method, a fuzzy control algorithm, a neural network path planning method and so on. Afterwards, the CPU 5 controls the unmanned lawn mower 1000 to weed along the weeding trajectory 300 within the boundary 200.
[0061] Referring to
[0066] The major difference between the method of the present embodiment and that of the aforesaid embodiment is that the route 400 within the area 200 for weeding is defined by the control signals sent by the handheld electronic device 6 cooperatively with the images captured by the camera module 4, and the unmanned lawn mower 1000 weeds along the route 400. In other words, the route 400 for weeding is assigned by the handheld electronic device 6 from the start location (i.e., a first position P1 shown in
[0067] Since the unmanned lawn mower 1000 is able to be equipped with the wireless signal based positioning module 8 and/or the dead reckoning module 9, except for the control signals sent by the handheld electronic device and the images captured by the camera module, the boundary 100 or the route 400 is further defined by wireless positioning signals transmitted from the at least one positioning terminal and/or further defined by the dead reckoning module 9, and the unmanned lawn mower 100 weeds within the boundary 100 or along the route 400.
[0068] Referring to
[0078] As shown in
[0079] When the unmanned lawn mower 1000 is desired to weed, at first, an initial user image 500 of the user U, as shown in
[0080] When the initial user image 500 does not match the identification image, the user U does not pass the check and the unmanned lawn mower 1000 idles (Step S303). When the initial user image 500 matches the identification image, the user U passes the check and the CPU 5 controls the mower body 1 to follow the movement of the user U according to the user motion image captured by the camera module 4 through image processing (Step S304), in order for the boundary or route definition. Steps S305 to S308 are similar to those in
[0081] Referring to
[0087] It should be noticed that certain emergency cases might occur during weeding process, and hence, there are procedures implemented for the certain emergency cases. when the unmanned lawn mower 1000 weeds along the weeding trajectory 300 within the boundary 100 or along the route 400, the proximity sensor module A detects objects on the weeding trajectory 300 or along the route 400 (Step S400). Herein, it is illustrative of an example that the unmanned lawn mower 1000 weeds along the weeding trajectory 300 and the camera module 4 is a stereo camera.
[0088] As shown in
[0089] When the object O detected (or the distance 700) is not within the warning range, the unmanned lawn mower 1000 continues to weed along the weeding trajectory 300 (step S400). When the object O detected (or the distance 700) is within the warning range, the CPU 5 further determines whether the object O detected is a living creature or not (step S402). The identification of living creature can be implemented by comparing the object O with skeleton analysis diagrams stored in the storage unit G. When the object O detected is not a living creature, the CPU 5 controls the unmanned lawn mower 1000 to avoid the object O (step S403). When the object O detected is a living creature, e.g., living creatures LC1, LC2 are respectively illustrated as a baby and a pet in
[0090] Compared to the prior art, the unmanned lawn mower of the present invention is equipped with the camera module to capture the image of the surroundings of the mower body, allowing the boundary or the route within the area for weeding to be defined by the images captured by the camera module through image processing. It not only leads to convenience of use for the unmanned lawn mower of the present invention, but also enables the unmanned lawn mower of the present invention to be more artificially intelligent.
[0091] Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.