ROBOT SURVEILLANCE SYSTEM
20180181141 ยท 2018-06-28
Inventors
Cpc classification
H04N23/55
ELECTRICITY
H04W4/80
ELECTRICITY
B25J9/1676
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40548
PHYSICS
G05D1/0214
PHYSICS
International classification
Abstract
Disclosed is a robot surveillance system for preventing a robot from falling down or collision by means of dynamic detection of moving object invasion. The system includes a light-emitting unit, an image capture unit, an image processing unit, a turning control unit, and a wireless transceiver unit. The image capture unit combines with an optic pattern to provide a visual module for the moving or surveillance robot. The shape of the optic pattern changes as the robot moves toward the object or pothole, and the image processing unit recognizes the detected object to be standstill, and thus controls the moving direction of the robot to effectively avoid falling or collision. Any moving object in the space is detected by the object motion detection, thereby effectively achieving the goal of spatial surveillance and security.
Claims
1. A robot surveillance system, comprising: at least one light-emitting unit provided on an outer part of a robot, each light-emitting unit emitting a light beam comprising an optical pattern incident on and shown by at least one standstill object or at least one moving object in space changing, the optical pattern changing dependent on the standstill object or the moving object; an image capture unit provided on the outer part of the robot, comprising an optic lens and a photo-sensing device connected together, the optic lens capturing an image of the standstill or moving object to cause the photo-sensing device to form a captured image of the optical pattern from the standstill or moving object; an image processing unit electrically connected to the image capture unit for receiving the captured image of the image capture unit, the image capture unit comprising a detection module, a recognition module, and a motion detection module, the detection module detecting the optic pattern from the image capture unit, a distance away from the standstill object calculated and a first signal generated and transferred when the optic pattern changes due to the standstill object, the recognition module examining and comparing information of the image from the image capture unit, and generating and transferring a second signal when a comparison result of the image showing not consistent, the motion detection module detecting the moving object in the image to generate and transfer a third signal; a turning control unit electrically connected to the detection module for receiving the first signal of the detection module to control the robot to turn to avoid the standstill object; and a wireless transceiver unit electrically connected to the recognition module and the motion detection module for receiving the image, the second signal, and the third signal from the recognition module and the motion detection module to wireless transfer to a user at a remote surveillance device through a wireless communication protocol.
2. The robot surveillance system as claimed in claim 1, wherein the at least one light-emitting unit is one of a laser, a light-emitting diode, and a luminary for emitting the light beam.
3. The robot surveillance system as claimed in claim 1, wherein the optic pattern is one of a grid, a straight line, and dots regularly arranged.
4. The robot surveillance system as claimed in claim 1, wherein the standstill object is at least one of an object placed on the ground, a recess of the ground, and a hump of the ground.
5. The robot surveillance system as claimed in claim 1, wherein the optic lens is one of a fish-eye lens, a wide-angle lens, and a standard lens.
6. The robot surveillance system as claimed in claim 1, wherein the photo-sensing device is one of a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS).
7. The robot surveillance system as claimed in claim 1, wherein the recognition mode reads the information of the image through means of face recognition, bar code recognition, or pattern recognition, and compares with a plurality of built-in comparison data in the recognition module.
8. The robot surveillance system as claimed in claim 7, wherein the recognition module further comprises a store module for storing the plurality of built-in comparison data.
9. The robot surveillance system as claimed in claim 1, wherein the motion detection module employs means of object motion detection, light flow detection, or object outline detection to detect the moving object.
10. The robot surveillance system as claimed in claim 1, wherein the communication protocol is at least one of Bluetooth communication protocol, infrared communication protocol, near-field communication (NFC), wireless local area networks (WLAN), WiGig, Zigbee, Wireless USB, ultra-wide band (UWB), and WiFi for providing the wireless transceiver unit with a function of communication with the remote surveillance device.
11. The robot surveillance system as claimed in claim 10, wherein the remote surveillance device is one of a mobile communication device, a remote device, and a computer device.
12. The robot surveillance system as claimed in claim 1, wherein the light-emitting unit is a flickering light source, having a flickering frequency the same as an image capture frequency of the image capture unit.
13. The robot surveillance system as claimed in claim 12, wherein the detection module of the image processing unit examines and compares differences between two successive images captured by the image capture unit to identify change of the optic pattern.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The present invention will be apparent to those skilled in the art by reading the following detailed description of a preferred embodiment thereof, with reference to the attached drawings, in which:
[0021]
[0022]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
[0023] The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
[0024] Please refer to
[0025] The at least one light-emitting unit 1 is provided on the outer part of the robot 2, and each light-emitting unit 1 emits a light beam 11 comprising an optical pattern 111, which is incident on and shown by at least one standstill object or at least one moving object in space. The optical pattern will change dependent on the standstill object or the moving object. In addition, the at least one light-emitting unit 1 is one of a laser, a light-emitting diode, and a luminary for emitting the light beam. Also, the optic pattern 11 is one of a grid, a straight line, and dots regularly arranged, and the standstill object is at least one of an object placed on the ground, a recess of the ground, and a hump of the ground. In one embodiment of the present invention, the cleaning robot 21 is implemented by a robot 2, which has an outer provided with two adjacent lasers as the light-emitting units 1. Specifically, the two light beams 11 emitted by two lasers comprises the optic patterns like straight lines. Further, one of the two light beams 11 horizontally travels forward in parallel such that the object standing on the ground or a hump of the ground as the standstill object is detected by use of the straight line pattern. The other light beam 11 of the light-emitting unit 1 oblique travels toward the ground such that the standstill object standing in a recess of the ground is detected by use of the straight line pattern. However, it should be noted that the above standstill objects and the optic patterns 111 are only exemplary for clear explanation, and not intended to limit the scope of the present invention. For those who skilled in this technical field, all the types of the optic pattern 111 performing the similar feature of the present invention are thus included in the scope of the present invention.
[0026] The image capture unit 3 is provided on the outer part of the robot 2, and comprises an optic lens 31 and a photo-sensing device 32 connected to the optic lens 31. The optic lens 31 is configured to capture an image of the standstill or moving object to cause the photo-sensing device 32 to form a captured image based on the optical pattern from the standstill or moving object. Further, the optic lens 31 is one of a fish-eye lens, a wide-angle lens, and a standard lens, and the photo-sensing device 32 is one of a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS). In the embodiment of the present invention, the optic lens 31 and the photo-sensing device 32 are implemented by the fish-eye lens and the CCD, respectively. The image capture unit 3 is provided on the outer part of the robot 2 with respect to the light-emitting unit 1, and the image of the optic pattern 111 is formed on the photo-sensing device 32.
[0027] The image processing unit 4 is electrically connected to the image capture unit 3 for receiving the image captured by the image capture unit 3, and comprises a detection module 41, a recognition module 42, and a motion detection module 43 The detection module 41 detects the optic pattern 111 in the image. When the optic pattern 111 changes due to the standstill object, a distance away from the standstill object is calculated and a first signal S1 is generated and transferred by the detection module 41. The recognition module 42 examines and compares information of the image from the image capture unit 3. When a comparison result of the image shows not consistent, the recognition module 42 generates and transfers a second signal S2. The motion detection module 43 detects the moving object in the image to generate and transfer a third signal S3. Moreover, the recognition mode 42 reads the information of the image through means of face recognition, bar code recognition, or pattern recognition, and compares with a plurality of built-in comparison data in the recognition module 42. The recognition mode 42 further comprises a store module (not shown) for storing the plurality of built-in comparison data. The motion detection module 43 employs means of object motion detection, light flow detection, or object outline detection to detect the moving object. In the embodiment of the present invention, the detection module 41 of the image processing unit 4 is configured for receiving the image captured by the image capture unit 3, and determines if the optic pattern 111 is incident on the standstill object and thus changes. If the optic pattern 111 changes, the detection module 41 issues the first signal S1 and calculates the distance away from the standstill object. For example, the detection module 42 is suitably applicable to the patrol robot of the second embodiment. When the patrol robot is configured to patrol the residence building or the large scale factory, means of bar code identification is employed to compare the built-in comparison data with the bar code on the person like employee ID. When the comparison result shows not consistent, the recognition module 42 transfers the person's image and the second signal S2, and at the same time, the motion detection module 43 utilizes means of light flow detection to implement the goal of security and surveillance by detecting the moving object in the space invading the secured region, and then transfers the image and the third signal S3.
[0028] The turning control unit 5 is electrically connected to the detection module 41 for receiving the first signal S1 of the detection module 41 to control the robot 2 to correctly turn to avoid the detected standstill object. In the embodiment of the present invention, when the detection module 41 receives the captured image from the image capture unit 3 and confirms that the optic pattern 111 incident on the standstill object changes, the detection module 41 issues the first signal S1 to the turning control unit 5, thereby controlling the robot 2 to appropriately turn and preventing risk of colliding with the standstill object.
[0029] The wireless transceiver unit 6 is electrically connected to the recognition module 42 and the motion detection module 43. Specifically, the wireless transceiver unit 6 receives the image, the second signal S2, and the third signal S3 from the recognition module 42 and the motion detection module 43 to wireless transfer to a user at a remote surveillance device 7 through a wireless communication protocol. Additionally, the communication protocol is at least one of Bluetooth communication protocol, infrared communication protocol, near-field communication (NFC), wireless local area networks (WLAN), WiGig, Zigbee, Wireless USB, ultra-wide band (UWB), and WiFi for providing the wireless transceiver unit 6 with a function of communication with the remote surveillance device 7. Moreover, the remote surveillance device 7 is one of a mobile communication device, a remote device, and a computer device. In the embodiment of the present invention, when the recognition module 42 confirms the comparison result is not consistent, the image of the undesired person and the second signal S2 are transferred, and the wireless transceiver unit 6 employs the wireless communication protocol to further transfer the image and the second signal S2 to the mobile communication device by the user as the remote surveillance device 7 so as to warn or inform related persons of the situation that the residence building or the factory is invaded by some undesired person. Further, when the motion detection module 43 uses means of light flow detection to detect the moving object in the space, the image and the third signal S3 are transferred to the wireless transceiver 6. Then, the wireless transceiver 6 further transfers the image of the moving object to the remote surveillance device 7 of the user, thereby informing the related persons of the invasion event caused by the moving object, and at the same time, storing the image of the moving object.
[0030] Furthermore, the light-emitting unit 1 is a flickering light source having a flickering frequency, which is the same as an image capture frequency of the image capture unit 3. Also, the detection module 41 of the image processing unit 4 examines and compares the differences between two successive images captured by the image capture unit 3 to identify change of the optic pattern 111. In the third embodiment of the present invention, when the light emitted by the light-emitting unit 1 in the flicker mode is kept bright, the image capture unit 3 fetches one image of the standstill object and the optic pattern 111 on the standstill object, and when the light emitted light turns dark at the next moment, the current image of the standstill object and the optic pattern 111 is fetched. The detection module 41 performs subtraction of the above two successive images to extract only the part of the optic pattern 111 in the image for more precisely detecting and identifying the change of the optic pattern 111 and more strictly controlling the robot 2 to turn.
[0031] Next, some actual applications for the robot surveillance system are described below to further help well understand the key features provided by the present invention, but not limited to the scope of the present invention. When the user intends to clean the area, the clean robot 21 provided with the robot surveillance system of the present invention is helpful because the clean robot 21 is prevented from colliding with some standstill object or falling down. Further, the clean robot 21 can surely monitor the moving object in the space by means of motion detection to effectively performing security and surveillance of the desired region, thereby providing the advantage of greatly decreasing cost of hardware and manpower, improving precision of detection and surveillance, and implementing instant warning. First, the at least one light-emitting unit 1 is prepared, and provided on the outer part of the robot 2. Each light-emitting unit 1 implemented by the laser emits a laser beam 11 comprising the specific optical pattern 111, which is incident on and shown by at least one standstill object or at least one moving object in space. As a result, the optic pattern 111 changes due to the standstill object or the moving object. Then, the image capture unit 3 is prepared and provided on the outer part of the robot 2. The image capture unit 3 comprises the optic lens 31 and the photo-sensing device 32 connected together. The optic lens 31 is intended to capture the image of the standstill or moving object to cause the photo-sensing device 32 to form the captured image of the optical pattern 111 on the standstill or moving object. Next, the image processing unit 4 electrically connected to the image capture unit 3 is prepared. The detection module 41 of the image processing unit 4 detects and identifies the optic pattern 111 transferred from the image capture unit 3. When the optic pattern 111 changes due to the standstill object, the distance away from the standstill object is calculated and the first signal S1 is thus generated and transferred to the turning control unit 5 so as to correctly control the robot 2 to turn and avoid the standstill object. Finally, the motion detection module 43 built-in by the image processing unit 4 is prepared. When the moving object in the space is detected by the motion detection module 43 through means of light flow detection, the image and the third signal S3 are transferred to the wireless transceiver unit 6. Then, the wireless transceiver unit 6 transfers the received image to the remote surveillance device 7 employed by the user, like the mobile communication device, so as to inform the related persons of the information that invasion is resulted in by the moving object and instantly store the image. With this, a specific hardware is designed by the system of the present invention to incorporate the image capture unit 3 with the optic pattern 111 emitted by the light-emitting unit 1, and intended to provide the clean robot 21 with the visual module. In other words, when the optic pattern 111 according to the image captured by the image capture unit 3 changes because the clean robot 21 is moving and about to collide with the standstill object, the image processing unit 4 recognizes and confirms that the standstill object is present, and controls the clean robot 21 to turn so as to instantly avoid collision or falling down. Therefore, the advantage of greatly decreasing cost of hardware is indeed implemented.
[0032] From the above mention, it is obvious that the robot surveillance system of the present invention indeed implements the desired effects through the above embodiments, and is not disclosed before the application date, thereby meeting all the regulations of the patent law.
[0033] Although the present invention has been described with reference to the preferred embodiments thereof, it is apparent to those skilled in the art that a variety of modifications and changes may be made without departing from the scope of the present invention which is intended to be defined by the appended claims.