NAVIGATION SYSTEM FOR NAVIGATING AN AUTONOMOUS MOBILE ROBOT WITHIN A PRODUCTION ENVIRONMENT
20250321582 ยท 2025-10-16
Inventors
Cpc classification
G05D2105/89
PHYSICS
G05D1/644
PHYSICS
G05D1/2446
PHYSICS
G05D1/246
PHYSICS
International classification
G05D1/244
PHYSICS
G05D1/246
PHYSICS
Abstract
A navigation system for navigating an autonomous mobile robot in an environment is provided. The navigation system includes at least one optical sensor attached to the autonomous mobile robot, a controller in communication with the at least one optical sensor, and a plurality of optical identifiers distributed within the environment at fixed locations and detectable by the at least one optical sensor. Each of the plurality of optical identifiers encodes a location within the environment. The controller is configured to obtain pictures of the environment via the at least one optical sensor, detect visible optical identifiers of the plurality of optical identifiers, which are within a field of view of the at least one optical sensor, decode the visible optical identifiers, and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.
Claims
1. A navigation system for navigating an autonomous mobile robot within an environment, the navigation system comprising: at least one optical sensor attached to the autonomous mobile robot; a controller in communication with the at least one optical sensor; and a plurality of optical identifiers distributed within the environment at fixed locations and detectable by the at least one optical sensor; wherein each of the plurality of optical identifiers encodes a location within the environment; wherein the controller is configured to: obtain pictures of the environment via the at least one optical sensor; detect visible optical identifiers of the plurality of optical identifiers, which are within a field of view of the at least one optical sensor; decode the visible optical identifiers; and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.
2. The navigation system of claim 1, wherein the controller is configured to estimate a distance to each of the visible optical identifiers and to navigate the autonomous mobile robot by applying a triangulation method using each pair of the visible optical identifiers.
3. The navigation system of claim 2, wherein the controller is configured to assign a weight to each of the visible optical identifiers based on the distance; and wherein optical identifiers closer to the autonomous mobile robot are assigned a higher weight for navigating the autonomous mobile robot.
4. The navigation system of claim 1, wherein the at least one optical sensor comprises at least one of a high-resolution camera and a near-distance low-resolution camera.
5. The navigation system of claim 1, wherein each of the plurality of optical identifiers is a printed or light projected optical identifier and comprises at least one of the following: a QR code; a barcode; a JAB code; an Aztec code; and a reference number.
6. The navigation system of claim 1, wherein the plurality of optical identifiers comprises a first subset of optical identifiers and a second subset of optical identifiers; wherein the first subset is associated with a first region of the environment; and wherein the second subset is associated with a second region of the environment.
7. The navigation system of claim 1, wherein the localizations using the decoded visible optical identifiers are determined by referencing a map of the environment stored in a data storage based on the visible optical identifiers.
8. The navigation system of claim 1, further comprising at least one LiDAR scanner arranged at the autonomous mobile robot and in communication with the controller; wherein the at least one LiDAR scanner is configured to scan a surrounding environment of the autonomous mobile robot; wherein the controller is configured to additionally localize the autonomous mobile robot within the environment based on the scan of the at least one LiDAR scanner; and wherein the controller is configured to compare the localization of the at least one LiDAR scanner with the localization of the at least one optical sensor and to obtain a corresponding variance.
9. The navigation system of claim 8, wherein the controller is configured to: when the variance is below a first threshold, navigate the autonomous mobile robot purely based on the plurality of optical identifiers; and when the variance is higher than a second threshold, stop the autonomous mobile robot.
10. The navigation system of claim 1, wherein the controller is configured to store a navigation history of the autonomous mobile robot; and wherein the navigation history is used as training data for an artificial intelligence module.
11. The navigation system of claim 10, wherein the artificial intelligence module is used to optimize paths of the autonomous mobile robot, to identify anomalies within the environment, or both optimize and to identify.
12. The navigation system of claim 1, wherein each of the plurality of optical identifiers is arranged at one of the following: a wall within the environment; a supporting structure for a product to be processed by the autonomous mobile robot; the product to be processed by the autonomous mobile robot; a second autonomous mobile robot or another robot system in communication with the controller; a drone; a handheld device; or a human operator.
13. A handheld device for performing a work task on an object by a human operator, the handheld device comprising: at least one work tool; a camera; and a controller; wherein the controller is configured to: obtain pictures of an environment within which the handheld device is operated via the camera; detect visible optical identifiers of a plurality of optical identifiers that are arranged at fixed locations within the environment, wherein the visible optical identifiers are optical identifiers which are within a field of view of at least one optical sensor; decode the visible optical identifiers; and correlate data pertaining to the work task with work positions at the object at which the work task has been performed based on real-time localizations of the handheld device within the environment using the decoded visible optical identifiers.
14. An autonomous mobile robot, comprising: at least one optical sensor; and a controller; wherein the controller is configured to: obtain pictures of an environment in which the autonomous mobile robot is located via the at least one optical sensor; detect visible optical identifiers, wherein the visible optical identifiers are located within a field of view of the at least on optical sensor, wherein the visible optical identifiers belong to a plurality of optical identifiers located within the environment at fixed locations, and wherein each of the plurality of optical identifiers encodes a location within the environment; decode the visible optical identifiers; and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.
15. A method for navigating an autonomous mobile robot according to claim 14 within an environment, the method comprising: obtaining, by a controller, pictures of the environment via at least one optical sensor attached to the autonomous mobile robot; detecting, by the controller, visible optical identifiers of a plurality of optical identifiers, wherein the visible optical identifiers are in a field of view of the at least one optical sensor, and wherein each of the plurality of optical identifiers encodes a fixed location within the environment; decoding, by the controller, the visible optical identifiers; and navigating, by the controller, the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0059] In the following, exemplary embodiments are described in more detail having regard to the attached figures. The illustrations are schematic and not to scale. Identical reference signs refer to identical or similar elements.
[0060]
[0061]
[0062]
[0063]
[0064]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0065]
[0066] The end effector 102 can, for example, be an optical scanner 102 for performing optical inspection scans of a surface (such as a surface of an aircraft or spacecraft fuselage), in order to detect anomalies such as dents, rivet pull-ins, out-of-contour deformations, blend-outs, and scratches. The high-resolution camera 113 is part of this optical scanner 102 in the depicted configuration and may, for example, be a high-resolution matrix scanner. However, the robot body 104 may also directly carry one or more high-resolution cameras 113.
[0067] Although shown in simplified form, it should be appreciated that the robot arm 101, in general, may be a robot arm 101 having at least three degrees of freedom (rotations), such that the end effector 102 can be moved to any three-dimensional position and orientation.
[0068] A controller 120 is comprised as a resident controller 120 within the autonomous mobile robot 100 in the depicted configuration. The controller 120 comprises a data storage 121 and an optional artificial intelligence module (AI module) 122. The controller 120 may be used to control the general operation of the autonomous mobile robot 100 and may, in particular, be configured to implement the navigation system and method described with regard to the following figures. The optical sensors 110 and the LiDAR scanners 112 are in communication with the controller 120. Although shown as a resident controller 120, at least for the purposes of the navigation system 10 (
[0069]
[0070] The optical identifiers 2 each encode a location within the environment 1, at which the corresponding optical identifier 2 is arranged. For example, each optical identifier 2 may be in the form of a QR code that encodes a data content that contains the coordinates at which the corresponding QR code is arranged. The optical identifiers 2 may be printed, attached (e.g., printed on a sticker that is attached at the corresponding location), light projected, or otherwise arranged at the corresponding locations. Preferably, the optical identifiers 2 all have the same size.
[0071]
[0072] Using the determined locations of the visible optical identifiers 2 and the determined distances to each of the visible optical identifiers (under consideration of the viewing directions, which are known because the arrangement of the optical sensors (cameras) 110 on the autonomous mobile robot 100 is known to the controller 120), the controller 120 determines a current localization of the autonomous mobile robot 100, for example using a triangulation method with each pair of the visible optical identifiers 2, as described herein further above.
[0073] Optionally, in the triangulation method a weight is assigned to each of the visible optical identifiers 2, which is used in the triangulation as well as in determining the final localization. In particular, the controller 120 may assign a higher weight to optical identifiers 2 closer to the autonomous mobile robot 100 because these optical identifiers yield a more accurate localization result. For example, the triangulation with each pair of the visible optical identifiers 2 yields a position of the autonomous mobile robot 100. Depending on the assigned weights to the optical identifiers, the controller 120 may, for example, determine a final localization as a weighted average of the individual localizations.
[0074] The localizations are continuously determined in this way while the autonomous mobile robot 100 moves through the environment 1. Hence, the controller 120 continuously obtains pictures of the environment 1 using the optical sensors 110, detects visible optical identifiers 2 within the pictures, which are within a combined field of view of all the optical sensors 110, decodes the data content of the detected visible optical identifiers 2, determines real-time localizations of the autonomous mobile robot within the environment 1, and navigates the autonomous mobile robot 100 based on the real-time localizations.
[0075] Optionally, a redundancy check may be performed using the LiDAR scanners 112 or another navigation method. For example, in
[0076] In
[0077] Optionally, the autonomous mobile robot 100 may move faster, if more of the optical identifiers 2 are currently visible because in such a case, accuracy of the localization can be increased.
[0078] Further, the navigation system 10 may enable navigating between regions 5, 6 in that automatically a corresponding map of the new region is loaded once the autonomous mobile robot 100 enters the new region. For example, if the autonomous mobile robot 100 leaves the region 5 in
[0079]
[0080]
[0081] Oftentimes, it is necessary to save position, for example, on the fuselage 9, at which a corresponding scan (or in general work task) has been performed (for example a position of a stringer or a frame (or both) of the fuselage 9, at which the scan has been taken. In order to avoid manually entering or marking the position on the object/fuselage 9, the handheld device 170 uses the same principle as the navigation system 10 described above, to obtain its current location. Therefore, the corresponding discussion for how the real-time localizations are obtained by detecting visible optical identifiers 2 within the environment 1 will not be repeated. Any and all features described with regard to the real-time localizations of the autonomous mobile robot 100 using the navigation system 10 are fully valid for the handheld device 170.
[0082] However, instead of for navigation purposes, the real-time localizations may be obtained, for example, each time a scan (or some other work task) is performed. The determined localization at each work task (i.e., the position, at which the handheld device is located within the environment 1) are used to determine a position on the object 9 (e.g., a position on a fuselage 9), at which the work task (here the optical scan) has been performed. These positions may then be correlated with data pertaining to the work task (e.g., with optical scans taken at that position).
[0083] The camera 171 of the handheld device 170, that is used for obtaining pictures of the environment 1 may either be an integrated camera 171 of the handheld device 170, or (as depicted) may be a camera 171, that can be attached to the handheld device 170 and that communicates with the controller 173 and can be used by the controller 173 for obtaining the pictures of the optical identifiers 2 within the environment 1. For example, as depicted, the camera 171 may be a camera 171 of a smartphone 174, that is attached to the handheld device 170 at a corresponding adapter 175. The smartphone 174 may then be connected to the controller 173 and may be used for obtaining the corresponding pictures.
[0084] The process of obtaining the localizations has been described with regard to the navigation system 10 (
[0085] Further, the handheld device 170 may also itself carry at least one optical identifier 2, which can be used by the navigation system 10 described herein, for example with regard to
[0086]
[0087] The method 200 starts in step 210 with obtaining pictures of the environment 1. The pictures may be obtained by the controller 120 via the optical sensors 110 of the autonomous mobile robot 100.
[0088] In step 220, the controller 120 detects visible optical identifiers 2 of a plurality of optical identifiers 2 (such as QR codes 2). The visible optical identifiers 2 are optical identifiers 2 which are currently within a combined field of view of the optical sensors 110, as described above with regard to the navigation system 10. Each of the optical identifiers 2 encodes a fixed location within the environment 1 at which it is arranged.
[0089] In step 230, the controller 120 decodes the visible optical identifiers 2 and determines a current localization of the autonomous mobile robot 100 within the environment 1 based on the decoded optical identifiers 2. Determining the localization may be done by a weighted triangulation method with each pair of visible optical identifiers 2, as described above.
[0090] Steps 210, 220, and 230 are performed continuously while the autonomous mobile robot 100 moves through the environment 1. In other words, the autonomous mobile robot 100 determines, in real-time, its localization within the environment 1 by monitoring the environment 1 for optical identifiers while it moves.
[0091] In step 240, the controller 120 navigates the autonomous mobile robot 100 based on the real-time localizations. Step 240 may be running concurrently with the continuous localization of the autonomous mobile robot 100 within the environment.
[0092] The systems and devices described herein may include a controller, such as controller 120, control unit, control device, controlling means, system control, processor, computing unit or a computing device comprising a processing unit and a memory which has stored therein computer-executable instructions for implementing the processes described herein. The processing unit may comprise any suitable devices configured to cause a series of steps to be performed so as to implement the method such that instructions, when executed by the computing device or other programmable apparatus, may cause the functions/acts/steps specified in the methods described herein to be executed. The processing unit may comprise, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, a central processing unit (CPU), an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.
[0093] The memory may be any suitable known or other machine-readable storage medium. The memory may comprise non-transitory computer readable storage medium such as, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. The memory may include a suitable combination of any type of computer memory that is located either internally or externally to the device such as, for example, random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like. The memory may comprise any storage means (e.g., devices) suitable for retrievably storing the computer-executable instructions executable by processing unit.
[0094] The methods and systems described herein may be implemented in a high-level procedural or object-oriented programming or scripting language, or a combination thereof, to communicate with or assist in the operation of the controller or computing device. Alternatively, the methods and systems described herein may be implemented in assembly or machine language. The language may be a compiled or interpreted language. Program code for implementing the methods and systems described herein may be stored on the storage media or the device, for example a ROM, a magnetic disk, an optical disc, a flash drive, or any other suitable storage media or device. The program code may be readable by a general or special-purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
[0095] Computer-executable instructions may be in many forms, including program modules, such as AI module 122, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.
[0096] It should be noted that comprising or including does not exclude other elements or steps, and one or a does not exclude a plurality. It should further be noted that features or steps that have been described with reference to any of the above embodiments may also be used in combination with other features or steps of other embodiments described above.
[0097] While at least one exemplary embodiment of the present invention(s) is disclosed herein, it should be understood that modifications, substitutions and alternatives may be apparent to one of ordinary skill in the art and can be made without departing from the scope of this disclosure. This disclosure is intended to cover any adaptations or variations of the exemplary embodiment(s). In addition, in this disclosure, the term or means either or both. Furthermore, characteristics or steps which have been described may also be used in combination with other characteristics or steps and in any order unless the disclosure or context suggests otherwise. This disclosure hereby incorporates by reference the complete disclosure of any patent or application from which it claims benefit or priority.
LIST OF REFERENCE SIGNS
[0098] 1 (production) environment [0099] 2 optical identifier [0100] 3 first subset of optical identifiers [0101] 4 second subset of optical identifiers [0102] 5 first region of the environment (assembly hangar; base level) [0103] 6 second region of the environment (assembly hangar; top level) [0104] 7 supporting structure [0105] 8 door [0106] 9 fuselage [0107] 10 navigation system [0108] 11 outdoor path [0109] 12 vertical support beams [0110] 13 first field of view [0111] 14 second field of view [0112] 15 third field of view [0113] 100 autonomous mobile robot [0114] 101 robot arm/manipulator [0115] 102 end effector, optical scanner [0116] 103 tray [0117] 104 robot body [0118] 110 optical sensor [0119] 111 near-distance low-resolution camera [0120] 112 LiDAR scanner [0121] 113 high-resolution camera [0122] 120 controller [0123] 121 data storage [0124] 122 artificial intelligence module (AI module) [0125] 130 wheels [0126] 160 drone [0127] 161 optical scanner of drone [0128] 162 camera of drone [0129] 170 handheld device [0130] 171 camera (of handheld device) [0131] 172 work tool, optical scanner (of handheld device) [0132] 173 controller [0133] 174 smartphone [0134] 175 adapter [0135] 176 handle [0136] 180 human operator [0137] 200 method [0138] 210 obtaining pictures [0139] 220 detecting optical identifiers [0140] 230 decoding optical identifiers [0141] 240 navigating based on optical identifiers