NAVIGATION SYSTEM FOR NAVIGATING AN AUTONOMOUS MOBILE ROBOT WITHIN A PRODUCTION ENVIRONMENT

20250321582 ยท 2025-10-16

    Inventors

    Cpc classification

    International classification

    Abstract

    A navigation system for navigating an autonomous mobile robot in an environment is provided. The navigation system includes at least one optical sensor attached to the autonomous mobile robot, a controller in communication with the at least one optical sensor, and a plurality of optical identifiers distributed within the environment at fixed locations and detectable by the at least one optical sensor. Each of the plurality of optical identifiers encodes a location within the environment. The controller is configured to obtain pictures of the environment via the at least one optical sensor, detect visible optical identifiers of the plurality of optical identifiers, which are within a field of view of the at least one optical sensor, decode the visible optical identifiers, and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.

    Claims

    1. A navigation system for navigating an autonomous mobile robot within an environment, the navigation system comprising: at least one optical sensor attached to the autonomous mobile robot; a controller in communication with the at least one optical sensor; and a plurality of optical identifiers distributed within the environment at fixed locations and detectable by the at least one optical sensor; wherein each of the plurality of optical identifiers encodes a location within the environment; wherein the controller is configured to: obtain pictures of the environment via the at least one optical sensor; detect visible optical identifiers of the plurality of optical identifiers, which are within a field of view of the at least one optical sensor; decode the visible optical identifiers; and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.

    2. The navigation system of claim 1, wherein the controller is configured to estimate a distance to each of the visible optical identifiers and to navigate the autonomous mobile robot by applying a triangulation method using each pair of the visible optical identifiers.

    3. The navigation system of claim 2, wherein the controller is configured to assign a weight to each of the visible optical identifiers based on the distance; and wherein optical identifiers closer to the autonomous mobile robot are assigned a higher weight for navigating the autonomous mobile robot.

    4. The navigation system of claim 1, wherein the at least one optical sensor comprises at least one of a high-resolution camera and a near-distance low-resolution camera.

    5. The navigation system of claim 1, wherein each of the plurality of optical identifiers is a printed or light projected optical identifier and comprises at least one of the following: a QR code; a barcode; a JAB code; an Aztec code; and a reference number.

    6. The navigation system of claim 1, wherein the plurality of optical identifiers comprises a first subset of optical identifiers and a second subset of optical identifiers; wherein the first subset is associated with a first region of the environment; and wherein the second subset is associated with a second region of the environment.

    7. The navigation system of claim 1, wherein the localizations using the decoded visible optical identifiers are determined by referencing a map of the environment stored in a data storage based on the visible optical identifiers.

    8. The navigation system of claim 1, further comprising at least one LiDAR scanner arranged at the autonomous mobile robot and in communication with the controller; wherein the at least one LiDAR scanner is configured to scan a surrounding environment of the autonomous mobile robot; wherein the controller is configured to additionally localize the autonomous mobile robot within the environment based on the scan of the at least one LiDAR scanner; and wherein the controller is configured to compare the localization of the at least one LiDAR scanner with the localization of the at least one optical sensor and to obtain a corresponding variance.

    9. The navigation system of claim 8, wherein the controller is configured to: when the variance is below a first threshold, navigate the autonomous mobile robot purely based on the plurality of optical identifiers; and when the variance is higher than a second threshold, stop the autonomous mobile robot.

    10. The navigation system of claim 1, wherein the controller is configured to store a navigation history of the autonomous mobile robot; and wherein the navigation history is used as training data for an artificial intelligence module.

    11. The navigation system of claim 10, wherein the artificial intelligence module is used to optimize paths of the autonomous mobile robot, to identify anomalies within the environment, or both optimize and to identify.

    12. The navigation system of claim 1, wherein each of the plurality of optical identifiers is arranged at one of the following: a wall within the environment; a supporting structure for a product to be processed by the autonomous mobile robot; the product to be processed by the autonomous mobile robot; a second autonomous mobile robot or another robot system in communication with the controller; a drone; a handheld device; or a human operator.

    13. A handheld device for performing a work task on an object by a human operator, the handheld device comprising: at least one work tool; a camera; and a controller; wherein the controller is configured to: obtain pictures of an environment within which the handheld device is operated via the camera; detect visible optical identifiers of a plurality of optical identifiers that are arranged at fixed locations within the environment, wherein the visible optical identifiers are optical identifiers which are within a field of view of at least one optical sensor; decode the visible optical identifiers; and correlate data pertaining to the work task with work positions at the object at which the work task has been performed based on real-time localizations of the handheld device within the environment using the decoded visible optical identifiers.

    14. An autonomous mobile robot, comprising: at least one optical sensor; and a controller; wherein the controller is configured to: obtain pictures of an environment in which the autonomous mobile robot is located via the at least one optical sensor; detect visible optical identifiers, wherein the visible optical identifiers are located within a field of view of the at least on optical sensor, wherein the visible optical identifiers belong to a plurality of optical identifiers located within the environment at fixed locations, and wherein each of the plurality of optical identifiers encodes a location within the environment; decode the visible optical identifiers; and navigate the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.

    15. A method for navigating an autonomous mobile robot according to claim 14 within an environment, the method comprising: obtaining, by a controller, pictures of the environment via at least one optical sensor attached to the autonomous mobile robot; detecting, by the controller, visible optical identifiers of a plurality of optical identifiers, wherein the visible optical identifiers are in a field of view of the at least one optical sensor, and wherein each of the plurality of optical identifiers encodes a fixed location within the environment; decoding, by the controller, the visible optical identifiers; and navigating, by the controller, the autonomous mobile robot based on real-time localizations of the autonomous mobile robot within the environment using the decoded visible optical identifiers.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0059] In the following, exemplary embodiments are described in more detail having regard to the attached figures. The illustrations are schematic and not to scale. Identical reference signs refer to identical or similar elements.

    [0060] FIG. 1 is a schematic view of an autonomous mobile robot that utilizes/implements the navigation system/method of the present disclosure.

    [0061] FIG. 2 is a schematic top view of an environment, in which a navigation system/method according to the present disclosure is used in an exemplary scenario.

    [0062] FIG. 3 is a schematic cut view of an environment, in which a navigation system/method according to the present disclosure is used in a further exemplary scenario.

    [0063] FIG. 4 is a schematic view of a handheld device using the localization method of the navigation system described herein for correlating data pertaining to a work task with corresponding work positions.

    [0064] FIG. 5 is a flow chart of a method for navigating an autonomous mobile robot within an environment, which can, for example, be implemented with the navigation system of FIGS. 2 and 3.

    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

    [0065] FIG. 1 shows an autonomous mobile robot 100. The autonomous mobile robot 100 comprises a robot body 104, a robot arm 101 (sometimes also called a manipulator) and an end effector 102 attached to the robot arm 102. Further, a tray 103 (for example for holding different end effectors 102 or as a landing platform for an associated drone 160 (FIGS. 2, 3) is attached to the robot body 104. The autonomous mobile robot 100 further comprises a drive unit (only the wheels 130 shown) to drive the autonomous mobile robot 100 over a ground surface. The autonomous mobile robot 100 also comprises a plurality of optical sensors 110 in the form of near-distance low-resolution cameras 111 as well as a plurality of LiDAR scanners 112. The optical sensors 110 and the LiDAR scanners 112 are arranged around the robot body 104, such that they can view the surroundings of the robot. It should be appreciated that the cameras 111 also can be any other kind of camera, such as high-resolution cameras, variable focus cameras (e.g., utilizing fluid lenses, etc.). The autonomous mobile robot 100 also comprises an additional optical sensor 110 in the form of a high-resolution camera 113 which is part of the end effector 102. It should be appreciated that also more than one high-resolution camera 113 may be provided and that some of the low-resolution cameras 111 may be replaced by high-resolution cameras 113.

    [0066] The end effector 102 can, for example, be an optical scanner 102 for performing optical inspection scans of a surface (such as a surface of an aircraft or spacecraft fuselage), in order to detect anomalies such as dents, rivet pull-ins, out-of-contour deformations, blend-outs, and scratches. The high-resolution camera 113 is part of this optical scanner 102 in the depicted configuration and may, for example, be a high-resolution matrix scanner. However, the robot body 104 may also directly carry one or more high-resolution cameras 113.

    [0067] Although shown in simplified form, it should be appreciated that the robot arm 101, in general, may be a robot arm 101 having at least three degrees of freedom (rotations), such that the end effector 102 can be moved to any three-dimensional position and orientation.

    [0068] A controller 120 is comprised as a resident controller 120 within the autonomous mobile robot 100 in the depicted configuration. The controller 120 comprises a data storage 121 and an optional artificial intelligence module (AI module) 122. The controller 120 may be used to control the general operation of the autonomous mobile robot 100 and may, in particular, be configured to implement the navigation system and method described with regard to the following figures. The optical sensors 110 and the LiDAR scanners 112 are in communication with the controller 120. Although shown as a resident controller 120, at least for the purposes of the navigation system 10 (FIG. 2) and method described herein, the controller 120 may also be a remote controller, that is arranged elsewhere outside of the autonomous mobile robot 100 but is in communication with the robot, in particular with the optical sensor 110. FIG. 1 also shows an exemplary QR code 2 as one of the optical identifiers 2 as part of the navigation system 10 described with regard to FIG. 2. This exemplary QR code 2 is attached to the robot body 104 of the autonomous mobile robot 100 and may be used for providing the current location of the autonomous mobile robot 100 (for example, determined by the navigation system 10 of FIG. 2 and communicated via a data connection) to other devices, such as to other autonomous mobile robots 100 or to the handheld device 170 described with regard to FIG. 4. These other devices may then also use the location of the autonomous mobile robot 100 when implementing the navigation system 10 of FIG. 2.

    [0069] FIG. 2 illustrates the navigation system 10 implemented by the autonomous mobile robot 100 exemplary with an environment 1 consisting of two regions 5, 6 in the form of assembly hangars (highly schematic), which are connected by an outdoor path 11. The assembly hangars 5, 6 are shown in a schematic top view. The navigation system 10 will, in particular, be described with regard to the first region 5 (upper part of FIG. 1). Within each of the regions 5, 6, two aircraft fuselages 9, or rather body sections of such fuselages 9, are shown. Each of the fuselages 9 is supported by a supporting structure 7. Within each of the regions 5, 6, four vertical support beams 12 are present, which, for example, may support an upper platform at the corresponding assembly station, as shown in FIG. 3. In each of the two regions 5, 6, two autonomous mobile robots 100 are currently present, each of which may be configured as described with regard to FIG. 1. As depicted, the autonomous mobile robots 100 are inspection robots for performing surface inspections of the fuselages 9. However, the autonomous mobile robots may also be any other kind of autonomous mobile robot 100 (for example, autonomous riveting robots). The autonomous mobile robots 100 may automatically navigate within the environment 1. As part of the navigation system 10, which is implemented mostly by the autonomous mobile robots 100, within each of the regions 5, 6 of the environment 1, a plurality of optical identifiers 2 (for the sake of clarity of the illustration, not all of which are indicated by reference signs), e.g., in the form of QR codes or any other form described herein further above, are present. The optical identifiers 2 are depicted in a highly schematic representation as small squares. It should be appreciated that the optical identifiers 2 are arranged such that, in general, they can become visible by the optical sensors (cameras) 110 of the autonomous mobile robots 100 if they are positioned accordingly. The optical identifiers 2 may be arranged in different heights or all in the same height. In the depicted example, at each of the vertical support beams 12, four optical identifiers 2 are arranged. Further, at each vertical beam of the supporting structures 7 supporting the fuselages 9, one optical identifier 2 is arranged, and a plurality of further optical identifiers 2 are arranged on the walls of the assembly hangars/regions 5, 6. The assembly hangars 5, 6 can be entered via corresponding doors 8. At each of the doors 8, also on the outside optical identifiers 2 are arranged for facilitating outdoor to indoor transfer of the autonomous mobile robots 100.

    [0070] The optical identifiers 2 each encode a location within the environment 1, at which the corresponding optical identifier 2 is arranged. For example, each optical identifier 2 may be in the form of a QR code that encodes a data content that contains the coordinates at which the corresponding QR code is arranged. The optical identifiers 2 may be printed, attached (e.g., printed on a sticker that is attached at the corresponding location), light projected, or otherwise arranged at the corresponding locations. Preferably, the optical identifiers 2 all have the same size.

    [0071] FIG. 2 shows a situation, in which one of the autonomous mobile robots 100 just has entered region 5 through the corresponding door 8 and navigates within the region 5 of the environment 1 utilizing the navigation system 10. Three fields of view 13, 14, 15 of three forward facing cameras/optical sensors 110 are indicated by dashed lines. The first field of view 13 corresponds to a high-resolution camera 113 (see FIG. 1). The second field of view 14 and the third field of view 15 correspond to near-distance low-resolution cameras 111. Within the first field of view 13, six optical identifiers 2 (two on each of the corresponding vertical support beam 12 within the field of view and two on the wall in front of the autonomous mobile robot 100) are currently visible. Further, within each of the second field of view 14 and third field of view 15, two optical identifiers 2 are visible (on the corresponding vertical support beams 12). Therefore, in total, ten optical identifiers 2 are currently visible which may be used by the controller 120 to determine a current localization and orientation of the autonomous control robot 100. The controller decodes the visible optical identifiers 2 and hence obtains their locations within the environment 1. Further, from the apparent sizes of the currently visible optical identifiers within the recorded pictures, the controller 120 determines a distance to each of the visible optical identifiers 2.

    [0072] Using the determined locations of the visible optical identifiers 2 and the determined distances to each of the visible optical identifiers (under consideration of the viewing directions, which are known because the arrangement of the optical sensors (cameras) 110 on the autonomous mobile robot 100 is known to the controller 120), the controller 120 determines a current localization of the autonomous mobile robot 100, for example using a triangulation method with each pair of the visible optical identifiers 2, as described herein further above.

    [0073] Optionally, in the triangulation method a weight is assigned to each of the visible optical identifiers 2, which is used in the triangulation as well as in determining the final localization. In particular, the controller 120 may assign a higher weight to optical identifiers 2 closer to the autonomous mobile robot 100 because these optical identifiers yield a more accurate localization result. For example, the triangulation with each pair of the visible optical identifiers 2 yields a position of the autonomous mobile robot 100. Depending on the assigned weights to the optical identifiers, the controller 120 may, for example, determine a final localization as a weighted average of the individual localizations.

    [0074] The localizations are continuously determined in this way while the autonomous mobile robot 100 moves through the environment 1. Hence, the controller 120 continuously obtains pictures of the environment 1 using the optical sensors 110, detects visible optical identifiers 2 within the pictures, which are within a combined field of view of all the optical sensors 110, decodes the data content of the detected visible optical identifiers 2, determines real-time localizations of the autonomous mobile robot within the environment 1, and navigates the autonomous mobile robot 100 based on the real-time localizations.

    [0075] Optionally, a redundancy check may be performed using the LiDAR scanners 112 or another navigation method. For example, in FIG. 2, the autonomous mobile robot 100 may use LiDAR scanners 112 arranged on the side of the autonomous mobile robot 100 to determine a distance to the corresponding opposite wall. This position determined by means of the LiDAR scanners 112 is compared to the localization obtained by the optical navigation (navigation by means of the optical identifiers 2) and a variance between the two localization methods is estimated. If this variance is below a first threshold, the navigation occurs purely on the optical navigation. If the variance is above a second threshold (which may also be the same as the first threshold), the autonomous mobile robot 100 is stopped and a human operator 180 may be notified to take care of the situation.

    [0076] In FIG. 2, further in each of the regions (assembly hangars) 5, 6, a drone 160 is present. The drone 160 may navigate in the same way as the autonomous mobile robot 100. In particular, the term autonomous mobile robot as used herein, although described for ground-based vehicles, also covers other vehicles such as the drone 160. The drone 160 as well as other autonomous mobile robots 100 and handheld devices 170 (one shown in FIG. 2) may also assist in navigating the subject autonomous mobile robot 100. Optical identifiers 2 may also be attached to each moving object (drone 160 (only indicated by reference sign 2, not explicitly shown), autonomous mobile robots 100, handheld device 170, etc.) or to the fuselages 9 itself. The moving objects 100, 170, 160 may each continuously determine their current location and may assist each other in determining real-time localizations by continuously transmitting their locations via a network connection to the other moving objects 100, 160. In this way, each moving object 100, 170, 160 may serve as a reference location, just as the fixed optical identifiers 2.

    [0077] Optionally, the autonomous mobile robot 100 may move faster, if more of the optical identifiers 2 are currently visible because in such a case, accuracy of the localization can be increased.

    [0078] Further, the navigation system 10 may enable navigating between regions 5, 6 in that automatically a corresponding map of the new region is loaded once the autonomous mobile robot 100 enters the new region. For example, if the autonomous mobile robot 100 leaves the region 5 in FIG. 2 via door 8, a map of the outdoor region, in particular containing the path 11, can be loaded. The autonomous mobile robot 100 or rather the controller 120 may detect such a change of regions by means of the detection of corresponding optical identifiers 2, for example, if an optical identifier 2 is decoded for the first time that belongs to the new region or if a dedicated optical identifier 2 is decoded, that contains a hint at the region change. For example, the optical identifiers 2 on the side of the door 8 may contain instructions to switch to a map of the region behind the door 8, once they are passed by the autonomous mobile robot 100. Therefore, for example, once the autonomous mobile robot 100 leaves one of the regions 5, 6, the controller may first load a map of the outdoor region (path 11) and may then load a map of the other one of the regions 5, 6 when entering the corresponding door 8. In FIG. 2, also one of the optical identifiers 2 (in region 6) is exemplary shown in enlarged form as a QR code 2.

    [0079] FIG. 3 shows a cut view of a part of one of the assembly hangars of FIG. 2. In FIG. 3, a scenario is illustrated in which an autonomous mobile robot 100 works on different levels (i.e., vertically separated areas 5, 6) as the regions of the environment 1. It is illustrated that one autonomous mobile robot 100 works on a top level 6 while another one works on a base level 5 of the corresponding environment 1. Each of the autonomous mobile robots 100 may switch between these levels 5, 6, i.e., change regions of the environment having distinct maps. For example, the autonomous mobile robot 100 may switch levels by means of an elevator (not shown). The transfer between the corresponding level 5, 6 to the other level 5, 6 may then be done in the same way as the switch between regions described above with regard to FIG. 2. However, here the elevator corresponds to the transition point. In other words, once the autonomous mobile robot 100 enters the elevator (the autonomous mobile robot 100 may navigate to the elevator in the described way using the optical navigation) it may detect this by decoding a corresponding optical identifier 2, e.g., arranged on the side of an elevator door. It may then enter the elevator, for example using the LiDAR scanners 112, and may switch the map to the map of the new level 5, 6. FIG. 3 also shows a handheld device 170 that is used by a human operator 180 to work on a surface of the fuselage 9, as described further below with regard to FIG. 4.

    [0080] FIG. 4 shows a handheld device 170 for performing a work task on an object 9, such as a fuselage 9. For example, the handheld device 170 may be a manual inspection scanner having an optical scanner 172 (similar to the optical scanner 102 of the autonomous mobile robot 100) that is used to manually scan the surface of an aircraft fuselage 9 for anomalies. The handheld device 170 as depicted comprises a work tool 172, here in the form of an optical scanner 172. However, the work tool 172 may also be any other work tool 172, such as a riveting tool, etc. The handheld device 170 further comprises a controller 173, a camera 171, and a handle 176 for holding the handheld device 170. Here, the camera 171 is a camera of a smartphone 174 that is attached to the handheld device 170 via an adapter 175.

    [0081] Oftentimes, it is necessary to save position, for example, on the fuselage 9, at which a corresponding scan (or in general work task) has been performed (for example a position of a stringer or a frame (or both) of the fuselage 9, at which the scan has been taken. In order to avoid manually entering or marking the position on the object/fuselage 9, the handheld device 170 uses the same principle as the navigation system 10 described above, to obtain its current location. Therefore, the corresponding discussion for how the real-time localizations are obtained by detecting visible optical identifiers 2 within the environment 1 will not be repeated. Any and all features described with regard to the real-time localizations of the autonomous mobile robot 100 using the navigation system 10 are fully valid for the handheld device 170.

    [0082] However, instead of for navigation purposes, the real-time localizations may be obtained, for example, each time a scan (or some other work task) is performed. The determined localization at each work task (i.e., the position, at which the handheld device is located within the environment 1) are used to determine a position on the object 9 (e.g., a position on a fuselage 9), at which the work task (here the optical scan) has been performed. These positions may then be correlated with data pertaining to the work task (e.g., with optical scans taken at that position).

    [0083] The camera 171 of the handheld device 170, that is used for obtaining pictures of the environment 1 may either be an integrated camera 171 of the handheld device 170, or (as depicted) may be a camera 171, that can be attached to the handheld device 170 and that communicates with the controller 173 and can be used by the controller 173 for obtaining the pictures of the optical identifiers 2 within the environment 1. For example, as depicted, the camera 171 may be a camera 171 of a smartphone 174, that is attached to the handheld device 170 at a corresponding adapter 175. The smartphone 174 may then be connected to the controller 173 and may be used for obtaining the corresponding pictures.

    [0084] The process of obtaining the localizations has been described with regard to the navigation system 10 (FIG. 2, 3) and will not be repeated here. As depicted, in the situation in FIG. 4, the camera 171 captures two optical identifiers 2, such as QR codes 2, that are arranged on a supporting structure 7 for the fuselage 9. However, this is only one exemplary situation. Depending on the scan position, the camera may capture other optical identifiers 2 as well as more or less optical identifiers 2. The mechanism is exactly the same as described with regard to the localization of the autonomous mobile robot 100 by means of the navigation system 10 (FIGS. 2, 3).

    [0085] Further, the handheld device 170 may also itself carry at least one optical identifier 2, which can be used by the navigation system 10 described herein, for example with regard to FIGS. 2 and 3, if the handheld device 170 is present within the environment 1. The handheld device 170 may, in particular, also send its current location to devices such as drones 160 and autonomous mobile robots 100 that are navigating via the navigating system 10, such that the handheld device 170 can be used as an additional position reference, just as the optical identifiers 2 located at fixed locations within the environment 1, as described with regard to FIGS. 2 and 3 further above.

    [0086] FIG. 5, with continued reference to FIGS. 1 to 3, shows a flow chart of a method 200 for navigating an autonomous mobile robot 100 within an environment 1. The method 200 may, for example, be performed by the navigation system 10 of FIGS. 2, 3 with the autonomous mobile robot 100 of FIG. 1. The steps of the method 200 have been concurrently described with regard to the discussion of the navigation system 10. Therefore, for the sake of brevity, the steps of the method 200 will only be described very shortly.

    [0087] The method 200 starts in step 210 with obtaining pictures of the environment 1. The pictures may be obtained by the controller 120 via the optical sensors 110 of the autonomous mobile robot 100.

    [0088] In step 220, the controller 120 detects visible optical identifiers 2 of a plurality of optical identifiers 2 (such as QR codes 2). The visible optical identifiers 2 are optical identifiers 2 which are currently within a combined field of view of the optical sensors 110, as described above with regard to the navigation system 10. Each of the optical identifiers 2 encodes a fixed location within the environment 1 at which it is arranged.

    [0089] In step 230, the controller 120 decodes the visible optical identifiers 2 and determines a current localization of the autonomous mobile robot 100 within the environment 1 based on the decoded optical identifiers 2. Determining the localization may be done by a weighted triangulation method with each pair of visible optical identifiers 2, as described above.

    [0090] Steps 210, 220, and 230 are performed continuously while the autonomous mobile robot 100 moves through the environment 1. In other words, the autonomous mobile robot 100 determines, in real-time, its localization within the environment 1 by monitoring the environment 1 for optical identifiers while it moves.

    [0091] In step 240, the controller 120 navigates the autonomous mobile robot 100 based on the real-time localizations. Step 240 may be running concurrently with the continuous localization of the autonomous mobile robot 100 within the environment.

    [0092] The systems and devices described herein may include a controller, such as controller 120, control unit, control device, controlling means, system control, processor, computing unit or a computing device comprising a processing unit and a memory which has stored therein computer-executable instructions for implementing the processes described herein. The processing unit may comprise any suitable devices configured to cause a series of steps to be performed so as to implement the method such that instructions, when executed by the computing device or other programmable apparatus, may cause the functions/acts/steps specified in the methods described herein to be executed. The processing unit may comprise, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, a central processing unit (CPU), an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.

    [0093] The memory may be any suitable known or other machine-readable storage medium. The memory may comprise non-transitory computer readable storage medium such as, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. The memory may include a suitable combination of any type of computer memory that is located either internally or externally to the device such as, for example, random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like. The memory may comprise any storage means (e.g., devices) suitable for retrievably storing the computer-executable instructions executable by processing unit.

    [0094] The methods and systems described herein may be implemented in a high-level procedural or object-oriented programming or scripting language, or a combination thereof, to communicate with or assist in the operation of the controller or computing device. Alternatively, the methods and systems described herein may be implemented in assembly or machine language. The language may be a compiled or interpreted language. Program code for implementing the methods and systems described herein may be stored on the storage media or the device, for example a ROM, a magnetic disk, an optical disc, a flash drive, or any other suitable storage media or device. The program code may be readable by a general or special-purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.

    [0095] Computer-executable instructions may be in many forms, including program modules, such as AI module 122, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.

    [0096] It should be noted that comprising or including does not exclude other elements or steps, and one or a does not exclude a plurality. It should further be noted that features or steps that have been described with reference to any of the above embodiments may also be used in combination with other features or steps of other embodiments described above.

    [0097] While at least one exemplary embodiment of the present invention(s) is disclosed herein, it should be understood that modifications, substitutions and alternatives may be apparent to one of ordinary skill in the art and can be made without departing from the scope of this disclosure. This disclosure is intended to cover any adaptations or variations of the exemplary embodiment(s). In addition, in this disclosure, the term or means either or both. Furthermore, characteristics or steps which have been described may also be used in combination with other characteristics or steps and in any order unless the disclosure or context suggests otherwise. This disclosure hereby incorporates by reference the complete disclosure of any patent or application from which it claims benefit or priority.

    LIST OF REFERENCE SIGNS

    [0098] 1 (production) environment [0099] 2 optical identifier [0100] 3 first subset of optical identifiers [0101] 4 second subset of optical identifiers [0102] 5 first region of the environment (assembly hangar; base level) [0103] 6 second region of the environment (assembly hangar; top level) [0104] 7 supporting structure [0105] 8 door [0106] 9 fuselage [0107] 10 navigation system [0108] 11 outdoor path [0109] 12 vertical support beams [0110] 13 first field of view [0111] 14 second field of view [0112] 15 third field of view [0113] 100 autonomous mobile robot [0114] 101 robot arm/manipulator [0115] 102 end effector, optical scanner [0116] 103 tray [0117] 104 robot body [0118] 110 optical sensor [0119] 111 near-distance low-resolution camera [0120] 112 LiDAR scanner [0121] 113 high-resolution camera [0122] 120 controller [0123] 121 data storage [0124] 122 artificial intelligence module (AI module) [0125] 130 wheels [0126] 160 drone [0127] 161 optical scanner of drone [0128] 162 camera of drone [0129] 170 handheld device [0130] 171 camera (of handheld device) [0131] 172 work tool, optical scanner (of handheld device) [0132] 173 controller [0133] 174 smartphone [0134] 175 adapter [0135] 176 handle [0136] 180 human operator [0137] 200 method [0138] 210 obtaining pictures [0139] 220 detecting optical identifiers [0140] 230 decoding optical identifiers [0141] 240 navigating based on optical identifiers