Device, method for controlling the same, and device group or swarm
12339657 · 2025-06-24
Assignee
Inventors
- Jan Behling (Dortmund, DE)
- Mathias ROTGERI (Dortmund, DE)
- Jan Sören EMMERICH (Dortmund, DE)
- Dirk HÖNING (Dortmund, DE)
- Patrick KLOKOWSKI (Dortmund, DE)
- Christian HAMMERMEISTER (Dortmund, DE)
- Michael TEN HOMPEL (Dortmund, DE)
Cpc classification
G05D1/243
PHYSICS
International classification
Abstract
A device includes an optical sensing unit configured to sense an object so as to obtain a picture of the object. The device includes a drive unit configured to drive and to move the device. The device includes an evaluation unit configured to evaluate the picture in terms of an at least two-dimensional pattern and to evaluate the pattern in terms of at least a first marking area and a second marking area. The evaluation unit is configured to obtain a marking result by comparing the first marking area and the second marking area, and to determine, on the basis of the marking result, relative localization of the device with regard to the object. The device includes a control unit configured to control the drive unit on the basis of the relative localization.
Claims
1. Device comprising: an optical sensing unit configured to sense an object so as to acquire a picture of the object; a drive unit configured to drive and to move the device; an evaluation unit configured to evaluate the picture in terms of an at least two-dimensional pattern, and to evaluate the pattern in terms of at least a first marking area and a second marking area so as to acquire a marking result by comparing the first marking area and the second marking area; and to determine, on the basis of the marking result, relative localization of the device with regard to the object; a control unit configured to control the drive unit on the basis of the relative localization; wherein the drive unit is configured to move the device along a direction of movement; wherein the optical sensing unit is configured to sense the object along or in parallel with the direction of movement; wherein the evaluation unit is configured to evaluate the picture in terms of an object code and to perform object identification of the object on the basis of the object code, the control unit being configured to control the drive unit on the basis of the object identification; and wherein the evaluation unit is configured to evaluate the picture in terms of a surface code and to perform surface identification of a surface of the object on the basis of the surface code, the control unit being configured to control the drive unit on the basis of the surface identification; wherein the device is adapted to distinguish the object from other objects carrying different object codes and to distinguish, based on the surface code, the surface of the object from other surfaces carrying different surface codes and a same object code, wherein based on the pattern, the device is adapted to identify the object, to identify the surface of the object and to determine a rotation, related to an orientation of the device, of the object about a vertical axis and/or about a horizontal axis.
2. Device as claimed in claim 1, wherein the pattern is a two-dimensional QR code which comprises a plurality of marking areas between which an object code for identifying the object and a surface code for identifying a surface of the object are arranged along different spatial directions.
3. Device as claimed in claim 1, wherein the evaluation unit is configured to determine a rotation of the object with regard to the device by means of a size comparison of the first marking area and the second marking area.
4. Device as claimed in claim 3, the device being a ground vehicle configured to move along horizontally while using the drive unit, the evaluation unit being configured to evaluate the pattern in terms of a horizontal arrangement of the first pattern area and of the second pattern area so as to determine a rotation, related to an orientation of the device, of the object about a vertical axis.
5. Device as claimed in claim 3, wherein the evaluation unit is configured to evaluate the pattern in terms of a vertical arrangement of the first pattern area and of the second pattern area so as to determine a rotation, related to an orientation of the device, of the object about a horizontal axis.
6. Device as claimed in claim 1, wherein the evaluation unit is configured to verify exclusively areas of the picture in terms of the presence of additional information, the evaluation unit being configured to verify the picture in at least a pattern area; wherein pattern areas are spanned along a linear straight-line displacement of a marking area to form an adjacent marking area.
7. Device as claimed in claim 6, wherein an expansion of the pattern area along a direction perpendicular to the straight line is determined by a dimension of the marking area.
8. Device as claimed in claim 6, wherein different pattern areas extend exclusively perpendicularly or in parallel with one another.
9. Device as claimed in claim 1, wherein the evaluation unit is configured to determine an object distance between the device and the object by means of comparing a size of the first or second pattern areas in the picture to a reference size, and to determine the relative localization on the basis of the object distance.
10. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the pattern in terms of a third pattern area and a fourth pattern area which span a rectangle along with the first pattern area and the second pattern area, and to determine the relative localization from deviations of the first to fourth pattern areas from the arrangement as a rectangle.
11. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the picture in terms of an object code and to perform object identification of the object on the basis of the object code, the control unit being configured to control the drive unit on the basis of the object identification.
12. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the picture in terms of a surface code and to perform surface identification of a surface of the object on the basis of the surface code, the control unit being configured to control the drive unit on the basis of the surface identification.
13. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the picture in terms of an object code which is arranged between the first and second marking areas and indicates an identity of the object.
14. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the pattern in terms of at least one third marking area; and to evaluate the picture in terms of an object code which is arranged between the first and second marking areas and indicates an identity of the object; and to evaluate the picture in terms of a surface code which is arranged between the third marking area and the first or second marking areas and indicates an identity of a side of the object on which the pattern is arranged.
15. Device as claimed in claim 1, wherein the evaluation unit is configured to evaluate the pattern in terms of an object code indicating an identity of the object, and to evaluate the pattern in terms of a surface code which is arranged separately from the former and indicates a specific surface region of the object, and to determine the relative localization with regard to the object and to the surface region.
16. Device as claimed in claim 1, wherein the control unit is configured to perform an instruction which indicates to take up a predetermined relative position with regard to a predetermined side of a predetermined object, and is configured to adapt its own position by controlling the drive unit in accordance with the instruction, on the basis of the relative localization.
17. Device as claimed in claim 1, comprising a coupling unit configured to perform mechanical coupling to a corresponding mechanical coupling unit of the object.
18. Device as claimed in claim 17, configured to orient itself by the two-dimensional pattern so as to mechanically connect the mechanical coupling unit to the corresponding mechanical coupling unit.
19. Device as claimed in claim 17, comprising a two-dimensional pattern on at least one side on which the mechanical coupling unit is arranged.
20. Device as claimed in claim 1, which in a top view spans a polygon surface and comprises a mechanical coupling unit on at least two faces of the polygon.
21. Device as claimed in claim 1, which is a self-driving robot.
22. The device of claim 1, being a robot, wherein the object is another robot; wherein the device is to mechanically couple to the other robot based on the marking result obtained by evaluating the pattern in terms of at least the first marking area and the second marking area; wherein the device is configured for distinguishing between different objects carrying different object codes and to distinguish between different surfaces of an object, the different surfaces carrying a same object code and different surface codes; wherein the device is to mechanically couple to the other robot at a side thereof carrying the pattern based on evaluation of the pattern.
23. The device of claim 1, wherein the device is to mechanically couple to the other robot at a side thereof carrying the pattern based on evaluation of the pattern.
24. The device of claim 1, being a robot, wherein the object is another robot; wherein the evaluation unit is configured to evaluate the picture in terms of an object code and to perform object identification of the object on the basis of the object code, the control unit being configured to control the drive unit on the basis of the object identification; wherein the evaluation unit is configured to evaluate the picture in terms of a surface code and to perform surface identification of a surface of the object on the basis of the surface code, the control unit being configured to control the drive unit on the basis of the surface identification; wherein the device is configured for distinguishing between different objects carrying different object codes and to distinguish between different surfaces of an object, the different surfaces carrying a same object code and different surface codes; wherein the device is to mechanically couple to the other robot based on the object code and based on the surface code.
25. Device group comprising: a plurality of devices comprising: an optical sensing unit configured to sense an object so as to acquire a picture of the object; a drive unit configured to drive and to move the device; an evaluation unit configured to evaluate the picture in terms of an at least two-dimensional pattern, and to evaluate the pattern in terms of at least a first marking area and a second marking area so as to acquire a marking result by comparing the first marking area and the second marking area; and to determine, on the basis of the marking result, relative localization of the device with regard to the object; a control unit configured to control the drive unit on the basis of the relative localization; wherein the drive unit is configured to move the device along a direction of movement; wherein the optical sensing unit is configured to sense the object along or in parallel with the direction of movement; wherein the evaluation unit is configured to evaluate the picture in terms of an object code and to perform object identification of the object on the basis of the object code, the control unit being configured to control the drive unit on the basis of the object identification; and wherein the evaluation unit is configured to evaluate the picture in terms of a surface code and to perform surface identification of a surface of the object on the basis of the surface code, the control unit being configured to control the drive unit on the basis of the surface identification; wherein the device is adapted to distinguish the object from other objects carrying different object codes and to distinguish, based on the surface code, the surface of the object from other surfaces carrying different surface codes and a same object code, wherein based on the pattern, the device is adapted to identify the object, to identify the surface of the object and to determine a rotation, related to an orientation of the device, of the object about a vertical axis and/or about a horizontal axis; wherein the plurality of devices comprise, on at least one surface, a device pattern indicating at least one object side or an object identity; wherein the plurality of devices are configured to orient themselves, in relation to one another, on the basis of the respective relative localization.
26. Device group as claimed in claim 25, wherein the devices are configured to sense the device pattern of other devices; and to take up, on the basis of object codes and/or surface codes, a respective relative location with regard to another device of the device group or to a surface thereof.
27. Device group as claimed in claim 25, wherein the plurality of devices are configured to mechanically couple to one another.
28. Method comprising: controlling a drive unit to drive a device and to move it along a direction of movement within a plane of movement; controlling an optical sensing unit of the device to sense an object within the or in parallel with the plane of movement so as to acquire a picture of the object; controlling an evaluation unit to evaluate the picture in terms of an at least two-dimensional pattern; and to evaluate the pattern in terms of at least a first marking area and a second marking area, to acquire a marking result by comparing the first marking area and the second marking area; and to determine, on the basis of the marking result, relative localization of the device with regard to the object; such that the picture is evaluated in terms of an object code and object identification of the object is performed on the basis of the object code, the drive unit controlled on the basis of the object identification; and such that the picture is evaluated in terms of a surface code and surface identification of a surface of the object is performed on the basis of the surface code, the drive unit being controlled on the basis of the surface identification; such that the object is distinguished from other objects carrying different object codes based on the surface code, the surface of the object is distinguished from other surfaces carrying different surface codes and a same object code, such that based on the pattern, the object and the surface of the object are identified a rotation, related to an orientation of the device, of the object about a vertical axis and/or about a horizontal axis is determined; controlling a control unit to control the drive unit on the basis of the relative localization.
29. A non-transitory digital storage medium having a computer program stored thereon to perform the method which comprises: controlling a drive unit to drive a device and to move it along a direction of movement within a plane of movement; controlling an optical sensing unit of the device to sense an object within the or in parallel with the plane of movement so as to acquire a picture of the object; controlling an evaluation unit to evaluate the picture in terms of an at least two-dimensional pattern; and to evaluate the pattern in terms of at least a first marking area and a second marking area, to acquire a marking result by comparing the first marking area and the second marking area; and to determine, on the basis of the marking result, relative localization of the device with regard to the object; such that the picture is evaluated in terms of an object code and object identification of the object is performed on the basis of the object code, the drive unit controlled on the basis of the object identification; and such that the picture is evaluated in terms of a surface code and surface identification of a surface of the object is performed on the basis of the surface code, the drive unit being controlled on the basis of the surface identification; such that the object is distinguished from other objects carrying different object codes based on the surface code, the surface of the object is distinguished from other surfaces carrying different surface codes and a same object code, such that based on the pattern, the object and the surface of the object are identified a rotation, related to an orientation of the device, of the object about a vertical axis and/or about a horizontal axis is determined; controlling a control unit to control the drive unit on the basis of the relative localization, when said computer program is run by a computer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Embodiments of the present invention will be detailed subsequently referring to the appended drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION OF THE INVENTION
(12) Before embodiments of the present invention will be explained in detail with reference to the drawings, it shall be noted that elements, objects and/or structures in the different figures which are identical, identical in function or in effect, are provided with identical reference numerals, so that the descriptions of said elements that are provided in different embodiments are interchangeable and/or mutually applicable.
(13) The present embodiments relate to devices, in particular to self-driving robots. These include, e.g., robots driving autonomously, the term autonomously being understood to mean that e.g. in a device group, e.g. a robot swarm, solutions to specific tasks are developed autonomously. This does not exclude external influences, e.g. overriding communication of a task, e.g. to transport an object or to perform actions at specific locations. Terms such as self-driving or driving autonomously may be understood to mean that the task of recognizing the surroundings and of orientation as well as of independent locomotion for accomplishing the tasks is achieved.
(14) Even though embodiments described herein relate to driving devices, e.g. while using wheels, rolls or chains, the embodiments are not limited thereto, but relate to any form of one-dimensional, two-dimensional, or three-dimensional locomotion, in particular to flying devices and/or devices where at least one working plane may be varied in position with regard to a direction of height, as is the case with fork-lift trucks, for example.
(15)
(16) The device 10 includes drive means 32 configured to drive and to move the device 10. for example, the device 10 may be configured as a ground vehicle so as to move along horizontally, e.g. on a ground area or the like, e.g. within an x/y plane, while using the drive means 32. Alternatively or additionally, at least part of the device 10 may be variable along a z direction, e.g. by means of height adjustment. Alternatively or additionally, the device 10 may be configured to spatially move in a three-dimensional or one-dimensional manner.
(17) In the event of two-dimensional movement, a movement may occur in parallel with the x/y plane; the reference plane may also be curved or titled as a function of the foundation on which the device is moved, e.g. when the ground is uneven. Movement of the device 10 may occur along a variable direction of movement, which may comprise, e.g., an x component and/or a y component. The device may be configured to sense the object 14 along or in parallel with a current or possible direction of movement in addition to the drive means 32, which moves the device along the direction of movement. To this end, the optical sensing means 12 may be configured to sense the object 14 within the x/y plane or in a manner that is offset or tilted with regard thereto. This means that the optical sensing means 12 may sense the object 14 along or in parallel with the direction of movement. The sensed surface of the object 14 is thus arranged to be out of plane with regard to the plane of movement, for example perpendicularly thereto or tilted at an angle thereto that differs from at least 90, so that sensing of the object 14 enables sensing of the pattern when the line of vision is parallel to the direction of movement. This may be used for recognizing other devices, stations or means with which interaction is desired, and is to be distinguished from ground marks which merely serve to achieve navigation purposes, are arranged perpendicularly to the direction of movement and are assumed to be invariable with regard to the position.
(18) The evaluation means 22 may be configured to evaluate the picture, obtained by means of the output signal 18, in terms of the pattern 16. This means that the pattern 16 may be evaluated in terms of predefined features. For example, the pattern 16 may comprise two or more marking areas 24.sub.1 and 24.sub.2. The evaluation means 22 may be configured to evaluate the pattern 16 in terms of at least two marking areas 24.sub.1 and 24.sub.2. The evaluation means 22 is further configured to perform a comparison while using the marking areas 24.sub.1 and 24.sub.2. The comparison may include mutual comparison of features of the marking areas 24.sub.1 and 24.sub.2, but alternatively or additionally may also include a comparison of the respective marking area 24.sub.1 and/or 24.sub.2, or of features thereof, with a respective reference quantity. The evaluation means 22 is configured to obtain a marking result on the basis of the comparison. The evaluation means 22 may be configured to determine, on the basis of the marking result, a relative localization of the device 10 with regard to the object 14, in particular to the pattern 16. It is possible, by means of a signal 26, to transmit the relative localization of a control means 28 configured to control a drive means 32. The control means 28 is configured to control the drive means 32 on the basis of the relative localization. The drive means 32 may comprise actuating elements, e.g. wheels, rolls, chains, propellers or the like so as to change the spatial position of the device 10. This means that the device 10 may spatially move on the basis of the recognized pattern 16 and of the evaluation of the pattern features, in particular of the marking areas 24.sub.1 and 24.sub.2.
(19)
(20) Alternatively or additionally, a dimension 34.sub.5, e.g. a distance between mutually facing edges of the marking areas 24.sub.1 and 24.sub.2, may comprise a predefined value at least at a predefined distance between the sensing means 12 and the pattern 16. The dimension 34.sub.5 may also refer to other edges of the marking areas 24.sub.1 and 24.sub.2.
(21) Even though the marking areas 24.sub.1 and 24.sub.2 are depicted as squares, a shape deviating therefrom is also possible, e.g. a free-form surface, a polygon, which is shaped in a regular or irregular manner, an ellipse, in particular a circle, or combinations thereof, e.g. mutually enclosing polygons or ellipses.
(22) In the depicted view of the pattern 16, in which the sensing means 12 views the pattern 16 in a perpendicular manner, for example, the direction a may be aligned, e.g., in parallel with the x direction, and the direction b may be aligned in parallel with the z direction of
(23)
(24) As is shown by way of example, distortions may arise within the pattern 16 sensed by the sensing means 12. For example, edges which are otherwise equal in length or are at a certain ratio with one another may be modified, as is depicted, for example, for dimensions 34.sub.3-1 and 34.sub.3-2 corresponding to the dimension 34.sub.3 in the view of
(25) This means that on the basis of the perspective, a distortion within the pattern 16 may arise, which may be ascertained by the evaluation means 22, so as to determine the relative localization of the device with regard to the object, in particular to the pattern 16.
(26) A comparison of the dimensions of
(27) Irrespectively thereof, the evaluation means 22 may be configured to evaluate the pattern 16 in terms of horizontal arrangement of the pattern areas 24.sub.1 and 24.sub.2 so as to determine orientation of the device 10 with regard to a rotation of the device 10 with regard to the object 14 about an axis 36. For example, if the device 10 is configured as a ground vehicle, the axis 36 may be vertically aligned, for example in parallel with the b direction or the z direction.
(28) Alternatively or additionally, it is also possible to configure the evaluation means 22 such that same evaluates the pattern 16 in terms of a vertical arrangement of the pattern areas 24.sub.1 and 24.sub.2 so as to determine rotation, related to an orientation of the device 10, of the object 14 about a horizontal axis. For example, a difference in dimensions 34.sub.1 or 34.sub.2 might result at different locations of the pattern due to differences in height, which may be evaluated in an equivalent manner. To this end, the pattern areas 24.sub.1 and 24.sub.2 might be arranged along the b direction, for example. Displacement of the pattern areas 24.sub.1 and 24.sub.2 along two directions, e.g. along a diagonal, may enable combinatorial evaluation in terms of a rotation of the device with regard to the object about a horizontal axis and a vertical axis.
(29) The embodiments have in common that rotation of the object with regard to the device may be determined by the evaluation means 22 by means of a comparison of the sizes of the marking areas 24.sub.1 and 24.sub.2, which conversely and in the sense of relative localization equivalently means rotation of the device with regard to the object.
(30)
(31) The marking areas 24.sub.1 and 24.sub.2 and/or 24.sub.3 may be configured to be identical with or different from one another. By way of example, mutually different numbers of mutually enclosing ring patterns 42.sub.1-1 to 42.sub.1-5, 42.sub.2-1 to 42.sub.2-4, and/or 42.sub.3-1 to 42.sub.3-4 may be arranged in mutually different marking areas 24.sub.1, and 24.sub.2, and 24.sub.3, respectively. For example, outer peripheries of the marking areas 24.sub.1 to 24.sub.3 may span, along the directions a and b which may span a pattern-specific coordinate system, pattern areas 44.sub.1 and 44.sub.2, which are arranged between two adjacent marking areas 24.sub.1 and 24.sub.2, and 24.sub.1 and 24.sub.3, respectively. Thus, it is along the different spatial directions a and b that additional information may be depicted in the pattern areas 44.sub.1 and 44.sub.2 arranged along said directions, which additional information may be sensed by the sensing means 12 and be evaluated by the evaluation means 22. For example, different information of codes 46 and 48 may be contained; it is also possible, alternatively, to depict only one of the codes 46 or 48. Each of the codes 46 and 48 may contain, independently of each other, specific information which may be evaluable by the evaluation means 22. For example, the code 46 may include an object code for identifying the object 14, which code enables, e.g., to unambiguously identify an object within a specific system or swarm of devices. For example, the code 48 may include a surface code for identifying a specific surface of the object 14. In other words, the valuation means 22 may be configured to evaluate the pattern 16 in terms of the object code 46, which indicates an identity of the object. The evaluation means 22 may further be configured to evaluate the pattern 16 in terms of a surface code 48, which is arranged separately from the object code 46 and which indicates a specific surface region of the object 14, e.g. a side thereof, so as to determine the relative localization with regard to the object 14 and to the surface region. Alternatively or additionally, the codes 46 and 48 may also comprise other information or may be mutually exchanged.
(32) For example, the respective pattern components 52.sub.1 to 52.sub.n formed to be round; however, they may have any other shape, e.g. be formed along a free-form area, a polygon, an ellipse, or the like. The pattern components 52 within a code may be formed to be identical to or different from one another. Likewise, the pattern components in different codes may be formed to be identical to or different from one another.
(33) The evaluation means 22 may be configured to evaluate the picture in terms of the object code 46 and to perform object identification of the object 14 on the basis of the object code 46. The control means 28 may be configured to control the drive means 32 on the basis of the object identification. This means that the object identification may form part of the relative localization. As a result, it is possible for the device 10, for example, to not only determine the presence of any object and/or to determine relative orientation or rotation with regard to the object, but also to identify the object and to distinguish it, e.g., from other objects.
(34) Alternatively or additionally, the evaluation means 22 may be configured to evaluate the picture in terms of the surface code 48, and to perform, on the basis of the surface code 48, surface identification of a surface of the object 14. The control means 28 may be configured to control the drive means 32 on the basis of the surface identification. This enables the device 10 to not only identify and/or to move toward and/or to avoid the object 14, but also to move toward or circumnavigate a specific side or surface of the object 14. For example, the object 14 may be provided, on different sides, with the same object code but with different side codes, which makes it possible to move toward a specific side of the object 14. For example, the device 10 may be informed that a specific side of the object 14 offers the possibility of accommodating or receiving energy, information, or objects, and/or to effect mechanical coupling to the device 10. Driving toward a specific side of the object 14 may thus be relevant to the device 10, which is made possible by the fact that the sides can be distinguished by means of the side code 48.
(35) For example, if patterns in accordance with
(36) Orientations of directions a and b with regard to the direction x/y/z may be arbitrary within this context. Without any limitations, the pattern 16 may be arranged, for example, on the object 14 and/or may be rotated there. It may also be possible for a spatial relative location of the object 14, for example a rotation about they axis, to be unambiguously determinable from the location of the marking areas 24.sub.1, 24.sub.2 and 24.sub.3
(37)
(38) In mutually oppositely located pattern areas 44.sub.1 and 44.sub.3 as well as 44.sub.2 and 44.sub.4, redundant information which is identical in each case may be rendered, for example the object code 46, on the one hand, and the side code 48, on the other hand. Other embodiments provide for encoding mutually different information in oppositely located pattern areas 44.sub.1 and 44.sub.3, and 44.sub.2 and 44.sub.4, respectively. Redundancy enables avoidance of errors, in particular with expected mechanical impairments of the pattern 16.
(39) The evaluation means 22 may be configured to read out the pattern 16 similarly to a QR code. Other patterns in accordance with embodiments described herein provide different encoding, for example as defined by a barcode or other graphic information representations. This includes configuring the evaluation means 22 accordingly.
(40) With renewed reference to
(41) The marking areas 44 within which the codes 46 and/or 48 are depicted may be determined in that a displacement of a marking area 24 along one of the directions a or b, specifically toward an adjacent marking area, is observed. Within this context, the marking area 44 may be understood to mean that the surface area within the pattern 16, which is passed over by the displacement along the straight line in parallel with the direction a or b and which is located outside the marking areas 24 at the same time, is available as a pattern area. For example, if a displacement of the marking area 24.sub.1 along the direction a is observed, the dimension b.sub.1 may determine, in this respect, e.g. a width or dimension along the direction b which is available to the pattern area 44. A length a.sub.2, across which the displacement takes place, may determine a dimension of the pattern area 44.sub.1 along the direction a. This means that along one direction, an expansion of the pattern area due to the length of the displacement and along a direction perpendicular thereto may be determined by a corresponding dimension of the marking area.
(42) The evaluation means 22 may be configured to verify the picture exclusively within the marking areas 44.sub.1 to 44.sub.4 in terms of additional information such as the codes 46 and 48. Within this context, the evaluation means may be configured to verify one, two or more pattern areas 44. Advantageous embodiments relate to axial extensions of the pattern areas 44 along the main directions, which spatially extend in a linearly independent manner, such as the mutually perpendicular directions a and b, which excludes arranging codes on the diagonal of the triangle 38. In accordance with these embodiments, different pattern areas 44 are arranged exclusively perpendicularly or in parallel with one another. This enables clear, fast, and robust localization of codes by means of the marking areas 44.sub.1 to 44.sub.3 in
(43)
(44) A pattern 16.sub.1, 16.sub.2 and/or 16.sub.3 may be arranged on one or several or all sides of the device 50. The patterns may be completely or partly identical, e.g. while using a pattern of
(45) The device 50 may be configured such that a side-individual pattern is attached on at least one, several or all sides 58. Attachment may be effected by means of adhesion or a bonding agent, but may also be effected integrally with a sidewall, for example by means of engraving or some other lithography technique.
(46) The device 50 may be configured to orient itself by a corresponding two-dimensional pattern attached on the object 14, e.g. on a different device. While using the mechanical coupling means 54.sub.1 or 54.sub.2, the device 50 may establish a mechanical connection to the corresponding mechanical coupling means.
(47) For high flexibility of the mechanical connection, it may be advantageous to configure the device 50 such that a base body of the device 50 spans a polygon surface. The polygon surface may comprise at least three sides and may be formed to be regular or irregular, advantageously in a manner which enables achieving high area density when connecting it to several devices. For example, regular hexagons or regular octagons are suitable for this purpose. It is advantageous for the device 50 to comprise a mechanical coupling means on at least two possibly oppositely located surfaces.
(48) Implementation of the base body as a polygon surface does not prevent that elements such as wheels of the drive means, the mechanical coupling means or the like will jut out from one or more sides. However, this is advantageously implemented such that the mechanical coupling may be effected such that no blockage or hindrance is caused by said elements.
(49) In accordance with embodiments, the device comprises a pattern 16 at least on one side on which the mechanical coupling means 54.sub.1 or 54.sub.2 is arranged. This enables other devices to find sides of the device 50 which are designed for mechanical coupling.
(50) Even though
(51)
(52)
(53) For example, if the device 10 is used once or several times in the device group 70, it may be extended by arranging a corresponding pattern, or device pattern, so as to give other devices the opportunity to orient themselves by the corresponding device. In this manner, one may achieve that the plurality of devices 50.sub.1 to 50.sub.3 each comprise, on at least one surface, a device pattern indicating at least an object side or an object identity. The plurality of devices may be configured to orient themselves, in relation to one another, on the basis of the respective individually determined relative localization.
(54) Embodiments provide for the devices to be configured to sense the device pattern located on the device and to take up, on the basis of the object code and/or surface code, a relative location with regard to another device of the device group or to a surface thereof.
(55) Optionally, and as is shown for the device group 70, two or more, for example three or all of the devices may be mechanically coupled to one another, for example while using mechanical coupling means 54.
(56)
(57) Embodiments provide for the possibility to arrange mechanical coupling means along three spatial directions so as to form a three-dimensional group.
(58)
(59) One or more sides of a device 95.sub.1 and/or 95.sub.2 may be provided with the patterns 16, which may optionally comprise object codes and/or surface codes, or side codes.
(60) A further device 105 of the device group 90 may be mobile or immobile, and comprises, e.g., coupling means 54. The device 105 may be configured to exchange objects with devices 95.sub.1 and/or 95.sub.2, and/or to exchange information or energy, for example by means of the coupling means 54.
(61) In other words,
(62) In yet other words, outer surfaces of the vehicles have codes in the form of bit patterns attached thereto, which on the one hand assign an unambiguous identification (ID) to each vehicle, which may be identical from all sides and faces. On the other hand, each face of the vehicle obtains an unambiguous ID, which may be recognized separately from the vehicle ID and may be identical across all vehicles, so that, e.g., same sides of different vehicles may be encoded in the same manner. As a result of this separation, e.g., a side 1 of all vehicles may obtain the same bit sequence in each case since the assignment stating to which vehicle the observed face belongs may be determined via the bit sequence of the vehicle ID. Alternatively or additionally, other entities with which the vehicle may interact may be made accessible in accordance with the same principle. For example, a reception or delivery station may be provided with a code, whereby said station may then be automatically recognized, or identified.
(63) The encoded bit sequences may be arranged in lines or two-dimensional areas and be attached between two markers, marking areas 24. For example,
(64) Embodiments enable simple and robust recognition of mutual orientations of vehicles. Separation between vehicle ID and side ID in combination with the fact that information is encoded only in the area located between two markers enables robust recognition even in the event of partial concealment of the face on which the code is attached. As soon as two markers are recognized, the information between them can be read out. Consequently, the vehicle ID may still be determined even if part of the code for the face ID is concealed. By accommodating the bits between the marker, it is not necessary to transform the entire image area spanned by the markers, as is the case with QR codes, for example. It may be sufficient to apply the transformation to a line, or a straight line, within the image, which is many times easier and faster. Since by means of the code, it is not only possible to provide information about the alignment of another participant, but also to estimate the distance from the latter, the method may be used for achieving localization of other participants with regard to one's own positon. With QR codes (and codes which work similarly), part of the existing bits is used for enabling and validating the useful image transformation of the code (synchronization line), another part is used for representing the type of the QR code (number of bits used, etc.). Embodiments are not dependent on this since the code is specifically tailored to the field of application of said devices.
(65) Embodiments enable robust and fast recognition of participants or objects with mobile robots which have cameras. Such embodiments are of interest, in particular, for mobile robots at stations in intralogistics so as to obtain, in a pick-up or transfer operation, in addition to information about the mere position, information about the respective alignments of the other participants with regard to one's own alignment.
(66) Even though some aspects have been described within the context of a device, it is understood that said aspects also represent a description of the corresponding method, so that a block or a structural component of a device is also to be understood as a corresponding method step or as a feature of a method step. By analogy therewith, aspects that have been described in connection with or as a method step also represent a description of a corresponding block or detail or feature of a corresponding device.
(67) Depending on specific implementation requirements, embodiments of the invention may be implemented in hardware or in software. Implementation may be effected while using a digital storage medium, for example a floppy disc, a DVD, a Blu-ray disc, a CD, a ROM, a PROM, an EPROM, an EEPROM or a FLASH memory, a hard disc or any other magnetic or optical memory which has electronically readable control signals stored thereon which may cooperate, or cooperate, with a programmable computer system such that the respective method is performed. This is why the digital storage medium may be computer-readable. Some embodiments in accordance with the invention thus comprise a data carrier which comprises electronically readable control signals that are capable of cooperating with a programmable computer system such that any of the methods described herein is performed.
(68) Generally, embodiments of the present invention may be implemented as a computer program product having a program code, the program code being effective to perform any of the methods when the computer program product runs on a computer. The program code may also be stored on a machine-readable carrier, for example.
(69) Other embodiments include the computer program for performing any of the methods described herein, said computer program being stored on a machine-readable carrier.
(70) In other words, an embodiment of the inventive method thus is a computer program which has a program code for performing any of the methods described herein, when the computer program runs on a computer. A further embodiment of the inventive methods thus is a data carrier (or a digital storage medium or a computer-readable medium) on which the computer program for performing any of the methods described herein is recorded.
(71) A further embodiment of the inventive method thus is a data stream or a sequence of signals representing the computer program for performing any of the methods described herein. The data stream or the sequence of signals may be configured, for example, to be transferred via a data communication link, for example via the internet.
(72) A further embodiment includes a processing means, for example a computer or a programmable logic device, configured or adapted to perform any of the methods described herein.
(73) A further embodiment includes a computer on which the computer program for performing any of the methods described herein is installed.
(74) In some embodiments, a programmable logic device (for example a field-programmable gate array, an FPGA) may be used for performing some or all of the functionalities of the methods described herein. In some embodiments, a field-programmable gate array may cooperate with a microprocessor to perform any of the methods described herein. Generally, the methods are performed, in some embodiments, by any hardware device. Said hardware device may be any universally applicable hardware such as a computer processor (CPU) or a hardware specific to the method, such as an ASIC.
(75) While this invention has been described in terms of several embodiments, there are alterations, permutations, and equivalents which fall within the scope of this invention. It should also be noted that there are many alternative ways of implementing the methods and compositions of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations and equivalents as fall within the true spirit and scope of the present invention.