METHOD FOR CONTROLLING A MACHINE BY MEANS OF AT LEAST ONE SPATIAL COORDINATE AS CONTROL VARIABLE AND CONTROL SYSTEM OF A MACHINE

20190333230 ยท 2019-10-31

    Inventors

    Cpc classification

    International classification

    Abstract

    A machine is controlled using at least one spatial coordinate as a control. Controlling a machine using at least one spatial coordinate as a control variable may include determining a vectorial space coordinate by means of a two-dimensional code applied to a carrier plane and readable by means of an optical image processing system, and transmitting the vectorial space coordinate as a control variable to a control system of the machine. The spatial position of a normal vector perpendicular to the area center of gravity of the code may be determined by an image processing system, and an the angle of rotation of a rotational movement of the carrier plane of the code about an axis of rotation perpendicular to the carrier plane may be detected by the image processing system, the length of the normal vector being determined from the angle of rotation.

    Claims

    1. A method of controlling a machine, comprising: visually detecting a two-dimensional code on a plane of a carrier medium; determining a spatial position of a normal vector perpendicular to the centroid of the area of the two-dimensional code; detecting an angle of rotation of a rotational movement of the plane about an axis of rotation perpendicular to the plane; determining a length of the normal vector based on the angle of rotation; determining a vectorial spatial coordinate from the spatial position of the normal vector and the length of the normal vector; and transmitting the vectorial spatial coordinate as a control variable to a control system of the machine.

    2. The method according to claim 1, further comprising: detecting a direction of rotation of the rotational movement of the plane; and determining a direction of orientation of the normal vector with respect to the plane from the detected direction.

    3. The method according to claim 1, further comprising: detecting a rotation of the plane about an axis of rotation parallel to the plane; and inverting an orientation direction of the normal vector with respect to the carrier plane using the detected rotation.

    4. The method according to claim 1, further comprising: determining a direction of orientation of the normal vector with respect to the plane by reading out and decoding the code.

    5. The method according to claim 1, further comprising: receiving image data from at least one camera of an optical image processing system; evaluating the image data for the presence of color marks; grouping recognized color marks into color mark groups; determining two-dimensional coordinates of the color marks belonging to at least one of the color mark groups in a coordinate system assigned to the camera; transforming the two-dimensional coordinates of the color marks of the at least one color mark group into a three-dimensional coordinate system assigned to the machine; and determining the normal vector based at least in part on the center of gravity of the area spanned by the color marks of the at least one color mark group.

    6. The method according to claim 5, further comprising: producing at least one bit mask for evaluating the image data, which bit mask is matched to key colors included in the color marks.

    7. The method according to claim 5, wherein each color mark is light-emitting.

    8. The method according to claim 1, wherein the code is designed as a two-dimensional arrangement of at least two color marks), each color mark being set up to display at least two individual color states for the respective color mark, and one of the color marks additionally being set up to change at a carrier frequency between a first and a second color state.

    9. The method according to claim 1, wherein the method is used within an augmented reality model for visualizing objects that can be identified in a virtual space using the vectorial spatial coordinate.

    10. A system for controlling a machine, comprising: an image processing device that reads a two-dimensional code on a plane of a carrier medium, determines a spatial position of a normal vector perpendicular to the centroid of the area of the two-dimensional code, and detects an angle of rotation of a rotational movement of the plane about an axis of rotation perpendicular to the plane; and one or more control components that determine a length of the normal vector based on the angle of rotation, determine a vectorial spatial coordinate from the spatial position of the normal vector and the length of the normal vector, and control transmitting the vectorial spatial coordinate as a control variable to a control system of the machine.

    11. The system according to claim 10, wherein the image processing device detects a direction of rotation of the rotational movement of the plane; and wherein a direction of orientation of the normal vector with respect to the plane is determined from the detected direction.

    12. The method according to claim 10, wherein the image processing device detects a rotation of the plane about an axis of rotation parallel to the plane, and wherein an inversion of an orientation direction of the normal vector with respect to the carrier plane uses the detected rotation.

    13. The system according to claim 10, wherein the image processing device reads out the code, and wherein a direction of orientation of the normal vector with respect to the plane is determined by decoding the code.

    14. The system according to claim 10, wherein the image processing device receives image data from at least one camera of an optical image processing system, and wherein the one or more control components evaluate the image data for the presence of color marks, group recognized color marks into color mark groups, determine two-dimensional coordinates of the color marks belonging to at least one of the color mark groups in a coordinate system assigned to the camera, transform the two-dimensional coordinates of the color marks of the at least one color mark group into a three-dimensional coordinate system assigned to the machine, and determine the normal vector based at least in part on the center of gravity of the area spanned by the color marks of the at least one color mark group.

    15. The system according to claim 14, wherein the one or more control components produce at least one bit mask for evaluating the image data, which bit mask is matched to key colors included in the color marks.

    16. The system according to claim 14, wherein each color mark is light-emitting.

    17. The system according to claim 10, wherein the code is designed as a two-dimensional arrangement of at least two color marks), each color mark being set up to display at least two individual color states for the respective color mark, and one of the color marks additionally being set up to change at a carrier frequency between a first and a second color state.

    18. The system according to claim 10, further comprising: an augmented reality model for visualizing objects that can be identified in a virtual space using the vectorial spatial coordinate.

    19. A method of controlling a machine, comprising: visually detecting a two-dimensional code on a plane of a carrier medium; determining a vectorial spatial coordinate from the detected two-dimensional code; transmitting the vectorial spatial coordinate as a control variable to a control system of the machine.

    20. The method of claim 19, wherein determining a vectorial spatial coordinate from the detected two-dimensional code includes: determining a spatial position of a normal vector perpendicular to the centroid of the area of the two-dimensional code; detecting an angle of rotation of a rotational movement of the plane about an axis of rotation perpendicular to the plane; determining a length of the normal vector based on the angle of rotation; and determining the vectorial spatial coordinate from the spatial position of the normal vector and the length of the normal vector.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0051] The system described herein will be explained in more detail below using an example and drawings. Shown below:

    [0052] FIG. 1 is a schematic representation of a control procedure according to an embodiment of the system described herein;

    [0053] FIG. 2 is an alternative structure for carrying out the control procedure according to an embodiment of the system described herein;

    [0054] FIG. 3 is a schematic structure of a smartphone display set up as a display device for a dynamic code according to an embodiment of the system described herein; and

    [0055] FIG. 4 is a schematic representation of a coded signal sequence k1 to k8 according to an embodiment of the system described herein.

    DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS

    [0056] FIG. 1 shows a schematic representation of a method of control according to an embodiment of the system described herein. A gesture generator (G) has a sample card (5) which has a machine-readable two-dimensional code printed on one side. Alternatively, also may be the sample card has machine-readable two-dimensional code printed on both sides, whereby the contents of both codes differ from each other in at least one information element. The pattern card (5) (or, more specifically, the surface of the pattern card bearing the code) defines a carrier plane (HE). As an alternative to a sample card, a display of a smartphone or tablet PC also may be provided.

    [0057] In this embodiment, the machine has an optical image processing system with a camera (100). This camera (100) has a field of view (KSB) which is essentially determined by the viewing direction or optical axis (KR) of the camera, as is known in the art. In a neutral basic position, the carrier plane (HE) is essentially aligned at a right angle to the camera axis (KR).

    [0058] The gesture generator (G) then may use the pattern card (5) to define a virtual vector, which points to a spatial coordinate (Z) starting from the centroid of the area of the code applied to the surface of the pattern card facing the camera. In a first step, the pattern map (5) may be tilted in space so that the normal vector (N+) is oriented in the direction of the target space coordinate (Z). The spatial coordinate may be any point within the first half-space (HR1) facing the camera or within the field of view of the camera (KSB). To control points in the second half space (HR2) remote from the camera, the direction of the normal vector may be switched (N+). This changeover may be effected by a rotary movement in the opposite direction (in the case of a code applied to one side of the carrier medium) or alternatively (in the case of codes applied to two sides of the carrier medium) by turning the carrier medium over and then decoding the coded content.

    [0059] In a second process step, the length of the vector may be set to the length required to reach the target spatial coordinate (Z) by means of a rotational movement (R). To improve serviceability, a rotation angle range from [+30 ] to [30 ] may not be transferred to control information (i.e., the length of the vector is not changed for rotation movements within this angle range). For angles of rotation with an amount greater than 30, the vector length may be continuously shortened or lengthened, whereby the rate of change increases disproportionately with increasing angle of rotation.

    [0060] As soon as the target space coordinate (Z) is determined in this way, a process based on this may be started by forwarding the control variables based on this space coordinate to a further processing device of the machine to be controlled. Such control may include, e.g., movement of the machine in the direction of the target space coordinate (Z) or identification by the machine of a component related to this space coordinate.

    [0061] Furthermore, the further processing device of the machine may synchronize the visualization process with data glasses, whereby both the target spatial coordinate (Z) as well as the vector and the identified component may be displayed in the field of vision of the data glasses.

    [0062] FIG. 2 shows an alternative structure for the execution of a procedure according to an embodiment of the system described herein, in which the direction of the camera (100) is oriented away from the gesture transmitter (G). This may be the case, for example, if the gesturer holds the camera (e.g., integrated in a smartphone) with a first hand and the carrier medium (5) of the code with a second hand and points the camera at the code.

    [0063] Embodiments of the system described herein are not only applicable in connection with static codes, but also in connection with dynamic two-dimensional codes. FIG. 3 shows the schematic structure of a display device, which is part of a system for authenticating a user to a central instance for releasing user-specific authorizations. In addition to the determination of the control variables according to the system described herein, an authorization check of the user also may take place at the same time. The carrier medium for the code may be formed by the display of a conventional smartphone, whichafter activation of a corresponding software application stored on the smartphonemay divide the display area into approximately four rectangular segments of equal size, which may be arranged horizontally and vertically in pairs. Each of these segments may form a color mark (t1, t2, t3, t4). Each of these color marks (t1, t2, t3, t4) may be set to display two individual color states for each color mark. A first color mark (t1) may be set up to alternately display the gray and black color states; the remaining color marks may be as follows:

    [0064] second color mark (t2) between green and yellow;

    [0065] third color mark (t3) between orange and red; and

    [0066] fourth color mark (t4) between purple and turquoise.

    [0067] For the color marks (t2, t3, t4) of the data values, saturated colors (e.g., color angles) may be used. The color tones may be selected so that the color states of the color marks may be displayed with approximately the same brightness in order to avoid glare effects in the receiving device.

    [0068] Thus, all color marks (t1, t2, t3, t4) of this two-dimensional arrangement may show at any time, i.e., independent of their current display state, a color state that may be clearly assigned to the respective color mark. The last three color marks (t2, t3, t4) may be designed in a manner known from the state of the art to display optically coded information by means of color changes. In an embodiment of the system described herein, the additional first color mark (t1), has color states that change at a predeterminable frequency (sometimes referred to herein as carrier frequency), this carrier frequency corresponding to the color change frequency of the other color marks (t2, t3, t4). By means of a conventional camera (not shown in this example for reasons of clarity), the central release instance may receive the image emitted in this way from the display andin addition to determining the control variablesevaluates the image with regard to the authentication information encoded in it by color changes.

    [0069] FIG. 4 shows color states (c11, . . . c42) displayed on the color marks (t1, t2, t3, t4) of the states k1, k2 . . . k8 over time, in accordance with an embodiment of the system described herein. Each color mark t.sub.i may alternate between its two characteristic color states c.sub.i1 and c.sub.i2 according to a pattern determined by the content of the coded identification data, with the exception of the color mark t1, which may alternate between its two color states c11 and c12 with a fixed carrier frequency. However, the color change of each color mark between a first state k.sub.i and a second state k.sub.i+1 following this in time may not take place in absolute synchronicity with the respective state changes of the other color marks shown on the display. This may be caused by the use of complex software and hardware components, such as a graphics library or the display technology of the display. This means that an image representation may be built up exactly during the change of state from the first state k.sub.i to the second state k.sub.i+1 and then in the result may partly represent the old state k.sub.i, but partly also the new state k.sub.i+1. This is also favored by the fact that the respective changes between the two color states c.sub.i1 and c.sub.i2 may not occur in an absolutely seamless manner, i.e., not immediately or abruptly, but require a certain period of time. The switching flanks between the two color states therefore may not be vertical in reality, but the switching processes may be rather oblique and steadywhen viewed with sufficient precision.

    [0070] As soon as a status change for the color mark (t1) assigned to the carrier signal is detected on the receiver side, a status change in the form of a color state deviating from the previous state k.sub.i should also be detected for each of the other color marks (t2, t3, t4). Otherwise, a faulty picture may be present. On the receiver side, all images may be discarded if at least two consecutive images do not represent the same state. Otherwise, the receiving device may detect the presence of an erroneous intermediate image and reject it.

    [0071] Various embodiments of the system described herein may be implemented using software, firmware, hardware, a combination of software, firmware and hardware and/or other computer-implemented modules, components or devices having the described features and performing the described functions. Software implementations of embodiments of the invention may include executable code that is stored one or more computer-readable media and executed by one or more processors. Each of the computer-readable media may be non-transitory and include a computer hard drive, ROM, RAM, flash memory, portable computer storage media such as a CD-ROM, a DVD-ROM, a flash drive, an SD card and/or other drive with, for example, a universal serial bus (USB) interface, and/or any other appropriate tangible or non-transitory computer-readable medium or computer memory on which executable code may be stored and executed by a processor. Embodiments of the invention may be used in connection with any appropriate operating system.

    [0072] Other embodiments of the invention will be apparent to those skilled in the art from a consideration of the specification and/or an attempt to put into practice the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.