SENSOR FOR SECURING A MACHINE
20190007659 ยท 2019-01-03
Inventors
Cpc classification
B25J9/1676
PERFORMING OPERATIONS; TRANSPORTING
G01B11/14
PHYSICS
G06V20/52
PHYSICS
H04N13/239
ELECTRICITY
H04N13/271
ELECTRICITY
G05B2219/40203
PHYSICS
International classification
H04N7/18
ELECTRICITY
G01B11/14
PHYSICS
H04N13/239
ELECTRICITY
H04N13/271
ELECTRICITY
Abstract
A safe optoelectronic sensor (10) is provided for securing a monitored zone (12) comprising at least one machine (26) that forms a hazard area (26a-b), wherein the sensor (10) has a light receiver (16a-b) for generating a received signal from received light from the monitored zone (12); a control and evaluation unit (24) for detecting object positions (28) in the monitored zone (12) from the received signal; and a safe output interface (30) for information acquired from the object positions (28). The control and evaluation unit (24) is here configured to determine the shortest distance between the hazard area (26a-b) and object positions (28) and to provide it at the safe output interface (30).
Claims
1. A safe optoelectronic sensor for securing a monitored zone, the monitored zone comprising at least one machine that forms a hazard area, the safe optoelectronic sensor having: a light receiver for generating a received signal from received light from the monitored zone; a control and evaluation unit for detecting object positions in the monitored zone from the received signal; and a safe output interface for information acquired from the object positions, wherein the control and evaluation unit is configured to determine the shortest distance between the hazard area and object positions and to provide the shortest distance at the safe output interface.
2. The safe optoelectronic sensor in accordance with claim 1, wherein the safe optoelectronic sensor is a 3D camera.
3. The safe optoelectronic sensor in accordance with claim 1, wherein the control and evaluation unit is configured to envelope the machine comprising at least one hazard area.
4. The safe optoelectronic sensor in accordance with claim 1, wherein the control and evaluation unit is configured to ignore object positions within hazard areas.
5. The safe optoelectronic sensor in accordance with claim 1, wherein the control and evaluation unit is configured to monitor changing hazard areas.
6. The safe optoelectronic sensor in accordance with claim 5, wherein the control and evaluation unit is configured to monitor changing hazard areas for different machine states within a process routine.
7. The safe optoelectronic sensor in accordance with claim 1, wherein the control and evaluation unit is configured to provide at least one piece of additional information at the output interface, with the at least one piece of additional information comprising at least one further shortest distance from other sections of the closest object or other objects, an object position, a direction of movement, a speed, an object envelope, or an object cloud.
8. The safe optoelectronic sensor in accordance with claim 1, wherein the safe optoelectronic sensor is configured for a detection capacity in which objects from a minimum size onward are reliably detected, with only objects of the minimum size being considered for the determination of the shortest distance.
9. The safe optoelectronic sensor in accordance with claim 8, wherein the control and evaluation unit is configured to define hazard areas in the form of at least one sphere covering the machine.
10. The safe optoelectronic sensor in accordance with claim 8, wherein the control and evaluation unit is configured to model objects as spheres having centers at the position of the object and radii corresponding to the minimum size corresponding to the detection capacity.
11. The safe optoelectronic sensor in accordance with claim 9, wherein the control and evaluation unit is configured to model objects as spheres having centers at the position of the object and radii corresponding to the minimum size corresponding to the detection capacity.
12. The safe optoelectronic sensor in accordance with claim 1, wherein the control and evaluation unit is configured to add the projective shadow of the hazard area and/or of the object to said hazard area and/or object for the determination of the shortest distance.
13. The safe optoelectronic sensor in accordance with claim 12, wherein the control and evaluation unit is configured to model projective shadows as cones.
14. The safe optoelectronic sensor in accordance with claim 12, wherein the control and evaluation unit is configured, for the determination of the shortest distance, to add only the projective shadow of the object to said object when the object is located closer to the safe optoelectronic sensor than the hazard area and, conversely, to add only the projective shadow of the hazard area to said hazard area when the hazard area is located closer to the safe optoelectronic sensor.
15. The safe optoelectronic sensor in accordance with claim 12, wherein the control and evaluation unit is configured, for the determination of the shortest distance, to model the respective closer partner of a pair of hazard area and object in the form of a first sphere together with the cone as a project shadow and to model the more remote partner as a second sphere.
16. The safe optoelectronic sensor in accordance with claim 15, wherein the control and evaluation unit is configured to use a sectional plane that is defined by the position of the safe optoelectronic sensor, the center of the first sphere, and the center of the second sphere for the determination of the shortest distance.
17. An arrangement of at least one safe optoelectronic sensor and of a control, the safe optoelectronic sensor being provided to secure a monitored zone, the monitored zone comprising at least one secured machine that forms a hazard area having: a light receiver for generating a received signal from received light from the monitored zone; a control and evaluation unit for detecting object positions in the monitored zone from the received signal; and a safe output interface for information acquired from the object positions, wherein the control and evaluation unit is configured to determine the shortest distance between the hazard area and object positions and to provide the shortest distance at the safe output interface; and the control being connected to the output interface and to the secured machine, wherein the control is configured to evaluate shortest distances provided by the safe optoelectronic sensor and to initiate a safety directed response as required.
18. A method of securing a monitored zone comprising at least one machine that forms a hazard area, wherein a received signal is generated and evaluated from received light from the monitored zone to detect object positions in the monitored zone, and wherein information acquired from the object positions is reliably output, wherein the shortest distance between the hazard area and the object positions is determined and is provided as safe information.
Description
[0051] The invention will be explained in more detail in the following also with respect to further features and advantages by way of example with reference to embodiments and to the enclosed drawing. The Figures of the drawing show in:
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060] Two camera modules 14a, 14b are mounted at a known fixed distance from one another therein and each take images of a spatial zone 12 to detect the spatial zone 12. An image sensor 16a, 16b, usually a matrix-type recording chip, is provided in each camera and records a rectangular pixel image, for example a CCD or a CMOS sensor. The two image sensors 16a, 16b together form a 3D image sensor for detecting a depth map. One objective 18a, 18b having an optical imaging objective is associated with each of the image sensors 16a, 16b respectively which in practice can be realized as any known imaging lens. The maximum angle of view of these optics is shown in
[0061] An illumination unit 22 is provided between the two image sensors 16a, 16b to illuminate the spatial zone 12 with a structured pattern. The stereo camera shown is accordingly configured for active stereoscopy in which the pattern also imparts evaluable contrasts everywhere to scenery that is structure-less per se.
[0062] Alternatively, no illumination or a homogeneous illumination is provided to evaluate the natural object structures in the spatial one 12, which as a rule, however, results in additional image defects.
[0063] An evaluation and control unit 24 is associated with the two image sensors 16a, 16b and the lighting unit 22. The control and evaluation unit 24 can be implemented in the most varied hardware, for example in digital modules such as microprocessors, ASICS (application specific integrated circuits), FPGAs (field programmable gate arrays), GPUs (graphics processing units) or mixed forms thereof that can be distributed over any desired internal and external components, with external components also being able to be integrated via a network or cloud provided that latencies can be managed or tolerated. Since the generation of the depth map and its evaluation is very computing intensive, an at least partly parallel architecture is preferably formed.
[0064] The control and evaluation unit 24 generates the structured illumination pattern with the aid of the illumination unit 22 and receives image data of the image sensors 16a, 16b. It calculates the 3D image data or the depth map of the spatial zone 12 from these image data with the aid of a stereoscopic disparity estimate. The total detectable spatial zone 12 or also the working region can be restricted via a configuration, for example to mask interfering or unnecessary regions.
[0065] An important safety engineering application of the stereo camera 10 is the monitoring of a machine 26 that is symbolized by a robot in
[0066] The control connected to the safe interface 30, either a higher ranking control or that of the machine 26, evaluates the shortest distance. In the hazard case, a safety directed response is initiated in order, for example, to stop or brake the machine 26 or to cause it to evade. Whether this is necessary can depend, in addition to the shortest distance, on further conditions such as the speeds or the nature of the object 28 and the machine zone 26 of the impending collision.
[0067] It will now be described in detail in the following how a distance monitoring is made possible with the sensor 10, for example for a human-robot collaboration while considering DIN EN ISO 10218 or ISO/TS 15066. The starting point is formed by the positions of the machine parts of the machine 26, at least to the extent that they are safety relevant, or by hazard zones defined on this basis and optionally expanded with reference to response and stopping times or other criteria and by the objects 28 detected by the stereo camera 10. The latter is, for example, present in the form of a 2D detection map whose pixel at positions in which an object 28 of a minimum size was detected, the distance value measured for this purpose is entered and otherwise remains empty. The respective distance, and in particular the shortest distance, from the machine 26, that forms a hazard area that is preferably also dynamic is calculated with the aid of these object detections that can naturally also be differently represented. Depending on the distance, a securing then takes place, optionally by a control connected to the safe interface 30, that can, as mentioned multiple times, also comprise an evasion or a slowing down.
[0068]
[0069] In this example, two hazard areas 26a-b have to be monitored, that is machine zones or machines, and four objects 28 are currently recognized in their environment by the sensor 10. Two of the objects 28 are individual persons, without the sensor 10 having to explicitly acquire this information; a further object 28 comprises two persons fused together, either because they are carrying a workpiece together and are so actually connected or because the segmentation was unable to separate the two persons. There is additionally another object 28 that cannot be identified in any more detail and could be article or a false detection. If it is beneath the minimum size, it can be ignored; otherwise it must be recognized as a person as a precaution. The non-connected arm of the person at the far left forms, in dependence on the evaluation, a separate further object or is attributed to the person, in particular according to the teaching of EP 3 200 122 A1. The sensor 10 delivers distance data so that a connected control protects the persons from injury by reduced speed, an evasive replanning of the routines, or where necessary a stop of the machines in the hazard areas 26a-b in good time.
[0070] A hazard area 26a-b is a preferred modeling of the hazardous machine 26. The hazard area 26a-b is a spatial zone in which the machine 26 carries out work movements in a respective time period. The hazard area 26a-b can surround the machine 26 with a little spacing to leave sufficient clearance for the work movements. In addition, it is advantageous for the calculations to geometrically define simple hazard areas 26a-b such as parallelepipeds or spheres, for which purpose certain empty spaces can then be accepted. A plurality of hazard areas 26a-b surround a plurality of machines 26 and/or a plurality of moving part sections of a machine 26. Hazard areas 26a-b can be rigid and can comprise all conceivable work movements. Alternatively, respective hazard areas 26a-b are defined for part sections of the work movement that are utilized in a sequence corresponding to the process and that are smaller and are better adapted.
[0071] The control and evaluation unit 24 continuously calculates the shortest distance of the object 28 closes to a respective hazard area 26a-b. Arrows are drawn in
[0072] The respective shortest distance last determined with respect to a hazard area 26a-b is provided cyclically or acyclically at the safe interface 30. Typical output rates are multiple times a second; however, a more infrequent updating is also conceivable depending on the required and possible response time of the sensor 10. A higher ranking control connected to the safe interface 30, in particular that of the machine 28, plans the next workstep again, where necessary in dependence on the shortest distance, so that the required safety distance between human and machine is always maintained.
[0073] The control and evaluation unit 24 preferably also determines a speed of the object 28 from which the shortest distance was measured and outputs it with the shortest distance at the safe interface 30. The hazard can thus be differentiated even better. The closest object 28 is admittedly the most dangerous as ruleor in more precise terms the one most at risk. The safety distance that the machine 26 maintains on its movement planning can additionally be adapted to a maximum speed of a human movement. The safety directed response of the machine is nevertheless best adapted to its environment if more information is present on the closest object 28 and possibly also on further objects 28. A dependence on the machine's own status and on the planned movement of the machine 26, in particular the position and speed of machine parts or even of dangerous tool regions, is also conceivable, with such information preferably being provided by the machine control.
[0074] There are a number of further measurement parameters or of parameters derived therefrom that the control and evaluation unit 24 can output in addition to the shortest distance at the safe interface 30 so that they can enter into the safety observation of the control connected there. The speed of the closest object 28 from which the shortest distance is measured has already been discussed.
[0075] Additional shortest distances from further objects 28 or from separate object sections of the closest object 28 are preferably output, for example of a different arm. A possible criterion here would be that there are even further local distance minima in the same object since the direct adjacent points from the shortest distance are of no interest. For example, the sensor 10 guarantees the monitoring of the five closest distances per active hazard area 26a-b. A sixth object and further objects or object sections are no longer considered, with an additional piece of information being conceivable, however, that there are more than five objects of the minimum size in the monitored zone 12. The connected control can thus also pre-plan for further future danger situations with other objects 28 than the closest object 28. A graphic example is a still somewhat more remote object 28 that approaches a hazard area 26a-b at high speed.
[0076] Further conceivable additional pieces of information are, non-exclusively, the size of the closest object 28, its position in the form of a centroid or of the closest point, a direction of movement, an object envelope, an enveloping body surrounding the object 28 or a representation of the object 28 in total as an object cloud, 3D point cloud, or 3D voxel representation.
[0077] A safety application using the sensor 10 in accordance with the invention can be described as follows. This routine is only an example. First, after a suitable installation of the sensor 10, for example with a bird's eye view above the machine 26 to be secured, the hazard areas 26a-b are configured. Alternatively, a sensor combination is installed to acquire an additional field of vision and/or further perspectives. The configuration itself expediently takes place by a set-up engineer in a corresponding software tool, with AR-like configurations, however, also being conceivable directly in the work space similar to EP 2 023 160 B1 named in the introduction where protected fields are configured in this manner. It is conceivable to configure a further set of hazard areas 26a-b in dependence on the process step of the machine 26.
[0078] The sensor 10 then detects the objects 28 that are respectively located in the monitored zone 12 in operation. The recorded depth maps are filtered with the hazard areas 26a-b that are themselves not monitored and are optionally filtered with taught background objects. Small interference objects and gaps in which no depth values are detectable are ignored. The control and evaluation unit 24 segments the depth map in a manner not explained in any more detail here to separate the objects 28. There are numerous examples in the literature on such segmentations.
[0079] The distance between each hazard area 26a-b and each object 28 is then determined. The distance between each object position, i.e. each object point and each point of a hazard area 26a-b, generally has to be determined for this purpose. However, more effective processes can be used, for instance by means of enveloping bodies, so that it is not necessary really to observe every point, and heuristics can also be used that quickly exclude some objects as not to be considered. A particularly effective method of locating the shortest distance will be provided further below.
[0080] Depending on the embodiment, the control and evaluation unit 24 determines further parameters in addition to the shortest distance, such as the speed of the object having the shortest distance or shortest distances from further objects 28. The shortest distance and any additional pieces of information are provided to the safe interface 28.
[0081] If a plurality of sensors 10 are used in a combination, a higher ranking control connected to the respective safe interfaces 30 reads the provided data and fuses them. If the sensors 10 are initially calibrated to a common global coordinate system, this fusion is very simple since only the shortest distance over all sensors 10 has to be selected from the shortest distances per hazard area 26a-b delivered locally per sensor 10.
[0082] The higher ranking control or a machine control directly connected to the safe interface 30 uses the shortest distances per hazard area 26a-b thus acquired to determine whether a replanning of the current workstep or of a future workstep is required so that the process runs ideally and safety is ensured. The safety directed response only comprises a shutdown of the machine 26 in emergencies.
[0083] The determination of the shortest distances in the control and evaluation unit 24 preferably takes place while taking account of the projective geometry in 3D space. Alternatively, a 2D projection onto the sensor image plane, onto a ground plane, or onto an intermediate plane would also be conceivable.
[0084] An advantageous particularly efficient sphere model for distance calculation will be presented in the following that considers projective shadows. For this purpose,
[0085] In a step S1, the machine 26 is represented by enveloping spheres for the further evaluation for a simplified handling. These spheres cover the machine 26 or at least those machine parts that form a hazard source to be monitored or the hazardous volume derived therefrom, that is the hazard area 26a-b. In this respect a consideration must be made between the effort for the monitoring and the accuracy of the approximation for the number of spheres. In some embodiments, the sphere representation is supplied to the control and evaluation unit 24 via an interface, for example from other sensors or from a control of the machine 26 that knows or monitors its own movement. The space taken up by the machine 26, including its projection onto the image plane of the stereo camera 10, is preferably masked and not observed to avoid the machine 26 itself being recognized as an object 28.
[0086] In a step S2, the spatial zone 12 is now recorded by the stereo camera 10 to acquire 3D image data. In some embodiments, the machine 26 is recognized in the 3D image data and a sphere representation is derived therefrom provided that these data are not made available elsewhere. The sequence of the steps S1 and S2 is then preferably reversed.
[0087] In a step S3, objects 28 of the minimum size are detected in the 3D image data. They are then known with their 3D position, for example in the form of a detection map as described above.
[0088] The objects 28 or their surfaces that are visible from the point of view of the stereo camera 10 are likewise modeled as a sphere in a step S4. For example, each position in the detection map at which a distance value is entered and accordingly an object 28 of the minimum size is detected there is enveloped with a sphere having a radius corresponding to the minimum size. A plurality of adjacent pixels are occupied by a distance value for objects 28 that are larger than such a sphere in the detection map so that spheres arise there that are nested in one another and that ultimately envelope the object as a whole. It is possible in an intermediate step to eliminate those spheres that are covered by other spheres and thus do not have to be evaluated.
[0089] In a step S5 that will be explained more exactly with reference to
[0090] However, this is not yet sufficient for a safe evaluation due to the projective covering from the point of view of the stereo camera 10. A further object that is not detectable for the stereo camera 10 can be hidden in the shadow 32 of the machine 26 or in the region masked for this purpose or equally in the shadow 34 of the objects. For reasons of safety, the respective shadow 32, 34 is treated as an object 28 or as a part of the machine 26. The projective shadow, that is the region covered from the point of view of the stereo camera 10, is a truncated cone for a sphere, with the truncated cone being formed by beams emanating from the stereo camera 10.
[0091] The machine 26 and the objects 28 thus become spheres having a transition into a truncated cone. Different reference points then result for the shortest distance, in dependence on the position of the objects 28 with respect to the machine 26, for example a distance 36a between the machine 26 and an object 28, a distance 36b between the machine 26 and a shadow 34 of an object 28, and a distance 36c between a shadow 32 of the machine 26 and object 28, and also a distance, not drawn, between the shadows 32, 34 would be conceivable.
[0092] In a step S6, the distances calculated for all the pairs are evaluated together, in particular the shortest distance is determined as the minimum. The pairs can also be restricted in advance. As can be seen, the object 28 at the far right in
[0093] A possible output parameter, and one that is particularly important in a number of cases, is the shortest distance since a typical safety observation requires that a certain distance is always observed between the machine 26 and objects 28. There can, however, also be other or further output parameters. For example, the n shortest distances from the machine 26 are calculated and output to evaluate a plurality of objects 28 or object regions. It is similarly possible to calculate and output shortest distances from a plurality of part regions of the machine 26 or from a plurality of machines in parallel. There is no basic difference between an individual machine 26 and a plurality of machines or hazard areas due to the sphere representation. It is, however, conceivable that different machine zones are treated differently, for instance due to a degree of danger. The coordinates, that is the affected regions of the machine 26 or objects 28 together with an approximated geometry or a geometry detected in the 3D image data, the contact points of the distance line, or the distance line itself can also be output with respect to a distance. The speeds of the involved regions of the machine 26 and objects 28 can also be determined from at least a plurality of frames of the 3D detection by object tracking, for instance. Such parameters can be relevant and can be calculated and output in any desired combination depending on a subsequent safety observation.
[0094]
[0095] In a step S11, a pair of spheres is selected for the machine 26 and an object 28 whose distance is to be calculated while considering the projective shadow 32, 34. The complete calculation remains geometrically correct; apart from unavoidable numerical inaccuracies, no approximation takes place.
[0096] The respective enveloping spheres are converted into suitable global coordinates, Cartesian coordinates here, with the aid of a calibration of the 3D camera. The spheres are associated with a truncated cone to take the projective shadow into account. That is consequently the input parameters of a single distance determination: Two geometrical bodies that can be illustrated as a sugarloaf and that comprise a spherical segment with an adjoining cone or truncated cone and that are completely described geometrically by a center and a radius, while adding the position of the stereo camera 10 and the fact that the projective shadows 32, 34 in the distance extend in principle into infinity and practically up to and into the maximum range of the stereo camera 10.
[0097] In a step S12, a simplification is done for the further calculation: A check is made which of the two partners of the pair is further remote from the stereo camera 10, with respect to the respective center. The center of the closer partner is marked by m; the center of the more remote partner by p. It must be noted that m,p no longer allow the recognition by these determinations whether the machine 26 or the object 28 is located there; the geometrical distance problem is independent of this.
[0098] In step S13, the cone is now omitted with the more remote p. This is sufficient because the cones representing projective shadows 32, 34 each arise in projection from the point of view of the stereo camera 10 and the distances from the stereo camera 10 thus become larger and larger. In other words, it is therefore impossible that the shortest distance is from the more remote shadow 32, 34. In addition, it is sufficient to consider p as a point in the further calculation. The sphere can namely be taken into account very simply by deducting its radius from the calculated distance. With the closer m, in contrast, it is still the sphere that is associated with a truncated cone.
[0099] In a step S14, the three-dimensional problem is now projected onto a two-dimensional problem.
[0100] This sectional plane should now be detected. Since it passes through the origin c, the plane equation is n.sub.1x+n.sub.2y+n.sub.3z=0 with the plane normal n=mp. A coordinate system of two normal vectors e.sub.1, e.sub.2 is determined in the sectional plane. The following division is advantageous for the further calculation in this respect:
[0101] The two normal vectors e.sub.1, e.sub.2 therefore span a Cartesian coordinate system within the sectional plane, with e.sub.1 in the direction m and e.sub.2 perpendicular thereto.
[0102] A 23 projection matrix can then be formed from the two vectors that projects m, p into the 2D sectional plane: m=(m, 0).sup.T, c=(0,0).sup.T and p from the projection with the projection matrix E: p=E p with
or p.sub.x=e.sub.1p and p.sub.y=e.sub.2p.
[0103]
[0104] In a step S15, these tangents are looked at. The tangent equations in gradient form are
let
be the ratio or the camera distance from the radius r, 1.
[0105] The gradient of the tangents is then
and the tangent equation is thus
[0106] The closest point on the tangent from the point p or its parameterization along the tangent is specifically required. Two different points p.sub.1, p.sub.2 are shown in
[0107] The Hesse normal form with which the distance of a point p from the straight line can be calculated can be simply derived from the point gradient shape of the tangent equation. However, the closest point on the straight line to p=(p.sub.x, p.sub.y).sup.T is also required here (perpendicular foot), the vector form is more practical for this purpose.
[0108] It follows from the tangent equation x={square root over (.sup.21)}y and thus the respective directional vector
for the tangents.
[0109] The tangent equation in vector form describes any desired point on the straight line over the free parameter t, in particular also two potential perpendicular feet l.sub.1,2:
[0110] The vectors b.sub.1,2:=p.fwdarw.l.sub.1,2 are the connection vector from p to the perpendicular foot of each tangent:
[0111] The perpendicular must be upright on the directional vector, i.e. a.sub.i.Math.b.sub.i=0 for i=1, 2.
[0112] The following solutions for t result from this:
[0113] p is now always in the negative y/positive x quadrant due to the coordinate system spanned by e.sub.1, e.sub.2, which also results from the order of the vectors in the cross product. However, this means that the descending tangent t.sub.2 is always the closer. The tangent t.sub.1 is preferably not observed, which facilitates the calculation in operation.
[0114] In addition, due to the filtering of depth values in the preprocessing, there is no object 28 in the machine 26 or its shadow; corresponding detections are masked. It can therefore be precluded that p is located in the cone. In addition, all the parameters are restricted to the range of valid depth values; that is they lie in the monitored spatial zone 12.
[0115] After the calculation of
the values are compared with the minimum permitted values. They define the valid region of the cone jacket.
[0116] The minimal value t.sub.min for the tangent results by inserting the circle center into
[0117] In a step S16, a case by case analysis is now carried out. Depending on which of the shown points p.sub.1, p.sub.2 corresponds to the location of p, the shortest distance is from the tangent or from the circle. As is clear, the tangent corresponds to the shadow 32, 34 and the circle corresponds to the machine 26 or to the object 28 itself.
[0118] t>t.sub.min applies to a point in a location such as p.sub.1 and then the perpendicular foot l is calculated from the tangent equation in a step S17a. The distance results from d=lp. It is transformed back into 3D coordinates via the plane vectors, l.sub.3D=E.sup.Tl with E.sup.T=(e.sub.1 e.sub.2) or l.sub.3D=l.sub.xe.sub.1+l.sub.ye.sub.2, where l.sub.x,l.sub.y are the coordinates of the closest tangent point. The radius of the sphere around p still has to be deducted for the actual distance to compensate the reduction of the sphere there to a point.
[0119] If conversely tt.sub.min for a point is in a location such as p.sub.2, the circle is directly used instead of the tangent in a step S17b. The distance then simply corresponds to the distance of the centers m, p after deduction of the radii of both spheres in this case.
[0120] The distance of the pair is thus known in an exact geometrical calculation while taking account of the projective shadows.
[0121] In many cases, the movement of the machine 26 is not completely free and dynamic, but is rather largely known at the setup time. The control and evaluation unit 24 then does not have to look at the distance at the run time again, but can rather calculate in advance the distances from a number of positions or from all possible positions in the spatial zone 12, for instance in a discretized form of 2D pixel addresses and a depth value, for a series of known configurations. A look-up table is thus produced in a configuration phase that permits a very fast evaluation at the run time.
[0122] In a further embodiment, the machine 26 is not considered with its covering cone; for example, because the safety concept does not require it. The machine 26 is then considered as a free-floating sphere and the object 28 as a sphere with a covering cone. If then all the spheres are still the same size, calculation can be more efficient with the squared distance and the laborious taking of roots is omitted.
[0123] A further embodiment not only provides one stereo camera 10, but rather a combination of a plurality of stereo cameras 10. It must be remembered that the stereo camera 10 is only an exemplary sensor and such a combination can also be inhomogeneous, i.e. can have different sensor types. Each sensor determines the shortest distance for it with the projective covering applicable to it. The distances are evaluated in combination in that, for example, the shortest distance from the combination is used for the safety directed evaluation.