Optoelectronic sensor and method for detecting objects

11698442 · 2023-07-11

Assignee

Inventors

Cpc classification

International classification

Abstract

An optoelectronic sensor is provided that has at least one light transmitter for transmitting a plurality of mutually separated light beams starting from a respective one transmission point; a transmission optics for the transmitted light beams; at least one light receiver for generating a respective received signal from the remitted light beams reflected from the objects and incident at a respective reception point; a reception optics for the remitted light beams; and an evaluation unit for acquiring information on the objects from the received signals. The reception optics and/or the transmission optics is/are a two-lens objective for an annular image field with an image field angle that has a first lens and a second lens, with the first lens being configured such that bundles of rays of every single transmission point and/or reception point with an image field angle α only impinge on half of the second lens.

Claims

1. An optoelectronic sensor for detecting objects in a monitored zone, the optoelectronic sensor comprising: at least one light transmitter for transmitting a plurality of mutually separated light beams starting from a respective one transmission point; a transmission optics for the transmitted light beams; at least one light receiver for generating a respective received signal from the remitted light beams reflected from the objects and incident at a respective reception point; a reception optics for the remitted light beams; and an evaluation unit for acquiring information on the objects from the received signals, wherein at least one of the reception optics and the transmission optics is a two-lens objective for an annular image field with an image field angle α that has a first lens and a second lens, with the first lens being configured such that light beams of every single transmission point and/or reception point with an image field angle α only impinge on half of the second lens.

2. The optoelectronic sensor in accordance with claim 1, wherein the inequality d≥(D1*f1)/(D1+2*f1*tan α) is satisfied for the two-lens objective, with a focal length f1 and a diameter D1 of the first lens and a distance d between the first lens and the second lens.

3. The optoelectronic sensor in accordance with claim 2, wherein d=(D1*f1)/(D1+2*f1*tan α) applies at least approximately.

4. The optoelectronic sensor in accordance with claim 1, wherein the focal length f2 of the second lens corresponds to the distance between the first lens and the second lens.

5. The optoelectronic sensor in accordance with claim 1, wherein the first lens has a small f-number k1.

6. The optoelectronic sensor in accordance with claim 5, wherein the first lens has an f-number k1=1.

7. The optoelectronic sensor in accordance with claim 1, wherein the transmission points are arranged on a first circular line.

8. The optoelectronic sensor in accordance with claim 1, wherein the reception points are arranged on a second circular line.

9. The optoelectronic sensor in accordance with claim 1, that has a plurality of light transmitters.

10. The optoelectronic sensor in accordance with claim 9, wherein the optoelectronic sensor has one light transmitter per transmission point.

11. The optoelectronic sensor in accordance with claim 1, that has a plurality of light receivers.

12. The optoelectronic sensor in accordance with claim 11, wherein the optoelectronic sensor has one light receiver per reception point.

13. The optoelectronic sensor in accordance with claim 1, wherein the light transmitter and the light receiver form a coaxial arrangement and the transmission optics and the reception optics are combined in a common optics.

14. The optoelectronic sensor in accordance with claim 1, that is configured as a laser scanner and has a movable deflection unit with whose aid the transmitted light beams are periodically conducted through the monitored zone.

15. The optoelectronic sensor in accordance with claim 14, wherein the deflection unit is configured in the form of a rotatable scanning unit in which at least one of the light transmitter and the light receiver is accommodated.

16. The optoelectronic sensor in accordance with claim 1, wherein the evaluation unit is configured to determine a distance of the objects from a time of flight between the transmission of the light beams and the reception of the remitted light beams.

17. A method of detecting objects in a monitored zone in which a plurality of mutually separated light beams are transmitted from a transmission optics starting from a respective transmission point; in which a respective received signal is generated from the remitted light beams reflected from the objects and incident at a respective reception point after passing through a reception optics; and in which the received signals are evaluated to acquire information on the objects, wherein at least one of the reception optics and the transmission optics is a two-lens objective for an annular image field with an image field angle α that has a first lens and a second lens; and wherein due to the design of the first lens, bundles of rays of every single transmission point and/or reception point with an image field angle α only impinge on half of the second lens.

Description

(1) The invention will be explained in more detail in the following also with respect to further features and advantages by way of example with reference to embodiments and to the enclosed drawing. The Figures of the drawing show in:

(2) FIG. 1 a schematic sectional representation of a laser scanner;

(3) FIG. 2a a schematic view of a circular arrangement of image field points;

(4) FIG. 2a a schematic view of a linear arrangement of image field points;

(5) FIG. 2c a schematic view of an annular arrangement of image field points;

(6) FIG. 3 a plan view of circularly arranged transmission points and reception points;

(7) FIG. 4 a schematic view of a two-lens objective for an annular image field with exemplary beam progressions; and

(8) FIG. 5 a schematic plan view of the second lens of the objective in accordance with FIG. 4 to illustrate the optical effect of the first lens.

(9) FIG. 1 shows a schematic sectional representation through an optoelectronic sensor 10 in an embodiment as a laser scanner. The sensor 10 in a rough distribution comprises a movable scanning unit 12 and a base unit 14. The scanning unit 12 is the optical measurement head, whereas further elements such as a supply, evaluation electronics, terminals and the like are accommodated in the base unit 14. In operation, the scanning unit 12 is set into a rotational movement about an axis of rotation 18 with the aid of a drive 16 of the base unit 14 to thus periodically scan a monitored zone 20.

(10) In the scanning unit 12, a light transmitter 22 having a plurality of light sources 22a, for example LEDs or lasers in the form of edge emitters or VCSELs, generates with the aid of a common transmission common optics 24 a plurality of transmitted beams 26 having a mutual angular offset that are transmitted into the monitored zone 20. If the transmitted light beams 26 impact an object in the monitored zone 20, corresponding remitted light beams 28 return to the sensor 10. The remitted light beams 28 are conducted from a common reception optics 30 to a light receiver 32 having a plurality of light reception elements 32a that each generate an electric received signal. The light reception elements 32a can be separate elements or pixels of an integrated matrix arrangement, for example photodiodes, APDs (avalanche diodes), or SPADs (single photon avalanche diodes).

(11) The purely exemplary four light sources 22a and light reception elements 32a are shown above one another in the sectional view. In actual fact, at least one of the groups is arranged in preferred embodiments of the invention in a circular figure or on a circular line, as will be explained further below. However, this does not have to relate to physical light sources 22a and light reception elements 32a, but only to the effective transmission points, that do, however, agree therewith here, as starting points of the transmitted light beams 26 and to reception points as end points of the remitted light beams 28. Differing from FIG. 1, it is conceivable to generate a plurality of transmission points with a physical light source or to accommodate a plurality of reception points on the same physical reception module.

(12) The light transmitter 22 and the light receiver 32 are arranged together in the embodiment shown in FIG. 1 on a circuit board 34 that is disposed on the axis of rotation 18 and that is connected to the shaft 36 of the drive 16. This is only to be understood by way of example; practically any desired numbers and arrangements of circuit boards are conceivable. The basic optical design with a light transmitter 22 and a light receiver 32 biaxially disposed next to one another is also not compulsory and can be replaced with any construction design known per se of single-beam optoelectronic sensors or laser scanners. An example for this is a coaxial arrangement with or without a beam splitter.

(13) A contactless supply interface and data interface (DI) 38 connects the moving scanning unit 12 to the stationary base unit 14. A control and evaluation (CEU) 40 is located there that can at least partly also be accommodated on the circuit board 34 or at another location in the scanning unit 12. The control and evaluation unit 40 controls the light transmitter 22 and receives the received signals of the light receiver 32 for further evaluation. It additionally controls the drive 16 and receives the signal of an angular measurement unit which is not shown, which is generally known from laser scanners ,and which determines the respective angular position of the scanning unit 12.

(14) The distance from a scanned object is measured for the evaluation, preferably using a time of flight process. Together with the information on the angular position from the angular measurement unit, two-dimensional polar coordinates of all the object points in a scanning plane are available after every scanning period with angle and distance. The respective scanning plane is likewise known via the identity of the respective transmitted light beam 28 and its detection in one of the light reception elements 32a so that a three-dimensional spatial zone is scanned overall.

(15) The object positions or object contours are thus known and can be output via a sensor interface (SI) 42. The sensor interface 42 or a further terminal, not shown, conversely serves as a. parameterization interface. The sensor 10 can also be configured as a safety sensor for use in safety engineering for monitoring a hazard source such as a dangerous machine. In this process, a protected field is monitored which may not be entered by operators during the operation of the machine. If the sensor 10 recognizes an unauthorized intrusion into the protected field, for instance a leg of an operator, it triggers an emergency stop of the machine. Sensors 10 used in safety technology have to work particularly reliably and must therefore satisfy high safety demands, for example the standard EN13849 for safety of machinery and the machinery standard EN1496 for electrosensitive protective equipment (ESPE). The sensor interface 42 can in particular be configured as a safe output device (OSSD, output signal switching device) to output a safety-directed switch-off signal on an intrusion of a protected field by an object.

(16) The sensor 10 shown is a laser scanner having a rotating measurement head, namely the scanning unit 12. In this respect, not only a transmission/reception module can rotate along as shown here; further such modules with a vertical offset or an angular offset with respect to the axis of rotation 18 are conceivable. Alternatively, a periodic deflection by means of a rotating mirror or by means of a facet mirror wheel is also conceivable. With a plurality of transmitted light beams 26, it must, however, be noted that how the plurality of transmitted light beams 26 are incident into the monitored zone 20 depends on the respective rotational position since their arrangement rotates by the rotating mirror as known geometrical considerations reveal. A further alternative embodiment pivots the scanning unit 12 to and fro, either instead of the rotational movement or additionally about a second axis perpendicular to the rotational movement to also generate a scanning movement in elevation.

(17) The embodiment as a laser scanner is also advantageous. A multiple sensor without a periodic movement is also possible that then practically only comprises the stationary scanning unit 12 having corresponding electronics, but without a base unit 14, in particular as a variant of a flash LIDAR.

(18) During the rotation of the sensor 10, a respective area is scanned by each of the transmitted light beams 26. A plane of the monitored zone 20 is here only scanned at a deflection angle of 0°, that is with a horizontal transmitted light beam not present in FIG. 1. The remaining transmitted light beams scan the envelope surface of a cone that is designed as differently acute depending on the deflection angle. With a plurality of transmitted light beams 26 that are deflected upward and downward at different angles, a kind of nesting of a plurality of hourglasses arises overall as a scanned structure. These envelope surfaces of a cone are here also sometimes called scanning planes in simplified terms.

(19) In accordance with the invention, the transmission optics 24 and/or the reception optics 30 are configured for an annular image field having an image field angle α. The motivation for this will be explained with reference to FIGS. 2a-c.

(20) In the ideal case, the optics 24, 30 should, as in FIG. 2a, image all the image field positions 46 in focus within the image circle 44. In accordance with the introductory discussion, a single lens, however, only does this for a very small image circle 44, whereas a corresponding objective would be too complex and would additionally bring about other optical limitations.

(21) An areal imaging is not necessarily required for a laser scanner since scanning planes having a mutual offset in elevation are already produced by a linear arrangement of light sources 22a and light reception elements 32a. One optics would be sufficient for this purpose that provides a sharp imaging on a linear arrangement of image field positions 46 as in FIG. 2b. However, this is also only possible for larger image circles 44 using a complex objective.

(22) Instead, the sharp imaging is only required for a single image field angle α such as is shown in FIG. 2c where the ring of the image field positions 46 corresponds to the image field angle α. The optics design is preferably oriented on the fixed image field angle α, which does not preclude the imaging also still being sharp in a certain environment; however, this is no longer a design demand for differing, and in particular smaller, image field angles. The image field angle α having a certain tolerance band of a sufficiently sharp imaging is as large as possible in FIG. 2c, for example α=±15°, to obtain distances that are large as possible between the beams 26, 28 of the sensor 10. A certain improvement of the angle covered, for example to ±8°, can already be achieved by this restriction to an annular image field. This will be, however, be improved considerably more with the configuration of the optics 24, 30 in accordance with the invention explained below with reference to FIGS. 4 and 5.

(23) FIG. 3 shows in a plan view a preferred arrangement of the light sources 22a or of the light reception elements 32a on a circular line 48a-b. As shown, the optical center axis of the optics 24, 30 preferably passes through the center of the circular line 48a-b. FIG. 3 can relate to the transmission path and/or to the reception path depending on the embodiment so that the reference numerals are shown in dual form.

(24) Due to the arrangement on the circular line 48a-b, only the annular image field corresponding to the image field angle α is effectively used. This arrangement is therefore particularly advantageous; exactly the optimized zones of the optics 24, 30 are utilized. Aberrations of the optics 24, 30 for image field angles differing from α are practically irrelevant.

(25) The difference between a light source 22a and a transmission point 22b was already briefly looked at in connection with FIG. 1. A transmission point 22b is the starting point of a transmitted light beam 26. It can simultaneously be the location of a physical light source 22a. On the one hand, however, a light source 22a as a semiconductor component also has a certain basic shape, here a square basic shape, that is larger than the emission surface itself. It is moreover possible to generate transmitted light beams from a plurality of transmission points 22b using one and the same physical light source 22a. The transmission points 22b are naturally strictly speaking not mathematical points, but rather have a finite extent so that only some points, and in particular the center points, can be arranged on the circular line 48a-b. The statements on the transmission points 22b apply accordingly to the reception points 32b. The arrangement of the transmission points 22b or of the reception points 32b is ultimately relevant for the optical properties of the sensor 10, not that of the light transmitters 22, light sources 22a, light receivers 26, or light reception elements 32a.

(26) In the embodiment in accordance with FIG. 1, each transmission point 22b is implemented by its own light source 22a and each reception point 32b is implemented by its own light reception element 32a. It is alternatively possible to deviate from this in the most varied manner. The same light source 22a can generate transmitted light beams 26 from a plurality of transmission points 22b or even from all the transmission points 22b by a beam splitter element or the like. The light source 22a can be moved mechanically to generate transmitted light beams 26 consecutively from a plurality of transmission points 22b or even from all the transmission points 22b. The transmitted light beam 26 can also move over the circular line 48a or over a part thereof without a mechanical movement of the light source 22a, for instance by means of a MEMS mirror, an optical phased array, or an acousto-optical modulator.

(27) A plurality of reception points 32b can in turn equally be achieved by separate light reception elements 32a such as pixels or pixel zones of an integrated multiple arrangement of light reception elements 32a. A mechanical movement of a light reception element 32a along the circular line 48a or along a part thereof or a corresponding deflection of the remitted light beams 28 by means of a moved MEMS mirror or the like is also conceivable on the reception side. In a further embodiment, the received light of a plurality of reception points 32b or of all the reception points 32b is conducted to a common light reception element. To nevertheless be able to determine the identity of the respective remitted light beam 28, a multiplexing is possible with a sequential activation of transmitted light beams 26 or via a time encoding of the multiple pulse sequence of the transmitted beams.

(28) FIG. 3 shows an example with three transmission points 22b or reception points 32b that are evenly distributed over the circular line 48a-b. Differing from this, various numbers 3, 4, 5, 6, 7, 8, . . . 16 and more are also conceivable in an irregular arrangement.

(29) FIG. 4 shows a schematic representation of a two-lens objective having a first lens 50 and a second lens 52, with both lenses 50, 52 preferably being converging lenses. This objective can be used as a transmission optics 24 and/or as a reception optics 32. It was explained above with respect to FIG. 2 that image field angles of up to ±18° are possible with a single lens optimized for an annular image field. The two-lens objective substantially improves this to ±20° and more.

(30) Two exemplary bundles of rays 54, 56 that are disposed opposite one another with respect to the optical axis and that correspond to the image field angle α are drawn in FIG. 4. The two-lens objective is also optimized for this image field angle α and the annular image field thereby determined.

(31) The first lens 50 reduces the beam diameter of the bundles of rays 54, 56 to a cross-section that is at a maximum still half as large as on the entry into the first lens 50. This reduced cross-section then only impinges a half of the second lens 52. The second lens 52 is thereby always only illuminated by light from one field point at a given position, but not from the field point disposed opposite with respect to the optical axis.

(32) FIG. 5 illustrates these optical properties of the two-lens objective again in a plan view of the second lens 52. Bundles of rays 54, 56 and 54′. 56′ of oppositely disposed field points do not overlap and do not reach the respective other half of the second lens 52 with respect to the optical axis. Laterally adjacent field points may also result in a specific overlap. The center of the second lens 52 remains without illumination.

(33) These qualitatively explained properties can also be indicated more exactly with reference to the parameters of the two-lens objective. That distance d between the main plane of the first lens 50 and the first optically active surface of the second lens 52 is sought in which all the beams of a bundle of rays 54, 56 to a field point have completely arrived at one side with respect to the optical axis.

(34) The main beam of the light bundle 50, 52 through the center of the first lens 50 has a side offset of tan α*d with a variably conceived d and the relevant marginal beam still has an additional lateral offset of (D1/2)f1*d, where D1 is the diameter used and f1 is the focal length of the first lens 50. Overall, the lateral offset should move the marginal beam beyond the optical axis. A lateral offset of d1/2 is necessary for this. The following inequality therefore has to be satisfied:
[(D1/2)/f1+tan α]*d≥D1/2,

(35) and this can be transformed into
d≥(D1*f1)/(D1+2*f1*tan α).

(36) It is advantageous here to select a numerical value for d at least close to the equality. The greater the remaining difference in the inequality, the more the second lens 52 comes into direct proximity with the image plane and can there hardly still develop a useful effect.

(37) The two lenses 50, 52 can be plano-convex, convex-plano, biconvex, and possibly also convex-concave or concave-convex; but in the last two cases still as a converging lens. Classically refractive lenses, Fresnel lenses, or diffractive optics and combinations thereof are possible. The two lenses 50, 52 can differ from one another or coincide with one another in these general shaping properties and effective principles. The two lenses 50, 52 can have different focal lengths f1, f2, different diameters D1, D2, and different shapes.

(38) In an advantageous embodiment, not only the distance between the two lenses 50, 52 is selected with reference to the above-stated inequality, but the selection f2=d′ is also made. d′ here is the distance between the main planes of the lenses 50, 52 that is a little larger than the distance d depending on the center thickness of the second lens 52.

(39) The front focal plane of the second lens 52 is placed in the main plane of the first lens 50 at this focal length f2. This has the consequence that the main beam runs in parallel with the optical axis in the image plane of the objective; the objective is then therefore telecentric at the image side. Inter alia on the use as a transmission optics 24, this makes it possible that the light sources 22a may be aligned in parallel with one another and do not have to be slanted. Advantages also already result with a non-exact matching of the focal length f2 to the distance d′, that is only f2˜d′, because the main beam angle at the image side is then already considerably reduced in size, albeit not down to 0°.

(40) The diameter D2 of the second lens 52 is furthermore preferably only selected to be as large as the light beams 54, 56 that pass through require. The two-lens objective is then completely determined only by three parameters: The diameter D1 and the focal length f1 of the first lens 50 can be freely selected. The distance d of the second lens 52 results from the above-explained inequality. The focal length f2 is finally placed at the distance d′.

(41) A total focal length f of the objective can also be calculated from these now known parameters using formulas of geometrical (paraxial) optics that are known per se. Conversely, the two-lens objective can only be fixed by its basic paraxial values: The focal length f of the objective, the aperture D=D1 of the objective, and the field angle α of the circular image field.

(42) In a further preferred embodiment with a very large, but still achievable f-number k1:=f1/D1=1 of the first lens, the relationships are simplified in a very graphic manner:
d=f1/(1+2 tan α), for instance with α=30°:d≈0.5*f1,
f2=d′≈d≈f1/(1+2 tan α).

(43) All the focal lengths f1, f2 and distances d or d′ for the design of the two-lens objective are herewith given for this preferred embodiment for every desired field angle α and for every desired aperture D=D1. All these values can again where required also be directly obtained from the desired values f and D of the objective using the formulas known per se for the calculation of the total focal length of two combined lenses.

(44) Finally, a numerical example will be shown: Objective focal length f=19 mm Aperture D=20 mm (diameter of lens 1).fwdarw.k=D/f=1.05) First lens F2 glass: f1=29.8 mm, center thickness 4 mm, aspherically convex-plano Second lens F2 glass: f2 21.6 mm, center thickness 5 mm, spherically convex-plano Lens distance d=14.8 mm; distance of second lens from the image plane: 4.2 mm Image field angle α=±15.4° Spot diameter 20 μm (=approx. 1 mrad)