SENSOR FOR CONTROLLING AN AUTOMATIC DOOR
20210011160 ยท 2021-01-14
Assignee
Inventors
Cpc classification
G01S17/58
PHYSICS
G01S17/42
PHYSICS
E05Y2400/44
FIXED CONSTRUCTIONS
G01S7/4802
PHYSICS
International classification
G01S17/42
PHYSICS
G01S7/481
PHYSICS
Abstract
The invention relates to a Sensor (10) for controlling an automatic door, where the sensor (10) comprises a laser scanner (12) for detecting the presence of an object, with at least one laser curtain (22, 32, 34) in a predefined detection area of the scanning field, where the sensor (10) comprises a distance data acquisition unit (13) that is embodied to acquire the distances of the points of reflection of the reflected signal by evaluation of time of flight, a presence detection unit (15), where the result of the distance data acquisition unit (13) is fed to the presence detection unit (15), where the distance data acquisition unit (13) forwards the distance data information to the presence detection unit (15), where the presence detection unit (15) evaluates, if an object is detected within the predefined detection area by analysing the distance data, where a presence detection information is created and fed to an at least one sensor output port (18, 18b). The invention is characterized in that the sensor further comprises an object information unit (11) comprising a human body identification unit (16), where the object information unit (11) receives the distance data, and the human body identification unit (16) uses the distance data to determine if the detected object is a human body, where the object information unit (11) creates an object information that is fed to the at least one output port (18, 18a).
Claims
1-15. (canceled)
16. Sensor (10) for controlling an automatic door, comprising: a laser scanner (12) for detecting the presence of an object with at least one laser curtain (22, 32, 34) in a predefined detection area of the scanning field, a distance data acquisition unit (13) that is embodied to acquire the distances of the points of reflection of the reflected signal by evaluation of time of flight, a presence detection unit (15), where the result of the distance data acquisition unit (13) is fed to the presence detection unit (15), the distance data acquisition unit (13) forwards the distance data information to the presence detection unit (15), the presence detection unit (15) evaluates, if an object is detected within the predefined detection area by analysing the distance data, where a presence detection information is created and fed to an at least one sensor output port (18, 18b), and, an object information unit (11) comprising a human body identification unit (16), where the object information unit (11) receives the distance data, and the human body identification unit (16) uses the distance data to determine if the detected object is a human body, where the object information unit (11) creates an object information that is fed to the at least one output port (18, 18a).
17. Sensor according to claim 16 characterized in that the object information unit (11) comprises a counting unit to count the number of human bodies detected by the human body identification unit (16), so that counting information is fed to the at least one output port (18, 18a).
18. Sensor according to claim 16 characterized in that the laser scanner (12) generates multiple laser curtains (32, 33) and the object information unit comprises a motion detection unit to recognize the the moving direction of an object.
19. Sensor according to claim 16, characterized in that the output port (18, 18a, 18b) is a physical or wireless port.
20. Sensor according to claim 16 characterized in that the presence information and the object information are fed to the same output port (18).
21. Sensor according to claim 16 characterized in that a first output port (18b) is dedicated to the presence information and a second output (18a) is dedicated to object information.
22. Sensor according to claim 20 characterized in that the presence information has a higher priority than the object information.
23. Sensor according to claim 16 characterized in that it comprises a computational unit (14) that is able to execute a method of human body recognition.
24. Method of human body recognition according to claim 23 characterized by analysing a detected object (P) in a monitored area and deciding whether or not the detected object is a human being with a laser scanner (12) comprising: the laser scanner (12) generates at least one laser curtain (22, 32, 34), where each laser curtain (22, 32, 34) is generated by multiple pulses evaluated by Time of flight (TOF) measurement of single pulses to generate the distance of the points of reflection with respect to the laser scanner position; combination of distances of the points of reflection with the direction of the pulse to retrieve a position in a predefined detection zone within a monitored area; projecting the points of reflection belonging to an detected object into an evaluation plane (EP) as evaluation objects (O1, O2), where the evaluation plane (EP) has a Z-axis that is related to the height and a perpendicular one to the Z-axis that is related to the width in the direction of the lateral extension of the laser curtain (22, 32, 34), where the evaluation plane (EP) is evaluated based on the density distribution of the points of reflection along the Z-axis and the evaluation result is compared to anthropometric parameters.
25. Method according to claim 24 characterized in that the anthropometric parameters are human body measures and/or human body proportions.
26. Method according to claim 10 characterized in that the points of reflection belonging to a evaluation object (O1, O2) are evaluated based on density distribution over height, where accordingly a head height (H1) and shoulder height (H2) is derived from, and the anthropometric parameter is head height (H1) to shoulder height (H2) ratio, which is compared to a predefined range for a human body.
27. Method according to claim 24 characterized in that the head height (H1) and shoulder height (H2) are derived by evaluating the peaks (24, 26) of the density distribution.
28. Method according to claim 24 characterized in that the evaluation plane (EP) is evaluated due to density distribution over height, where a head width (W1) and shoulder width (W2) are derived by taking the width (W1, W2) at the peaks of the corresponding density distribution.
29. Method according to claim 28 characterized in that the anthropometric parameter is head width (W1) to shoulder width (W2) ratio, which is compared to a predefined range for human body proportion.
30. Method according to claim 16 characterized in that the points of reflection are time integrated over an acquisition period.
Description
[0057] Throughout the description, the claims and the drawings, those terms and associated reference signs will be used as are notable from the enclosed list of reference signs. In the drawings is shown
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072] Furthermore, the processing unit 14 comprises a distance determination unit 13 that is employing TOF to determine the distance of a point of reflection. This distance information is fed to the presence detection unit 15 that determines whether the point of reflection was caused by an object in a critical area. Furthermore the processing unit 14 comprises a direction determination unit 17 that is enabled to derive the motion direction of a human body or an object. Preferably the evaluation unit 16 and the direction determination unit 17 are grouped in the object information 11 unit so that both information can be merged and communicated to the output port 18a.
[0073] The laser scanner of the embodiment according to
[0074] According to this setup the evaluation unit 16 receives the data of the point of reflection with regard to the laser scanner.
[0075] The evaluation unit 16 then analyses the point of reflections according to the invention as will be further described in the following figures and as a result will output a signal containing information whether or not an detected object as a human body.
[0076]
[0077] In difference to the example of
[0078] A further difference is shown in
[0079] The method how the distance data are forwarded is independent of the solution of using a common output port or separate output ports. Therefore these aspects can be combined on demand.
[0080]
[0081] The evaluation unit of the sensor 20 is set in a way that it evaluates and evaluation plane EP that matches the laser curtain 22. Therefore the evaluation plane EP has a Z-axis in a vertical direction and the same width axis W as has the laser curtain 22.
[0082]
[0083] According to invention the evaluation unit 16 now computes a density distribution along Z-axis of the evaluation plane EP, where in this density distribution two peaks are supposed to be derivable
[0084] If there is e.g. only one peak, the measurement is discarded and the evaluation object is not identified as a human body.
[0085] If there are two peaks 24, 26, as would be the case by detecting a human body the position H1, H2 of the position of the peaks on the Z-axis is taken. The first peak 24 is assumed to provide the overall height H1 of the object, being the head when viewing the human body and the second peak 26 is supposed to be the shoulder height H2 of a person. The ratio of overall height H1 and shoulder height H2 is compared to a range of predefined human body proportions. Furthermore, the head height (the distance between shoulder height and overall height; H1H2) may be taken into account as well, as human body proportions change with age of the human beings.
[0086] According to this it is not necessary to limit the measurement to a minimum height that possibly might exclude children from detection, as they can be defined according to the above described evaluation.
[0087] Within the evaluation plane EP the width W2 of the shoulders the position H2 of the second density peak 26 can be determined. In the area of the first peak 24, the width of the head W1 can be determined. Due to these further parameters more precise evaluation of the object with regard to human body recognition can be achieved.
[0088]
[0089] The laser scanner of the human recognition sensor 30 derives the position of the points of reflection of the detected object relative to the laser scanner, where the evaluation unit projects them into the evaluation plane EP as evaluation objects.
[0090] The persons P, when moving through the laser curtains 32, 34, produce points of reflection during an acquisition period.
[0091] As described in
[0092] In this time width-plane the present points of reflection are clustered to time-objects TO_1, TO_2, TO_3. This is done by using the DBSCAN algorithm.
[0093] The four detected objects passing the laser curtain during the acquisition period in this case lead to the definition of three time objects TO_1, TO_2, TO_3.
[0094] As shown in an enlarged view of the time-object TO_2 that there could be more detected objects in the time object TO_2.
[0095] The evaluation unit is further furnished to take the points of reflection of each time object and projects them into the evaluation plane EP, as shown in
[0096] In a next separation step the evaluation unit assigns the points of reflection of each time object TO_1, TO_2, TO_3 to objects.
[0097] This is done by analyzing the evaluation plane EP from top to the bottom and assigning each point to an evaluation object.
[0098] The determination of single evaluation objects O1 is done by the evaluation unit, where the evaluation plane EP contains all points of reflection of the time-object TO_2. The evaluation plane EP is parsed by a neighbor zone 40 from the top to the bottom of the evaluation plane EP. Once a point or points of reflection are newly present in the neighbor zone 40, all the points of reflection within the neighbor zone 40 are taken into account and the newly present point of reflection is assigned to an evaluation object; e.g. see
[0099] As a result
[0100] Each object in this evaluation plane as shown in
[0101] According to a further improvement of the invention the evaluation unit may be enabled to analyse the moving direction of objects. This enables the human recognition sensor to provide direction information with the object information. E.g. this allows a count on how many people entered or left a building or to do the counting itself and just to provide the net count on the output port.
[0102] The moving direction is analyzed by comparing the accumulated points of reflection of the two curtains 32, 34 over a short period of time e.g. 500 ms. The points of reflection are projected into a time with plane, in which the mathematical center of gravity of the present points of reflection is determined for each curtain.
[0103] According to the shift of the center of gravity, indicated by the cross in
LIST OF REFERENCE SIGNS
[0104] 10 human recognition sensor
[0105] 11 object information unit
[0106] 12 laser scanner
[0107] 13 distance determination unit
[0108] 14 processing unit
[0109] 15 presence detection unit
[0110] 16 evaluation unit
[0111] 17 direction determination unit
[0112] 18a output port
[0113] 18b output port
[0114] 20 human recognition sensor
[0115] 22 laser curtain
[0116] 24 peak
[0117] 26 peak
[0118] 30 human recognition sensor
[0119] 32 first laser curtain
[0120] 34 second laser curtain
[0121] 44 center of gravity
[0122] 46 center of gravity
[0123] TO_1 time object
[0124] TO_2 time object
[0125] TO_3 time object
[0126] O1 object
[0127] O2 object
[0128] EP evaluation plane
[0129] P person
[0130] M moving direction
[0131] Z Z-Axis
[0132] W width axis