PERSONAL IDENTIFICATION FOR MULTI-STAGE INSPECTIONS OF PERSONS
20170365118 · 2017-12-21
Inventors
Cpc classification
G07C9/37
PHYSICS
International classification
Abstract
The present invention relates to a multi-stage control system comprising at least one control device at a first location and at least a follow-up control point at a second location with a follow-up control device. The control device according to the invention comprises for the automatic inspection of a person with respect to hidden objects an inspection device for contactless inspection of the person. Said control device is configured to determine a follow-up control area of the person, to store data defining the follow-up control area in a data set, to generate a unique identification feature for the person on the basis of a detected external feature of the person, and to allocate the person to the data set. The follow-up control device comprises a display device for displaying a graphical representative of a person, wherein the display device is designed to display a follow-up control area of the person in a visually recognizable manner for finding hidden objects in accordance with a data set allocated to the person. The follow-up control device can be configured to generate the unique identification feature for the person on the basis of a detected feature of the person. Alternatively, the follow-up control device can be designed to display an identification feature, in particular a recording, more particularly a recording of the facial view of the person, which is allocated to the data set of a follow-up control area at another control point, for visual verification whether the data set is associated with the person. The invention further relates to a corresponding control method, to a corresponding follow-up control method and to a corresponding multi-stage control method.
Claims
1. A screening device (12; 12a; 12b), which for automatically screening a person (10.1) for concealed objects has an inspection device (16) for contactless inspection of the person (10.1), the inspection device being configured for determining a follow-up screening area (13; 13.1, 13.2, 13.3) of the person (10.1) and storing data defined by the follow-up screening area in a data set, characterized in that the screening device (12; 12a; 12b) is configured for generating, based on a detected external feature of the person (10.1), a unique identification feature for the person (10.1), based thereon, and associating it with the data set of the person (10.1).
2. The screening device (12; 12a; 12b) according to claim 1, wherein the screening device has or is connected to at least one detection unit (32.1, 34.1, 34.2, 34.3) for at least one biometric feature of the person (10.1) as the basis for the identification feature, the biometric feature being in particular at least one of the following: the papillary pattern of a finger of the person (10.1), the hand geometry and/or the palm lines on a hand of the person (10.1), the iris structure and/or the retina structure of at least one eye of the person (10.1), the voice of the person (10.1), an image, preferably of the face, of the person (10.1), the gender of the person (10.1), the build of the person (10.1), the height of the person (10.1), or the estimated weight of the person (10.1).
3. The screening device (12; 12a; 12b) according to claim 1 or 2, wherein the screening device has or is connected to at least one image acquisition unit as a detection unit (34.1, 34.2, 34.3), the image acquisition unit being configured for generating a facial photograph of the person (10.1) as an identification feature.
4. The screening device (12; 12a; 12b) according to one of claims 1 to 3, wherein the inspection device is configured for i) scanning the person to be screened (10.1) with X-rays or electromagnetic millimeter waves and generating a backscatter image of the bodily surface of the person (10.1), or ii) scanning the person to be screened (10.1) with X-rays and generating a transmission image of the person (10.1).
5. The screening device (12; 12a; 12b) according to one of claims 1 to 4, wherein the screening device is configured for generating for the person (10.1), based on features of the person (10.1) detected by means of the inspection device (16), the identification feature, preferably the gender of the person (10.1), the build of the person (10.1), the height of the person (10.1), or the estimated weight of the person (10.1).
6. A follow-up screening device (26.1, 26.2, 26.3) having a display device (20.2, 20.3, 20.4) for displaying a graphical representation (11.1, 11.2) of a person (10.1), the display device (20.2, 20.3, 20.4) being configured for displaying, corresponding to a data set associated with the person (10.1), a follow-up screening area (13; 13.1, 13.2, 13.3) of the person (10.1) in a visually recognizable manner in order to find concealed objects, characterized in that the follow-up screening device (26.1, 26.2, 26.3) is configured for generating, based on a detected feature of the person (10.1), a unique identification feature for the person (10.1).
7. A follow-up screening device (26.1, 26.2, 26.3) having a display device (20.2, 20.3, 20.4) for displaying a graphical representation (11.1, 11.2) of a person (10.1), the display device (20.2, 20.3, 20.4) being configured for displaying, corresponding to a data set associated with the person (10.1), a follow-up screening area (13; 13.1, 13.2, 13.3) of the person (10.1) with regard to concealed objects, in a visually recognizable manner, characterized in that the follow-up screening device (26.1, 26.2, 26.3) is configured for displaying an identification feature, in particular a photograph, particularly preferably a facial photograph, of the person (10.1), which is associated with the data set of a displayed follow-up screening area (13; 13.1, 13.2, 13.3) at another control point (12), for visually verifying that the data set is associated with the person (10.1).
8. A multi-stage screening system having at least one screening device (12; 12a; 12b) according to one of claims 1 to 5 at a first location, and at least one follow-up screening point at a second location, having at least one follow-up screening device (26.1, 26.2, 26.3) according to one of claim 6 or 7.
9. A screening method with automatic screening (S10) of a person (10.1) for concealed objects by means of a contactless inspection method, a follow-up screening area (13; 13.1, 13.2, 13.3) of the person (10.1) being determined and the data that define the follow-up screening area being stored in a data set, characterized by detection (T1; T2; T3) of an external feature of the person (10.1); generation (S14) of a unique identification feature for the person (10.1) that is based on the external feature; and association (S16) of the identification feature with the data set.
10. The screening method according to claim 9, wherein at least one biometric feature of the person (10.1) is detected for the detection of an external feature of the person (10.1).
11. The screening method according to claim 9 or 10, wherein during the automatic screening (S10), i) the person to be screened (10.1) is scanned with X-rays or electromagnetic millimeter waves and a backscatter image of the bodily surface of the person (10.1) is generated, or ii) the person to be screened (10.1) is scanned with X-rays and a transmission image of the person (10.1) is generated.
12. The screening method according to one of claims 9 to 11, wherein, based on features of the person (10.1) that are detected by means of the contactless inspection method, the identification feature for the person (10.1), based thereon, is generated.
13. A follow-up screening method for follow-up screening of a person (10.1) in order to find concealed objects, with display (S20) of a graphical representation of the person (10.1), and display of a visually recognizable follow-up screening area (13; 13.1, 13.2, 13.3) on the graphical representation, corresponding to a data set that is associated with the person (10.1), characterized by detection (S23) of an external feature of the person (10.1) and generation (S25) of a unique identification feature for the person (10.1), based thereon.
14. A follow-up screening method for follow-up screening of a person (10.1) for concealed objects, with display (S20) of a graphical representation of the person, and display of a visually recognizable follow-up screening area on the graphical representation, corresponding to a data set that is associated with the person, characterized by display (S22) of an identification feature, in particular a photograph, particularly preferably a facial photograph, of the person (10.1), which is associated with the data set of a displayed follow-up screening area at another control point, and visual verification (S24) that the data set is associated with the person (10.1).
15. A multi-stage screening method having at least one first stage that includes a screening method according to one of claims 9 to 12, and at least one second stage that includes a follow-up screening method according to one of claims 13 to 14.
Description
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061] Within the scope of the security check 1, the carry-on luggage and possibly the shoes of passengers (not illustrated) are screened in a known manner. The carry-on luggage, jackets, shoes, and other objects carried by the passenger are generally inspected by X-ray. In the process, objects and substances inside the carry-on luggage, for example, are made visible, and screening is conducted for concealed or hidden dangerous contents such as weapons, prohibited objects, or hazardous substances.
[0062] In addition to the screening of the carry-on luggage described above, there is likewise screening of the passengers for hazardous or prohibited objects which the passenger is carrying in concealment on his/her body or in the clothing.
[0063] For this purpose, a passenger 10.1 is initially subjected to an automatic screening for concealed objects at a first location, by means of a first screening device 12. Additional passengers 10.2, in particular at peak traffic periods, wait in a waiting line 14 in front of the particular first screening device 12 until their turn comes.
[0064] In
[0065] The screening device 12 is generally attended to by at least one security agent 18.1. Inspection information concerning the person 10.1 being screened at that moment in the screening device 12 is displayed to the security agent 18.1 via a display device 20.1 that is integrated into the screening device 12 (or alternatively, operatively connected thereto).
[0066] The screening device 12 automatically carries out the inspection of the person 10.1, wherein all inspection information generated concerning the person 10.1 is automatically evaluated on the device by program control, i.e., by means of appropriate software programs. If the result of the automatic screening of the person 10.1 shows that, based on the security threshold required by applicable regulations, it may be assumed that the screened person is not carrying any hazardous or prohibited objects in concealment, the security agent 18.1 is signaled on the display unit 20.1, for example by means of a green screen, that the person is not of special interest. The person 10.1 may then, for example, walk on the path 22 into the secured area. The person 10.1 may also pick up his/her screened baggage (if present) beforehand, and, retaining the example of the airport, proceed to the appropriate departure gate in the secured area.
[0067] However, if the screening device 12 establishes, based on the automatic screening of the person 10.1, that the person 10.1 is possibly carrying hidden objects on the body concealed by the clothing or in the clothing, an appropriate alarm notification is displayed to the security agent 18.1 on the display unit 20.1. Accordingly, the person 10.1 must be submitted to a follow-up screening by a security agent 18.2, 18.3, or 18.4. That is, the passenger 10.1 must walk on the path 24 to the appropriate follow-up screening devices 26.1, 26.2, 26.3. At that location the passenger 10.1 is then subjected to a follow-up screening, for example with manual scanning, accompanied by a bodily inspection, to rule out the presence of hazardous or prohibited objects on the person 10.1.
[0068] As discussed above, the follow-up screening takes place at a second location that is different from the first location of the screening device 12, so as not to adversely impact the throughput at the first screening device 12 due to the follow-up screenings.
[0069] Since the follow-up screening of a person 10.1 generally takes considerably more time than the automatic screening with the first screening device 12, a waiting line 28 with passengers 30.1, 30.2 to be subjected to a follow-up screening may also form in front of the follow-up screening devices.
[0070] In principle, a person 10.1 or 30.1, 30.2 classified as “of special interest” by the first screening device 12 could be subjected to a full bodily follow-up screening at one of the follow-up screening devices 26.1, 26.2, 26.3. However, this would unnecessarily increase the time for the follow-up screening, and in addition the findings already obtained by use of the first screening device 12 would go unused. For this reason, the screening device 12 displays on the display unit 20.1, on a graphical representation of the screened person 10.1, the bodily area or optionally the bodily areas for which the automatic inspection device has acquired information concerning the presence of concealed objects. One example of such a graphical display is discussed in greater detail below in conjunction with
[0071] One problem with regard to an efficient screening process is ensuring that the screening information obtained at the control point 12 concerning the automatically inspected person 10.1, as the basis for the follow-up screening by a security agent 18.2, 18.3, 18.4 waiting at the follow-up screening devices 26.1, 26.2, 26.3, is transmitted in such a way that the screening information at that location is associated with the correct person.
[0072] For this reason, likewise provided at each follow-up screening device 26.1, 26.2, 26.3 is a display unit 20.2, 20.3, 20.4 on which the follow-up screening areas for the follow-up screening, associated with the person, for example 30.1 or 30.2, who is to undergo a follow-up screening, and determined by the first screening device 12, are displayed to the security agent 18.2, 18.3, 18.4 present at that location. That is, the data defining the follow-up screening area of a person 10.1, 30.1, 30.2 who is to undergo a follow-up screening have been stored in a respective data set at the first screening device 12. If these data are relayed, merely corresponding to the chronology of their detection, to a particular follow-up screening device 26.1, 26.2, 26.3 that becomes free, it can be ensured that the person associated with the particular data set goes to the “correct” follow-up screening device 26.1, 26.2, 26.3 only when there is no waiting line 28 on the path to that location. However, if a waiting line 28 has formed, i.e., if at least two persons who are to undergo a follow-up screening 30.1, 30.2 are waiting in front of the follow-up screening devices 26.1, 26.2, 26.3, as the result of the persons 30.1, 30.2 intentionally or also unintentionally exchanging places in the waiting line 28 it is possible that a person 30.1, 30.2 who is to undergo a follow-up screening does not go to the follow-up screening device 26.1, 26.2 to which the data defining the follow-up screening areas associated with that person have been transmitted. In the prior art, this possible security gap is closed, for example, by appropriate staffing to avoid waiting lines.
[0073] In principle, a plurality of measures may be taken, for example to denote a person 10.1 of special interest at the first screening device 12 with an identification feature, so that this identification feature, as an assignment criterion, could be used in a largely tamper-proof manner at one of the follow-up screening devices 26.1, 26.2, 26.3. For example, a tamper-proof band on which a machine-readable barcode is present as an identification feature could be printed out at the first screening device 12. This band could be affixed to the wrist of the person 10.1, so that, based on this identification feature, the person 10.1 may be uniquely identified at one of the follow-up screening devices 26.1, 26.2, 26.3. Based on the identification feature, the correct data set that defines the associated follow-up screening area could also be retrieved. However, this approach is not optimal. First of all, producing and affixing such a security band would mean an additional material and time expenditure. Additional malfunctions may occur with extra technical equipment. In addition, it cannot be ruled out that such a physical identification feature could still be intentionally exchanged between persons 30.1, 30.2 waiting in a line.
[0074] A different approach is therefore proposed below. To this end, it is provided that the screening device 12 is additionally configured for generating, based on a detectable external feature of the person 10.1, a unique identification feature for the person 10.1, based thereon. The identification feature is then associated with the data set of the person 10.1 which determines the follow-up screening area of the person 10.1, as has been determined by the screening device 12 based on the contactless inspection method.
[0075] The detectable external feature of the person 10.1 is preferably at least one biometric feature of the person 10.1 which directly represents the identification feature or is used as the basis for generating same. For this purpose, the screening device 12 may be equipped with an appropriate detection unit or be connected to same.
[0076] The following are particularly suitable here as externally detectable biometric features: the papillary pattern of a finger of the person (fingerprints), the hand geometry and/or the palm lines on a hand of the person 10.1, the iris structure and/or the retina structure of at least one eye of the person 10.1, or the voice of the person 10.1. These biometric features may be detected at the screening device 12 using a detection unit 32.1, known to those skilled in the art, for the biometric feature in question. For example, the detection unit 32.1 may be a scanner for fingerprints or palm lines, an optical detector for scanning the iris structure and/or the retina structure of an eye, or an acoustic recording device for recording voices.
[0077] The screening device 12 is configured for generating, based on the detected external feature, i.e., the biometric feature of the person 10.1, the unique identification feature for the person 10.1, based thereon, and associating the data set, which determines the follow-up screening area for the person, with the person 10.1.
[0078] According to this variant, appropriate detection units 32.2 through 32.4 are likewise provided at the follow-up screening devices 26.1 through 26.3. Correspondingly, the follow-up screening devices 26.1 through 26.3 are also configured for generating, based on the detected external biometric feature of the person 10.1, the corresponding unique identification feature for the person 10.1, based thereon.
[0079] Based on the identification feature, the follow-up screening device 26.1 through 26.3 may then verify that a follow-up screening area displayed on the particular display unit 20.2 through 20.4 is associated with the person who is approaching for the follow-up screening.
[0080] Alternatively, it is possible for the follow-up screening device 26.1 through 26.3 to be configured so that it identifies, based on the identification feature, the associated data set at a data source and then retrieves it from that location. The data source may be, for example, a central data server 40 at the control point 1, or may be the first screening device 12 itself. The data server 40 for storing the data sets (screening data) may be situated spatially remotely from the control point 1, and may be connected thereto in a known manner via a computer network (LAN and/or WAN).
[0081] The approach explained in conjunction with the detection units 32.1 through 32.4 reduces the staffing necessary for avoiding waiting lines, but requires additional material expenditure in the form of the mentioned detection units 32.1 through 32.4. In addition, the detection of the mentioned biometric features represents an acceptance problem for the affected persons, since they must basically trust that detected biometric data will in fact be deleted after use.
[0082] Therefore, in one particularly preferred embodiment, as a biometric feature an image of the person 10.1 automatically inspected in the first screening device 12, in particular the face of the person 10.1, is generated as an identification feature. For this purpose, an appropriate image acquisition unit, for example at least one of the cameras 34.1, 34.2, 34.3, is provided at the first screening device 12.
[0083] An image acquisition unit, as a camera 34.1, may be integrated into the screening device 12 or be mounted on same, so that the person 10.1 may be photographed before, during, or after the automatic screening by means of the contactless inspection device.
[0084] Alternatively or additionally, an image acquisition unit in the form of the camera 34.2 may be set up, in the open or concealed, in the area of the waiting line 14. By use of the camera 34.2, an appropriate image of the person may already be acquired before entry into the screening device 12.
[0085] Alternatively or additionally, a mobile camera 34.3 may be used as the image acquisition unit. In the example in
[0086] Alternatively, the security agent 18.1 could hand the smart phone or the tablet computer with the integrated camera 34.3 to the person 10.1. The person 10.1 can then take a facial photograph (selfie) of him/herself.
[0087] Using an image of the face of the person 10.1 as a unique identification feature for the person 10.1 additionally greatly simplifies the method described above. Here as well, the screening device 12 is configured for associating the image of the face of the person 10.1 as an identification feature with the data set, which determines the follow-up screening area of the person 10.1, and transmitting same as a data source for the data sets to one of the follow-up screening devices 26.1 through 26.3 or a central data server 40. The data server 40 is connected in a known manner to the screening device 12 and to the follow-up screening devices 26.1 through 26.3 via a computer network (LAN and/or WAN), and may therefore be situated in the area of the control point 1 or spatially remotely from same.
[0088] For example, the screening device 12 may transmit the data set to the follow-up screening device 26.1 through 26.3, which is displayed to the system as available. In the example in
[0089] Alternatively, the unoccupied follow-up screening device 26.3 could retrieve the next data set of a person who is to undergo a follow-up screening 10.1 from the screening device 12 or the central data server 40. The screening device 12 may then display the follow-up screening area for the person 10.1, which is determined by the retrieved data set, together with the image of the face of the person 10.1, as an identification feature on the display unit 20.4.
[0090] In both cases, the security agent 18.4 at the follow-up screening device 26.3 may thus immediately identify or recognize the passenger in the waiting line 28 who is to be subjected to a follow-up screening by that security agent, and accept that passenger for the follow-up screening.
[0091] In one particular refinement of the system, the first screening device 12 is configured for generating, based on the inspection data concerning the inspected person 10.1 that have been detected by means of the contactless inspection device 16, the identification feature for the person. It is likewise possible for the first screening device 12 to be configured for determining, based on the inspection data concerning the inspected person 10.1 that have been detected by means of the contactless inspection device 16, as an identification feature as an alternative or in addition to a representation or an image of the person, one or more of the following features: the gender of the person 10.1, the build of the person 10.1, the height of the person 10.1, or the estimated weight of the person 10.1.
[0092] For example, an image of the face of the person 10.1 may be used as an identification feature. An image of the face may be generated, for example, based on surface scanning of the person 10.1, using reflected X-rays or millimeter waves. With appropriate quality, for example the graphical representation of the person 10.1, which is displayed on the display unit 20.1 or one of the display units 20.2 through 20.4 at one of the follow-up screening devices 26.1 through 26.3, may be displayed with an image of the face of the person 10.1.
[0093] Alternatively or additionally, one or more of the following features may be used as an identification feature: the gender of the person 10.1, the build of the person 10.1, the height of the person 10.1, or the estimated weight of the person 10.1.
[0094] Accordingly, the graphical representation (avatar) of the person 10.1, which is displayed on the display unit 20.1 or one of the display units 20.2 through 20.4 at one of the follow-up screening devices 26.1 through 26.3, may be appropriately modified, i.e., adapted according to the identification features used. The avatar as a line drawing could be depicted, for example, corresponding to the person as a thin or heavyset man. In addition, the avatar could also be adapted to the height of the person; for example, a measuring bar could be depicted next to the avatar.
[0095] It is thus possible once again for the security agents 18.2 through 18.4 at the follow-up screening devices 26.1 through 26.3 to immediately identify a person who is to undergo a follow-up screening.
[0096]
[0097]
[0098] To protect the privacy of the inspected person 10.1, the representation of the results of the screening operation takes place automatically and anonymously. That is, in the course of the automatic screening, the screening device 12a automatically identifies the position of concealed objects by means of the specialized data processing algorithms, and displays same on the display unit 20.1*, on an avatar as a graphical representation of the person 10.1*. The positions 13* marked on the graphical representation correspond to bodily areas, which as follow-up screening areas of the person 10.1* are to be screened once again for concealed objects in a follow-up screening.
[0099] The display content of the display unit 10.1* is illustrated in enlarged form at the left in
[0100] As explained in conjunction with
[0101] Alternatively or additionally, a mobile image acquisition unit in the form of a smart phone or a tablet computer with an integrated camera 34.3* may be provided at the screening device 12a. The security agent 18.1* may then capture a facial image of the person 10.1* as an identification feature, using the mobile camera 34.3*, if necessary (for example, in the event of an ATR alarm). Alternatively, it may be possible for the person 10.1* to use the mobile camera 34.3* to take a facial photograph (selfie) of him/herself. The mobile camera 34.3* may be connected to the screening device 12a via a known short-range radio link, for example Bluetooth or NFC. However, a cable connection via Universal Synchronous Bus (USB) is also possible.
[0102]
[0103] As likewise indicated in
[0104] It is noted that the screening devices 12a and 12b explained by way of example in
[0105]
[0106] A visual marking may comprise, for example, a colored identification of a follow-up screening area. A security agent receives from the representation for the follow-up screening direct information concerning which bodily areas of the person are to be subjected to a follow-up screening. Thus, the information already detected by means of the screening device 12 or 12a, 12b concerning possibly concealed objects may be taken into account during the follow-up screening.
[0107] At each of the follow-up screening devices 26.1, 26.2, 26.3 (
[0108]
[0109] After the method starts, a person is screened for concealed objects by means of a contactless inspection method in a step S10.
[0110] In step S10, the person to be screened is scanned, for example with X-rays or electromagnetic millimeter waves, in order to generate a backscatter image of the bodily surface of the person as the basis for an automatic detection of concealed objects. It is also conceivable to scan the person to be screened with penetrating X-rays, and on this basis to generate a transmission image of the person as the basis for the automatic detection of concealed objects on or in the person.
[0111] If (in a step S11) areas to be subjected to a follow-up screening for the person are determined during this screening due to the fact that the automatic inspection method has found possible concealed objects on the person, the method continues to step S12.
[0112] Data defining the follow-up screening area are stored in a data set in step S12.
[0113] The method subsequently goes to step S14, in which a unique identification feature for the person is generated, based on a detected external feature of the person.
[0114] The method subsequently goes to step S16, in which the generated identification feature is associated with the data set that determines the follow-up screening area of the person.
[0115] The method then goes to a step S20 as the interface with another screening method, in which a more accurate screening of the follow-up screening areas determined for the person takes place.
[0116] If (in step S11) it has been established that no follow-up screening area was determined during the contactless inspection of the person, this means that the person has automatically been classified by the system as unobjectionable. The screening method then ends for this person and goes to step END.
[0117] A step for detecting an external feature of the person may in principle be integrated into the method at times T1, T2, or T3; i.e., the detection of an external feature of the person as the basis for the unique identification feature may in principle take place before, during, or after the automatic screening of the person. If it has been generated before or during the automatic screening, the detected external feature may be immediately deleted as soon as it is established that no follow-up screening area has been determined for the person.
[0118] In one particular refinement it may be provided that in step S12, for example, the basis for the unique identification feature for the person is generated based on features of the person that are detected by means of the contactless inspection method in step S10. As already explained elsewhere, for example an image of the face of the person, which has been acquired by scanning with reflected X-rays or millimeter waves, could be used as a unique identification feature for the person.
[0119]
[0120] According to a first alternative A1 of the method, the method goes to a step S21, in which the same external feature of the person is detected as when the screening method in
[0121] The unique identification feature for the person, based on the detected external feature, is generated in a next step S23, likewise corresponding to the method in
[0122] In a subsequent step S25, it is verified that a follow-up screening area, displayed on the display device, is associated with a person present there who is to undergo a follow-up screening. For this purpose, the unique identification feature generated at the follow-up screening point is checked for agreement with the identification feature that is associated with a displayed follow-up screening area. Alternatively, by use of the identification feature that is generated at the follow-up screening point, the associated data set, which determines the follow-up screening area for the person, may be identified at a data source and retrieved therefrom. The data source may be the central data server 40 (
[0123] If it is established in a step S27 that the follow-up screening area displayed on the display device (20.2, for example) at the follow-up screening device (26.1, for example) does not match the person (30.2, for example) who is to undergo a follow-up screening, the person (30.2) in question in the waiting line 28 may have possibly exchanged places with another person (30.1, for example) in the waiting line 28 prior to the follow-up screening points 26.1, 26.2, 26.3. Accordingly, it is to be assumed that an appropriate alarm notification is triggered at another follow-up screening point (26.2, for example). This means that in this specific situation, which is only an example, the two persons (30.1, 30.2) must switch follow-up screening points (26.1, 26.2), and the method in each case then begins once more at step S20.
[0124] If it has been established in step S27 that the person who is to undergo a follow-up screening is associated with the displayed follow-up screening area, the appropriate follow-up screening of the person may take place in a subsequent step S30.
[0125] According to an alternative embodiment A2 of the follow-up screening method, after step S20 the method goes to step S22.
[0126] In step S22, together with a displayed follow-up screening area, an associated identification feature at a prior control point (for example, 12 in
[0127] In a subsequent step S24, it is verified by visual means that the displayed follow-up screening area is associated with the person who is present at the follow-up screening point, by comparing the displayed identification feature to the person.
[0128] If the security agent determines a match in a step S26, the method likewise goes to step S30, in which the follow-up screening of the person may take place corresponding to the displayed follow-up screening areas.
[0129] If the visual comparison shows that the person who is present at the follow-up screening point does not match the displayed identification feature, in a step S28 the person is led to one of the other follow-up screening points, similarly as for step S29 in alternative A1. At this location the method likewise begins again at step S20.
[0130] Lastly, it is noted that the screening method explained in conjunction with
[0131] In summary, a multi-stage screening system having at least one screening device at a first location and at least one follow-up screening point at a second location, having a follow-up screening device, has been proposed herein. For automatically screening a person for concealed objects, the screening device has an inspection device for contactless inspection of the person, and is configured for determining a follow-up screening area of the person and storing data defined by the follow-up screening area in a data set, and for generating, based on a detected external feature of the person, a unique identification feature for the person and associating same with the data set of the person. The follow-up screening device has a display device for displaying a graphical representation of a person, the display device being configured for displaying in a visually recognizable manner a follow-up screening area of the person in order to find concealed objects, corresponding to a data set that is associated with the person. The follow-up screening device may be configured for generating the unique identification feature for the person, based on a detected feature of the person. Alternatively, the follow-up screening device may be configured for displaying an identification feature, in particular a photograph, particularly preferably a facial photograph, of the person, which is associated with the data set of a follow-up screening area at another control point, for visual verification that the data set is associated with the person. Moreover, the invention relates to a corresponding screening method, a corresponding follow-up screening method, and a corresponding multi-stage screening method.