ATTENDANCE RECORDING TERMINAL WITH USER-SPECIFIC INTERACTION

20230196300 · 2023-06-22

    Inventors

    Cpc classification

    International classification

    Abstract

    An attendance recording terminal, in particular time recording terminal, for recording attendance information of a person on the basis of an action of the person includes a control unit, a memory management unit having a master record of personal data, and an identification engine is configured for identifying the person and for outputting identification information. The control unit is configured for interacting with the memory management unit in such manner that it checks in the master record, depending on the identification information, as to what type of interaction the identified person wants with the attendance recording terminal. Furthermore, the control unit is configured in such manner that a user-specific type of interaction is provided for the identified person on the attendance recording terminal and/or on a mobile access medium of the person. A corresponding computer-implemented method for operating an attendance recording terminal is also related.

    Claims

    1. An attendance recording terminal for recording attendance information of a person (40) on the basis of an action of the person (40), the attendance recording terminal comprising: a control unit (11), a memory management unit (20), wherein the memory management unit (20) comprises at least one master record of personal data, and an identification engine (111), wherein the identification engine (111) is configured for identifying the person (40) and for outputting identification information, wherein the control unit (11) is configured for interacting with the memory management unit (20) such that the control unit it-checks in the master record, depending on the identification information, as to what type of interaction the identified person (40) wants with the attendance recording terminal (10), and wherein the control unit (11) is configured such that a user-specific type of interaction is provided for the identified person (40) on the attendance recording terminal (10), wherein an intent engine (112) is further configured for recording an intent of the interaction of the person (40) with the attendance recording terminal (10), wherein the intent engine (112) is configured such that, in order to record the intent of the interaction of the person (40) with the attendance recording terminal (10), a signal input by the person (40) is used, wherein the intent engine (112) is further configured such that the signal input takes place as follows: the intent engine (112) receives a signal which is triggered by the person (40) by means of a movement of the mobile access medium of the person (40) and which represents and/or confirms the intent of the interaction.

    2. The attendance recording terminal (10) according to claim 1, wherein at least one sensor unit (12) is provided, wherein the sensor unit (12) is configured for monitoring a surrounding zone (30) and for outputting presence information of the person (40) detected from the monitoring of the surrounding zone (30).

    3. The attendance recording terminal (10) according to claim 1, wherein a display is also provided on the attendance recording terminal (10) and/or the attendance recording terminal (10) is configured to be in communicative connection with a mobile access medium of the person (40) comprising a display, and wherein the type of interaction includes at least the following aspect: providing a user-specific user interface on the display of the attendance recording terminal (10) and/or of the mobile access medium of the person (40).

    4. The attendance recording terminal (10) according to claim 3, wherein the user-specific user interface is provided in a user-specific font size and/or in a user-specific language.

    5. The attendance recording terminal (10) according to claim 1, wherein a/the display is provided on the attendance recording terminal (10) and/or the attendance recording terminal (10) is configured to be in communicative connection with a/the mobile access medium of the person (40) comprising a/the display, and wherein the type of interaction includes at least the following aspect: outputting a voice signal at the attendance recording terminal (10) and/or on the mobile access medium of the person (40), wherein the voice signal is provided in a user-specific language and/or at a user-specific volume.

    6. The attendance recording terminal (10) according to claim 1, wherein an intent engine (112) is also configured for recording an intent of the interaction of the person (40) with the attendance recording terminal (10), wherein the intent engine (112) is configured such that, in order to record the intent of the interaction of the person (40) with the attendance recording terminal (10), a signal input by the person (40) is used, wherein, in particular, the intent engine (112) is further configured such that the signal input takes place as follows: the intent engine (112) receives a signal which is input by the person (40) on a/the mobile access medium of the person (40), and which represents the intent of the interaction; and/or the intent engine (112) receives a voice input from the person (40) representing the intent of the interaction or recognizes a gesture input from the person (40) representing the intent of the interaction, and/or the intent engine (112) processes a signal which is input by the person (40) at the attendance recording terminal (10) and/or on a/the mobile access medium of the person (40), and which represents the intent of the interaction.

    7. The attendance recording terminal (10) according to claim 1, wherein a feedback engine (113) is also provided, wherein the feedback engine (113) is configured such that when the interaction of the person (40) with the attendance recording terminal (10) takes place or is requested, a feedback signal is output at the attendance recording terminal (10) and/or on a/the mobile access medium of the person (40), wherein the feedback signal is designed in at least one of the following ways: as an acoustic feedback signal; and/or as a voice signal; and/or as a haptic feedback signal; and/or as a graphic signal.

    8. A computer-implemented method for operating an attendance recording terminal for recording attendance information of a person (40) on the basis of an action of the person (40), the method including the following steps: identifying the person 40 through an identification engine 111, checking a master record of personal data as to what type of interaction the identified person (40) wants with the attendance recording terminal (10), and providing at least one user-specific type of interaction for the identified person (40), wherein the following is provided as a user-specific type of interaction with the attendance recording terminal (10): performing a predefined movement with a mobile access medium of the person (40), which represents and/or confirms the intent of the interaction, wherein the movement comprises placing the mobile access medium of the person (40) twice or once again on an interface of the attendance recording terminal (10).

    9. The computer-implemented method according to claim 8, wherein, in order to identify the person (40), a surrounding zone (30) is first monitored with at least one sensor unit (12).

    10. The computer-implemented method according to claim 8, wherein the type of interaction includes at least the following aspect: providing a user-specific user interface on a display of the attendance recording terminal (10) and/or a mobile access medium of the person (40).

    11. The computer-implemented method of claim 10, wherein the user-specific user interface is provided in a user-specific font size and/or in a user-specific language.

    12. The computer-implemented method according to claim 8, wherein the type of interaction includes at least the following aspect: outputting a voice signal at the attendance recording terminal (10) and/or on a/the mobile access medium of the person (40), wherein the voice signal is provided in a user-specific language and/or at a user-specific volume.

    13. The computer-implemented method according to claim 8, wherein a desired interaction with the attendance recording terminal (10) is selected and/or confirmed by the person (40) in at least one of the following ways: performing a predefined movement with a/the mobile access medium of the person (40), once again placing the mobile access medium of the person (40), on an interface of the attendance recording terminal (10); and/or performing an input on the attendance recording terminal (10) and/or on a/the mobile access medium of the person (40); and/or performing a predefined gesture input; and/or performing a predefined voice input.

    14. The computer-implemented method according to claim 8, wherein the attendance recording terminal (10) and/or a/the mobile access medium of the person (40) outputs a feedback signal about an interaction with the attendance recording terminal (10) that has been carried out or requested, wherein the feedback signal is output in at least one of the following ways: as an acoustic feedback signal; and/or as a voice signal; and/or as a haptic feedback signal; and/or as a graphic signal.

    15. A computer program comprising commands which, when the program is executed by a processor of an attendance recording terminal (10), causes the attendance recording terminal (10) to perform the steps of the method according to claim 8.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0093] Further advantageous and preferred configurations emerge from the following description with reference to the figures. In the drawings, which only show exemplary embodiments,

    [0094] FIG. 1 shows a schematic configuration of an exemplary embodiment of the proposed attendance recording terminal;

    [0095] FIG. 2 shows a flow chart of the method steps of a proposed computer-implemented method for operating an attendance recording terminal;

    [0096] FIG. 3 shows a schematic configuration of an exemplary embodiment of an identification engine of the proposed attendance recording terminal;

    [0097] FIG. 4 shows a schematic plan view of an arrangement of a proposed attendance recording terminal in a corridor; and

    [0098] FIG. 5 shows a schematic configuration of an exemplary embodiment of a control unit of the proposed attendance recording terminal.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0099] The configuration of an exemplary embodiment of a proposed attendance recording terminal 10 is shown schematically in FIG. 1. The attendance recording terminal 10 is a time recording terminal which serves the purpose of enabling time recording by employees, for example at the entrance to a company building. Specifically, different clocking processes can be carried out at the time recording terminal and thus different attendance information can be stored or processed. For example, an employee can clock his/her attendance in or out when he/she takes a break or when his/her working day is over.

    [0100] The attendance recording terminal 10 is accordingly configured to record attendance information of a person 40 on the basis of an action of the person 40. As shown in FIG. 1, the attendance recording terminal 10 comprises at least one control unit 11 and also a memory management unit 20. The memory management unit 20 can be provided locally and, for example, be an integrated part of the attendance recording terminal 10, or the memory management unit 20 can also be provided online in the cloud. This is indicated in FIG. 1 by the dashed lines. The attendance recording terminal 10 can then establish an online connection to the memory management unit 20, for example via the control unit 11, and in this way communicate with the memory management unit 20.

    [0101] Master records of the people 40, which people 40 interact with the attendance recording terminal 10, are stored in the memory management unit 20. Personal data, for example employee-specific information such as working hours, can be stored in the master records. The personal data is available to the attendance recording terminal 10 or its control unit 11 via a communication connection between the memory management unit 20 and the control unit 11. The memory management unit 20 also processes and stores the attendance information of the person 40 as intended, for example that they have now come to work and their working time recording should begin.

    [0102] Furthermore, the attendance recording terminal 10 that is shown and is preferred in this respect has a sensor unit 12. The sensor unit 12 of the attendance recording terminal 10 monitors a surrounding zone 30, which surrounding zone 30 surrounds the attendance recording terminal 10 (cf. also FIG. 4). Accordingly, the sensor unit 12 receives signals from the surrounding zone 30, as indicated by the solid arrow pointing from the surrounding zone 30 to the sensor unit 12 in FIG. 1.

    [0103] The sensor unit 12 can have different units for monitoring the surrounding zone 30. In the exemplary embodiment of FIG. 1, the sensor unit 12 has the following four units: a camera 121, a biometric sensor 122, a proximity sensor 123 and a mobile access engine 124. In this case, the sensor unit 12 can also have only one of the units or different combinations of the units. The provision of two units of the same type, such as for example two cameras 121, is also possible. A plurality of separately operating sensor units of different configurations can also be provided.

    [0104] The presence of the person 40 in the surrounding zone 30 is first generally recorded via the mentioned units of the sensor unit 12. As a result of this, the sensor unit 12 can output detected presence information of the person 40. Specifically, for example, the camera 121 directed at the surrounding zone 30 monitors the corresponding room, for example a corridor surrounding the attendance recording terminal 10 (identified with reference numeral 300 in FIG. 4), and recognizes by means of image processing as soon as the person 40 enters the surrounding zone 30.

    [0105] The biometric sensor 122 of the sensor unit 12 can in turn detect a direct action of the person 40 at the sensor unit 12 and thus at the attendance recording terminal 10. For this purpose, for example, the person 40 can perform a fingerprint scan on the biometric sensor 122 or trigger another signal.

    [0106] The proximity sensor 123 of the sensor unit 12 can be designed in various embodiments and can detect the presence of the person 40 in the surrounding zone 30 in a contactless manner.

    [0107] The mobile access engine 124 (mobile access device) of the sensor unit 12 records the presence of the person 40 in the surrounding zone 30 via a communication exchange between the mobile access engine 124 and the person 40, specifically a mobile access medium of the person 40. The mobile access medium of the person 40 can be an access card, such as a key card. A keyless transponder (badge) can also be used, or also a mobile device of the person 40, which mobile device has a specific application for communicating with the mobile access engine 124.

    [0108] The mobile access engine 124 of the sensor unit 12 is configured in such manner that a communication exchange, indicated in FIG. 1 by the dashed double arrow, takes place between the mobile access engine 124 and the mobile access medium of the person in order to be able to record the presence of the person 40 and to output the detected presence information.

    [0109] For example, the keyless transponder (badge) or the access card of the person 40 can also be equipped with RFID or also with UWB technology (ultra-wideband). A corresponding RFID or UWB module of the mobile access engine 124 then records when the person 40 enters the surrounding zone 30. This function can also be implemented via an application on the mobile device of the person 40.

    [0110] The mobile access engine 124 can be implemented as software in the attendance recording terminal 10 or in the sensor unit 12 or comprise software via which software the communication exchange between the person 40 and the sensor unit 12 is evaluated and, if necessary, the detected presence information is generated and output.

    [0111] If the sensor unit 12 observes, for example in one of the ways described, that a person 40 is in the surrounding zone 30 or is entering the surrounding zone 30, then the sensor unit 12 can output detected presence information of the person 40.

    [0112] Furthermore, as shown in FIG. 1, the attendance recording terminal 10 has an identification engine 111 (identification device). This identification engine 111, for example integrated in the control unit 11, can identify the person 40 on the basis of that detected presence information of the person 40. In this case, the identification engine 111 is configured to output identification information, which identification information can be clearly representative of a respective person 40 or in any case can clearly assign that person 40 to a predefined group of people. For example, for some interactions it may not be absolutely necessary to uniquely identify the specific person 40, rather it may be sufficient to simply assign them to a group of people. For example, it can be recognized that the person 40 basically belongs to a “permitted” group of people, such that the person 40 can carry out an interaction as long as the interaction does not require a more in-depth, unambiguous identification.

    [0113] The identification engine 111 can be implemented as software in the attendance recording terminal 10 or in the control unit 11 or can comprise software, via which software detected presence information of the person 40 or other personal information can be evaluated.

    [0114] For example, if a person 40 was basically recognized by the sensor unit 12, then this person can be specifically identified by means of the identification engine 111. The aforementioned identification of the person 40 can be based, for example, on the detected presence information of the person 40 and/or on the basis of at least one other item of personal information. The one or the other item of personal information that can be used for identification purposes does not have to include any personal data, but only enables clarity regarding the person 40 or a person to be clearly distinguishable from other people.

    [0115] In the example of a camera 121, the detected presence information can be a photo of the person 40. However, only the signal that the camera 121 has captured a person could also be made available to the control unit 11 or the identification engine 111 as simple detected presence information. The specific photo of the person 40 could be made available as other personal information, on the basis of which the specific identification then takes place.

    [0116] The identification engine 111 is also configured to verify the person 40 and to output verification information. Verification here means a level of detail more than simple identification. The person 40 can be recognized per se or assigned to a group of people via the identification described. However, verification means more extensively that it is also checked whether the identified person 40 is a real person or whether there is a deception, since, for example, an attempt is being made at a deliberate incorrect clocking process. The person 40 is verified on the basis of the detected presence information of the person 40 and/or on the basis of the other personal information.

    [0117] For this purpose, the identification engine 111 can include an anti-spoofing engine 111a, as can be seen in the exemplary embodiment in FIG. 3, which shows an identification engine 111 in detail. The anti-spoofing engine 111a, which can be implemented as software in the identification engine 111 or can comprise software, can be used to distinguish between real people and photos of a person that are merely held in front of the camera 121. Such an anti-spoofing engine 111a is particularly advantageous if cameras 121 are used as sensor units 12 in order to prevent a deliberately incorrect clocking process at the attendance recording terminal 10.

    [0118] The identification engine 111 is therefore either provided with detected presence information or other personal information for evaluating and identifying the person 40, or the identification engine 111 can also detect corresponding information itself. For this purpose, as shown in the exemplary embodiment of the identification engine 111 in FIG. 3, the identification engine 111 can also comprise at least one of the following units: at least one camera 111b, at least one biometric sensor 111c, and/or at least one mobile access engine 111d. The functionalities of the individual units have already been described above in connection with the sensor unit 12, to which reference can be made. It should be explicitly mentioned that the mobile access engine 111d of the identification engine 111 is configured in such manner that, in order to identify the person 40, a communication exchange takes place between the mobile access engine 111d and a mobile access medium (badge, access card, mobile device or application on a mobile device or the like) of the person 40.

    [0119] In principle, to reduce individual units, provision can also be made for the identification engine 111 to be able to access the units of the sensor unit 12 (camera 121, biometric sensor 122, mobile access engine 124) directly or to be provided with corresponding information from them.

    [0120] Furthermore, provision can also be made for the sensor unit 12 and the control unit 11 or the identification engine 111 to be an integral part of the attendance recording terminal 10 and, for example, together with the corresponding sensors, to be installed together in a housing of the attendance recording terminal 10. However, it can also be advantageous to arrange the sensor unit 12 or a part of the sensor unit 12, for example the camera 121, outside of the attendance recording terminal 10 at a favorable point in the surrounding zone 30, while the identification engine 111, for example with its camera 111b, is an integral part of the attendance recording terminal 10. Then, using sensor unit 12, a person can advantageously initially be detected further away from attendance recording terminal 10, while the person 40 is identified by means of the identification engine 111 on the basis of the information generated at camera 111b on attendance recording terminal 10 itself.

    [0121] For the purpose of a user-friendly interaction with the attendance recording terminal 10, the generally heterogeneous group of people 40 interacting with the terminal can be provided with different types of interaction. As proposed, it is advantageous for the control unit 11 to be configured for interacting with the memory management unit 20 in such manner that it is checked in the master record, depending on the identification information, which type of interaction the identified person 40 wants with the attendance recording terminal 10. In addition, the control unit 11 is configured in such manner that a user-specific type of interaction is provided for the identified person 40 on the attendance recording terminal 10 and/or on the mobile access medium of the person 40.

    [0122] The corresponding proposed computer-implemented method for operating the attendance recording terminal 10 for recording attendance information of a person 40 on the basis of an action of the person 40 comprises the following steps: [0123] identifying the person 40 through the identification engine 111, [0124] checking a master record of personal data as to what type of interaction the identified person 40 wants with the attendance recording terminal 10, and [0125] providing a user-specific type of interaction for the identified person 40.

    [0126] An exemplary embodiment of the proposed method is shown in FIG. 2, wherein, within the context of step S101, the surrounding zone 30 is first fundamentally monitored with the sensor unit 12. If a person 40 is detected, then this person is identified in step S102 using the identification engine 111. In the course of the subsequent step S103, a specific check is then carried out with regard to the identified person 40 in the master record of the personal data as to whether a specific type of interaction is desired (step S103a). If the result of this check is “yes”, then the desired user-specific type of interaction is provided for the person 40 as part of step S104. However, if the result of the check is “no”, then a predefined default type of interaction is provided (step S105).

    [0127] One type of interaction includes, for example, providing a user-specific user interface on a display of the attendance recording terminal 10 and/or the mobile access medium of the person 40. For this purpose, the attendance recording terminal 10 can have a display and/or the attendance recording terminal 10 can be configured to be in communicative connection with the mobile access medium of the person 40 comprising a display.

    [0128] The user-specific user interface is or will then be particularly advantageously provided in a user-specific font size and/or in a user-specific language for the type of interaction individually adapted to the preferences of the interacting person 40.

    [0129] Additionally or alternatively, it is proposed that the type of interaction includes at least the aspect that a voice signal is output on the attendance recording terminal 10 and/or on the mobile access medium of the person 40. In this case, the voice signal is or will preferably be provided in a user-specific language and/or at a user-specific volume.

    [0130] It is thus possible for a person 40 to have stored “voice control” as the preferred type of interaction. This could also be the case, for example, if the person 40 has visual or reading difficulties or is illiterate. Then, after this person 40 has been identified by the terminal, voice control is provided as a user-specific interaction type. The various clocking options such as “Clock in”, “Clock out”, “Check account balance”, etc. can be read out on the terminal itself or on the mobile access medium, for example on the mobile device of the person 40. This can preferably be done in the preferred language of the interacting person 40. The person 40 can then confirm or reject the action that has been read out and thus suggested, for example by means of their own voice signal (such as a “yes” or also a “no”). It is also possible to confirm an input by a specific movement, such as with a gesture or of the mobile access medium, such as the mobile device or badge.

    [0131] In the case of another person 40, in turn, a “silent interaction”, for example by means of an application on the mobile device, could be preferred. No voice signals are then output, but rather the person 40 is shown interaction options on the mobile device, which the person 40 can select and thus perform the clocking process.

    [0132] In principle, an intent engine 112 (intent recognition device) is configured and provided for recording an intent of the interaction of the person 40 with the attendance recording terminal 10. The intent engine 112 can be part of the control unit 11 and can be implemented as software in the attendance recording terminal 10 or in the control unit 11 or can comprise software, via which software specific information resulting from the actions of the person 40 can be evaluated.

    [0133] The intent engine 112 recognizes the intent of the person 40, namely generally that or even specifically why that person 40 would like to interact or interacts with the attendance recording terminal 10. For this purpose, the intent engine 112 basically records the intent of an interaction of the person 40 and, for example, outputs intent information, which intent information represents the intent of the person 40 regarding the interaction. It can simply be the fact that the person 40 now wants to carry out an interaction, or it can also be the specific interaction itself, i.e. the specific intent of the interaction.

    [0134] The specific intent of the interaction can be, for example, that the person 40 wants to clock in because their working time is beginning, or that they want to clock out of the time recording system. Furthermore, the specific intent of the person 40 can also be to call up specific information stored in the master record. For example, the person 40 could want to inquire about their attendance time or working time completed in a certain predefined period of time. It can also be possible to use the attendance recording terminal 10 to find out to what extent there is still credit on an access card or a person-specific employee card if the access card or the person-specific employee card (or a badge) is also suitable for payment in a canteen or the like. The memory management unit 20 preferably comprises the master record of corresponding personal data.

    [0135] The intent engine 112 is preferably configured in such manner that, in order to record the intent of the interaction of the person 40 with the attendance recording terminal 10, a signal input by the person 40 is used. The signal input takes place as follows: [0136] the intent engine 112 receives a signal which is input by the person 40 on the mobile access medium of the person 40, in particular on the mobile device, and which represents the intent of the interaction; and/or [0137] the intent engine 112 receives a voice input from the person 40 representing the intent of the interaction or recognizes a gesture input from the person 40 representing the intent of the interaction, and/or [0138] the intent engine 112 processes a signal which is input by the person 40 at the attendance recording terminal 10 and/or on the mobile access medium of the person 40, preferably on the display, and which represents the intent of the interaction, and/or [0139] the intent engine 112 receives a signal which is triggered by the person 40 by means of a movement with the mobile access medium of the person 40 and which represents the intent of the interaction.

    [0140] By means of the mentioned active inputs by the person 40, i.e. by performing: [0141] a predefined movement with the mobile access medium, in particular once again placing the mobile access medium of the person 40 (preferably a badge) on an interface of the attendance recording terminal 10; and/or [0142] an input on the attendance recording terminal 10 and/or on the mobile access medium of the person 40; and/or [0143] a predefined gesture input; and/or [0144] a predefined voice input

    [0145] the person 40 can actively communicate their intent to the terminal. In this way, the risk of incorrect clocking processes can also be reduced.

    [0146] Specifically, in the above-mentioned case of an active input of the person 40 on their mobile device, it is advantageous that the intent of the interaction can be unambiguously and clearly input by the person 40 without interpretation steps being required. For this purpose, the specific interactions with the attendance recording terminal 10 possible for the person can be displayed in an application on the mobile device of the person 40, particularly advantageously in the manner preferred by the person 40, and then selected by the person 40. The intent of the interaction is then recorded in the attendance recording terminal 10 by a corresponding communication of the mobile device with the attendance recording terminal 10 or specifically with the intent engine 112 and is provided, for example, as intent information and further processed.

    [0147] In the previously alternatively or additionally mentioned case that the signal input takes place via the voice input of the person 40 or that the signal input takes place via a gesture input of the person 40, a corresponding voice detection engine 112a (voice recognition device) or gesture detection engine 112b (gesture recognition device) is provided. The voice detection engine 112a and the gesture detection engine 112b can be implemented as software in the control unit 11 or in the intent engine 112 or comprise software, via which the incoming voice signal of the person 40 or the detected gesture or movement the person 40 can be evaluated.

    [0148] For example, the person 40 can communicate their intent to interact with the attendance recording terminal 10 via a voice input, such as calling out the word “clock in”. If that voice signal is recognized by the voice detection engine 112a, the intent engine 112 then knows that an interaction is desired and also specifically which one. In the case of a gesture, provision can be made, for example, for the person 40 to perform a specific movement, which is recorded by the sensor unit 12, for example, so that the gesture detection engine 112b then recognizes this as an intent to clock in or an intent to clock out and recognizes the corresponding intent information. A gesture can also be that the person 40 performs a predetermined movement with their mobile access medium, which movement is then received wirelessly at the attendance recording terminal 10 and recognized by the gesture detection engine 112b as corresponding intent information.

    [0149] In the case also mentioned above, it is necessary, for example, for the person 40 to input a signal, for example on the touchscreen display of the attendance recording terminal 10 itself, in order to express their intent. In this case, only possible corresponding intents can also be predefined by the attendance recording terminal 10 on the display. The intent engine 112 then processes that input signal as corresponding intent information.

    [0150] The surrounding zone 30 can be divided into a plurality of sub-zones, as can be seen in FIG. 1 and an example of an arrangement of an attendance recording terminal 10 in a corridor 300 also in FIG. 4. Thus, a near zone 31, which is arranged closest to the attendance recording terminal 10, and a distant zone 32, which is arranged furthest away from the attendance recording terminal 10, are provided. Between the near zone 31 and the distant zone 32 there is a middle zone 33.

    [0151] Advantageously, the sensor unit 12 specifically recognizes in which sub-zone a person 40 is located and, if necessary, outputs a corresponding specific detected presence information of the person 40, specifically as a function of one or more zone-specific signal/signals, which originates/originate from the person 40 in the mentioned sub-zones.

    [0152] In this case, the zone-specific signal does not have to be actively triggered by the person 40. Rather, the zone-specific signal is merely caused by the person 40, for example by the presence or by the type of advancing movement of the person 40 in the respective sub-zone. For example, a person 40 in the sub-zone or their movement can be tracked via camera tracking and a zone-specific signal can be generated from this. The zone-specific signal can thus be effected or caused by the simple presence or the movement of the person 40 in the sub-zone.

    [0153] The following circumstances can flow into the zone-specific signal and thus into the specifically detected presence information of the person 40: [0154] in which sub-zone the person 40 is located, and/or [0155] which sub-zone the person 40 is entering, and/or [0156] what does a movement pattern of the person 40 look like?

    [0157] The movement pattern of the person 40 can relate to a movement direction of the person 40 and/or a movement speed of the person 40. The movement direction can be represented by a direction vector over time. The movement pattern of the person 40 within a surrounding zone or sub-zone can be recorded and used as a basis for the analysis of the intent of the interaction. Using the information regarding the sub-zone which a person 40 is entering, a change of zone by a person 40 can be detected and can be used as a basis for the analysis.

    [0158] The technical means for detecting the position, i.e. the presence or the movement of the person 40 in a sub-zone, for example, can advantageously be the sensor unit 12. The sensor unit 12 advantageously includes a camera 121.

    [0159] In this way, for example, individual engines or devices can be woken step by step or, in other words, the attendance recording terminal 10 can be put into different operating states depending on which sub-zone the person 40 is in or how the person 40 is moving for example in a sub-zone. For example, if the person 40 only enters the distant zone 32 and is basically detected there, but that person 40 does not approach the attendance recording terminal 10 any further and therefore does not enter the middle zone 33 or the near zone 31, it can be concluded that the person 40 does not want to interact with the attendance recording terminal 10 at all. Accordingly, the attendance recording terminal 10 can be operated in an energy-saving mode, for example, as long as the person 40 is not detected in the near zone 31 or at least not in the middle zone 33. Individual modules, which are basically necessary for the interaction of a person, can also be switched on step by step, as soon as the person 40 is detected in a predefined sub-zone.

    [0160] The example shown in FIG. 4 of a corridor 300 having a T-junction illustrates the different sub-zones, represented by dashed partial circles, surrounding the attendance recording terminal 10 at different distances. The person 40 who is located in the near zone 31 interacts with the attendance recording terminal 10. The person 40a, in turn, is located in the distant zone 32, in which a basic detection of the person 40a takes place by means of the sensor unit 12 of the attendance recording terminal 10, but this detection would not result in the attendance recording terminal 10 having to operate. Only when the person 40a also enters the middle zone 33 or, depending on the desired configuration, the near zone 31, is a corresponding intent recognized via the interaction of the person 40, so that a change to a more active operating state takes place, for example. The third person 40b, in turn, is located outside the entire surrounding zone 30, so that in relation to the person 40b, said person is not detected at all. The attendance recording terminal 10 does not detect the person 40b in the configuration shown, since it is clear that no interaction with the attendance recording terminal 10 is desired by the person 40b.

    [0161] At least one further control level or a plurality of control levels are also advantageously integrated into the attendance recording terminal 10. In this way, the person 40 can give feedback, i.e. feedback, about the clocking process carried out or just about the requested interaction, i.e. about the change made or planned change to the attendance information in the memory management unit 20 or also about the fact that no change was made.

    [0162] The memory management unit 20 is then advantageously configured in such manner that when the attendance information has been stored, the memory management unit 20 checks in the data of the person 40 in its master record of the personal data whether feedback is desired about the interaction of the person 40 with the attendance recording terminal 10.

    [0163] The attendance recording terminal 10 or the control unit 11 can also have a feedback engine 113 (feedback device) in addition to the identification engine 111 and intent engine 112, as shown in FIG. 5. The feedback engine 113 is configured in such manner that when the person 40 interacts or is requested to interact with the attendance recording terminal 10, a feedback signal can be output. The feedback signal can be output at the attendance recording terminal 10, for example on the display thereof or via loudspeakers, or also on the mobile access medium of the person 40. The feedback signal can be output as follows: [0164] as an acoustic feedback signal; and/or [0165] as a voice signal; and/or [0166] as a haptic feedback signal, in particular a vibration signal, preferably on the mobile access medium of the person (40); and/or [0167] as a graphic signal.

    [0168] The feedback signal can, for example, also include a request or user information with which an operator of the attendance recording terminal (e.g. employer) requests the person 40 who is interacting with the attendance recording terminal 10 as a user to do something or informs them about something. A request to the person 40 could be as follows: “Report to the HR department”, “Please reduce overtime” etc. Simple information could be given about the current time account balance.

    [0169] Provision can advantageously be made for sensitive, personal information to be output only on the mobile device of the person 40, while general and non-sensitive information is output only on the attendance recording terminal 10 or on the attendance recording terminal 10 and the mobile device.

    [0170] The feedback engine 113 can be implemented at least partially as software in the attendance recording terminal 10 or in the control unit 11, via which software it can be checked, on the one hand, using the memory management unit 20 whether the person 40 wants feedback and, on the other hand, that corresponding feedback can be initiated. Accordingly, the feedback engine 113 or the control unit 11 can be in communicative connection with a voice output unit if, for example, a voice signal is desired as the feedback signal.

    [0171] It is particularly advantageous to check in the master record of the personal data beforehand whether the specific person 40 who is performing the interaction and who was previously identified wants feedback at all, and if so, in what form the feedback is wanted. The feedback engine 113 can therefore be configured in such manner that a user-specific feedback signal can be output. For example, a voice signal can be output in the respective language of the person 40. Furthermore, for example, in the case of a person 40 with a hearing impairment, provision could be made for this person 40 to receive a different signal as a feedback signal, for example a graphic or haptic signal.

    [0172] In order to increase the security of a correct interaction between the person 40 and the attendance recording terminal 10, an interaction confirmation engine 114 (interaction confirmation device) is also provided in the attendance recording terminal 10 or the control unit 11 in the exemplary embodiment of FIG. 5.

    [0173] The interaction confirmation engine 114 may receive clarifying intent information. The clarifying intent information is intended to verify or falsify the already communicated and received intent of the interaction of the person 40 with the attendance recording terminal 10. In this case, it is then advantageous if the clarifying intent information represents further clocking information for the control unit 11. In addition to the identification information and intent information, it is then made dependent on this further clocking information whether or not the clocking process is to be carried out in the memory management unit 20.

    [0174] For this purpose, the interaction confirmation engine 114 can be implemented at least partially as software in the attendance recording terminal 10 or in the control unit 11, via which software corresponding signals or actions of the person 40 can be evaluated to confirm or deny the previously recorded intent.

    [0175] The interaction confirmation engine 114 can be designed as a voice detection engine 114a (voice recognition device) for recording a voice input by the person 40 and/or as a gesture detection engine 114b (gesture recognition device) for recording a gesture input by the person 40. For this purpose, reference can be made to the description of the voice detection engine 112a or gesture detection engine 112b above.

    [0176] In concrete terms, clarifying intent information can include, for example, of the person 40 confirming via a voice signal (for example a pronounced “yes”) that they desire the recorded interaction and that this should be carried out. Contact-based clarifying intent information is also conceivable, for example in that the person 40 performs an input on the attendance recording terminal 10 or on their mobile access medium. However, voiceless and contactless confirmations by the person 40 are also conceivable, for example by the person 40 performing a specific gesture, for example also with their mobile access medium. A specific movement of the mobile access medium can be identified by the interaction confirmation engine 114 as clarifying intent information, for example via RFID or UWB technology.

    [0177] The intent engine 112 is also advantageously configured for assigning distance information to the received intent information. This can be used to identify the distance from which the signal input or initiated by the person 40 comes. An example is that the intent information is communicated by the person 40 via a voice signal and, for example via the volume of the signal, the interaction detection engine 112 recognizes how far away the person 40 is. The clocking process can then be made dependent on the distance information corresponding to a predefined value range. A clocking process can therefore be prevented if the person 40 is still too far away from the attendance recording terminal 10. For the clocking process and the necessary comparison of the detected or received information with stored information, the interaction detection engine 112 can then be configured in such manner that the intent information is only further processed or output if the distance information corresponds to the predefined value range.

    [0178] Furthermore, the person 40 can advantageously be given an indication by the attendance recording terminal 10 as to when the person 40 is close enough to the attendance recording terminal 10 and a clocking process is therefore possible. According to that exemplary embodiment, the control unit 11 is configured in such manner that if the detected presence information of the person 40 corresponds to a predefined value range, initiation information is output to the person 40 to notify that the interaction is possible.

    [0179] The initiation information can be a voice signal or an acoustic signal and/or a graphic signal and/or a haptic signal, in particular a vibration signal. The initiation information can be output on the attendance recording terminal 10, for example via a loudspeaker or display of the attendance recording terminal 10, or also on a mobile access medium of the person 40, such as their mobile device.

    [0180] The memory management unit 20 is preferably configured in such manner that when the initiation information is to be output to the person 40, the memory management unit 20 checks in the data of the person 40 as to what type of initiation information the person 40 desires. In this way, the individual needs or requirements of the person 40 can be addressed in a particularly advantageous manner. For example, a person 40 with a visual impairment could preferably have an acoustic signal or voice signal output as initiation information and not a graphic signal, while the situation is exactly the opposite for a person 40 with a hearing impairment. Signals containing speech or text can also be output as initiation information individually adapted to the person 40 in their respective language. The same applies to any feedback signals that can also be output in a person-specific manner, since person-specific information is stored in the master record of the data in the memory management unit 20.

    [0181] The features and advantages described in the context of the proposed attendance recording terminal 10 can be transferred accordingly to the proposed computer-implemented method for operating an attendance recording terminal. Specifically, the computer-implemented method is configured for operating that proposed and described attendance recording terminal 10. The proposed attendance recording terminal 10 described here is in turn configured for carrying out the proposed computer-implemented method for operating an attendance recording terminal. In this respect, the features characterizing the attendance recording terminal 10 and the method for operating the attendance recording terminal 10 and specific advantages have been previously described and will be described below in general only simply.