Surveillance System and Method for Monitoring the Sterility of Objects in an Operation Room

20220398848 · 2022-12-15

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for monitoring and maintaining the sterility of objects in an operation room (1) is proposed, comprising: •registering target objects (3) and reference objects (4); •attributing to each target object (3) a set of reference objects (4); •attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the reference objects (4) attributed to this target object (3); tracking the target objects (3) and the reference objects (4); •determining, using data of the tracking, if a violation has occurred, the violation comprising that at least a part of a target object (3)—has entered the forbidden zone (90) attributed to that target object (3), and/or—has left the allowed zone (91) attributed to that target object (3). Furthermore, a surveillance system for performing this method is proposed.

    Claims

    1.-15. (canceled)

    16. Surveillance system for monitoring and maintaining the sterility of objects in an operation room (1), comprising a tracking system (2) designed for tracking at least one object within the operation room (1); wherein the surveillance system is configured for: registering a first set of objects, these objects being referred to as target objects (3), and a second set of objects, these objects being referred to as reference objects (4), wherein the objects (3,4) preferably are body parts, areas, and/or items, attributing to each target object (3) a subset of the second set of reference objects (4), and attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the subset of reference objects (4) attributed to this target object (3); wherein the tracking system (2) is configured for tracking the target objects (3) and the reference objects (4); and wherein the surveillance system is configured for determining, using data provided by the tracking system (2), if a violation has occurred, the violation comprising that at least a part of a target object (3): has entered the forbidden zone (90) attributed to that target object (3), and/or has left the allowed zone (91) attributed to that target object (3).

    17. The surveillance system of claim 16, wherein the surveillance system further comprises an output unit (5) and wherein the surveillance system is configured for outputting a signal via the output unit (5) in case a violation has occurred.

    18. The surveillance system of claim 17, wherein the surveillance system is configured for outputting a signal that indicates the location in which a violation has occurred.

    19. The surveillance system of claim 16, wherein the surveillance system comprises at least one marker (7) that is attached to an object (3,4) and wherein the tracking system (2) is designed for measuring data concerning a position of the at least one marker (7), and/or for retrieving data from the marker (7).

    20. The surveillance system of claim 16, wherein the surveillance system comprises an object recognition unit (21) designed for recognizing at least one object (3,4), wherein the at least one object (3,4) preferably is a person, a body part, an area, and/or an item.

    21. The surveillance system of claim 20, wherein the surveillance system comprises a data structure (66) in which data about objects (3,4) is stored, and wherein the surveillance system is configured for automatically registering an object (3,4) about which data is stored within the data structure (66) and which is recognized by the object recognition unit (21), and/or attributing attributes to an object (3,4) about which data is stored within the data structure (66) and which recognized by the object recognition unit (21), wherein the attribution of the attribute is based on data about this object (3,4) stored in the data structure (66).

    22. The surveillance system of claim 16, wherein the surveillance system comprises a facing direction recognition unit (25) that is configured for recognizing the facing direction of a person (30,40), and wherein the forbidden zone (90) and/or an allowed zone (91) attributed to a target object (3) depends on the facing direction of the person (30,40), wherein the person (30,40) preferably comprises at least one body part that is the target object (3), and/or comprises at least one body part that is attributed to the target object (3) as a reference object (4).

    23. The surveillance system of claim 16, comprising a command recognition unit (61) designed for recognizing commands, wherein the surveillance system is configured for registering master users (39), recognizing commands given by a master user (39) using the command recognition unit (61), and executing the commands given by the master user (39).

    24. The surveillance system of claim 16, comprising a processing unit (6), wherein the processing unit (6) preferably is configured for processing data in connection with registering objects (3,4), attributing reference objects (4) and the respective forbidden zone (90) resp. allowed zone (91) to a target object (3), and/or determining if a violation has occurred.

    25. Method for monitoring and maintaining the sterility of objects in an operation room (1), using the surveillance system of claim 16, the method comprising: registering a first set of objects, such objects being referred to as target objects (3), and a second set of objects, such objects being referred to as reference objects (4), wherein the objects (3,4) preferably are body parts, areas, and/or items; attributing to each target object (3) a subset of the second set of reference objects (4); attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the subset of reference objects (4) attributed to this target object (3); tracking, using the tracking system (2) of the surveillance system, the target objects (3) and the reference objects (4); determining, using data of the tracking, if a violation has occurred, the violation comprising that at least a part of a target object (3) has entered the forbidden zone (90) attributed to that target object (3), and/or has left the allowed zone (91) attributed to that target object (3).

    26. The method of claim 25, comprising outputting a signal in case a violation has occurred, preferably wherein outputted signal indicates the location in which a violation has occurred.

    27. The method of claim 25, wherein the forbidden zone (90) resp. the allowed zone (91) attributed to a target object (3) satisfies exactly one of the following properties: it comprises the space occupied by the subset of reference objects (4) attributed to this target object (3), it does not comprise any part of the space occupied by the subset of reference objects (4) attributed to this target object (3).

    28. The method of claim 25, wherein a first group of target objects (3) and a second group of target objects (3) are defined, wherein a common first subset of reference objects (4) is comprised in each respective subset of reference objects (4) attributed to a target object (3) of the first group, and/or a common second subset of reference objects (4) is comprised in each respective subset of reference objects (4) attributed to a target object (3) of the second group.

    29. The method of claim 28, wherein a common first forbidden zone (90) resp. a common first allowed zone (91) is comprised in each respective forbidden zone (90) resp. allowed zone (91) attributed a target object (3) of the first group, and/or a common second forbidden zone (90) resp. a common second allowed zone (91) is comprised in each respective forbidden zone (90) resp. allowed zone (91) attributed to a target object (3) of the second group.

    30. The method of claim 25, wherein the forbidden zone (90) and/or an allowed zone (91) attributed to a target object (3) depends on the facing direction of a person (30,40), wherein the person (30,40) preferably comprises at least one body part that is the target object (3), and/or comprises at least one body part that is attributed to the target object (3) as a reference object (4).

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0172] Preferred embodiments of the invention are described in the following with reference to the drawings, which are solely for the purpose of illustrating the present preferred embodiments of the invention and not for the purpose of limiting the same. In the drawings,

    [0173] FIGS. 1a, 1b show the current state of the art,

    [0174] FIGS. 2a, 2b show a proposed surveillance system,

    [0175] FIGS. 3a, 3b show an adjustment of a forbidden zone,

    [0176] FIG. 4 shows an allowed zone and a forbidden zone,

    [0177] FIGS. 5a, 5b show markers attached to a scrub head,

    [0178] FIG. 5c shows a marker-less scrub head,

    [0179] FIGS. 6a, 6b show a monitoring of body parts,

    [0180] FIGS. 7a, 7b show an allowed zone for the hands of a sterile person,

    [0181] FIG. 8 shows a forbidden zone for the head of a person,

    [0182] FIG. 9 shows another monitoring of body parts,

    [0183] FIG. 10a, 10b show a forbidden zone depending on facing direction,

    [0184] FIG. 11a, 11b show an allowed resp. forbidden passing of two sterile bodies,

    [0185] FIG. 12 shows giving a gesture command,

    [0186] FIG. 13 shows object recognition,

    [0187] FIG. 14 shows mutual attribution of two bodies,

    [0188] FIG. 15 shows a processing unit,

    [0189] FIG. 16a, 16b show indicating the location of a violation using signals,

    [0190] FIG. 17 shows a flow diagram of a surveillance method for monitoring persons, and

    [0191] FIG. 18 shows a flow diagram for commands given by a master user.

    DESCRIPTION OF PREFERRED EMBODIMENTS

    [0192] FIG. 1a shows an operation room 1 comprising sterile items such as an operation table 11 on which a patient is placed and an instrument table 12 on which instruments for use during an intervention are placed. In order for these items to remain sterile, none person considered non-sterile is allowed to come too close to these sterile items. Compliance with this rule is typically monitored by the chief surgeon 38 and/or the other personnel in the operation room 1. If, as shown in FIG. 1b, a non-sterile person comes too close to the sterile items, e.g. the chief surgeon 38 takes the necessary steps such as ordering the non-sterile person to distance itself from the sterile items or to replace resp. re-sterilize the now possibly contaminated items. However, this requires a lot of attention from personnel and this method is prone to failure.

    [0193] A surveillance system for automatically monitoring—and preferably also for maintaining—the sterility of objects, such as body parts, areas, and/or items, in an operation room 1, is proposed, an example of which is shown in FIG. 2a. The surveillance system is configured for attributing a forbidden zone 90 and/or an allowed zone 91 to a so-called target object 3, and is further configured for monitoring if the target object 3 stay out of its forbidden zone 90 resp. stays within its allowed zone 91. The forbidden zone 90 resp. allowed zone 91 is preferably defined using one or more so-called reference objects 4 that are attributed to the target object 3. Thereby it is possible to monitor the sterility of a sterile object by monitoring if it comes too close to a non-sterile object resp. if a non-sterile object comes too close to the sterile object.

    [0194] The depicted surveillance system comprises a tracking system 2 that is designed for tracking objects in the operation room 1, preferably by at least quasi-continuously determining the position of these objects in the operation room 1. The tracking system 2 can for example comprise an AZURE KINECT DK system (by Microsoft), a DYNAMIC VISION SENSOR (by iniVation), and/or a SPECK sensor (by iniVation/aiCTX), each e.g. as available on Aug. 30, 2019. An AZURE KINECT DK system, which comprises a depths sensor 28, a camera 29, and an artificial intelligence unit 60, can e.g. be comprised in an object recognition unit 21, a body part person recognition unit 211, a shape recognition unit 22, a facing direction recognition unit 25, a skeleton tracking unit 26, an object perimeter tracking unit 27, a body perimeter tracking unit 270, and/or a gesture command recognition unit 612. A DYNAMIC VISION SENSOR can e.g. be comprised in an object recognition unit 21, a body part person recognition unit 211, a shape recognition unit 22, a facing direction recognition unit 25, a skeleton tracking unit 26, an object perimeter tracking unit 27, a body perimeter tracking unit 270, and/or a gesture command recognition unit 612. A SPECK sensor can e.g. be comprised in a face recognition unit 24, a person recognition unit 210, and/or a facing direction recognition unit 25.

    [0195] The surveillance system is configured for registering objects 3, 4, preferably by using a processing unit 6 and/or a data structure 66. By registering an object 3, 4, that object 3, 4 becomes known to the surveillance system, so that the surveillance system can information-technological deal with this object 3, 4, e.g. store data concerning this object 3, 4 (e.g. data that allows recognition and/or tracking of this object 3, 4 and/or attributes attributed to this object 3,4). The objects 3, 4 preferably are body parts, areas, and/or items. Examples of a body part to be registered with the surveillance system is a person's full body, hand, arm, and/or head. Examples of an item to be registered with the surveillance system is an operation table 11 and/or an instrument table 12. An example of an area to be registered with the surveillance system is an area defined by an operation table 11 that is fixed (or at least assumed to be fixed), e.g. the area on and above the operation table. A registered object 3, 4 can be a target object 3, which is being monitored for violations, and at the same time be a reference object 4, which is used for defining the forbidden zone 90 and/or an allowed zone 91 for a target object 3.

    [0196] In the following, multiple examples are described that concern target objects 3 in form of body parts, in particular full bodies; it is however understood that—where applicable and with the necessary modifications—these examples could as well be described for other target objects 3, such as items and/or areas.

    [0197] The surveillance system is configured for attributing to each target object 3 a set of registered objects 4, and based thereon a forbidden zone 90 and/or an allowed zone 91. In the example of FIG. 2a, the target object 3, namely a full body of a first person 30, is considered non-sterile and the reference objects 4 attributed to it are two items, namely the operation table 11 and the instrument table 12, and one body part, namely a full body of a second person 40, e.g. the chief surgeon 38, each of which is considered sterile. In order to maintain the sterility of these sterile reference objects 4, a forbidden zone 90 that comprises a 30-centimeter-environment of the reference objects 4, i.e. the reference objects 4 themselves and everything that is within 30 centimeter or less thereof, is attributed to the target object 3. In the example of FIG. 2a, the depicted 30-centimeter-environment of the chief surgeon 38 is defined by the distance in all directions; and the depicted 30-centimeter-environment of the operation table 11 and the instrument table 12 is defined by a distance in a vertical projection onto a plane parallel to the floor (effectively defining a cylinder whose base is parallel to the polygon indicated by the broken line in the figure). Similarly, an allowed zone 91 for this target object 3 can be defined, e.g. as the complementary of the depicted forbidden zone 90. In this case, the allowed zone 91 for this target object 3 is everything but said 30-centimeter-environments of the attributed reference objects 4.

    [0198] The tracking system 2 depicted in FIG. 2a is inter alia configured for recognizing objects by measuring data concerning the respective markers 7 attached to these objects. The markers 7 in this example are image patterns, which can be recognized using a camera 29 of the tracking system 2. The markers 7 allow the surveillance system to differentiate between the two depicted persons 3, 4, and to decide, e.g. using data stored in a data structure 66 of the surveillance system, that the one person is considered a target object 3 and the other person is attributed thereto as a reference object 4. In case two or more target objects 3 are to be monitored, the markers 7 can allow differentiating between these target objects 3, e.g. in case different reference objects 4, different forbidden zones 90, and/or different allowed zones 91 are attributed to different target objects 3. In addition or alternatively to markers 7, an object recognition unit 21, a person recognition unit 210, a body part person recognition unit 211, a shape recognition unit 22, a voice recognition unit 23, and/or a face recognition unit 24 can allow for recognizing the objects.

    [0199] In addition, the tracking system 2 shown in FIG. 2a comprises an object perimeter tracking unit 27, which allows for tracking the perimeters of the respective objects 3, 4, which allows [0200] determining the space currently occupied by the reference objects 4, which can lead to an adjustment of the attributed forbidden zones 90 resp. allowed zones 91; and/or [0201] determining the space currently occupied by the target objects 3, which in turn allows for determining if a part of the target objects 3 has entered resp. has left a certain zone.

    [0202] In case of body parts, in particular of full bodies, a skeleton tracking unit 26 can be used for estimating and this sense determining the space occupied by these body parts. In case of items, a shape recognition unit 22 can be used for determining the space occupied by these items.

    [0203] The depicted surveillance system is further configured for determining (e.g. by estimating), by using data provided by the tracking system 2, if a violation has occurred, namely that a target object 3 [0204] has entered the forbidden zone 90 attributed to that target object 3, and/or [0205] has left the allowed zone 91 attributed to that target object 3.

    [0206] Such a violation is depicted in FIG. 2b, where the target object 3, i.e. at least a part of it, has entered its forbidden zone 90. In the depicted example, the surveillance system comprises an output unit 5 designed as a loudspeaker that outputs an acoustic signal indicating the sterility violation by the target object 3. In addition or alternatively, the output unit 5 can comprise a visual output unit such as [0207] a display on which a warning message can be displayed (e.g. on a monitor and/or optical head mount display), [0208] a warning light, and/or [0209] a light beam emitter that allows for pointing at the location at which a violation has occurred.

    [0210] The forbidden zone 90 resp. allowed zone 91 attributed to a target object is preferably defined based on the space occupied by the reference objects 4 that are attributed to the target object, e.g. are an environment thereof. The tracking system 2 is preferably configured for tracking the reference objects 4, which allows for adjusting a forbidden zone 90 resp. an allowed zone 91 according to a displacement of the attributed reference objects 4. Such an example is shown in FIG. 3a and FIG. 3b, wherein the forbidden zone 90 is inter alia defined by a distance environment around the instrument table 12 and thus the forbidden zone 90 shown in FIG. 3a is adjusted according to the new position of the instrument table 12 in FIG. 3b. As described before, items, such as the instrument table 12, can be tracked using shape recognition. In the example of FIGS. 3a and 3b, markers 7 are attached to two diagonally opposed corners of the instrument table 12, which allows the tracking system 2 to track the position of the instrument table 12 by tracking the markers, e.g. in connection with data about the shape of the instrument table 12. As exemplified in FIG. 3a, the forbidden zone 90 can of course comprise multiple connection components.

    [0211] In the example shown in FIG. 4, the reference objects 4 attributed to a target object 3, in this case the full body of a chief surgeon 38 (who is considered sterile), are an operation table 11 (which is considered sterile) and a full body of a support person 40 (who is considered non-sterile). In the depicted example, a forbidden zone 90 attributed to the target object 3 is defined as a 30-centimeter-environment of the full body of the support person 40; and an allowed zone 91 attributed to the target object 3 is defined as an 1-meter-environment of the operation table 11. Of course, in this example the forbidden zone could instead as well be defined as to consist of the union of said 30-centimeter-environment of the non-sterile support person 40 and the complementary of said 1-meter-environment of the operation table 11.

    [0212] The forbidden zone (resp. the allowed zone) attributed to each target object is preferably based on the space occupied by the reference objects attributed thereto in that either the space occupied by the reference objects is comprised in the forbidden zone (resp. allowed zone) or that no part of the space occupied by the reference objects is comprised in the forbidden zone (resp. allowed zone). However, the forbidden zone (resp. allowed zone) can e.g. also be based on the space occupied by the reference objects attributed to a target object in that it is the intersection and/or union of such zones and/or their complementary zones.

    [0213] As shown in FIGS. 5a and 5b, markers 7 comprising an image pattern can be attached to the head of a person, which can be beneficial for its recognition using optical tracking systems that are installed at a height, such as a fisheye lens camera installed at a ceiling. As depicted in FIG. 5a, the marker 7 can comprise an image pattern printed on a dimensionally stable plate that is attached to a scrub. As shown in FIG. 5b, the image pattern 7 can also be printed on the scrub head. Two or more markers 7 can be different, which can allow for recognizing individual objects, classes of objects, attributes to be attributed to objects, and/or the position of the attached of marker. FIG. 5c shows an example where no such marker is used. In this case, the person can e.g. be recognized using a face recognition unit and/or a voice recognition unit, which possibly comprise an artificial intelligence unit.

    [0214] The surveillance system can be configured for monitoring target objects 3 in form of body parts as exemplified in FIG. 6a. In this example, radio markers 7, e.g. a RFID tag or a BLUETOOTH®-Chip, are attached to each of the body parts 3, in this case two hands, and the tracking system 2 is designed for tracking the radio markers 7, e.g. using a radio technology, and thereby the position of the body parts 3. In the depicted example, the same reference objects 4 and the same forbidden zone 90 are attributed to each hand 3, and if at least one of the hands 3 enters the forbidden zone 90, as depicted in the example of FIG. 6b, the output unit 5 indicates this violation.

    [0215] As shown in FIGS. 6a, 6b, body parts, which can be target objects and/or reference objects, can be tracked by using tracking markers 7 attached thereto. Alternatively or in addition, the tracking system 2 can comprise an object perimeter tracking unit 27, in particular a body perimeter tracking unit 270, and/or a skeleton tracking unit 26, which can be used for tracking, e.g. at least estimating, the position of a body part, such as e.g. the arms/hands and/or the head of a person, possibly supported by a body part recognition unit 211.

    [0216] In many cases, body parts other than the full body are monitored only for persons that are considered sterile, such as surgeons. Namely, while for non-sterile persons typically the whole body is considered non-sterile, for sterile persons it is efficient to only keep certain body parts, e.g. the hands and/or the front, sterilized; and thus is can be beneficial to monitor individual body parts of such a sterile person.

    [0217] Body parts of a person can be reference objects 4 attributed to target objects 3 in form of other body parts of the same person. Such an example is shown in FIGS. 7a and 7b, wherein the allowed zone 91 attributed to a target object 3 in form of the hands of a person is a zone that is [0218] bounded from below: by the level of a sterile field, such as that of an operation table 11, [0219] bounded from above: by the level of the armpits of that person, [0220] bounded from the sides: by the elbows of that person.

    [0221] In particular, the allowed zone 91 attributed to the hands of the person in FIGS. 7a, 7b varies with the position of further body parts of that person, namely its armpits or elbows.

    [0222] In the example shown in FIG. 8, the forbidden zone 90 for the head is the air domain above the operation table 11, while the same air domain is possibly comprised in an allowed zone of the hands of the person.

    [0223] FIG. 9 shows an example where the reference object 4 is an area, which e.g. can be registered by inputting coordinates. The tracking system 2 tracks such a fixed reference object 4 e.g. in that the tracking system 2 resp. the surveillance system knows the fixed position thereof and/or in that the tacking system 2 measures data concerning a reference system for locating the fixed position of that area within that reference system, e.g. in cases where the tracking system 2 is not fixed and/or is restarted.

    [0224] In the depicted example, two persons 30, 40 wear different markers 7, 7′ and from information retrieved from the markers 7, 7′, e.g. by consulting a data structure 66 in which information on the different markers 7, 7′ is stored, the surveillance system knows that of the first person 30 only the arms and hands are allowed to enter the area; and that the second person 40 is not allowed to enter the area at all. Therefore, the body of the first person 30 without the arms/hands and the full body of the second person 40 and are registered as target objects 3, and to both the area 4 is attributed as their respective forbidden zone 90.

    [0225] The depicted tracking system 2 comprises a depth sensor 28 that supports the surveillance system in determining if something is positioned inside the area 4 or not; and a skeleton tracking unit 26 that supports the surveillance system in estimating the position of the respective skeleton of each of the persons 30, 40. Based on data about the respective skeleton, the surveillance system estimates the space occupied by the first person's 30 body minus its arms/hands and the space occupied by the second person's 40 full body, i.e. the space occupied by the respective target objects 3. Based on these estimates, the surveillance system estimates—and in this sense determines—if any of the target objects 3 have entered their respective forbidden zone 90, namely the area 4. If that is the case, e.g. because the second person 40 has entered the area 4 as shown in FIG. 9, the output unit 5 outputs a warning signal. This depicted scenario could e.g. be the case if the second person 40 is a non-sterile support person, the first person 30 is a sterile surgeon, and the area 4 is a sterile area, e.g. the area above the operation table (not shown in FIG. 9). The surveillance system can be configured for outputting different warning signals depending on which person has caused the violation.

    [0226] In some cases, it may be beneficial to define that the set of reference objects, the forbidden zone, the allowed zone, and/or that the cases in which a violation occurs depend of further parameter, e.g. the facing direction of a person.

    [0227] An example of such a case is depicted in FIGS. 10a and 10b, wherein the tracking system 2 comprises a facing direction recognition unit 25 that is designed for tracking the facing direction of a person, which can e.g. comprise a face recognition unit 24 and a body perimeter tracking unit 270. By recognizing the position of the face and the perimeter of the body, the surveillance unit can estimate, and in this sense determine, into which direction the person's front and back are facing. Alternatively or in addition, markers 7 can be attached to the front and/or the back of a person, whereby the facing direction recognition unit 25 can determine the direction to which the front resp. the back are facing. In the depicted case, the target object 3 is the back of the depicted person, to which—if, as shown in FIG. 10a, the back faces the operation table 11—the sterile area 4 is attributed as a forbidden zone 90; and to which—if, as shown in FIG. 10b, the back does not face the operation table 11—no forbidden zone is attributed.

    [0228] FIGS. 11a and 11b exemplify a possibly monitoring for the passing of two sterile persons. For guaranteeing sterility, two sterile persons should preferably only pass front-to-front or back-to-back, but not face-to-back. In the depicted scenario, the target object 3 is the front of the first person 30 and if that front faces the front of the second person as shown in FIG. 11a, no reference object 4 resp. forbidden zone 90 is attributed to the front of the first person 30, which is why the two persons can pass each other that way without triggering a violation. However, if, as shown in FIG. 11b, the front of the first person faces the back of the second person, that second person's 40 back (as a reference object 4) and a forbidden zone 90 around that back is attributed to the front of the first person 30, which is why, if the two persons 30, 40 pass each other that way and in proximity to each other, a violation occurs. Of course, the front of the second person 40 can also be monitored as a target object in a similar manner.

    [0229] The surveillance system is preferably configured for registering one or more master users 39 and for executing processes based on commands given by these master users 39. As shown in FIG. 12, the surveillance system, e.g. the tracking system 2 thereof, can comprise a command recognition unit 61 designed to recognize a command of a master user 39. The command recognition unit 61 preferably comprises a voice command recognition unit 611, a gesture command recognition unit 612, and/or an artificial intelligence unit 60.

    [0230] In the depicted example, a human target object 3 requests that an attribute, e.g. its sterility status, is changed, which possibly leads to the attribution of a new set of reference objects, a new forbidden zone, and/or a new allowed zone to at least one of its body parts. The surveillance system signals the request to the master user 39, e.g. via an acoustic output unit 5 or via an optical output unit 5 (e.g. a display 8). The master user 39 can react by giving a command, e.g. —as depicted here—a gesture command, thereby approving or rejecting the attribution of the new attribute to the target object 3. A display 8 for use with the surveillance system can be designed as a monitor (as shown in FIG. 12) or as a head-mounted display (not shown).

    [0231] As shown in FIG. 13, the surveillance system, e.g. the tracking system 2 thereof, can comprise an object recognition unit 21 for recognizing objects, which can support the registration of recognized objects. The object recognition unit 21 can for example comprise an artificial intelligence unit 60 that is designed for identifying objects based on an analysis of an image of the objects. In the depicted example, the artificial intelligence unit is trained to recognize the instrument table 12 and is thereby able to recognize and register the instrument table 12 automatically. Preferably, the surveillance system is configured for automatically attribute attributes to recognized objects, e.g. based on entries in a data structure 66.

    [0232] In addition or alternatively, the surveillance system can use markers (not shown in FIG. 13) for recognizing and registering an object, possibly by additionally using information about the markers, the objects, and/or their relative placement.

    [0233] In addition or alternatively, the surveillance system can be designed for registering objects based on user input, e.g. inputted via a mouse, a keyboard, a touchpad, a touchscreen, and/or gesture commands. Similarly, attributes can be attributed via user input.

    [0234] The object recognition unit 21 can in particular be designed as a person recognition unit 210, which can allow for recognizing individual persons. The person recognition unit 210 can e.g. use biometrical data of a person, a password, and/or a marker for recognizing a person. Preferably, the surveillance system is configured for automatically registering the recognized person resp. a body part thereof e.g. as a target object, as a reference object and/or as a master user. A person recognition unit 210 can for example comprise a voice recognition unit 23 and/or a face recognition unit 24.

    [0235] In the example of FIG. 14, a first human body 3 and a second human body 3′ are registered as target objects. To first body 3 the second body 3′ is attributed as a reference object 4, and based thereon an environment of the second body 3′ is attributed to the first body 3 as its forbidden zone 90. Furthermore, to the second body 3′ the first body 3 is attributed as a reference object 4′, and based thereon an environment of the first body 3 is attributed to the second body 3′ as its forbidden zone 90′. This can e.g. be the case where the first person 3 is considered sterile and the second person 3′ is considered non-sterile, so that the two should not get close to each other.

    [0236] FIG. 15 shows a processing unit 6 as it can be used in the surveillance system for calculations, in particular in an artificial intelligence unit 60 of the surveillance system. The displayed processing unit 6 comprises a processor (CPU) 62 and a volatile (e.g. RAM) memory 63 and/or a non-volatile (e.g. a hard disk) memory 64, wherein the processor 62 communicates with the memory modules 63, 64 using one or more data buses 65. Preferably, the non-volatile memory 64 comprises a data structure 66 and/or the processing unit is connected to a data structure.

    [0237] In some embodiments, the surveillance system is configured for outputting a signal that indicates the location in which a violation occurs. Such an example is shown in FIGS. 16a and 16b, wherein the depicted surveillance system comprises multiple loudspeakers 5, each being arranged at a different location of the operation room 1. In case a violation occurs, the acoustical signal is outputted by the loudspeaker(s) closest to the location of the violation, thereby allowing the personnel to identify the cause of the violation quickly.

    [0238] FIG. 17 shows a flow diagram of a proposed method for monitoring—and preferably also for maintaining—the sterility of objects in an operation room, comprising the steps of: [0239] registering target objects and reference objects; [0240] attributing to each target object a (sub-)set of reference objects; [0241] attributing to each target object a forbidden zone and/or an allowed zone based on a space occupied by the (sub-)set of reference objects attributed to the respective target object; [0242] tracking the target objects and the reference objects; [0243] determining, using data of the tracking, if a violation has occurred, the violation comprising that at least a part of a target object [0244] has entered the forbidden zone attributed to that target object, and/or [0245] has left the allowed zone attributed to that target object.

    [0246] The method optionally comprises outputting a signal in case a violation has occurred.

    [0247] FIG. 18 shows a flow diagram for a method comprising the steps of: [0248] registering a master user, [0249] recognizing a command given by the master user, and [0250] executing the command given by the master user.

    [0251] As shown in FIG. 18, the method can optionally comprise receiving a request to change at least one attribute attributed to an object, and signalling the request to a master user.

    [0252] Of course, the order of executing the steps of the proposed methods can vary in any technically useful manner, including parallel execution.

    TABLE-US-00001 LIST OF REFERENCE SIGNS 1 operation room 11 operation table 12 instrument table 2 tracking system 21 object recognition unit 210 person recognition unit 211 body part recognition unit 22 shape recognition unit 23 voice recognition unit 24 face recognition unit 25 facing direction recognition unit 26 skeleton tracking unit 27 object perimeter tracking unit 270 body perimeter tracking unit 28 depth sensor 29 camera 3 target object 30 person 38 surgeon 39 master user 4 reference object 40 person 5 output unit 6 processing unit 60 artificial intelligence unit 61 command recognition unit 611 voice command recognition unit 612 gesture command recognition unit 62 CPU 63 volatile memory 64 non-volatile memory 65 bus 66 data structure 7 marker 8 display 90 forbidden zone 91 allowed zone