Surveillance System and Method for Monitoring the Sterility of Objects in an Operation Room
20220398848 · 2022-12-15
Inventors
Cpc classification
G06T7/246
PHYSICS
A61B34/20
HUMAN NECESSITIES
G06V20/52
PHYSICS
G06V40/10
PHYSICS
G16H40/20
PHYSICS
International classification
G06V20/52
PHYSICS
A61B34/20
HUMAN NECESSITIES
G06T7/246
PHYSICS
G06V40/10
PHYSICS
Abstract
A method for monitoring and maintaining the sterility of objects in an operation room (1) is proposed, comprising: •registering target objects (3) and reference objects (4); •attributing to each target object (3) a set of reference objects (4); •attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the reference objects (4) attributed to this target object (3); tracking the target objects (3) and the reference objects (4); •determining, using data of the tracking, if a violation has occurred, the violation comprising that at least a part of a target object (3)—has entered the forbidden zone (90) attributed to that target object (3), and/or—has left the allowed zone (91) attributed to that target object (3). Furthermore, a surveillance system for performing this method is proposed.
Claims
1.-15. (canceled)
16. Surveillance system for monitoring and maintaining the sterility of objects in an operation room (1), comprising a tracking system (2) designed for tracking at least one object within the operation room (1); wherein the surveillance system is configured for: registering a first set of objects, these objects being referred to as target objects (3), and a second set of objects, these objects being referred to as reference objects (4), wherein the objects (3,4) preferably are body parts, areas, and/or items, attributing to each target object (3) a subset of the second set of reference objects (4), and attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the subset of reference objects (4) attributed to this target object (3); wherein the tracking system (2) is configured for tracking the target objects (3) and the reference objects (4); and wherein the surveillance system is configured for determining, using data provided by the tracking system (2), if a violation has occurred, the violation comprising that at least a part of a target object (3): has entered the forbidden zone (90) attributed to that target object (3), and/or has left the allowed zone (91) attributed to that target object (3).
17. The surveillance system of claim 16, wherein the surveillance system further comprises an output unit (5) and wherein the surveillance system is configured for outputting a signal via the output unit (5) in case a violation has occurred.
18. The surveillance system of claim 17, wherein the surveillance system is configured for outputting a signal that indicates the location in which a violation has occurred.
19. The surveillance system of claim 16, wherein the surveillance system comprises at least one marker (7) that is attached to an object (3,4) and wherein the tracking system (2) is designed for measuring data concerning a position of the at least one marker (7), and/or for retrieving data from the marker (7).
20. The surveillance system of claim 16, wherein the surveillance system comprises an object recognition unit (21) designed for recognizing at least one object (3,4), wherein the at least one object (3,4) preferably is a person, a body part, an area, and/or an item.
21. The surveillance system of claim 20, wherein the surveillance system comprises a data structure (66) in which data about objects (3,4) is stored, and wherein the surveillance system is configured for automatically registering an object (3,4) about which data is stored within the data structure (66) and which is recognized by the object recognition unit (21), and/or attributing attributes to an object (3,4) about which data is stored within the data structure (66) and which recognized by the object recognition unit (21), wherein the attribution of the attribute is based on data about this object (3,4) stored in the data structure (66).
22. The surveillance system of claim 16, wherein the surveillance system comprises a facing direction recognition unit (25) that is configured for recognizing the facing direction of a person (30,40), and wherein the forbidden zone (90) and/or an allowed zone (91) attributed to a target object (3) depends on the facing direction of the person (30,40), wherein the person (30,40) preferably comprises at least one body part that is the target object (3), and/or comprises at least one body part that is attributed to the target object (3) as a reference object (4).
23. The surveillance system of claim 16, comprising a command recognition unit (61) designed for recognizing commands, wherein the surveillance system is configured for registering master users (39), recognizing commands given by a master user (39) using the command recognition unit (61), and executing the commands given by the master user (39).
24. The surveillance system of claim 16, comprising a processing unit (6), wherein the processing unit (6) preferably is configured for processing data in connection with registering objects (3,4), attributing reference objects (4) and the respective forbidden zone (90) resp. allowed zone (91) to a target object (3), and/or determining if a violation has occurred.
25. Method for monitoring and maintaining the sterility of objects in an operation room (1), using the surveillance system of claim 16, the method comprising: registering a first set of objects, such objects being referred to as target objects (3), and a second set of objects, such objects being referred to as reference objects (4), wherein the objects (3,4) preferably are body parts, areas, and/or items; attributing to each target object (3) a subset of the second set of reference objects (4); attributing to each target object (3) a forbidden zone (90) and/or an allowed zone (91) based on a space occupied by the subset of reference objects (4) attributed to this target object (3); tracking, using the tracking system (2) of the surveillance system, the target objects (3) and the reference objects (4); determining, using data of the tracking, if a violation has occurred, the violation comprising that at least a part of a target object (3) has entered the forbidden zone (90) attributed to that target object (3), and/or has left the allowed zone (91) attributed to that target object (3).
26. The method of claim 25, comprising outputting a signal in case a violation has occurred, preferably wherein outputted signal indicates the location in which a violation has occurred.
27. The method of claim 25, wherein the forbidden zone (90) resp. the allowed zone (91) attributed to a target object (3) satisfies exactly one of the following properties: it comprises the space occupied by the subset of reference objects (4) attributed to this target object (3), it does not comprise any part of the space occupied by the subset of reference objects (4) attributed to this target object (3).
28. The method of claim 25, wherein a first group of target objects (3) and a second group of target objects (3) are defined, wherein a common first subset of reference objects (4) is comprised in each respective subset of reference objects (4) attributed to a target object (3) of the first group, and/or a common second subset of reference objects (4) is comprised in each respective subset of reference objects (4) attributed to a target object (3) of the second group.
29. The method of claim 28, wherein a common first forbidden zone (90) resp. a common first allowed zone (91) is comprised in each respective forbidden zone (90) resp. allowed zone (91) attributed a target object (3) of the first group, and/or a common second forbidden zone (90) resp. a common second allowed zone (91) is comprised in each respective forbidden zone (90) resp. allowed zone (91) attributed to a target object (3) of the second group.
30. The method of claim 25, wherein the forbidden zone (90) and/or an allowed zone (91) attributed to a target object (3) depends on the facing direction of a person (30,40), wherein the person (30,40) preferably comprises at least one body part that is the target object (3), and/or comprises at least one body part that is attributed to the target object (3) as a reference object (4).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0172] Preferred embodiments of the invention are described in the following with reference to the drawings, which are solely for the purpose of illustrating the present preferred embodiments of the invention and not for the purpose of limiting the same. In the drawings,
[0173]
[0174]
[0175]
[0176]
[0177]
[0178]
[0179]
[0180]
[0181]
[0182]
[0183]
[0184]
[0185]
[0186]
[0187]
[0188]
[0189]
[0190]
[0191]
DESCRIPTION OF PREFERRED EMBODIMENTS
[0192]
[0193] A surveillance system for automatically monitoring—and preferably also for maintaining—the sterility of objects, such as body parts, areas, and/or items, in an operation room 1, is proposed, an example of which is shown in
[0194] The depicted surveillance system comprises a tracking system 2 that is designed for tracking objects in the operation room 1, preferably by at least quasi-continuously determining the position of these objects in the operation room 1. The tracking system 2 can for example comprise an AZURE KINECT DK system (by Microsoft), a DYNAMIC VISION SENSOR (by iniVation), and/or a SPECK sensor (by iniVation/aiCTX), each e.g. as available on Aug. 30, 2019. An AZURE KINECT DK system, which comprises a depths sensor 28, a camera 29, and an artificial intelligence unit 60, can e.g. be comprised in an object recognition unit 21, a body part person recognition unit 211, a shape recognition unit 22, a facing direction recognition unit 25, a skeleton tracking unit 26, an object perimeter tracking unit 27, a body perimeter tracking unit 270, and/or a gesture command recognition unit 612. A DYNAMIC VISION SENSOR can e.g. be comprised in an object recognition unit 21, a body part person recognition unit 211, a shape recognition unit 22, a facing direction recognition unit 25, a skeleton tracking unit 26, an object perimeter tracking unit 27, a body perimeter tracking unit 270, and/or a gesture command recognition unit 612. A SPECK sensor can e.g. be comprised in a face recognition unit 24, a person recognition unit 210, and/or a facing direction recognition unit 25.
[0195] The surveillance system is configured for registering objects 3, 4, preferably by using a processing unit 6 and/or a data structure 66. By registering an object 3, 4, that object 3, 4 becomes known to the surveillance system, so that the surveillance system can information-technological deal with this object 3, 4, e.g. store data concerning this object 3, 4 (e.g. data that allows recognition and/or tracking of this object 3, 4 and/or attributes attributed to this object 3,4). The objects 3, 4 preferably are body parts, areas, and/or items. Examples of a body part to be registered with the surveillance system is a person's full body, hand, arm, and/or head. Examples of an item to be registered with the surveillance system is an operation table 11 and/or an instrument table 12. An example of an area to be registered with the surveillance system is an area defined by an operation table 11 that is fixed (or at least assumed to be fixed), e.g. the area on and above the operation table. A registered object 3, 4 can be a target object 3, which is being monitored for violations, and at the same time be a reference object 4, which is used for defining the forbidden zone 90 and/or an allowed zone 91 for a target object 3.
[0196] In the following, multiple examples are described that concern target objects 3 in form of body parts, in particular full bodies; it is however understood that—where applicable and with the necessary modifications—these examples could as well be described for other target objects 3, such as items and/or areas.
[0197] The surveillance system is configured for attributing to each target object 3 a set of registered objects 4, and based thereon a forbidden zone 90 and/or an allowed zone 91. In the example of
[0198] The tracking system 2 depicted in
[0199] In addition, the tracking system 2 shown in
[0202] In case of body parts, in particular of full bodies, a skeleton tracking unit 26 can be used for estimating and this sense determining the space occupied by these body parts. In case of items, a shape recognition unit 22 can be used for determining the space occupied by these items.
[0203] The depicted surveillance system is further configured for determining (e.g. by estimating), by using data provided by the tracking system 2, if a violation has occurred, namely that a target object 3 [0204] has entered the forbidden zone 90 attributed to that target object 3, and/or [0205] has left the allowed zone 91 attributed to that target object 3.
[0206] Such a violation is depicted in
[0210] The forbidden zone 90 resp. allowed zone 91 attributed to a target object is preferably defined based on the space occupied by the reference objects 4 that are attributed to the target object, e.g. are an environment thereof. The tracking system 2 is preferably configured for tracking the reference objects 4, which allows for adjusting a forbidden zone 90 resp. an allowed zone 91 according to a displacement of the attributed reference objects 4. Such an example is shown in
[0211] In the example shown in
[0212] The forbidden zone (resp. the allowed zone) attributed to each target object is preferably based on the space occupied by the reference objects attributed thereto in that either the space occupied by the reference objects is comprised in the forbidden zone (resp. allowed zone) or that no part of the space occupied by the reference objects is comprised in the forbidden zone (resp. allowed zone). However, the forbidden zone (resp. allowed zone) can e.g. also be based on the space occupied by the reference objects attributed to a target object in that it is the intersection and/or union of such zones and/or their complementary zones.
[0213] As shown in
[0214] The surveillance system can be configured for monitoring target objects 3 in form of body parts as exemplified in
[0215] As shown in
[0216] In many cases, body parts other than the full body are monitored only for persons that are considered sterile, such as surgeons. Namely, while for non-sterile persons typically the whole body is considered non-sterile, for sterile persons it is efficient to only keep certain body parts, e.g. the hands and/or the front, sterilized; and thus is can be beneficial to monitor individual body parts of such a sterile person.
[0217] Body parts of a person can be reference objects 4 attributed to target objects 3 in form of other body parts of the same person. Such an example is shown in
[0221] In particular, the allowed zone 91 attributed to the hands of the person in
[0222] In the example shown in
[0223]
[0224] In the depicted example, two persons 30, 40 wear different markers 7, 7′ and from information retrieved from the markers 7, 7′, e.g. by consulting a data structure 66 in which information on the different markers 7, 7′ is stored, the surveillance system knows that of the first person 30 only the arms and hands are allowed to enter the area; and that the second person 40 is not allowed to enter the area at all. Therefore, the body of the first person 30 without the arms/hands and the full body of the second person 40 and are registered as target objects 3, and to both the area 4 is attributed as their respective forbidden zone 90.
[0225] The depicted tracking system 2 comprises a depth sensor 28 that supports the surveillance system in determining if something is positioned inside the area 4 or not; and a skeleton tracking unit 26 that supports the surveillance system in estimating the position of the respective skeleton of each of the persons 30, 40. Based on data about the respective skeleton, the surveillance system estimates the space occupied by the first person's 30 body minus its arms/hands and the space occupied by the second person's 40 full body, i.e. the space occupied by the respective target objects 3. Based on these estimates, the surveillance system estimates—and in this sense determines—if any of the target objects 3 have entered their respective forbidden zone 90, namely the area 4. If that is the case, e.g. because the second person 40 has entered the area 4 as shown in
[0226] In some cases, it may be beneficial to define that the set of reference objects, the forbidden zone, the allowed zone, and/or that the cases in which a violation occurs depend of further parameter, e.g. the facing direction of a person.
[0227] An example of such a case is depicted in
[0228]
[0229] The surveillance system is preferably configured for registering one or more master users 39 and for executing processes based on commands given by these master users 39. As shown in
[0230] In the depicted example, a human target object 3 requests that an attribute, e.g. its sterility status, is changed, which possibly leads to the attribution of a new set of reference objects, a new forbidden zone, and/or a new allowed zone to at least one of its body parts. The surveillance system signals the request to the master user 39, e.g. via an acoustic output unit 5 or via an optical output unit 5 (e.g. a display 8). The master user 39 can react by giving a command, e.g. —as depicted here—a gesture command, thereby approving or rejecting the attribution of the new attribute to the target object 3. A display 8 for use with the surveillance system can be designed as a monitor (as shown in
[0231] As shown in
[0232] In addition or alternatively, the surveillance system can use markers (not shown in
[0233] In addition or alternatively, the surveillance system can be designed for registering objects based on user input, e.g. inputted via a mouse, a keyboard, a touchpad, a touchscreen, and/or gesture commands. Similarly, attributes can be attributed via user input.
[0234] The object recognition unit 21 can in particular be designed as a person recognition unit 210, which can allow for recognizing individual persons. The person recognition unit 210 can e.g. use biometrical data of a person, a password, and/or a marker for recognizing a person. Preferably, the surveillance system is configured for automatically registering the recognized person resp. a body part thereof e.g. as a target object, as a reference object and/or as a master user. A person recognition unit 210 can for example comprise a voice recognition unit 23 and/or a face recognition unit 24.
[0235] In the example of
[0236]
[0237] In some embodiments, the surveillance system is configured for outputting a signal that indicates the location in which a violation occurs. Such an example is shown in
[0238]
[0246] The method optionally comprises outputting a signal in case a violation has occurred.
[0247]
[0251] As shown in
[0252] Of course, the order of executing the steps of the proposed methods can vary in any technically useful manner, including parallel execution.
TABLE-US-00001 LIST OF REFERENCE SIGNS 1 operation room 11 operation table 12 instrument table 2 tracking system 21 object recognition unit 210 person recognition unit 211 body part recognition unit 22 shape recognition unit 23 voice recognition unit 24 face recognition unit 25 facing direction recognition unit 26 skeleton tracking unit 27 object perimeter tracking unit 270 body perimeter tracking unit 28 depth sensor 29 camera 3 target object 30 person 38 surgeon 39 master user 4 reference object 40 person 5 output unit 6 processing unit 60 artificial intelligence unit 61 command recognition unit 611 voice command recognition unit 612 gesture command recognition unit 62 CPU 63 volatile memory 64 non-volatile memory 65 bus 66 data structure 7 marker 8 display 90 forbidden zone 91 allowed zone