COMPUTER-IMPLEMENTED METHOD OF HANDLING AN EMERGENCY INCIDENT, COMMUNICATION NETWORK, AND EMERGENCY PROCESSING UNIT
20220245768 · 2022-08-04
Inventors
- Nikolaos Drys (Athens, GR)
- Aristeidis Giachalis (Athens, GR)
- Konstantinos Katsimingos (Athens, GR)
- Dimitrios Talasoglou (Nea Makri, GR)
- Aikaterini Skouta (Argyroupoli, GR)
- Anastasios Talampekos (Athens, GR)
Cpc classification
G06V20/35
PHYSICS
International classification
Abstract
A computer-implemented method of handling an emergency incident rcan include receiving information on an emergency incident that includes at least one image of the emergency incident, applying a Convolutional Neural Network (CNN) object recognition and classification process for identifying and marking objects on the at least one image that are related to the emergency incident and that may cause at least one secondary hazardous situation, processing the data relating to the identified and marked objects by applying a deep learning algorithm to the data in a Relational Network (RN) architecture, wherein the image on the basis of the identified and marked objects is correlated to a set of recognized objects in a database for classifying the emergency. A communication network, communication apparatus, and an emergency processing unit are also provided. Embodiments of such machines and systems can be configured to implement embodiments of the method.
Claims
1. A computer-implemented method of handling an emergency incident reported to a Public Safety Answering Point (PSAP), the method comprising: receiving information on an emergency incident, the information comprising at least one image of the emergency incident; applying a Convolutional Neural Network (CNN) object recognition and classification process for identifying and marking objects of the at least one image that are related to the emergency incident and that are identified as being associated with a primary hazardous situation and/or at least one secondary hazardous situation; processing the data relating to the identified and marked objects by applying a deep learning algorithm to the data in a Relational Network (RN) architecture to correlate the identified and marked objects to a set of recognized objects in a database that are defined for classifying the emergency incident.
2. The method of claim 1, wherein the RN has a Long-Short-Term-Memory (SLTM) architecture.
3. The method of claim 1, comprising: applying image processing to the at least one image using image correction software run by an emergency processing unit that is communicatively connectable to the PSAP.
4. The method of claim 1, wherein the CNN is applied based on convolution and max pooling.
5. The method of claim 1, wherein the at least one image includes an image of the primary hazardous situation, the method comprising: correlating objects identified from the at least one image as being associated with the primary hazardous situation with other objects identified from the at least one image that, in combination with the objects associated with the primary hazardous situation, are associated with the at least one secondary hazardous situation.
6. The method of claim 5, wherein the primary hazardous situation is a fire the at least one secondary hazardous situation includes inflammable materials.
7. The method of claim 5, comprising: outputting an alert in response to verification that the at least one secondary hazardous situation is occurable.
8. The method of claim 7, comprising: providing a suggestion for handling the primary hazardous situation and a suggestion for handling the at least one secondary hazardous situation.
9. A communication apparatus for handling emergency incidents, the communication apparatus comprising: an emergency processing unit communicatively connectable to an Emergency Service Network (ESINET) via which information on emergency incidents are exchangeable with a Public Safety Answering Point (PSAP), the emergency processing unit comprising a processor connected to a non-transitory computer readable medium having code stored thereon, the code defining a method that is performed by the emergency processing unit when the processor runs the code, the method comprising: applying a Convolutional Neural Network (CNN) object recognition and classification process for identifying and marking objects of at least one image received via the ESINET that are related to an emergency incident and that are identified as being associated with a primary hazardous situation of the emergency incident and/or at least one secondary hazardous situation; processing the data relating to the identified and marked objects by applying a deep learning algorithm to the data in a Relational Network (RN) architecture to correlate the identified and marked objects to a set of recognized objects in a database that are defined for classifying the emergency incident.
10. The communication apparatus of claim 9, comprising: the ESINET and the PSAP.
11. The communication apparatus of claim 9, wherein the emergency processing unit comprises a deep learning unit and an image correction unit.
12. The communication apparatus of claim 9, wherein the emergency processing unit comprises a deep learning unit.
13. The communication apparatus of claim 9, wherein the emergency processing unit comprises an image correction unit.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The invention and embodiments thereof will be described below in further detail in connection with the drawings. It should be appreciated that like reference numbers can identify similar components.
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027] Reference numerals used in the drawings include the following: [0028] 1, 1′ image [0029] 2, 2′ object on the image [0030] 3 deep learning unit [0031] 4 communication network [0032] 5 image correction unit 5 [0033] 6 Emergency Service Network (ESINET) [0034] 7 PSAP [0035] 8 emergency processing unit
DETAILED DESCRIPTION
[0036]
[0037] As can be seen from the example image of
[0038]
[0039] As illustrated here, an example of how embodiments of the present technology is able to understand entities in an image given a set of images that contained objects that it is programmed to search for. The lower four images of
[0040]
[0041] Images 1, 1′ that are taken of emergency incidents via an emergency caller smartphone, table, camera, or other type of electronic device can be transmitted to a PSAP 7 via the Emergency Service IP Network (ESINET) 6. Prior to being handled by a recipient, the emergency incident depicted on an image is processed in an emergency processing unit 8 that performs image correction as described with respect to
[0042] After the object recognition and classification step described with respect to
[0043] This pool of objects may be enlarged, whereby only objects should be included that are relevant for the emergency classification. Accordingly, an equivalent RN can be composed and be provided with an emergency perspective and criteria to attain an understanding of hazardous relations between the objects (as the objects 2 of image 1 or the objects 2′ of image 1′ in
[0044] Also, the learning process can continue with the method of backpropagation as the emergency dispatchers will provide sentences of what happened in the scene. This process provides a machine understanding of what is crucial for or in the emergency scene shown on an image that has been received, and what is not, and accordingly, the recipient of the emergency call, as a first responder, will be alerted as to the one or more secondary hazardous situations in an emergency incident shown on an image.
[0045] At the PSAP 7, the recipient of the emergency call will receive the corrected image or images of an emergency incident as well as the original images sent by the caller reporting the emergency incident via the emergency processing unit 8 or other element of the ESINET. Further, the recipient will receive, from the emergency processing unit 8, an object identification of the objects 2 or 2′ that have been identified on the respective images 1 or 1′ together with alerts for possible secondary hazardous situations that have been verified on the respective images 1 or 1′. The information received, namely, the information on the emergency incident itself as a primary hazardous situation and the information on possible further secondary hazardous situations may then be handled accordingly by the recipient at the PSAP 4, for example, by informing the police and/or fire department on the emergency incident.
[0046]
[0047] Another set of objects “small rocks”, “large rocks”, and “many objects close to each other” have been combined to “debris” that in combination with the objects mentioned above lead to the conclusion that a human may be trapped inside debris.
[0048] A further example shown in
[0049] Thus, by the method of handling an emergency incident according to the embodiments described above, “secondary” details in an image that are not immediately related to the entire emergency scene or incident and that would otherwise be neglected or overseen by a person may be identified and correlated to hazardous situations that may occur. For example, as already outlined above, if there is an obvious fire in the emergency scene, the emergency agent would accordingly send firefighters to handle the situation. However, if there is a label on an object that indicates, for example, flammable material or a school sign or the like that would be overseen by the person handling the emergency, the method can result in also identifying secondary hazardous situations caused by such objects and will provide a corresponding alert to the call recipient who thus is enabled to manage the entire situation appropriately, thereby reducing the risks.
[0050] Finally, it is noted that it is also possible to train the algorithm in the early process to provide some suggestions for what type of precautionary actions should be taken in dangerous situations. For example, after alerting that there may be a human buried under debris of a landslide, a suggestion may be output to send life rescuers capable to extracting a person from such debris to the emergency location.
[0051] It should be appreciated that different embodiments of the method, communication system, and a communication apparatus can be developed to meet different sets of design criteria. For example, the particular type of network connection, server configuration or PSAP configuration for a device for use in embodiments of the method can be adapted to account for different sets of design criteria. As yet another example, it is contemplated that a particular feature described, either individually or as part of an embodiment, can be combined with other individually described features, or parts of other embodiments. The elements and acts of the various embodiments described herein can therefore be combined to provide further embodiments. Thus, while certain exemplary embodiments of a telecommunication apparatus, telecommunication device, terminal device, a network, a server, a PSAP, an ESINET, an emergency processing unit, a communication system, and methods of making and using the same have been shown and described above, it is to be distinctly understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims.