COMPUTER ENHANCED SAFETY SYSTEM
20230143767 · 2023-05-11
Inventors
Cpc classification
F16P3/142
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G06F3/017
PHYSICS
G06F3/0346
PHYSICS
International classification
F16P3/14
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
A device for assisting with the safe use of a machine. The device has a user interface for displaying an augmented reality scene, a camera for capturing, in real time, video of an area of interest for inclusion in an augmented reality scene, a positioning module for determining the position of the device relative to the machine, a shape recognition module for recognizing an actual location near the machine in the video, a database of virtual safe areas around the machine, which are displayable in the augmented reality scene and a matching module for matching the virtual safe areas to the corresponding actual component, to identify to a user, in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe. A library of signs, arrows and the like, which are displayable and represent hazards, prohibitions and mandatory actions to the user, are available.
Claims
1. A device for assisting with the safe use of a machine, the device comprising: a user interface for displaying an augmented reality scene; a camera for capturing in real time, video of an area of interest for inclusion in an augmented reality scene; a positioning module for determining a position of the device relative to the machine; a shape recognition module for recognizing an actual location near the machine in the video; a database of virtual safe areas around the machine which are displayable in the augmented reality scene; a matching module for matching the virtual safe areas to a corresponding actual component, to identify to a user, in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe.
2. The device as claimed in claim 1, wherein sensors determine an active status of the machine and the safety of a zone is determined depending on whether the zone is in or near an area where the machine is inactive or active.
3. The device as claimed in claim 1, wherein a reference marker is provided at a predetermined location on the machine to orient the device with respect to the machine, and wherein the machine is paired with the device using a near field communication system that detects if the device is in the vicinity of the machine and then instructs the user to look at the machine or scan the reference marker before proceeding.
4. The device as claimed in claim 1, wherein the device is a handheld device.
5. The device as claimed in claim 3, wherein the handheld device is a tablet computer or smartphone.
6. The device as claimed in claim 1, wherein the device is an augmented reality headset.
7. The device as claimed in claim 1, wherein the user interface is a graphical user interface and/or an audio output.
8. The device as claimed in claim 4, wherein the user interface is a graphical user interface that is controlled by physical interaction with the handheld device.
9. The device as claimed in claim 8, wherein the physical interaction includes gestures and/or voice and/or a keyboard and/or mouse which interact with objects in an augmented reality environment.
10. The device as claimed in claim 8, wherein the graphical user interface includes an augmented reality experience or a gaming object and animation combined with instruction windows.
11. The device as claimed in claim 1, wherein the positioning module includes a GPS location device.
12. The device as claimed in claim 1, wherein the positioning module includes a local network device which determines the position of the device with respect to nodes in a local network.
13. The device as claimed in claim 12, wherein the positioning module includes a combination of GPS and the local network.
14. The device as claimed in claim 1, wherein the positioning module uses a grid/mesh network of gaming objects positioned a set distance away from the machine.
15. The device as claimed in claim 1, wherein the positioning module defines an area relative to the machine and detects when the device has moved into/out of an area corresponding to a virtual safe area.
16. The device as claimed in claim 1, wherein the positioning module provides updated device position information to allow the augmented reality scene to reflect a change of location of the device with respect to the machine.
17. The device as claimed in claim 1, wherein the matching module indicates matching by an alert on the user interface.
18. The device as claimed in claim 17, wherein the alert is a flashing virtual area.
19. The device as claimed in claim 17, wherein the alert is a change in color of the virtual area.
20. The device as claimed in claim 17, wherein the alert is a sound.
21. The device as claimed in claim 1, wherein a reference marker is provided at a predetermined location on the machine to orient the device with respect to the machine.
22. The device as claimed in claim 21, wherein the reference marker is a barcode.
23. The device as claimed in claim 21, wherein the reference marker is a QR code
24. The device as claimed in claim 21, wherein the reference marker is scanned and the augmented reality scene is mapped out on the user interface with respect to that reference point.
25. The device as claimed in claim 1, wherein a plurality of devices are used in conjunction with a single machine.
26. The device as claimed in claim 23, wherein at least one of the plurality of devices is provided with location information on the other devices.
27. The device as claimed in claim 1, wherein multiple machines are controlled from a single device
28. A computer implemented method for assisting with the safe use of a work machine, the method comprising the steps of: capturing, in real time, a video of an area of interest for inclusion in an augmented reality scene, on a user interface of the device; determining a position of the device relative to the machine; recognizing an actual area near the machine in the video; accessing a database of virtual safe areas which are displayable in the augmented reality scene; and matching the virtual safe areas to a corresponding actual area to show a user in the augmented reality scene, areas around the machine which are safe and corresponding areas, which are unsafe.
29. The method as claimed in claim 28 wherein sensors determine an active status of the machine and a safety of a zone is determined depending on whether the zone is in or near an area where the machine is inactive or active.
30. The method as claimed in claim 28, wherein a reference marker is provided at a predetermined location on the machine to orient the device with respect to the machine, and wherein the machine is paired with the device using a near field communication system that detects if the device is in the vicinity of the machine and then instructs the user to look at the machine or scan the reference marker before proceeding.
31. The method as claimed in claim 28, wherein the user interface is a graphical user interface which is controlled by gestures which interact with objects in an augmented reality environment.
32. The method as claimed in claim 28, wherein the step of determining the position of the device relative to a machine uses GPS.
33. The method as claimed in claim 28, wherein the step of determining the position of the device relative to a machine uses the position of the device with respect to nodes in a local network.
34. The method as claimed in claim 28, wherein the step of determining the position of the device relative to a machine combines GPS and a local network.
35. The method as claimed in claim 28, wherein the step of determining the position of the device relative to a machine uses a grid/mesh network of invisible gaming objects positioned a set distance away from the machine.
36. The method as claimed in claim 28, wherein the step of determining the position of the device relative to the machine defines an area relative to the machine and detects when the device has moved out of the area.
37. The method as claimed in claim 36, wherein an alert is provided to the device if it moves out of the area.
38. The method as claimed in claim 28, wherein the step of determining the position of the device relative to the machine detects and plots the position of the device with respect to the machine.
39. The method as claimed in claim 28, wherein, in the step of determining the position of the device relative to a machine, the device position information is updated to allow the augmented reality scene to reflect a change of location of the device with respect to the machine.
40. The method as claimed in claim 28, wherein a camera captures video of the area of interest for inclusion in the augmented reality scene, and wherein a virtual component is overlaid with an actual component image received from the camera.
41. The method as claimed in claim 28, wherein matching is indicated by an alert on the user interface.
42. The method as claimed in claim 41, wherein the alert is a flashing virtual area.
43. The method as claimed in claim 41, wherein the alert is a change in color of the virtual area.
44. The method as claimed in claim 41, wherein the alert is a sound.
45. The method as claimed in claim 31, wherein the graphical user interface includes an augmented reality experience, and wherein a reference marker is provided at a predetermined location on the machine to orient the augmented reality experience with respect to the machine.
46. The method as claimed in claim 45, wherein the reference marker is scanned and the augmented reality scene is mapped out in the device with respect to the reference point.
47. The method as claimed in claim 28, wherein the user is compelled to acknowledge that they have seen/read/listened to content before the user may proceed onto a subsequent piece of content.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0097] The invention will be more clearly understood from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:
[0098]
[0099]
[0100]
[0101]
[0102]
[0103]
[0104]
DETAILED DESCRIPTION OF THE DRAWINGS
[0105] The present invention provides a device with an augmented reality interface which assists a user in identifying safe areas around a work machine. The device of the present invention may be implemented as an AR headset, or as a handheld device such as a rugged tablet computer.
[0106] The user's device can be connected directly to the machine or a remote service tablet via a wireless network connection such as Wi-Fi™, Bluetooth™, mesh network or alternatively connected to the machines telematics API through an internet connection (GSM, 3G, 4G, 5G etc).
[0107] The machine or remote service tablet may act as its own independent server to store the augmented reality content required to be displayed on the headwear. This independent server removes the need for an internet connection so the user can avail themselves of the experiences in an offline, remote geographic locations. Based on whether the machine is running or not, digital content is displayed on or around the machine to notify the user of “unseen” dangers. These notifications can be tailored to the location of the user relative to the machine.
[0108] The AR environment includes Environmental Health & Safety (EHS) prompts where applicable to inform/remind the user of all the associated hazards/dangers around the perimeter of the machine and upon or under it. It is known to list EHS warnings in the operators' manual and operating procedures documentation. However, by integrating them into an AR system in which this information is combined with real time video of the machine, an extra level of context and understanding is provided. In at least one embodiment, the device of the present invention compels a user to acknowledge that they have seen/read/listened to content before the user may proceed onto a subsequent piece of content. This feature also confirms that the user has accepted that they have understood the instruction.
[0109] The AR environment creates zones around the physical machine can be highlighted in different colours. These virtual components map on to the actual components of the machine to show which zones are safe around the machine. Arrows, warning symbols and audio prompts can also be used to provide safety guidance to a user around the machine and notify the user of any dangers as they move around the machine. The handheld device solution has arrows, voice commands and other symbols incorporated with real time video of the machine. The AR headset allows the user to look straight at the machine when viewing the AR content.
[0110] Positional awareness of the device may be provided by a grid/mesh network of invisible gaming objects positioned a set distance away from the machine. If the user collides with one of these gaming objects then a notification can be displayed on the device and a signal sent to the machine. The machine upon receipt of this information, checks the status of the machine to determine whether it is in operation, e.g. tracking, crushing/screening. Based upon that status, the machine can then trigger another event, such as stop machine from moving (if tracking), stop a belt and crusher/screen (if crushing/screening) or allow user to proceed towards the machine, if it is not functioning.
[0111] In addition to the above, the user will also have the ability to remotely stop the machine at any point directly from the headwear/handheld device if the user is within the immediate vicinity of the machine. This drastically cuts down the time for the user to reach for the nearest E-stop on the machine, especially if the user is away from the machine when the alert is raised and had to put themselves in potential danger by going up to the machine to stop it.
[0112] Multiple users may share the same experience. This means that if more than one user is in the vicinity of the machine then the other user could be notified of their whereabouts/location. For example, if the second user/device was at the other side of the machine and not in the first users line of sight.
[0113] This could also include if the other user/device was in a machine and they were moving into the area the first user is in. The user in danger could then be notified through an alarm or visual of the immediate risk and told to exit that area to avoid coming into harm.
[0114]
[0115] The device uses positioning module 67 which determines the position of the device relative to the machine and which may use a reference marker such as a QR code as a reference marker to orient the device relative to the machine. Shape recognition module 69 is used to recognise the actual component of the machine in the video. The database of safe areas 73 provides a graphical representation displayable in the augmented reality scene. Matching module 71 matches a location in a safe area to the corresponding actual area around or on the work machine.
[0116]
[0117] recognising an actual component of the machine in the video 103;
[0118] accessing a database of virtual safe areas which are displayable in the augmented reality scene 104; matching the virtual component to the corresponding actual component to show a user in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe 105.
[0119]
[0120] In
[0121]
[0122]
[0123]
[0124] The description of the invention including that which describes examples of the invention with reference to the drawings may comprise a computer apparatus and/or processes performed in a computer apparatus. However, the invention also extends to computer programs, particularly computer programs stored on or in a carrier adapted to bring the invention into practice. The program may be in the form of source code, object code, or a code intermediate source and object code, such as in partially compiled form or in any other form suitable for use in the implementation of the method according to the invention. The carrier may comprise a storage medium such as ROM, e.g. CD ROM, or magnetic recording medium, e.g. a memory stick or hard disk. The carrier may be an electrical or optical signal which may be transmitted via an electrical or an optical cable or by radio or other means.
[0125] The description of the invention including that which describes examples of the invention describes the use of an augmented reality (AR) systems and apparatus.
[0126] AR overlays virtual objects onto the real-world environment. AR devices like the Microsoft HoloLens and various enterprise-level “smart glasses” are transparent, letting you see everything in front of you as if you are wearing a weak pair of sunglasses. The technology is designed for completely free movement while projecting images over whatever you look at. The term mixed reality overlays and anchors virtual objects to the real world.
[0127] In the specification the terms “comprise, comprises, comprised and comprising” or any variation thereof and the terms include, includes, included and including” or any variation thereof are considered to be totally interchangeable and they should all be afforded the widest possible interpretation and vice versa.
[0128] The invention is not limited to the embodiments hereinbefore described but may be varied in both construction and detail.