SYSTEM FOR VISUALIZING AN OBJECT TO A REMOTE USER FOR REMOTE ASSISTANCE APPLICATIONS
20210168353 · 2021-06-03
Inventors
- Gianluca Palumbo (Rome RM, IT)
- Sverre Dokken (Limassol, CY)
- Jorgen Grindevoll (Sabaneta Antioquia, CO)
- Jens Hjelmstad (Lillestrom, NO)
- Alexis MICHAEL (Ayia Fyla, CY)
Cpc classification
H04N13/111
ELECTRICITY
B63B79/30
PERFORMING OPERATIONS; TRANSPORTING
B63B79/10
PERFORMING OPERATIONS; TRANSPORTING
H04N13/388
ELECTRICITY
H04L67/75
ELECTRICITY
H04L67/131
ELECTRICITY
H04N2213/008
ELECTRICITY
H04L67/12
ELECTRICITY
International classification
H04N13/388
ELECTRICITY
B63B79/10
PERFORMING OPERATIONS; TRANSPORTING
B63B79/30
PERFORMING OPERATIONS; TRANSPORTING
G05D1/00
PHYSICS
H04N13/111
ELECTRICITY
Abstract
A system for visualizing an object to a remote user includes a digitization apparatus and a visualization apparatus. The digitization apparatus includes a sensor, a first processor, and a first communication interface. The sensor is configured to sense the object within a three-dimensional space region to obtain sensor data. The first processor is configured to determine volumetric data based upon the sensor data. The first communication interface is configured to transmit the volumetric data. The visualization apparatus includes a second communication interface, a second processor, and a display. The second communication interface is configured to receive the volumetric data. The second processor is configured to determine a three-dimensional representation of the object based upon the volumetric data. The display is configured to visualize the three-dimensional representation of the object to the remote user.
Claims
1. A system for visualizing an object to a remote user, the system comprising: a digitization apparatus comprising a sensor, a first processor, and a first communication interface, wherein the sensor is configured to sense the object within a three-dimensional space region to obtain sensor data, the sensor data representing a location of the object relative to the sensor and a shape of the object relative to the sensor, wherein the first processor is configured to determine volumetric data based upon the sensor data, the volumetric data forming a three-dimensional volumetric representation of the object, and wherein the first communication interface is configured to transmit the volumetric data over a communication network; and a visualization apparatus comprising a second communication interface, a second processor, and a display, wherein the second communication interface is configured to receive the volumetric data over the communication network, wherein the second processor is configured to determine the three-dimensional representation of the object based upon the volumetric data, and wherein the display is configured to visualize the three-dimensional representation of the object to the remote user.
2. The system of claim 1, wherein the digitization apparatus comprises a plurality of sensors comprising the sensor and a further sensor, wherein the further sensor is configured to sense the object within the three-dimensional space region to obtain further sensor data, the further sensor data representing a further location of the object relative to the further sensor and a further shape of the object relative to the further sensor, and wherein the first processor is configured to determine the volumetric data further based upon the further sensor data.
3. The system of claim 2, wherein the first processor of the digitization apparatus is configured to fuse the respective sensor data of the respective sensors of the plurality of sensors.
4. The system of claim 1, wherein the sensor comprises one or more of: a depth sensor, a radar sensor, a lidar sensor, a ladar sensor, an ultrasonic sensor, or a stereographic camera.
5. The system of claim 1, wherein the volumetric data comprises volumetric point cloud data, wherein the volumetric point cloud data represents a plurality of points on a surface of the object.
6. The system of claim 1, wherein the first communication interface of the digitization apparatus and the second communication interface of the visualization apparatus are configured to establish a communication link for communicating the volumetric data, and wherein the first processor of the digitization apparatus is configured to determine a latency of the communication link to obtain a latency indicator, and to adapt a quality of the three-dimensional volumetric representation of the object based upon the latency indicator.
7. The system of claim 1, wherein the second processor of the visualization apparatus is configured to perform a three-dimensional rendering based upon the volumetric data.
8. The system of claim 1, wherein the display is a component of one or more of: virtual reality (VR) glasses, a VR headset, augmented reality (AR) glasses, a computer system, a smartphone, or a tablet.
9. The system of claim 1, wherein the digitization apparatus further comprises a microphone configured to capture an acoustic sound signal originating from the three-dimensional space region, wherein the first processor is configured to determine sound data based upon the acoustic sound signal, and wherein the first communication interface is configured to transmit the sound data over the communication network; wherein the visualization apparatus further comprises a loudspeaker, wherein the second communication interface is configured to receive the sound data over the communication network, wherein the second processor is configured to determine the acoustic sound signal based upon the sound data, and wherein the loudspeaker is configured to emit the acoustic sound signal towards the remote user.
10. The system of claim 1, wherein the visualization apparatus further comprises a microphone configured to capture a reverse acoustic sound signal, in particular a reverse acoustic sound signal originating from the remote user, wherein the second processor is configured to determine reverse sound data based upon the reverse acoustic sound signal, wherein the second communication interface is configured to transmit the reverse sound data over the communication network; wherein the digitization apparatus further comprises a loudspeaker, wherein the first communication interface is configured to receive the reverse sound data over the communication network, wherein the first processor is configured to determine the reverse acoustic sound signal based upon the reverse sound data, and wherein the loudspeaker is configured to emit the reverse acoustic sound signal.
11. The system of claim 1, wherein the second processor of the visualization apparatus is configured to determine an object type of the object based upon the three-dimensional representation of the object to obtain an object type indicator, and to retrieve object information associated with the object from a database based upon the object type indicator, and wherein the display of the visualization apparatus is configured to visualize the object information to the remote user.
12. The system of claim 1, wherein the first communication interface of the digitization apparatus and the second communication interface of the visualization apparatus are configured to communicate over the communication network according to one or more of the following communication standards: a satellite based communication standard from the group comprising: Inmarsat BGAN communication standard, Iridium Certus communication standard, or the Globalstar communication standard, or a cellular mobile communication standard from the group comprising: 5G communication standard, 4G communication standard, 3G communication standard, or WiMAX communication standard.
13. The system of claim 1, wherein the system is configured to enable remote assistance in maintenance, repair, or troubleshooting onboard a maritime vessel.
14. A method of operating a system for visualizing an object to a remote user, the system comprising a digitization apparatus and a visualization apparatus, the method comprising: sensing, by a sensor of the digitization apparatus, the object within a three-dimensional space region to obtain sensor data, the sensor data representing a location of the object relative to the sensor and a shape of the object relative to the sensor; determining, by a first processor of the digitization apparatus, volumetric data based on the sensor data, the volumetric data forming a three-dimensional volumetric representation of the object; transmitting, by a first communication interface of the digitization apparatus, the volumetric data over a communication network; receiving, by a second communication interface of the visualization apparatus, the volumetric data over the communication network; determining, by a second processor of the visualization apparatus, the three-dimensional volumetric representation of the object based upon the volumetric data; and visualizing, by a display of the visualization apparatus, the three-dimensional representation of the object to the remote user.
15. The method of claim 14, wherein the digitization apparatus comprises a plurality of sensors comprising the sensor and a further sensor, and further comprising: sensing, by the further sensor, the object within the three-dimensional space region to obtain further sensor data, the further sensor data representing a further location of the object relative to the further sensor and a further shape of the object relative to the further sensor; wherein determining, by the first processor of the digitization apparatus, the volumetric data is further based upon the further sensor data.
16. The method of claim 15, further comprising: fusing, by the first processor of the digitization apparatus, the respective sensor data of the respective sensors of the plurality of sensors.
17. The method of claim 14, wherein the volumetric data comprises volumetric point cloud data, wherein the volumetric point cloud data represents a plurality of points on a surface of the object.
18. The method of claim 14, further comprising: establishing a communication link between the first communication interface of the digitization apparatus and the second communication interface of the visualization apparatus for communicating the volumetric data; determining, by the first processor of the digitization apparatus, a latency of the communication link to obtain a latency indicator; and adapting, by the first processor of the digitization apparatus, a quality of the three-dimensional volumetric representation of the object based upon the latency indicator.
19. The method of claim 14, further comprising: performing, by the second processor of the visualization apparatus, a three-dimensional rendering based upon the volumetric data.
20. A computer program product comprising a non-transitory computer-readable medium storing program code, wherein the program code is executable by one or more processors of a system to: obtain sensor data of an object within a three-dimensional space region at a digitization apparatus, the sensor data representing a location of the object relative to the sensor and a shape of the object relative to the sensor; determine volumetric data based on the sensor data at the digitization apparatus, the volumetric data forming a three-dimensional volumetric representation of the object; transmit, from a digitization apparatus via a first communication interface, the volumetric data over a communication network; receive, at a visualization apparatus via a second communication interface, the volumetric data over the communication network; determine, at the visualization apparatus, the three-dimensional volumetric representation of the object based upon the volumetric data; and visualize, using a display of the visualization apparatus, the three-dimensional representation of the object to the remote user.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0046] Further implementations of the principles of the present disclosure are described with respect to the following figures, in which:
[0047]
[0048]
DETAILED DESCRIPTION OF THE FIGURES
[0049]
[0050] The system 100 comprises a digitization apparatus 101 comprising a sensor 101a, a processor 101b, and a communication interface 101c. The sensor 101a is configured to sense the object within a three-dimensional space region to obtain sensor data, the sensor data representing a location of the object relative to the sensor and a shape of the object relative to the sensor. The processor 101b is configured to determine volumetric data based upon the sensor data, the volumetric data forming a three-dimensional volumetric representation of the object. The volumetric data may e.g. comprise volumetric point cloud data, wherein the volumetric point cloud data represents a plurality of points on the surface of the object. The communication interface 101c is configured to transmit the volumetric data over a communication network.
[0051] As indicated in the figure by dashed lines, the digitization apparatus 101 may comprise one or more further sensors, i.e. a plurality of sensors. If the digitization apparatus 101 comprises a plurality of sensors, the processor 101b of the digitization apparatus 101 may be configured to fuse the respective sensor data of the respective sensors of the plurality of sensors. By fusion of the respective sensor data, the quality of the three-dimensional representation of the object may be improved.
[0052] A further sensor may e.g. be configured to sense the object within the three-dimensional space region to obtain further sensor data, the further sensor data representing a further location of the object relative to the further sensor and a further shape of the object relative to the further sensor. The processor 101b may be configured to determine the volumetric data further based upon the further sensor data. In this case, additional three-dimensional information associated with the object may be provided by the further sensor. Such further sensor capable of providing additional three-dimensional information associated with the object may e.g. be a depth sensor, a radar sensor, a lidar sensor, a ladar sensor, an ultrasonic sensor, or a stereographic camera.
[0053] Additionally, or alternatively, a further sensor may e.g. be configured to sense the object within the three-dimensional space region to obtain further sensor data, the further sensor data representing a texture and/or a color of the object. The processor 101b may be configured to determine the volumetric data further based upon the further sensor data. In this case, additional texture information and/or color information associated with the object may be provided by the further sensor. Such further sensor capable of providing additional texture information and/or color information associated with the object may e.g. be a visible light camera or an infrared light camera.
[0054] For communicating the volumetric data over the communication network, any one or a combination of the following communication standards may be applied: a satellite-based communication standard, in particular the Inmarsat BGAN communication standard, the Iridium Certus communication standard, and/or the Globalstar communication standard, and/or a cellular mobile communication standard, in particular the 5G communication standard, the 4G communication standard, the 3G communication standard, and/or the WiMAX communication standard. For an improved communication from onboard a maritime vessel, the communication interface 101c of the digitization apparatus 101 may particularly be connectable to a communication relay onboard the maritime vessel. The connection between the communication interface 101c of the digitization apparatus 101 and the communication relay may e.g. be realized by an Ethernet connection.
[0055] The system 100 further comprises a visualization apparatus 103 comprising a communication interface 103a, a processor 103b, and a display 103c. The communication interface 103a is configured to receive the volumetric data over the communication network. The processor 103b is configured to determine the three-dimensional representation of the object based upon the volumetric data. The display 103c is configured to visualize the three-dimensional representation of the object to the remote user. The display 103c may e.g. be part of one of the following devices: virtual reality (VR) glasses or headset, augmented reality (AR) glasses, a computer system, a smartphone, a tablet. The three-dimensional representation of the object may be rotated, panned, and/or zoomed by the remote user on the display 103c.
[0056] For determining the three-dimensional representation of the object based upon the volumetric data in a particularly efficient manner, the processor 103b of the visualization apparatus 103 may be configured to perform a three-dimensional rendering based upon the volumetric data, e.g. using a three-dimensional rendering application programming interface, API. Such three-dimensional rendering application programming interface, API, may specifically be designed for visualizing a specific type of object.
[0057] Optionally, the processor 103b of the visualization apparatus 103 is configured to determine an object type of the object based upon the three-dimensional representation of the object to obtain an object type indicator, and to retrieve object information associated with the object from a database based upon the object type indicator. The display 103c of the visualization apparatus 103 may be configured visualize the object information to the remote user. The visualization of the object information may be performed as an overlay to the three-dimensional representation of the object. The objection information may e.g. represent blueprints, technical schemas, other graphical information associated with the object.
[0058] Since the system 100 is particularly suited for remote assistance in maintenance, repair, and/or troubleshooting, the system 100 may additionally be equipped with audio connection capabilities. In particular, an audio connection from the digitization apparatus 101 to the visualization apparatus 103 and/or a reverse audio connection from the visualization apparatus 103 to the digitization apparatus 101 may be realized.
[0059] In case of the audio connection from the digitization apparatus 101 to the visualization apparatus 103, the digitization apparatus 101 may further comprise a microphone being configured to capture an acoustic sound signal, in particular an acoustic sound signal originating from the three-dimensional space region. The processor 101b may be configured to determine sound data based upon the acoustic sound signal. The communication interface 101c may be configured to transmit the sound data over the communication network. The visualization apparatus 103 may further comprise a loudspeaker. The communication interface 103a may be configured to receive the sound data over the communication network. The processor 103b may be configured to determine the acoustic sound signal based upon the sound data. The loudspeaker may be configured to emit the acoustic sound signal towards the remote user. Thereby, the remote user may obtain further information, which may e.g. be provided by another user located at the object.
[0060] In case of the reverse audio connection from the visualization apparatus 103 to the digitization apparatus 101, the visualization apparatus 103 may further comprise a microphone being configured to capture a reverse acoustic sound signal, in particular a reverse acoustic sound signal originating from the remote user. The processor 103b may be configured to determine reverse sound data based upon the reverse acoustic sound signal. The communication interface 103a may be configured to transmit the reverse sound data over the communication network. The digitization apparatus 101 may further comprise a loudspeaker. The communication interface 101c may be configured to receive the reverse sound data over the communication network. The processor 101b may be configured to determine the reverse acoustic sound signal based upon the reverse sound data. The loudspeaker may be configured to emit the reverse acoustic sound signal. Thereby, the remote user may provide spoken handling instructions for maintenance, repair, and/or troubleshooting, which may e.g. be executed by another user located at the object.
[0061] For operation of the system 100 in real-time, a small latency of the communication link between the communication interface 101c of the digitization apparatus 101 and the communication interface 103a of the visualization apparatus 103 may be desirable. For this purpose, the processor 101b of the digitization apparatus 101 may be configured to determine a latency of the communication link to obtain a latency indicator, and to adapt a quality of the three-dimensional volumetric representation of the object based upon the latency indicator. Thereby, a reduction of the volumetric data to be communicated between the digitization apparatus 101 and the visualization apparatus 103 may be achieved.
[0062]
[0063] The system comprises a digitization apparatus and a visualization apparatus. The digitization apparatus comprises a sensor, a processor, and a communication interface. The visualization apparatus comprises a communication interface, a processor, and a display.
[0064] The method 200 comprises sensing 201, by the sensor of the digitization apparatus, the object within a three-dimensional space region to obtain sensor data, the sensor data representing a location of the object relative to the sensor and a shape of the object relative to the sensor, determining 203, by the processor of the digitization apparatus, volumetric data based upon the sensor data, the volumetric data forming a three-dimensional volumetric representation of the object, transmitting 205, by the communication interface of the digitization apparatus, the volumetric data over a communication network, receiving 207, by the communication interface of the visualization apparatus, the volumetric data over the communication network, determining 209, by the processor of the visualization apparatus, the three-dimensional representation of the object based upon the volumetric data, and visualizing 211, by the display of the visualization apparatus, the three-dimensional representation of the object to the remote user.
[0065] In summary, the concept allows for an efficient visualization of the object to the remote user, in particular for remote assistance in maintenance, repair, and/or troubleshooting onboard a maritime vessel. In addition, the concept may allow for providing medical assistance onboard a maritime vessel. Various further aspects of the concept are summarized in the following:
[0066] The concept may allow for a digitalization and visualization of an object within a three-dimensional space region in real-time. The digitization may be performed by accurate local measurements of the object using specific sensors, such as a depth sensor, a visual light camera and/or an infrared light camera. By using specific computer vision algorithms, a three-dimensional representation of the object may be determined. In particular, respective sensor data from a plurality of sensors may be combined (“fused”), considering different perspectives of the respective sensors. The volumetric data may then represent only a low amount of geometry, color and/or other measures.
[0067] The volumetric data may comprise volumetric point cloud data. Such volumetric point cloud data may represent three-dimensional information, potentially with a custom multi-sample, multi-dimensional representation of the measures of the respective sensors. Specific internal data structures may be used based on multiple numeric data measures for each point. The volumetric point cloud data may specifically be suited to be used for rendering.
[0068] For visualizing the three-dimensional representation of the object to the user, a display as part of virtual reality (VR) glasses or headset, augmented reality (AR) glasses, a computer system, such as a notebook or laptop, a smartphone, or a tablet, may be used. In this regard, VR/AR engines and/or a three-dimensional rendering application programming interface, API, may be used. Optionally, object information may additionally be overlaid to the three-dimensional representation of the object, e.g. including specific graphical elements, such as blueprints, technical schemas, or other graphical data. The display may particularly visualize the three-dimensional representation of the object along with other graphical elements and video/audio streams at the same time within a virtual space, e.g. being overlapped to the physical world. Thereby, a stereographic visualization of synthetic imagery generated in real-time may be provided; seamlessly blending the three-dimensional information with the physical world.
[0069] In particular when using virtual reality (VR) glasses or headset, or augmented reality (AR) glasses, two slightly different sets of images, wherein the perspective may be adapted for each eye, may be used for projecting the three-dimensional representation into a two-dimensional frame, e.g. at 60 Hz to 120 Hz. 240 Hz is also possible and reduces frame tearing in real-time rendering with fast moving field of views (such as when the user quickly rotates the head). Thereby, an interactive visualization according to the head point of view of the remote user may be achieved in three-dimensional space.
[0070] Furthermore, communication between a local user at the object and the remote user may be supported using a streaming of audio signals. For this purpose, microphones and loudspeakers, e.g. headphones, potentially as part of VR/AR headset or glasses, may be used. This may further allow to enhance the remote assistance capability. In addition, a streaming of video signals may be employed by the system.
[0071] For communication between the digitization apparatus and the visualization apparatus, one or a combination of different communication standards may be used, in particular a satellite-based communication standard and/or a cellular mobile communication standard. For satellite-based communication, a very small aperture terminal, VSAT, may be employed. In particular, communications over multiple network infrastructures may be used, wherein the amount of data may be adapted according to available communication resources.
[0072] The communications may e.g. be performed between (a) different rooms onboard a maritime vessel, (b) from the maritime vessel to land based headquarters or other supporting land based locations, or (c) from the maritime vessel to another maritime vessel, in ports, along the coast, or in open seas. Thereby, available on-board connections onboard the maritime may be leveraged. Furthermore, network communication application programming interfaces, APIs, may be used. The digitization apparatus may particularly be interfaced to a communication relay onboard the maritime vessel over a ship internal network e.g. using cables or WiFi.
[0073] The digitization apparatus may be configured to sense the object within the three-dimensional space region, e.g. within a room, using the different sensors, such as depth sensors, visual light cameras, infrared light cameras. Specific computer vision algorithms may be applied. The three-dimensional representation of the object may be rendered in real-time using the three-dimensional rendering API. The different sensors may be connected to the processor of the digitization apparatus over wireline or wireless connections, such as USB-C, Thunderbolt 3, WiFi, or Bluetooth. For this purpose, specific network communication application programming interfaces, APIs, may be used.
[0074] In general, the concept provides the flexibility to arrange the sensors as required, or to permanently install the sensors at specific locations. Furthermore, the concept provides the flexibility to support virtual reality (VR) glasses or headset, or augmented reality (AR) glasses, available from different manufacturers. Also, the concept supports the use of a smartphone or a tablet providing three-dimensional rendering capabilities, potentially in conjunction with high-resolution cameras.
[0075] Possibly, the use of virtual reality (VR) glasses or headset, augmented reality (AR) glasses, a smartphone or a tablet may be the main and/or a supporting means of communication between the user onboard the maritime vessel and the remote user assisting remotely, which may allow one or multiple users with the relevant expertise to collaborate from different remote locations.
[0076] The visualization apparatus may also highlight and provide support in identifying elements that need to be inspected or repaired by the remote user. In this regard, graphical step-by-step handling instructions may be displayed e.g. on how to repair a malfunctioning component of a technical system. In particular, different types of graphical elements including overlays of machine schematics, vessel schematics, or any other kind of schematic to support the remote user may be visualized. Furthermore, external web pages, e.g. floating in front of the remote user, may be visualized to the remote user. In this regard, specific web browsers for virtual reality (VR), operating systems, OSs, may be used.
[0077] Parts of the system, in particular the digitization apparatus and/or the visualization apparatus, may each be bundled as a kit comprising the respective components for easy deployment. The kit may, however, also be customized. An exemplary kit may e.g. comprise VR/AR glasses or headset, a plurality of sensors including a depth sensor, an infrared light camera, a visible light camera along with multiple stands, suitable cables, and a suitcase. For further ease of implementation, a common shared code base with a number of custom parts tied to specific classes of devices may be used. Thus, a suite of compatible applications may be provided running on VR/AR glasses or headset, a computer system, a smartphone, and/or a tablet.
[0078] The concept particularly allows for remote assistance in any given circumstances, but it is of particular importance for high risk or emergency situations using real-time remote visualization of the object within the three-dimensional space region and allows remotely for professionals or experts to provide assistance without having to be physically present. Furthermore, handling instructions may be communicated to onsite or onboard staff on how to solve the issues. Moreover, real-time responses from different experts may be made available remotely. Thereby, tele-presence of the remote user may be supported, and a non-skilled user onboard the maritime vessel may be assisted.
REFERENCE SIGNS
[0079] 100 System [0080] 101 Digitization apparatus [0081] 101a Sensor [0082] 101b Processor [0083] 101c Communication interface [0084] 103 Visualization apparatus [0085] 103a Communication interface [0086] 103b Processor [0087] 103c Display [0088] 200 Method [0089] 201 Sensing [0090] 203 Determining [0091] 205 Transmitting [0092] 207 Receiving [0093] 209 Determining [0094] 211 Visualizing