G05D1/2249

ENHANCEMENTS TO BEYOND-VISUAL-LINE-OF-SIGHT (BVLOS) OPERATION OF REMOTE-CONTROLLED APPARATUSES
20240377822 · 2024-11-14 ·

Beyond-visual-line-of-sight (BVLOS) control of drones is enhanced using wireless communication via a cellular network. In an example system, a drone and a remote control station are configured to communicate via a 5G network. Drone control is further enhanced with extended reality (XR) display. Video stream data captured by the drone is replaced, supplemented, and/or overlaid with XR visual data in an XR environment. The XR visual data corresponds to a perspective from a virtual location in the XR environment that mirrors a real-world location of the drone. This lightens constraints and requirements associated with the network communication of video stream data from the drone for BVLOS control of the drone. Drone control is further enhanced with voice command capability. A user vocally utters a coarse-resolution or abstract command, and a drone control station translates the utterance into a sequence of fine-resolution drone commands that implement the vocally-uttered command.

CONTROL METHOD FOR MOVABLE PLATFORM, HEAD-MOUNTED DEVICE, SYSTEM, AND STORAGE MEDIUM
20240370031 · 2024-11-07 · ·

A method for controlling a movable platform includes: displaying a first image and/or a second image on a display device of a head-mounted device, where the first image is captured by a first photographing device on the head-mounted device and the second image is captured by a second photographing device on the movable platform; when switching from displaying the second image to displaying the first image on the display device, sending a safety operation instruction to the movable platform to make the movable platform perform a corresponding safety operation. This disclosure ensures the safety of the movable platform during interactions between the head-mounted device and the movable platform, particularly when the content displayed by the head-mounted device changes. The disclosure also provides a terminal device and a head-mounted device.

ROBOT CLEANER, METHOD OF OPERATING THE SAME, AND AUGMENTED REALITY SYSTEM
20180055312 · 2018-03-01 ·

A robot cleaner includes a display unit, a camera unit configured to capture an image of a cleaning region when cleaning starts, a dust sensor configured to output a sensed signal corresponding to an amount of dust sucked in the cleaning region, and a control unit configured to calculate the amount of dust based on the sensed signal, to start an augmented reality (AR) mode when cleaning ends, to generate an AR image corresponding to the amount of dust, and to control the display module to superimpose the AR image on the image.

ROBOT CLEANER AND A SYSTEM INCLUDING THE SAME
20180055326 · 2018-03-01 ·

A robot cleaner includes a camera, a display unit configured to display a cleaning region projected by the camera, a communication unit configured to perform communication with a mobile terminal to receive an image including the cleaning region captured by the mobile terminal, and a control unit configured to search a cleaning region based on a traveling path of the robot cleaner, to predict an estimated traveling path of the robot cleaner based on the cleaning region and the image including the cleaning region, to generate an augmented reality (AR) image of the estimated traveling path, and to display the AR image to be superimposed on the image including the cleaning region.

REMOTE SUPPORT APPARATUS
20240402717 · 2024-12-05 · ·

A remote support apparatus remotely supports a travelling of a vehicle. The remote support apparatus communicates wirelessly with the vehicle. The remote support apparatus wirelessly acquires an image of a view in a travelling direction of the vehicle and display the acquired image by a displaying device of the remote support apparatus. The image is taken by an imaging device of the vehicle. The remote support apparatus displays a set steering angle line in the image displayed by the displaying device. The set steering angle line is a line along which an outer portion of the vehicle during vehicle turning is predicted to move when the vehicle turns at a set steering angle.

ADAPTIVE MAPPING TO NAVIGATE AUTONOMOUS VEHICLES RESPONSIVE TO PHYSICAL ENVIRONMENT CHANGES
20170248963 · 2017-08-31 ·

Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical and electronic hardware, computer software and systems, and wired and wireless network communications to provide map data for autonomous vehicles. In particular, a method may include accessing subsets of multiple types of sensor data, aligning subsets of sensor data relative to a global coordinate system based on the multiple types of sensor data to form aligned sensor data, and generating datasets of three-dimensional map data. The method further includes detecting a change in data relative to at least two datasets of the three-dimensional map data and applying the change in data to form updated three-dimensional map data. The change in data may be representative of a state change of an environment at which the sensor data is sensed. The state change of the environment may be related to the presence or absences of an object located therein.

ROBOTIC VEHICLE ACTIVE SAFETY SYSTEMS AND METHODS

Systems and methods implemented in algorithms, software, firmware, logic, or circuitry may be configured to process data and sensory input to determine whether an object external to an autonomous vehicle (e.g., another vehicle, a pedestrian, road debris, a bicyclist, etc.) may be a potential collision threat to the autonomous vehicle. The autonomous vehicle may be configured to implement active safety measures to avoid the potential collision and/or mitigate the impact of an actual collision to passengers in the autonomous vehicle and/or to the autonomous vehicle itself. Interior safety systems, exterior safety systems, a drive system or some combination of those systems may be activated to implement active safety measures in the autonomous vehicle.

Robotic vehicle active safety systems and methods

Systems and methods implemented in algorithms, software, firmware, logic, or circuitry may be configured to process data and sensory input to determine whether an object external to an autonomous vehicle (e.g., another vehicle, a pedestrian, road debris, a bicyclist, etc.) may be a potential collision threat to the autonomous vehicle. The autonomous vehicle may be configured to implement active safety measures to avoid the potential collision and/or mitigate the impact of an actual collision to passengers in the autonomous vehicle and/or to the autonomous vehicle itself. Interior safety systems, exterior safety systems, a drive system or some combination of those systems may be activated to implement active safety measures in the autonomous vehicle.

Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
09612123 · 2017-04-04 · ·

Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical and electronic hardware, computer software and systems, and wired and wireless network communications to provide map data for autonomous vehicles. In particular, a method may include accessing subsets of multiple types of sensor data, aligning subsets of sensor data relative to a global coordinate system based on the multiple types of sensor data to form aligned sensor data, and generating datasets of three-dimensional map data. The method further includes detecting a change in data relative to at least two datasets of the three-dimensional map data and applying the change in data to form updated three-dimensional map data. The change in data may be representative of a state change of an environment at which the sensor data is sensed. The state change of the environment may be related to the presence or absences of an object located therein.

Remote assistance method, remote assistance system, and non-transitory computer-readable storage medium

According to the remote assistance method of the present disclosure, first, a positional relationship between the vehicle having an autonomous traveling function and an object present around the vehicle at a future time beyond a current time is displayed spatially on a display device, the positional relationship being predicted based on a path plan for autonomous traveling created by the vehicle and information on the object. Next, assistance content input from a remote operator is transmitted to the vehicle. Then, remote assistance corresponding to the assistance content is executed in the vehicle in response to confirmation that the positional relationship between the vehicle and the object displayed on the display device when the assistance content is input is realized after the vehicle receives the assistance content.