Patent classifications
G05D1/2249
INFORMATION GENERATION METHOD, INFORMATION GENERATION DEVICE, AND RECORDING MEDIUM
An information generation method is performed by an information generation device which generates information for a learning model that infers whether a mobile object is movable in a predetermined region. The information generation method includes: obtaining at least (i) first information and (ii) second information when the mobile object moves in a first region, the first information being obtained from a sensor provided in the mobile object, the second information relating to movement of the mobile object; inferring whether the mobile object is movable in the first region according to the second information; and generating fourth information for a learning model, the fourth information associating the first information, the second information, and third information with one another, the third information indicating an inference result which is obtained in the inferring.
SYSTEMS AND METHODS TO ACCOUNT FOR LATENCY ASSOCIATED WITH REMOTE DRIVING APPLICATIONS
Systems and methods to account for latency associated with remote driving applications may include a vehicle having an imaging device and a teleoperator station in communication with each other via a network. Imaging data that is captured by the imaging device may be transmitted to the teleoperator station for presentation to a teleoperator. In order to account for latency in the transmission, receipt, processing, and presentation of the imaging data, one or more visualizations of the vehicle, with various visual characteristics, may be rendered within or overlaid onto the imaging data, in order to facilitate safe and reliable remote operation of the vehicle by the teleoperator at the teleoperator station.
ARTIFICIAL INTELLIGENCE BASED SYSTEM AND METHOD FOR MANAGING HETEROGENEOUS NETWORK-AGNOSTIC SWARM OF ROBOTS
An AI based system and method for managing heterogeneous network-agnostic swarm of robots is disclosed. The method includes receiving a set of commands from a human machine interface associated with one or more electronic devices, determining one or more robotic capabilities associated with autonomous robot and capturing one or more positional parameters by using one or more sensors. The method includes broadcasting the one or more robotic capabilities and the one or more positional parameters to each of the one or more autonomous robots and determining one or more situational parameters associated with the one or more autonomous robots. Furthermore, the method includes detecting one or more targets and allocating the one or more tasks and the detected one or more targets among the one or more autonomous robots.
SYSTEMS AND METHODS TO INCREASE ENVIRONMENT AWARENESS ASSOCIATED WITH REMOTE DRIVING APPLICATIONS
Systems and methods to increase environment awareness in remote driving applications may include a vehicle having an imaging device and a teleoperator station in communication with each other via a network. For example, audio data may be received from the vehicle and processed to identify known sounds associated with unseen objects in the environment. In addition, imaging data may be received from the vehicle and processed to identify known but unheard objects in the environment. Based on the identified sounds and/or objects, visualizations of the objects may be generated and presented to a teleoperator, and sounds associated with the objects may be amplified, synthesized, and/or emitted to the teleoperator to increase environment awareness.
SYSTEMS AND METHODS TO ENSURE SAFE DRIVING BEHAVIORS ASSOCIATED WITH REMOTE DRIVING APPLICATIONS
Systems and methods to ensure safe driving behaviors in remote driving applications may include a vehicle having an imaging device and a teleoperator station in communication with each other via a network. For example, a safety tunnel having various safety tunnel parameters may be generated based on location data, map data, vehicle data, and/or sensor data. Remote operation of the vehicle may be monitored with respect to the safety tunnel parameters, and various visual, audio, and/or haptic alerts or feedback may be presented or emitted for the teleoperator to encourage or enforce vehicle operation within the safety tunnel parameters. Further, various autonomous remote operation programs or control routines may be initiated or instructed to ensure safe driving behaviors of the vehicle based on the safety tunnel parameters.
INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD
The present invention acquires surrounding information of surrounding environment of a vehicle; sets a virtual viewpoint position ahead of the vehicle in a traveling direction; uses the acquired surrounding information and generates virtual surrounding information indicating the surrounding environment at the set virtual viewpoint position; transmits a first control signal for controlling an external device located at a location away from the vehicle to display the generated virtual surrounding information; receives driving operation information on a driving operation of the vehicle output from the external device; generates driving assistance information for assisting driving of the vehicle based on the received driving operation information; and generates a second control signal for controlling an output device installed in the vehicle to output the driving assistance information.
REMOTE OPERATION CONTROL METHOD, REMOTE OPERATION SYSTEM, AND MOVING BODY
A remote operation control method for controlling a remote operation of a moving body is provided. A video captured by a camera mounted on the moving body is transmitted to a remote operator terminal on a side of a remote operator remotely operating the moving body. The remote operation control method includes: setting an upper limit speed of the moving body during the remote operation to be lower as a quality of the video transmitted from the moving body to the remote operator terminal becomes lower or as an encoding and decoding time of the video becomes longer; and limiting a speed of the moving body during the remote operation to the upper limit speed or less regardless of an operation amount input by the remote operator.
SYSTEM, APPARATUS, AND METHOD FOR PROVIDING AUGMENTED REALITY ASSISTANCE TO WAYFINDING AND PRECISION LANDING CONTROLS OF AN UNMANNED AERIAL VEHICLE TO DIFFERENTLY ORIENTED INSPECTION TARGETS
A method for controlling an unmanned aerial vehicle using a control apparatus, comprises: executing a navigation process by: obtaining a live video moving image from a navigation camera device of the UAV; and generating a navigation display interface for display on a display device of the control apparatus, the navigation display interface comprising a plurality of navigation augmented reality display elements related to a determined waypoint superimposed over the live video moving image; and when the UAV reaches the determined waypoint, executing a precision landing process by: generating a precision landing display interface for display on the display device, the precision landing display interface comprising a plurality of precision landing AR display elements related to a landing target associated with the determined waypoint superimposed over the live video moving image obtained from a precision landing camera device of the UAV.
TARGET ACQUISITION SYSTEM FOR AN UNMANNED AIR VEHICLE
The application relates to a target acquisition system (100) for an unmanned aircraft (106) according to an embodiment. The system comprises goggles (110) and the unmanned aircraft. The unmanned aircraft equipped with a camera (124) and a measuring unit (230) is configured to transmit location data (DS, KA, DE) related to a location (MS) of a target (102) to the goggles. The goggles are configured to form an augmented reality user interface (LK) by means of at least one goggle lens (212) for controlling the unmanned aircraft. The goggles equipped with an orientation detector (213) are configured to present to a wearer (108) of the goggles the location of the target as an augmented reality target object (MB) in the user interface based on the received target location data and an orientation (SA) of the goggles as detected by the orientation detector.
REMOTE AGRICULTURAL VEHICLE INTERFACE SYSTEM AND METHODS FOR SAME
A remote agricultural vehicle interface system includes an agricultural vehicle capability input configured to receive one or more vehicle characteristics of an agricultural vehicle and a remote vehicle interface generator that generates a remote vehicle interface for the agricultural vehicle based on the vehicle characteristics. The remote vehicle interface includes one or more remote outputs and one or more remote inputs for the agricultural vehicle. A remote access evaluator includes one or more of an electronic device input, a vehicle to device connection input, or a vehicle to device range input configured to receive one or more range characteristics. An interface refinement tool is configured to refine the remote vehicle interface based on one or more of electronic device characteristics, connection characteristics, or range characteristics. A remote vehicle interface output is configured to communicate the remote vehicle interface refined with the interface refinement tool to a candidate electronic device.