G05D1/0038

DISPLAY DEVICE

A display device 20 includes: a hinge 91; a first monitor device 21 including a first front surface 22; and a second monitor device 31 including a second front surface 32. Parallel planes from a virtual second front surface on a rear side by predetermined distances at a first angle a to a third angle c between the first front surface 22 and the virtual second front surface are defined as first to third virtual planes E to G. A rotation axis X1 of the hinge 91 is disposed on anyone of these three virtual planes or inside a space formed by the three virtual planes in a view in a direction parallel to the rotation axis X1, and the second front surface 32 is disposed at a position away from the rotation axis X1 on a front side of the second monitor device 31 by a predetermined distance.

AUTONOMOUS VEHICLE, CONTROL SYSTEM FOR REMOTELY CONTROLLING THE SAME, AND METHOD THEREOF

An autonomous vehicle may include a processor configured to transmit vehicle data for remote control of the autonomous vehicle to a control system when the remote control of the autonomous vehicle is required, and when receiving a remote control command for the remote control from the control system, to generate and follow a path based on the received remote control command.

REMOTE PARKING CONTROL FOR VEHICLES COUPLED IN A TOWED RECHARGING ARRANGEMENT

Electrified vehicles are coupled together by a towing device for in-flight energy transfer. A remote-controlled parking system collects images from vehicle-mounted cameras to produce a 360° overhead live streaming image that is displayed on a smartphone linked to a vehicle controller. A user interface (UI) on the smartphone accepts a first touch from the user on the streaming image showing the vehicles at a starting position in order to specify a maneuver endpoint. The controller calculates a sequence of steering actions to create a path to the endpoint. The UI displays the calculate path as an overlay on the live image. The UI generates an activation signal in response to a second touch input on the touchscreen, and forwards the activation signal to the vehicle controller during the second touch input to move the vehicles according to the actuator commands only while the user maintains the second touch input.

AUTONOMOUS VEHICLE, CONTROL SYSTEM FOR REMOTELY CONTROLLING THE SAME, AND METHOD THEREOF
20220413494 · 2022-12-29 · ·

An autonomous vehicle may include an autonomous driving control apparatus including a processor that is configured to request remote control of the autonomous vehicle to a control system when remote control of the autonomous vehicle is required, when receiving a remote control path and a remote control command from the control system, determines whether the remote control path and the remote control command match, and performs the remote control command depending on a result thereof.

OBSTACLE CLIMBING SURVEILLANCE ROBOT AND ENERGY-ABSORBING FRAME THEREFOR
20220413493 · 2022-12-29 ·

A surveillance robot is adapted with a light-weight body formed with light-weight foam, wheel motors arranged within the light-weight foam and connected to wheels extending out from the body and drivable by the wheel motors, a sensor system at least partially arranged within the light-weight foam for picking up any of image, audio and environmental data, an electronic controller arranged within the light-weight foam, connected to the sensor system and wheel motors, and including a memory and a set of computer instructions that provide for surveillance robot operation, and a transceiver section connected to the electronic controller and including an antenna for transmitting and receiving commands, the image data, the audio data and/or the environmental data to or from the electronic controller. The light-weight foam substantially surrounds, supports and protects the wheel motors, sensor system, electronic controller and transceiver from mechanical shock as the robot traverses obstacles.

Selecting Antenna Patterns On Unmanned Aerial Vehicles
20220416860 · 2022-12-29 ·

Described herein are unmanned aerial vehicles (UAVs) and systems and methods for dynamically selecting directional antennas onboard the UAV for wireless transmissions. For example, an embodiment pertains to a UAV that comprises a flight control system in remote communication with a remote receiver via directional antennas onboard the UAV. The flight control system is operatively coupled with a propulsion system to control the flight of the UAV. While in-flight, the flight control system is configured to determine an orientation and position of the UAV. It is further configured to select a subset of directional antennas to transmit from based on the determined orientation and position, among other factors. The flight control system then directs a transmitter to send wireless communications using the selected directional antennas.

Mobile body, location estimation device, and computer program

A device includes an external sensor to scan an environment so as to periodically output scan data, a storage to store an environmental map, and a location estimation device to match the sensor data against the environmental map read from the storage so as to estimate a location and an attitude of the vehicle. The location estimation device determines predicted values of a current location and a current estimation of the vehicle in accordance with a history of estimated locations and estimated attitudes of the vehicle, and performs the matching by using the predicted values.

Display control device and display control method

A display control device includes an image receiving unit that receives an image captured by an imaging device included in a work machine and a display control unit that processes the image based on a movement amount of the work machine in receipt delay time of the image, and generates a display signal.

Built-in safety of control station and user interface for teleoperation
11537124 · 2022-12-27 · ·

A method and system may receive tiled video feed data sourced from one or more remotely situated vehicles. A teleoperator user interface is generated to include a concurrent display of a plurality of distinct video tiles from the tiled video feed data. A respective video tile displayed in the teleoperator user interface may include visual safety cues. A user interface segment that is displaying a respective video tile may be modified in response to teleoperator input.

SYSTEMS, COMPUTER PROGRAM PRODUCTS, AND METHODS FOR BUILDING SIMULATED WORLDS
20220404835 · 2022-12-22 ·

Systems, computer program products, and methods for constructing models and simulations of real-world environments are described. A robot employs various sensors to collect data from its environment and provides this data to a tele-operation system. Any number of tele-artists may access the tele-operation system and use the robot sensor data to collaboratively construct a simulated scene representative of the robot's environment. The tele-artists may continue to update the simulation in real-time as the robot explores its environment and provides more sensor data. The robot may use the simulation in support of fundamental operations through its cognitive architecture, such as action planning and hypothesis generation.

An artificial intelligence controller of the robot may monitor the adaptations made to the simulation by the tele-artists in response to the sensor data in order to learn (e.g., via reinforcement learning) how to autonomously generate and update its own simulation based on its own sensor data.