G05D1/0038

Graphical management system for interactive environment monitoring

Systems and methods for monitoring an environment using a Graphical Management System (GMS) are described. The GMS may present a vector image map of the environment for interaction by a user. The user may zoom and pan on the map to generate views of the map with various ranges of detail of the map. Video data from a plurality of video cameras may also be displayed on the map based on the user input and the level of zoom and the location viewed in the map. Further, the user may select a timeline event and sensor data associated with the event and the map may be initialized at the time and location of the event.

Unmanned aerial vehicle with virtual un-zoomed imaging

In some examples, an unmanned aerial vehicle (UAV) may control a position of a first camera to cause the first camera to capture a first image of a target. The UAV may receive a plurality of second images from a plurality of second cameras, the plurality of second cameras positioned on the UAV for providing a plurality of different fields of view in a plurality of different directions around the UAV, the first camera having a longer focal length than the second cameras. The UAV may combine at least some of the plurality of second images to generate a composite image corresponding to the first image and having a wider-angle field of view than the first image. The UAV may send the first image and the composite image to a computing device.

Optimizing management of autonomous vehicles
11609564 · 2023-03-21 · ·

A system, device and method of managing autonomous vehicles are provided. The system may include a server, a feedback device, a control device for controlling a vehicle. A server may include a communication interface configured to communicate with a control device of each of a plurality of vehicles and a feedback device; a memory storing instructions; and at least one processor. The at least one processor is configured to receive control data from the control device of each of the plurality of vehicles; determine an operation status of each of the plurality of vehicles based on the control data; generate a feedback interface based on the operation status of each of the plurality of vehicles; and transmit the feedback interface to the feedback device.

Vehicle control systems and methods and optical tethering techniques

The present disclosure generally relates to vehicle remote parking assistance, and, more specifically, systems and methods for determining a distance between a vehicle and a mobile device during a remote parking procedure. In particular, the systems and methods include selecting a camera lens or focal length for use with a first method of determining a distance between a vehicle and a mobile device based on an image of the vehicle and a digital model of the vehicle.

ROBOT

Measures for use in controlling a robot. At an electronic user device, data representative of an environment of the robot is received from the robot. The received data indicates a location of at least one moveable object in the environment. In response to receipt of the representative data, a representation of the environment of the robot is displayed on a graphical display of the electronic user device. Input is received from a user of the electronic user device indicating a desired location for the at least one moveable object in the environment of the robot. In response to receipt of the user input, control data is transmitted to the robot. The control data is operable to cause the robot to move the at least one object to the desired location in the environment of the robot.

REMOTE ASSISTANCE SYSTEM AND PROGRAM
20230081876 · 2023-03-16 ·

In a remote assistance apparatus, a task assigning unit prohibits assignment of a task to an operator whose operational capability is judged to be insufficient by newest evaluation results that are stored in a storage unit, and assigns a task of remotely controlling a vehicle to an operator who is selected from among a plurality of operators whose operating abilities are judged to be sufficient by the newest evaluation results. An evaluation acquiring unit acquires the evaluation result of the operational capability of the selected operator, in response to the selected operator performing the task that is assigned by the task assigning unit using a remote control unit that causes the vehicle to travel in response to remote control by the operator. A storage control unit performs control to store the newest evaluation result acquired by the evaluation acquiring unit in the storage unit.

Vehicle control system
11480957 · 2022-10-25 · ·

A vehicle control system includes: a control device mounted on a vehicle and configured to execute remote automatic moving processing; an operation terminal configured to be carried by a user and to transmit a control signal to the control device based on an input by the user; and a position determination unit mounted on the vehicle and/or the operation terminal and configured to measure a distance between the operation terminal and the vehicle. When the control device determines that the distance exceeds a first threshold based on a signal from the position determination unit, the control device stops the vehicle and transmits an output signal to the operation terminal, the output signal being a signal to cause the operation terminal to output a fact that the distance exceeds the first threshold.

SYSTEMS AND METHODS FOR AUTONOMOUS SELECTION AND OPERATION OF COMBINATIONS OF STEALTH AND PERFORMANCE CAPABILITIES OF A MULTI-MODE UNMANNED VEHICLE
20230081946 · 2023-03-16 ·

An unmanned vehicle including a vehicle body, propulsion system, maneuvering system, vehicle control system, rack, sensor, and a power supply. The vehicle control may be used to control the unmanned vehicle in combination with the propulsion and the maneuvering system. The rack may include a retractable mount that may move between a down position and an up position. The sensor system may include a plurality of transient object detection sensors. The plurality of transient object detection sensors may include a sensor adapted to detect an item of interest and may provide an item of interest signal to the vehicle control system. The vehicle control system may identify an item of interest classification and may provide a classification signal. The classification signal may be determined by the item of interest classification and may be utilized to avoid detection of the unmanned vehicle by the item of interest.

Image quality enhancement for autonomous vehicle remote operations

Techniques for image quality enhancement for autonomous vehicle remote operations are disclosed herein. An image processing system of an autonomous vehicle can obtain images captured by at least two different cameras and stitch the images together to create a combined image. The image processing system can apply region blurring to a portion of the combined image to create an enhanced combined image, e.g., to blur regions/objects determined to be less import (or unimportant) for the remote operations. The image processing system can encode pixel areas of the enhanced combined image using a corresponding quality setting for respective pixel areas to create encoded image files, e.g., based on complexity levels of the respective pixel areas. The image processing system can transmit the encoded image files to a remote operations system associated with the autonomous vehicle for remote operations support.

Information processing apparatus, information processing method, and information processing program
11604478 · 2023-03-14 · ·

Concerning a partial area image that constitutes a wide area image, to control a flying body in accordance with a flight altitude at a past point of time of image capturing, an information processing apparatus includes a wide area image generator that extracts, from a flying body video obtained when a flying body captures a ground area spreading below while moving, a plurality of video frame images and combines the video frame images, thereby generating a captured image in a wide area, an image capturing altitude acquirer that acquires a flight altitude at a point of time of image capturing by the flying body for each of the plurality of video frame images, and an image capturing altitude output unit that outputs a difference of the flight altitude for each video frame image.