G05D1/0038

METHODS AND SYSTEMS FOR A FLY BY VIEW IN AN ELECTRIC AIRCRAFT
20230012962 · 2023-01-19 · ·

A system for fly by view in an electric aircraft, where the system includes an electric aircraft, where the electric aircraft further includes at least a flight component mechanically coupled to the electric aircraft, a battery assembly and at least on video transmitter coupled to the electric aircraft, where the at least one video transmitter is configured to transmit a pilot stream to a first-person-view headset. The system also includes a first-person-view headset, where the first-person-view headset is configured to receive the pilot stream from at least one video transmitter, and display the pilot stream to a user, where displaying the pilot stream includes displaying a flight time remaining metric as a function of the remaining charge of the battery, and displaying a flight component output metric as a function of the performance of the flight component.

CONTROLLING AND MONITORING REMOTE ROBOTIC VEHICLES

A method performed by a controller device is provided. The method includes (a) connecting to a plurality of remote robotic vehicles over a wireless connection; (b) selectively controlling one of the plurality of remote robotic vehicles over the wireless connection; and (c) displaying, on a video screen of the controller device, a video stream received from the one remote robotic vehicle over the wireless connection. A corresponding method is provided to be performed by a robotic vehicle. Apparatuses, computer program products, and a system for performing the method(s) are also provided.

MOVING BODY CONTROL SYSTEM, MOVING BODY CONTROL METHOD, AND MOVING BODY REMOTE SUPPORT SYSTEM
20230013007 · 2023-01-19 · ·

A moving body control system controls a moving body being a target of remote support by a remote operator. The moving body control system acquires an image captured by a camera installed on the moving body, and spatially splits the image into a plurality of split images. The moving body control system sets importance of each of the plurality of split images such that the importance of a split image with a higher need for gaze by the remote operator is higher than the importance of a split image with a lower need for the gaze by the remote operator. The moving body control system encodes and transmits each split image to a remote support device on the remote operator side such that an image quality of the split image of the higher importance is higher than an image quality of the split image of the lower importance.

IMAGE CAPTURE SYSTEM, CONTROL DEVICE, AND METHOD THEREFOR
20230221729 · 2023-07-13 ·

A control device includes a traveling control unit and an image capture control unit. The traveling control unit controls traveling of a wheeled platform traveling along an image capture plane where a plurality of image capture target objects are arranged, spaced apart from each other in vertical and horizontal directions. The image capture control unit controls an image capture operation of a plurality of cameras installed in a direction perpendicular to a traveling direction of the wheeled platform, according to an image capture timing list set on a per camera basis, of the plurality of cameras.

User interface for displaying point clouds generated by a LiDAR device on a UAV

Techniques are disclosed for real-time mapping in a movable object environment. A system for real-time mapping in a movable object environment, may include at least one movable object including a computing device, a scanning sensor electronically coupled to the computing device, and a positioning sensor electronically coupled to the computing device. The system may further include a client device in communication with the at least one movable object, the client device including a visualization application which is configured to receive point cloud data from the scanning sensor and position data from the positioning sensor, record the point cloud data and the position data to a storage location, generate a real-time visualization of the point cloud data and the position data as it is received, and display the real-time visualization using a user interface provided by the visualization application.

Methods and systems for keeping remote assistance operators alert
11698643 · 2023-07-11 · ·

Examples described may enable provision of remote assistance for an autonomous vehicle. An example method includes a computing system operating by default in a first mode and periodically transitioning from operation in the first mode to operation in a second mode. In the first mode, the system may receive environment data provided by the vehicle and representing object(s) having a detection confidence below a threshold, where the detection confidence is indicative of a likelihood of correct identification of the object(s), and responsive to the object(s) having a confidence below the threshold, provide remote assistance data comprising an instruction to control the vehicle and/or a correct identification of the object(s). In the second mode, the system may trigger user interface display of remote assistor alertness data based on pre-stored data related to an environment in which the pre-stored data was acquired, and receive a response relating to the alertness data.

Remote assistance system for autonomous vehicle

A system for providing remote assistance to an autonomous vehicle is disclosed herein. The system includes at least one remote assistance button configured to be selectively activated to initiate remote assistance for the autonomous vehicle. Each remote assistance button corresponds to a dedicated remote assistance function for the autonomous vehicle. For example, the system can include remote assistance buttons for causing the autonomous vehicle to stop, decelerate, or pull over. The system includes a controller configured to detect activation of the remote assistance button(s) and to cause remote assistance to be provided to the autonomous vehicle in response to the activation. For example, the autonomous vehicle may perform an action corresponding to the activated button with input assistance from the controller but without the system taking over control of the autonomous vehicle.

FOVEATED STITCHING
20230216981 · 2023-07-06 ·

The present disclosure relates to a computer-implemented method for stitching images representing the surroundings of an automated vehicle into a stitched view and an image stitching system for an automated vehicle for use in said method. The method comprises the steps of: providing, by means of respective image capturing units, two images representing surroundings of the automated vehicle, wherein the two images share an overlapping region of the surroundings from different viewpoints of the respective image capturing units; determining an image transformation between the two images based on pre-calculated calibration information, or feature matching of discernable features of the surroundings visible in said two images; stitching the two images into a stitched view with a respective image seam between the two images based on said image transformation; displaying the stitched view to an operator of the automated vehicle, and receiving, by means of an operator input device, operator input data indicating the operator's viewpoint in the stitched view; determining a region of interest within the stitched view based on said operator input data; wherein the step of stitching the two images into a stitched view involves determining a set of stitching solutions between the two images and selecting a stitching solution that results in a stitched view with a stitching seam that is displaced away a distance from a point in the region of interest in a direction towards the outside of the region of interest.

TELEPRESENCE ROBOTS HAVING COGNITIVE NAVIGATION CAPABILITY

The embodiments of present disclosure herein address unresolved problem of cognitive navigation strategies for a telepresence robotic system. This includes giving instruction remotely over network to go to a point in an indoor space, to go an area, to go to an object. Also, human robot interaction to give and understand interaction is not integrated in a common telepresence framework. The embodiments herein provide a telepresence robotic system empowered with a smart navigation which is based on in situ intelligent visual semantic mapping of the live scene captured by a robot. It further presents an edge-centric software architecture of a teledrive comprising a speech recognition based HRI, a navigation module and a real-time WebRTC based communication framework that holds the entire telepresence robotic system together. Additionally, the disclosure provides a robot independent API calls via device driver ROS, making the offering hardware independent and capable of running in any robot.