G05D2107/17

DELIVERY SERVICE SYSTEM AND METHOD USING AUTONOMOUS VEHICLES

The present invention relates to a technical idea for providing a delivery service on regular and irregular roads using autonomous vehicles. More specifically, the present invention relates to technology in which, on a regular road, a lead vehicle and at least one droid vehicle are coupled to each other and a delivery service is provided based on autonomous driving; and on an irregular road, the coupling between the lead vehicle and the droid vehicle is automatically released and the droid vehicle provides a delivery service by remotely controlling the driving of the droid vehicle by the lead vehicle in the last mile delivery section corresponding to the irregular road. According to one embodiment of the present invention, a system for providing a delivery service using autonomous vehicles may provide a delivery service on an irregular road where entry of normal vehicles is not allowed and a regular road where entry of small and low-speed vehicles is not allowed and may include a droid vehicle for providing a delivery service using limited autonomous driving performance in a last mile delivery section corresponding to the irregular road; and a lead vehicle for providing a delivery service based on autonomous driving on the regular road, transporting the droid vehicle by being coupled to the droid vehicle on the regular road, and remotely controlling driving of the droid vehicle after being separated from the droid vehicle in the last mile delivery section.

MOBILE OBJECT CONTROL DEVICE, MOBILE OBJECT CONTROL METHOD, TRAINING DEVICE, TRAINING METHOD, GENERATION DEVICE, AND STORAGE MEDIUM
20240071090 · 2024-02-29 ·

Provided is a mobile object control device including a storage medium storing a computer-readable command and a processor connected to the storage medium, the processor executing the computer-readable command to: acquire a photographed image, which is obtained by photographing surroundings of a mobile object by a camera mounted on the mobile object, and an input instruction sentence, which is input by a user of the mobile object; detect a stop position of the mobile object corresponding to the input instruction sentence in the photographed image by inputting at least the photographed image and the input instruction sentence into a trained model including a pre-trained visual-language model, the trained model being trained so as to receive input of at least an image and an instruction sentence to output a stop position of the mobile object corresponding to the instruction sentence in the image; and cause the mobile object to travel to the stop position.

CONTROL APPARATUS FOR AIRCRAFT AND CONTROL METHOD THEREFOR
20240062135 · 2024-02-22 · ·

The present invention relates to a control method for a control apparatus for an aircraft, the control apparatus including a communication unit and a processor. The control method includes receiving an address of a delivery location to which a delivery product is to be delivered, and controlling an aircraft by means of the communication unit to deliver the delivery product to the delivery location, in which the controlling of the aircraft by means of the communication unit includes selecting, on the basis of the address, any one of a plurality of delivery criteria that defines an area into which the aircraft is to unload the delivery product, establishing a delivery area and a flight lane to the delivery area on the basis of the delivery criterion, controlling the aircraft to allow the aircraft to fly along the flight lane, and controlling the aircraft to unload the delivery product into the delivery area when the aircraft reaches the delivery area.

Rotorcraft autorotation control through electrical braking

A method of operating an electrically powered rotorcraft of the type having a fuselage and a set of N rotors driven by a set of electric motors and coupled to the fuselage, N?4, under a failure condition preventing ordinary operation of the rotorcraft. The method includes entering a failsafe mode of operation wherein autorotation of at least four of the rotors is enabled. The method also includes using electrical braking associated with a selected group of the rotors to control pitch, roll and yaw of the rotorcraft.

Multi-part navigation process by an unmanned aerial vehicle for navigation

Embodiments described herein may relate to an unmanned aerial vehicle (UAV) navigating to a target in order to provide medical support. An illustrative method involves a UAV (a) determining an approximate target location associated with a target, (b) using a first navigation process to navigate the UAV to the approximate target location, where the first navigation process generates flight-control signals based on the approximate target location, (c) making a determination that the UAV is located at the approximate target location, and (d) in response to the determination that the UAV is located at the approximate target location, using a second navigation process to navigate the UAV to the target, wherein the second navigation process generates flight-control signals based on real-time localization of the target.

AUTONOMOUS DETECT AND AVOID FROM SPEECH RECOGNITION AND ANALYSIS
20240201696 · 2024-06-20 ·

A technique for detecting and avoiding obstacles by an unmanned aerial vehicle (UAV) includes: querying a knowledge graph having information related to a dynamic obstacle that may be in proximity to the UAV when traveling along a planned route; comparing the location of the dynamic obstacle to the UAV to detect conflicts; and in response to detecting a conflict, performing an action to avoid conflict with the dynamic obstacle. The knowledge graph can be updated by receiving a VHF radio signal containing the information related to the dynamic obstacle in the audible speech format; translating the audible speech format to a text format using speech recognition; analyzing the text format for relevant information related to the dynamic obstacle; comparing the relevant information related to the dynamic obstacle of the text format to the knowledge graph to detect changes; and updating the knowledge graph.

Method for controlling hand-over in drone network

The present invention relates to a method for controlling hand-over in a drone network. A method for controlling hand-over in a drone network that is established by a plurality of drones that constitute a formation, and controlled by a ground control station (GCS) that controls the location, configuration and mobility of each of the plurality of drones according to the present invention includes a phase via which the GCS predicts, based on previously stored control information, a drone that is to be newly deployed or transferred from another formation and allocates network connection information to the drone thus predicted; a phase via which the GCS generates a virtual routing table including the drone that is thus predicted to be deployed or transferred; a phase via which the GCS, upon actual deploying or transferring the predicted drone, changes the virtual routing table into an actual routing table; and a phase via which the GCS, upon the drone thus deployed or transferred transmitting a control message of the formation routing protocol, calibrates and optimizes the routing table.

UNMANNED DEVICE CONTROL METHOD AND APPARATUS, STORAGE MEDIUM, AND ELECTRONIC DEVICE
20240281005 · 2024-08-22 ·

An unmanned device control method and apparatus, a storage medium, and an electronic device. An unmanned device is controlled to move according to a preplanned target path; current environment information of the unmanned device is obtained; according to the current environment information of the unmanned device, a target subpath on which the unmanned device is located is determined, from target subpaths included in the target path, as a designated subpath; and a control strategy is then determined according to a scenario type corresponding to the designated subpath, and a determined control strategy is used to control the unmanned device.

Assisted Navigation System for Assisted Automation of Mobile Robots

The assisted navigation system is intended to enable an assisted operation mode in ground mobile robots. The system is designed to achieve an autonomous relocation of a robot from one location to another location within a sidewalk, minimizing the need for constant human intervention. The system includes a camera for collecting the visual data needed to assess the terrain and potential obstacles, a collection of sensors to detect potential obstacles during assisted operations, a communication module to receive inputs from a remote operator to enable the activation of this system, a localization module for teleoperations, a local server module to store the information gathered, a processor configured to operate a robot in an assisted mode of operation based on input from the communication interface in which the robot performs a task without human intervention, and a communication interface coupled to the processor and configured to communicate control values to the systems of the mobile robot.

SYSTEM AND METHOD FOR DATA HARVESTING FROM ROBOTIC OPERATIONS FOR CONTINUOUS LEARNING OF AUTONOMOUS ROBOTIC MODELS

A system and method involves detecting a trigger event during operation of an autonomous ground vehicle traveling between two physical locations; generating event sequence data from primary sensor data, secondary sensor data, spatiotemporal data, and telemetry data through operation of a reporter; communicating the event sequence data to cloud storage and raw data to a streaming database; transforming the raw data into normalized data stored in a relational database through operation of a normalizer; operating a curation system to identify true trigger events from the normalized data and extract training data by way of a discriminator; operating a machine learning model within an active learning pipeline to generate a model update from aggregate training data generated from the training data by an aggregator; and reconfiguring the navigational control system with the model update communicated from the active learning pipeline to the autonomous ground vehicle.