Patent classifications
G05D2109/20
SPACE COUPLING SYSTEM AND SPACE COUPLING METHOD
A space coupling system for coupling real space with virtual space is provided. A real position in the real space and a virtual position in the virtual space are associated with each other. A first real position is the real position of a real person in the real space or the real position of a screen configured to move following the real person. A first virtual position is the virtual position associated with the first real position and changing in conjunction with the first real position. The space coupling system acquires the first real position and the first virtual position, locates an access point accessible by a virtual person at the first virtual position in the virtual space, and displays or projects information regarding the virtual person or an image of the virtual space around the access point on the screen in the real space.
VISION-BASED APPROACH AND LANDING SYSTEM
A vision-based navigation method that does not depend on GPS signal for landing AAM aircraft. The Vision-Based Approach and Landing System (VALS) uses images captured from a camera for Advanced Air Mobility (AAM) approach and landing in GPS-denied environments, which offers a potential Alternative Position, Navigation, and Timing (APNT) solution. VALS utilizes a computer vision algorithm called Coplanar Pose from Orthography and Scaling with Iterations (COPOSIT) to estimate the position and orientation of the camera based on coplanar features. VALS also includes an extended Kalman filter that uses IMU measurements in a prediction step and the COPOSIT estimation results in a correction element. Combining IMU with vision creates a sensor fusion navigation solution for GPS-denied environments.
AERIAL VEHICLE, IMAGE PROCESSING METHOD AND DEVICE, MOVABLE PLATFORM
An image processing method may be applied to a movable platform, and the movable platform may comprise a first vision sensor and a second vision sensor. The method may include obtaining a first localized image of the first vision sensor within an overlapping visual range, obtaining a second localized image of the second vision sensor within the overlapping visual range; acquiring an image captured by the first vision sensor at a first moment and an image captured at a second moment, the first vision sensor being positioned in space at the first moment differently than at the second moment; and determining a relative positional relationship between an object in the space where the movable platform is located and the movable platform based on the first localized image, the second localized image, the image captured at the first moment and the image captured at the second moment.
INFORMATION PROCESSING APPARATUS, PROGRAM, SYSTEM, AND INFORMATION PROCESSING METHOD
An information processing apparatus is provided, including: a prediction result obtainment unit which obtains a result of prediction of weather in stratosphere; and a flight path determination unit which determines a flight path of a flight vehicle based on the result of prediction of weather in stratosphere such that the flight vehicle flies through an area in stratosphere that has been predicted to satisfy a predetermined stratospheric path condition, wherein the flight vehicle functions as a stratospheric platform, and forms a wireless communication area by emitting a beam and provides a wireless communication service to a user terminal in the wireless communication area.
FLIGHT DIRECTOR ASSIST SYSTEM FOR AIRCRAFT
An aircraft includes one or more LRUs including one or more processors. The aircraft also includes one or more actuators coupled to one or more flight control surfaces. The actuator(s) are communicatively coupled to at least one of the LRU(s) to receive control signals. The aircraft also includes one or more sensors coupled to at least one of the LRU(s) and configured to generate sensor data indicative of a trajectory of the aircraft. While in a manual flight mode, the processor(s) are configured to generate trajectory guidance data based on one or more trajectory setpoints. The processor(s) are also configured to, while in the manual flight mode, determine an error metric indicating deviation between the trajectory of the aircraft and the trajectory guidance data and send a control signal based on the error metric to the one or more actuators.
COMMUNICATING BETWEEN HETEROGENOUS ENDPOINTS OF AN ELECTRONIC SYSTEM VIA GENERALIZED DATA MESSAGING
A technique for communicating between multiple endpoints of an electronic system includes providing a common interface component for each endpoint. Each common interface component is configured to translate between endpoint-specific messages of a respective endpoint and generalized messages that are not specific to any endpoint. Using this arrangement, any two endpoints can communicate via generalized messages, by translating endpoint-specific messages of a sender into generalized messages and by translating generalized messages into endpoint-specific messages of a receiver.
AIR MOBILITY DEVICE AND A METHOD FOR INSPECTING A VEHICLE USING THE AIR MOBILITY DEVICE
An air mobility device includes at least one communication circuit, at least one camera, a microphone, a driving device, a memory, and a processor connected with the at least one communication circuit, the at least one camera, the microphone, the driving device, and the memory. The processor moves to a position adjacent to a vehicle using the driving device. The processor also connects communication with the vehicle using the at least one communication circuit. The processor also transmits a request signal for requesting the vehicle to execute a function to the vehicle through the communication. The processor also inspects the function of the vehicle, using at least one of the at least one camera or the microphone.
Devices, systems and methods for navigating a mobile platform
Aspects of embodiments to systems and methods for navigating a mobile platform using an imaging device on the platform, from a point of origin towards a target located in a scene, and without requiring a Global Navigation Satellite system (GNSS), by employing the following steps: acquiring, by the imaging device, an image of the scene comprising the target; determining, based on analysis of the image, a direction vector pointing from the mobile platform to the target; advancing the mobile platform in accordance with the direction vector to a new position; and generating, by a distance sensing device, an output as a result of attempting to determine, with the distance sensing device, a distance between the mobile platform and the target. The mobile platform advanced towards the target until the output produced by the distance sensing device is descriptive of a distance which meets a low-distance criterion.
CONTROL METHOD FOR MOVABLE PLATFORM, HEAD-MOUNTED DEVICE, SYSTEM, AND STORAGE MEDIUM
A method for controlling a movable platform includes: displaying a first image and/or a second image on a display device of a head-mounted device, where the first image is captured by a first photographing device on the head-mounted device and the second image is captured by a second photographing device on the movable platform; when switching from displaying the second image to displaying the first image on the display device, sending a safety operation instruction to the movable platform to make the movable platform perform a corresponding safety operation. This disclosure ensures the safety of the movable platform during interactions between the head-mounted device and the movable platform, particularly when the content displayed by the head-mounted device changes. The disclosure also provides a terminal device and a head-mounted device.
IMPROVEMENTS IN IMAGE ACQUISITION PLANNING SYSTEMS AND METHODS USED TO GENERATE INFORMATION FOR STRUCTURES OF INTEREST
The present disclosure relates to improvements in systems and methods in acquiring images via imaging devices, where such imaging devices can be configured, in some implementations, with an unmanned aerial vehicle or other vehicle types, as well as being hand-held. Images are acquired from the imaging devices according to capture plans where useful information types about a structure of interest (or objects, items, etc.) can be derived from a structure image acquisition event. Images acquired from capture plans can be evaluated to generate improvements in capture plans for use in subsequent structure image acquisition events. Capture plans provided herein generate accurate information as to all or part of the structure of interest, where accuracy is in relation to the real-life structure incorporated in the acquired images.