G05D1/245

SYSTEMS AND METHODS FOR AIRCRAFT LANDING GUIDANCE DURING GNSS DENIED ENVIRONMENT

A system comprises a GNSS sensor onboard an aerial vehicle; a monitor warning system (MWS) that determines whether the vehicle is in a GNSS denied environment; and a flight management system that includes a landing guidance module, and a database having location coordinates of landing sites. Onboard vision sensors and a radar velocity system (RVS) communicate with the guidance module. When the MWS determines that the vehicle is in a GNSS denied environment, the guidance module calculates an optimal flight path by receiving image data from the vision sensors; receiving position, velocity and altitude data from the RVS; receiving location coordinates of a landing site; processing the image data, and the position, velocity and altitude data, to determine a location of the vehicle and provide 3D imaging of a route to the landing site; and calculating a flight path angle to the landing site, using vehicle and landing site coordinates.

SYSTEMS AND METHODS FOR AIRCRAFT LANDING GUIDANCE DURING GNSS DENIED ENVIRONMENT

A system comprises a GNSS sensor onboard an aerial vehicle; a monitor warning system (MWS) that determines whether the vehicle is in a GNSS denied environment; and a flight management system that includes a landing guidance module, and a database having location coordinates of landing sites. Onboard vision sensors and a radar velocity system (RVS) communicate with the guidance module. When the MWS determines that the vehicle is in a GNSS denied environment, the guidance module calculates an optimal flight path by receiving image data from the vision sensors; receiving position, velocity and altitude data from the RVS; receiving location coordinates of a landing site; processing the image data, and the position, velocity and altitude data, to determine a location of the vehicle and provide 3D imaging of a route to the landing site; and calculating a flight path angle to the landing site, using vehicle and landing site coordinates.

AUTOMATIC WORKING SYSTEM, SELF-MOVING DEVICE, AND METHODS FOR CONTROLLING SAME
20240077885 · 2024-03-07 ·

A self-moving device, including: a moving module, a task execution module, a control module. The control module is electrically connected to the moving module and the task execution module, controls the moving module to actuate the self-moving device to move, controls the task execution module to execute a working task. The self-moving device further includes a satellite navigation apparatus, electrically connected to the control module and configured to receive a satellite signal and output current location information of the self-moving device. The control module determines whether quality of location information output by the satellite navigation apparatus at a current location satisfies a preset condition, controls, if the quality does not satisfy the preset condition, the moving module to actuate the self-moving device to change a moving manner, to enable quality of location information output by the satellite navigation apparatus at a location after the movement to satisfy the preset condition.

MOBILE BODY, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM
20240069576 · 2024-02-29 ·

Provided are a mobile body, an information processing method, and a computer program. A mobile body of the present disclosure includes: an imaging unit configured to capture an image of an environment around the mobile body; an estimation unit configured to estimate a position of the mobile body on the basis of the image captured by the imaging unit; a calculation unit configured to calculate the position of the mobile body on the basis of a control command for controlling movement of the mobile body; and a wind information calculation unit configured to calculate information regarding wind acting on the mobile body on the basis of a first position that is the position of the mobile body, which is estimated by the estimation unit, and a second position that is the position of the mobile body, which is calculated by the calculation unit.

Intersection node-assisted high-definition mapping

A computer-implemented method for controlling a vehicle includes receiving, via a processor, from two or more IX control devices disposed at a two or more stationary positions having known latitudes longitudes and orientations, first sensory data identifying the position and dimensions of a feature in a mapped region. The processor generates a plurality of IX nodes based on the first sensory data received from the IX control devices, and receives LiDAR point cloud that includes LiDAR and other vehicle sensory device data such as Inertial Measurement Unit (IMU) data received from a Vehicle (AV) driving in the mapped region. The LiDAR point cloud includes a simultaneous localization and mapping (SLAM) map having second dimension information and second position information associated with the feature in the mapped region. The processor generates, without GPS and/or real-time kinematics information, an optimized High-Definition (HD) map having Absolute accuracy using batch optimization and map smoothing.

Mobile Robot Positioning Method and System Based on Wireless Ranging Sensors, and Chip
20240061442 · 2024-02-22 ·

The present disclosure discloses a mobile robot positioning method and system based on wireless ranging sensors, and a chip. The mobile robot positioning method adopts a manner of controlling a mobile robot to traverse two target positions successively to acquire a distance between the mobile robot at each traversed position and a fixed positioning base station, rather than calculate distances between the robot at the same position and different base stations, such that the trouble of arranging a plurality of base stations in a positioning area is reduced.

ENHANCED OBSERVABILITY UNINHABITED AERIAL VEHICLES AND METHODS OF USE
20240059411 · 2024-02-22 ·

Aerial vehicles, their structures and methods of locomotion are described. An aerial vehicle may include a fuselage having an x-axis, a plurality of flexible structures emanating from the fuselage that take the form of a feather, wing and/or tentacle, at least one motor, and at least one propeller driven by one or more motors. Each flexible structure may extend from a fuselage in any direction and are used to enhance the observability of the aircraft by moving and/or oscillating within a frequency band and at a magnitude that is more easily observed by and catches the human eye.

System and method for proximate vehicle intention prediction for autonomous vehicles

A system and method for proximate vehicle intention prediction for autonomous vehicles are disclosed. A particular embodiment is configured to: receive perception data associated with a host vehicle; extract features from the perception data to detect a proximate vehicle in the vicinity of the host vehicle; generate a trajectory of the detected proximate vehicle based on the perception data; use a trained intention prediction model to generate a predicted intention of the detected proximate vehicle based on the perception data and the trajectory of the detected proximate vehicle; use the predicted intention of the detected proximate vehicle to generate a predicted trajectory of the detected proximate vehicle; and output the predicted intention and predicted trajectory for the detected proximate vehicle to another subsystem.

Camera-based commissioning

Lighting control systems may be commissioned for programming and/or control with the aid of a mobile device. Design software may be used to create a floor plan of how the lighting control system may be designed. The design software may generate floor plan identifiers for each lighting fixture, or group of lighting fixtures. During commissioning of the lighting control system, the mobile device may be used to help identify the lighting devices that have been installed in the physical space. The mobile device may receive a communication from each lighting control device that indicates a unique identifier of the lighting control device. The unique identifier may be communicated by visible light communication (VLC) or RF communication. The unique identifier may be associated with the floor plan identifier for communication of digital messages to lighting fixtures installed in the locations indicated in the floor plan identifier.

Systems and methods for traction detection and control in a self-driving vehicle

Methods and systems are provided for traction detection and control of a self-driving vehicle. The self-driving vehicle has drive motors that drive drive-wheels according to a drive-motor speed. Traction detection and control can be obtained by measuring the vehicle speed with a sensor such as a LiDAR or video camera, and measuring the wheel speed of the drive wheels with a sensor such as a rotary encoder. The difference between the measured vehicle speed and the measured wheel speeds can be used to determine if a loss of traction has occurred in any of the wheels. If a loss of traction is detected, then a recovery strategy can be selected from a list of recovery strategies in order to reduce the effects of the loss of traction.