G05D1/243

SYSTEMS AND METHODS FOR AIRCRAFT LANDING GUIDANCE DURING GNSS DENIED ENVIRONMENT

A system comprises a GNSS sensor onboard an aerial vehicle; a monitor warning system (MWS) that determines whether the vehicle is in a GNSS denied environment; and a flight management system that includes a landing guidance module, and a database having location coordinates of landing sites. Onboard vision sensors and a radar velocity system (RVS) communicate with the guidance module. When the MWS determines that the vehicle is in a GNSS denied environment, the guidance module calculates an optimal flight path by receiving image data from the vision sensors; receiving position, velocity and altitude data from the RVS; receiving location coordinates of a landing site; processing the image data, and the position, velocity and altitude data, to determine a location of the vehicle and provide 3D imaging of a route to the landing site; and calculating a flight path angle to the landing site, using vehicle and landing site coordinates.

SYSTEMS AND METHODS FOR AIRCRAFT LANDING GUIDANCE DURING GNSS DENIED ENVIRONMENT

A system comprises a GNSS sensor onboard an aerial vehicle; a monitor warning system (MWS) that determines whether the vehicle is in a GNSS denied environment; and a flight management system that includes a landing guidance module, and a database having location coordinates of landing sites. Onboard vision sensors and a radar velocity system (RVS) communicate with the guidance module. When the MWS determines that the vehicle is in a GNSS denied environment, the guidance module calculates an optimal flight path by receiving image data from the vision sensors; receiving position, velocity and altitude data from the RVS; receiving location coordinates of a landing site; processing the image data, and the position, velocity and altitude data, to determine a location of the vehicle and provide 3D imaging of a route to the landing site; and calculating a flight path angle to the landing site, using vehicle and landing site coordinates.

Remote monitoring system and an autonomous running vehicle and remote monitoring method

An autonomous running vehicle transmits a camera image around the vehicle photographed by a camera to a remote monitoring center. An obstacle is detected on the basis of information obtained from autonomous sensors including the camera. When an obstacle is detected, the autonomous running vehicle is automatically stopped. The remote monitoring center determines, when the autonomous running vehicle automatically stops, whether or not the run of the autonomous running vehicle is permitted to restart on the basis of the received camera video. When it is determined that the autonomous running vehicle can be restarted, a departure signal is transmitted to the autonomous running vehicle. When the departure signal is received from the remote monitoring center, the autonomous running vehicle restarts running.

Photographing device and method

A flying drone for shelf label checking includes a flying mechanism, a camera, and a camera interface configured to transmit and receive data to and from the camera. A flight control interface is configured to transmit and receive data to and from the flying mechanism. A processor is configured to acquire a first image of an object from a first distance with the camera, then extract an object region for the object from the first image. The processor then sets a flight path based on the object region and controls the flying mechanism to fly the camera along the flight path to a second distance that is closer to the object than the first distance. A second image of the object is then acquired from the second distance with the camera.

Movement control system, movement control method, and non-transitory computer readable medium

According to one embodiment, a movement control system includes a receiver configured to receive start information for deciding a start order from which execution of an operation plan is started, the operation plan including a plurality of orders for controlling a movable object; and an operation plan executor configured to start the execution of the operation plan from the start order decided based on the start information.

Detecting road conditions based on braking event data received from vehicles
11866020 · 2024-01-09 · ·

Data is received regarding vehicle braking events, each event occurring on one of a plurality of vehicles, and each event associated with a location. A determination is made that the braking events correspond to a pattern. Based on determining that the braking events correspond to the pattern, a first location is identified. In response to identifying the first location, at least one action is performed.

Domestic robotic system
11865708 · 2024-01-09 · ·

A domestic robotic system includes a moveable robot having an image obtaining device for obtaining images of the exterior environment of the robot, and a processor programmed to detect a predetermined pattern within the obtained images. The processor and image obtaining device form at least part of a first navigation system for the robot which can determine a first estimate of at least one of the position and orientation of the robot. A second navigation system for the robot determines an alternative estimate of the at least one of the position and orientation of the robot. Calibration of the second navigation system can be performed using the first navigation system.

V2X information elements for maneuver and path planning

Techniques disclosed provide for enhanced V2X communications by defining information Elements (IE) for V2X messaging between V2X entities. For a transmitting vehicle that sends a V2X message to a receiving vehicle, these IEs are indicative of a detected vehicle model type detected by the transmitting vehicle of a detected vehicle; a pitch rate of the transmitting vehicle, a detected vehicle, or a detected object; a roll rate of the transmitting vehicle, a detected vehicle, or a detected object; a yaw rate of a detected vehicle, or a detected object; a pitch rate confidence; a roll rate confidence; an indication of whether a rear brake light of a detected vehicle is on; or an indication of whether a turning signal of a detected vehicle is on; or any combination thereof. With this information, the receiving vehicle is able to make more intelligent maneuvers than otherwise available through traditional V2X messaging.

Sharing sensor data between multiple controllers to support vehicle operations

This disclosure presents an assisted driving vehicle system, including autonomous, semi-autonomous, and technology assisted vehicles, that can share sensor data among two or more controllers. A sensor can have one communication channel to a controller, thereby saving cabling and circuitry costs. The data from the sensor can be sent from one controller to another controller to enable redundancy and backup in case of a system failure. Sensor data from more than one sensor can be aggregated at one controller prior to the aggregated sensor data being communicated to another controller thereby saving bandwidth and reducing transmission times. The sharing of sensor data can be enabled through the use of a sensor data distributor, such as a converter, repeater, or a serializer/deserializer set located as part of the controller and communicatively coupled to another such device in another controller using a data interface communication channel.

Detecting Road Conditions Based on Braking Event Data Received from Vehicles
20190382029 · 2019-12-19 ·

Data is received regarding vehicle braking events, each event occurring on one of a plurality of vehicles, and each event associated with a location. A determination is made that the braking events correspond to a pattern. Based on determining that the braking events correspond to the pattern, a first location is identified. In response to identifying the first location, at least one action is performed.