G01S15/86

System for automated exploration by an autonomous mobile device using markers based on image features

An autonomous mobile device (AMD) uses sensors to explore a physical space and determine the locations of obstacles. Simultaneous localization and mapping (SLAM) techniques are used by the AMD to designate as keyframes some images and their associated descriptors of features in the space. Each keyframe indicates a location and orientation of the AMD relative to those features. Anchors are specified relative to keyframes. A marker is specified relative to one or more anchors. Because markers are associated with features in the physical space, they maintain their association with the physical space through various processes such as SLAM loop closures. Markers may specify locations in the physical space, such as navigation waypoints, navigation destinations such as a goal pose for exploring an unexplored area, as an observation target to facilitate exploration, and so forth. Markers may also be used to specify block listed locations to be avoided during exploration.

MOBILE MACHINERY SITUATIONAL AWARENESS APPARATUS
20220382285 · 2022-12-01 ·

An apparatus for mobile machinery includes a sensor module configured for detachable fitment to mobile machinery, and a control module configured to interface with a control system of the mobile machinery. The sensor module includes an orthogonal sensor arrangement for sensing obstacles in three-dimensional space, an orthogonal indicator arrangement configured to provide a gradient proximity indication of an obstacle, and a wireless transceiver arranged in signal communication with the sensor and indicator arrangements. The control module includes a control wireless transceiver for communicating with the wireless transceiver of the sensor module, and a processor in signal communication with the control wireless transceiver, the processor configured to program, via the wireless transceiver of each sensor module, the sensor and indicator arrangements and with predetermined thresholds of obstacle proximity. If a sensed obstacle proximity exceeds a maximum threshold, the processor overrides the control system to facilitate situational awareness of an operator.

SENSOR LAYOUT OF VEHICLES
20220381899 · 2022-12-01 ·

The present disclosure relates to a vehicle. The vehicle includes a first set of cameras, including a first subset of cameras facing to the front of the vehicle; a second set of cameras, with focal lengths less than those of the first set of cameras, the second set of cameras including a second and a third subset of cameras, the second subset of cameras facing to the front of the vehicle, and third subset of cameras facing to a side front and/or a side of the vehicle; and a third set of cameras, with focal lengths less than those of the second set of cameras, the third set of cameras including a fourth and a fifth subset of camera, the fourth subset of cameras facing to the front of the vehicle, and the fifth subset of camera facing to the side front and/or side of the vehicle.

SENSOR LAYOUT OF VEHICLES
20220381899 · 2022-12-01 ·

The present disclosure relates to a vehicle. The vehicle includes a first set of cameras, including a first subset of cameras facing to the front of the vehicle; a second set of cameras, with focal lengths less than those of the first set of cameras, the second set of cameras including a second and a third subset of cameras, the second subset of cameras facing to the front of the vehicle, and third subset of cameras facing to a side front and/or a side of the vehicle; and a third set of cameras, with focal lengths less than those of the second set of cameras, the third set of cameras including a fourth and a fifth subset of camera, the fourth subset of cameras facing to the front of the vehicle, and the fifth subset of camera facing to the side front and/or side of the vehicle.

UNMANNED AERIAL VEHICLE WITH UNDERWATER SONAR SCANNING CAPABILITY

An unmanned aerial system includes an unmanned aerial vehicle having a body and a primary propulsion system coupled to the body. The primary propulsion system includes at least one propeller and at least one motor coupled to the at least one propeller. The unmanned aerial system also includes a pair of landing gears coupled to the body of the unmanned aerial vehicle. Each landing gear of the pair of landing gears includes a buoyant elongated float. The unmanned aerial system also includes a SONAR device coupled to the unmanned aerial vehicle.

Vehicle guidance device, method, and computer program product
11511805 · 2022-11-29 · ·

According to one embodiment, a vehicle guidance device is to be installed in a vehicle for providing path guidance to the vehicle. The device includes a plurality of receivers that receives a ranging information signal via a ultrasonic-wave ranging sensor, the ranging information signal including an ultrasonic-wave ranging signal on which path guidance information is superimposed, the ultrasonic-wave ranging signal being for measuring a distance to an object; an information extractor that extracts, for each of the receivers, the path guidance information from the ranging information signal; and a path guide that provides the path guidance on the basis of a distance corresponding to the ultrasonic-wave ranging signal and the path guidance information.

System and method associated with user authentication based on an acoustic-based echo-signature

A system associated with predicting authentication of a device user based on a joint features representation related to an echo-signature associated with a device is disclosed. The system performs operations that include emitting acoustic signals in response to a request for processing of a profile associated with the device. The system receives a set of echo acoustic signals that are tailored based on reflection of the acoustic signals from unique contours of one or more depth portions associated with the user relative to a discrete epoch. One or one or more region segments associated with the echo acoustic signals are extracted in order to train a classification model. A classification model is generated based on the one or more region segments as extracted. A joint features representation based on the classification model is generated. A vector-based classification model is used in the prediction of the joint features representation. The system determines whether the joint features representation is associated with the echo-signature based on the prediction of the joint features representation. A corresponding method and computer-readable device are also disclosed.

SONAR STEERING SYSTEMS AND ASSOCIATED METHODS
20220373662 · 2022-11-24 ·

Sonar steering systems offering improved functionality and ease of use for an operator (e.g., an angler) are provided. A sonar steering system is configured to automatically adjust the directional coverage volume of the sonar system in a hands-free manner to allow the operator to focus on other tasks. Some such sonar steering systems are configured to adjust the directional coverage volume of the sonar transducers to maintain a target such as an area of interest (AOI) within the sonar display despite movement of the watercraft relative to the target. Accordingly, the coverage volume may be automatically adjusted to maintain the aim of the sonar transducers at a target that is moving through the water, such as a school of fish.

Aligning measured signal data with SLAM localization data and uses thereof
11506500 · 2022-11-22 · ·

A method includes retrieving a map of a 3D geometry of an environment the map including a plurality of non-spatial attribute values each corresponding to one of a plurality of non-spatial attributes and indicative of a plurality of non-spatial sensor readings acquired throughout the environment, receiving a plurality of sensor readings from a device within the environment wherein each of the sensor readings corresponds to at least one of the non-spatial attributes and matching the plurality of received sensor readings to at least one location in the map to produce a determined sensor location.

AUTONOMOUS VEHICLE THAT COMPRISES ULTRASONIC SENSORS
20220365210 · 2022-11-17 ·

An autonomous vehicle includes a first ultrasonic sensor and a second ultrasonic sensor that is included in a daisy chain with the first ultrasonic sensor, wherein the first ultrasonic sensor is electrically connected to the second ultrasonic sensor in the daisy chain by way of a twisted pair wire. The autonomous vehicle further includes an electronic control unit (ECU) for the first ultrasonic sensor and the second ultrasonic sensor, the ECU is included in the daisy chain with the first ultrasonic sensor and the second ultrasonic sensor, wherein the ECU is electrically connected to the first ultrasonic sensor and the second ultrasonic sensor by way of the twisted pair wire, and further wherein the ECU is in bidirectional communication with the first ultrasonic sensor and the second ultrasonic sensor by way of differential signaling over the twisted pair wire.