Patent classifications
G01S17/00
Training Algorithm For Collision Avoidance Using Auditory Data
A machine learning model is trained by defining a scenario including models of vehicles and a typical driving environment. A model of a subject vehicle is added to the scenario and sensor locations are defined on the subject vehicle. A perception of the scenario by sensors at the sensor locations is simulated. The scenario further includes a model of a parked vehicle with its engine running. The location of the parked vehicle and the simulated outputs of the sensors perceiving the scenario are input to a machine learning algorithm that trains a model to detect the location of the parked vehicle based on the sensor outputs. A vehicle controller then incorporates the machine learning model and estimates the presence and/or location of a parked vehicle with its engine running based on actual sensor outputs input to the machine learning model.
Bistatic lidar architecture for vehicle deployments
A lidar system having a lidar transmitter and lidar receiver that are in a bistatic arrangement with each other can be deployed in a climate-controlled compartment of a vehicle to reduce the exposure of the lidar system to harsher elements so it can operate in more advantageous environments with regards to factors such as temperature, moisture, etc. In an example embodiment, the bistatic lidar system can be connected to or incorporated within a rear view mirror assembly of a vehicle.
SENSOR INTEGRATION INTO EXISTING VEHICLE STRUCTURES
One or more vehicle sensors can be integrated into existing vehicle housings or structures to provide a streamlined appearance, potentially improved sensing capabilities, and a reduction in the use of extra structures to hold the one or more vehicle sensors. In one or more arrangements, one or more vehicle sensors can be located within a body of the vehicle and can be spaced from the vehicle emblem in a longitudinal direction of the vehicle. The one or more sensors can be configured and/or operatively positioned to sense at least a portion of an external environment of the vehicle through the vehicle emblem. In one or more arrangements, one or more sensors can be located within a front grille of the vehicle. The one or more sensors are configured to sense at least a portion of an external environment of the vehicle through the front grille.
LABELING SYSTEM FOR A VEHICLE APPENDAGE
The present technology is directed to identifying and labeling a vehicle appendage. More specifically, the present technology is generally related to receiving one or more lidar points associated with a vehicle having an appendage and outputting a label to classify the vehicle and the appendage. In some examples, a first portion of the one or more lidar points associated with the vehicle having the appendage represents the vehicle and a second portion of the one or more lidar points associated with the vehicle having the appendage represents the appendage The present disclosure can further train a perception model to output the label to classify the vehicle and the appendage.
Apparatus and methods for safe navigation of robotic devices
Apparatus and methods for navigation of a robotic device configured to operate in an environment comprising objects and/or persons. Location of objects and/or persons may changed prior and/or during operation of the robot. In one embodiment, a bistatic sensor comprises a transmitter and a receiver. The receiver may be spatially displaced from the transmitter. The transmitter may project a pattern on a surface in the direction of robot movement. In one variant, the pattern comprises an encoded portion and an information portion. The information portion may be used to communicate information related to robot movement to one or more persons. The encoded portion may be used to determine presence of one or more object in the path of the robot. The receiver may sample a reflected pattern and compare it with the transmitted pattern. Based on a similarity measure breaching a threshold, indication of object present may be produced.
Techniques for beam pattern adjustments in a LIDAR system
A system and method include receiving a first beam pattern from an optical source that comprises a plurality of optical beams transmitted towards a target causing different spaces to form between each optical beam. The system and method include measuring a vertical angle between at least two of the optical beams along a first axis and calculating a second beam pattern based on the vertical angle and a pivot point that causes the optical beams to be transmitted towards the target with substantially uniform spacing. The system and method include adjusting, at the pivot point, one or more components to form the second beam pattern to adjust the plurality of different spaces to the substantially uniform spacing for transmission towards the target. The system and method include receiving return optical beams from the target to produce a plurality of points to form the point cloud.
ROBOT CLEANER
A robot cleaner of the present disclosure comprises a main body configured to travel in a cleaning zone and to suction a foreign substance on a floor in the cleaning zone, an image sensor provided on the main body and configured to obtain an image of a predetermined area at a front side of the main body, a first light source provided on the main body and configured to emit a first pattern of light to a first sub-area of the predetermined area and a second light source provided on the main body at a position below the first light source and configured to emit a second pattern of light to a second sub-area of the predetermined area, the first sub-area being located lower than the second sub-area.
AUTOMATIC WATER FAUCET DEVICE
An automatic water faucet device 1 for automatically discharging water when an object to be detected is detected has: a sensor 14 that detects the object; a first water discharge part 12 that performs foamy water discharge; a second water discharge part 13 that performs spray water discharge; and a controller 40 that performs control for switching between the foamy water discharge from the first water discharge part 12 and the spray water discharge from the second water discharge part 13, wherein the controller 40 performs the foamy water discharge from the first water discharge part 12 while the sensor 14 detects the object, and when the sensor 14 no longer detects the object, the controller 40 stops this foamy water discharge, and thereafter performs spray water discharge from the second water discharge part 13 for a predetermined period.
OPTICAL SENSOR DEVICE AND METHOD FOR OPERATING A TIME-OF-FLIGHT SENSOR
An optical sensor device, which may be a time-of-flight sensor, comprises a pixel array having a plurality of pixels. Moreover, the optical sensor device comprises a read-out node configured to provide photo-generated charge carriers from a first pixel and a second pixel for read-out and a first transfer gate configured to enable a read-out of the first pixel using the read-out node and a second transfer gate to disable a read-out of the second pixel during read-out of the first pixel.
FLIGHT CONTROL METHOD AND UNMANNED UNMANNERED AERIAL VEHICLE
A method for controlling an aerial vehicle includes determining a direction in which the aerial vehicle is traveling; determining, with reference to a table, an altitude range which corresponds to the determined direction and within which the aerial vehicle is caused to fly, the table indicating correspondences between directions in which the aerial vehicle is traveling and altitude ranges within which the aerial vehicle is to fly; obtaining, from an altimeter, a first altitude, which is a current altitude, at which the aerial vehicle is flying; determining whether the first altitude is included in the determined altitude range; and if it is determined that the first altitude is not included in the determined altitude range, changing an altitude at which the aerial vehicle is caused to fly from the first altitude to a second altitude included in the determined altitude range.