Patent classifications
G05D1/0268
Methods and systems for simultaneous localization and calibration
Examples relate to simultaneous localization and calibration. An example implementation may involve receiving sensor data indicative of markers detected by a sensor on a vehicle located at vehicle poses within an environment, and determining a pose graph representing the vehicle poses and the markers. For instance, the pose graph may include edges associated with a cost function representing a distance measurement between matching marker detections at different vehicle poses. The distance measurement may incorporate the different vehicle poses and a sensor pose on the vehicle. The implementation may further involve determining a sensor pose transform representing the sensor pose on the vehicle that optimizes the cost function associated with the edges in the pose graph, and providing the sensor pose transform. In further examples, motion model parameters of the vehicle may be optimized as part of a graph-based system as well or instead of sensor calibration.
System and method for collaborative sensor calibration
The present teaching relates to method, system, medium, and implementations for sensor calibration. An ego vehicle determines whether a sensor deployed on the ego vehicle to facilitate autonomous driving of the ego vehicle needs to be calibrated and sends, if it is determined that the sensor needs to be calibrated, a request for assistance in collaborative calibration of the sensor, with a first position of the ego vehicle or a first configuration of the sensor with respect to the ego vehicle. When a response of the request is received, an assisting vehicle is indicated to travel to be near the ego vehicle to facilitate the calibration of the sensor by collaborating with the moving ego vehicle and the ego vehicle coordinates with the assisting vehicle to enable the sensor to acquire information of a target present on the assisting vehicle for the collaborative calibration of the sensor.
PERFORMING AUTONOMOUS PATH NAVIGATION USING DEEP NEURAL NETWORKS
A method, computer readable medium, and system are disclosed for performing autonomous path navigation using deep neural networks. The method includes the steps of receiving image data at a deep neural network (DNN), determining, by the DNN, both an orientation of a vehicle with respect to a path and a lateral position of the vehicle with respect to the path, utilizing the image data, and controlling a location of the vehicle, utilizing the orientation of the vehicle with respect to the path and the lateral position of the vehicle with respect to the path.
GUIDED OPERATION BY ROBOTIC PROCESSES
A method is disclosed. The method is implemented by a robot engine that is stored as program code on a memory of a system. The program code is executed by a processor of the system, which enables robotic process automations of the robot engine. The processor is communicatively coupled to the memory within the system. The method includes initiating a guided operation of a platform presented by the system and monitoring the guided operation to observe an interaction with the platform or to receive a direct input by the robot engine. The method also includes executing a backend operation with respect to the interaction or the direct input.
SYSTEMS AND METHODS FOR AUTONOMOUS NAVIGATION ON SIDEWALKS IN VARIOUS CONDITIONS
The disclosure provides a method for determining a location of an unmanned ground vehicle (UGV). The method includes receiving LIDAR data with a computer system, the LIDAR data being received from at least one LIDAR sensor mounted to the UGV, receiving Global Navigation Satellite System (GNSS) data with the computer system, the GNSS data being received from at least one GNSS sensor mounted to the UGV, and computing location data with the computer system, the location data being computed by fusing the LIDAR data and the GNSS data to determine a location of the UGV.
Redundant pose generation system
Techniques for performing multiple simultaneous pose generation for an autonomous vehicle. For instance, a system that navigates the autonomous vehicle can include at least a first component that determines first poses for the autonomous vehicle using at least a first portion of sensor data captured by one or more sensors and a second component that determines second poses for the autonomous vehicle using at least a second portion of the sensor data. The first component may have more computational resources than the second component and determine poses at a different frequency than the second component. The system may generate trajectories for the autonomous vehicle using the first poses when the first component is operating correctly. Additionally, the system may generate trajectories for the autonomous vehicle using the second poses when the first component is not operating correctly.
Travel Route Determination System
A travel route determination system including: a route generating unit generating planned travel routes including work routes along which a work vehicle performs autonomous travel; a control unit capable of causing the work vehicle to perform autonomous travel along the planned travel routes; an information obtaining unit obtaining position information and orientation information on the work vehicle; and a determination unit determining an autonomous travel candidate route at which the work vehicle can start autonomous travel, before the work vehicle starts autonomous travel. The determination unit sets a candidate determination region based on the position information and orientation information on the work vehicle, and the determination unit determines, among the work routes, a work route in the candidate determination region as the autonomous travel candidate route.
Method of Operating A Printing Robot In Shadows
A mobile printing robot system for printing a construction layout. The method addresses a problem that occurs when there is a loss of a tracking lock to an absolute positioning device, such as a total station. In a construction site, a line of sight between a mobile robot and an absolute positioning device may be lost when the mobile robot moves into a shadowed region behind an obstacle, such as a column. The mobile printing robot may use its local sensor data to continue to print until a maximum estimated position is reached. The mobile robot may also provide position updates to the absolute positioning device to aid it to regain a track lock when the mobile robot emerged from the shadowed region.
Method, system and apparatus for mobile automation apparatus localization
A method of mobile automation apparatus localization in a navigation controller includes: controlling a depth sensor to capture a plurality of depth measurements corresponding to an area containing a navigational structure; selecting a primary subset of the depth measurements; selecting, from the primary subset, a corner candidate subset of the depth measurements; generating, from the corner candidate subset, a corner edge corresponding to the navigational structure; selecting an aisle subset of the depth measurements from the primary subset, according to the corner edge; selecting, from the aisle subset, a local minimum depth measurement for each of a plurality of sampling planes extending from the depth sensor; generating a shelf plane from the local minimum depth measurements; and updating a localization of the mobile automation apparatus based on the corner edge and the shelf plane.
LOCALIZATION SYSTEM FOR A DRIVERLESS VEHICLE
A localization system for a driverless vehicle configured to drive in a driverless manner from a starting point to a destination point includes a set of sensors configured to localize the vehicle indoors and at least one capture device configured to localize the vehicle outdoors. The localization system is configured to provide at least one orientation point at a transition area between outdoors and indoors or vice versa. The localization is also configured to switch from a localization method for localizing the vehicle indoors to a localization method for localizing the vehicle outdoors or vice versa in response to capturing, by the set of sensors and/or the capture device, the at least one orientation point.