Patent classifications
G05D1/0257
SENSOR SYSTEMS FOR SYNCING OPERATIONAL DATA FOR HEAVY EQUIPMENT
Sensor systems for communications between heavy equipment machines during tree felling operations. A system includes a first heavy equipment comprising a first winch and a second heavy equipment comprising a second winch. The system includes a first cable attached to the first winch and a fulcrum roller and a second cable attached to the second winch and the fulcrum roller. The system is such that the first heavy equipment communicates with the second heavy equipment by way of long-range radio signals.
Vehicle control apparatus, vehicle, vehicle control method, and storage medium
A vehicle control apparatus controls movement of a vehicle in a lateral direction intersecting a direction in which the vehicle travels based on a movement trajectory of a preceding vehicle. The vehicle control apparatus includes a detection unit configured to detect a surrounding environment of the vehicle, and a preceding vehicle which travels ahead in the same lane in which the vehicle travels, a determination unit configured to determine whether or not the preceding vehicle straddles lanes or approaches within a set distance predetermined for the lanes based on lateral movement information of the preceding vehicle detected by the detection unit, and a control unit configured to control lateral movement of the vehicle based on a determination result of the determination unit and detection information of the detection unit.
Method and apparatus for performing grid-based locailization of a mobile body
A method of localizing a mobile body (MB) in a known environment, includes the following steps: a) defining an occupancy grid (G) modeling the environment; b) defining a set of position grids (Π) each position grid being associated to a heading of the mobile body; c) receiving a time series of measurements (z.sub.1, z.sub.2, . . . ) from a distance sensor carried by the mobile body; and d) upon receiving a measurement of the time series, updating the pose probabilities of the position grids as a function of present values of the occupancy probabilities and of the received measurement; wherein step d) is carried out by applying an inverse sensor model to the received measurement, while considering the distance sensor co-located with a detected obstacle and by applying Bayesian fusion to update the pose probabilities of the position grids.
OBSTACLE TO PATH ASSIGNMENT FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
In various examples, one or more output channels of a deep neural network (DNN) may be used to determine assignments of obstacles to paths. To increase the accuracy of the DNN, the input to the DNN may include an input image, one or more representations of path locations, and/or one or more representations of obstacle locations. The system may thus repurpose previously computed information—e.g., obstacle locations, path locations, etc.—from other operations of the system, and use them to generate more detailed inputs for the DNN to increase accuracy of the obstacle to path assignments. Once the output channels are computed using the DNN, computed bounding shapes for the objects may be compared to the outputs to determine the path assignments for each object.
Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition
A sensor apparatus for an autonomously operated commercial vehicle to allow panoramic capture of surroundings of the commercial vehicle, including: radar units mountable in front corner areas of the vehicle; downwardly directed cameras having a fisheye objective, mountable on front upper corner areas of the vehicle; at least one rearwardly directed sensor mounted on a section of the vehicle to allow rearward image capture; and an evaluation module to evaluate image data from the radar units, the downwardly directed cameras and the at least one rearwardly directed sensor to achieve the panoramic capture of the surroundings of the vehicle; in which the radar units and the at least one rearwardly directed sensor capture all points in a surrounding area of the vehicle, and wherein the downwardly directed cameras capture all points in the surrounding area of the vehicle. Also described are a related commercial vehicle, method and computer readable medium.
Model for excluding vehicle from sensor field of view
The technology relates to developing a highly accurate understanding of a vehicle's sensor fields of view in relation to the vehicle itself. A training phase is employed to gather sensor data in various situations and scenarios, and a modeling phase takes such information and identifies self-returns and other signals that should either be excluded from analysis during real-time driving or accounted for to avoid false positives. The result is a sensor field of view model for a particular vehicle, which can be extended to other similar makes and models of that vehicle. This approach enables a vehicle to determine when sensor data is of the vehicle or something else. As a result, the detailed modeling allowing the on-board computing system to make driving decisions and take other actions based on accurate sensor information.
Multi-domain neighborhood embedding and weighting of sampled data
This document describes “Multi-domain Neighborhood Embedding and Weighting” (MNEW) for use in processing point cloud data, including sparsely populated data obtained from a lidar, a camera, a radar, or combination thereof. MNEW is a process based on a dilation architecture that captures pointwise and global features of the point cloud data involving multi-scale local semantics adopted from a hierarchical encoder-decoder structure. Neighborhood information is embedded in both static geometric and dynamic feature domains. A geometric distance, feature similarity, and local sparsity can be computed and transformed into adaptive weighting factors that are reapplied to the point cloud data. This enables an automotive system to obtain outstanding performance with sparse and dense point cloud data. Processing point cloud data via the MNEW techniques promotes greater adoption of sensor-based autonomous driving and perception-based systems.
Systems and methods for a scenario tagger for autonomous vehicles
Systems and methods are directed to determining autonomous vehicle scenarios based on autonomous vehicle operation data. In one example, a computer-implemented method for determining operating scenarios for an autonomous vehicle includes obtaining, by a computing system comprising one or more computing devices, log data representing autonomous vehicle operations. The method further includes extracting, by the computing system, a plurality of attributes from the log data. The method further includes determining, by the computing system, one or more scenarios based on a combination of the attributes, wherein each scenario includes multiple scenario variations and each scenario variation comprises multiple features. The method further includes providing, by the computing system, the one or more scenarios for generating autonomous vehicle operation analytics.
Control and Navigation Device for an Autonomously Moving System and Autonomously Moving System
The invention relates to a control and navigation device for an autonomously moving system, which comprises the following: a sensor device, which is configured to acquire sensor data, and for this purpose a LiDAR sensor installation, which is configured for 360-degree acquisition; a fisheye camera installation, which is configured for 360-degree acquisition; and a radar sensor installation, which is configured for 360-degree acquisition; a data processing installation with an AI-based software application, which is configured to determine control signals for purposes of navigating an autonomously moving system by means of processing of the sensor data; and a data communication interface, which is connected to the data processing installation, and is configured to provide the control signals for transmission to a controller of the autonomously moving system. The sensor device, the data processing installation, and the data communication interface are arranged at an assembly component, which is configured to assemble, in a detachable manner, the sensor device, the data processing installation, and the data communication interface together as a common module at the autonomously moving system. Furthermore, an autonomously moving system is provided.
Method and apparatus for controlling an autonomous vehicle
A method for operating an automated vehicle includes controlling by one or more computing devices an autonomous vehicle; receiving by one or more computing devices sensor data from the vehicle corresponding to moving objects in a vicinity of the vehicle; receiving by one or more computing devices road condition data; and determining by one or more computing devices undesirable locations related to the moving objects. The undesirable locations related to the moving objects for the vehicle are based at least in part on the road condition data. The step of controlling the vehicle includes avoiding the undesirable locations.