G05D1/249

Autonomous ground vehicle for solar module installation

An autonomous solar module installation platform can be used for solar module installation onto a solar tracker. The autonomous solar module installation platform can include an autonomous ground vehicle and a robotic arm for the solar module installation onto the solar tracker. The autonomous ground vehicle can autonomously drive itself to the solar tracker using a global positioning system and align itself with the solar tracker using at least a vision system in order to place one or more solar modules onto the solar tracker.

Controlling host vehicle based on detected door opening events

Systems and methods are provided for vehicle navigation. In one implementation, a system may comprise at least one processor. The processor may be programmed to receive an image associated with the environment of the host vehicle. The processor may analyze the image to identify a side of a parked vehicle; a first structural feature of the parked vehicle in a forward region of the side of the parked vehicle or a second structural feature of the parked vehicle in a rear region of the side of the parked vehicle; and a door feature of the parked vehicle in a vicinity of the first or the second structural features. The processor may then determine, based on a subsequent image, a change of an image characteristic of the door feature of the parked vehicle and alter a navigational path of the host vehicle based on the change of the image characteristic.

Autonomous vehicles and methods of zone driving

Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from zones, including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with free drive paths as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.

Mobile robot and method of controlling a mobile robot illumination system

A mobile robot including a vision system, the vision system including a camera and an illumination system; the illumination system including a plurality of light sources arranged to provide a level of illumination to an area surrounding the mobile robot; and a control system for controlling the illumination system. The control system adjusts the level of illumination provided by the plurality of light sources based on an image captured by the camera; an exposure time of the camera at the time the image was captured; and robot rotation information.

Sun-aware routing and controls of an autonomous vehicle

Sun-aware routing and controls of an autonomous vehicle is described herein. A location and an orientation of a sensor system is identified within an environment of the autonomous vehicle to determine whether the sun causes a threshold level of perception degradation to the sensor system incident to generating a sensor signal indicative of a traffic light. The determination is based upon perception degradation data that can be precomputed for locations and orientations within the environment for dates and times of day. The perception degradation data is based upon the location of at least one traffic light and positions of the sun relative to the locations and the orientations within the environment. A mechanical system of the autonomous vehicle is controlled to execute a maneuver that reduces perception degradation to the sensor system when the perception degradation is determined to exceed the threshold level of perception degradation.

Travel control device and travel control method

A travel control device is configured to make an autonomous driving vehicle travel in such a way that the autonomous driving vehicle arrives at a specified position specified by a user who intends to board the autonomous driving vehicle, when the autonomous driving vehicle has reached a predetermined range from the specified position, transmit an information sending request notifying the user terminal that the autonomous driving vehicle has reached a vicinity of the specified position and requesting sending of position identifying information for identifying a position at which the user intends to board the autonomous driving vehicle to the user terminal via a communication circuit, and when receiving the position identifying information via the communication circuit, change a position of the autonomous driving vehicle, based on the position identifying information in such a way that the autonomous driving vehicle comes close to the user.

Techniques to compensate for movement of sensors in a vehicle

Techniques are described for compensating for movements of sensors. A method includes receiving two sets of sensor data from two sets of sensors, where a first set of sensors are located on a roof of a cab of a semi-trailer truck and a second set of sensor data are located on a hood of the semi-trailer truck. The method also receives from a height sensor a measured value indicative of a height of the rear of a rear portion of the cab of the semi-trailer truck relative to a chassis of the semi-trailer truck, determines two correction values, one for each of the two sets of sensor data, and compensates for the movement of the two sets of sensors by generating two sets of compensated sensor data. The two sets of compensated sensor data are generated by adjusting the two sets of sensor data based on the two correction values.

PERFORMING IMAGE BASED ACTIONS ON A MOVING VEHICLE

A method implemented by a treatment system disposed on a vehicle, the treatment system having one or more processors, a storage, and a treatment mechanism, includes capturing a first image of a region of an agricultural environment, detecting, by implementing a first machine learning (ML) algorithm on a first portion of the first image, a presence of at least a portion of a first object in the first image, determining whether the first object detected is a treatment candidate, determining, upon determining that the first object is a treatment candidate, a first three dimensional (3D) location of at least a portion of the first object in the agricultural environment, and applying, a treatment to at least the portion of the first object by activating the treatment mechanism to interact with the first object.

Systems and methods for unmanned vehicles having self-calibrating sensors and actuators

Systems and methods of unmanned vehicles having self-calibrating sensors and actuators are provided. The unmanned vehicle comprises a communication interface and a processor for controlling a propulsion system of the vehicle and receiving sensor data from one or more sensors of the vehicle. The processor is configured to operate in a guided calibration mode by controlling the propulsion system according to commands received from an external guided control system, while processing the sensor data to determine a degree of certainty on a calibration the sensor data and a position of the vehicle. The processor determines that the degree of certainty is above a threshold value associated with safe operation of the propulsion system in an autonomous calibration mode, and subsequently switch operation of the propulsion system to the autonomous calibration mode based on the determination that the degree of certainty is above the threshold value.

Real-time HDR video for vehicle control

The invention provides an autonomous vehicle with a video camera that merges images taken a different light levels by replacing saturated parts of an image with corresponding parts of a lower-light image to stream a video with a dynamic range that extends to include very low-light and very intensely lit parts of a scene. The high dynamic range (HDR) camera streams the HDR video to a HDR system in real timeas the vehicle operates. As pixel values are provided by the camera's image sensors, those values are streamed directly through a pipeline processing operation and on to the HDR system without any requirement to wait and collect entire images, or frames, before using the video information.