Patent classifications
G05D1/249
Method for operating a system with two automatically moving floor processing devices as well as system for implementing such a method
A method for operating a system with a first automatically moving floor processing device and a second automatically moving floor processing device in which the first floor processing device detects environmental features in an environment of the first floor processing device. The first floor processing device or a shared computing device allocated to both the processing devices generates a first area map based on the detected environmental features, and the first floor processing device also detects the second floor processing device, and the position of the second floor processing device is thereupon stored within the generated first area map. The second floor processing device receives information about a current position of the second floor processing device within the first area map, and controls a second floor processing activity as soon as the first floor processing device has detected the second floor processing device.
Agricultural analysis robotic systems and methods thereof
A method, non-transitory computer readable medium, and system that manage agricultural analysis in dynamic environments includes detecting a location of one or more agricultural objects of interest in image data of an environment captured by a sensor device during active navigation of the environment. An orientation and position of the sensor device with respect to the image data is determined. Each of the one or more agricultural objects of interest is analyzed based on the image data, the detected location of the one or more agricultural objects of interest, and the determined orientation and position of the sensor device to determine one or more characteristics about the one or more agricultural objects of interest. At least one action is initiated based on the determined one or more characteristics about the one or more agricultural objects of interest.
Object recognition apparatus, vehicle, and object recognition method
An object recognition apparatus mounted in a vehicle including a sensor is provided. The apparatus includes a detection unit configured to detect an object present in a same lane as that of the vehicle by using information from the sensor, an acquisition unit configured to acquire information concerning lights which the object turns on by using the information from the sensor, and a determining unit configured to determine a type of the object based on the information concerning the lights. A condition in which the determination unit determines that the type of the object is a two-wheeled vehicle includes a case in which the object includes not less than two lights arrayed in a vertical direction with respect to the ground surface.
Systems and methods for autonomous vehicle operation
Disclosed herein are systems and methods for autonomous vehicle operation, in which a processor is configured to receive sensor data collected by a first sensor of a first autonomous vehicle during navigation of the first autonomous vehicle through a particular location and prior to a control signal subsequently generated by a controller of the first autonomous vehicle; determine based on the sensor data an event that triggered the control signal. A communication device coupled to the processor is configured to transmit to a second autonomous vehicle an instruction, based on the determined event, to adjust sensor data collected by a second sensor of the second autonomous vehicle during navigation of the second autonomous vehicle in the particular location.
Systems and methods for autonomous vehicle operation
Disclosed herein are systems and methods for autonomous vehicle operation, in which a processor is configured to receive sensor data collected by a first sensor of a first autonomous vehicle during navigation of the first autonomous vehicle through a particular location and prior to a control signal subsequently generated by a controller of the first autonomous vehicle; determine based on the sensor data an event that triggered the control signal. A communication device coupled to the processor is configured to transmit to a second autonomous vehicle an instruction, based on the determined event, to adjust sensor data collected by a second sensor of the second autonomous vehicle during navigation of the second autonomous vehicle in the particular location.
Systems and methods for cooperatively managing mixed traffic at an intersection
Systems and methods for cooperatively managing mixed traffic at an intersection are disclosed herein. One embodiment determines, at an autonomous sensor-rich vehicle, that one or more other vehicles are following in the same lane, the one or more other vehicles including at least one legacy vehicle; communicates with the one or more other vehicles to form a platoon; receives Signal Phase and Timing (SpaT) information from a roadside unit associated with an intersection toward which the platoon is traveling; calculates at the autonomous sensor-rich vehicle, based at least in part on the SPaT information and location information for the platoon, a speed profile and a trajectory for the autonomous sensor-rich vehicle that minimize a delay of the platoon in traversing the intersection while accounting for fuel consumption; and executes the speed profile and the trajectory to control indirectly the one or more other vehicles while the platoon traverses the intersection.
Visual identifiers for docking and zoning an autonomous mower
A lawn vehicle network includes a charging station having a visual identifier, a lawn vehicle having a battery, a blade system, a drive system whose output effects lawn vehicle forward movement, a processor board connected to both systems, the processor board capable of processing image data and sending commands to both systems, and a vision assembly connected to the processor board and able to transmit image data to the processor board, and the processor board, having received the image data, able to, if the image data represent a first object, maintain the drive system's output at the time of that determination, if the image data represent a second object, change the drive system's output at the time of that determination, and if the image data represent the visual identifier, maintain the drive system's output or send a shutoff command to the vision assembly at the time of that determination.
Visual identifiers for docking and zoning an autonomous mower
A lawn vehicle network includes a charging station having a visual identifier, a lawn vehicle having a battery, a blade system, a drive system whose output effects lawn vehicle forward movement, a processor board connected to both systems, the processor board capable of processing image data and sending commands to both systems, and a vision assembly connected to the processor board and able to transmit image data to the processor board, and the processor board, having received the image data, able to, if the image data represent a first object, maintain the drive system's output at the time of that determination, if the image data represent a second object, change the drive system's output at the time of that determination, and if the image data represent the visual identifier, maintain the drive system's output or send a shutoff command to the vision assembly at the time of that determination.
Lane line reconstruction using future scenes and trajectory
A vehicle capable of autonomous driving includes a lane detection system. The lane detection system is trained to predict lane lines using training images. The training images are automatically processed by a training module of the lane detection system in order to create ground truth data. The ground truth data is used to train the lane detection system to predict lane lines that are occluded in real-time images of roadways. The lane detection system predicts lane lines of a roadway in a real-time image even though the lane lines maybe indiscernible due to objects on the roadway or due to the position of the lane lines being in the horizon.
Vehicular control system with handover procedure for driver of controlled vehicle
A vehicular control system includes a forward-viewing camera, a forward-sensing sensor and an in-cabin-sensing sensor. With the system controlling driving of the vehicle, the system determines a triggering event that triggers handing over driving of the vehicle to a driver of the vehicle before the vehicle encounters an event point associated with the triggering event. The vehicular control system (i) determines a total action time available before the vehicle encounters the event point, (ii) estimates a driver takeover time for the driver to take over control of the vehicle and (iii) estimates a handling time for the driver to control the vehicle to avoid encountering the event point. Responsive to the vehicular control system determining that the estimated driver takeover time is less than the difference between the determined total action time and the estimated handling time, control of the vehicle is handed over to the driver of the vehicle.