G05D1/227

Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway

A collision avoidance method and system for a mobile robot crossing a road. When a mobile robot approaches a road, it senses road conditions via at least one first sensor, and initiates road crossing if the road conditions are deemed suitable for crossing. As it crosses the road, the mobile robot senses, via at least one second sensor, a change in the road conditions indicating the presence of at least one hazardous moving object. In response to determining that at least one hazardous object in present, the mobile robot initiates a collision avoidance maneuver. A mobile robot configured to avoid collisions while crossing a road includes: at least one first sensor configured to sense road conditions, at least one second sensor configured to sense road conditions, and a processing component configured to carry out one or more collision avoidance maneuvers.

Determining drivable free-space for autonomous vehicles

In various examples, sensor data may be received that represents a field of view of a sensor of a vehicle located in a physical environment. The sensor data may be applied to a machine learning model that computes both a set of boundary points that correspond to a boundary dividing drivable free-space from non-drivable space in the physical environment and class labels for boundary points of the set of boundary points that correspond to the boundary. Locations within the physical environment may be determined from the set of boundary points represented by the sensor data, and the vehicle may be controlled through the physical environment within the drivable free-space using the locations and the class labels.

Performing 3D reconstruction via an unmanned aerial vehicle

In some examples, an unmanned aerial vehicle (UAV) may include one or more processors configured to capture, with one or more image sensors, and while the UAV is in flight, a plurality of images of a target. The one or more processors may compare a first image of the plurality of images with a second image of the plurality of images to determine a difference between a current frame of reference position for the UAV and an estimate of an actual frame of reference position for the UAV. In addition, the one or more processors may determine, based at least on the difference, and while the UAV is in flight, an update to a three-dimensional model of the target.

Polyline contour representations for autonomous vehicles
11938926 · 2024-03-26 · ·

Aspects of the disclosure relate to controlling a vehicle having an autonomous driving mode or an autonomous vehicle. For instance, a polygon representative of the shape and location of a first object may be received. A polyline contour representation of a portion of a polygon representative of the shape and location of a second object may be received. The polyline contour representation may be in half-plane coordinates and including a plurality of vertices and line segments. Coordinates of the polygon representative of the first object may be converted to the half-plane coordinate system. A collision location between the polyline contour representation and the polygon representative of the first object may be determined using the converted coordinates. The autonomous vehicle may be controlled in the autonomous driving mode to avoid a collision based on the collision location.

Systems and methods for controlling actuators based on load characteristics and passenger comfort
11938953 · 2024-03-26 · ·

Among other things, we describe techniques for operation of a vehicle based on measured load characteristics and/or passenger comfort. One or more sensors of the vehicle can measure passenger data and/or load data of the vehicle. The passenger data and/or load data of the vehicle can be used by the vehicle to determine how to navigate within the surrounding environment.

Systems and methods for self-driving vehicle collision prevention

Systems and methods for self-driving collision prevention are presented. The system comprises a self-driving vehicle safety system, having one or more sensors in communication with a control system. The control system is configured determine safety fields and instruct the sensors to scan a region corresponding to the safety fields. The control system determines exclusion regions, and omits the exclusion regions from the safety field. The safety system may also include capability reduction parameters that can be used to constrain the drive system of the vehicle, for example, by restricting turning radius and speed in accordance with the safety fields.

Zone passage control in worksite

A method for a zone passage control system for an underground worksite having a plurality of operation zones for autonomously operating mobile vehicle operations includes the steps of associating a first passage control unit with a first zone and a second zone, detecting state parameter information of the first zone and the second zone, merging the first zone and the second zone into a fusion zone on the basis of the state parameter information of the first zone and the second zone, and adapting the zone passage control system to allow a first autonomously operating mobile vehicle to pass the first passage control unit in the fusion zone without interrupting operation of a second autonomously operating mobile vehicle in the first zone and/or the second zone.

System and method for sharing data collected from the street sensors
11941976 · 2024-03-26 · ·

An environmental safety system may comprise a plurality of first sensors each located at a predetermined physical location of a traffic intersection and with a predetermined orientation. The system may have a memory storing executable instructions. The system may have one or more processors in communication with the plurality of first sensors and the memory. The one or more processors may be programmed by the executable instructions. The system may receive first sensor data captured at a time point and by the plurality of first sensors. The system may determine values of one or more parameters of an object of interest within a threshold distance of the traffic intersection using the first sensor data. The system may generate an information object comprising the values of the one or more parameters of the object of interest, the time point, and a signature of the information object. The system may transmit, via a communication network, the information object.

Apparatus and methods for autonomously controlling vehicles at a specified location
11940815 · 2024-03-26 ·

Aspects relate to apparatus and methods for autonomously controlling vehicles at a specified location. Apparatus includes a processor configured to receive a map of a location, communicate with a plurality of vehicles at the location, and communicate with a monitor device. Communicating with the plurality of vehicles includes receiving status data from the vehicles and transmitting a waypath to the vehicles.

Systems and methods for adjusting UAV trajectory

A system includes a flight controller. The flight controller is configured to, in response to a first user interface receiving a first user input, generate a first control signal. The first control signal is configured to control an unmanned aerial vehicle (UAV) to effect an autonomous flight with a first flight parameter and a second flight parameter. In response to a second user interface, different from the first user interface, receiving a second user input, the flight controller is further configured to modify the first flight parameter to obtain a modified first flight parameter, generate a second control signal based on the modified first flight parameter and the second flight parameter, and control the UAV to operate based on the second control signal.