G06V20/56

Occlusion Constraints for Resolving Tracks from Multiple Types of Sensors
20230046396 · 2023-02-16 ·

This document describes techniques for using occlusion constraints for resolving tracks from multiple types of sensors. In aspects, an occlusion constraint is applied to an association between a radar track and vision track to indicate a probability of occlusion. In other aspects, described are techniques for a vehicle to refrain from evaluating occluded radar tracks and vision tracks collected by a perception system. The probability of occlusion is utilized for deemphasizing pairs of radar tracks and vision tracks with a high likelihood of occlusion and therefore, not useful for tracking. The disclosed techniques may provide improved perception data more closely representing multiple complex data sets for a vehicle for preventing a collision with an occluded object as the vehicle operates in an environment.

METHOD AND APPARATUS FOR PROCESSING IMAGE

The present disclosure provides a method and apparatus for processing an image. A specific implementation includes: acquiring a top view of a road; identifying a position of a lane line from the top view; cutting the top view into at least two areas, and determining, according to the position of the lane line in each area, a width of a lane in the each area and an average width of the lane in the top view; calculating a first perspective correction matrix by optimizing a first loss function, the first loss function being used to represent a difference between the width of the lane in the each area and the average width of the lane in the top view; and performing a lateral correction on the top view through the first perspective correction matrix to obtain a first corrected image.

METHOD FOR AUTOMATING AN AGRICULTURAL WORK TASK
20230050661 · 2023-02-16 ·

A method for automating an agricultural work task which is performed by a tillage device on an agricultural tractor includes modifying via a control unit at least one process control variable representing a working or operating parameter of the tillage device using feedback data which represent a field state of a field surface before or after tillage, generating via an imaging sensor a ground image of the field surface, and evaluating via a data processing unit the ground image to determine at least some of the feedback data. The data processing unit evaluates the ground image such that the ground image is used to determine the feedback data depending on the result of a monitoring of the field surface for visually covering air dust.

METHOD FOR AUTOMATING AN AGRICULTURAL WORK TASK
20230050661 · 2023-02-16 ·

A method for automating an agricultural work task which is performed by a tillage device on an agricultural tractor includes modifying via a control unit at least one process control variable representing a working or operating parameter of the tillage device using feedback data which represent a field state of a field surface before or after tillage, generating via an imaging sensor a ground image of the field surface, and evaluating via a data processing unit the ground image to determine at least some of the feedback data. The data processing unit evaluates the ground image such that the ground image is used to determine the feedback data depending on the result of a monitoring of the field surface for visually covering air dust.

AIRCRAFT DOOR CAMERA SYSTEM FOR DOCKING ALIGNMENT MONITORING
20230052176 · 2023-02-16 ·

A camera with a field of view toward an external environment of an aircraft is disposed within an aircraft door such that a ground surface is within the field of view of the camera during taxiing of the aircraft. A display device is disposed within an interior of the aircraft. A processor is operatively coupled to the camera and to the display device. The processor analyzes image data captured by the camera for docking guidance by identifying, within the captured image data, a region on the ground surface corresponding to an alignment fiducial indicating a parking location for the aircraft, determining, based on the region of the captured image data corresponding to the alignment fiducial indicating the parking location, a relative location of the aircraft with respect to the alignment fiducial, and outputting an indication of the relative location of the aircraft to the alignment fiducial.

SYSTEMS AND METHODS FOR DETERMINING ROAD TRAVERSABILITY USING REAL TIME DATA AND A TRAINED MODEL

Embodiments of the disclosed systems and methods provide for determination of roadway traversability by an autonomous vehicle using real time data and a trained traversability determination machine learning model. Consistent with aspects of the disclosed embodiments, the model may be trained using annotated birds eye view perspective data obtained using vehicle vision sensor systems (e.g., LiDAR and/or camera systems). During operation of a vehicle, vision sensor data may be used to construct birds eye view perspective data, which may be provided to the trained model. The model may label and/or otherwise annotate the vision sensor data based on relationships identified in the model training process to identify associated road boundary and/or lane information. Local vehicle control systems may compute control actions and issue commands to associated vehicle control systems to ensure the vehicle travels within a desired path.

METHOD FOR PREDICTING AN EGO-LANE FOR A VEHICLE
20230052594 · 2023-02-16 ·

A method for predicting an ego-lane for a vehicle. The method includes: receiving at least one image captured by at last one camera sensor of the vehicle, which depicts a lane that may be used by a vehicle; ascertaining a center line of the lane, which extends through a center of the lane, by implementing a trained neural network on the captured image, the neural network being trained via regression to ascertain a center line of a lane, which extends in a center of the lane, based on captured images of the lane; outputting a plurality of parameters, which describe the center line of the lane, via the neural network; generating the center line based on the parameters of the center line; identifying the center line of the lane as the ego-lane of the vehicle; and providing the ego-lane.

METHOD AND CONTROL UNIT FOR OPERATING A TRANSVERSE STABILIZATION SYSTEM OF A VEHICLE
20230052366 · 2023-02-16 ·

A method for operating a transverse stabilization system of a vehicle. A steering direction of the vehicle and a setpoint direction of the vehicle are read in, with a transverse stabilization target for the transverse stabilization system being determined using the steering direction and the setpoint direction.

PROJECTION ON A VEHICLE WINDOW

A system includes a camera aimed externally to a vehicle, a window of the vehicle, a projector positioned to project on the window, and a computer communicatively coupled to the camera and the projector. The computer is programmed to, upon receiving data from the camera indicating a first person outside the vehicle, instruct the projector to project an image on the window depicting a second person inside the vehicle.

CALCULATING A DISTANCE BETWEEN A VEHICLE AND OBJECTS

A method for calculating a distance between a vehicle camera and an object, the method may include: (a) obtaining an image that was acquired by the vehicle camera of a vehicle; the image captures the horizon, the object, and road lane boundaries; (b) determining an initial row-location horizon estimate and a row-location contact point estimate, the contact point is between the object and a road on which the vehicle is positioned; (c) determining a vehicle camera roll angle correction that once applied will cause the lanes boundaries to be parallel to each other in the real world; (d) calculating a new row-location horizon estimate, wherein the calculating comprises updating the row-location horizon estimate based on the vehicle camera roll angle correction; and (e) calculating the distance between the vehicle camera based on a difference between the new row-location horizon estimate and the row-location contact point estimate.