Patent classifications
B60R2300/108
VEHICLE TRAILER ANGLE DETECTION SYSTEM USING ULTRASONIC SENSORS
A trailer angle detection system for a vehicle includes a camera disposed at a rear portion of the vehicle and viewing rearward of the vehicle. A plurality of ultrasonic sensors is disposed at the rear portion of the vehicle and sense rearward of the vehicle. A control has at least one processor operable to process image data captured by the camera. Responsive to processing of image data captured by the camera, the control detects a trailer rearward of the vehicle and in the field of view of the camera. The at least one processor is operable to process sensor data captured by the ultrasonic sensors to determine a distance to portions of the trailer rearward of the vehicle. Responsive to processing of captured image data and processing of captured sensor data, the trailer angle detection system is operable to determine an angle of the trailer relative to the vehicle.
USER INTERFACE, MEANS OF MOVEMENT, AND METHODS FOR RECOGNIZING A USER'S HAND
A hand of a user may be detected in free space, where a plurality of surface points are determined and include a center area surface point and at least two surface points of the plurality of surface points located on a periphery of the surface of the hand. A curve extending through the plurality of surface points may be determined based on a position of a curvature. The plurality of surface points are processed to determine if the plurality of surface points of the detected hand are arranged in at least one of a substantially concave area relative to the sensor, and/or a substantially convex area relative to the sensor. The detected hand may be identified as a palm or back of the hand based on the processing of the plurality of surface points.
DEVICE, SYSTEM, AND METHOD FOR VISUAL LANGUAGE FOR AUTONOMOUS VEHICLE
The autonomous vehicle visual language system displays a plan sentence to an operator of an autonomous vehicle. The plan sentence includes visual syntax, the visual syntax being images displayed in predetermined configurations to convey specific information associated with any maneuvers the autonomous vehicle plans to execute. The visual syntax allows for a structured layering of information such that the plan sentence can display more complex information. The operator of the autonomous vehicle can process individual warnings and information more quickly, even when the operator has never seen a particular warning or element of information before. Additionally, the system 100 allows for personalization of a driver model to more closely reflect the operator's driving style, which is then implemented in the visual language displayed as the plan sentence.
Continuous occlusion models for road scene understanding
Systems and methods are disclosed for road scene understanding of vehicles in traffic by capturing images of traffic with a camera coupled to a vehicle; generating a continuous model of occlusions with a continuous occlusion mode for traffic participants to enhance point track association accuracy without distinguishing between moving and static objects; applying the continuous occlusion model to handle visibility constraints in object tracks; and combining point track association and soft object track modeling to improve 3D localization accuracy.
Asphalt mat thermal profile verification method and system
A method and system of verifying asphalt mat temperature values and location data generated by a paving machine includes identifying a location and temperature value of a threshold thermal characteristic of thermal data generated by the paving machine, and displaying the paver-generated location and temperature value of the threshold thermal characteristic on a display of the paving machine. The method further includes determining a location and temperature value of the threshold thermal characteristic by one or more external systems, and comparing the externally-determined location and temperature value to the paver-generated location and temperature value.
Multi-Purpose Camera Device For Use On A Vehicle
An illustrative example camera device includes a sensor that is configured to detect radiation. A first portion of the sensor has a first field of vision and is used for a first imaging function. A distortion correction prism directs radiation outside the first field of vision toward the sensor. A lens element between the distortion correcting prism and the sensor includes a surface at an oblique angle relative to a sensor axis. The lens element directs radiation from the distortion correcting prism toward a second portion of the sensor that has a second field of vision and is used for a second imaging function. The sensor provides a first output for the first imaging function based on radiation detected at the first portion of the sensor. The sensor provides a second output for the second imaging function based on radiation detection at the second portion.
CONTROL DEVICE, OPERATION METHOD FOR CONTROL DEVICE, AND STORAGE MEDIUM
A control device that controls imaging with fisheye cameras disposed on front and rear portions and right and left side portions of a vehicle, the control device comprising: a detection unit configured to detect an orientation of the vehicle; and a control unit configured to control a conversion center position for converting a fisheye image of each of the fisheye cameras into a planar image based on the orientation of the vehicle.
VEHICLE DRIVING ASSISTANCE APPARATUS
The present disclosure provides a vehicle driving assistant apparatus, comprising: an infrared image acquiring module for acquiring information of an infrared image of an object ahead of a vehicle; a display module connected with the infrared image acquiring module, for displaying a corresponding image according to the information of the infrared image acquired by the infrared image acquiring module. The disclosure enables the driver to see the road conditions ahead by means of the vehicle driving assistant apparatus even when facing glaring light emitted from ahead when driving at night, thus reducing the occurrence of traffic accidents.
Measurement of a Dimension on a Surface
The invention relates to a device, to a vehicle, and to a method for measuring a dimension between at least two points on surfaces. The device comprises an image-generating apparatus configured to scan the surroundings of the vehicle, and a display apparatus configured to display a representation of the surroundings of the vehicle. The device also includes an input apparatus configured to define at least two points as measuring points between which a dimension is to be determined in the displayed representation, a surroundings sensor configured to sense a distance and a direction of each of the measuring points with respect to the vehicle, and an evaluation apparatus configured to determine the dimension based on the sensed distances and directions of the measuring points, wherein the evaluation apparatus is further configured to output the determined dimension.
WORK VEHICLE DEBRIS ACCUMULATION CONTROL SYSTEMS
A debris accumulation control system is provided for usage within a work vehicle including an operator station and a work vehicle compartment. In embodiments, the work vehicle debris accumulation control system includes a display device located in the operator station of the work vehicle, a three dimensional (3D) imaging device having a field of view (FOV) encompassing a debris-gathering region of the work vehicle compartment, and a controller operably coupled to the display device and to the 3D imaging device. The controller is configured to: (i) utilize 3D imaging data provided by the 3D imaging device to estimate a debris accumulation risk level within the work vehicle compartment; and (ii) generate a first visual alert on the display device when the debris accumulation risk level surpasses a first predetermined threshold.