B60R2300/301

Rotating LIDAR with co-aligned imager

Example implementations are provided for an arrangement of co-aligned rotating sensors. One example device includes a light detection and ranging (LIDAR) transmitter that emits light pulses toward a scene according to a pointing direction of the device. The device also includes a LIDAR receiver that detects reflections of the emitted light pulses reflecting from the scene. The device also includes an image sensor that captures an image of the scene based on at least external light originating from one or more external light sources. The device also includes a platform that supports the LIDAR transmitter, the LIDAR receiver, and the image sensor in a particular relative arrangement. The device also includes an actuator that rotates the platform about an axis to adjust the pointing direction of the device.

Vehicular vision system with accelerated determination of another vehicle

A vehicular sensing system includes a data processor disposed at a vehicle, a camera disposed at an in-cabin side of the vehicle windshield, and a range sensor disposed at the vehicle and having a field of sensing that overlaps a portion of the field of view of the camera. During a driving maneuver of the vehicle, the system detects another vehicle that is located within the radar sensors field of sensing and outside of the camera's field of view and determines movement of the range sensor-detected other vehicle relative to the vehicle. When the range sensor-detected other vehicle enters the field of view of the camera, the system, at least in part via processing at the data processor of image data captured by the camera, determines that the range sensor-detected other vehicle is an object of interest. A system output is provided for use by a driving assist system.

VEHICULAR CONTROL SYSTEM WITH ROAD CURVATURE DETERMINATION
20230072196 · 2023-03-09 ·

A vehicular control system includes a forward viewing camera disposed at an in-cabin side of a windshield of a vehicle and viewing forward of the vehicle. Road curvature of a road along which the vehicle is traveling is determined responsive at least in part to processing of image data captured by the camera. Responsive at least in part to processing of captured image data, speed of the vehicle is controlled by an adaptive cruise control system of the vehicle. Upon approach of the vehicle to a curve in the road along which the vehicle is traveling, speed of the vehicle is reduced by the adaptive cruise control system to a reduced speed for traveling around the curve in the road. Speed of the vehicle is increased by the adaptive cruise control system when the vehicle is traveling along a straighter section of road after the curve in the road.

Actively Modifying a Field of View of an Autonomous Vehicle in view of Constraints
20230075786 · 2023-03-09 ·

Methods and devices for actively modifying a field of view of an autonomous vehicle in view of constraints are disclosed. In one embodiment, an example method is disclosed that includes causing a sensor in an autonomous vehicle to sense information about an environment in a first field of view, where a portion of the environment is obscured in the first field of view. The example method further includes determining a desired field of view in which the portion of the environment is not obscured and, based on the desired field of view and a set of constraints for the vehicle, determining a second field of view in which the portion of the environment is less obscured than in the first field of view. The example method further includes modifying a position of the vehicle, thereby causing the sensor to sense information in the second field of view.

Periphery monitoring device

A periphery monitoring device includes: an acquisition section acquiring a steering angle of a vehicle; an image acquisition section acquiring a captured image from an image capturing section that captures an image of a periphery of the vehicle; a detection section acquiring detection information of an object around the vehicle; and a control section causing a display section to display a synthesized image including a vehicle image showing the vehicle and a periphery image showing the periphery of the vehicle based on the captured image. When the object is detected on a course of the vehicle traveling at the steering angle by a predetermined distance, the control section causes a virtual vehicle image to be displayed in the synthesized image to be superimposed on a course to the object with a position of the vehicle as a reference.

System and method for visibility enhancement
11648876 · 2023-05-16 · ·

A system for visibility enhancement for a motor vehicle assistant system for warning the driver of hazardous situations due to at least one object being located within a critical range defined relative to the motor vehicle includes at least a first sensor means comprising a camera installed in a rear view equipment of the motor vehicle adapted to record at least one image, and an image processing means adapted to receive a first input signal from the first sensor means containing the at least one image and a second input signal containing at least one position profile of the at least one object located within the critical range, and manipulate the at least one image to generate a contrast manipulated image. A corresponding method of visibility enhancement is also described.

IMAGE GENERATING DEVICE AND IMAGE GENERATING METHOD
20170368993 · 2017-12-28 ·

An ECU captures an image of an imaging region around an own vehicle, and acquires image data, the imaging region being configured by a plurality of imaging areas. The ECU determines whether the object is present in the imaging areas, on the basis of detection results of an object present around the own vehicle. The ECU selects a target area to be displayed in an easy-to-see state from among the plurality of imaging areas, on the basis of determination results. The ECU reduces and corrects the image data of each imaging area such that an image of the target area is displayed in the easy-to-see state compared to an image of each imaging area to generate display image data.

TIME OF FLIGHT CAMERAS USING PASSIVE IMAGE SENSORS AND EXISTING LIGHT SOURCES

A vehicle may perform time of flight (ToF) calculations or determinations using passive sensors (e.g., passive image sensors) with existing onboard vehicle light sources to generate depth data and a depth image of a scene surrounding the vehicle. For example, the vehicle may use an automotive camera (e.g., a passive sensor that does not emit light) and may pair the camera with one or more light sources on the vehicle to construct a ToF camera. To accomplish this construction of a ToF camera using existing automotive cameras and existing light sources, a controller of the vehicle synchronizes the passive automotive cameras to the onboard light sources. In some examples, the vehicle can enable a high dynamic range (HDR) for the ToF cameras by varying properties of the light emitted from the light sources.

VIDEO RECORDING APPARATUS AND CONTROLLING METHOD THEREOF
20230202406 · 2023-06-29 ·

The present disclosure relates to a video recording apparatus and a controlling method thereof. According to an embodiment of the present disclosure, a video recording apparatus may include a motion detection sensor, a controller, and an application processor. The motion detection sensor may sense motion of an external object outside of a vehicle. The controller may be activated if the motion detection sensor senses motion of the external object and may monitor the external object through the motion detection sensor during a predetermined time period. An image processor may be activated based on a monitoring result of the external object so as to perform a video recording function.

Light-diffusing optical elements having cladding with scattering centers
09851500 · 2017-12-26 · ·

A light-diffusing optical element with efficient coupling to light sources with high numerical aperture. The light-diffusing optical element includes a higher index core surrounded by a lower index cladding. The cladding includes scattering centers that scatter evanescent light entering the cladding from the core. The scattered light exits the element to provide broad-area illumination along the element. Scattering centers include dopants, nanoparticles and/or internal voids. The core may also include scattering centers. The core is glass and the cladding may be glass or a polymer. The element features high numerical aperture and high scattering efficiency.