G01S17/02

Optical sensing in MEMS package for LiDAR system

Embodiments of the disclosure provide systems and methods for incorporating an optical sensing system in a MEMS package for real-time sensing of angular position of a MEMS mirror. The system may include an optical source configured to emit an optical signal to a backside of the MEMS mirror. The system may also include an optical detector configured to receive a returning optical signal reflected by the backside of the MEMS mirror. The system may further include at least one controller. The at least one controller may be configured to determine a scanning angle of the MEMS mirror based on a position on the optical detector where the returning optical signal is received.

OPTICAL MEASURING DEVICE, ASSEMBLING DEVICE OF MOUNTING SUBSTRATE, AND ASSEMBLING METHOD FOR MOUNTING SUBSTRATE
20220342161 · 2022-10-27 ·

An optical measuring device includes: a laser light source that emits first light having a first wavelength; an image capturing unit that emits second light having a second wavelength different from the first wavelength; a separating unit that receives the first light and the second light to direct the first light and the second light toward an object to be measured, and receives reflected light from the object to be measured to separate the reflected light into first reflected light based on the first light and second reflected light based on the second light; a light receiving element that receives the first reflected light separated by the separating unit; and a calculating unit that calculates a yawing angle and a pitching angle of the object to be measured based on a light receiving result of the light receiving element, in which the image capturing unit captures an image of the object to be measured by receiving the second reflected light separated by the separating unit, and the calculating unit calculates a rolling angle of the object to be measured based on an image capturing result acquired by the image capturing unit.

MONOSTATIC LIDAR TRANSCEIVER SYSTEM

A LiDAR system includes a light source and an arrayed micro-optic configured to receive light from the light source so as to produce and project a two-dimensional array of light spots on a scene. The LiDAR system also includes receiver optics having an array of optical detection sites configured so as to be suitable for establishing a one-to-one correspondence between light spots in the two-dimensional array and optical detection sites in the receiver optics. The LiDAR system further includes a birefringent prism and a lens. The LiDAR system may also include a mask placed in the light path between the birefringent prism and the receiver optics. Alternatively, the LiDAR system may include a controller programmed to activate or deactivate each optical detection site.

MONOSTATIC LIDAR TRANSCEIVER SYSTEM

A LiDAR system includes a light source and an arrayed micro-optic configured to receive light from the light source so as to produce and project a two-dimensional array of light spots on a scene. The LiDAR system also includes receiver optics having an array of optical detection sites configured so as to be suitable for establishing a one-to-one correspondence between light spots in the two-dimensional array and optical detection sites in the receiver optics. The LiDAR system further includes a birefringent prism and a lens. The LiDAR system may also include a mask placed in the light path between the birefringent prism and the receiver optics. Alternatively, the LiDAR system may include a controller programmed to activate or deactivate each optical detection site.

Apparatus for acquiring 3-dimensional maps of a scene

An active sensor for performing active measurements of a scene is presented. The active sensor includes at least one transmitter configured to emit light pulses toward at least one target object in the scene, wherein the at least one target object is recognized in an image acquired by a passive sensor; at least one receiver configured to detect light pulses reflected from the at least one target object; a controller configured to control an energy level, a direction, and a timing of each light pulse emitted by the transmitter, wherein the controller is further configured to control at least the direction for detecting each of the reflected light pulses; and a distance measurement circuit configured to measure a distance to each of the at least one target object based on the emitted light pulses and the detected light pulses.

SENSOR DEVICE
20230131002 · 2023-04-27 ·

Some of electromagnetic waves being emitted from an emission unit (110) and being reflected by a movable reflection unit (120) are reflected or scattered by a target object such as an object existing outside a sensor device (10). Some other of the electromagnetic waves being emitted by the emission unit (110) and being reflected by the movable reflection unit (120) are reflected or scattered by a structure (200) positioned closer to the movable reflection unit (120) than the target object is. A detection unit (122) detects deflection angles of the movable reflection unit (120) in a first direction (X) and a second direction (Y). An amendment unit (150) amends a detection result by the detection unit (122), based on a receiving result of the electromagnetic waves by a receiving unit (130), the electromagnetic waves being reflected or scattered by the structure (200).

Sensor data segmentation
11475573 · 2022-10-18 · ·

A system may include one or more processors configured to receive a plurality of images representing an environment. The images may include image data generated by an image capture device. The processors may also be configured to transmit the image data to an image segmentation network configured to segment the images. The processors may also be configured to receive sensor data associated with the environment including sensor data generated by a sensor of a type different than an image capture device. The processors may be configured to associate the sensor data with segmented images to create a training dataset. The processors may be configured to transmit the training dataset to a machine learning network configured to run a sensor data segmentation model, and train the sensor data segmentation model using the training dataset, such that the sensor data segmentation model is configured to segment sensor data.

Sensor assembly with cleaning

A sensor assembly includes a housing including a first chamber and a second chamber fluidly connected to the first chamber, a first sensor disposed in the second chamber and including a first sensor window facing outward from the second chamber, and a second sensor outside and fixed relative to the first chamber and the second chamber. The second sensor includes a second sensor window. The housing includes an intake from an exterior environment to the first chamber, a first outlet from the second chamber to the exterior environment, and a second outlet from the second chamber to the exterior environment. The first outlet is positioned to direct air across the first sensor window, and the second outlet is positioned to direct air across the second sensor window.

OPTICAL SCANNER

An optical scanner (100) which comprises: a mirror (200), pivotally mounted about a first pivot axis and is partially transparent from its front face (210) towards its rear face (220) to the light radiation; a light source (300) intended to emit an incident light radiation on the front face (210) of the mirror (200);
the scanner is characterised in that the rear face (220) comprises a structuration formed by at least one facet essentially planar and inclined with respect to the front face (210) so that a light radiation, incident on the front face (210), and transmitted by the at least one facet (220i) undergoes a deflection with respect to the angle of incidence of said light radiation on the front face (210).

Sensor Steering for Multi-Directional Long-Range Perception
20230161047 · 2023-05-25 ·

The present disclosure relates to systems, vehicles, and methods for adjusting a pointing direction and/or a scanning region of a lidar. An example method includes determining a plurality of points of interest within an environment of a vehicle. The method also includes assigning, to each point of interest of the plurality of points of interest, a respective priority score. The method additionally includes partitioning at least a portion of the environment of the vehicle into a plurality of sectors. Each sector of the plurality of sectors includes at least one point of interest. For each sector of the plurality of sectors, the method includes adjusting a scanning region of a lidar unit based on the respective sector and causing the lidar unit to scan the respective sector.