G01S7/539

Intelligent ultrasonic system and rear collision warning apparatus for vehicle

An intelligent ultrasonic system may include: a camera sensor unit configured to take an image of a road ahead of a driving vehicle; an ultrasonic signal input unit configured to receive an ultrasonic signal sensed through one or more ultrasonic sensors mounted on the vehicle; a feature extraction unit configured to extract a feature of the received ultrasonic signal; a data collision unit configured to collect one or more data related to a surrounding situation of the road on which the vehicle is driven; and a control unit configured to divide the surrounding situation into two or more classes based on the one or more data collected through the data collection unit, and change or reset an existing parameter to a parameter corresponding to any one class of the classes when the surrounding situation corresponds to the one class or is changed to the one class.

Intelligent ultrasonic system and rear collision warning apparatus for vehicle

An intelligent ultrasonic system may include: a camera sensor unit configured to take an image of a road ahead of a driving vehicle; an ultrasonic signal input unit configured to receive an ultrasonic signal sensed through one or more ultrasonic sensors mounted on the vehicle; a feature extraction unit configured to extract a feature of the received ultrasonic signal; a data collision unit configured to collect one or more data related to a surrounding situation of the road on which the vehicle is driven; and a control unit configured to divide the surrounding situation into two or more classes based on the one or more data collected through the data collection unit, and change or reset an existing parameter to a parameter corresponding to any one class of the classes when the surrounding situation corresponds to the one class or is changed to the one class.

Escape system for a sunken car and ultrasonic component
11454614 · 2022-09-27 ·

The present disclosure illustrates an escape system for a sunken car and an ultrasonic component. The ultrasonic component has a case and an ultrasonic module, and the escape system for a sunken car has at least the ultrasonic component and a main board. The escape system for the sunken car and the ultrasonic component in present disclosure utilize the property of the ultrasonic to recognize the type and thickness of the obstacle which is accumulated in the ultrasonic component, and to determine whether the warning message for sweeping the obstacle should be sent, so as to maintain the sensitivity of the ultrasonic component and further to prevent the ultrasonic component from mistakenly judging the car is sunk.

IN-VEHICLE OBJECT DETERMINING APPARATUS
20170322299 · 2017-11-09 ·

An in-vehicle object determining apparatus cooperates with an obstacle sensor unit, which detects an obstacle at a first time. An estimated detected state is calculated as a detected state of the obstacle estimated to be detected by the obstacle sensor unit at a second time after a lapse of a predetermined time period from the first time, on condition that the obstacle is assumed to be under stationary state, based on (i) a vehicle-relative position of the obstacle detected at the first time, (ii) a sensor position of the obstacle sensor unit, and (iii) a vehicle position change during a period from the first time to the second time. It is determined that the obstacle is a moving object based on a discrepancy between the estimated detected state of the obstacle and a real detected state of the obstacle actually detected by the obstacle sensor unit at the second time.

SYSTEM AND METHOD FOR DETECTING INTERACTION WITH LIVING OBJECT

A living object detection system may detect the interaction between an observer and a target object. The system may include an object sensor device, an object classification data unit, and an object detection module. The object sensor device may be attachable to the observer and include an ultrasonic sensor for sensing distance and a passive infrared sensor for sensing temperature. The object classification data unit may store predetermined object classifiers that identifies an object as a living object or non-living object. The object detection module may determine the target object as a living object or a non-living object based on the object classifiers stored in the object classification data unit and on a physical feature set of the target object, where the physical feature set may include distance and temperature parameters.

Method for Tracking a Target Acoustic Source

A method of processing an acoustic image includes the steps of acquiring acoustic signals generated by acoustic sources in a predetermined region of space, generating a multispectral 3D acoustic image that includes a collection of 2D acoustic images, performing a frequency integration of the multispectral acoustic image for generating a 2D acoustic map locating at least one target acoustic source of interest and modeling the signal spectrum associated with the target acoustic source, generating a classification map obtained by comparing the signal spectrum of each signal associated with each pixel of the multispectral acoustic image and the model of the signal spectrum associated with the target acoustic source to distinguish the spectrum of the signal associated with the target acoustic source from the signal spectra associated with the remaining acoustic sources, and merging the classification map and the acoustic map to obtain a merged map.

ULTRASOUND/RADAR FOR EYE TRACKING
20170261610 · 2017-09-14 ·

An eye tracking unit that includes one or more transmitters that transmit a signal (e.g., a radar signal or an ultrasonic sound) at an eye, one or more receivers that receive a reflection of the signal generated by interaction of the signal with the eye, and an eye orientation estimation module that estimates an orientation of the eye based on the reflected signal received by the one or more ultrasonic receivers and based on a model of the eye. The eye tracking unit may be part of a head-mounted display (HMD) that includes a display element configured to display content to a user wearing the HMD. The model of the eye may be trained by displaying a visual indicator on the electronic element and detecting a reflected signal corresponding to the eye looking at the visual indicator.

ULTRASOUND/RADAR FOR EYE TRACKING
20170261610 · 2017-09-14 ·

An eye tracking unit that includes one or more transmitters that transmit a signal (e.g., a radar signal or an ultrasonic sound) at an eye, one or more receivers that receive a reflection of the signal generated by interaction of the signal with the eye, and an eye orientation estimation module that estimates an orientation of the eye based on the reflected signal received by the one or more ultrasonic receivers and based on a model of the eye. The eye tracking unit may be part of a head-mounted display (HMD) that includes a display element configured to display content to a user wearing the HMD. The model of the eye may be trained by displaying a visual indicator on the electronic element and detecting a reflected signal corresponding to the eye looking at the visual indicator.

AMBIGUITY MITIGATION BASED ON COMMON FIELD OF VIEW OF RADAR SYSTEMS
20220236409 · 2022-07-28 ·

A method includes obtaining an initial point cloud for each of two or more radar systems that share a common field of view. Each initial point cloud results from processing reflected energy at each of the two or more radar systems. Each point of the initial point cloud indicates one or more hypotheses for a range, a Doppler, and a direction of arrival (DOA) to an object that resulted in the reflected energy. A point cloud, obtained from the initial point cloud, has a same number of hypotheses for the range, the Doppler, and the DOA. Resolving ambiguity in the common field of view is based on the point clouds to obtain resolved and unresolved points in the common field of view. A radar image obtained from each of the two or more radar systems is used to control an aspect of operation of a vehicle.

AMBIGUITY MITIGATION BASED ON COMMON FIELD OF VIEW OF RADAR SYSTEMS
20220236409 · 2022-07-28 ·

A method includes obtaining an initial point cloud for each of two or more radar systems that share a common field of view. Each initial point cloud results from processing reflected energy at each of the two or more radar systems. Each point of the initial point cloud indicates one or more hypotheses for a range, a Doppler, and a direction of arrival (DOA) to an object that resulted in the reflected energy. A point cloud, obtained from the initial point cloud, has a same number of hypotheses for the range, the Doppler, and the DOA. Resolving ambiguity in the common field of view is based on the point clouds to obtain resolved and unresolved points in the common field of view. A radar image obtained from each of the two or more radar systems is used to control an aspect of operation of a vehicle.