VEHICULAR CABIN MONITORING SYSTEM
20240233511 ยท 2024-07-11
Inventors
- Mohammad Hossein Golbon Haghighi (Waltham, MA, US)
- Ashwin Alapakkam Kannan (Quincy, MA, US)
- Ashesh Goswami (Somerville, MA, US)
Cpc classification
G08B25/00
PHYSICS
B60R21/01534
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06V20/59
PHYSICS
B60R21/015
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A vehicular cabin monitoring system includes a radar sensor disposed at a vehicle so as to sense interior of the vehicle. The radar sensor includes at least one transmitter that transmits radio signals, and a plurality of receivers that receive radio signals. The vehicular cabin monitoring system, responsive to processing by a processor of radar data captured by the radar sensor and using a neural network, determines occupancy of a seat of the vehicle. The determined occupancy of the seat of the vehicle includes (i) the seat of the vehicle is not occupied, (ii) the seat of the vehicle is occupied by an adult and (iii) the seat of the vehicle is occupied by a child. The vehicular cabin monitoring system generates an alert responsive to determining that the seat of the vehicle is occupied by a child and that no seat of the vehicle is occupied by an adult.
Claims
1. A vehicular cabin monitoring system, the vehicular cabin monitoring system comprising: a radar sensor disposed at an interior cabin of a vehicle equipped with the vehicular cabin monitoring system, the radar sensor sensing at least a portion of the interior cabin of the vehicle; wherein the radar sensor comprises (i) at least one transmitter that transmits radio signals and (ii) a plurality of receivers that receive radio signals; an electronic control unit (ECU) comprising electronic circuitry and associated software; wherein the electronic circuitry of the ECU comprises a processor operable to process radar data captured by the radar sensor; wherein radar data captured by the radar sensor is transferred to the ECU; wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor and using a neural network trained with radar sensor data, determines occupancy of a seat of the vehicle, and wherein the determined occupancy of the seat of the vehicle comprises one selected from the group consisting of (i) the seat of the vehicle is not occupied, (ii) the seat of the vehicle is occupied by an adult and (iii) the seat of the vehicle is occupied by a child; wherein the vehicular cabin monitoring system generates an alert at least in part responsive to (i) determination that the seat of the vehicle is occupied by a child and (ii) determination that no seat of the vehicle is occupied by an adult; and wherein the vehicular cabin monitoring system generates the alert at least in part responsive to determination that (i) the seat of the vehicle is occupied by the child and (ii) no seat is occupied by an adult for a threshold period of time.
2. The vehicular cabin monitoring system of claim 1, wherein the threshold period of time comprises a threshold period of time after the vehicle is turned off.
3. The vehicular cabin monitoring system of claim 1, wherein the neural network comprises a quantized neural network.
4. The vehicular cabin monitoring system of claim 1, wherein the neural network comprises a convolutional neural network (CNN).
5. The vehicular cabin monitoring system of claim 1, wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, determines a plurality of zones of radar data captured by the radar sensor, and wherein each zone of the plurality of zones corresponds to a different seat of the vehicle.
6. The vehicular cabin monitoring system of claim 1, wherein the radar sensor is mounted at a headliner in the cabin of the vehicle.
7. The vehicular cabin monitoring system of claim 6, wherein the radar sensor is disposed between a first row of seats of the vehicle and a second row of seats of the vehicle.
8. The vehicular cabin monitoring system of claim 1, wherein the vehicular cabin monitoring system transmits the alert to at least one from the group consisting of (i) an owner of the vehicle and (ii) emergency services.
9. The vehicular cabin monitoring system of claim 1, wherein the alert comprises at least one selected from the group consisting of (i) flashing a light of the vehicle and (ii) honking a horn of the vehicle.
10. The vehicular cabin monitoring system of claim 1, wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, applies a ground clutter filter to the radar data captured by the radar sensor.
11. The vehicular cabin monitoring system of claim 1, wherein the radar sensor operates at a frequency that is different than a frequency at which an advanced driver assist system (ADAS) of the vehicle operates.
12. The vehicular cabin monitoring system of claim 1, wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, generates a heat map image representative of the seat of the vehicle.
13. The vehicular cabin monitoring system of claim 12, wherein the heat map image represents ranges and angles to one or more objects at the seat of the vehicle.
14. The vehicular cabin monitoring system of claim 1, wherein the vehicular cabin monitoring system, at least in part responsive to (i) determination that a first seat of the vehicle is occupied by a child, (ii) determination that a second seat of the vehicle is occupied by an adult and (iii) determination that the vehicle is off, generates a second alert indicating the child is present in the vehicle.
15. A vehicular cabin monitoring system, the vehicular cabin monitoring system comprising: a radar sensor disposed at an interior cabin of a vehicle equipped with the vehicular cabin monitoring system, the radar sensor sensing at least a portion of the interior cabin of the vehicle; wherein the radar sensor comprises (i) at least one transmitter that transmits radio signals and (ii) a plurality of receivers that receive radio signals; an electronic control unit (ECU) comprising electronic circuitry and associated software; wherein the electronic circuitry of the ECU comprises a processor operable to process radar data captured by the radar sensor; wherein radar data captured by the radar sensor is transferred to the ECU; wherein the radar sensor operates at a frequency that is different than a frequency at which an advanced driver assist system (ADAS) of the vehicle operates; wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, determines occupancy of a seat of the vehicle, and wherein the determined occupancy of the seat of the vehicle comprises one selected from the group consisting of (i) the seat of the vehicle is not occupied, (ii) the seat of the vehicle is occupied by an adult and (iii) the seat of the vehicle is occupied by a child; wherein the vehicular cabin monitoring system generates an alert at least in part responsive to (i) determination that the seat of the vehicle is occupied by a child and (ii) determination that no seat of the vehicle is occupied by an adult; and wherein the vehicular cabin monitoring system generates the alert at least in part responsive to determination that (i) the seat of the vehicle is occupied by the child and (ii) no seat is occupied by an adult for a threshold period of time.
16. The vehicular cabin monitoring system of claim 15, wherein the threshold period of time comprises a threshold period of time after the vehicle is turned off.
17. The vehicular cabin monitoring system of claim 15, wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, determines a plurality of zones of radar data captured by the radar sensor, and wherein each zone of the plurality of zones corresponds to a different seat of the vehicle.
18. The vehicular cabin monitoring system of claim 15, wherein the radar sensor is mounted at a headliner in the cabin of the vehicle.
19. The vehicular cabin monitoring system of claim 18, wherein the radar sensor is disposed between a first row of seats of the vehicle and a second row of seats of the vehicle.
20. A vehicular cabin monitoring system, the vehicular cabin monitoring system comprising: a radar sensor disposed at an interior cabin of a vehicle equipped with the vehicular cabin monitoring system, the radar sensor sensing at least a portion of the interior cabin of the vehicle; wherein the radar sensor comprises (i) at least one transmitter that transmits radio signals and (ii) a plurality of receivers that receive radio signals; an electronic control unit (ECU) comprising electronic circuitry and associated software; wherein the electronic circuitry of the ECU comprises a processor operable to process radar data captured by the radar sensor; wherein radar data captured by the radar sensor is transferred to the ECU; wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, determines a plurality of zones of radar data captured by the radar sensor, and wherein each zone of the plurality of zones corresponds to a different seat of the vehicle; wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, determines occupancy of each zone of the vehicle, and wherein the determined occupancy of the zone of the vehicle comprises one selected from the group consisting of (i) the zone of the vehicle is not occupied, (ii) the zone of the vehicle is occupied by an adult and (iii) the zone of the vehicle is occupied by a child; wherein the vehicular cabin monitoring system generates an alert at least in part responsive to (i) determination that at least one zone of the vehicle is occupied by a child and (ii) determination that no zone of the vehicle is occupied by an adult; and wherein the vehicular cabin monitoring system generates the alert at least in part responsive to determination that (i) the at least one zone of the vehicle is occupied by the child and (ii) no zone is occupied by an adult for a threshold period of time, and wherein the threshold period of time comprises a threshold period of time after the vehicle is turned off.
21. The vehicular cabin monitoring system of claim 20, wherein the vehicular cabin monitoring system transmits the alert to at least one from the group consisting of (i) an owner of the vehicle and (ii) emergency services.
22. The vehicular cabin monitoring system of claim 21, wherein the alert comprises at least one selected from the group consisting of (i) flashing a light of the vehicle and (ii) honking a horn of the vehicle.
23. The vehicular cabin monitoring system of claim 22, wherein the vehicular cabin monitoring system, via processing by the processor of radar data captured by the radar sensor, applies a ground clutter filter to the radar data captured by the radar sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0016] A vehicle sensing system and/or object detection system and/or alert system operates to capture sensing data interior of the vehicle and may process the captured data to detect objects within the vehicle, such as to detect occupants within the vehicle. The system includes a processor that is operable to receive sensing data from one or more sensors and provide an output, such as an alert or control of a vehicle system.
[0017] Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 (
[0018] Radar sensing systems such as mm-wave multiple-input multiple-output (MIMO) frequency-modulated continuous wave (FMCW) radar sensing systems may easily estimate a location and velocity of all targets in their field of view (FoV) of the radar sensor or sensors of the system. However, target classifications for some applications are a challenge for these radar systems. Implementations herein include using a radar sensor disposed at a vehicle to detect and classify children seated in the cabin of the equipped vehicle.
[0019] Children are especially vulnerable to heat stroke when left in a hot vehicle, even if the windows are slightly open. Heat stroke can occur when the body cannot regulate its temperature and can be deadly. A child's body temperature rises three to five times faster than adults. When a child is in a hot vehicle, their temperature can rise quicklyand they can die within minutes when their body temperature reaches 107 degrees Fahrenheit. Hundreds of children have died of heat stroke from being trapped in a hot vehicle.
[0020] Implementations herein include a low-cost and cost effective mmWave MIMO FMCW radar sensor that detects a child left alone in a vehicle and alerts the owner or emergency services or nearby adults to prevent heat stroke fatalities. While a child may be identified using a camera, cameras are an intrusive approach that may, for example, make occupants uncomfortable. Instead, the sensing system implements a non-intrusive approach to detect the presence or absence of child/children using non-intrusive FMCW radar sensors and, for example, one or more neural networks (e.g., a convolutional neural network (CNN)). The radar sensor may operate at any frequency (e.g., 60 GHZ) that avoids disruption or interference with collision avoidance radar sensors or advanced driver assistance radars (ADAS) (or any other driver assistance system that implement radar sensors) that typically operate at 77 GHz.
[0021] By combining the strengths of convolutional neural networks (CNNs) and mmWave in-cabin radar sensor signals, the sensing system includes the ability to detect and recognize a child's presence in vehicles. The system may be highly accurate (e.g., approximately 99 percent accurate) in determining whether the car (or a particular seat) is empty, is occupied by a child, or is occupied by an adult.
[0022] This problem may be framed as a classification task where, given an input, the sensing system determines whether the input belongs to one of three classes: (i) empty (i.e., the seat is unoccupied), (ii) a child present in the seat, or (iii) an adult present in the seat. The system may first collect radar data to train and evaluate a classification algorithm. As shown in
[0023] Any suitable radar sensor operating around, for example, 60 GHz may be used. For example, the radar sensor may include an mmWave multiple-input multiple-output (MIMO) frequency modulated continuous wave (FMCW) radar to collect training and testing datasets for radar classification. The radar sensor may be equipped with any number of transmit and receive antennas with any operating frequency range. For example, the radar sensor is equipped with four transmit and three receive antennas, and the operating frequency range is 60-64 GHz with a bandwidth of 4 GHz. The radar sensor may have a wide field of view to ensure that each seat of the vehicle is within the field of view. For example, the radar's field of view may be 60 degrees on either side of the boresight angle, allowing the radar sensor to detect objects within a 120 degree range. Depending on mounting location (e.g., between the first and second rows of seats), the radar sensor may be mounted with a tilt (e.g., a 15 degree tilt) toward the rear of the vehicle because a mounting angle parallel to the floor may cause undesirable reflections (and by tilting toward the rear, the field of sensing may include a third row of seats, if applicable). Optionally, a ground clutter filter (e.g., during a post-processing step) may be applied to sensor data captured by the radar sensor to remove any unwanted reflections from returns without removing reflections from vehicle occupants.
[0024] In some examples, a range fast Fourier transform (FFT) is applied to the radar received signals to estimate the distance of the objects from the radar sensor, and/or a beamforming algorithm may be used to estimate the azimuth angle of arrival of all targets in the field of view of the radar sensor.
[0025] Range estimation for radar sensors refers to the process of determining the distance of an object from the radar sensor. Range estimation is an important function of radar systems, as it allows the radar to determine the position and movements of objects. Radar sensors and systems such as FMCW radars transmit a continuous waveform that is frequency-modulated over time, and the range to the object is determined by measuring the time delay between the transmitted and received signals. The time delay is equal to the round-trip time of the signal, which is the time taken for the signal to travel to the object and back to the radar. The distance R to the reflecting object can be determined as:
[0026] In Equation (1), c is the speed of light (e.g., in m/s) and ?t is the delay time (e.g., in seconds).
[0027] Radar angle of arrival (AoA) estimation is the process of determining the direction from which a radio frequency (RF) signal arrives at the radar receiver. This information can be used to locate the position of the target, track the movement of moving objects, and perform other tasks in radar systems. A MIMO radar sensor uses multiple transmit and receive antennas to improve the accuracy of AoA estimation. The signals received at different antennas are used to estimate the AoA using techniques such as maximum likelihood estimation.
[0028]
[0029] Based on the measurements and algorithms of Equation (1) and Equation (2), the angle and range of an object relative to the radar sensor may be estimated. Using this information, a range/angle heat map image may be generated with measurements to convert radar data to images for deep learning framework and train the machine learning model (e.g., the CNN model) for classification. Training samples may be generated to train the model by generating heat map images of seats from a variety of different vehicles that are either (i) empty, (ii) occupied by a child (which may include a car seat or the like), or (iii) occupied by an adult. Each heat map image may be labeled with the ground truth classification (e.g., via a human annotator), and the model may be trained using the generated heat map images to update parameters of the model (e.g., weights and activations).
[0030]
[0031] Optionally, each image that includes two zones is cropped at the horizontal center to obtain zonal images. These zonal images are used as inputs for the deep learning model training and inference. Each zonal image may be cropped to a specific size dependent upon the specifications of the camera and/or the model. For example, each image is cropped to a size of 267?535 pixels.
[0032] By framing the problem as a classification task, the model (e.g., a ResNet-18 model) may be trained on zonal images. The model may output a 3-class probability score on the likelihood of the input belonging to classes (i) empty, (ii) a child, or (iii) an adult. The set of zonal images may be split into a training set and a test set at any ratio (e.g., a 70:30 ratio). Because adjacent frames are very similar in characteristics, the split may be performed similarly to a time-series signal to avoid data leakage. Despite a time-series like split, the data may be treated as independent and identically distributed for model training and testing purposes. An exemplary train/test image count is illustrated in
[0033] The model may include any number of layers and nodes. For example, when the model is a ResNet-18 model, the model is an 18-layer deep learning model which contains close to 11 million parameters. Pretrained weights may be used to initialize the model for training. Due to the complexity of the problem, the model's convolution layer weights may be frozen. In this example, only batch-normalization and fully connected layer weights are trained. The model may be trained for any number (e.g., 5) of epochs using, for example, an Adam optimizer and a learning rate of 1e-2. Exemplary results of test data using a non-quantized model with a 3-class confusion matrix is illustrated in
[0034] Due to the hardware resource constraints on most embedded devices in automotive applications, it is often important that deployed models are compact in size and do not consume too much power. Optionally, the weights of the model may be quantized to, for example, 8-bit integers (e.g., quantized from 32-bit floats to 8-bit integers). This involves performing post-training quantization (PTQ) on the single precision float32 model using the training dataset for calibration. Exemplary results of test data using a quantized model with a 3-class confusion matrix is illustrated in
[0035] Referring now to
[0036] Thus, implementations herein include a deep learning-based classification algorithm (e.g., using a CNN) to classify occupancy of seats or zones in a vehicle as being empty, including an adult, or as including a child using an in-cabin radar sensor of a vehicle. The system makes use of cost-effective and/or non-intrusive sensors. For example, the system includes a low-cost radar sensor, such as an mmWave MIMO FMCW radar sensor with the fusion of a neural network, for highly accurate classification to effectively detect a child left alone in a car (i.e., no adult is present in the vehicle) and alert the owner or the emergency services. The system may only generate the alert when certain conditions are met. For example, the system may only generate the alert when the vehicle is off, when the windows are closed more than a threshold amount, when the system determines that an adult has not been present in the vehicle for at least a threshold period of time, when the external temperature is above an upper temperature threshold or below a lower temperature threshold, etc.
[0037] Optionally, the system may generate an alert when the vehicle is turned off, before the driver or adult occupant has exited the vehicle. The alert may include a notification sent to a mobile device of an owner or user or driver of the vehicle, to emergency services, or to any other appropriate entity. In some examples, the system may, in response to detecting a child within the vehicle, activate visual and/or audible alerts (e.g., flash lights, sound the horn, etc.) to attract the attention of those nearby the vehicle (and the system may delay generating this type of alert until determination of the temperature increasing in the cabin of the vehicle or after a period of time has elapsed since the driver or adult occupant has exited the vehicle). In addition to or in alternative to the alert, the system may control a number of functions of the vehicle (e.g., roll down the windows, start a climate control system of the vehicle to regulate the temperature, etc.). The radar sensor provides a less intrusive solution compared to a camera solution. Because collision avoidance radar sensors or advanced driver assistance radars (ADAS) generally operate at 77 GHZ, the system may operate at a different frequency (e.g., 60 GHz) so as to not be affected by interference from other radar sensors.
[0038] Implementations herein may make use of any machine learning model architecture. For example, the system may implement one or more CNNs, typically used for image-based applications, in a radar application. Optionally, the system uses range-angle heat maps from an mmWave radar as inputs to the models to detect the presence of children in vehicle cabins. This allows the system to formulate the child-detection procedure as a classification task. The system may achieve near-perfect accuracy (
[0039] The system may utilize sensors, such as radar sensors or imaging radar sensors or lidar sensors or the like, to detect presence of and/or range to objects and/or other vehicles and/or pedestrians. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
[0040] The radar sensors of the sensing system each comprise a plurality of transmitters that transmit radio signals via a plurality of transmit antennas (e.g., at least two transmit antennas, such as three or more transmit antennas), a plurality of receivers that receive radio signals via the plurality of receive antennas (e.g., at least two receive antennas, such as three or four or more receive antennas), with the received radio signals being transmitted radio signals that are reflected from an object present in the field of sensing of the respective radar sensor. For example, the sensors may use three transmit antennas and four receive antennas. The system includes an ECU or control that includes a data processor for processing sensor data captured by the radar sensors. The ECU or sensing system may be part of a driving assist system of the vehicle, with the driving assist system controlling at least one function or feature of the vehicle (such as to provide autonomous driving control of the vehicle) responsive to processing of the data captured by the radar sensors.
[0041] Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.