B60R21/01542

Automotive analytics technology to provide synergistic collision safety

Systems, apparatuses and methods may provide for technology that conducts a real-time analysis of interior sensor data associated with a vehicle, exterior sensor data associated with the vehicle and environmental data associated with the vehicle. Additionally, the technology may determine whether a hazard condition exists based on the real-time analysis, wherein the hazard condition includes a deviation of a current behavior waveform from a reference behavior waveform by a predetermined amount. In one example, a safety measure is triggered with respect to the vehicle if the hazard condition exists. The safety measure may be selected based on a reaction time constraint associated with the hazard condition.

APPARATUS, SYSTEMS AND METHODS FOR CLASSIFYING DIGITAL IMAGES
20220343659 · 2022-10-27 ·

The present disclosure is directed to apparatuses, systems and methods for automatically classifying images of occupants inside a vehicle. More particularly, the present disclosure is directed to apparatuses, systems and methods for automatically classifying images of occupants inside a vehicle by comparing current image feature data to previously classified image features.

APPARATUSES, SYSTEMS AND METHODS FOR CLASSIFYING DIGITAL IMAGES
20220292851 · 2022-09-15 ·

The present disclosure is directed to apparatuses, systems and methods for automatically classifying digital images of occupants inside a vehicle. More particularly, the present disclosure is directed to apparatuses, systems and methods for automatically classifying digital images of occupants inside a vehicle by comparing current image data to previously classified image data.

3D time of flight active reflecting sensing systems and methods

The system and method provide for identification of dynamic objects in an enclosed space and the presence of a component in a primary location. The system uses an active electro-optical 3D sensor, such as a three-dimensional time of flight camera, to identify the presence or absence of a reflected pulse, to determine, for example, proper placement of a seat belt, or a change in characteristics of a reflected pulse to determine a change in location, and thus possible movement, of a living creature in a vehicle, for example.

Apparatus, systems and methods for classifying digital images

The present disclosure is directed to apparatuses, systems and methods for automatically classifying images of occupants inside a vehicle. More particularly, the present disclosure is directed to apparatuses, systems and methods for automatically classifying images of occupants inside a vehicle by comparing current image feature data to previously classified image features.

Apparatuses, systems and methods for classifying digital images

The present disclosure is directed to apparatuses, systems and methods for automatically classifying digital images of occupants inside a vehicle. More particularly, the present disclosure is directed to apparatuses, systems and methods for automatically classifying digital images of occupants inside a vehicle by comparing current image data to previously classified image data.

SEAT

Disclosed is a seat including: sensors which includes a first cushion sensor provided at a seat cushion in a position corresponding to buttocks of an occupant, a second cushion sensor provided at the seat cushion and located farther frontward than the first cushion sensor, a first back sensor provided at a seat back and located in a lower position thereof, and a second back sensor provided at the seat back and located above the first back sensor; and a controller connected to the sensors and thereby allowed to acquire pressure values from the respective sensors. The controller is configured to identify the motion of the occupant based on outputs of at least two sensors of the first cushion sensor, the second cushion sensor, the first back sensor, and the second back sensor.

Method and system for localizing an occupant within a vehicle

A computer implemented method for localizing an occupant within a vehicle includes identifying whether at least one occupant is present within the vehicle based on a signal of an ultrasonic sensor and determining information regarding an inclination of the vehicle based on at least one signal of an inclination sensor. Based on the information regarding inclination of the vehicle, it is determined whether an occupant is present in one of a plurality of predefined sections of the vehicle if the presence of at least one occupant within the vehicle is identified.

Processing device, processing method, and program for at least one of controlling a traveling state of a movable object depending on use information of a first training device and restricting use states of one or more training devices depending on the traveling state of the movable object

A processing device includes a control unit configured to execute identifying a first training device that is of one or more training devices equipped in a movable object for providing a training opportunity to a passenger and that is used by the passenger, and at least one process of controlling a traveling state of the movable object depending on a content of a training using the first training device and restricting use states of the one or more training devices depending on the traveling state of the movable object.

ENHANCED SENSOR OPERATION

A two-dimensional image of a vehicle occupant in a vehicle is collected. The collected two-dimensional image is input to a machine learning program trained to output one or more reference points of the vehicle occupant, each reference point being a landmark of the vehicle occupant. One or more reference points of the vehicle occupant in the two-dimensional image is output from the machine learning program. A location of the vehicle occupant in an interior of the vehicle is determined based on the one or more reference points. A vehicle component is actuated based on the determined location. For each of the one or more reference points, a similarity measure is determined between the reference point and a three-dimensional reference point, the similarity measure based on a distance between the reference point and the three-dimensional reference point.