B60W2540/229

SYSTEMS AND METHODS FOR PREDICTING DRIVER VISUAL IMPAIRMENT WITH ARTIFICIAL INTELLIGENCE

Systems and methods are provided for predictive assessment of driver perception abilities based on driving behavior personalized to the driver in connection with, but not necessarily, autonomous and semi-autonomous vehicles. In accordance with on embodiment, a method comprises receiving first vehicle operating data and associated first gaze data of a driver operating a vehicle; training a model for the driver based on the first vehicle operating data and the first gaze data, the model indicating driving behavior of the driver; receiving second vehicle operating data and associated second gaze data of the driver; and determining that an ability of the driver to perceive hazards is impaired based on applying the model to the second vehicle operating data and associated second gaze data.

Methods and devices for triggering vehicular actions based on passenger actions

Autonomous driving system methods and devices which trigger vehicular actions based on the monitoring of one or more occupants of a vehicle are presented. The methods, and corresponding devices, may include identifying a plurality of features in a plurality of subsets of image data detailing the one or more occupants; tracking changes over time of the plurality of features over the plurality of subsets of image data; determining a state, from a plurality of states, of the one or more occupants based on the tracked changes; and triggering the vehicular action based on the determined state.

IN-CABIN RADAR APPARATUS
20230236302 · 2023-07-27 · ·

In an in-cabin radar apparatus, transmitting antennas are disposed at one side in a direction parallel to a control circuit and disposed in a line in a vertical direction, and receiving antennas are disposed at one side in a direction perpendicular to the control unit and disposed in a line in a horizontal direction. Each transmission side feed line may be perpendicularly connected to one of the transmitting antennas, and each receiving side feed line may be perpendicularly connected to one of the receiving antennas. Each of a distance between the transmitting antennas and a distance between the receiving antennas may be implemented to be less than or equal to half of a transmitting and receiving wavelength.

Occupant behavior determining apparatus, occupant behavior determining method, and storage medium
11565707 · 2023-01-31 · ·

An occupant behavior determining apparatus includes a surveillance camera that captures an image of an occupant of a vehicle to acquire an image; a face recognizing section that recognizes a face of the occupant based on the image; a posture recognizing section that recognizes a posture of the occupant based on the image; and a behavior determining section that determines a behavior of the occupant in a vehicle cabin based on a recognition result of the face recognizing section and a recognition result of the posture recognizing section.

DRIVING ASSISTANCE DEVICE AND DRIVING ASSIST METHOD

An environmental information acquiring unit (11) to acquire environmental information on an environment around a mobile object, an action information acquiring unit (12) to acquire action information on an action of a driver of the mobile object, a calculation unit (13) to obtain control information for performing automated driving control of the mobile object on the basis of the environmental information acquired by the environmental information acquiring unit (11) and a machine learning model (18) that uses the environmental information as an input and outputs the control information, a contribution information determining unit (14) to determine contribution information having a high degree of contribution to the control information on the basis of the environmental information and the control information, a cognitive information calculating unit (15) to calculate cognitive information indicating a cognitive region of the driver in the environment around the mobile object on the basis of the action information and the environmental information, a specification unit (16) to specify unrecognized contribution information estimated not to be recognized by the driver on the basis of the contribution information and the cognitive information, and an information output control unit (17) to output driving assistance information necessary for driving assistance on the basis of the unrecognized contribution information specified by the specification unit (16) are provided.

METHOD FOR VISUALLY TRACKING GAZE POINT OF HUMAN EYE, VEHICLE EARLY WARNING METHOD AND DEVICE
20230025540 · 2023-01-26 ·

A method for visually tracking a gaze point of a human eye includes: periodically obtaining position coordinates of a human eye of a driver of a host vehicle and coordinates of a gaze point of a sightline of the human eye on ab inner side of a current projection screen; in combination with a refractive index and a curvature of the current projection screen, coordinates of a gaze point of the sightline on an outer side of the current projection screen and a corresponding refracted light path formed by outward refraction of the sightline are obtained, and in combination with a preset normal viewing distance of the human eye, a final gaze point of the sightline on the refracted light path and coordinates of the final gaze point are obtained.

USER INTERFACE FOR ALLOCATION OF NON-MONITORING PERIODS DURING AUTOMATED CONTROL OF A DEVICE

A system for user interaction with an automated device includes a control system configured to operate the device during an operating mode corresponding to a first state in which the control system automatically controls the device operation, and the operating mode prescribes that a user monitor the device operation during automated control. The control system is configured to allocate a time period for the device to transition to a temporary state in which automated control is maintained and the user is permitted to stop monitoring and perform a task unrelated to device operation. The system includes a user interaction system including a visual display configured to present trajectory information, an indication as to whether an area is conducive to putting the device in the temporary state, and time period allocation information, the user interaction system including an interface engageable by the user to manage scheduling of allocated time period(s).

SYSTEM AND METHOD FOR CLASSIFYING A TYPE OF INTERACTION BETWEEN A HUMAN USER AND A MOBILE COMMUNICATION DEVICE IN A VOLUME BASED ON SENSOR FUSION

A system and method for classifying a type of interaction between a human user and a mobile communication device within a defined volume, based on multiple sensors. The method may include: determining a position of the mobile communication device relative to a frame of reference of the defined volume, based on: angle of arrival, time of flight, or received intensity of radio frequency (RF) signals transmitted by the mobile communication device and received by a phone location unit located within the defined volume configured to wirelessly communicate with the mobile communication device; obtaining at least one sensor measurement related to the mobile communication device from various non-RF sensors; repeating the obtaining, to yield a time series of sensor readings; and using a computer processor to classify the type of interaction into one of many predefined types of interactions, based on the position and the time series of sensor readings.

DROWSY DRIVING DETECTION METHOD AND SYSTEM THEREOF, AND COMPUTER DEVICE
20230230397 · 2023-07-20 ·

A drowsy driving detection method comprises: acquiring a side face image of a currently seated driver collected by a camera module; performing face recognition on the side face image to obtain side face feature parameters, and determining, according to the side face feature parameters, whether an ID file corresponding to the currently seated driver exists in a driver ID library; and if yes, periodically acquiring a side face image of the driver in the current period collected by the camera module, obtaining eye movement feature parameters of the driver in the current period according to the side face image of the current period, and determining whether the driver is driving while drowsy according to a comparison result between the eye movement feature parameters of the current period and the normal eye movement feature parameters of the driver.

METHOD AND APPARATUS FOR SETTING A DRIVING MODE OF A VEHICLE

A method and an apparatus for setting a driving mode of a vehicle are disclosed. The method performed by an in-vehicle apparatus for setting a driving mode includes: determining a state of an occupant, determining a type of road on which the vehicle is traveling, and setting the driving mode of the vehicle based on the state of the occupant and the type of the road.