B60W2540/229

INWARD/OUTWARD VEHICLE MONITORING FOR REMOTE REPORTING AND IN-CAB WARNING ENHANCEMENTS

Systems and methods are provided for intelligent driving monitoring systems, advanced driver assistance systems and autonomous driving systems, and providing alerts to the driver of a vehicle, based on anomalies detected between driver behavior and environment captured by the outward facing camera. Various aspects of the driver, which may include his direction of sight, point of focus, posture, gaze, is determined by image processing of the upper visible body of the driver, by a driver facing camera in the vehicle. Other aspects of environment around the vehicle captured by the multitude of cameras in the vehicle are used to correlate driver behavior and actions with what is happening outside to detect and warn on anomalies, prevent accidents, provide feedback to the driver, and in general provide a safer driver experience.

VEHICLE DRIVING SUPPORT APPARATUS

A vehicle driving support apparatus includes a forward environment recognizing device configured to recognize a traveling environment forward of a vehicle, a control device configured to perform adaptive cruise control and active lane keep centering control based on the recognized traveling environment, an electric power steering device configured to control a turning angle of wheels in a ganged manner in accordance with a steering angle received from a steering handle, and a driver monitoring system configured to detect changes in biological information of a driver who drives the vehicle. When the driver monitoring system detects a drop in alertness of the driver, the control device is configured to perform the adaptive cruise control and the active lane keep centering control and execute a steering handle idle mode that stops the electric power steering device from controlling the turning angle in the ganged manner in accordance with the steering angle.

Affective-cognitive load based digital assistant

Embodiments of the present disclosure sets forth a computer-implemented method comprising receiving, from at least one sensor, sensor data associated with an environment, computing, based on the sensor data, a cognitive load associated with a user within the environment, computing, based on the sensor data, an affective load associated with an emotional state of the user, determining, based on both the cognitive load at the affective load, an affective-cognitive load, determining, based on the affective-cognitive load, a user readiness state associated with the user, and causing one or more actions to occur based on the user readiness state.

Travel controller and method for travel control

A travel controller detects, from an image representing the face of a driver of a vehicle, an act of the driver checking surroundings of the vehicle, records a time at which the act of checking is detected, and suggests to the driver a lane change for the vehicle to change a travel lane to an adjoining lane. In the case that the act of checking is detected in a precheck period before a suggestion time at which the lane change is suggested, the travel controller makes the vehicle execute the lane change regardless of whether the act of checking is detected in a post-suggestion check period after the suggestion time of the lane change.

PRESENTATION CONTROL DEVICE AND AUTOMATED DRIVING CONTROL SYSTEM
20230020471 · 2023-01-19 ·

A presentation control device is configured to control information presentation to a driver of a vehicle. A determination unit is configured to distinguish between a monitoring interruption mode in which the driver is permitted to interrupt surroundings monitoring and a monitoring-required mode in which the driver is prohibited to interrupt the surroundings monitoring, and in the monitoring-required mode, between a hands-free permitted state and a hands-free prohibited state. A presentation control unit configured to perform the information presentation related to a mode transition from the monitoring interruption mode to the monitoring-required mode when the mode transition occurs. The presentation control unit is configured to perform the information presentation at an earlier timing when the monitoring interruption mode is transitioned to the hands-free prohibited state than when the monitoring interruption mode is transitioned to the hands-free permitted state.

SYSTEM FOR TESTING A DRIVER ASSISTANCE SYSTEM OF A VEHICLE
20230219584 · 2023-07-13 ·

The invention relates to a system for testing a driver assistance system of a vehicle, where the driver assistance system has at least one interior sensor and is designed to process sensor signals of the at least one interior sensor for monitoring a driver of the vehicle, the system comprising: simulation means for simulating at least one physical property of the driver which characterizes a physiological condition of the driver, in particular the driver's attentiveness, activity, fatigue, mood, state of health, and/or drug influence, and is able to be detected by the at least one interior sensor such that it can generate sensor signals as a function of the at least one simulated physical property; and an interface which interacts with the driver assistance system such that sensor signals are provided the driver assistance system as a function of the at least one simulated physical property. The invention further relates to a corresponding method.

TRAVELING VIDEO DISPLAY METHOD AND TRAVELING VIDEO DISPLAY SYSTEM

A method according to the present disclosure comprises acquiring a plurality sets of traveling video recorded by a plurality of cameras capturing a plurality of directions equipped in a remote driving vehicle, acquiring information regarding at least one of driving operation by a remote driving operator, a traveling state of the remote driving vehicle, and motion of the remote driving operator, specifying an attention direction or an attention scope of the remote driving operator for a situation around the remote driving vehicle based on acquired information, displaying at least one of the plurality sets of traveling video reflecting the attention direction or the attention scope on a display device, and performing display based on the attention direction or the attention scope.

DASH CAM WITH ARTIFICIAL INTELLIGENCE SAFETY EVENT DETECTION

A vehicle dash cam may be configured to execute one or more neural networks (and/or other artificial intelligence), such as based on input from one or more of the cameras and/or other sensors associated with the dash cam, to intelligently detect safety events in real-time. Detection of a safety event may trigger an in-cab alert to make the driver aware of the safety risk. The dash cam may include logic for determining which asset data to transmit to a backend server in response to detection of a safety event, as well as which asset data to transmit to the backend server in response to analysis of sensor data that did not trigger a safety event. The asset data transmitted to the backend server may be further analyzed to determine if further alerts should be provided to the driver and/or to a safety manager.

Method and device for evaluating a degree of fatigue of a vehicle occupant in a vehicle

A method evaluates a degree of fatigue of a vehicle occupant in a vehicle. A number of first fatigue indicators is provided which are determined according to computation rules from a plurality of first sensor values and each represent a degree of fatigue of the vehicle occupant. The first sensor values represent measured values of the vehicle and/or measured values relating to a current journey. A first metadata record is associated with each of the number of first fatigue indicators, wherein the first metadata records represent information about the characteristics of the sensors. The first sensor values are processed in the respective first fatigue indicators. A number of second fatigue indicators is provided which are determined according to computation rules from one or more second sensor values and each represent a degree of fatigue of the vehicle occupant. The second sensor values represent physiological and/or physical parameters of the vehicle occupants. A second metadata record is associated with each of the number of second fatigue indicators. The second metadata records represent information about the characteristics of the sensors. The second sensor values are processed in the respective second fatigue indicators. An overall fatigue indicator is determined which represents the degree of fatigue of the vehicle occupant by weighting the number of first fatigue indicators and the number of second fatigue indicators. The fatigue indicators are weighted according to the information about the characteristics of the sensors contained in the first metadata record and the second metadata record.

INFORMATION PROCESSING DEVICE, MOBILE DEVICE, INFORMATION PROCESSING SYSTEM, AND METHOD
20230211810 · 2023-07-06 ·

To implement a configuration to calculate a manual driving recoverable time required for a driver who is executing automatic driving in order to achieve a requested recovery ratio (RRR) for each road section, and issue a manual driving recovery request notification on the basis of the calculated time. A data processing unit is included, which calculates a manual driving recoverable time required for a driver who is executing automatic driving in order to achieve a predefined requested recovery ratio (RRR) from automatic driving to manual driving and determines notification timing of a manual driving recovery request notification on the basis of the calculated time. The data processing unit acquires the requested recovery ratio (RRR) for each road section set as ancillary information of a local dynamic map (LDM), and calculates the manual driving recoverable time for each road section scheduled to travel, using learning data for each driver.