Patent classifications
B60W2540/225
INFORMATION PROCESSING APPARATUS, VEHICLE, AND INFORMATION PROCESSING METHOD
An information processing apparatus, a vehicle, and an information processing method each capable of improving the visibility of an on-board meter panel. Information processing apparatus includes information acquirer that acquires traveling information on traveling of a vehicle including an on-board meter panel; and display pattern changer that changes a display pattern of the on-board meter panel in accordance with change in the traveling information.
SYSTEMS AND METHODS FOR PREDICTING DRIVER VISUAL IMPAIRMENT WITH ARTIFICIAL INTELLIGENCE
Systems and methods are provided for predictive assessment of driver perception abilities based on driving behavior personalized to the driver in connection with, but not necessarily, autonomous and semi-autonomous vehicles. In accordance with on embodiment, a method comprises receiving first vehicle operating data and associated first gaze data of a driver operating a vehicle; training a model for the driver based on the first vehicle operating data and the first gaze data, the model indicating driving behavior of the driver; receiving second vehicle operating data and associated second gaze data of the driver; and determining that an ability of the driver to perceive hazards is impaired based on applying the model to the second vehicle operating data and associated second gaze data.
Occupant behavior determining apparatus, occupant behavior determining method, and storage medium
An occupant behavior determining apparatus includes a surveillance camera that captures an image of an occupant of a vehicle to acquire an image; a face recognizing section that recognizes a face of the occupant based on the image; a posture recognizing section that recognizes a posture of the occupant based on the image; and a behavior determining section that determines a behavior of the occupant in a vehicle cabin based on a recognition result of the face recognizing section and a recognition result of the posture recognizing section.
DRIVING ASSISTANCE DEVICE AND DRIVING ASSIST METHOD
An environmental information acquiring unit (11) to acquire environmental information on an environment around a mobile object, an action information acquiring unit (12) to acquire action information on an action of a driver of the mobile object, a calculation unit (13) to obtain control information for performing automated driving control of the mobile object on the basis of the environmental information acquired by the environmental information acquiring unit (11) and a machine learning model (18) that uses the environmental information as an input and outputs the control information, a contribution information determining unit (14) to determine contribution information having a high degree of contribution to the control information on the basis of the environmental information and the control information, a cognitive information calculating unit (15) to calculate cognitive information indicating a cognitive region of the driver in the environment around the mobile object on the basis of the action information and the environmental information, a specification unit (16) to specify unrecognized contribution information estimated not to be recognized by the driver on the basis of the contribution information and the cognitive information, and an information output control unit (17) to output driving assistance information necessary for driving assistance on the basis of the unrecognized contribution information specified by the specification unit (16) are provided.
METHOD FOR VISUALLY TRACKING GAZE POINT OF HUMAN EYE, VEHICLE EARLY WARNING METHOD AND DEVICE
A method for visually tracking a gaze point of a human eye includes: periodically obtaining position coordinates of a human eye of a driver of a host vehicle and coordinates of a gaze point of a sightline of the human eye on ab inner side of a current projection screen; in combination with a refractive index and a curvature of the current projection screen, coordinates of a gaze point of the sightline on an outer side of the current projection screen and a corresponding refracted light path formed by outward refraction of the sightline are obtained, and in combination with a preset normal viewing distance of the human eye, a final gaze point of the sightline on the refracted light path and coordinates of the final gaze point are obtained.
USER INTERFACE FOR ALLOCATION OF NON-MONITORING PERIODS DURING AUTOMATED CONTROL OF A DEVICE
A system for user interaction with an automated device includes a control system configured to operate the device during an operating mode corresponding to a first state in which the control system automatically controls the device operation, and the operating mode prescribes that a user monitor the device operation during automated control. The control system is configured to allocate a time period for the device to transition to a temporary state in which automated control is maintained and the user is permitted to stop monitoring and perform a task unrelated to device operation. The system includes a user interaction system including a visual display configured to present trajectory information, an indication as to whether an area is conducive to putting the device in the temporary state, and time period allocation information, the user interaction system including an interface engageable by the user to manage scheduling of allocated time period(s).
INWARD/OUTWARD VEHICLE MONITORING FOR REMOTE REPORTING AND IN-CAB WARNING ENHANCEMENTS
Systems and methods are provided for intelligent driving monitoring systems, advanced driver assistance systems and autonomous driving systems, and providing alerts to the driver of a vehicle, based on anomalies detected between driver behavior and environment captured by the outward facing camera. Various aspects of the driver, which may include his direction of sight, point of focus, posture, gaze, is determined by image processing of the upper visible body of the driver, by a driver facing camera in the vehicle. Other aspects of environment around the vehicle captured by the multitude of cameras in the vehicle are used to correlate driver behavior and actions with what is happening outside to detect and warn on anomalies, prevent accidents, provide feedback to the driver, and in general provide a safer driver experience.
SYSTEMS AND METHODS TO INCREASE DRIVER AWARENESS OF EXTERIOR OCCURRENCES
Systems and methods are provided for increasing driver awareness of external audible and visual alerts. The external alerts may be detected by external sensors of a vehicle. Systems may use internal sensors to determine whether a driver of the vehicle appears aware of the external alert or whether the external alert is detectable inside the vehicle. If the driver does not appear aware of the external alert, or the external alert is not detectable inside the vehicle, the vehicle may broadcast an internal alert to the driver. The internal alert may be one or more of a visual alert, an audible alert, or a tactile alert.
Method for controlling autonomous driving vehicle
Disclosed herein is a method for controlling an autonomous driving vehicle. The vehicle control method includes detecting an eye level of an occupant adjacent to the window through a first camera which captures an image of an inside of the vehicle, setting an area of the window corresponding to the eye level of the occupant to a first area and setting the other remaining area of the window to a second area, and adjusting light transmittance of the window such that light transmittance of the first area is lower than light transmittance of the second area.
Information providing device, movable body, and method of providing information
An information providing device includes an external environment recognition unit that recognizes an external environmental situation of a movable body, an information providing control unit that provides a notification of recommended stopping information based on a current indication of traffic signals recognized by the external environment recognition unit, and a determination unit which determines, in the case that a first intersection and a second intersection are positioned in a travel direction of the movable body, whether or not a degree of proximity of the first intersection and the second intersection satisfies a predetermined condition. In the case it is determined by the determination unit that the predetermined condition is satisfied, the information providing control unit prevents the recommended stopping information based on a current indication of the traffic signal belonging to the second intersection from being provided.