Patent classifications
B60W2040/089
Method and system for adaptive motion planning based on passenger reaction to vehicle motion in autonomous driving vehicles
The present teaching relates to method, system, medium, and implementation of automatic motion planning for an autonomous driving vehicle. Information is obtained with respect to a current location of the autonomous driving vehicle operating in a current vehicle motion, wherein the information is to be used to estimate the operational capability of the autonomous driving vehicle with respect to the current location. Sensor data are obtained, via one or more sensors in one or more media types. A reaction of a passenger present in the vehicle with respect to a current vehicle motion is estimated based on the sensor data. Motion planning is performed based on the information with respect to the current location and the reaction of the passenger to the current vehicle motion.
Vehicular cabin monitoring system
A vehicular cabin monitoring system includes a radar assembly disposed in a cabin of a vehicle and operable to capture radar data. The radar assembly is housed in an interior rearview mirror assembly of the vehicle and includes at least one radar transmit antenna that is operable to transmit radar waves and at least one radar receive antenna that is operable to receive radar waves. A control includes a data processor for processing radar data captured by the radar assembly. The control, via processing at the data processor of radar data captured by the radar assembly, detects movement of a body part of an occupant present in the cabin of the vehicle. The control, responsive to detecting movement of the body part of the occupant in the cabin of the vehicle, generates a control command associated with at least one operation of the vehicle.
Method for controlling moving body based on collaboration between the moving body and human, and apparatus for controlling the moving body thereof
The present disclosure relates to technology that controls a remote moving body based on collaboration between the moving body and human, and a method for controlling a moving body includes acquiring a first biosignal indicating an intention to start operation of the moving body from a user, operating the moving body, determining a surrounding situation of the moving body that autonomously controls the driving, providing the user with surrounding information of the moving body for inducing path setting, acquiring a second biosignal evoked by recognition of the surrounding information from the user, setting a driving direction of the moving body, commanding the moving body to automatically perform a driving operation to be carried out in the set driving direction, and acquiring a third biosignal responsive to recognition of a driving error from the user and correcting the driving direction of the moving body to induce driving path resetting.
DRIVER MONITORING DEVICE, DRIVER MONITORING METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM
A driver monitoring device includes: a processor; and a memory. The memory stores instructions, when executed by the processor, cause the driver monitoring device to perform operations including: detecting an unbalanced posture of a driver based on an image of a driver’s seat; determining whether the unbalanced posture is a habitual unbalanced posture of the driver; calculating an opening degree of at least one eye of the driver based on the image in a case in which it is determined that the unbalanced posture is other than the habitual unbalanced posture; determining whether it is a timing to give a notification in a case in which the vehicle is traveling and the opening degree is equal to or larger than a first threshold; and giving the notification to the driver in a case in which it is determined that it is the timing to give the notification.
System and Method For Utilizing a Health Monitoring Device to Detect a Health Emergency to Activate an Autopilot Feature
A system including a software application for detecting the health status of a person operating a vehicle, aircraft or machine, and for evaluating whether said operator has become physically incapable of operating the vehicle and for thereafter initiating the auto-control technology present in the vehicle, aircraft or machine. The system includes a health monitoring device worn by an operator, a sensor located in the health monitoring device capable of sending a signal, a storage for maintaining vital health statistics, a health monitoring system capable of comparing vital health statistics and a control system capable of receiving signals and activating the auto-control technology.
CABIN MONITORING AND SITUATION UNDERSTANDING PERCEIVING METHOD AND SYSTEM THEREOF
A cabin monitoring and situation understanding perceiving method is proposed. A cabin interior image capturing step is performed to capture a cabin interior image. A generative adversarial network model creating step is performed to create a generative adversarial network model according to the cabin interior image. An image adjusting step is performed to adjust the cabin interior image to generate an approximate image. A cabin interior monitoring step is performed to process the approximate image to generate a facial recognizing result and a human pose estimating result. A cabin exterior image and voice capturing step is performed to capture a cabin exterior image and a voice information. A situation understanding perceiving step is performed to process at least one of the approximate image, the cabin exterior image and the voice information according to a situation understanding model to perceive a situation understanding result.
AUTONOMOUS VEHICLE CONTROL TRANSITIONING
Signals are received from a plurality of sources, representing operating characteristics of a vehicle and an environment surrounding the vehicle. A plurality of operational factors are developed based on the signals. The vehicle is controlled according to one of at least three levels of control, including an autonomous, a semi-autonomous, and a manual level of control, based on the operational factors.
INFORMATION PROVIDING DEVICE AND INFORMATION PROVIDING METHOD
A confusion degree determining unit determines a degree of confusion of an occupant by using occupant state information acquired by an occupant state acquiring unit. A recognition degree determining unit determines a degree of recognition of the occupant, with respect to surrounding conditions and automatic control of a vehicle, by using surrounding condition information and control information acquired by a host vehicle status acquiring unit and the occupant state information acquired by the occupant state acquiring unit. An information generation unit generates information to be provided to the occupant by using the surrounding condition information and control information acquired by the host vehicle status acquiring unit, the degree of confusion determined by the confusion degree determining unit, and the degree of recognition determined by the recognition degree determining unit.
Information Presenting Apparatus and Information Presenting Method
An information presenting apparatus is used in an autonomous vehicle capable of switching between autonomous driving control and manual driving control. The information presenting apparatus determines a response action for checking that the driver is ready to take over when the autonomous driving control is switched to the manual driving control, performs control for requesting the driver to perform the response action determined, and detects the response action performed by the driver.
Emotive advisory system and method
Information about a device may be emotively conveyed to a user of the device. Input indicative of an operating state of the device may be received. The input may be transformed into data representing a simulated emotional state. Data representing an avatar that expresses the simulated emotional state may be generated and displayed. A query from the user regarding the simulated emotional state expressed by the avatar may be received. The query may be responded to.