Patent classifications
B60Q9/00
Context-sensitive adjustment of off-road glance time
Methods and systems involve obtaining information from one or more sources to determine a real-time context. A method includes determining a situational awareness score indicating a level of vigilance required based on the real-time context, obtaining images of eyes of a driver to detect a gaze pattern of the driver, determining a relevant attention score indicating a level of engagement of the driver in the real-time context based on a match between the gaze pattern of the driver and the real-time context, and obtaining images of the driver to detect behavior of the driver. A driver readiness score is determined and indicates a level of readiness of the driver to resume driving the vehicle based on the behavior of the driver. An off-road glance time is obtained based on using the situational awareness score, the relevant attention score, and the driver readiness score.
Gaze target detector
A gaze target detector includes a line-of-sight detector, a relative speed data acquiring unit, a relative position data acquiring unit, a curvature calculator, a threshold adjuster, and a gaze determination unit. The line-of-sight detector detects the line of sight of an occupant in a vehicle. The relative speed data acquiring unit acquires a relative speed between the vehicle and a gaze target at which the occupant is gazing. The relative position data acquiring unit acquires a relative position between the vehicle and the gaze target. The curvature calculator calculates the curvature of a traveling track of the vehicle. The threshold adjuster adjusts a threshold based on at least one of the relative speed, the relative position, or the curvature. The threshold is used to determine the gaze target. The gaze determination unit determines the gaze target as a gaze event based on the threshold adjusted.
Gaze target detector
A gaze target detector includes a line-of-sight detector, a relative speed data acquiring unit, a relative position data acquiring unit, a curvature calculator, a threshold adjuster, and a gaze determination unit. The line-of-sight detector detects the line of sight of an occupant in a vehicle. The relative speed data acquiring unit acquires a relative speed between the vehicle and a gaze target at which the occupant is gazing. The relative position data acquiring unit acquires a relative position between the vehicle and the gaze target. The curvature calculator calculates the curvature of a traveling track of the vehicle. The threshold adjuster adjusts a threshold based on at least one of the relative speed, the relative position, or the curvature. The threshold is used to determine the gaze target. The gaze determination unit determines the gaze target as a gaze event based on the threshold adjusted.
Vehicular control system with traffic lane detection
A vehicular control system includes a forward viewing camera disposed at an in-cabin side of a windshield of a vehicle and viewing forward of the vehicle. Road curvature of a road along which the vehicle is traveling is determined responsive at least in part to processing by an image processor of image data captured by the camera. Responsive at least in part to processing of captured image data, a traffic lane of the road along which the vehicle is traveling is determined. Upon approach of the vehicle to a curve in the road, speed of the vehicle is reduced to a reduced speed for traveling around the curve in the road at least in part responsive to at least one selected from the group consisting of (a) processing of image data captured by the forward viewing camera and (b) data relevant to a current geographical location of the equipped vehicle.
Vehicle steering system
A steering system for a vehicle, including: a pair of wheel steering devices that respectively steer right and left wheels independently of each other; and a controller configured to control the pair of wheel steering devices, wherein the controller is configured to: determine a standard steering amount of each of the right and left wheels in accordance with a steering request; execute opposite-phase shift steering in which steering amounts of the respective right and left wheels are shifted in mutually opposite directions by respective shift amounts with respect to the standard steering amounts determined respectively for the right and left wheels; and estimate a friction coefficient of a road surface on which the vehicle is running based on steering forces respectively applied to the right and left wheels in the opposite-phase shift steering.
Vehicle steering system
A steering system for a vehicle, including: a pair of wheel steering devices that respectively steer right and left wheels independently of each other; and a controller configured to control the pair of wheel steering devices, wherein the controller is configured to: determine a standard steering amount of each of the right and left wheels in accordance with a steering request; execute opposite-phase shift steering in which steering amounts of the respective right and left wheels are shifted in mutually opposite directions by respective shift amounts with respect to the standard steering amounts determined respectively for the right and left wheels; and estimate a friction coefficient of a road surface on which the vehicle is running based on steering forces respectively applied to the right and left wheels in the opposite-phase shift steering.
Dynamic lighting and sensor adjustment for occupant monitoring
Systems and methods to proactively adapt image sensor settings of an occupant monitoring system to accommodate ambient lighting changes are provided. The system may track current ambient light, track the driver, and predict upcoming changing lighting conditions. Sensors are used to predict when the ambient light will change and preemptively adjust to prevent dark or washout images. The sensor adjustment may be timed so that the adjustment is made just in time or concurrently with the change in ambient light conditions. The system may predict when the light on the occupant's face will change and proactively readjust the settings of the sensor to accommodate the changed lighting conditions.
Electronic shift operation apparatus and control method thereof
In an electronic shift operation apparatus, an operation of a haptic actuator is controlled depending on a kind of shifting stage signal (P.R.N.D) generated at the time of an operation of a shift button to generate a different type of haptic signal, and an operation of the haptic actuator is controlled depending on a distance to a rear object positioned behind a vehicle at the time of an operation of an R-stage button to additionally generate a haptic signal.
Electronic shift operation apparatus and control method thereof
In an electronic shift operation apparatus, an operation of a haptic actuator is controlled depending on a kind of shifting stage signal (P.R.N.D) generated at the time of an operation of a shift button to generate a different type of haptic signal, and an operation of the haptic actuator is controlled depending on a distance to a rear object positioned behind a vehicle at the time of an operation of an R-stage button to additionally generate a haptic signal.
Device for classifying data
A device is configured to classify data. Its operation involves providing (210) data samples including one or more of: image data, radar data, acoustic data, and/or lidar data to a processing unit. The data samples include at least one test sample, including positive samples and negative samples. Each positive sample has been determined to contain data relating to at least one object to be detected including at least one pedestrian, car, vehicle, truck or bicycle. Each negative sample has been determined not to contain data relating to the at least one object to be detected. These determinations regarding the positive samples and the negative samples are provided as input data, validated by a human operator, and/or provided by the device itself through a learning algorithm. A first plurality of groups is generated (220) by the processing unit implementing an artificial neural network, wherein at least some of the first plurality of groups are assigned a weighting factor. Each group of the first plurality of groups is populated (230) by the processing unit implementing the artificial neural network with a different at least one of the plurality of negative samples based on a different feature of sample data similarity for each group, which involves processing the negative samples to determine a number of different features of sample data similarity in order to populate different groups with negative samples that share or substantially share that or a similar feature, wherein at least one of the groups of the first plurality of groups contains at least two negative samples. It is determined (240) by the processing unit implementing the artificial neural network whether the at least one test sample contains data relating to the at least one object based on the plurality of the positive samples and the first plurality of groups. The artificial neural network implements the learning algorithm.