B60Q9/00

Localized traffic data collection

A system and method for collecting, processing, storing, or transmitting traffic data. A localized data collection module may retrieve, receive, or intercept traffic data through or from hardware installed in a traffic control cabinet adjacent an intersection or other roadway feature of interest. Data which may have previously been confined to a closed loop traffic control system may be remotely accessible for traffic operations control or monitoring via a network connected server and/or cloud architecture.

Localized traffic data collection

A system and method for collecting, processing, storing, or transmitting traffic data. A localized data collection module may retrieve, receive, or intercept traffic data through or from hardware installed in a traffic control cabinet adjacent an intersection or other roadway feature of interest. Data which may have previously been confined to a closed loop traffic control system may be remotely accessible for traffic operations control or monitoring via a network connected server and/or cloud architecture.

Illumination-based object tracking within a vehicle

Systems and methods for providing illumination-based object tracking information within a host vehicle. In some embodiments, the system may comprise a remote object detection module and an illumination display pattern within the vehicle comprising one or more light sources defining a pattern. The illumination display pattern may be configured to dynamically change in accordance with one or more objects being detected and/or tracked by the remote object detection module to convey visible information to vehicle occupants regarding such object(s).

Implementing and optimizing safety interventions
11514485 · 2022-11-29 · ·

A network system provides interventions to providers to reduce the likelihood that its users will experience safety incidents. The providers provide service to the users such as transportation. Providers who are safe and have positive interpersonal behavior may be perceived by users as high quality providers. However, other providers may be more prone to cause safety incidents. A machine learning model is trained using features derived from service received by users of the network system. Randomized experiments and trained models predict the effectiveness of various interventions on a provider based on characteristics of the provider and the feedback received for the provider. As interventions are sent to providers, the change in feedback can indicate whether the intervention was effective. By providing messages proactively, the network system may prevent future safety incidents from occurring.

Movable carrier auxiliary system

A movable carrier auxiliary system includes at least one optical image capturing system disposed on a movable carrier, at least one warning module, and at least one displaying device. The optical image capturing system includes an image capturing module and an operation module, and has at least one lens group including at least two lenses having refractive power. The image capturing module captures and produces an environmental image surrounding the movable carrier. The operation module electrically connected to the image capturing module detects at least one lane marking in the environmental image to generate a detecting signal. The warning module electrically connected to the operation module receives the detecting signal to determine whether a moving direction of the movable carrier deviates from a lane, and generate a warning signal when the moving direction deviates from the lane. The displaying device electrically connected to the warning module displays the warning signal.

Drowsiness detection system
11514688 · 2022-11-29 · ·

A machine-implemented method for automated detection of drowsiness, which includes receiving from an imaging device directed at the face of an operator a series of images of the face of the operator onto processing hardware, on the processor detecting facial landmarks of the operator from the series of images to determine the level of talking by the operator, the level of yawning of the operator, the PERCLOS of the operator, on the processor detecting the facial pose of the operator from the series of images to determine the level of gaze fixation by the operator, on the processor calculating the level of drowsiness of the operator by ensembling the level of talking by the operator, the level of yawning of the operator, the PERCLOS of the operator and the level of gaze fixation by the operator, and generating an alarm when the calculated level of drowsiness of the operator exceeds a predefined value.

Drowsiness detection system
11514688 · 2022-11-29 · ·

A machine-implemented method for automated detection of drowsiness, which includes receiving from an imaging device directed at the face of an operator a series of images of the face of the operator onto processing hardware, on the processor detecting facial landmarks of the operator from the series of images to determine the level of talking by the operator, the level of yawning of the operator, the PERCLOS of the operator, on the processor detecting the facial pose of the operator from the series of images to determine the level of gaze fixation by the operator, on the processor calculating the level of drowsiness of the operator by ensembling the level of talking by the operator, the level of yawning of the operator, the PERCLOS of the operator and the level of gaze fixation by the operator, and generating an alarm when the calculated level of drowsiness of the operator exceeds a predefined value.

Radar system for internal and external environmental detection

Examples disclosed herein relate to radar systems to coordinate detection of objects external to the vehicle and distractions within the vehicle. A method of environmental detection with a radar system includes detecting an object in an external environment of a vehicle with the radar system positioned on the vehicle. The method includes determining a distraction metric from measurements of user activity obtained within the vehicle with the radar system. The method includes adjusting one or more detection parameters of the radar system based at least on the detected object and the distraction metric. Other examples disclosed herein relate to a radar sensing unit for a vehicle that includes an internal distraction sensor, an external object detection sensor, a coordination sensor and a central controller for internal and external environmental detection.

Radar system for internal and external environmental detection

Examples disclosed herein relate to radar systems to coordinate detection of objects external to the vehicle and distractions within the vehicle. A method of environmental detection with a radar system includes detecting an object in an external environment of a vehicle with the radar system positioned on the vehicle. The method includes determining a distraction metric from measurements of user activity obtained within the vehicle with the radar system. The method includes adjusting one or more detection parameters of the radar system based at least on the detected object and the distraction metric. Other examples disclosed herein relate to a radar sensing unit for a vehicle that includes an internal distraction sensor, an external object detection sensor, a coordination sensor and a central controller for internal and external environmental detection.

Vehicle and method of controlling the same
11511731 · 2022-11-29 · ·

A vehicle includes: recognizing a forward vehicle in response to the processing of image data captured by an image sensor disposed at the vehicle so as to have a field of view of the outside of the vehicle; obtaining a distance from the forward vehicle in response to the processing of detecting data captured by a radar disposed at the vehicle so as to have a detecting area of the outside of the vehicle; obtaining a change amount of vertical movement of the forward vehicle in the image data in response to the distance from the forward vehicle that is equal to or less than a reference distance; obtaining a height of an obstacle on a road surface corresponding to the change amount; obtaining the height of the obstacle on the road surface in the image data in response to the distance from the forward vehicle that exceeds the reference distance; identifying a driving speed of the vehicle; identifying a reference height corresponding to the driving speed of the vehicle; and outputting deceleration guide information in response to the height of the obstacle on the road surface that is greater than or equal to the reference height.