Patent classifications
B60K2360/119
Display Unit with Changing Symbols
A display unit includes a layer with a plurality of sub-regions which are arranged adjacently to one another along an illuminated edge of the layer. A corresponding number of symbols are arranged in the sub-regions. The display unit additionally has a plurality of light sources. The said light sources are arranged adjacently to one another along the illuminated edge of the layer. Each light source is configured to emit light into the respective sub-region of the layer via the illuminated edge of the layer. The display unit additionally has a limiting element which is arranged between the illuminated edge of the layer and the light sources and which is configured to limit the light emitted from the light sources for each sub-region such that the symbol in the respective sub-region is illuminated in a selective manner by the light sources.
Vehicle user interface apparatus and vehicle
A vehicle user interface apparatus for a vehicle including a light emitting unit configured to emit light; a touch sensor configured to sense a touch; a processor configured to control the light emitting unit to emit the light in response to an event, activate the touch sensor when the light is emitted, and provide a signal to control a vehicle device corresponding to a touch input received through the activated touch sensor; and a cover covering the light emitting unit, the touch sensor, and the processor, the cover configured to transmit the light into a passenger compartment with a specific shape when the light is emitted.
User interface and method for assisting a user in the operation of an operator control unit
User interfaces and processes to support a user during the operation of a touch-sensitive control unit. A presence of an input means, such as a finger of the user, is detected in a predefined first area in front of a button displayed on the control unit. In response to the detection a timer is started having a predetermined time segment. When the timer elapses, a secondary function in the control unit is executed relative to the button in which the input means is detected.
Systems and methods for communicating intent of an autonomous vehicle
The present disclosure provides systems and methods to communicate intent of an autonomous vehicle. In particular, the systems and methods of the present disclosure can receive, from an autonomy computing system of an autonomous vehicle, data indicating an intent of the autonomous vehicle to perform a driving maneuver. It can be determined that the intent of the autonomous vehicle should be communicated to a passenger of the autonomous vehicle. Responsive to determining that the intent of the autonomous vehicle should be communicated to the passenger of the autonomous vehicle, a graphical interface indicating the intent of the autonomous vehicle can be generated and provided for display for viewing by the passenger.
PROPAGATION OF APPLICATION CONTEXT BETWEEN A MOBILE DEVICE AND A VEHICLE INFORMATION SYSTEM
A method, device, and a vehicle information system are provided for persisting application context from the mobile device to the vehicle information system. An operating context is determined for at least one application executing on the mobile device. A user interface view for display by the vehicle information system is generated and provided to the vehicle information system. The user interface view comprises at least one application user interface element associated with the at least one application, and the application user interface element comprises an application entry point defined by the operating context for the at least one application.
IN-VEHICLE INFORMATION PROCESSING SYSTEM
Provided is an in-vehicle information processing system that can be operated intuitively. An in-vehicle information processing system (10) includes a display (11), a touch operation interface (12), an imaging unit (13), and a controller (14). The display (11) includes at least one screen. The touch operation interface (12) detects contact by an operation hand of an operator. The imaging unit (13) captures an image of the touch operation interface (12) and at least a portion of the operation hand. The controller (14) associates position coordinates in an operation region (R2) on the screen with position coordinates in a predetermined region (R1) of the touch operation interface (12) and causes at least a portion of the operation hand to be displayed in overlap on the screen on the basis of the image captured by the imaging unit (13).
MANAGEMENT METHOD AND APPARATUS FOR VEHICLE SAFETY, AND COMPUTER STORAGE MEDIUM
A management method and apparatus for vehicle safety, and a computer storage medium. The method includes: acquiring at least one of meteorological data and environmental data, the environmental data is collected by an on-board device on a vehicle; determining a weather warning type based on at least one of the meteorological data and the environmental data; determining whether the weather warning type meets a predetermined warning condition; determining vehicle insurance data matched with the vehicle, based on at least one of the meteorological data and the environmental data as well as driving data of the vehicle; and displaying an identifier associated with the vehicle insurance data on at least one of a mobile device and an on-board display screen of the vehicle, the mobile device is associated with the vehicle when a predetermined operation is detected on the mobile device.
COMPONENT FOR VEHICLE INTERIOR
A component for a vehicle interior may comprise a light guide to allow light transmission from a light source, and a cover to cover the light guide. The light guide may comprise an icon or indicator and the cover may comprise a depression or indentation in an inner surface of the cover. The icon/indicator of the light guide may comprise a light-transmissive resin material configured to fit within the depression/indentation in the cover. The icon/indicator may be presented when illuminated and hidden by the cover when not illuminated. The cover may comprise a polymer material coupled to the light guide. The cover may present a user interface. A surface of the cover may comprise an information region and/or a decorative region. The component may comprise a steering wheel, console, floor console, center console, instrument panel, door panel, dashboard, display, armrest, or cockpit.
Apparatus for recognizing gesture in vehicle and method thereof
An apparatus is configured to efficiently recognize a gesture by interworking with an infotainment system. The apparatus obtains information about the gesture of a user, is connected with the infotainment system of a vehicle to identify a state of the infotainment system, branches a gesture allowed in recognition, based on the state of the infotainment system, and performs gesture recognition, based on the information about the sensed gesture and a result of branching the gesture allowed in recognition. The apparatus addresses overload and reliability degradation problems according to continuous image recognition.
Display and voice output control system of vehicle
A display and voice output control system of a vehicle on which automated driving is performed includes a display device, a voice output device and a controller. The controller controls a notification of a recommended behavior by the display device and the voice output device. In the notification control, it is judged whether or not a notification condition of the recommended behavior is satisfied. If it is judged that the notification condition is satisfied, it is judged whether a notification relaxation condition of the recommended behavior is satisfied based on a status of a driver and a driving status of the vehicle. If it is judged that the notification relaxation conditions is satisfied, relaxation notification processing is executed. Otherwise, normal notification processing is executed. In the relaxation notification processing, an output status of the recommended behavior in the voice output device is controlled to an inactive state.