Patent classifications
B60K35/654
Improved Visualization with an AR HUD
The disclosure relates to a contact-analogous head-up display, in particular an augmented reality head-up display, which places information in direct contact with the environment. Contrary to conventional head-up displays, the items of information appear as part of the environment. The subject matter of the disclosure relates to a dynamic improvement of the visibility of concealed AR elements in situations in which several elements are displayed in parallel.
LIGHT EMISSION DEVICE, CONTROL DEVICE AND MOBILE ENTITY
A light emission device is installed in a vehicle capable of changing a degree of involvement of a driver in surroundings monitoring during driving. A light emitter is configured to be located in a gaze stable field of view of the driver when an information display device is located in an effective field of view of the driver. The information display device is disposed on a side of a steering device in the mobile entity. The light emitter is configured to emit light in a manner according to the degree of involvement of the driver in the surroundings monitoring during driving.
Vehicle travel control method and travel control device
A travel control method for a vehicle for executing autonomous travel control includes, prior to executing the autonomous travel control, presenting a driver with travel control information as to whether or not to accept execution of the autonomous travel control and detecting, in response to a presentation of the travel control information, an acceptance input made by the driver indicating that the driver accepts the execution of the autonomous travel control. The travel control method also includes: setting an input form of a present acceptance input by the driver to an input form different from an input form of a previous acceptance input.
Navigation device, navigation system, and route guidance method
A navigation device includes a map display unit configured to display a map on a first display device and display a guidance route and a current position on the map, and a scale determination unit configured to determine a scale of the map to be displayed on the first display device. The scale determination unit lowers the scale as the current position approaches an intersection at which it is required to turn, and enlarges the scale as the current position moves through the intersection.
Image processing apparatus, moving apparatus, method, and program
Configuration, in which images output to a display unit are switched and displayed in accordance with the behavior of a driver, such as movements of the head of the driver, is achieved. Driver information indicating the behavior of the driver of a moving apparatus and images captured by a plurality of cameras that images a situation around the moving apparatus from different viewpoints are input. The images output to the display unit are switched in accordance with the driver information. The plurality of cameras is, for example, a plurality of rear cameras installed in the rear of the moving apparatus. For example, a direction of the face or line-of-sight of the driver is detected. An image in a direction corresponding to the detected direction of the face or line-of-sight of the driver is selected as an output image, and displayed on the display unit. Alternatively, an image in a direction indicated by a gesture of the driver is selected, and displayed on the display unit.
Dynamic information protection for display devices
A system for controlling visual protection for a display unit includes a camera which is configured to generate a video signal; a video signal processing unit which is configured, based on the video signal of the camera, to determine a particular position of each of one or more persons relative to the display unit and to determine a particular viewing direction of one of the one or more persons; a database which is configured to generate a prediction for a behaviour of each of the one or more persons based on the particular position and viewing direction of each of the one or more persons detected by the video signal processing unit, wherein the prediction includes one or more predicted viewing directions of the particular person; and a display control unit which is configured to control the visual protection for the display unit based on the prediction.
Contextual autonomous vehicle support through written interaction
An autonomous vehicle including a sensor system that outputs a sensor signal indicative of a condition of the autonomous vehicle. The vehicle also includes a user interface device with a display. A computing system determines, based upon a profile of the passenger, that support is to be provided textually to the passenger when the support is provided to the passenger. The computing system further detects occurrence of an event that has been identified as potentially causing discomfort to the passenger. The computing system yet further sets a predefined support message defined in an account corresponding to the event maintained in a database prior to occurrence of the event as a support message to be presented to the passenger. The computing system additionally causes the display to present the support message textually, wherein the textual support message solicits feedback from the passenger of the autonomous vehicle.
Method for improving ergonomics of a vehicle cockpit
A method for improving ergonomics of a vehicle obtains a first information defining an initial cockpit configuration. Further, information on a cockpit user's shape and information of a seat and steering wheel position which the user typically uses while driving, are obtained. This information is fed into a bio-mechanical simulation which carries out the simulation based upon the information defining the initial cockpit configuration, the user's shape and the user's seat and steering wheel position. In the bio-mechanical simulation, an ergonomic quality criteria is calculated for reaching movements during driving. Based upon the simulation result which is the quality criteria, the cockpit configuration is changed. The bio-mechanical simulation and the changing of the cockpit configuration in the optimization process is then repeated until a predetermined stop condition is fulfilled. The cockpit configuration which is achieved at that point in time is output as final cockpit-configuration.
USER INTERFACES, COMPUTER PROGRAM PRODUCT, SIGNAL SEQUENCE, TRANSPORTATION VEHICLE AND METHOD FOR DISPLAYING INFORMATION ON A DISPLAY DEVICE
User interfaces, a computer program product, a signal sequence, a transportation vehicle, and a method for displaying information on a display device of a transportation vehicle. The method includes determining an information to be displayed, determining that a user sees the display device and, in response thereto, displaying a message representing the information on the display device. In response to the display of the message on the display device, to determine that the user looks away from the display device and, in response thereto, to hide the message.
VEHICLE INCLUDING COMMUNICATION SYSTEM FOR DISABLED PERSON AND CONTROL METHOD OF COMMUNICATION SYSTEM FOR DISABLED PERSON
A vehicle including a communication system for a disabled person may include: a motion capture unit that recognizes sign language of a passenger of the vehicle; an input unit that receives a voice or characters from the passenger; a database that stores sign language information; a control unit that is communicatively coupled to the motion capture unit and the input unit, the control unit being configured to translate the sign language recognized by the motion capture unit using the sign language information stored in the database, and to convert a voice received through the input unit into characters; and an output unit that is communicatively coupled to the control unit, the output unit being configured to output the sign language translated by the control unit as a voice or characters, and to output the voice converted by the control unit as characters.