B60K2360/1464

IMAGE DISPLAY DEVICE AND IMAGE DISPLAY METHOD
20190155024 · 2019-05-23 ·

An image display device includes an acquisition unit that acquires two or more types of information, a detection unit that detects at least one of an acceleration, an orientation, and an angular velocity of the image display device, an image change section that determines a state of the image display device on the basis of a detection result of the detection unit, switches information, which is to be selected from the two or more types of information acquired by the acquisition unit, in accordance with the determined state of the image display device, and generates display data based on the switched information, and a display unit that displays the display data.

METHOD AND DEVICE TO ENABLE A FAST STOP OF AN AUTONOMOUSLY DRIVING VEHICLE

The autonomously driving vehicle is fitted with a GPS unit to which a library of points of interest is assigned, and comprises a human-machine interface that is connected to the GPS unit and which comprises at least one of the following input means for the input of a command for the most immediate possible stopping of the vehicle: a unit for speech input that is connected to the interface, a module for recognizing gestures that is connected to the interface, and a touch-sensitive screen that is connected to the interface.

TRANSMISSION SHIFTING SYSTEM
20190136964 · 2019-05-09 ·

A vehicle includes a transmission, a hologram generator, and a controller. The transmission is configured to shift between a plurality of gears. The hologram generator is configured to project a holographic transmission gear selector. The controller is programmed to, responsive to an operator hand traversing a predetermined path relative to the holographic transmission gear selector, shift the transmission between the plurality of gears.

System and method for remote virtual reality control of movable vehicle partitions
10249088 · 2019-04-02 · ·

A method for remote virtual reality control of movable vehicle partitions includes displaying a graphic model of at least a portion of a vehicle on an output device. The output device is located remotely from the vehicle and the vehicle has one or more movable vehicle partitions. The method includes processing images received from one or more imaging devices, the images including gestures relative to the graphic model. The method includes identifying one or more vehicle commands based on the gestures relative to the graphic model. The vehicle commands define control of the one or more movable vehicle partitions. The method further includes executing the vehicle commands at the vehicle.

Gesture and Facial Expressions Control for a Vehicle
20190092169 · 2019-03-28 · ·

The present approach relates to a vehicle having a plurality of devices and a human-machine interface (HMI) for the gesture- and/or facial expression-based actuation of a function of a vehicle device, which comprises a camera for recording a specific occupant of a vehicle and a control unit connected to the camera.

INPUT DEVICE AND CONTROL METHOD OF THE SAME

The present disclosure relates to an input device for vehicle and control method of the same, and more particularly, to an input device for vehicle implemented as a plurality of sensor electrodes and control method of the input device. The input device may include: first sensor electrodes arranged in a first preset channel area with a first preset density; second sensor electrodes arranged in a second preset channel area with a second preset density, the second density being less than the first density; and a controller configured to determine a capacitive reference value of the input device based on first sensor value information that is collected from the first sensor electrodes and the second sensor electrodes.

VEHICLE DISPLAY SYSTEM, VEHICLE SYSTEM, AND VEHICLE

This display system which is provided in a vehicle comprises: a road surface drawing device that is configured so as to emit a light pattern (L1) toward a road surface outside the vehicle; an HUD that is located inside the vehicle and configured so as to display HUD information to an occupant of the vehicle such that the HUD information is overlaid on real space outside the vehicle; and a display control unit that is configured so as to control the road surface drawing device. The display control unit is configured so as to control the emission of the light pattern (L1) in accordance with an input operation by the occupant in an HUD display area (D1) where the HUD information can be displayed.

Equipment control device, equipment control method, and non-transitory recording medium

An equipment control device includes a receiver that receives sensing result information including a position, a shape, and a movement of a predetermined object and including a position of an eye point of a person, and a controller that, when the sensing result information indicates that the eye point, equipment placed at a predetermined position, and the object align and that the object is in a predetermined shape corresponding to the equipment in advance, determines command information causing to operate the equipment in accordance with the movement of the object in the predetermined shape to an equipment operating device.

USER INTERFACE APPARATUS FOR VEHICLE, AND VEHICLE

A user interface apparatus configured to be installed in a vehicle, the apparatus includes a touch screen; a gesture detecting unit configured to detect a gesture of a user and convert the gesture into an electrical signal; and at least one processor configured to determine whether a criteria for switching to an autonomous driving state is satisfied in response to the gesture detected by the gesture detecting unit being applied from a driver seat of the vehicle to the touch screen, further, in response to the criteria for switching to the autonomous driving state being satisfied, the at least one processor is further configured to switch a state of the vehicle to the autonomous driving state.

GESTURE AND MOTION BASED CONTROL OF USER INTERFACES
20190073040 · 2019-03-07 ·

Embodiments are disclosed of an apparatus including a digital camera to capture video or a sequence of still images of a user's hand. One or more processors are coupled to the digital camera to process the video or the sequence of still images to produce a digital representation of the user's hand, to determine the gesture and motion of the user's hand from the digital representation, and to correlate the gesture or the gesture/motion combination to a user interface command. A user interface controller is coupled to receive the user interface command from the one or more processors. A display is coupled to the user interface controller. The user interface controller causes a set of one or more user interface controls to appear on the display. The user interface command selects or adjusts one or more of the set of displayed user interface controls, and motion of the selected user interface control tracks the motion of the user's hand in substantially real time. Other embodiments are disclosed and claimed.