B60K2370/1472

Touch screen, a vehicle having the same, and a method of controlling the vehicle

A vehicle includes: a plurality of electronic devices each configured to adjust an angle of a part thereof, disposed at different positions and configured to perform the same function a touch screen having a plurality of touch areas for receiving operation commands of the plurality of electronic devices, respectively, configured to display a plurality of display areas respectively corresponding to the plurality of touch areas; and a controller configured to recognize the touch areas and touch gestures based on a touch signal received by the touch screen, to identify the electronic device corresponding to the recognized touch area, to identify operation information corresponding to the recognized touch gesture, and to control the identified electronic device based on the identified operation information. The plurality of electronic devices includes at least one of a plurality of vents, a plurality of lighting devices, and a plurality of sound output devices.

Virtual human-machine interface system and corresponding virtual human-machine interface method for a vehicle

A virtual human-machine interface system for a vehicle including at least one projection surface disposed within the vehicle and a corresponding virtual human-machine interface method for a vehicle are provided. The virtual human-machine interface system comprises at least one micro-mirror projection device for projecting an image on the at least one projection surface, at least one sensor for detecting commands given by a user by determining the position of a part of the user's body within the vehicle, and a control unit for controlling the human-machine interface system. The position of the image projected by the at least one projection device within the vehicle may be modified by a specific command given by the user.

METHOD FOR OPERATING A HUMAN-MACHINE INTERFACE, AND HUMAN-MACHINE INTERFACE
20210349592 · 2021-11-11 ·

A method for operating a human machine interface for a vehicle includes a vehicle component, a control unit and a touch-sensitive surface that is provided in the vehicle component. The method includes recognizing a touch on an arbitrary contact point of the touch-sensitive surface, and assigning to the contact point a button, by means of which an input is possible, wherein a function is assigned to the button. Moreover, a human machine interface is shown.

Systems and methods for locking an input area associated with detected touch location in a force-based touch display

In a touch screen environment, a computer calculates an effective position and updated effective positions of simultaneous or sequential touch events by calculating average coordinates of the touch events using force measurements. The average coordinates correspond to computerized maps of the user interface and z coordinates correspond to an average force at the x and y locations. The effective positions are used to determine if the user's touches move across multiple virtual input areas having priority and non-priority relationships. By expanding a virtual input area of the map for those areas having a priority label relative to a different non-priority virtual input area, the computer effectuates appropriate functions depending on where the most recent effective position lies.

Vehicle and steering unit
11433937 · 2022-09-06 · ·

A vehicle and a steering unit can improve safety. A vehicle includes a steering wheel and a touchpad disposed on the steering wheel. The vehicle enables control of a controlled apparatus by a gesture on the touchpad of the steering wheel upon detecting a predetermined pressure on the touchpad of the steering wheel and provides a tactile sensation from the touchpad in accordance with the gesture on the touchpad of the steering wheel.

Vehicle control device mounted on vehicle and method for controlling the vehicle

A vehicle control device for controlling a vehicle including first and second display units disposed at different positions therein, can include a communication unit configured to communicate with the first and second display units; and a controller configured to in response to an occurrence of a preset condition, make a selection of at least one of the first display unit and the second display unit, and display a first execution screen of an application on the first display unit or a second execution screen of the application on the second display unit according to the selection, or change the first execution screen displayed on the first display unit or the second execution screen displayed on the second display unit according to the selection.

SYSTEMS AND METHODS FOR LOCKING AN INPUT AREA ASSOCIATED WITH DETECTED TOUCH LOCATION IN A FORCE-BASED TOUCH DISPLAY

In a touch screen environment, a computer calculates an effective position and updated effective positions of simultaneous or sequential touch events by calculating average coordinates of the touch events using force measurements. The average coordinates correspond to computerized maps of the user interface and z coordinates correspond to an average force at the x and y locations. The effective positions are used to determine if the user's touches move across multiple virtual input areas having priority and non-priority relationships. By expanding a virtual input area of the map for those areas having a priority label relative to a different non-priority virtual input area, the computer effectuates appropriate functions depending on where the most recent effective position lies.

METHOD FOR PROCESSING TOUCH INSTRUCTION, ELECTRONIC DEVICE AND STORAGE MEDIUM
20210333963 · 2021-10-28 ·

A method for processing a touch instruction, an electronic device and a storage medium, related to the fields of Internet of Things, intelligent transportation, and the like, are provided. The method includes: detecting a type of a received touch event; in a case where the type of the touch event is a predetermined type, acquiring a number of fingers which execute the touch event of the predetermined type each time; and determining a touch instruction by utilizing a change situation of the number of fingers which execute two adjacent touch events of the predetermined type. According to the solution, the number of fingers which execute the touch event can be determined by utilizing the change situation of the number of fingers corresponding to two adjacent touch events conforming to the predetermined type.

VEHICLE MIDDLEWARE
20210234767 · 2021-07-29 ·

The present disclosure describes a vehicle implementing one or more processing modules. These modules are configured to connect and interface with the various buses in the vehicle, where the various buses are connected with the various components of the vehicle to facilitate information transfer among the vehicle components. Each processing module is further modularized with the ability to add and replace other functional modules now or in the future. These functional modules can themselves act as distinct vehicle components. Each processing modules may hand-off processing to other modules depending on its health, processing load, or by third-party control. Thus, the plurality of processing modules helps to implement a middleware point of control to the vehicle with redundancy in processing and safety and security awareness in their applications.

Input device for vehicle and input method

An input device for a vehicle that performs input for operating a user interface (UI) displayed by a display disposed in a vehicle includes: a touch sensor that receives a touch input by a user; and a controller that selects, when the touch input into the touch sensor is a touch with a plurality of fingers or a plurality of taps, a region corresponding to the count of touches with the fingers or the count of taps, from among a plurality of regions displayed by the display.