B60K2360/1464

Channel selection interface for a vehicle

Disclosed herein is inter alia a channel selection interface includes a plurality of sectors, each of the sectors including a number of tunable service identifiers divided by the plurality of sectors, and a plurality of channel markers that separate the plurality of sectors. The channel selection interface provides an even allocation of channels for a gesture recognition interface.

AUTOMATIC VEHICLE DIAGNOSTIC DETECTION AND COMMUNICATION
20190279447 · 2019-09-12 ·

Methods and systems for vehicle diagnostic detection and communication are disclosed. Specifically, a method to monitor the health of vehicle systems and subsystems and diagnose detected anomalies is provided. In the event an anomaly or unhealthy state is detected within a vehicle, subsystem or component, the system may take a number of actions. In one embodiment, the actions comprise notifying a maintenance provider, a regulatory monitor, and original equipment manufacturer. The system may also maintain a vehicle fleet-wide performance database to enable identification and analysis of systemic fleet-wide data.

GEAR SELECTION SYSTEM AND METHOD
20190255949 · 2019-08-22 · ·

A gear selection system for a vehicle, and a method of selecting a gear in a vehicle, are disclosed. The gear selection system comprises a display screen for displaying a gear selection Graphical User Interface, a user input device for receiving user input in the form of a first user gesture, and a control apparatus configured to control the display screen to display the gear selection GUI in response to the first user gesture being received. Also provided is a gear selection system comprising a touch-sensitive user input device configured for receiving user input in the form of user touch gestures, a display screen for displaying a gear selection Graphical User Interface, and a control apparatus configured to control the display screen to display the gear selection GUI. The gear selection system is configured to generate a control signal indicative of a selected gear in response to a drag gesture being received on the touch-sensitive user input device.

Autonomous vehicle

An autonomous vehicle that moves automatically without any driver's manipulation. The autonomous vehicle includes a vehicle body including a deck, and an action-inducing device for inducing an action of a person who is a user of the autonomous vehicle or a non-user around the autonomous vehicle. The action-inducing device including an image-capturing device for capturing the person, and an information presentation device for presenting action-inducing information, to thereby induce the person to take the action.

Apparatus for recognizing gesture in vehicle and method thereof
11983328 · 2024-05-14 · ·

An apparatus is configured to efficiently recognize a gesture by interworking with an infotainment system. The apparatus obtains information about the gesture of a user, is connected with the infotainment system of a vehicle to identify a state of the infotainment system, branches a gesture allowed in recognition, based on the state of the infotainment system, and performs gesture recognition, based on the information about the sensed gesture and a result of branching the gesture allowed in recognition. The apparatus addresses overload and reliability degradation problems according to continuous image recognition.

GESTURE DEVICE FOR TURNING ON A ROAD VEHICLE AND RELATIVE ROAD VEHICLE

Road vehicle comprising an ignition system for the powertrain system comprising in turn a luminous device integral with the passenger compartment of the road vehicle and comprising an optical sensor configured to detect at least a first gesture and a second gesture performed by the driver to switch from an off configuration to an on configuration of the road vehicle and vice versa; the luminous device comprising a display device configured to at least partially assume a first colour and a second colour; wherein the display device is configured to switch from the first colour to the second colour when the movable part switches from the off configuration to the on configuration and vice versa when the movable part switches from the on configuration to the off configuration. Principal figure: 3.

Ultrasonic haptic control system

An ultrasonic haptic control system for a motor vehicle is provided. The system includes a steering wheel assembly comprising a ring-shaped member configured to be held by an occupant of the vehicle, and rotated about a central axis in order to steer the vehicle; and a plurality of ultrasound emitters associated with the steering wheel. The ultrasound emitters are configured to focus ultrasound waves within a central region about which the ring-shaped member circumferentially extends and/or radially outside of and adjacent to the ring-shaped member. The focused ultrasound waves are configured to form one or more haptic control surfaces. A method of operating a haptic control system is also provided.

METHOD AND DEVICE FOR COUPLING ELECTRONIC APPLIANCES TO A MOTOR VEHICLE
20190239049 · 2019-08-01 ·

A system, consisting of a vehicle having an infotainment system, a mobile data processing device and a motion sensing device, wherein a first communication interface is set up to transmit control commands I.sub.S from the motion sensing device to the mobile data processing device, the mobile data processing device is set up to translate the control commands I.sub.S into instructions I.sub.A to the infotainment system, and a second communication interface is set up to transmit the instructions I.sub.A from the mobile data processing device to the infotainment system.

VEHICLE SYSTEMS AND METHODS FOR DETERMINING A TARGET BASED ON A VIRTUAL EYE POSITION AND A POINTING DIRECTION

Vehicle systems and methods for determining a target position are disclosed. A vehicle includes a user detection system configured to output a gesture signal in response to a hand of a user performing at least one gesture to indicate a final target position. The vehicle also includes a user gaze monitoring system configured to output an eye location signal that indicates an actual eye position of the user. The vehicle also includes one or more processors and one or more non-transitory memory modules communicatively coupled to the processors. The processors store machine-readable instructions that, when executed, cause the one or more processors to determine a first point and a second point located on the hand of the user based at least in part on the gesture signal from the user detection system. The first point and the second point define a pointing axis of the hand of the user.

VEHICLE SYSTEMS AND METHODS FOR DETERMINING A TARGET BASED ON SELECTING A VIRTUAL EYE POSITION OR A POINTING DIRECTION

Vehicle systems and methods for determining a final target position based on either a first target position and a second target position are disclosed. A vehicle includes a user detection system configured to output a gesture signal in response to a hand of a user performing at least one gesture to indicate a final target position. The vehicle also includes a user gaze monitoring system configured to output an eye location signal that indicates an actual eye position of the user. The vehicle also includes one or more processors and one or more non-transitory memory modules communicatively coupled to the processors. The processors store machine-readable instructions that, when executed, cause the one or more processors to select either the first target position or the second target position as the final target position based on a first accuracy and a second accuracy.