B60K35/26

METHOD AND APPARATUS FOR UNIFIED PERSONAL CLIMATE MANAGEMENT

A system includes a processor configured to determine that a first user will transition from a first climate-controllable environment to a second climate-controllable environment within a threshold time. The processor is also configured to compare first and second environment temperatures. The processor is further configured to detect whether a second-user control device is in communication with a second-environment climate control and set the second-environment climate control to a desired temperature, based on the first environment temperature, responsive to an absence of the second-user control device.

Motor vehicle combined instrument having a helmholtz resonator as acoustic element

A motor vehicle combined instrument panel has a front housing directed towards a vehicle occupant and a plate which is provided on the housing and has a loudspeaker.

CONTROL DEVICE, INPUT SYSTEM AND CONTROL METHOD
20180335845 · 2018-11-22 · ·

There is provided a control device. An operation detector configured to detect a user's operation on an operation surface of a panel. A driving unit configured to vibrate the panel by driving a vibration element attached to the panel. Wherein when the operation is detected by the operation detector, the driving unit combines a sound signal of sound, which is to be generated from the panel by vibration of the panel, with a drive signal for generating, in the panel, haptic vibration to provide the user with vibration of a haptic sense, and outputs a combined signal to the vibration element, thereby generating the haptic vibration and the sound in the panel.

SUPPORT TO HANDLE AN OBJECT WITHIN A PASSENGER INTERIOR OF A VEHICLE

The disclosure related to a system to support handling of an object located within a passenger interior of a motor vehicle and not connected to the motor vehicle. The system has a sensor unit to detect a distance of the object to a storage area available within the passenger interior. An evaluation unit receives and processes sensor signals generated by the sensor unit. A signaling unit that can be controlled using the evaluation unit to emit optic, acoustic and/or haptic signals within the passenger interior to determine if a hand of the driver moves towards the storage area and activate the signaling unit if the hand moves towards the storage area and if the hand is located, for a predetermined period of time, within the storage area of a predetermined size.

MOBILE APPLICATION INTERFACE DEVICE FOR VEHICLE NAVIGATION ASSISTANCE

An interface device connects to a mobile application running on a mobile device to provide vehicle navigation assistance. The interface device includes a microphone, a location determining component, a display, a plurality of indicator lights, a short-range communications transceiver, and a controller. The controller is configured to: receive a spoken instruction or query from a user via the microphone; send data associated with the spoken instruction or query to the mobile application running on the mobile device via the short-range communications transceiver; receive navigation data based on the spoken instruction or query from the mobile application running on the mobile device via the short-range communications transceiver; determine a current position based on one or more signals received via the location determining component; and provide at least one of a symbolic output or a textual output via the display based on the navigation data and the current position.

MOBILE APPLICATION INTERFACE DEVICE FOR VEHICLE NAVIGATION ASSISTANCE

An interface device connects to a mobile application running on a mobile device to provide vehicle navigation assistance. The interface device includes a microphone, a location determining component, a display, a plurality of indicator lights, a short-range communications transceiver, and a controller. The controller is configured to: receive a spoken instruction or query from a user via the microphone; send data associated with the spoken instruction or query to the mobile application running on the mobile device via the short-range communications transceiver; receive navigation data based on the spoken instruction or query from the mobile application running on the mobile device via the short-range communications transceiver; determine a current position based on one or more signals received via the location determining component; and provide at least one of a symbolic output or a textual output via the display based on the navigation data and the current position.

Interactive safety system for vehicles

An interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles. For example, the interactive vehicle safety system may have capabilities for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle (such as a truck, van, train, etc.). The interactive vehicle safety system disclosed may comprise one or more image capturing devices (such as camera, sensor, laser), distance and object sensors (such as ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor), a real-time image processing of an object, and one or more display systems (such as LCD or LED displays). The interactive vehicle safety system may give a seamless 360-degree front panoramic view to a driver.

SOUND EFFECT GENERATION DEVICE FOR VEHICLES
20180277091 · 2018-09-27 · ·

A vehicle sound effect generation apparatus generates a sound effect of an engine based on a vibration sound database including a fundamental wave sound having a fundamental frequency component and a plurality of adjustment wave sounds having a frequency component other than the fundamental frequency component. The vehicle sound effect generation apparatus includes a running state detecting unit that detects a running state of a vehicle, a sound effect generation unit that synthesizes the fundamental wave sound with the adjustment wave sounds based on the running state of the vehicle, and a visual guidance direction setting unit that sets a visual guidance direction at which a vehicle occupant shoots a look. The sound effect generation unit causes output characteristics of a sound effect in the visual guidance direction to increase more than output characteristics of a sound effect in a direction other than the visual guidance direction.

VEHICLE-TO-HUMAN COMMUNICATION IN AN AUTONOMOUS VEHICLE OPERATION
20180276986 · 2018-09-27 ·

A device and method for autonomous vehicle-to-human communications are disclosed. Upon detecting a human traffic participant being proximal to a traffic yield condition of a vehicle planned route, generating a message for broadcast to the human traffic participant and sensing whether the human traffic participant acknowledges a receipt of the message. When sensing that the human traffic participant acknowledges receipt of the message, generating a vehicle acknowledgment message for broadcast to the pedestrian.

Automated Motor Vehicle and Method for Controlling the Automated Motor Vehicle
20240317135 · 2024-09-26 ·

An automated motor vehicle that has a sensor device which continuously determines a viewing direction of a person sitting on a driver's seat, and a control device connected to the sensor device. The control device determines a frequency of a change in the viewing direction on the basis of the viewing direction determined by the sensor device, compares the frequency with a predetermined limit value, and if the frequency is greater than the predetermined limit value, outputs a control signal to an information output device connected to the control device. The information output device outputs a predetermined item of information to the person sitting on the driver's seat after receiving the control signal.