VEHICLE INTERFACE DEVICE
20180005528 · 2018-01-04
Inventors
- Jean-Jacques Loeillet (Coventry, Warwickshire, GB)
- Andrew Chatwin (Coventry, Warwickshire, GB)
- Andy Wells (Coventry, Warwickshire, GB)
- Alan Trevana (Coventry, Warwickshire, GB)
- Frazer McKimm (Coventry, Warwickshire, GB)
- Conor Duff (Coventry, Warwickshire, GB)
- Guido Astorri (Coventry, Warwickshire, GB)
Cpc classification
G08G1/165
PHYSICS
B60Q9/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The present disclosure relates to a vehicle interface device configured to output a haptic signal to indicate a potential hazard. The apparatus includes at least one haptic generator configured to generate a haptic signal; and a processor for controlling the haptic generator. The processor is configured to determine an angular position of the identified object relative to the vehicle in dependence on object data relating to an identified object representing a potential hazard. A control signal is generated to cause the haptic generator to output a haptic signal for providing an indication of the determined relative position of the identified object. The control signal is modified to progressively change the generated haptic signal to represent changes in the relative angular position of the identified object. The present disclosure also relates to a vehicle incorporating a vehicle interface device.
Claims
1. A vehicle interface device for generating a haptic indication of a potential hazard, the vehicle interface device comprising: at least one haptic generator configured to generate a haptic signal; and a processor for controlling the at least one haptic generator; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position, of the identified object relative to the vehicle; generate a control signal to cause the haptic generator to output a haptic signal for providing an indication of the angular position of the identified object relative to the vehicle; and modify the control signal to progressively change the generated haptic signal to represent changes in the angular position of the identified object relative to the vehicle.
2. The vehicle interface device of claim 1, wherein the processor is configured to receive the object data from a sensor disposed on the vehicle.
3. The vehicle interface device of claim 1, wherein the at least one haptic generator comprises a plurality of said haptic generators, wherein the processor is configured to control activation of the haptic generators to represent changes in the angular position of the identified object relative to the vehicle.
4. The vehicle interface device of claim 1, wherein the at least one haptic generator comprises a vibration generator.
5. The vehicle interface device of claim 1, wherein the at least one haptic generator comprises: an ultrasonic transducer configured to generate an ultrasonic signal, optionally wherein the ultrasonic transducer is configured to control an output direction of the ultrasonic signal to re resent changes hi the angular position of the identified object relative to the vehicle or an air vent configured to generate the haptic signal as a jet of air optionally wherein the air vent comprises an adjustable nozzle for controlling a direction of the jet of air to represent changes in the relative angular position to the identified object from the vehicle.
6-8. (canceled)
9. The vehicle interface device claim 1, wherein the at least one haptic generator is disposed within a seat in an the occupant compartment of the vehicle, optionally wherein the vehicle interface device further comprises a sensor configured to determine occupant contact with the seat, and wherein the processor is configured to control activation of the at least one haptic generator in. dependence on occupant contact with the seat,
10. (canceled)
11. The vehicle interface device of claim 1, wherein the processor is configured to: determine a trajectory of the identified object in dependence on the object data and to modify the haptic signal in dependence on the determined trajectory; and/or determine a time to collision in dependence on the object data and modify the haptic signal in dependence on the determined time to collision; and for determine a nature of the identified object in dependence on the object data and modify the haptic signal in dependence on the determined nature.
12-13. (canceled)
14. The vehicle interface device of claim 1, wherein the processor is configured to modify the haptic signal by changing one or more of the following parameters: amplitude, frequency, magnitude, haptic pattern, and pattern form.
15. The vehicle interface device of claim 1, further comprising: a display configured to extend around at least a portion of a perimeter of an occupant compartment in a vehicle; and wherein the processor is configured to control the display and: generate a control signal to cause the display to display a visual indicator at a display position in the display corresponding to the angular position of the identified object relative to the vehicle; and modify the control signal to progressively change the display position of the visual indicator in the display at least substantially to match changes in the angular position of the identified object relative to the vehicle.
16. The vehicle interface device of claim 1, comprising: at least one electroacoustic transducer configured to generate an audible signal; and wherein the processor controls the at least one electroacoustic transducer and is configured to: generate a control signal to cause the at least one electroacoustic transducer to generate an audio object corresponding to the angular position of the identified object relative to the vehicle; and modify the control signal to progressively change a perceived spatial location of the audio object to represent changes in the angular position of the identified object relative to the vehicle.
17. A vehicle comprising the vehicle interface device of claim 1.
18. A method of generating a haptic indication of a potential hazard, the method comprising: in dependence on object data relating to an identified object representing a potential hazard, determining an angular position of the identified object relative to a vehicle; generating a haptic signal for providing an indication of the position of the identified object relative to the vehicle; and progressively changing the generated haptic signal to represent changes in the angular position of the identified object relative to the vehicle.
19. The method of claim 18, further comprising receiving the object data from a sensor disposed on the vehicle.
20. The method of claim 18, further comprising controlling activation of a plurality of said haptic generators to represent changes in the angular position of to the identified object relative to the vehicle.
21. The method of claim 18, wherein the haptic signal is generated via at least one haptic generator that comprises: a vibration generator: an ultrasonic transducer for generating an ultrasonic signal as the haptic signal; or an air vent configured to generate the haptic signal in the form of a jet of air.
22-25. (canceled)
26. The method of claim 18, wherein the haptic signal is generated via at least one haptic generator that is disposed within a seat in an occupant compartment of the vehicle, optionally determining occupant contact with the seat, and controlling activation of the at least one haptic generator in dependence on occupant contact with the seat.
27. (canceled)
26. The method of claim 18, further comprising: determining a trajectory of the identified object, in dependence on the identified object data and modifying the haptic signal in dependence on the determined trajectory; and/or determining a time to collision in dependence on the identified object data and modifying the haptic, signal in dependence On the determined time to collision; and/or determining a nature of the identified object in dependence on the identified object data and modifying the haptic signal in dependence on the determined nature.
29-30. (canceled) 31, The method of claim 18, further comprising modifying the haptic signal by changing one or more of the following parameters: amplitude, frequency, magnitude, haptic pattern, and pattern form.
32. The method of claim 18, further comprising generating a visual indication of a potential hazard, the-method comprising: displaying a visual indicator at a display position corresponding to the angular position of the identified object relative to the vehicle; and progressively changing the display position of the visual indicator at least substantially to match changes in the angular position of the identified object relative to the vehicle,
33. The method of further comprising generating an audible indication of a potential hazard, comprising: generating an audible signal for providing an indication of the angular position of the object relative to the vehicle; and progressively modifying the generated audible signal to represent changes in the angular position of the identified object relative to the vehicle.
34-36. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0204] An embodiment of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which:
[0205]
[0206]
[0207]
[0208]
[0209]
[0210]
[0211]
[0212]
[0213]
[0214]
[0215]
[0216]
[0217]
[0218]
[0219]
[0220]
DETAILED DESCRIPTION
[0221] A vehicle interface device 1 in accordance with an embodiment of the present embodiment will now be described. The vehicle interface device 1 functions as a human machine interface (HMI) for a vehicle 2 shown schematically in
[0222] The vehicle interface device 1 is operable to generate an alert to notify a driver of the vehicle 2 that a potential hazard has been identified. The alert comprises a directional component to notify the driver of the angular position of the potential hazard in relation to the vehicle 2. The alert in the present embodiment comprises three modalities: [0223] 1. Vision—patterns using colours, pulse and motion are displayed in the visual structure of the occupant compartment; [0224] 2. Sound—directional object-based sound associated with the nature of the identified object; and [0225] 3. Haptic—directional and applied via the seat (or the steering wheel), for example in the form of vibration or contact.
[0226] The potential hazard typically takes the form of an object 6 identified by the vehicle interface device 1. The identified object 6 can be either stationary or moving. The vehicle interface device 1 is configured to provide the driver with an indication of the position of the identified object 6 in relation to the vehicle 2. The vehicle interface device 1 can be configured to differentiate between different types of objects to determine the nature of the potential hazard, for example to determine if the potential hazard is a pedestrian (potentially differentiating between an adult and a child), a cyclist, a vehicle, a truck, an animal or an inanimate object. An image processing algorithm can, for example, be applied to image data to determine the nature of the identified object. The form of the alert can be modified in dependence on the determined nature of the potential hazard. The vehicle interface device 1 could identify more than one potential hazard at any time and the techniques described herein could be performed simultaneously for the plurality of identified hazards. Alternatively, the vehicle interface device 1 could be configured to prioritise one of the identified hazards over the others, for example in dependence on the nature of the potential hazards identified by the vehicle interface device 1. The vehicle interface device 1 could be configured to output an alert only relating to the potential hazard identified as having the highest priority.
[0227] As shown in
[0228] The ADAS 9 is coupled to sensor means for monitoring a region surrounding the vehicle 2. As shown in
[0229] The first and second signal processors identify object(s) 6 proximal to the vehicle 2 within the operating zones R1-4 and output the positional data D1 in the form of x, y coordinates defining the position of the identified object 6 relative to a virtual reference point on the vehicle 2. It will be understood that the sensor means can comprise different types of sensors, such as ultrasonic sensors and/or capacitive sensors. Moreover, the sensor means could be remote from the vehicle 2, for example in another vehicle which is in communication with the vehicle 2 (vehicle-to-vehicle (V2V) communication).
[0230] With reference to
[0231] The positional data D1, the time to collision data D2, and the nature of object data D3 each relate to the identified object 6 identified as a potential hazard. These different data sets are referred to herein as object data. The object data can be output from the sensors means disposed on the vehicle 2. Alternatively, or in addition, the processor 7 could be configured to receive the object data from an external source, such as infrastructure (infrastructure-to-vehicle (12V)) or another vehicle.
[0232] The driver monitoring system 10 comprises a driver monitoring camera (not shown). An image processing unit receives image data from the driver monitoring camera and assesses a driver distraction level and a driver tiredness (fatigue) level. The image processing unit can, for example, implement an image-processing algorithm to determine a driver alertness level, for example based on head pose and/or gaze direction. The driver monitoring system 10 can also monitor the driver workload, for example with reference to the vehicle speed and/or steering angle. A driver capability can also be determined by the driver monitoring system 10 to provide an estimate of an expected reaction time by the driver at any given time. The driver monitoring system 10 monitors the current driver workload, driver distraction, driver tiredness (fatigue) and driver capability. The driver monitoring system 10 can comprise a driver-facing camera to monitor driver behaviour, for example based on face recognition algorithms. The driver monitoring system 10 can also monitor driver inputs, including steering angle and/or pedal angles. The driver monitoring system 10 outputs the second input signal S.sub.IN2 which includes driver monitoring data D4 comprising an estimated driver reaction time. The driver monitoring system 10 can also output data generated by the image processing unit defining the head pose and/or the gaze direction of the driver. The second input signal S.sub.IN2 could optionally also comprise information relating to the driver's vision capabilities, for example short or long distance vision and/or colour perception. The driver capability can be determined=T0 (reaction time of a specific situation, for example obtained from a look-up table)+(delta) ΔT (additional time calculated based on driver monitoring of workload and/or distraction and/or tiredness) or multiply by a predefined percentage of reaction rate (for example, a fatigued individual may take x% longer to react, where x is a predefined number greater than zero). The driver monitoring system 10 can optionally also utilise auditory and/or vision information relating to a particular driver. The auditory information can define a driver's auditory capabilities; and the vision information can define a driver's vision capabilities, for example indicating the driver's long/short sighted ability and/or colour perception ability. The auditory and/or vision information could be measured or could be input by the driver.
[0233] The third input signal S.sub.IN3 can be output from the user identification system 11 to identify the driver of the vehicle. The processor 7 additionally receives vehicle dynamics data from the vehicle information system 12. The fourth input signal S.sub.IN4 comprises vehicle speed data D5, but can include other vehicle dynamics parameters, such as the steering angle. The processor 7 can also receive driver data D6 which, as described herein, can be used to estimate one or more physical characteristics of the driver. The processor 7 can also receive driver head position data D7 indicating the position of the driver's head. The processor 7 can also receive driver clothing data D8 characterising the clothing worn by the driver, for example the thickness of a garment and/or the number of layers. The driver head position data D7 and the driver clothing data D8 can be generated in dependence on image processing of image data generated by a driver-facing camera (not shown). The outputs from the display device 13, the audio device 14 and the haptic device 15 can be modified in dependence on the estimated physical characteristics. For example, the haptic output generated by the haptic device 15 can be controlled in dependence on a pressure zone on the driver seat estimated in dependence on the measured weight of the driver.
[0234] The processor 7 applies a control algorithm to the input signals S.sub.IN1-4 to generate the output signals S.sub.OUT1-3 for controlling operation of the display device 13, the audio device 14 and the haptic device 15 respectively. The processor 7 thereby functions as a HMI controller for the vehicle 2. The configuration of the display device 13, the audio device 14 and the haptic device 15 will now be described in more detail.
[0235] The display device 13 is configured to output a visual indicator to notify a vehicle occupant of a potential hazard. The display device 13 is configured to extend substantially around the interior perimeter of the occupant compartment 3. As shown in
[0236] In the present embodiment, the display device 13 comprises a matrix of light emitting elements 24 arranged to form a substantially continuous optical track or band around the interior perimeter of the occupant compartment 3. The light emitting elements 24 each comprise one or more light emitting diodes (LEDs). As shown in
[0237] With reference to
[0238] With reference to
[0239] The display colour relates to the colour of the visual pattern P and can be changed to indicate a determined risk level associated with the potential hazard. By way of example, the visual pattern P can be displayed in yellow when the determined risk level posed by the identified object 6 is relatively low; or red when the determined risk level posed by the potential hazard is relatively high. The light emitting elements 24 can display a green colour to indicate that the vehicle interface device 1 is in operation but no potential hazards have been identified. The display colour signal S2 is generated as a function of the determined time to collision (ttc) and the estimated driver reaction time. The display form signal S3 is generated as a function of the determined nature of the identified object 6. The display form signal S3 controls the display form (shape) of the visual pattern P to represent different types of objects 6. For example, a first visual pattern P can be displayed to represent a cyclist, and a second visual pattern P can be displayed to represent another vehicle. The size of the visual pattern P can be controlled in dependence on the display form signal S3. The different visual patterns P can be predefined or generated dynamically, for example derived from the object data.
[0240] The processor 7 is configured to control the display position signal S1 such that changes in the display position of the visual pattern P are substantially continuous to provide a spatially uninterrupted indication of changes in the relative angular position of the identified object 6. To indicate changes in the relative angular position of the identified object 6, the visual pattern P travels progressively within the display device 13 to provide a scrolling effect providing an uninterrupted (seamless) representation of changes in the relative angular position of the identified object 6. The display position of the visual pattern P can change in a horizontal direction to indicate changes in the relative angular position of the identified object 6. The size and/or illumination level of the visual pattern P could also be controlled, for example to indicate a determined range to the identified object 6 and/or a determined size of the identified object 6. Alternatively, or in addition, the display position of the visual pattern P can change in a vertical direction to indicate that the identified object 6 is travelling towards and/or away from the vehicle 2. For example, the visual pattern P can travel upwardly within those vertical portions of the display device 13 disposed on the A-pillar 23 (and optionally also the B-pillar and/or the C-pillar and/or the D-pillar) to indicate that the identified object 6 is travelling towards the vehicle 2.
[0241] The processor 7 can also be configured to control the size and/or position and/or illumination level of the visual pattern P depending on the field of vision of the driver. The field of vision of the driver is illustrated in
[0242] The processor 7 can take into account additional control factors. For example, the processor 7 can use the driver size as a further input to determine the display position of the visual pattern P. For example, if the driver is small (necessitating a forward seating position), the processor 7 can translate the display position of the visual pattern P towards the front of the occupant compartment to improve visibility of the visual pattern P. The driver size can be determined by processing the image data received from the driver monitoring camera. Alternatively, the driver size can be estimated based on the position of the driver seat and/or a measured weight of the driver.
[0243] The audio device 14 is an object-based audio system configured to generate a multi-dimensional audio alert in dependence on the second output signal S.sub.OUT2 generated by the processor 7. The audio alert conveys positional information and/or movement information relating to the identified object 6. The audio device 14 is configured to output an acoustic pattern which is audible within the occupant compartment 3. In the present embodiment, the audio device 14 comprises a rendering station 28 configured to generate an object-based audio output which can combine different sound elements with metadata to form an audio object 29 (or a plurality of audio objects 29). The audio object 29 is an acoustic event perceived in space that may or may not occupy the same location as a loudspeaker. The audio object 29 has physical parameters that are manipulated to provide a change in the perceived location of the audio object 29 representing changes to the state of the identified (physical) object 6. This is different from the “phantom centre” experienced when a listener sits between two stereo loudspeakers because the centre image cannot be manipulated as a result of external factors.
[0244] The metadata utilised by the rendering station 28 is generated in dependence on the determined position of the identified object 6, for example the determined relative angular position (heading) and/or range of the identified object 6. The rendering station 28 can control the perceived spatial location of the audio object 29 in three-dimensions. The perceived spatial location of the audio object 29 conveys information relating to the position of the identified object 6 in relation to the vehicle 2. By way of example, the perceived spatial location of the audio object 29 can provide an indication of the relative angular position of the identified object 6. Moreover, the perceived spatial location of the audio object 29 can be changed to represent changes in the relative angular position of the identified object 6. One or more characteristics of the audio object 29 can also be controlled to convey information relating to the identified object 6. For example, a sound effect transmitted in said audio object 29 can be selected to indicate the nature of the identified object 6. The amplitude of the audio object 29 can be controlled to indicate a range to the identified object 6.
[0245] As illustrated in
[0246] The rendering station 28 could, for example, be configured to change the spatial location of the audio object 29 in dependence on the determined angular position of the identified object 6 relative to the vehicle 2. The spatial relationship between the vehicle 2 and the identified object 6 can be used to define the perceived spatial location of the audio object 29. In particular, the perceived angular position of the audio object 29 can correspond to the angular position of the identified object 6 in relation to the vehicle 2. The audio device 14 could optionally implement a sound shower such that the audio object 29 can be heard only in the driver area, thereby reducing disturbance to other occupants.
[0247] The perceived vertical location of the audio object 29 can be varied to convey additional information, for example relating to the size or nature of the identified object 6. The perceived vertical location of the audio object 29 could be relatively low to indicate that the identified object 6 is relatively small (for example to indicate that a child has been identified); and relatively high to indicate that the identified object 6 is relatively large (for example to indicate that an adult or a cyclist has been identified). Equally, the perceived vertical location of the audio object 29 could be adjusted to indicate range (distance), a relatively high perceived vertical location representing a relatively large range to the identified object 6 and a relatively low perceived vertical location representing a relatively small range to the identified object 6.
[0248] With reference to
[0249] The haptic device 15 is configured to generate a haptic alert in dependence on the third output signal S.sub.OUT3 generated by the processor 7. The haptic alert is configured to convey positional information and/or movement information relating to the identified object 6. The haptic device 15 is associated with a driver seat 32 disposed in the occupant compartment 3. As shown in
[0250] In the present embodiment the haptic effect generating device 37 comprises an array of vibration generators 38 that can be controlled independently of each other. The vibration generators 38 can, for example, each comprise an electric actuator (such as a piezoelectric actuator), an eccentric rotating element, or a vibratory transducer. In the illustrated arrangement, the haptic effect generating device 37 comprises nine (9) vibration generators 38. The haptic device 15 comprises a haptic control unit 39 configured to control operation of said vibration generators 38 in dependence on the third output signal S.sub.OUT3. Specifically, the haptic control unit 39 is configured to output haptic control signals S.sub.H to control each vibration generator 38 independently. It will be understood that less than, or more than nine (9) vibration generators 38 can be incorporated into the haptic effect generating device 37. Alternatively, or in addition, the haptic effect generating device 37 could comprise one or more of the following: an ultrasonic transducer (for example haptic touchless technology), an electric actuator (such as a piezoelectric actuator) and a vibratory transducer.
[0251] The haptic effect generating device 37 is controlled in dependence on the third output signal S.sub.OUT3 selectively to energize one or more of the vibration generators 38 to generate a haptic pattern. The haptic pattern is controlled to convey information to the driver of the vehicle 2 relating to the identified object 6, for example to indicate a relative angular position and/or relative angular movement of the detected object 6. As shown in
[0252] In another embodiment one or more of the vibration generators 38 are arranged so that they are positioned and/or grouped within different portions of the haptic effect generating device 37.
[0253] The amplitude and/or frequency of the haptic pattern can be controlled to convey additional information, such as a hazard level (criticality) posed by the identified object 6. For example, the amplitude of the haptic pattern could be increased if the processor 7 determines that the identified object 6 is a particular hazard. The hazard level can, for example, be calculated based on the determined time to collision (ttc) and the reaction time of the driver. The amplitude and/or frequency of the haptic pattern could be modified to indicate the form of the identified object 6.
[0254] With reference to
[0255] The processor 7 can also be configured to control operation of the haptic effect generating device 37 in dependence on a determined contact between the driver and the seat squab 34. Using weight percentiles, a contact pattern between the driver and the seat cushion 33 and the seat squab 34 can be estimated. By way of example, a first contact pattern 40A for a 5th percentile is shown in
[0256] The haptic effect generating device 37 could utilise an array of ultrasonic transducers in place of (or in addition to) the vibration generators 38. The ultrasonic generators could be incorporated into the seat cushion 33 and/or the seat squab 34 and/or the head rest 35. In this arrangement, one or more of said ultrasonic generators can be activated to generate the haptic pattern which is sensed by the driver. The haptic pattern can be controlled by selectively activation of one or more of said ultrasonic generators. In use, the ultrasonic transducers could be configured to generate an ultrasonic signal that is transmitted through the air and is felt by the driver. Thus, the ultrasonic transducers are operable to transmit the haptic pattern when the driver is not in direct contact with the driver seat 32.
[0257] In an alternative arrangement, the haptic pattern could be generated by controlling an airflow incident on the driver of the vehicle 2. The haptic effect generating device 37 could utilise one or more air vents to control the airflow to generate the haptic pattern. The one or more air vents could be incorporated into the driver seat 32, for example into a head rest; and/or into a door of the vehicle 2; and/or into a B-pillar of the vehicle 2. The one or more air vents could be selectively opened/closed to control airflow incident on the driver, for example on the back of the driver's head, neck or shoulders. The resulting haptic pattern can be used to notify the driver of the relative angular position and/or relative movement of the identified object 6. The extent to which each air vent is opened could be controlled to control the strength of the incident airflow. Alternatively, or in addition, the haptic effect generating device 37 could comprise an adjustable nozzle (not shown) which can be controlled to change the direction of the incident airflow. An operating speed of a fan unit for generating the airflow could be controlled. The incident airflow could be pulsed. The pulsed airflow could be controlled to convey additional information, such as the nature of the identified object 6 and/or a hazard level. For example, the frequency of the pulses could be increased to signal a reduction in the range to the identified object 6.
[0258] The operation of the vehicle interface device 1 will now be described with reference to a first example illustrated in
[0259] The processor 7 outputs the third output signal S.sub.OUT3 to control operation of the haptic device 15. The haptic device 15 operates throughout the sequence to provide an additional communication means. In particular, the vibration generators 38 in the central column Y2 are activated initially when the cyclist is detected behind the vehicle 2. The intensity of the vibrations is adjusted based on the measured range to the cyclist. As the cyclist approaches on the right hand side of the vehicle 2, the vibration generators 38 in the right column Y3 generate vibrations which progressively increase in magnitude while those generated by the vibration generators 38 in the central column Y2 progressively decrease. When the cyclist is alongside the vehicle 2, only those vibration generators 38 in the right hand column Y3 are active. The magnitude of the vibrations decreases as the cyclist moves further away from the vehicle 2. The vibration generators 38 thereby generate a haptic pattern which also conveys relative angular position and movement information to the driver.
[0260] The processor 7 can be configured to control output of the first, second and third output signals S.sub.OUT1-3 to control activation of the display device 13, the audio device 14 and the haptic device 15 to convey different information. The display device 13, the audio device 14 and the haptic device 15 can be activated independently of each other to convey information relating to different identified hazards, for example in dependence on a determined priority of a plurality of potential hazards or in dependence on an identified region in which the hazards is identified. The processor 7 can control an activation sequence of the display device 13, the audio device 14 and the haptic device 15, for example depending on a personal preference setting or depending on a determined urgency. By way of example, the haptic device 15 can be activated when there is an imminent hazard to provide direct feedback to the driver.
[0261] The operation of the vehicle interface device 1 will now be described with reference to a second example illustrated in
[0262] The visual pattern P thereby sweeps along the right lateral panel 21R to provide a continuous indication of the relative angular position of the cyclist. As shown in
[0263] It will be appreciated that various changes and modifications can be made to the vehicle interface device without departing from the scope of the present invention. The vehicle interface device has been described herein with reference to implementing three modalities, namely visual, audio and haptic feedback. It will be appreciated that the vehicle interface device could be implemented with only one of said modalities or two of said modalities.
[0264] The illumination level (or intensity) of the illuminating elements can be controlled individually within the visual pattern P to indicate a measured distance between the vehicle and the identified object 6. The measured distance could be the shortest distance between the vehicle 2 and the identified object 6, for example measured normal to an exterior of the vehicle 2; or could be the distance measured relative to a reference point in the vehicle 2. By varying the illumination level within the visual pattern P, a sense of depth or perspective can be conveyed.
[0265] The user can set preferences for operation of the visual device 13, the audio device 14 and the haptic device 15.
[0266] Further aspects of the present invention are set out in the following numbered paragraphs:
[0267] 1. A vehicle interface device for generating a haptic indication of a potential hazard, the vehicle interface device comprising: [0268] at least one haptic generator configured to generate a haptic signal; and [0269] a processor for controlling said haptic generator; [0270] wherein the processor is configured to: [0271] in dependence on object data relating to an object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; [0272] generate a control signal to cause the haptic generator to output a haptic signal for providing an indication of the determined relative position of the identified object; and [0273] modify the control signal to progressively change the generated haptic signal to represent changes in the relative angular position of the identified object.
[0274] 2. A vehicle interface device as described in paragraph 1, wherein the processor is configured to receive said object data from sensor means disposed on the vehicle.
[0275] 3. A vehicle interface device as described in paragraph 1 comprising a plurality of said haptic generators, wherein the processor is configured to control activation of said haptic generators to represent changes in the relative angular position to the identified object from the vehicle.
[0276] 4. A vehicle interface device as described in paragraph 1, wherein the at least one haptic generator comprises a vibration generator.
[0277] 5. A vehicle interface device as described in paragraph 1, wherein the at least one haptic generator comprises an ultrasonic transducer for generating an ultrasonic signal.
[0278] 6. A vehicle interface device as described in paragraph 5, wherein the ultrasonic transducer is configured to control the output direction of said ultrasonic signal to represent changes in the relative angular position to the identified object from the vehicle.
[0279] 7. A vehicle interface device as described in paragraph 1, wherein the at least one haptic generator comprises an air vent for generating the haptic signal in the form of a jet of air.
[0280] 8. A vehicle interface device as described in paragraph 7, wherein the air vent comprises an adjustable nozzle for controlling the direction of the jet of air to represent changes in the relative angular position to the identified object from the vehicle.
[0281] 9. A vehicle interface device as described in paragraph 1, wherein the at least one haptic generator is disposed within a seat in the occupant compartment.
[0282] 10. A vehicle interface device as described in paragraph 9 comprising means for determining occupant contact with the seat; wherein the processor is configured to control activation of said at least one haptic generator in dependence on the determined occupant contact with the seat.
[0283] 11. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a trajectory of the identified object in dependence on the identified object data; and to modify the haptic signal in dependence on the determined trajectory.
[0284] 12. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a time to collision in dependence on the identified object data; and to modify the haptic signal in dependence on the determined time to collision.
[0285] 13. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a nature of the identified object in dependence on the identified object data; and to modify the haptic signal in dependence on the determined nature.
[0286] 14. A vehicle interface device as described in paragraph 1, wherein the processor is configured to modify the haptic signal by changing one or more of the following parameters: amplitude, frequency, magnitude, haptic pattern; and pattern form.
[0287] 15. A vehicle interface device as described in paragraph 1, wherein the vehicle interface device is also suitable for generating a visual indication of a potential hazard, the vehicle interface device comprising: [0288] display means configured to extend around at least a portion of a perimeter of an occupant compartment in a vehicle; and [0289] a processor for controlling said display means; [0290] wherein the processor is configured to: [0291] in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; [0292] generate a control signal to cause the display means to display a visual indicator at a display position in said display means corresponding to the determined relative angular position of the identified object; and [0293] modify the control signal to progressively change the display position of the visual indicator within the display means at least substantially to match changes in the relative angular position of the identified object.
[0294] 16. A vehicle interface device as described in paragraph 1, wherein the vehicle interface device is also suitable for generating an audible indication of a potential hazard, the vehicle interface device comprising: [0295] at least one electroacoustic transducer configured to generate an audible signal; and [0296] a processor for controlling said at least one electroacoustic transducer; [0297] wherein the processor is configured to: [0298] in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the object relative to the vehicle; [0299] generate a control signal to cause the at least one electroacoustic transducer to generate an audio object; and [0300] modify the control signal to progressively change a perceived spatial location of the audio object to represent changes in the relative angular position of the identified object.
[0301] 17. A vehicle comprising a vehicle interface device as described in paragraph 1.
[0302] 18. A method of generating a haptic indication of a potential hazard, the method comprising: [0303] in dependence on object data relating to an identified object representing a potential hazard, determining an angular position of the identified object relative to a vehicle; [0304] generating a haptic signal for providing an indication of the determined relative position of the identified object; and [0305] progressively changing the generated haptic signal to represent changes in the relative angular position of the identified object.
[0306] 19. A method as described in paragraph 18 comprising receiving said object data from sensor means disposed on the vehicle.
[0307] 20. A method as described in paragraph 18 or paragraph 19 comprising controlling activation of a plurality of said haptic generators to represent changes in the relative angular position to the identified object from the vehicle.
[0308] 21. A method as described in any one of paragraphs 18, 19 or 20, wherein the at least one haptic generator comprises a vibration generator.
[0309] 22. A method as described in any one of paragraphs 18, 19 or 20, wherein the at least one haptic generator comprises an ultrasonic transducer for generating an ultrasonic signal.
[0310] 23. A method as described in paragraph 22, wherein the ultrasonic transducer is configured to control the output direction of said ultrasonic signal to represent changes in the relative angular position to the identified object from the vehicle.
[0311] 24. A method as described in any one of paragraphs 18, 19 or 20, wherein the at least one haptic generator comprises an air vent for generating the haptic signal in the form of a jet of air.
[0312] 25. A method as described in paragraph 24, wherein the air vent comprises an adjustable nozzle for controlling the direction of the jet of air to represent changes in the relative angular position to the identified object from the vehicle.
[0313] 26. A method as described in any one of paragraphs 18 to 25, wherein the at least one haptic generator is disposed within a seat in the occupant compartment.
[0314] 27. A method as described in paragraph 26 comprising determining occupant contact with the seat; and controlling activation of said at least one haptic generator in dependence on the determined occupant contact with the seat.
[0315] 28. A method as described in any one of paragraphs 18 to 27 comprising determining a trajectory of the identified object in dependence on the identified object data; and modifying the haptic signal in dependence on the determined trajectory.
[0316] 29. A method as described in any one of paragraphs 18 to 28 comprising determining a time to collision in dependence on the identified object data; and modifying the haptic signal in dependence on the determined time to collision.
[0317] 30. A method as described in any one of paragraphs 18 to 29 comprising determining a nature of the identified object in dependence on the identified object data; and modifying the haptic signal in dependence on the determined nature.
[0318] 31. A method as described in any one of paragraphs 18 to 30 comprising modifying the haptic signal by changing one or more of the following parameters: amplitude, frequency, magnitude, haptic pattern; and pattern form.
[0319] 32. A method as described in any one of paragraphs 18 to 31 comprising generating a visual indication of a potential hazard, the method comprising: [0320] in dependence on object data relating to an identified object representing a potential hazard, determining an angular position of the identified object relative to a vehicle; [0321] displaying a visual indicator at a display position corresponding to the determined relative angular position of the identified object; and [0322] progressively changing the display position of the visual indicator at least substantially to match changes in the relative angular position of the identified object.
[0323] 34. A method as described in any one of paragraphs 18 to 32 comprising generating an audible indication of a potential hazard, the method comprising: [0324] in dependence on object data relating to an identified object representing a potential hazard, determining an angular position of the object relative to a vehicle; [0325] generating an audible signal for providing an indication of the determined relative angular position of the object; and [0326] progressively modifying the generated audible signal to represent changes in the relative angular position of the identified object.