TRAFFIC SITUATION-DEPENDENT CONTROL OF A VIRTUAL REALITY DEVICE
20230174090 · 2023-06-08
Assignee
Inventors
Cpc classification
H04N21/41422
ELECTRICITY
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
H04N21/44008
ELECTRICITY
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
H04N21/4131
ELECTRICITY
International classification
Abstract
A method of controlling a VR device, in particular VR goggles, of a user in a vehicle interior of a vehicle via a communication link, includes vehicle data being received and evaluated, or receiving evaluated vehicle data via the communication link. At least one traffic event may be classified based on the evaluated vehicle data, to generate control commands for controlling the VR device, by way of which the user of the VR device may be alerted to a response of the vehicle to the traffic event, which may be unpredictable to the user. The examples further relate to a vehicle arrangement.
Claims
1-8. (canceled)
9. A method of controlling a VR device of a user in a vehicle interior of a vehicle via a communication link, the method comprising: obtaining evaluated vehicle data via the communication link; classifying the at least one traffic event on basis of the evaluated vehicle data; and generating control commands to control the VR device to alert the user of the VR device to a response of the vehicle to the at least one traffic event which is unpredictable to the user.
10. The method as claimed in claim 9, wherein the control commands are generated to output a message in a display of the VR device, activate a transparency of the VR device, output an acoustic or graphical notification, to pause or terminate an application of the VR device and/or to suspend a noise canceling function.
12. The method as claimed in claim 9, wherein the user of the VR device is alerted to a response of the vehicle to the at least one traffic event in form of an activation of an emergency braking assistant, an evasive maneuver of the vehicle, an arrival or approach to a destination, a triggered fatigue detection, or an object detected by a vehicle sensor system.
13. The method as claimed in claim 9, further comprising generating control commands to control the VR device to signal a requirement of a vehicle passenger to communicate with the user of the VR device.
14. The method as claimed in claim 13, wherein the requirement of the vehicle passenger to communicate with the user of the VR device is initiated by an on-board input by the vehicle passenger.
15. The method as claimed in 9, wherein the control commands to control the VR device are generated by an on-board control unit and/or by a processor unit of the VR device.
16. The method as claimed in claim 15, wherein the obtained evaluated vehicle data are transmitted between the on-board control unit and the VR device via a communication link from among communication links including a Bluetooth link, or a WLAN link.
17. A vehicle, comprising: an on-board control unit, and a VR device, configured to carry out a process of controlling the VR device via a communication link with the on-board control unit of the vehicle, the process including, obtaining evaluated vehicle data via the communication link; classifying the at least one traffic event on basis of the evaluated vehicle data; and generating control commands to control the VR device to alert the user of the VR device to a response of the vehicle to the at least one traffic event which is unpredictable to the user.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] These and other aspects and advantages will become more apparent and more readily appreciated from the following description of examples, taken in conjunction with the accompanying drawings, where like reference numerals refer to like elements throughout:
[0028] A FIGURE shows a vehicle arrangement according to an example.
DETAILED DESCRIPTION
[0029] The FIGURE shows a vehicle arrangement 1 according to an example. The vehicle arrangement 1 comprises a vehicle 2 which can be an automatically operable or manually controlled vehicle. As an example, two vehicle passengers 6, 8 are located in a vehicle interior 4 of the vehicle 2.
[0030] A first vehicle passenger is a driver of the vehicle 2. A second vehicle passenger 8 is wearing a virtual reality headset or a VR device 10. The second vehicle passenger 8 is therefore a user 8 of the VR device 10. The first vehicle passenger 6 is a so-called Non-VR user.
[0031] Between an on-board control unit 12 and the VR device 10, a data-carrying communication link 14 is established, which is based on a Bluetooth transmission standard, for example. Vehicle data can be received and evaluated, or evaluated vehicle data can be received, via the communication link 14. The vehicle data can be obtained or received, for example, by a vehicle sensor system 16. In addition, information from route guidance in the form of vehicle data can be taken into account.
[0032] The vehicle sensor system 16 can comprise, for example, camera sensors, LIDAR sensors, radar sensors and the like.
[0033] Based on the evaluated vehicle data, the on-board control unit 12 or a processor unit 11 of the VR device 10 can be used to classify a traffic situation or a traffic event. Based on the traffic event, a response of the vehicle 2 to the traffic event can be estimated.
[0034] For example, critical driving situations can be classified as possible traffic events. In critical driving situations, a safe distance of the vehicle 2 is undershot, thus initiating an evasive maneuver or an emergency braking maneuver. Such driving maneuvers can be initiated automatically by the on-board control unit 12 or by an emergency braking assistance function, or by the driver 6. The corresponding driving situation can often be determined in advance using the vehicle sensor system 16.
[0035] After a classification of the traffic event, control commands for controlling the VR device 10 are generated in order to alert the user 8 of the VR device 10 of a response of the vehicle 2 or the driver 6 to the traffic event that is unpredictable to the user 8, and to avoid shocking the user 8. To do this, a warning message can be displayed in the VR display of the VR device 10 by the generated control commands
[0036] Depending on configuration of the VR device 10, a See-Through function can be activated by the generated control commands. To position the VR device 10 in space, some VR devices 10 have camera systems which can be used to detect and evaluate the surroundings. A See-Through function displays the video stream of such a camera system instead of the VR display, to allow the user 8 a view behind the VR device 10. The See-Through function can therefore be activated and deactivated according to the traffic situation in order to prepare the user 8 for possible responses of the vehicle 2.
[0037] In another exemplary traffic event, the vehicle 2 can reach a destination. This can be registered based on vehicle data in the form of navigation data. When approaching the destination, a corresponding warning can be displayed in the VR device 10, so that the user 8 can prepare to end the VR usage.
LIST OF REFERENCE SIGNS:
[0038] 1 vehicle arrangement
[0039] 2 vehicle
[0040] 4 vehicle interior
[0041] 6 first vehicle passenger/driver
[0042] 8 second vehicle passenger/user of the VR device
[0043] 10 VR device/virtual-reality headset
[0044] 11 processor unit of the VR device
[0045] 12 on-board control unit
[0046] 14 communication link
[0047] 16 vehicle sensor system
[0048] A description has been provided with particular reference to examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims, which may include the phrase “at least one of A, B and C” as an alternative expression that refers to one or more of A, B or C, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).