System and Method for the Interactive Control of Vehicle Functions

20230339321 · 2023-10-26

    Inventors

    Cpc classification

    International classification

    Abstract

    A system comprises a vehicle and a smart device. The smart device is configured to establish a communicative connection with the vehicle. The vehicle is configured to determine a trigger event in the vehicle comprising an evaluation of sensor data in the vehicle. Upon the trigger event being determined by the vehicle, the vehicle is configured to communicate a trigger data set to the smart device. The smart device is configured, after receiving the trigger data set, to interact with the vehicle according to the trigger data set. The interacting of the smart device with the vehicle can comprise controlling at least one vehicle function of the vehicle.

    Claims

    1. A system for interaction control of vehicle functions, comprising: a vehicle; and a smart device configured to establish a communicative connection with the vehicle; wherein the vehicle is configured to: determine a trigger event in the vehicle comprising an evaluation of sensor data in the vehicle; and communicate a trigger data set to the smart device upon the trigger event being determined; and wherein the smart device is configured to, after receiving the trigger data set, interact with the vehicle according to the trigger data set.

    2. The system according to claim 1, wherein the smart device is configured to interact with the vehicle by controlling at least one vehicle function of the vehicle.

    3. The system according to claim 1, wherein the vehicle comprises a vehicle communication unit and the smart device comprises a smart device communication unit, and wherein the communicative connection between vehicle and smart device comprises a Bluetooth Low Energy (BLE) connection.

    4. The system according to claim 1, wherein the vehicle is further configured to determine the trigger event comprising the evaluation of the sensor data in the vehicle further comprising determining a spatial relation between the smart device and the vehicle.

    5. The system according to claim 1, wherein the sensor data in the vehicle comprise at least one of the following: a current geographical position of the vehicle; /or data concerning a Point of Interest, POI, according to a current geographical position of the vehicle; data with respect to a geographical position that is important for the user of the vehicle; a current time of day or a current time stamp; a current position and/or a current movement vector of the smart device relative to the vehicle; and/or a current state of a vehicle function or a current vehicle sensor value.

    6. The system according to claim 1, wherein the sensor data in the vehicle comprises a current state of a vehicle function or a current vehicle sensor value, wherein the vehicle function or the current vehicle sensor value comprises data with respect to at least one of the following vehicle functions or vehicle sensor values: light switched on or off; doors right/left front/back or trunk or hood opened or closed; current tire pressure of tires front/back right/left; charging flap and/or gas cap opened/closed; requisite need for maintenance in the vehicle; and/or state of charge or fuel tank level of the vehicle.

    7. A method for interactive control of vehicle functions, the method comprising: establishing a communicative connection between a smart device and a vehicle; determining, by the vehicle, a trigger event in the vehicle comprising evaluating sensor data in the vehicle; communicating, in response to determining the trigger, a trigger data set to the smart device; and interacting, by the smart device, with the vehicle according to the trigger data set.

    8. The method according to claim 7, wherein interacting by the smart device with the vehicle comprises controlling at least one vehicle function of the vehicle.

    9. The method according to claim 7, wherein establishing the communicative connection between the vehicle and the smart device comprises establishing a Bluetooth Low Energy (BLE) connection.

    10. The method according to claim 7, wherein evaluating the sensor data in the vehicle comprises determining a spatial relation between the smart device and the vehicle.

    11. The method according to claim 7, wherein evaluating the sensor data in the vehicle further comprises evaluating at least one of: a current geographical position of the vehicle; data concerning a Point of Interest, POI, according to a current geographical position of the vehicle; data with respect to a geographical position that is important for the user of the vehicle; a current time of day or a current time stamp; a current position and/or a current movement vector of the smart device relative to the vehicle; and/or a current state of a vehicle function or a current vehicle sensor value.

    12. The method according to claim 7, wherein evaluating the sensor data in the vehicle further comprises evaluating a current state of a vehicle function or a current vehicle sensor value, wherein the vehicle function or the current vehicle sensor value can comprise data with respect to at least one of the following vehicle functions or vehicle sensor values: light switched on or off; doors right/left front/back or trunk or hood opened or closed; current tire pressure of tires front/back right/left; charging flap and/or gas cap opened/closed; requisite need for maintenance in the vehicle; and/or state of charge or fuel tank level of the vehicle.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0084] FIG. 1 schematically shows a system for the interactive control of vehicle functions;

    [0085] FIG. 2 shows one exemplary method for the interactive control of vehicle functions.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0086] FIG. 1 schematically shows a system 100 for the interactive control of vehicle functions of a vehicle 110.

    [0087] The system 100 comprises a vehicle 110 and at least one smart device 120 A . . . 120 N. The smart device 120 A . . . 120 N can comprise a sensor unit 124 configured to acquire sensor data. In particular, the sensor data can comprise movement data of a bearer of the smart device 120 A . . . 120 N. For acquiring the movement data, the sensor unit 124 can in this case acquire sensor data from one or more of the following sensors:

    [0088] an acceleration sensor or accelerometer, which ascertains an acceleration by measuring an inertia force acting on a mass or test mass, with the result that it can determine the acceleration, an increase or decrease in speed and/or a direction of movement of the smart device (120 A . . . 120 N); and/or

    [0089] a position determining sensor or a position determining unit for acquiring or determining the geographical position or current position data with the aid of a navigation satellite system. The navigation satellite system can be any conventional and future global navigation satellite system (GNSS) for position determination and navigation by reception of signals from navigation satellites and/or pseudolites. This can involve for example the Global Positioning System (GPS), GLObal NAvigation Satellite System (GLONASS), Galileo positioning system, and/or BeiDou Navigation Satellite System. In the example of GPS, the position determining sensor or the position determining unit can comprise a GPS module configured to determine current GPS position data of the smart device 120 A . . . 120 N; and/or

    [0090] a gyro sensor, which is an acceleration or position sensor, configured to sense tiny accelerations, rotational movements and/or changes in position of a mass or test mass. Data of the gyro sensor can be combined with position data of a navigation module, wherein changes in direction, for example, can be ascertained very accurately through the combination of gyro sensor data and position data; and/or

    [0091] a magnetic field sensor configured to sense a current orientation or direction of movement of the smart device 120 A . . . 120 N; and/or

    [0092] a proximity sensor or approach sensor for activating or deactivating the display of the smart device 120 A . . . 120 N; and/or

    [0093] at least one further sensor configured to acquire movement data of the smart device 120 A . . . 120 N.

    [0094] The smart device 120 A . . . 120 N is configured to establish and/or set up a communicative connection with the vehicle 110.

    [0095] The vehicle 110 and the smart device 120 A . . . 120 N can each comprise a communication unit 116, 126. The vehicle 110 and the smart device 120 A . . . 120 N can be configured to set up a Bluetooth Low Energy (BLE) connection to one another by means of the respective communication unit 116, 126 in a manner known from the prior art.

    [0096] By way of example, the smart device 120 A . . . 120 N can already be configured as a digital key or a digital vehicle key in a manner known from the prior art for the vehicle 110. By means of the BLE technology, it is thus possible for the bearer of the smart device 120 A . . . 120 N, without any initial input or requirement of an initial connection between the vehicle 110 and the smart device 120 A . . . 120 N, by means of approaching the vehicle 110, to set up a communicative connection with the vehicle 110 or to interact with the vehicle 110, wherein security with respect to the communication between the vehicle 110 and the smart device 120 A . . . 120 N is simultaneously ensured by the digital key security requirements. The digital key security requirements or security requirements of digital vehicle keys are known from the prior art, for example known in accordance with the Standard Car Connectivity Consortium®.

    [0097] The vehicle 110 is configured to determine a trigger event with respect to the vehicle. In this case, determining the trigger event comprises the evaluation of sensor data in the vehicle 110. The acquiring and evaluating of sensor data in the vehicle 110 are effected in a manner known from the prior art. The vehicle 110 can comprise a computing unit 112 configured to evaluate sensor data in the vehicle and—on the basis of predefinable or predefined criteria—to determine a trigger event.

    [0098] The sensor data in the vehicle 110 which are evaluated for determining the trigger event in the vehicle can comprise: [0099] a current geographical position of the vehicle 110; and/or [0100] data concerning a Point of Interest, POI, according to a current geographical position of the vehicle 110; and/or [0101] data with respect to a geographical position that is important for the user of the vehicle 110, e.g. a home address, a work address, etc.; and/or [0102] a current time of day or a current time stamp; and/or [0103] a current position and/or a current movement vector of the smart device 120 A . . . 120 N relative to the vehicle 110; and/or [0104] smart device sensor data that were acquired by the sensor unit 124 of the smart device 120 A . . . 120 N and transmitted to the vehicle 110 via the communicative connection, wherein the smart device sensor data can be processed by the smart device 120 A . . . 120 N before being transmitted to the vehicle 110; [0105] a current state of a vehicle function or a current vehicle sensor value, wherein the vehicle function or the vehicle sensor value can comprise data with respect to the following vehicle functions or vehicle sensor values: [0106] light switched on or off; and/or [0107] doors right/left front/back or trunk or hood opened or closed; and/or [0108] current tire pressure of tires front/back right/left; and/or [0109] charging flap and/or gas cap opened/closed; and/or [0110] requisite need for maintenance in the vehicle; and/or [0111] state of charge or fuel tank level of the vehicle; and/or [0112] any further suitable current state of a vehicle function.

    [0113] Preferably, the evaluating of sensor data in the vehicle for the purpose of determining the trigger event comprises determining a spatial relation between the smart device and the vehicle.

    [0114] The vehicle 110 or the computing unit 112 of the vehicle 110 can be configured to determine a spatial relation between the smart device 120 A . . . 120 N and the vehicle 110. In particular, the determined spatial relation between the smart device 120 A . . . 120 N and the vehicle 110 can be taken into account by the computing unit 112 of the vehicle 110 when determining the trigger event or can be a (partial) prerequisite for determining the trigger event.

    [0115] In order to determine the spatial relation between the smart device 120 A . . . 120 N and the vehicle 110, communication between smart device 120 A . . . 120 N and vehicle 110 can be effected using ultra-wideband technology (UWB). This involves short-range radio communication that uses extremely large frequency ranges with a bandwidth of at least 500 MHz or of at least 20% of the arithmetic mean of the lower and upper limit frequencies of the frequency band used. Advantageously, a highly precise determination of the position of the smart device 120 A . . . 120 N with respect to the vehicle 110 can be achieved through the use of UWB technology. In this case, the data can be transmitted from the smart device 120 A . . . 120 N to the vehicle 110 locally via a suitable radio interface, or the abovementioned communicative connection 130, e.g. Bluetooth Low Energy (BLE), between smart device 120 A . . . 120 N and vehicle 110. The spatial relation can result from the highly precise determination of the position of the smart device 120 A . . . 120 N or the bearer of the smart device 120 A . . . 120 N with respect to the vehicle 110. For this purpose, the vehicle 110 can comprise UWB anchors 118 A . . . 118 D. In this case, the position of the smart device 120 A . . . 120 N with respect to the vehicle 110 can be implemented in zones depending on the system design. In this case, in the region of the exterior of the vehicle 110, for example, a rear zone, a front zone and side zones are conceivable. Furthermore, or as an alternative thereto, the determination of the spatial relation can also comprise the identification of at least one movement vector of the smart device 120 A . . . 120 N relative to the vehicle 110, whereby a movement of the user of the smart device 120 A . . . 120 N relative to the vehicle 110 can be determined. Furthermore, or as an alternative thereto, it is also possible to determine the spatial relation by means of a precise position of the smart device 120 A . . . 120 N with respect to the vehicle 110, for example 1 meter (m) in front of the driver's door. Furthermore, or as an alternative thereto, the determination of the spatial relation in the vehicle interior is possible as a precise position in the vehicle interior. In another example, the vehicle interior can also be subdivided into zones, e.g. driver's seat, passenger seat, right back seat region, etc.

    [0116] Upon or as a consequence of the trigger event being determined, the vehicle 110 is configured to communicate a trigger data set to the smart device 120 A . . . 120 N—via the communicative connection 130. In this case, a predefined or predefinable trigger data set can be assigned to each trigger event.

    [0117] After receiving the trigger data set from the vehicle 110, the smart device 120 A . . . 120 N is configured to interact with the vehicle 110 according to the trigger data set. In this case, a predetermined or predeterminable interaction between the vehicle 110 and the smart device 120 A . . . 120 N can be effected in respect of each trigger data set.

    [0118] As a result of the trigger data set being communicated from the vehicle 110 to the smart device 120 A . . . 120 N via the communicative connection 130, the vehicle 110 can call up a suitable, predeterminable or predetermined application or app in the smart device 120 A . . . 120 N, which comprises a corresponding functionality. In another example, the functionality described below can already be integrated in the operating system of the smart device 120 A . . . 120 N and can be correspondingly called up or activated by the communication of the trigger data set from the vehicle 110 to the smart device 120 A . . . 120 N.

    [0119] The interacting of the smart device 120 A . . . 120 N with the vehicle 110 after receiving the trigger data set from the vehicle comprises at least one instance of feedback from the smart device 120 A . . . 120 N to the vehicle 110 indicating that an action according to the trigger data set has taken place. Furthermore, in the context of this document, the interacting of the smart device 120 A . . . 120 N with the vehicle 110 after receiving the trigger data set can comprise:

    [0120] a reaction of the smart device 120 A . . . 120 N according to the processed content of the received trigger data set; and/or

    [0121] a control and/or interaction of the smart device 120 A . . . 120 N by functions and/or devices connected to the smart device 120 A . . . 120 N, wherein the devices can comprise for example smart home devices known to the smart device, as explained in greater detail further below; and/or

    [0122] an interaction of the smart device 120 A . . . 120 N with the bearer of the smart device 120 A . . . 120 N in a manner known from the prior art, wherein the interacting with the bearer of the smart device 120 A . . . 120 N comprises for example an output of the smart device 120 A . . . 120 N, e.g. notification to the user via user interfaces of the smart device 120 A . . . 120 N known from the prior art, for example a visual and/or acoustic and/or haptic etc. notification and/or an input by the bearer of the smart device 120 A . . . 120 N that is required for the further procedure, for example via touch input, voice input, etc.; and/or

    [0123] a communication of further predeterminable or predetermined data for further interaction with the vehicle 110 between smart device 120 A . . . 120 N and vehicle 110.

    [0124] This makes it possible to realize an interaction between vehicle 110 and smart device 120 A . . . 120 N in a very flexible manner, without intervention by the user of the vehicle and/or smart device being necessarily required. In particular, a computing unit 122 of the smart device 120 A . . . 120 N can process the trigger data set. Interacting of the smart device 120 A . . . 120 N with the vehicle 110 as explained above can be carried out on the basis of the processed data of the trigger data set.

    [0125] The interacting of the smart device 120 A . . . 120 N with the vehicle 110 can comprise controlling at least one vehicle function of the vehicle 110.

    [0126] For this purpose, the vehicle 110 can comprise a control unit 114 configured to control or regulate a predefinable or predefined vehicle function according to the trigger event and/or the interaction between smart device 120 A . . . 120 N and vehicle 110, as explained further below with reference to a plurality of exemplary embodiment variants.

    [0127] Advantageously, a particularly agile and flexible interaction between the smart device 120 A . . . 120 N, the vehicle 110 and also optionally further devices connected to the smart device 120 A . . . 120 N and/or the bearer of the smart device 120 A . . . 120 N can thus be effected. Advantageously, the agility and flexibility of the interaction between smart device 120 A . . . 120 N and vehicle 110 is thus increased. There is thus an increase in the agility and flexibility during the control of the vehicle functions, the security—through the use of digital key standards—in the interaction between smart device 120 A . . . 120 N and vehicle 110 and thus in the control of the vehicle functions being ensured at the same time.

    Exemplary embodiments are explained below.

    Example 1: Interaction and Control of Vehicle Functions—Tire Pressure

    [0128] A vehicle 110 comprises suitable sensors from the prior art which acquire sensor values with regard to the tire pressure of each of the tires of the vehicle. A smart device 120 A . . . 120 N is assigned to the user of the vehicle 110 and is coupled to the vehicle 110 by means of BLE.

    [0129] A trigger event is predefined or predefinable as follows:

    [0130] at least one tire pressure sensor determines at least one tire pressure that is not suitable for at least one tire of the vehicle 110; and

    [0131] a current GPS position in combination with POI data reveals that the vehicle 110 is situated at a gas station or service station or a suitable location for checking a tire pressure; and

    [0132] the vehicle 110 determines by way of BLE that the user or driver of the vehicle 110, who is the bearer of the smart device 120 A . . . 120 N, is moving out of the vehicle 110.

    [0133] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (tire pressure sensor, position sensor, POI map data) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0134] data with respect to the position of the tire(s) comprising a tire pressure that is not suitable for the tire(s) of the vehicle 110;

    [0135] data with respect to a target tire pressure for the tire(s) determined above.

    [0136] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0137] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the tire pressure can be checked or adjusted at the current geographical location (gas station, service station, etc.).

    [0138] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the user will check the tire pressure;

    [0139] the smart device 120 A . . . 120 N indicates the tire(s) for which the tire pressure needs to be checked (e.g. front left and back right);

    [0140] the vehicle 110 determines—for example by means of UWB—the location at the vehicle 110 or the tire of the vehicle 110 at which the user of the vehicle 110 or the bearer of the smart device 120 A . . . 120 N is situated (spatial relation), or determines, on the basis of a movement vector of the user of the smart device 120 A . . . 120 N, the tire of the vehicle 110 to which the user is expected to move, and sends a message comprising relevant data (actual tire pressure and target tire pressure of the tire) to the smart device 120 A . . . 120 N, for example by means of BLE;

    [0141] for this tire, the smart device 120 A . . . 120 N outputs the actual tire pressure determined by the tire pressure sensor and also the target tire pressure to the user of the vehicle 110;

    [0142] the user of the vehicle 110 confirms by way of input via the smart device 120 A . . . 120 N that the user has checked and possibly correctly adjusted the tire pressure.

    [0143] In another example, the tire inflater or the tire air pressure device can be a smart device. In this example, the smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0144] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the tire pressure can be checked or adjusted at the current geographical location by means of the smart device tire air pressure device (availability of smart device tire air pressure device at gas station, service station, etc.).

    [0145] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the user will check the tire pressure;

    [0146] the smart device 120 A . . . 120 N indicates the tire(s) for which the tire pressure needs to be checked (e.g. front left and back right);

    [0147] the vehicle 110 determines—for example by means of UWB—the location at the vehicle 110 or the tire of the vehicle 110 at which the user of the vehicle 110 or the bearer of the smart device 120 A . . . 120 N is situated (spatial relation), or determines, on the basis of a movement vector of the user of the smart device 120 A . . . 120 N, the tire of the vehicle 110 to which the user is expected to move, and sends a message comprising relevant data (actual tire pressure and target tire pressure of the tire) to the smart device 120 A . . . 120 N, and also the target tire pressure of the tire directly or indirectly by means of the smart device 120 A . . . 120 N to the smart device tire air pressure device for example by means of BLE, WiFi or Thread;

    [0148] the smart device tire air pressure device sets the received target tire pressure at the tire and reports back the status;

    [0149] the user of the vehicle 110 confirms—optionally—by way of input via the smart device 120 A . . . 120 N that the user has checked the tire pressure set by the smart device tire air pressure device.

    [0150] The interacting of the smart device 120 A . . . 120 N with the vehicle 110 subsequently comprises controlling a vehicle function (checking the tire pressure) of the vehicle 110:

    [0151] the smart device 120 A . . . 120 N communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0152] the vehicle 110—after receiving the control message—controls the tire pressure sensor in such a way that the latter checks the tire pressure newly set by the user of the vehicle 110, and sends a checking message (actual tire pressure now corresponds to target tire pressure, or actual tire pressure still does not correspond to the target tire pressure) to the smart device 120 A . . . 120 N;

    [0153] the smart device 120 A . . . 120 N outputs (acoustically and/or visually) the checking message to the user of the vehicle 110.

    [0154] If the user of the vehicle 110 and of the smart device 120 A . . . 120 N is situated at the tire, the user can also start a tire check via the smart device 120 A . . . 120 N. In this case, the smart device 120 A . . . 120 N can record an image of the tire by means of an integrated camera. By means of known image processing algorithms and also by means of suitable machine learning algorithms, it is possible—at the smart device 120 A . . . 120 N or after communication of the recorded image at the vehicle 110 or after communication to a backend—on the basis of textual identifications, QR codes, etc. at the tire—to determine whether the recorded tire is approved for the vehicle 110, corresponds to the safety specifications, is mounted correctly with regard to the direction of travel, etc.

    [0155] The abovementioned steps can be repeated for possible further tires and/or a renewed difference between actual tire pressure and target tire pressure of the tire.

    Example 2: Interactive Setting of Vehicle Functions

    [0156] A user of the vehicle 110 carries a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled to the latter (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0157] Predefined or predefinable trigger event:

    [0158] identifying a preset presentation or greeting of the user of the vehicle 110 upon the user approaching the vehicle 100, if the user is carrying the smart device 120 A . . . 120 N (presetting of vehicle function in combination with user profile of smart device 120 A . . . 120 N). A presentation can comprise: switching on the low-beam light and/or switching on interior lighting of the vehicle 110 and/or outputting a sound presentation via vehicle loudspeakers and/or folding out the exterior mirrors of the vehicle etc. starting from when the smart device 120 A . . . 120 N is at a specific distance from the vehicle 110.

    [0159] Recognizing that the vehicle 110 is being approached by the user carrying the smart device 120 A . . . 120 N (e.g. by means of UWB, spatial relation).

    [0160] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (user profile data, UWB sensor system) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0161] data with respect to starting the presentation according to the user profile.

    [0162] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0163] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the presentation will be started according to the user profile and/or that the presentation will be started the next time the user of the vehicle 110 is predicted to depart (for example on the basis of a learning user profile).

    [0164] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the user does not want a presentation according to the user profile for a next approach to the vehicle 110 and/or for a predetermined or predeterminable period of time;

    [0165] The interacting of the smart device 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (brief deactivation of the presentation according to the user profile) of the vehicle 110:

    [0166] the smart device 120 A . . . 120 communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0167] the vehicle 110—after receiving the control message—controls the vehicle functions according to the presentation in such a way that the vehicle 110 deactivates the presentation for the next approach to the vehicle 110 by the user, for the next approach and/or for the predefined or predefinable period of time, and communicates a success message (presentation is deactivated for the next approach to the vehicle 110 or the predefined or predefinable period of time) to the smart device 120 A . . . 120;

    [0168] the smart device 120 A . . . 120 outputs (acoustically and/or visually) the checking message to the user of the vehicle 110.

    [0169] Furthermore, by means of suitable machine learning algorithms, for example with the aid of models created by machine learning methods—e.g. by means of supervised learning or unsupervised learning—the vehicle 110 can learn at what times and/or at what geographical positions etc. the presentation is not necessary or has been deactivated by the user of the vehicle 110, and can automatically adopt this setting for the future.

    [0170] A deactivation of the presentation may be desired by the user of the vehicle 110, for example, if:

    [0171] the vehicle 110 is standing in a driveway of a house and the user—on account of working on the house and/or in the garden—frequently passes by the vehicle 110 without wanting to use the latter;

    [0172] the presentation is intended not to occur on account of a relatively late time of day (light/noise nuisance);

    [0173] the presentation is intended not to occur on account of an activity at a specific geographical position (e.g. observation of wildlife in a forest);

    [0174] etc.

    Example 3: Personalization of Vehicle Functions

    [0175] At least two users of the vehicle 110 are carrying a smart device 120 A . . . 120 N, which are known to the vehicle 110 and are coupled thereto (e.g. both smart devices 120 A . . . 120 N as a digital key of the vehicle 110).

    [0176] Predefined or predefinable trigger event:

    [0177] the at least two users approach the vehicle 110 in each case with the smart device 120 A . . . 120 N (digital key technology).

    [0178] Each user or smart device 120 A . . . 120 N is assigned a user profile, wherein the user profile comprises vehicle settings such as e.g. seat setting, mirror setting, etc., which are automatically set by the vehicle 110.

    [0179] The vehicle 110 cannot unambiguously recognize which user takes a seat at which position in the vehicle 110 or, with the aim of localizing the smart device 120 A . . . 120 N relative to the vehicle 110 or the movement trajectory of the approach to the vehicle 110, which user or which associated smart device 120 A . . . 120 N is moving to which position in the vehicle 110.

    [0180] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (interior sensor system, position sensor, POI map data) and communicates a corresponding trigger data set to the smart devices 120 A . . . 120 N of the at least two users. The trigger data set can comprise:

    [0181] data with respect to the recognition of at least two smart devices 120 A . . . 120 N.

    [0182] Each smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0183] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the presence of at least two users has been recognized.

    [0184] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) the position in the vehicle 110 at which the user takes a seat (driver, passenger, back seat right, back seat middle, back seat left, etc.).

    [0185] The interacting of the smart devices 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (vehicle settings according to the user profile of the respective smart device 120 A . . . 120 N) of the vehicle 110:

    [0186] each smart device 120 A . . . 120 communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0187] the vehicle 110—after receiving the control message—controls the vehicle functions in such a way that the latter are set according to the position in the vehicle 110, and communicates a success message to the respective smart device 120 A . . . 120 N;

    [0188] the smart device 120 A . . . 120 outputs (acoustically and/or visually) the checking message to the respective bearers thereof.

    Example 4: Interactive Control of Anti-Theft Warning Systems in Conjunction with Presence Recognition of Person and/or Animal in the Vehicle 110

    [0189] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0190] Predefined or predefinable trigger event:

    [0191] a current GPS position in combination with POI data reveals that the vehicle 110 is situated at a gas station or service station.

    [0192] The user of the vehicle 110 leaves the vehicle 110 (determination for example by way of UWB).

    [0193] A person, e.g. the passenger, or an animal remains in the vehicle 110 (determination for example by way of interior camera and/or seat occupancy mat with a corresponding sensor system, etc.).

    [0194] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (interior sensor system, position sensor, POI map data) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0195] data with respect to the recognition of a remaining person and/or a remaining animal in the vehicle;

    [0196] data with respect to the vehicle function of the anti-theft warning system, which is activated by the vehicle 110 being left by the vehicle user (and thus attendant locking of the vehicle 110).

    [0197] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0198] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that there is still a person or an animal in the vehicle 110 and/or the anti-theft warning system is activated as a result of the vehicle 110 being left and there is thus the risk of the anti-theft warning system being activated by movement in the vehicle interior.

    [0199] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the anti-theft warning system is intended to be deactivated for a predetermined or predeterminable period of time (e.g. for 5 minutes, for 10 minutes, until the return of the user and thus until the next time the vehicle 110 is unlocked), wherein the predetermined or predeterminable period of time can be fixedly predefined or can be selected by the user of the smart device 120 A . . . 120 N or of the vehicle 110 by way of input;

    [0200] The interacting of the smart device 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (brief deactivation of the anti-theft warning system) of the vehicle 110:

    [0201] the smart device 120 A . . . 120 communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0202] the vehicle 110—after receiving the control message—controls the anti-theft warning system in such a way that the vehicle 110 deactivates the anti-theft warning system for the predefined or predefinable period of time and communicates a success message (anti-theft warning system is deactivated for the predefined or predefinable period of time) to the smart device 120 A . . . 120;

    [0203] the smart device 120 A . . . 120 outputs (acoustically and/or visually) the checking message to the user of the vehicle 110.

    Example 5: Linking Smart Home Functions to Vehicle 110

    [0204] There currently exist IP-based connectivity standards (e.g. Matter) for the automation of home functions (smart home devices). All smart home devices which support this standard are thereby enabled to communicate. In particular, the smart device 120 A . . . 120 N can thus control smart home devices by means of a smart device application by way of the IP-based connectivity standards.

    [0205] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110). A smart device application is loaded on the smart device 120 A . . . 120 N and can be executed on the latter, which application can control smart home devices of an associated smart home by way of an IP-based connectivity standard. A list of the smart home devices which are controlled by the smart device 120 A . . . 120 N is stored on the smart device 120 A . . . 120 N and is synchronized with the vehicle. The synchronization can be effected by way of a backend, for example. In this case, a corresponding device identification with parameters such as e.g. device type, device designation, GPS position of the device (e.g. garage with smart home garage opener) is stored for each smart home device.

    [0206] Predefined or predefinable trigger event:

    [0207] a current GPS position and/or a journey route input in the vehicle 110 reveals that the vehicle 110 is approaching a position at which a smart home device which can be controlled by the smart device 120 A . . . 120 N is situated. The smart home device is known to the vehicle 110. This can arise for example from the abovementioned synchronization of the list with the vehicle 110.

    [0208] The vehicle 110 recognizes that the list of the smart home devices at the current GPS position includes a garage with a smart home garage door.

    [0209] The vehicle 110 signals and/or displays to the user, for example via a suitable output unit, the smart home devices available at the position determined.

    [0210] The user of the vehicle 110 and of the smart device 120 A . . . 120 N can input by means of an input (e.g. voice input and/or touch input) in the vehicle 110 the intended operational control of the available smart home device(s) or the intended querying of a status. By way of example, the user of the vehicle sees that the garage with the smart home garage door is closed (status query), and can input the intended opening thereof (operational control). The vehicle communicates the user's input (status query and/or operational control) to the smart device 120 A . . . 120 N. The smart device 120 A . . . 120 N controls the smart home garage door according to the communicated input from the user.

    [0211] The vehicle 110 communicates a corresponding trigger data set calculated from the communicated input from the user to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0212] data with respect to the recognition of an approach to the garage with the smart home garage door;

    [0213] smart home device type, designation and identification number, which can be uniquely assigned to the smart home device list present in the smart device;

    [0214] a control command, e.g. garage opened/closed, luminaire brightness 0-100%, etc.

    [0215] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0216] the trigger data set triggers an interaction between smart device 120 A . . . 120 N and vehicle 110;

    [0217] the smart device 120 A . . . 120 controls the smart home garage door according to the control message in an IP-based manner (e.g. Matter) by initiating the opening of the garage door. The interaction can comprise a feedback message from the smart device 120 A . . . 120 N to the vehicle 110 regarding successful performance of the action, for example the garage is open. The feedback message can be output in the vehicle 110 to the user of the vehicle 110 via a suitable output unit.

    [0218] Advantageously, it is thus possible to realize—in an analogous manner—any desired smart home functions as extended vehicle functions by way of the smart device 120 A . . . 120 N.

    [0219] In particular, it is thus possible to carry out—analogously to the procedure above—the control of a multiplicity of smart home devices, for example the control of smart home lighting in the garage or in the driveway of the house of the user of the vehicle 110. Furthermore, on account of the bidirectional communication between smart device 120 A . . . 120 N and vehicle 110 (as an example of the interaction between smart device 120 A . . . 120 N), it is possible to indicate statuses of smart home devices in the vehicle 110, e.g. “garage is open”.

    Example 6: Control of Arbitrary Vehicle Functions or Vehicle Settings in the Vehicle 110

    [0220] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N which couples to the vehicle 110. The smart device 120 A . . . 120 N comprises a digital key for the use of the vehicle 110. The vehicle 110 recognizes that the user is an occasional user of the vehicle 110 (e.g. car sharing use, taxi journey, etc.).

    [0221] Predefined or predefinable trigger event:

    [0222] the vehicle 110, with the aid of the digital key, recognizes that the user of the smart device 120 A . . . 120 N is an occasional user or one-off user of the vehicle 110 and where this user is situated in the vehicle 110.

    [0223] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (interior sensor system, UWB sensor system, digital key standard) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0224] data with respect to possibilities of vehicle settings in the vehicle 110 or at the position in the vehicle 110 (spatial relation). These can comprise for example setting possibilities at the seat, travel data or navigation data etc.

    [0225] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0226] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with advice about possible vehicle settings.

    [0227] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) the intention to carry out specific vehicle settings.

    [0228] The interacting of the smart device 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (setting the vehicle functions) of the vehicle 110:

    [0229] the smart device 120 A . . . 120 communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0230] the vehicle 110—after receiving the control message—controls the vehicle settings according to the control message and communicates a success message (vehicle settings have been implemented) to the smart device 120 A . . . 120;

    [0231] the smart device 120 A . . . 120 outputs (acoustically and/or visually) the checking message to the user of the vehicle 110.

    Example 7: Transfer of Telephony and/or Entertainment Functions

    [0232] At least one user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0233] Predefined or predefinable trigger event:

    [0234] the vehicle 110 recognizes that the user of the vehicle 110 and user of the smart device 120 A . . . 120 N is getting into the vehicle 110 or leaving the vehicle 110 (e.g. UWB sensor system, vehicle interior sensors).

    [0235] The vehicle 110 recognizes an ongoing telephony and/or entertainment function on the smart device 120 A . . . 120 N).

    [0236] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (interior sensor system, UWB sensor system) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0237] data with respect to the recognition of the user getting into or out of the vehicle 110.

    [0238] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0239] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N.

    [0240] The interacting of the smart device 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (e.g. transferring the audio stream of a telephony function implemented via the smart device 120 A . . . 120 N from the vehicle to the smart device 120 A . . . 120 N; and/or transferring the telephony function (e.g. in the case of Voice over IP telephony) and/or entertainment function of the smart device 120 A . . . 120 N into the vehicle 110 or from the vehicle 110 into the smart device 120 A . . . 120 N of the vehicle 110:

    [0241] the vehicle 110 recognizes that the user is getting out of the vehicle 110. After the actuation of the door contact of the vehicle 110 has been recognized, the ongoing telephony function (Voice over IP) or the ongoing audio stream of a telephone call conducted via the smart device 120 A . . . 120 N is transferred from the vehicle 110 to the smart device 120 A . . . 120 N in a manner known from the prior art; or

    [0242] the vehicle 110 recognizes that the user is getting into the vehicle 110. After the actuation of the door contact has been recognized, an ongoing entertainment function (e.g. video streaming) is transferred from the smart device 120 A . . . 120 N via the corresponding output unit at the position in the vehicle 110 at which the user takes a seat (spatial relation), in a manner known from the prior art, and continues to be implemented. In the case of video streaming, here the video can continue to be implemented on the smart device 120 A . . . 120 N and the audio and video output can be effected via suitable output units in the vehicle 110. As an alternative thereto, the smart device 120 A . . . 120 N can communicate to the vehicle 110—if the latter has an integrated SIM card—a URL and a current time or time stamp of the video, such that the latter is transmitted via the vehicle.

    Example 8: Warning about Forgotten Objects in the Vehicle 110

    [0243] It is known from the prior art to attach a locating device or tracker to an object, which makes it possible to find the object by way of radio technology, e.g. UWB or BLE, for example Apple AirTag.

    [0244] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110). A locating device is attached at least to one object, for example an umbrella, and is known to the vehicle 110 and/or smart device 120 A . . . 120 N.

    [0245] Predefined or predefinable trigger event:

    [0246] the vehicle 110 recognizes that the user of the vehicle 110 is leaving the vehicle 110.

    [0247] the vehicle 110 recognizes that the umbrella has been left behind in the vehicle 110.

    [0248] the vehicle 110 recognizes that it will rain. Alternatively, by means of corresponding machine learning algorithms, the vehicle 110 can recognize that the user of the vehicle 110 always takes the umbrella with them.

    [0249] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (BLE sensor system, UWB sensor system, integrated weather application in the vehicle) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0250] data with respect to the object left behind in the vehicle 110;

    [0251] data with respect to the weather situation (it is supposed to rain) or data with respect to past habits of the user (the umbrella is always taken along).

    [0252] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0253] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the umbrella has been left behind in the vehicle and it is supposed to rain or the umbrella is always taken along.

    [0254] The example mentioned above can be extended or supplemented in any desired way—for example with the aid of suitable machine learning algorithms. By way of example, the vehicle 110 and/or the smart device 120 A . . . 120 N can “learn” that the user of the vehicle 110 does not want a notification upon leaving the umbrella behind in the vehicle 110—despite predicted rainy weather—if the vehicle is parked in its own garage, since in this case of use the user of the vehicle 110 will always leave the umbrella behind in the vehicle. In addition or as an alternative thereto, the smart device 120 A . . . 120 N can also warn against forgetting the umbrella if the latter is not situated in the vehicle, e.g. is forgotten after a restaurant visit in a restaurants.

    Example 8: Reminder to Carry Out Vehicle-Related Actions

    [0255] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0256] Predefined or predefinable trigger event:

    [0257] the vehicle 110 recognizes that the user of the vehicle 110 is leaving the vehicle 110.

    [0258] the vehicle 110 recognizes that a vehicle-related action must be carried out;

    [0259] the vehicle 110 determines from a geographical position in combination with POI map data that the vehicle-related action can be carried out at the current geographical position.

    [0260] The vehicle 110 is configured to determine the abovementioned trigger event by means of the evaluation of the corresponding sensor data (BLE sensor system, UWB sensor system, integrated vehicle sensors, GPS sensor) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0261] data with respect to the vehicle function to be carried out, e.g. wiping water in the vehicle empty, state of charge of an electrical energy store or tank filling level low, optionally in regard to the distance of the next journey predicted to be travelled, etc.

    [0262] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. In particular, the interaction can comprise the following steps:

    [0263] the trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the vehicle-related action determined can be carried out.

    Example 9: Activation of Social Network Applications

    [0264] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0265] Predefined or predefinable trigger event:

    [0266] the vehicle 110 recognizes by means of suitable machine learning algorithms and/or on the basis of database entries, for example, that the vehicle 110 is situated at a popular geographical position or a special geographical position.

    [0267] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (BLE sensor system, UWB sensor system, integrated vehicle sensors, GPS sensor) and communicates a corresponding trigger data set to the smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0268] data with respect to the popular or special geographical position.

    [0269] The smart device 120 A . . . 120 N receives the trigger data set and is configured to interact with the vehicle 110 according to the trigger data set. The trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with the advice that the vehicle is situated at a popular or special geographical position.

    [0270] The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the user wants to record the popular or special geographical position or a recorded data set is available in the temporary memory.

    [0271] The interacting of the smart device 120 A . . . 120 with the vehicle 110 subsequently comprises controlling a vehicle function (recording the popular or special geographical position) of the vehicle 110:

    [0272] the smart device 120 A . . . 120 communicates a corresponding control message—for example via BLE—to the vehicle 110;

    [0273] the vehicle 110—after receiving the control message—controls exterior cameras of the vehicle 110 in such a way that the popular or special geographical position is photographed or recorded. The vehicle 110 communicates a success message (comprising the photographed or recorded geographical position) to the smart device 120 A . . . 120;

    [0274] the smart device 120 A . . . 120 outputs (acoustically and/or visually) the checking message to the user of the vehicle 110, such that the user of the smart device can upload the photograph or the recording via a social media application.

    Example 10: Complying with Parking or Waiting Regulations

    [0275] A user of the vehicle 110 is carrying a smart device 120 A . . . 120 N, which is known to the vehicle 110 and is coupled thereto (e.g. smart device 120 A . . . 120 N as digital key of the vehicle 110).

    [0276] Predefined or predefinable trigger event:

    [0277] the vehicle 110 recognizes that a parking or waiting process is under way; and

    [0278] the vehicle 110 recognizes that parking/waiting regulations are in force.

    [0279] In this case, it is possible to predefine for example that a vehicle 110 is waiting if the user of the vehicle 110 remains in the vehicle 110 and does not wait for longer than 3 minutes, and a vehicle 110 is parked if the user of the vehicle 110 leaves the vehicle 110 or waits for longer than 3 minutes. Different parking or waiting regulations may be in force for the parking or waiting of the vehicle 110.

    [0280] The vehicle 110 is configured to determine the abovementioned trigger event by means of evaluation of the corresponding sensor data (BLE sensor system, UWB sensor system, GPS position or geographical position of the vehicle, navigation data of the vehicle, database entries concerning parking regulations at the current GPS position, evaluation of traffic signs captured by vehicle cameras) and communicates a corresponding trigger data set to a smart device 120 A . . . 120 N. The trigger data set can comprise:

    [0281] data with respect to waiting/parking regulations currently in force at the current geographical position (e.g. waiting/parking only for a limited time, ticket required for authorizing parking, etc.).

    [0282] The smart device 120 A . . . 120 N receives the trigger data set. The trigger data set triggers an interaction application, which is loaded and executed on the smart device 120 A . . . 120 N. The interaction application triggers a notification via the smart device 120 A . . . 120 N in a manner known from the prior art and can comprise for example a haptic and/or visual and/or acoustic message to the user of the smart device 120 A . . . 120 N with advice about parking regulations currently in force. The user of the smart device 120 A . . . 120 N can indicate by means of an input (e.g. voice input and/or touch input) that the user no longer wants advice in future for the current geographical position. This input is communicated from the smart device 120 A . . . 120 N to the vehicle 110 and correspondingly stored in a storage unit.

    [0283] The examples mentioned above are exemplary embodiments and serve for elucidation. The trigger events and also the interactions between smart device 120 A . . . 120 N and vehicle 110 can be arbitrarily combined and extended.

    [0284] FIG. 2 shows a method 200 for the interactive control of vehicle functions of a vehicle 110, which method can be carried out by a system 100 as described with reference to FIG. 1.

    [0285] The method 200 comprises:

    [0286] establishing 210 a communicative connection between a smart device 120 A . . . 120 N and a vehicle 110;

    [0287] determining 220, by means of the vehicle 110, a trigger event in the vehicle 110, wherein determining the trigger event comprises an evaluation of sensor data in the vehicle 110;

    [0288] communicating 230, upon the trigger event being determined, a trigger data set to the smart device 120 A . . . 120 N;

    [0289] interacting 240 of the smart device 120 A . . . 120 N with the vehicle 110 according to the trigger data set.

    [0290] The interacting 240 of the smart device 120 A . . . 120 N with the vehicle 110 can comprise controlling at least one vehicle function of the vehicle 110.

    [0291] The vehicle 110 and the smart device 120 A . . . 120 N can each comprise a communication unit 116, 126, wherein the communicative connection between vehicle 110 and smart device 120 comprises a Bluetooth Low Energy, BLE, connection.

    [0292] The evaluating of sensor data in the vehicle 110 when determining the trigger event can comprise determining a spatial relation between the smart device 120 A . . . 120 N and the vehicle 110.

    [0293] The sensor data in the vehicle 110 which are evaluated for determining the trigger event in the vehicle can comprise: [0294] a current geographical position of the vehicle 110; and/or [0295] data concerning a Point of Interest, POI, according to a current geographical position of the vehicle 110; and/or [0296] data with respect to a geographical position that is important for the user of the vehicle 110; and/or [0297] a current time of day or a current time stamp; and/or [0298] a current position and/or a current movement vector of the smart device 120 A . . . 120 N relative to the vehicle 110; and/or [0299] a current state of a vehicle function or a current vehicle sensor value, wherein the vehicle function or the vehicle sensor value can comprise data with respect to the following vehicle functions or vehicle sensor values: [0300] light switched on or off; and/or [0301] doors right/left front/back or trunk or hood opened or closed; and/or [0302] current tire pressure of tires front/back right/left; and/or [0303] charging flap and/or gas cap opened/closed; and/or [0304] requisite need for maintenance in the vehicle; and/or [0305] state of charge or fuel tank level of the vehicle; and/or [0306] any further suitable current state of a vehicle function or vehicle sensor value.

    [0307] The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.