Method and Control Unit for Controlling a Camera

20230145925 · 2023-05-11

    Inventors

    Cpc classification

    International classification

    Abstract

    A control unit determines subject data in relation to a subject, and controls a vehicle camera in accordance with the subject data so as to capture an image of the subject using the camera.

    Claims

    1-11. (canceled)

    12. A system, comprising: a camera of a vehicle; and a control unit configured to: determine subject data in relation to a subject, and control the camera in accordance with the subject data so as to capture an image of the subject using the camera.

    13. The system of claim 12, wherein the subject data indicate a shutter release time and/or a shutter release position at which the camera is to be triggered in order to capture the image of the subject.

    14. The system of claim 12, wherein the control unit is configured to receive the subject data from another road user, which is arranged in front of the vehicle in the direction of travel of the vehicle.

    15. The system of claim 12, wherein the control unit is configured to receive the subject data from an infrastructure unit of a road network on which the vehicle is driven.

    16. The system of claim 12, wherein the subject data indicate how the vehicle should move to allow the image of the subject to be captured, with respect to one or more of: a traffic lane in which the vehicle should be located, a vehicle speed of the vehicle, a trajectory of the vehicle, and an orientation of the vehicle relative to the subject; and wherein the control unit is configured to cause the vehicle to move as indicated by the subject data.

    17. The system of claim 16, wherein the control unit is configured to: issue an instruction to a driver of the vehicle to cause the vehicle to move as indicated by the subject data; and/or intervene automatically in the longitudinal and/or lateral guidance of the vehicle in order to cause the vehicle to move as indicated by the subject data.

    18. The system of claim 12, wherein the control unit is configured to: determine position data relating to a position of the vehicle, and control the camera also in accordance with the position data, including a digital map relating to the road network on which the vehicle is driving, in order to take the photograph of the subject using the camera.

    19. The system of claim 12, wherein the control unit is configured to: determine vehicle data relating to a condition of the vehicle, wherein the vehicle data includes one or more of: information relating to a tire pressure of at least one tire of the vehicle, information relating to the wear condition of one of the tires of the vehicle, information relating to a loading condition of the vehicle, and information relating to a vehicle speed of the vehicle; and control the camera in accordance with the vehicle data in order to capture the image of the subject using the camera.

    20. The system of claim 12, wherein the control unit is configured to: determine environmental data relating to the environment of a vehicle, wherein the environmental data was acquired in particular by one or more environment sensors of the vehicle; and determine the subject data on the basis of the environmental data.

    21. The system of claim 12, wherein the control unit is configured to: send a request for the provision of subject data for the subject to a vehicle-external unit; and receive the subject data from the vehicle-external unit in response to the request.

    22. A method for controlling a camera of a vehicle, the method comprising: determining subject data in relation to a subject; and controlling the camera in accordance with the subject data to capture an image of the subject using the camera.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0025] FIG. 1 shows exemplary components of a vehicle;

    [0026] FIG. 2 shows an exemplary driving situation with a subject to be captured; and

    [0027] FIG. 3 shows an exemplary method for controlling a camera in a vehicle.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0028] As explained at the beginning, this document is concerned with increasing the quality of photographs taken with a camera in a vehicle. In this context, FIG. 1 shows exemplary components of a vehicle 100. The vehicle 100 can comprise one or more environment sensors 102 (e.g., an environment camera, a radar sensor, a lidar sensor, etc.) that are configured to acquire sensor data (also referred to in this document as environment data) in relation to the environment of the vehicle 100. In addition, the vehicle 100 can comprise a position sensor 103, which is configured to acquire sensor data (also referred to in this document as position data) in relation to the position of the vehicle 100. In addition, the vehicle 100 can comprise a communication unit 106 which is configured to exchange communication data with a vehicle-external unit (e.g. with an infrastructure unit of the road network on which the vehicle 100 is being driven and/or with another road user) via a (wireless) communication link.

    [0029] In addition, at least one camera 104 is arranged in the vehicle 100, which is designed to capture photographs relating to the environment of the vehicle 100. For example, the camera 104 can be held by an occupant of the vehicle 100. Alternatively, the camera 104 can be mounted on a bracket in the vehicle 100 (not shown).

    [0030] A control unit 101 of the vehicle 100 can be configured to determine subject data relating to a subject in the environment of the vehicle 100 that is to be captured with the camera 104, based on the environment data, based on the position data (e.g. in combination with a digital map relating to the road network being used by the vehicle 100), and/or based on received communication data. In particular, the subject data can indicate or comprise the direction and/or orientation and/or position of the camera 104 for capturing the subject; and/or the shutter release time and/or the shutter release position at which the camera 104 must be triggered in order to capture the subject.

    [0031] The control unit 101 can also be configured to control the camera 104 in accordance with the subject data (e.g. by sending a control instruction to the camera 104 via a wireless or wired communication link) in order to take a photograph of the subject.

    [0032] Alternatively or additionally, the control unit 101 can be configured to control an action of the user of the camera 104 based on the subject data. For example, a voice output, a haptic signal, and/or an optical signal can be used to prompt the user to hold the camera 104 as specified by the subject data for taking the photograph (in particular at the position and/or orientation indicated by the subject data).

    [0033] FIG. 2 shows an example driving situation in which the vehicle 100 is following another road user 200 (e.g. a vehicle driving in front) on a road (indicated by the arrow). The user of the vehicle 100 may have informed the control unit 101 of the vehicle 100 via a user interface that the camera 104 should be used to take a photograph of a specific subject 203 (e.g. of a specific point of interest) while driving the vehicle 100, wherein the subject 203 may be in front of the vehicle 100 in the direction of travel of the vehicle 100. Alternatively or additionally, the subject 203 can be arranged next to or behind the vehicle 100.

    [0034] The control unit 101 can use the communication unit 106 and/or a (wireless) communication link 205 to instruct a vehicle-external unit 200, 202, e.g. the road user 200 driving in front and/or an infrastructure unit 202, to provide subject data relating to the subject 203 to be captured. For example, the vehicle-external unit 200, 202 can be instructed to determine the exact position of the subject 203 to be captured and to send it to the control unit 101 as subject data. The camera 104 in the vehicle 100 can then be operated precisely and reliably on the basis of the received and/or determined subject data in order to take a high-quality photograph of the subject 203 (and, if appropriate, of the user).

    [0035] The subject data can be communicated to a server by another road user 200, wherein the server is designed to store the subject data for a plurality of different subjects 203. The control unit 101 can then download the subject data for a selection of one or more subjects 203 from the server if required. The control unit 101 can be configured to convert the subject data for a subject 203 provided in a general form to the specific situation of the vehicle 100 (in particular with regard to the exact shutter release time and/or the exact shutter release position and/or with regard to the orientation of the camera 104).

    [0036] Thus, a database of subject data for subjects 203 of interest can be provided. This further increases the convenience for users of cameras 104.

    [0037] A system is thus described for the optimal or optimized and automatic triggering of a hand-held or on-board camera 104 installed in a moving vehicle 100 for photographing a specific subject 203. The subject 203 should be photographed in an optimized way (e.g. without an interfering obstacle and/or with correct light conditions (in particular not towards the sun)) at the current driving speed of the vehicle 100.

    [0038] For this purpose, an automatic pre-calculation of the earliest possible or the exact time of triggering the camera 104 can be performed. The trigger for releasing the shutter of the camera 104 can be received e.g. (as subject data) by Car-to-Car or Car-to-X communication. Using Car-to-X communication, a fixed coordinate with respect to the subject 203 in a high-resolution digital map can be sent to the camera 104 via the vehicle 100, via the infrastructure 202, via an app, and/or “over the air”. The camera 104 and/or the control unit 101 of the vehicle 100 can have access to the digital map.

    [0039] A Car-to-Car message can be used to receive a signal (i.e. subject data) from a vehicle 200 in front. The vehicle 200 in front can be designed to determine the position of the subject 203 using one or more environment sensors. In particular, the optimum shutter release position and/or the optimum shutter release time can be determined by the vehicle 200 traveling in front. Current weather conditions and/or light conditions can be taken into account. The information relating to the shutter release time and/or the shutter release position can then be sent as subject data to the control unit 101 of the following vehicle 100 and/or directly to the camera 104.

    [0040] If necessary, multiple different sources of information (e.g. a vehicle 200 traveling in front and/or an infrastructure unit 202) can be combined in order to determine the shutter release point and/or shutter release position particularly accurately. The camera 104 can then be triggered at the determined shutter release position or at the determined shutter release time in order to take an optimized photograph of the subject 203. It may be advantageous for taking the photograph that the vehicle 100 in which the camera 104 is located adjusts its traffic lane, the trajectory within the lane, its speed relative to the traffic situation and/or a given speed limit to further improve the quality of the photograph. This can be indicated as part of the determined subject data and taken into account by the control unit 101. In particular, the driver of the vehicle 100 (e.g. by issuing an instruction) can be prompted to change the driving state of the vehicle 100 for taking the photograph.

    [0041] In the context of the described system or method the following data and/or information can be used: [0042] data relating to one or more vehicle components; e.g. tire pressure monitoring system or air pressure, tire wear and/or current friction values, engine power and torque on the axles, etc.; [0043] GPS data (traffic), Car-to-Car (V2V) data and/or Car-to-X (V2X) data in order to query a queue of vehicles ahead, which may not be visible to the environment camera 102 of the vehicle 100 (e.g. when in a traffic queue); [0044] map data and/or traffic lane geometry; [0045] topography (gradient); [0046] weather and/or climate; [0047] braking power; [0048] loading condition (weight, type of load); and/or

    [0049] maximum longitudinal and/or lateral acceleration forces.

    [0050] FIG. 3 shows a flowchart of an exemplary (possibly computer-implemented) method 300 for controlling a camera 104 (e.g. a compact camera or an SLR camera) that is carried in a (motor) vehicle 100 (by a user or occupant). For example, the camera 104 can be held by an occupant of the vehicle 100. Alternatively or additionally, the camera 104 can be mounted on a bracket in the vehicle 100.

    [0051] The method 300 comprises determining 301 subject data in relation to a subject 203 to be captured, which may be travelling in front in the direction of travel of the vehicle 100. The subject data can be provided by a vehicle-external unit 200, 202 (in particular sent to the camera 104 and/or the vehicle 100 and/or received from the camera 104 and/or the vehicle 100 via a wireless communication link 205).

    [0052] The method 300 also comprises controlling 302 the camera 104 in accordance with the subject data, in order to take a photograph of the subject 203 using the camera 104. In particular, in accordance with the subject data (e.g. in accordance with a shutter release time or a shutter position indicated in the subject data), the shutter of the camera 104 can be automatically activated to take a photograph of the subject 203.

    [0053] The measures described in this document enable an occupant of a vehicle 100 to take optimized photographs with a camera 104, even when the vehicle 100 is moving. This can increase the convenience and satisfaction of the occupant.

    [0054] The present invention is not limited to the exemplary embodiments shown. In particular, it is important to note that the description and the figures are intended only as examples to illustrate the principle of the proposed methods, devices and systems.