Method, apparatus, computer program and computer program product for transmitting image data
10576891 ยท 2020-03-03
Assignee
Inventors
Cpc classification
H04N23/66
ELECTRICITY
H04N7/181
ELECTRICITY
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/40
PERFORMING OPERATIONS; TRANSPORTING
International classification
H04N7/18
ELECTRICITY
Abstract
In a method for transmitting image data from a plurality of cameras in a vehicle, a set of functions is provided, which set of functions comprises a plurality of functions each having a requirement for a respective image setting of at least one of the cameras. If a function is activated, a respective image setting is set for at least one of the cameras on the basis of at least one of the function and a respective predefined desired data rate during image data transmission in order to provide the image data from the respective camera for the respective function.
Claims
1. A method for transmitting image data from a camera in a vehicle, the method comprising the acts of: providing a set of vehicle functions, each vehicle function performed by one or more vehicle systems of the vehicle using the camera, the camera having a plurality of image settings, wherein the set of vehicle functions comprises: a first vehicle function associated with a first image setting, and a second vehicle function associated with a second image setting, wherein the respective performance of each vehicle function requires the camera set to the respective image setting; activating the first and/or the second vehicle function from among the set of vehicle functions; setting the camera to an image setting corresponding to the activated first and/or second vehicle function, based on the activated first and/or second vehicle function and a predefined desired data rate for transmitting image data, from the camera, in the course of performing the activated first and/or second vehicle function, wherein setting the camera to the image setting is in accordance with a multi-part policy that delineates: one or more image setting requirements associated with each vehicle function, dynamic data rate changes associated with each of the one or more image setting requirements, whether the one or more image setting requirements are to be set for each image, or for each nth image of the camera, prioritization of image setting requirements according to active vehicle functions, and one or more predefined maximum limits for resource utilization by active vehicle functions, wherein a current data rate of the camera is determined, and the image settings are gradually adapted on the basis of the current data rate and on the basis of the predefined desired data rate, starting from a predefined starting data rate which is lower than the predefined desired data rate, and wherein the vehicle functions are respectively weighted, and, if more than one vehicle function is activated at the same time, the camera is set to the image setting on the basis of the respective weighting.
2. The method according to claim 1, wherein the image setting of the camera is set on the basis of a dynamically available data rate.
3. The method according to claim 1, wherein the image setting of the camera is set on the basis of a dynamically available computing capacity.
4. The method of claim 1, wherein the respective requirement of the vehicle functions for their respective image setting is dynamic.
5. The method according to claim 1, wherein a current data rate of the camera is determined, and the image settings are adapted on the basis of the current data rate and on the basis of the predefined desired data rate.
6. An apparatus for transmitting image data comprising: a network node in a vehicle; a camera in the vehicle and coupled to the network node, the camera having a plurality of image settings; and a computing unit coupled to the network node, wherein the computing unit is configured to: provide a set of vehicle functions, each vehicle function performed by one or more vehicle systems of the vehicle using the camera, wherein the set of vehicle functions comprises: a first vehicle function associated with a first image setting, and a second vehicle function associated with a second image setting, wherein the respective performance of each vehicle function requires the camera set to the respective image setting, and in response to the first and/or second vehicle function being activated, set the camera to an image setting corresponding to the activated first and/or second vehicle function, based on the activated first and/or second vehicle function and a predefined desired data rate for transmitting image data, from the camera, in the course of performing the activated first and/or second vehicle function, wherein setting the camera to the image setting is in accordance with a multi-part policy that delineates: one or more image setting requirements associated with each vehicle function, dynamic data rate changes associated with each of the one or more image setting requirements, whether the one or more image setting requirements are to be set for each image, or for each nth image of the camera, prioritization of image setting requirements according to active vehicle functions, and one or more predefined maximum limits for resource utilization by active vehicle functions, wherein a current data rate of the camera is determined, and the image settings are gradually adapted on the basis of the current data rate and on the basis of the predefined desired data rate, starting from a predefined starting data rate which is lower than the predefined desired data rate, and wherein the vehicle functions are respectively weighted, and, if more than one vehicle function is activated at the same time, the camera is set to the image setting on the basis of the respective weighting.
7. The apparatus according to claim 6, wherein the image setting of the camera is set on the basis of a dynamically available data rate.
8. The apparatus according to claim 6, wherein the image setting of the camera is set on the basis of a dynamically available computing capacity.
9. A vehicle comprising: a network node; a camera coupled to the network node and configured to transmit image data, the camera having a plurality of image settings; and a computing unit coupled to the network node, wherein the computing unit is configured to: provide a set of vehicle functions, each vehicle function performed by one or more vehicle systems of the vehicle using the camera, wherein the set of vehicle functions comprises: a first vehicle function associated with a first image setting, and a second vehicle function associated with a second image setting, wherein the respective performance of each vehicle function requires the camera set to the respective image setting, and in response to the first and/or second vehicle function being activated, set the camera to an image setting corresponding to the activated first and/or second vehicle function, based on the activated first and/or second vehicle function and a predefined desired data rate for transmitting image data, from the camera, in the course of performing the activated first and/or second vehicle function, wherein setting the camera to the image setting is in accordance with a multi-part policy that delineates: one or more image setting requirements associated with each vehicle function, dynamic data rate changes associated with each of the one or more image setting requirements, whether the one or more image setting requirements are to be set for each image, or for each nth image of the camera, prioritization of image setting requirements according to active vehicle functions, and one or more predefined maximum limits for resource utilization by active vehicle functions, wherein a current data rate of the camera is determined, and the image settings are gradually adapted on the basis of the current data rate and on the basis of the predefined desired data rate, starting from a predefined starting data rate which is lower than the predefined desired data rate, and wherein the vehicle functions are respectively weighted, and, if more than one vehicle function is activated at the same time, the camera is set to the image setting on the basis of the respective weighting.
10. The vehicle according to claim 9, in which the respective image setting of the camera is set on the basis of a dynamically available data rate.
11. The vehicle according to claim 9, in which the respective image setting of the camera is set on the basis of a dynamically available computing capacity.
12. A non-transitory computer readable medium having computer-executed code embodied therein for transmitting image data from a camera in a vehicle, the camera having a plurality of image settings, the non-transitory computer readable medium having: processor executable program code to provide a set of vehicle functions, each vehicle function performed by one or more vehicle systems of the vehicle using the camera, wherein the set of vehicle functions comprises: a first vehicle function associated with a first image setting, and a second vehicle function associated with at second image setting, wherein the respective performance of each vehicle function requires the camera set to the respective image setting, and processor executable program code to, in response to the first and/or second vehicle function being activated, set the camera to an image setting corresponding to the activated first and/or second vehicle function, based on the activated first and/or second vehicle function and a predefined desired data rate for transmitting image data, from the camera, in the course of performing the activated first and/or second vehicle function, wherein setting the camera to the image setting is in accordance with a multi-part policy that delineates: one or more image setting requirements associated with each vehicle function, dynamic data rate changes associated with each of the one or more image setting requirements, whether the one or more image setting requirements are to be set for each image, or for each nth image of the camera, prioritization of image setting requirements according to active vehicle functions, and one or more predefined maximum limits for resource utilization by active vehicle functions, wherein a current data rate of the camera is determined, and the image settings are gradually adapted on the basis of the current data rate and on the basis of the predefined desired data rate, starting from a predefined starting data rate which is lower than the predefined desired data rate, and wherein the vehicle functions are respectively weighted, and, if more than one vehicle function is activated at the same time, the camera is set to the image setting on the basis of the respective weighting.
13. The method according to claim 1, wherein the set of vehicle functions comprises at least one of: lane detection, a display of a view predefined by a customer in a parking function, an object detection, an obstacle detection and a traffic sign detection.
14. The method according to claim 1, wherein the respective requirement of the vehicle functions for their respective image setting changes on the basis of a vehicle state.
15. The method according to claim 1, wherein the image settings impose image quality parameters on at least a portion of images captured by the camera.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
DETAILED DESCRIPTION OF THE DRAWINGS
(3)
(4) The net work node H comprises, for example, a hub which is designed to connect network nodes in a star shape, in particular to connect the cameras KAM_1-KAM_4 and computing units to one another, for example by means of an Ethernet, in particular according to IEEE-802.3. Ethernet is a technology which specifies software, for example protocols, and hardware, for example cables, distributors, network cards, for wired data networks.
(5) The vehicle 1 additionally has a control apparatus SV which is connected to the network node H. The control apparatus SV has, for example, a data and program memory and a computing unit. The data and program memory and/or the computing unit can be formed in one structural unit and/or may be distributed among two or more structural units.
(6) The control apparatus SV is designed to transmit control commands to the respective camera KAM_1-KAM_4 using the connection via the network node H to the cameras KAM_1-KAM_4 or using a direct connection to the cameras KAM_1-KAM_4 in order to set image settings of the respective camera KAM_1-KAM_4, as described in more detail below.
(7) The control apparatus SV may also be referred to as an apparatus for transmitting image data.
(8) In particular, the data and program memory control apparatus SV stores a program which is explained in more detail below using the flowchart in
(9) The program is started in a step S1 in which variables are initialized, for example.
(10) A set of functions FN_M is provided in a step S3. The set of functions FN_M comprises a plurality of functions FN each having a requirement for a respective image setting BE of at least one of the cameras KAM_1-KAM_4. The set of functions FN_M is stored, for example, in the data and program memory of the control apparatus SV.
(11) The respective requirement of the plurality of functions FN for the respective image setting BE is dynamic, for example. The respective requirement of the plurality of functions FN for the respective image setting BE changes on the basis of a vehicle state, for example. The vehicle state comprises, for example, a steering lock and/or forward travel of the vehicle and/or reverse travel of the vehicle and/or indicating of the vehicle.
(12) Such functions FN comprise, for example, a lane detection function and/or a function for displaying a camera image on an image display unit in the vehicle 1, fix example for a view predefined by a customer in a parking function. Alternatively or additionally, the functions FN comprise a function for detecting objects and/or obstacles and/or traffic signs and/or a function for detailed object recognition if another sensor, for example a radar or ultrasonic sensor, has detected an object in a particular image area, for example.
(13) The image setting BE comprises, for example, a setting of the images per second and/or a setting of the color depth and/or a setting of the type of compression and/or compression rate and/or a setting of the resolution. Alternatively or additionally, the image setting BE also comprises, for example, a setting regarding whether only every nth it age is intended to be transmitted in a high quality. In the case of four cameras KAM_1-KAM_4, for example, one of the cameras may each in turn transmit a high-quality image and the three remaining cameras may transmit a low-quality image. Alternatively or additionally, the image setting BE comprises, for example, whether only a partial image area is intended to be recorded with high quality since some functions FN possibly require only partial image areas. For example, a lane detection function possibly requires only a lower half of the image.
(14) In a step S5, a check is carried out in order to determine whether a function FN is activated. If a function FN has been activated, the program is continued in a step S7.
(15) A predefined desired data rate D_G is provided in step S7. The predefined desired data rate D_G is distinguished by the fact that it makes it possible to transmit image data which are of such high quality that the requirement of the activated function FN can be met. Furthermore, the predefined desired data rate D_G is distinguished, in particular, by the fact that it is as low as possible in this case. For this purpose, the matter of which requirement is associated with which likely data rate change is stored, in particular, for all functions FN in the set of functions FN_M. This is stored, for example, in a database stored in the data and program memory of the control apparatus SV. Furthermore, a respective assignment of functions FN to sources may be stored in the database since respective sources possibly impose different requirements on the image setting BE of the respective camera KAM_1-KAM_4.
(16) Since the cameras KAM_1-KAM_4 are connected to a universal bus or universal network, for example, data which do not come from the respective camera KAM_1-KAM_4 and dynamically restrict an available data rate D_DYN may possibly also be transmitted via the bus and/or the network. For example, the available data ate D_DYN can be dynamically restricted for a desired audio transmission. Therefore, the dynamically available data rate D_DYN is alternatively or additionally determined and provided in step S7. The dynamically available data rate D_DYN results, in particular, from subtracting a restrictive data rate from a predefined maximum data rate which is technically determined, for example, and is stored in the data and program memory of the control apparatus SV, for example.
(17) Since a computing unit which processes the transmitted image data, for example the computing unit of the control apparatus SV and/or a further computing unit, can also be universally used, a respectively available computing capacity R_DYN can dynamically change. Therefore, the dynamically available computing capacity R_DYN is alternatively or additionally determined and provided in step S7. The dynamically available computing capacity R_DYN results, in particular, from subtracting a restrictive computing capacity which is needed to calculate non-image data, from a predefined maximum computing capacity which is technically determined, for example, and is stored in the data and program memory of the control apparatus SV, for example.
(18) In a step S9, an image setting BE is set for at least one of the cameras KAM_1-KAM_4, to be precise on the basis of the activated function FN and/or on the basis of the respective predefined desired data rate D_G and/or on the basis of the dynamically available data rate D_DYN and/or on the basis of the dynamically available computing capacity R_DYN.
(19) The image setting BE of the respective camera KAM_1-KAM_4 is set, in particular, in such a manner that a data rate which results from the transmission of the image data from the respective camera KAM_1-KAM_4 is as low as possible and the image quality of the image data is simultaneously distinguished by the fact that the respective activated function FN can be accomplished.
(20) The image setting BE of the respective camera KAM_1-KAM_4 is alternatively or additionally set on the basis of the respective requirement of the activated function FN, for example.
(21) In a step S11, a respective current data rate D_AKT of the respective camera KAM_1-KAM_4 is determined.
(22) In a step S13, the image setting BE of the respective camera KAM_1-KAM_4 is adapted on the basis of the respective current data rate D_AKT and on the basis of the predefined desired data rate D_G and/or the dynamically available data rate D_DYN and/or the dynamically available computing capacity R_DYN.
(23) The image setting BE of the respective camera KAM_1-KAM_4 is adapted in step S13, in particular gradually, starting from a predefined starting data rate which is lower than the predefined desired data rate D_G.
(24) The program is then continued in step S11. Steps S11 to S13 are carried out, in particular, as long as the respective function FN is active. The program is then ended.
(25) A respective weighting is assigned to the functions FN in the set of functions FN-M, in particular.
(26) Alternatively or additionally, a plurality of functions FN can be activated at the same time, for example. If this is the case, the image setting BE of the respective camera KAM_1-KAM_4 is additionally set in step S9 and/or in step S13, in particular, on the basis of the respective weighting, with the result that functions FN with a high weighting are taken into account earlier than functions EN with a low weighting.
(27) Alternatively or additionally, a respective image setting of a plurality of the cameras KAM_1 -KAM_4 is set in step S9 and/or in step S13. If some of the cameras KAM_1-KAM_4 or all cameras KAM_1-KAM_4 are connected to the same bus and/or to the same network, this makes it possible, for example, to reduce a total data rate produced by some of the cameras KAM_1-KAM_4 or by all cameras KAM_1-KAM_4.
(28) Alternatively, the program can be ended after step S9.
(29) The procedure explained may contribute to the image data from the respective camera KAM_1-KAM_4 being transmitted in a very efficient manner since the respective active function FN can be accomplished using the image data, but no additional, unnecessary data are transmitted at the same time.
(30) In the above-described method or control sequence, one or more elements of a multipart policy can be implemented with the following properties:
(31) Part 1 of the policy:
(32) This part of the policy contains which requirements are imposed on which image data source by which function requiring image data or by which data sink. A plurality of elements may be provided in this case, for example: requirements with regard to the image rate (recorded or transmitted images per second), requirements with regard to compression artifacts in the overall image, requirements for color depth, requirements with regard to compression artifacts in sections of the image (for example in the section of a lower half of the image in which the road surface can be seen, or in the section of a right-hand half of the image in which traffic signs are typically presented), requirements with regard to the image resolution, requirements for the maximum data rate.
(33) Part 2 of the policy:
(34) This part of the policy records which requirement is associated with which likely data rate change.
(35) Part 3 of the policy:
(36) This part of the policy records whether the respective predefined or set quality requirements or quality parameters are imposed on each individual image of the images required from a data sink, for example from a detection algorithm for pedestrians, at a predefined image rate of 30 images per second, for example, or whether it is sufficient for the data sink to obtain only every nth image of the 30 images per second from a camera in a particularly high quality, for example in a predefined high resolution, and to obtain the remaining images in poorer quality, for example in a predefined lower resolution. In the case of four cameras of a parking system, for example, the cameras may each in turn emit a high-quality image which therefore requires more bandwidth and computing power in a central control unit, while the respective other three cameras each precisely then emit a lower-quality image which is therefore less of a burden in terms of the data rate to be transmitted and/or the processing load.
(37) Part 4 of the policy:
(38) According to this part of the policy, if a plurality of functions or data sinks may be active at the same time and may therefore impose requirements on the central control unit at the same time, provision is made for the requirements be able to be prioritized with regard to which of the requirements is more important than others. Such prioritization can be set and stored in the central control unit using parameter values.
(39) Part 5 of the policy:
(40) In this part of the policy, maximum or upper limits are predefined for the respective utilization of resources and must be taken into account overall. In this case, it is possible to predefine, for example, a maximum bandwidth which must not be exceeded by individual cameras, groups of cameras and/or all cameras overall.
(41) On the basis of the respective specifically defined policy, a process of selecting the detailed image settings (for example with regard to resolution and/or number of colors) and/or compression settings (compression rate or image quality) for the individual cameras can then be carried out. The selection process may in turn be carried out in a plurality of steps:
(42) First step of the selection process (AS1):
(43) In this step, functions or control units requiring image data report their respective requirement (see part 1 of the policy) for a parameter of the policy to the central control unit, for example because a function, for example lane detection, is activated, because a particular view is selected for a parking function in a graphical user interface, because an object, for example an obstacle and/or a traffic sign, has been detected, because another sensor, for example a radar or ultrasonic sensor, has detected objects in a particular image area.
(44) Second step of the selection process (AS2):
(45) In this step, further control components of the vehicle can report variable resource restrictions (see part 5 of the policy). For example, an Ethernet connection may be or have been restricted. in terms of the possible bandwidth of the video or image transmission. on account of a temporary audio transmission.
(46) Third step of the selection process (AS3):
(47) In this step, the presumably best constellation of the requirements is selected. The new data rate is gradually approached from below. In this case, a simplification can be achieved using prefabricated scenarios with fixed parameter sets which are defined or firmly predefined once as suitable.
(48) Fourth step of the selection process (AS4):
(49) This step is carried out repeatedly and, in particular, continuously or within. predefined repetition intervals, in this case, the selection of the set control parameters is readjusted, possibly regularly or repeatedly, in the case of changing image contents. This takes into account, in particular, the fact that changing image contents generally result in different compression results and therefore in changing bandwidths and compression artifacts. The central. policy can therefore be permanently readjusted if necessary. The transmission of data can therefore be optimized even when the requirements imposed on the image quality temporarily remain constant.
LIST OF REFERENCE SYMBOLS
(50) 1 Vehicle BE Image setting D_AKT Current data rate D_DYN Dynamically available data rate D_G Desired data rate FN Function FN_M Set of functions H Network node KAM_1 to KAM_4 First to fourth camera R_DYN Dynamically available computing capacity SV Control apparatus
(51) The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.