TRACKING OF MULTIPLE OBJECTS USING NEURAL NETWORKS, LOCAL MEMORIES, AND A SHARED MEMORY
20220309681 ยท 2022-09-29
Inventors
Cpc classification
G06T7/246
PHYSICS
G06N3/0442
PHYSICS
International classification
Abstract
A method for tracking and/or characterizing multiple objects in a sequence of images. The method includes: assigning a neural network to each object to be tracked; providing a memory that is shared by all neural networks; providing a local memory for each neural network, respectively; supplying images from the sequence, and/or details of these images, to each neural network; during the processing of each image and/or image detail by one of the neural networks, generating an address vector from at least one processing product of this neural network; based on this address vector, writing at least one further processing product of the neural network into the shared memory and/or into the local memory, and/or reading out data from this shared memory and/or local memory and further processing the data by the neural network.
Claims
1-14. (canceled)
15. A method for tracking and/or characterizing multiple objects in a sequence of images, comprising the following steps: assigning a respective neural network to each object of the multiple objects to be tracked; providing a memory that is shared by all of the respective neural networks; providing a respective local memory for each respective neural network; supplying images from the sequence, and/or details of these images, to each of the respective neural networks; during processing of each image and/or image detail by a neural network of the respective neural networks, generating an address vector from at least one processing product of the neural network; based on the address vector, writing at least one further processing product of the neural network into the shared memory and/or into the local memory, and/or reading out data from the shared memory and/or local memory and further processing the read out data by the neural network; and delivering, as and output, by each respective neural network, positions of the assigned object in the images or image details supplied to the respective neural network, and/or information concerning behavior or other sought properties of the assigned object.
16. The method as recited in claim 15, wherein the shared memory and/or at least one local memory of the neural network is configured to map an address vector of address components, via differentiable operations, onto one or multiple memory locations, and to read data from the memory locations or write data into the memory locations.
17. The method as recited in claim 15, wherein the processing product from which the address vector is generated represents visual features that are recognized in the assigned object to be tracked.
18. The method as recited in claim 15, wherein the shared memory and/or at least one local memory is configured as an associative memory in which data are storable in association with processing products of the respective neural networks.
19. The method as recited in claim 18, wherein the associative memory is pre-populated with identifications of objects and/or with data that characterize behavior or other sought properties of objects.
20. The method as recited in claim 15, wherein during the further processing, the neural network combines the data read out from the shared memory and/or from at least one local memory with at least one processing product of the neural network.
21. The method as recited in claim 15, wherein at least one first and one second neural network of the respective neural networks contain mutually corresponding sequences of layers in which particular neurons or other processing units are organized, wherein: the first neural network writes a processing product from a first layer into one or multiple memory locations of the shared memory, and the second neural network further processes data, read from the one or multiple memory locations, in a second layer that follows the first layer in the sequence.
22. The method as recited in claim 15, wherein the sequence of images includes images of a traffic situation that has been recorded using at least one sensor that is carried along by a vehicle.
23. The method as recited in claim 22, wherein the outputs of the respective neural networks are combined into an overall assessment of the traffic situation.
24. The method as recited in claim 22, wherein an activation signal for the vehicle is generated from the outputs of the respective neural networks and/or from an overall assessment of the traffic situation generated from the outputs of the respective neural networks, and the vehicle is activated using the activation signal.
25. The method as recited in claim 15, wherein the sequence of images include images that have been recorded during a visual observation of a monitored area.
26. A non-transitory machine-readable data medium on which is stored a computer program for tracking and/or characterizing multiple objects in a sequence of images, the computer program, when executed by one or multiple computers, causing the one or multiple computers to perform the following steps: assigning a respective neural network to each object of the multiple objects to be tracked; providing a memory that is shared by all of the respective neural networks; providing a respective local memory for each respective neural network; supplying images from the sequence, and/or details of these images, to each of the respective neural networks; during processing of each image and/or image detail by a neural network of the respective neural networks, generating an address vector from at least one processing product of the neural network; based on the address vector, writing at least one further processing product of the neural network into the shared memory and/or into the local memory, and/or reading out data from the shared memory and/or local memory and further processing the read out data by the neural network; and delivering, as and output, by each respective neural network, positions of the assigned object in the images or image details supplied to the respective neural network, and/or information concerning behavior or other sought properties of the assigned object.
27. One or multiple computers configured to track and/or characterize multiple objects in a sequence of images, the one or multiple computers configured to: assign a respective neural network to each object of the multiple objects to be tracked; provide a memory that is shared by all of the respective neural networks; provide a respective local memory for each respective neural network; supply images from the sequence, and/or details of these images, to each of the respective neural networks; during processing of each image and/or image detail by a neural network of the respective neural networks, generate an address vector from at least one processing product of the neural network; based on the address vector, write at least one further processing product of the neural network into the shared memory and/or into the local memory, and/or read out data from the shared memory and/or local memory and further processing the read out data by the neural network; and deliver, as and output, by each respective neural network, positions of the assigned object in the images or image details supplied to the respective neural network, and/or information concerning behavior or other sought properties of the assigned object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0039]
[0040]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0041]
[0042] Images 1 have been recorded using at least one sensor 51 that is carried along by a vehicle 50.
[0043] A neural network 3a through 3c is assigned to each object 2a through 2c to be tracked, in step 110. As explained above, for example objects 2a through 2c to be tracked may be discerned in first image 1 of the sequence without an analysis being made of the particular type of object 2a through 2c.
[0044] A memory 4, including memory locations 4b, that is shared by all neural networks 3a through 3c is provided in step 120. According to block 121, this memory 4 may already be pre-populated with identifications of objects 2a through 2c and/or with data that characterize the behavior or other sought properties of objects 2a through 2c.
[0045] A local memory 9a through 9c is provided for each neural network 3a through 3c, respectively, in step 125.
[0046] Images 1 from the sequence and/or details of these images are supplied to each neural network 3a through 3c in step 130. During the processing of each image 1 and/or image detail by one of neural networks 3a through 3c, an address vector 4a is generated from at least one processing product 5a through 5c of this neural network 3a through 3c in step 140. Due to the structural design of memory 4, this address vector 4a is mapped onto memory locations 4b via differentiable operations, and may be utilized in two ways.
[0047] At least one further processing product 6a through 6c of neural network 3a through 3c is written into shared memory 4 and/or into local memory 9a through 9c, based on address vector 4a, in step 150. According to block 151, this processing product 6a through 6c may originate from a first layer of a first neural network 3a that is organized in layers.
[0048] Alternatively or also in combination therewith, data 4c are read out from shared memory 4 and/or from local memory 9a through 9c in step 160, and these data are further processed by neural network 3a through 3c in step 170. According to block 171, this further processing may in particular involve, for example, combining data 4c with at least one processing product of this neural network 3a through 3c. According to block 172, the further processing may be carried out in a second neural network 3b in a second layer that follows the first layer, from which data 4c have been taken according to block 151.
[0049] As output 7a through 7c, each neural network 3a through 3c delivers positions of particular assigned object 2a through 2c in images 1 or image details supplied to it, and/or information concerning the behavior or other sought properties of particular assigned object 2a through 2c, in step 180.
[0050] Outputs 7a through 7c of neural networks 3a through 3c may be combined into an overall assessment 8 of the traffic situation in step 190.
[0051] An activation signal 191a for vehicle 50 may be generated from outputs 7a through 7c of neural networks 3a through 3c, and/or from overall assessment 8 of the traffic situation generated therefrom, in step 191. Vehicle 50 may be activated using this activation signal 191a in step 192.
[0052]
[0053] In order in particular to be able to make note of the previous history and earlier predictions, each neural network 3a through 3c includes a local memory 9a through 9c, respectively, which in each case includes memory locations 4b and to which only respective neural network 3a through 3c has access. In addition, a shared memory 4 is also provided. Based on processing products 5a through 5c of neural networks 3a through 3c, address vectors 4a may be formed, via which further processing products 6a through 6c may be subsequently stored in shared memory 4 and/or data 4c may be retrieved from shared memory 4. Access may be made to local memories 9a through 9c of individual neural networks 3a through 3c in exactly the same way. This is not depicted in
[0054] In the example shown in