H04N19/10

Systems and methods for determining processing completeness within a distributed media item processing environment
10911792 · 2021-02-02 · ·

Systems, methods, and non-transitory computer readable media are configured to receive a media item. The media item can be split into a plurality of segments. The plurality of segments can be subjected to a plurality of distributed prepublication processing stages. One or more stage progress reports can be received. Each of the one or more stage progress reports can indicate an extent complete of one of the plurality of distributed prepublication processing stages for one of the plurality of segments. An overall extent complete can be calculated with respect to the media item based on the one or more stage progress reports.

Warp processing for image capture

Systems and methods are disclosed for image signal processing. For example, methods may include receiving, by an image signal processor, one or more input image signals from one or more image sensors; determining a mapping based on the input image signal(s), wherein the mapping includes records that associate image portions of an output image with corresponding image portions of the input image signal(s); sorting the records of the mapping according to an order of the corresponding image portions of the input image signal(s); applying, by the image signal processor, image processing to image portions of the input image signal(s) to determine image portions of one or more processed images in the order; and determining, by the image signal processor, the image portions of the output image based at least in part on the mapping and the corresponding image portions of the processed image(s) in the order.

Warp processing for image capture

Systems and methods are disclosed for image signal processing. For example, methods may include receiving, by an image signal processor, one or more input image signals from one or more image sensors; determining a mapping based on the input image signal(s), wherein the mapping includes records that associate image portions of an output image with corresponding image portions of the input image signal(s); sorting the records of the mapping according to an order of the corresponding image portions of the input image signal(s); applying, by the image signal processor, image processing to image portions of the input image signal(s) to determine image portions of one or more processed images in the order; and determining, by the image signal processor, the image portions of the output image based at least in part on the mapping and the corresponding image portions of the processed image(s) in the order.

VIDEO IMAGE PROCESSING METHOD AND DEVICE
20210021857 · 2021-01-21 ·

A video image processing method includes determining a current image block and, in response to a size of the current image block meeting a preset condition, determining a temporal candidate motion vector of the current image block according to at least one of a temporal motion vector prediction (TMVP) operation or an advanced/alternative temporal motion vector prediction (ATMVP) operation.

VIDEO IMAGE PROCESSING METHOD AND DEVICE
20210021858 · 2021-01-21 ·

A video image processing method includes determining a current image block and, in response to a size of the current image block meeting a preset condition, determining a temporal candidate motion vector of the current image block according to at least one of a temporal motion vector prediction (TMVP) operation or an advanced/alternative temporal motion vector prediction (ATMVP) operation.

Methods and apparatuses for encoding and decoding video using periodic buffer description

A method of encoding video including: writing a plurality of predetermined buffer descriptions into a sequence parameter set of a coded video bitstream; writing a plurality of updating parameters into a slice header of the coded video bitstream for selecting and modifying one buffer description out of the plurality of buffer descriptions; and encoding a slice into the coded video bitstream using the slice header and the modified buffer description.

Methods and apparatuses for encoding and decoding video using periodic buffer description

A method of encoding video including: writing a plurality of predetermined buffer descriptions into a sequence parameter set of a coded video bitstream; writing a plurality of updating parameters into a slice header of the coded video bitstream for selecting and modifying one buffer description out of the plurality of buffer descriptions; and encoding a slice into the coded video bitstream using the slice header and the modified buffer description.

Methods and apparatuses for encoding and decoding digital light field images

A method for encoding a raw lenselet image includes a receiving phase, wherein at least a portion of a raw lenselet image is received, the image including a plurality of macro-pixels, each macro-pixel having pixels corresponding to a specific view angle for the same point of a scene, and an output phase, wherein a bitstream having at least a portion of an encoded lenselet image is outputted. The method has an image transform phase, wherein the pixels of said raw lenselet image are spatially displaced in a transformed multi-color image having a larger number of columns and rows with respect to the received raw lenselet image, wherein dummy pixels having undefined value are inserted into the raw lenselet image and wherein the displacement is performed so as to put the estimated center location of each macro-pixel onto integer pixel locations. Moreover, the method includes a sub-view generation phase, wherein a sequence of sub-views is generated, said sub-views having pixels of the same angular coordinates extracted from different macro-pixels of the transformed raw lenselet image. Finally, the method has a graph coding phase, wherein a bitstream is generated by encoding a graph representation of at least one of the sub-views of the sequence according to a predefined graph signal processing technique.

Signalling of filtering information

A video decoder is configured to, for a group of video blocks of the video data, determine a number of merged groups for a plurality of classes is equal to one merged group; receive a first flag indicating that filter coefficient information for at least one merged group is not coded in the video data; receive for the one merged group, a second flag, wherein a first value for the second flag indicates that filter coefficient information mapped to the one merged group is coded in the video data, and wherein a second value for the second flag indicates that the filter coefficient information mapped to the one merged group is all zero values; determine the second flag is equal to the second value; and determine one or more filters from the set of filters using the all zero values.

Signalling of filtering information

A video decoder is configured to, for a group of video blocks of the video data, determine a number of merged groups for a plurality of classes is equal to one merged group; receive a first flag indicating that filter coefficient information for at least one merged group is not coded in the video data; receive for the one merged group, a second flag, wherein a first value for the second flag indicates that filter coefficient information mapped to the one merged group is coded in the video data, and wherein a second value for the second flag indicates that the filter coefficient information mapped to the one merged group is all zero values; determine the second flag is equal to the second value; and determine one or more filters from the set of filters using the all zero values.