H04N19/553

Image processing apparatus, image processing method, and image processing program
10038913 · 2018-07-31 · ·

One embodiment discloses an image processing apparatus which generates an interpolation frame from consecutive first and second frames, including: a motion estimation section which assigns, to the interpolation frame, motion vectors from the first frame to the second frame; a first degree-of-difference calculation section which extracts calculates a first degree of difference in terms of pixel values; a second degree-of-difference calculation section which calculates a second degree of difference in terms of vectors; and an interpolation frame generation section which generates the interpolation frame by determining a motion vector that should be assigned to a pixel of attention on the basis of combined weights obtained by combining the first degrees of difference and the second degrees of difference, respectively.

Method and system for video frame interpolation based on optical flow method
20180176574 · 2018-06-21 ·

A method and system for video frame interpolation based on an optical flow method is disclosed. The process includes calculating bidirectional motion vectors between two adjacent frames in a frame sequence of input video by using the optical flow method, judging reliabilities of the bidirectional motion vectors between the two adjacent frames, and processing a jagged problem and a noise problem in the optical flow method; marking shielding and exposure regions in the two adjacent frames, and updating an unreliable motion vector; with regard to the two adjacent frames, according to marking information about the shielding and exposure regions and the bidirectional motion vector field, mapping front and back frames to an interpolated frame to obtain a forward interpolated frame and a backward interpolated frame; synthesizing the forward interpolated frame and the backward interpolated frame into the interpolated frame; repairing a hole point in the interpolated frame to obtain a final interpolated frame. Since the optical flow method is based on pixels, disclosed method and system are more accurate and do not have blocking effect and other problems.

VIDEO FRAME RATE CONVERSION USING STREAMED METADATA
20180132009 · 2018-05-10 ·

A video server generates metadata representative of interpolation parameters for portions of a first frame representative of a scene in a stream of frames including the first frame. The interpolation parameters are used to generate at least one interpolated frame representative of the scene subsequent to the first frame and prior to a second frame in the stream of frames. The video server incorporates the metadata into the stream and transmits the stream including the multiplexed metadata. A video client receives the first frame representative the stream of frames including the metadata. The video client generates one or more interpolated frames representative of the scene subsequent to the first frame and prior to a second frame in the stream of frames based on the first frame and the metadata. The video client displays the first frame, the one or more interpolated frames, and the second frame.

SYSTEMS AND METHODS FOR ADAPTIVE SELECTION OF WEIGHTS FOR VIDEO CODING

Techniques and systems are provided for processing video data. For example, a current block of a picture of the video data can be obtained for processing by an encoding device or a decoding device. A pre-defined set of weights for template matching based motion compensation are also obtained. A plurality of metrics associated with one or more spatially neighboring samples of the current block and one or more spatially neighboring samples of at least one reference frame are determined. A set of weights are selected from the pre-defined set of weights to use for the template matching based motion compensation. The set of weights is determined based on the plurality of metrics. The template matching based motion compensation is performed for the current block using the selected set of weights.

SYSTEMS AND METHODS OF ADAPTIVELY DETERMINING TEMPLATE SIZE FOR ILLUMINATION COMPENSATION

Techniques and systems are provided for processing video data. For example, a current block of a picture of the video data can be obtained for processing by an encoding device or a decoding device. A parameter of the current block can be determined. Based on the determined parameter of the current block, at least one or more of a number of rows of samples or a number columns of samples in a template of the current block and at least one or more of a number of rows of samples or a number columns of samples in a template of a reference picture can be determined. Motion compensation for the current block can be performed. For example, one or more local illumination compensation parameters can be derived for the current block using the template of the current block and the template of the reference picture.

SYSTEMS AND METHODS OF ADAPTIVELY DETERMINING TEMPLATE SIZE FOR ILLUMINATION COMPENSATION

Techniques and systems are provided for processing video data. For example, a current block of a picture of the video data can be obtained for processing by an encoding device or a decoding device. A parameter of the current block can be determined. Based on the determined parameter of the current block, at least one or more of a number of rows of samples or a number columns of samples in a template of the current block and at least one or more of a number of rows of samples or a number columns of samples in a template of a reference picture can be determined. Motion compensation for the current block can be performed. For example, one or more local illumination compensation parameters can be derived for the current block using the template of the current block and the template of the reference picture.

SYSTEMS AND METHODS OF PERFORMING IMPROVED LOCAL ILLUMINATION COMPENSATION

Techniques and systems are provided for processing video data. For example, video data can be obtained for processing by an encoding device or a decoding device. Bi-predictive motion compensation can then be performed for a current block of a picture of the video data. Performing the bi-predictive motion compensation includes deriving one or more local illumination compensation parameters for the current block using a template of the current block, a first template of a first reference picture, and a second template of a second reference picture. The templates can include neighboring samples of the current block, the first reference picture, and the second reference picture. The first template of the first reference picture and the second template of the second reference picture can be used simultaneously to derive the one or more local illumination compensation parameters.

VIDEO COMPRESSION WITH ADAPTIVE VIEW-DEPENDENT LIGHTING REMOVAL
20180097867 · 2018-04-05 ·

A video stream of a scene for a virtual reality or augmented reality experience may be captured by one or more image capture devices. Data from the video stream may be retrieved, including base vantage data with base vantage color data depicting the scene from a base vantage location, and target vantage data with target vantage color data depicting the scene from a target vantage location. The base vantage data may be reprojected to the target vantage location to obtain reprojected target vantage data. The reprojected target vantage data may be compared with the target vantage data to obtain residual data. The residual data may be compressed by removing a subset of the residual data that is likely to be less viewer-discernable than a remainder of the residual data. A compressed video stream may be stored, including the base vantage data and the compressed residual data.

Method for detecting occlusion areas
09917988 · 2018-03-13 · ·

A method and apparatus for occlusion area detection based on block difference associated with a motion vector and a predicted block difference are disclosed. For each current block of a frame, motion estimation is performed based on a temporally previous frame and a temporally subsequent frame. Based on the motion vector derived, two reference blocks of the current block are located in the temporally neighboring frames. The block difference between these two reference blocks is calculated for the current block. By comparing the block difference with a predicted block difference of the current block, the current block is determined to be an occlusion block or not accordingly. The predicted block difference is updated by averaging the block difference of neighboring blocks in a non-motion boundary area.

Method for detecting occlusion areas
09917988 · 2018-03-13 · ·

A method and apparatus for occlusion area detection based on block difference associated with a motion vector and a predicted block difference are disclosed. For each current block of a frame, motion estimation is performed based on a temporally previous frame and a temporally subsequent frame. Based on the motion vector derived, two reference blocks of the current block are located in the temporally neighboring frames. The block difference between these two reference blocks is calculated for the current block. By comparing the block difference with a predicted block difference of the current block, the current block is determined to be an occlusion block or not accordingly. The predicted block difference is updated by averaging the block difference of neighboring blocks in a non-motion boundary area.