Patent classifications
H04N19/62
Encoding and decoding based on blending of sequences of samples along time
Computer processor hardware receives image data specifying element settings for each image of multiple original images in a sequence. The computer processor hardware analyzes the element settings across the multiple original images. The computer processor hardware then utilizes the element settings of the multiple original images in the sequence to produce first encoded image data specifying a set of common image element settings, the set of common image element settings being a baseline to substantially reproduce each of the original images in the sequence.
Multi-processor support for array imagers
Using the techniques discussed herein, a set of images is captured by one or more array imagers (106). Each array imager includes multiple imagers configured in various manners. Each array imager captures multiple images of substantially a same scene at substantially a same time. The images captured by each array image are encoded by multiple processors (112, 114). Each processor can encode sets of images captured by a different array imager, or each processor can encode different sets of images captured by the same array imager. The encoding of the images is performed using various image-compression techniques so that the information that results from the encoding is smaller, in terms of storage size, than the uncompressed images.
Multi-processor support for array imagers
Using the techniques discussed herein, a set of images is captured by one or more array imagers (106). Each array imager includes multiple imagers configured in various manners. Each array imager captures multiple images of substantially a same scene at substantially a same time. The images captured by each array image are encoded by multiple processors (112, 114). Each processor can encode sets of images captured by a different array imager, or each processor can encode different sets of images captured by the same array imager. The encoding of the images is performed using various image-compression techniques so that the information that results from the encoding is smaller, in terms of storage size, than the uncompressed images.
IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
Image processing device is image processing device that uses a plurality of images respectively having focusing positions different from each other to calculate distance information to a subject, and includes frequency converter, amplitude extractor, and distance information calculator. Frequency converter converts the plurality of images into frequency. Amplitude extractor extracts an amplitude component out of a phase component and the amplitude component of a coefficient obtained by converting the plurality of images into frequency. Distance information calculator calculates the distance information, by using lens blur data and only the amplitude component extracted by amplitude extractor out of the phase component and the amplitude component of the coefficient.
IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
Image processing device is image processing device that uses a plurality of images respectively having focusing positions different from each other to calculate distance information to a subject, and includes frequency converter, amplitude extractor, and distance information calculator. Frequency converter converts the plurality of images into frequency. Amplitude extractor extracts an amplitude component out of a phase component and the amplitude component of a coefficient obtained by converting the plurality of images into frequency. Distance information calculator calculates the distance information, by using lens blur data and only the amplitude component extracted by amplitude extractor out of the phase component and the amplitude component of the coefficient.
Hybrid transform scheme for video coding
Blocks of a frame are encoded by selected according to a hybrid symmetrical discrete sine transform scheme. For a residual block resulting from inter prediction, the residual block is respectively transformed using a discrete cosine transform (DCT) and a symmetrical discrete sine transform (SDST). A first rate-distortion value for encoding the residual block using the DCT and a second rate-distortion value for encoding the residual block using the SDST are generated. For a residual block generated by intra prediction, the residual block is respectively transformed using at least one transform mode, each of which is not the SDST. Multiple inter prediction and intra prediction modes may be considered to encode the current block. The transform mode and the prediction mode resulting in a lowest rate-distortion value for encoding the current block are selected, and the current block is encoded into an encoded bitstream using the selected modes.
MOTION COMPENSATION AND MOTION ESTIMATION LEVERAGING A CONTINUOUS COORDINATE SYSTEM
Computer processor hardware receives settings information for a first image. The first image includes a set of multiple display elements. The computer processor hardware receives motion compensation information for a given display element in a second image to be created based at least in part on the first image. The motion compensation information indicates a coordinate location within a particular display element in the first image to which the given display element pertains. The computer processor hardware utilizes the coordinate location as a basis from which to select a grouping of multiple display elements in the first image. The computer processor hardware then generates a setting for the given display element in the second image based on settings of the multiple display elements in the grouping.
MOTION COMPENSATION AND MOTION ESTIMATION LEVERAGING A CONTINUOUS COORDINATE SYSTEM
Computer processor hardware receives settings information for a first image. The first image includes a set of multiple display elements. The computer processor hardware receives motion compensation information for a given display element in a second image to be created based at least in part on the first image. The motion compensation information indicates a coordinate location within a particular display element in the first image to which the given display element pertains. The computer processor hardware utilizes the coordinate location as a basis from which to select a grouping of multiple display elements in the first image. The computer processor hardware then generates a setting for the given display element in the second image based on settings of the multiple display elements in the grouping.
IMAGE PROCESSING DEVICE AND PROCESSING METHOD THEREOF
There are provided an image processing device and a processing method thereof. The image processing method includes obtaining an interference signal using a sample beam and a reference beam, transforming the interference signal by using a numerical signal processing method or an intensity mixing method to generate a transformed interference signal, and obtaining a three-dimensional (3D) phase image by using the interference signal and the transformed interference signal.
IMAGE PROCESSING DEVICE AND PROCESSING METHOD THEREOF
There are provided an image processing device and a processing method thereof. The image processing method includes obtaining an interference signal using a sample beam and a reference beam, transforming the interference signal by using a numerical signal processing method or an intensity mixing method to generate a transformed interference signal, and obtaining a three-dimensional (3D) phase image by using the interference signal and the transformed interference signal.