H04N5/20

Evaluation device, evaluation method, and camera system
11483487 · 2022-10-25 · ·

It is made such that a user can easily and appropriately evaluate luminance of an HDR video signal. A luminance evaluation value is obtained by processing the HDR video signal. The luminance evaluation value is displayed on a display unit. For example, the HDR video signal is a linear HDR video signal and/or the HDR video signal obtained by performing gradation compression on the linear HDR video signal with a log curve characteristic. For example, the luminance evaluation value includes an average picture level, a high light share ratio, a high light average picture level, a product value of the high light share ratio and the high light average picture level and the like.

Evaluation device, evaluation method, and camera system
11483487 · 2022-10-25 · ·

It is made such that a user can easily and appropriately evaluate luminance of an HDR video signal. A luminance evaluation value is obtained by processing the HDR video signal. The luminance evaluation value is displayed on a display unit. For example, the HDR video signal is a linear HDR video signal and/or the HDR video signal obtained by performing gradation compression on the linear HDR video signal with a log curve characteristic. For example, the luminance evaluation value includes an average picture level, a high light share ratio, a high light average picture level, a product value of the high light share ratio and the high light average picture level and the like.

Image data encoding/decoding method and apparatus

Disclosed is an image data encoding/decoding method and apparatus. A method for decoding a 360-degree image comprises the steps of: receiving a bitstream obtained by encoding a 360-degree image; generating a prediction image by making reference to syntax information obtained from the received bitstream; combining the generated prediction image with a residual image obtained by dequantizing and inverse-transforming the bitstream, so as to obtain a decoded image; and reconstructing the decoded image into a 360-degree image according to a projection format.

Time-of-flight camera having improved dynamic range and method of generating a depth map

A time-of-flight camera for generating a depth map indicating distance(s) to target(s) includes a processor and multi-pixel time-of-flight image sensor. The camera: (a) determines, at each pixel, a plurality of phase-correlation values associated with at least one exposure duration and at least one phase offset; (b) determines for each pixel an accumulated correlation in response to the plurality of phase-correlation values; and (c) generates the depth map in response to a plurality of the accumulated correlations. A set of accumulated correlations may be determined in response to a plurality of sets of phase-correlation values such that each accumulated correlation is associated with one unique phase offset in response to each set of phase-correlation values being associated with one exposure duration, the depth map being generated in response to a plurality of sets of accumulated correlations. A computer-implemented method of generating a depth map using a time-of-flight image sensor is provided.

Time-of-flight camera having improved dynamic range and method of generating a depth map

A time-of-flight camera for generating a depth map indicating distance(s) to target(s) includes a processor and multi-pixel time-of-flight image sensor. The camera: (a) determines, at each pixel, a plurality of phase-correlation values associated with at least one exposure duration and at least one phase offset; (b) determines for each pixel an accumulated correlation in response to the plurality of phase-correlation values; and (c) generates the depth map in response to a plurality of the accumulated correlations. A set of accumulated correlations may be determined in response to a plurality of sets of phase-correlation values such that each accumulated correlation is associated with one unique phase offset in response to each set of phase-correlation values being associated with one exposure duration, the depth map being generated in response to a plurality of sets of accumulated correlations. A computer-implemented method of generating a depth map using a time-of-flight image sensor is provided.

VIDEO SIGNAL PROCESSING DEVICE AND METHOD THEREOF
20230062367 · 2023-03-02 · ·

A video signal processing device and method thereof are provided in the present application. The video processing device includes a storage and a processor. The storage stores a plurality of brightness mapping relationships. If set brightness corresponding to the brightness mapping relationships does not match the target brightness, the processor selects a first mapping relationship and a second mapping relationship of two pieces of set brightness closing to target brightness from the brightness mapping relationships. A target mapping relationship corresponding to the target brightness is obtained by an interpolation according to the first mapping relationship, the first brightness, the second mapping relationship and the corresponding set brightness. The processor converts nonlinear brightness information of a first video signal into linear brightness information of a second video signal according to the target mapping relationship.

VIDEO SIGNAL PROCESSING DEVICE AND METHOD THEREOF
20230062367 · 2023-03-02 · ·

A video signal processing device and method thereof are provided in the present application. The video processing device includes a storage and a processor. The storage stores a plurality of brightness mapping relationships. If set brightness corresponding to the brightness mapping relationships does not match the target brightness, the processor selects a first mapping relationship and a second mapping relationship of two pieces of set brightness closing to target brightness from the brightness mapping relationships. A target mapping relationship corresponding to the target brightness is obtained by an interpolation according to the first mapping relationship, the first brightness, the second mapping relationship and the corresponding set brightness. The processor converts nonlinear brightness information of a first video signal into linear brightness information of a second video signal according to the target mapping relationship.

TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEPTION APPARATUS, AND RECEPTION METHOD
20230163849 · 2023-05-25 · ·

Both a conventional receiver and an HDR-compatible receiver well perform electro-optical conversion processing on transmission video data obtained by using an HDR opto-electronic transfer characteristic. High dynamic range opto-electronic conversion is performed on high dynamic range video data to obtain the transmission video data. Encoding processing is performed on this transmission video data to obtain a video stream. A container of a predetermined format including this video stream is transmitted. Metadata information indicating a standard dynamic range opto-electronic transfer characteristic is inserted into a layer of the video stream, and metadata information indicating a high dynamic range opto-electronic transfer characteristic is inserted into at least one of the layer of the video stream and a layer of the container.

TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEPTION APPARATUS, AND RECEPTION METHOD
20230163849 · 2023-05-25 · ·

Both a conventional receiver and an HDR-compatible receiver well perform electro-optical conversion processing on transmission video data obtained by using an HDR opto-electronic transfer characteristic. High dynamic range opto-electronic conversion is performed on high dynamic range video data to obtain the transmission video data. Encoding processing is performed on this transmission video data to obtain a video stream. A container of a predetermined format including this video stream is transmitted. Metadata information indicating a standard dynamic range opto-electronic transfer characteristic is inserted into a layer of the video stream, and metadata information indicating a high dynamic range opto-electronic transfer characteristic is inserted into at least one of the layer of the video stream and a layer of the container.

IMAGE ADJUSTMENT APPARATUS, IMAGE ADJUSTMENT METHOD AND PROGRAM

An image adjustment apparatus includes an input unit configured to acquire a grayscale image, the grayscale image being an image representing a contrasting density using a contrast of luminance, and a luminance transform unit configured to transform luminances of pixels of the grayscale image using a luminance transform function with a function value varying depending on coordinates in the grayscale image on the basis of a camera response function to generate a transformed grayscale image and output the transformed grayscale image to a predetermined device, the transformed grayscale image being the grayscale image with a global brightness being adaptively adjusted and with the luminance being transformed in such a manner that local details are enhanced.