Patent classifications
H04N9/68
ELECTRONIC APPARATUS AND OPERATING METHOD OF THE ELECTRONIC APPARATUS
An electronic apparatus includes: an image sensor for acquiring pixel values of first pixels sensed during a first exposure time and second pixels sensed during a second exposure time longer than the first exposure time; and a controller for outputting an output image acquired based on pixel values of the first pixels and a corrected saturated pixel value obtained by correcting a pixel value of a saturated pixel having a pixel value exceeding a threshold value among the second pixels, using a pixel value of at least one first pixel having a distance closest to a position of the saturated pixel among the first pixels.
ELECTRONIC APPARATUS AND OPERATING METHOD OF THE ELECTRONIC APPARATUS
An electronic apparatus includes: an image sensor for acquiring pixel values of first pixels sensed during a first exposure time and second pixels sensed during a second exposure time longer than the first exposure time; and a controller for outputting an output image acquired based on pixel values of the first pixels and a corrected saturated pixel value obtained by correcting a pixel value of a saturated pixel having a pixel value exceeding a threshold value among the second pixels, using a pixel value of at least one first pixel having a distance closest to a position of the saturated pixel among the first pixels.
HDR color processing for saturated colors
To mitigate some problems of the pixel color mapping being used in HDR video decoding of the type of SLHDR, a high dynamic range video encoding circuit (300) is taught, configured to encode a high dynamic range image (IM_HDR) of a first maximum pixel luminance (PB_C1), together with a second image (Im_LWRDR) of lower dynamic range and corresponding lower second maximum pixel luminance (PB_C2), the second image being functionally encoded as a luma mapping function (400) for decoders to apply to pixel lumas (Y_PQ) of the high dynamic range image to obtain corresponding pixel lumas (PO) of the second image, the encoder comprising a data formatter (304) configured to output to a video communication medium (399) the high dynamic range image and metadata (MET) encoding the luma mapping function (400), the functional encoding of the second image being based also on a color lookup table (CL(Y_PQ)) which encodes a multiplier constant (B) for all possible values of the pixel lumas of the high dynamic range image, and the formatter being configured to output this color lookup table in the metadata, characterized in that the high dynamic range video encoding circuit comprises: —a gain determination circuit (302) configured to determine a luma gain value (G_PQ) which quantifies a ratio of an output image luma for a luma position equal to a correct normalized luminance position divided by an output luma for the luma of the pixel of the high dynamic range image, wherein the high dynamic range video encoding circuit comprises a color lookup table determination circuit (303) configured to determine the color lookup table (CL(Y_PQ)) based on values of the luma gain value for various lumas of pixels present in the high dynamic range image. Similarly we teach how the same principles can be embodied in a SLHDR-type video decoder.
HDR color processing for saturated colors
To mitigate some problems of the pixel color mapping being used in HDR video decoding of the type of SLHDR, a high dynamic range video encoding circuit (300) is taught, configured to encode a high dynamic range image (IM_HDR) of a first maximum pixel luminance (PB_C1), together with a second image (Im_LWRDR) of lower dynamic range and corresponding lower second maximum pixel luminance (PB_C2), the second image being functionally encoded as a luma mapping function (400) for decoders to apply to pixel lumas (Y_PQ) of the high dynamic range image to obtain corresponding pixel lumas (PO) of the second image, the encoder comprising a data formatter (304) configured to output to a video communication medium (399) the high dynamic range image and metadata (MET) encoding the luma mapping function (400), the functional encoding of the second image being based also on a color lookup table (CL(Y_PQ)) which encodes a multiplier constant (B) for all possible values of the pixel lumas of the high dynamic range image, and the formatter being configured to output this color lookup table in the metadata, characterized in that the high dynamic range video encoding circuit comprises: —a gain determination circuit (302) configured to determine a luma gain value (G_PQ) which quantifies a ratio of an output image luma for a luma position equal to a correct normalized luminance position divided by an output luma for the luma of the pixel of the high dynamic range image, wherein the high dynamic range video encoding circuit comprises a color lookup table determination circuit (303) configured to determine the color lookup table (CL(Y_PQ)) based on values of the luma gain value for various lumas of pixels present in the high dynamic range image. Similarly we teach how the same principles can be embodied in a SLHDR-type video decoder.
Signal reshaping for high dynamic range signals
In a method to improve backwards compatibility when decoding high-dynamic range images coded in a wide color gamut (WCG) space which may not be compatible with legacy color spaces, hue and/or saturation values of images in an image database are computed for both a legacy color space (say, YCbCr-gamma) and a preferred WCG color space (say, IPT-PQ). Based on a cost function, a reshaped color space is computed so that the distance between the hue values in the legacy color space and rotated hue values in the preferred color space is minimized HDR images are coded in the reshaped color space. Legacy devices can still decode standard dynamic range images assuming they are coded in the legacy color space, while updated devices can use color reshaping information to decode HDR images in the preferred color space at full dynamic range.
Method for displaying a video stream of a scene
A method for displaying a video stream of a scene captured by a monitoring camera on a display of a remote client device comprises receiving, at the client device, the video stream and information indicating an on/off status for an IR-illuminator illuminating the scene; setting a display setting of the display differently based on upon the on/off status for the IR-illuminator; and displaying the video stream on the display using the display setting. The client device comprises a display, a receiver which receives a video stream from the monitoring camera, and a control circuit. The control circuit executes an IR-illuminator status function to extract information indicating on/off status for the IR-illuminator illuminating the scene, a display setting function of the display differently upon the on/off status for the IR-illuminator, and a displaying function displays the video stream on the display using the setting set by the display setting function.
Method for displaying a video stream of a scene
A method for displaying a video stream of a scene captured by a monitoring camera on a display of a remote client device comprises receiving, at the client device, the video stream and information indicating an on/off status for an IR-illuminator illuminating the scene; setting a display setting of the display differently based on upon the on/off status for the IR-illuminator; and displaying the video stream on the display using the display setting. The client device comprises a display, a receiver which receives a video stream from the monitoring camera, and a control circuit. The control circuit executes an IR-illuminator status function to extract information indicating on/off status for the IR-illuminator illuminating the scene, a display setting function of the display differently upon the on/off status for the IR-illuminator, and a displaying function displays the video stream on the display using the setting set by the display setting function.
CONTENT CREATIVE INTENTION PRESERVATION UNDER VARIOUS AMBIENT COLOR TEMPERATURES
One embodiment provides a method comprising receiving an input content, and receiving ambient contextual data indicative of one or more ambient lighting conditions of an environment including a display device. The input content has corresponding metadata that at least partially represents a creative intent indicative of how the input content is intended to be viewed. The method further comprises adaptively correcting the input content based on the ambient contextual data to preserve the creative intent, and providing the corrected input content to the display device for presentation. The adaptively correcting comprises applying automatic white balancing to the input content to correct color tone of the input content.
CONTENT CREATIVE INTENTION PRESERVATION UNDER VARIOUS AMBIENT COLOR TEMPERATURES
One embodiment provides a method comprising receiving an input content, and receiving ambient contextual data indicative of one or more ambient lighting conditions of an environment including a display device. The input content has corresponding metadata that at least partially represents a creative intent indicative of how the input content is intended to be viewed. The method further comprises adaptively correcting the input content based on the ambient contextual data to preserve the creative intent, and providing the corrected input content to the display device for presentation. The adaptively correcting comprises applying automatic white balancing to the input content to correct color tone of the input content.
Display Management for High Dynamic Range Video
A display management processor receives an input image with enhanced dynamic range to be displayed on a target display which has a different dynamic range than a reference display. The input image is first transformed into a perceptually-quantized (PQ) color space, preferably the IPT-PQ color space. A color volume mapping function, which includes an adaptive tone-mapping function and an adaptive gamut mapping function, generates a mapped image. A detail-preservation step is applied to the intensity component of the mapped image to generate a final mapped image with a filtered tone-mapped intensity image. The final mapped image is then translated back to the display's preferred color space. Examples of the adaptive tone mapping and gamut mapping functions are provided.