H04N13/133

Method for image processing of image data for varying image quality levels on a two-dimensional display wall

A captured scene captured of a live action scene while a display wall is positioned to be part of the live action scene may be processed. To perform the processing, image data of the live action scene having a live actor and the display wall displaying a first rendering of a precursor image is received. Further, precursor metadata for the precursor image displayed on the display wall and display wall metadata for the display wall is determined. An image matte is accessed, where the image matte indicates a first portion associated with the live actor and a second portion associated with the precursor image on the display wall Image quality levels for display wall portions of the display wall in the image data is determined, and pixels associated with the display wall in the image data are adjusted to the image quality levels.

Method for image processing of image data for varying image quality levels on a two-dimensional display wall

A captured scene captured of a live action scene while a display wall is positioned to be part of the live action scene may be processed. To perform the processing, image data of the live action scene having a live actor and the display wall displaying a first rendering of a precursor image is received. Further, precursor metadata for the precursor image displayed on the display wall and display wall metadata for the display wall is determined. An image matte is accessed, where the image matte indicates a first portion associated with the live actor and a second portion associated with the precursor image on the display wall Image quality levels for display wall portions of the display wall in the image data is determined, and pixels associated with the display wall in the image data are adjusted to the image quality levels.

Signal processing device and image display apparatus including the same

Disclosed is a signal processing device and an image display apparatus including the same. In the signal processing device and the image display apparatus according to the present disclosure, a High Dynamic Range (HDR) processor receives an image signal and adjust a luminance of the image signal, and a reduction unit configured to amplify the adjusted luminance of the image signal and increase a resolution of the grayscale of the image signal to generate an enhanced image signal, wherein the enhanced image signal provides an increased luminance and grayscale resolution of the image signal while maintaining high dynamic range within the displayed HDR image. Accordingly, expression of high grayscale of a received image may improve.

Signal processing device and image display apparatus including the same

Disclosed is a signal processing device and an image display apparatus including the same. In the signal processing device and the image display apparatus according to the present disclosure, a High Dynamic Range (HDR) processor receives an image signal and adjust a luminance of the image signal, and a reduction unit configured to amplify the adjusted luminance of the image signal and increase a resolution of the grayscale of the image signal to generate an enhanced image signal, wherein the enhanced image signal provides an increased luminance and grayscale resolution of the image signal while maintaining high dynamic range within the displayed HDR image. Accordingly, expression of high grayscale of a received image may improve.

System and method for realtime LED viewing angle correction
11496726 · 2022-11-08 · ·

A video display device includes LED pixels, a memory, and a processor. The processor receives video data that includes video pixels that correspond to the LED pixels. For at least some of the video pixels, the processor calculates a viewing angle for the LED pixel based on (i) a 3D location and optical axis vector for the LED pixel and (ii) a 3D location of a viewer of the LED pixel. The processor calculates a gain factor for the LED pixel based on the viewing angle and a relationship between pixel intensity and pixel viewing angle for the LED pixel. The processor calculates a compensated brightness for the LED pixel based on the gain factor and a brightness of the video pixel. The processor causes the LED pixel to emit light having the compensated brightness.

System and method for realtime LED viewing angle correction
11496726 · 2022-11-08 · ·

A video display device includes LED pixels, a memory, and a processor. The processor receives video data that includes video pixels that correspond to the LED pixels. For at least some of the video pixels, the processor calculates a viewing angle for the LED pixel based on (i) a 3D location and optical axis vector for the LED pixel and (ii) a 3D location of a viewer of the LED pixel. The processor calculates a gain factor for the LED pixel based on the viewing angle and a relationship between pixel intensity and pixel viewing angle for the LED pixel. The processor calculates a compensated brightness for the LED pixel based on the gain factor and a brightness of the video pixel. The processor causes the LED pixel to emit light having the compensated brightness.

SHIFT-AND-MATCH FUSION OF COLOR AND MONO IMAGES
20170318273 · 2017-11-02 ·

In general, techniques are described that facilitate processing of color image data using both a mono image data and a color image data. A device comprising a monochrome camera, a color camera, and a processor may be configured to perform the techniques. The monochrome camera may be configured to capture monochrome image data of a scene. The color camera may be configured to capture color image data of the scene. A processor may be configured to match features of the color image data to features of the monochrome image data, and compute a finite number of shift values based on the matched features of the color image data and the monochrome image data. The processor may further be configured to shift the color image data based on the finite number of shift values to generate enhanced color image data.

Method and apparatus for delivering and controlling multi-feed data
09804392 · 2017-10-31 ·

Left and right visual feeds are configured to form contiguous non-stereo left, stereo central, and non-stereo right display regions. Viewed together, an appearance of full-width stereo three-dimensionality may be achieved. The left and right display regions have the brightness of the left or right feeds respectively, while the central display region has the combined brightness of the left and right feeds. The parts of the left and right feeds that cooperate to form the stereo central display region may be scaled in brightness, so that the central display area is of uniform brightness with the left and right display areas, or smoothly varying, continuous, varying in a controlled manner, etc., rather than appearing as a sharply-defined area twice as bright as the left and right display areas. Scaling profiles may be uniform step-downs, linearly decreasing, quadratically decreasing, otherwise curved, some combination thereof, etc.

Apparatus and method for providing 3-dimensional around view through a user interface module included in a vehicle

A three-dimensional around view providing apparatus for providing a 3D around view through a user interface module included in a vehicle may include a plurality of image pickup units mounted in the vehicle, a depth estimator configured to receive a plurality of images from the plurality of image pickup units and to acquire a plurality of depth maps corresponding to the plurality of images, a controller configured to minimize a depth difference between a first boundary region of a first depth map and a second boundary region of a second depth map. At least one among an autonomous vehicle, a user terminal, and a server according to embodiments of the present disclosure may be associated or integrated with an artificial intelligence (AI) module, a drone (unmanned aerial vehicle (UAV)), a robot, an augmented reality (AR) device, a virtual reality (VR) device, a 5th-generation (5G) service related device, and the like.

Apparatus and method for providing 3-dimensional around view through a user interface module included in a vehicle

A three-dimensional around view providing apparatus for providing a 3D around view through a user interface module included in a vehicle may include a plurality of image pickup units mounted in the vehicle, a depth estimator configured to receive a plurality of images from the plurality of image pickup units and to acquire a plurality of depth maps corresponding to the plurality of images, a controller configured to minimize a depth difference between a first boundary region of a first depth map and a second boundary region of a second depth map. At least one among an autonomous vehicle, a user terminal, and a server according to embodiments of the present disclosure may be associated or integrated with an artificial intelligence (AI) module, a drone (unmanned aerial vehicle (UAV)), a robot, an augmented reality (AR) device, a virtual reality (VR) device, a 5th-generation (5G) service related device, and the like.