Patent classifications
G09G5/026
COLOR CALIBRATION MODULE AND ELECTRONIC DEVICE INCLUDING SAME
A color calibration module applied to an electronic device is provided. The electronic device includes a screen. The screen includes an edge. The color calibration module includes a base and a body. The base is detachably disposed on the edge. The body includes a first end and a second end. The first end is rotatably connected to the base through a rotating shaft, and the second end includes a color calibration detecting head. An electronic device including the color calibration module is further provided.
Frame Replay with Selectable Taps
Systems, methods, and devices are provided to selectively perform a frame replay at various stages of an image processing pipeline for an electronic display. Image processing circuitry may include first compensation circuitry that compensates for a first compensation factor relating to a first physical parameter of an electronic display and second compensation circuitry that compensates for a second compensation factor relating to a second physical parameter of the electronic display. A first tap point that enables the image frame to be stored and reused may be located between the first compensation circuitry and the second compensation circuitry, while a second tap point may be located after the second compensation circuitry.
DYNAMIC FRAME RATE CONVERSION IN ACTIVE MATRIX DISPLAY
The present invention provides a motion content based dynamic frame rate conversion method for displaying a video by a display device, comprising: detecting motion content of the video and generating a control signal for controlling a display color depth based on the motion detection result. The video is displayed with a lower color depth at a higher frame rate than a standard configuration of the display device if the motion detection result indicates that the video contains appreciable amount of motion content; and the video is displayed with a higher color depth at a lower frame rate than the standard configuration of the display device if the motion detection result indicates that the video is relatively static. The present invention can facilitate the display device to dynamically convert its display output formats according to motion content of the video to further optimize the display quality.
DISPLAY DEVICE
According to an aspect, a display device includes a pixel including a first sub-pixel configured to emit light having a peak in a spectrum of red, a second sub-pixel configured to emit light having a peak in a spectrum of green, and a third sub-pixel configured to emit light having a peak in a spectrum of blue. The first sub-pixel, the second sub-pixel, and the third sub-pixel are inorganic light-emitting diodes. A light emission intensity of the second sub-pixel is increased at a predetermined ratio with respect to a light emission intensity of the first sub-pixel when the first sub-pixel emits light at a light emission intensity within a low-luminance range equal to or lower than a predetermined level of luminance.
Control display method and electronic device
This application provides a control display method and an electronic device. The method includes: determining a display position of a control on a background picture; and then determining a display scene of the background picture at the display position of the control, where the display scene of the background picture at the display position of the control is determined based on display parameters of the background picture at the display position of the control; determining display parameters of the control based on the display scene of the background picture at the display position of the control, so that a contrast between the background picture displayed at the display position of the control and the control displayed based on the display parameters meets a first preset condition; and displaying the control based on the determined display parameters of the control. This application is applicable to control display of the electronic device.
Image processing method, drive device, display panel and wearable device
An image processing method, a drive device, a display panel, and a wearable device are disclosed. The image processing method includes: determining an adjacent display pixel adjacent to each grayscale transition region in the row direction or in the column direction in the display image region according to a position of the grayscale transition region; determining a transition pixel in the grayscale transition region; acquiring a first pixel grayscale, in which the first pixel grayscale is a grayscale of the adjacent display pixel; acquiring a second pixel grayscale; adjusting a third pixel grayscale of the transition pixel according to the first pixel grayscale, the second pixel grayscale and the transition pixel, in which the third pixel grayscale is between the first pixel grayscale and the second pixel grayscale; and transmitting the third pixel grayscale to the display panel for display.
Artificial reality system using superframes to communicate surface data
This disclosure describes efficient communication of surface texture data between system on a chip (SOC) integrated circuits. An example system includes a first integrated circuit and a second integrated circuit communicatively coupled to the first integrated circuit by a video communication interface. The first integrated generates a superframe in a video frame of the video communication interface for transmission to the second integrated circuit. The superframe includes multiple subframe payloads that carry surface texture data to be updated in the frame and corresponding subframe headers that include parameters of the subframe payloads. The second integrated circuit includes a direct access memory (DMA) controller. The DMA upon receipt of the superframe, writes the surface texture data within each of the subframe payloads directly to an allocated location in memory based on the parameters included in the corresponding one of the subframe headers.
Edge illumination architecture for display device
A display driver includes first interface circuitry, a graphic memory, image processing circuitry, and drive circuitry. The first interface circuitry is configured to receive an edge illumination command from a controller external to the display driver. The graphic memory is configured to store image data. The image processing circuitry is configured to render an edge-illuminated image by overlaying an edge illumination graphic on a first image corresponding to the image data in response to the edge illumination command. The edge illumination graphic extends along an edge of a display region of a display panel. The drive circuitry is configured to drive the display panel based on the edge-illuminated image.
Method and apparatus for alpha blending images from different color formats
In some examples, an apparatus obtains source layer pixels, such as those of a content image and first destination layer pixels, such as those of a destination image. The first destination layer pixels have associated alpha values. The apparatus obtains information that indicates a first blending color format for the alpha values. The first blending color format is different from a first destination layer color format for the first destination layer pixels and an output color format for a display. The apparatus converts the source and/or first destination layer pixels to the first blending color format. The apparatus generates first alpha blended pixels based on alpha blending the source layer pixels with the first destination layer pixels using the associated alpha values. The apparatus provides, for display on the display, the first alpha blended pixels.
Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays
A dual-mode augmented/virtual reality near-eye wearable display for use with a curved lens element. The lenses are provided with one or more transparent waveguide elements that are disposed within the thickness of the lenses. The waveguide elements are configured to couple display images directly from image sources such as emissive display imagers to an exit aperture or plurality of exit aperture sub-regions within a viewer's field of view. In a preferred embodiment, a plurality of image sources are disposed on the peripheral surface of the lenses whereby each image source has a dedicated input image aperture and exit aperture sub-region that are each “piecewise flat” and have matched areas and angles of divergence whereby a viewer is presented with the output of the plurality of image source images within the viewer's field of view.