H04N9/79

Image processor, image processing system including image processor, system-on-chip including image processing system, and method of operating image processing system

An image processing system comprises a first image processing device configured to process a frame of image data comprising a plurality of pixels, each having corresponding pixel values. Each of the pixel values include a first and second set of bits that may be separately or simultaneously accessed and/or processed. The first set of bits may correspond to the more significant bits of each pixel and the second set of bits may correspond to the less significant bits. In some examples the number of bits in each of the first and second set of bits may correspond to the width of a used data bus and/or features of a peripheral device connected to the image processor, such as a display.

Image processor, image processing system including image processor, system-on-chip including image processing system, and method of operating image processing system

An image processing system comprises a first image processing device configured to process a frame of image data comprising a plurality of pixels, each having corresponding pixel values. Each of the pixel values include a first and second set of bits that may be separately or simultaneously accessed and/or processed. The first set of bits may correspond to the more significant bits of each pixel and the second set of bits may correspond to the less significant bits. In some examples the number of bits in each of the first and second set of bits may correspond to the width of a used data bus and/or features of a peripheral device connected to the image processor, such as a display.

Synchronizing playback by media playback devices

Example systems, apparatus, and methods receive audio information including a plurality of frames from a source device, wherein each frame of the plurality of frames includes one or more audio samples and a time stamp indicating when to play the one or more audio samples of the respective frame. In an example, the time stamp is updated for each of the plurality of frames using a time differential value determined between clock information received from the source device and clock information associated with the device. The updated time stamp is stored for each of the plurality of frames, and the audio information is output based on the plurality of frames and associated updated time stamps. A number of samples per frame to be output is adjusted based on a comparison between the updated time stamp for the frame and a predicted time value for play back of the frame.

Synchronizing the storing of streaming video
11627354 · 2023-04-11 · ·

A method and device for communicating video for synchronization between a portable wearable camera and a wireless hub device are described. The portable wearable camera may capture first video data. Also, the portable wearable camera may transmit the first video data to the wireless hub device via a first wireless connection, and may capture second video data. When the first wireless connection between the wearable camera and the wireless hub device is unable to support full resolution video playback, the portable wearable camera may store the second video data. Further, the portable wearable camera may receive a request from the wireless hub device for the second video data via a second wireless connection, and may capture third video data. Further, the portable wearable camera may transmit, to the wireless hub device, the third video data via the first wireless connection and the second video data via the second wireless connection.

Synchronizing the storing of streaming video
11627354 · 2023-04-11 · ·

A method and device for communicating video for synchronization between a portable wearable camera and a wireless hub device are described. The portable wearable camera may capture first video data. Also, the portable wearable camera may transmit the first video data to the wireless hub device via a first wireless connection, and may capture second video data. When the first wireless connection between the wearable camera and the wireless hub device is unable to support full resolution video playback, the portable wearable camera may store the second video data. Further, the portable wearable camera may receive a request from the wireless hub device for the second video data via a second wireless connection, and may capture third video data. Further, the portable wearable camera may transmit, to the wireless hub device, the third video data via the first wireless connection and the second video data via the second wireless connection.

IMAGING DEVICE
20230104653 · 2023-04-06 · ·

An imaging device includes an imaging sensor that outputs an imaging signal representing a sequence of frame images of a photographic subject. A buffer memory temporarily stores data of the sequence of frame images from the imaging signal. A release switch is actuated by a user to output an image-taking signal. A controller, upon receipt of the image-taking signal from the release switch: (i) generates moving image data from at least some of the plurality of frame images stored in the buffer memory, (ii) generates at least one piece of still image data based on at least one frame image of the plurality of frame images stored in the buffer memory, and (iii) associates the moving image data with the still image data and records the moving image data and the still image data in a recording medium.

IMAGING DEVICE
20230104653 · 2023-04-06 · ·

An imaging device includes an imaging sensor that outputs an imaging signal representing a sequence of frame images of a photographic subject. A buffer memory temporarily stores data of the sequence of frame images from the imaging signal. A release switch is actuated by a user to output an image-taking signal. A controller, upon receipt of the image-taking signal from the release switch: (i) generates moving image data from at least some of the plurality of frame images stored in the buffer memory, (ii) generates at least one piece of still image data based on at least one frame image of the plurality of frame images stored in the buffer memory, and (iii) associates the moving image data with the still image data and records the moving image data and the still image data in a recording medium.

VISUAL EXPERIENCE MODULATION BASED ON STROBOSCOPIC EFFECT

An approach for modifying in real-time by removing or reinforcing stroboscopic effect from images associated with a viewing experience is disclosed. The approach includes identifying video clips, detecting environmental parameters and calculating display setting. The approach also analyzes display setting using recommendation from GAN, output displaying setting on an AR display and receiving feedback from user.

Display system, display method, and display apparatus

A display system includes a conversion apparatus converting video luminance including a luminance value in a first luminance range and a display apparatus connected thereto and displaying the video. The conversion apparatus includes a first acquisition unit, a first luminance converter, a second luminance converter, a quantization converter, and an output unit outputting a third luminance signal to the display apparatus. The display apparatus includes: a second acquisition unit acquiring the third luminance signal and setting information indicating display settings recommended to the display apparatus in display of the video; a display setting unit setting the display apparatus, using the setting information; a third luminance converter converting a third code value indicated by the third luminance signal into a second luminance value compatible with a second luminance range, using the setting information; and a display controller displaying the video on the display apparatus based on the second luminance value.

Multifunction multimedia device

A method for interpreting messages, user-defined alert conditions, voice commands and performing an action in response is described. A method for annotating media content is described. A method for presenting additional content associated with media content identified based on a fingerprint is described. A method for identifying that an advertisement portion of media content is being played based on a fingerprint derived from the media content is described. A method of one media device recording particular media content automatically in response to another media device recording the particular media content is described. A method of concurrently playing media content on multiple devices is described. A method of publishing information associated with recording of media content is described. A method of deriving fingerprints by media devices that meet an idleness criteria is described. A method of loading, modifying, and displaying a high definition frame from a frame buffer is described. A method of recording or playing media content identified based on fingerprints is described.