H04N9/89

DETECTING ERRORS IN THE TIMING BETWEEN SUBTITLES AND SHOT CHANGES

In various embodiments, a subtitle timing application detects timing errors between subtitles and shot changes. In operation, the subtitle timing application determines that a temporal edge associated with a subtitle does not satisfy a timing guideline based on a shot change. The shot change occurs within a sequence of frames of an audiovisual program. The subtitle timing application then determines a new temporal edge that satisfies the timing guideline relative to the shot change. Subsequently, the subtitle timing application causes a modification to a temporal location of the subtitle within the sequence of frames based on the new temporal edge. Advantageously, the modification to the subtitle improves a quality of a viewing experience for a viewer. Notably, by automatically detecting timing errors, the subtitle timing application facilitates proper and efficient re-scheduling of subtitles that are not optimally timed with shot changes.

DETECTING ERRORS IN THE TIMING BETWEEN SUBTITLES AND SHOT CHANGES

In various embodiments, a subtitle timing application detects timing errors between subtitles and shot changes. In operation, the subtitle timing application determines that a temporal edge associated with a subtitle does not satisfy a timing guideline based on a shot change. The shot change occurs within a sequence of frames of an audiovisual program. The subtitle timing application then determines a new temporal edge that satisfies the timing guideline relative to the shot change. Subsequently, the subtitle timing application causes a modification to a temporal location of the subtitle within the sequence of frames based on the new temporal edge. Advantageously, the modification to the subtitle improves a quality of a viewing experience for a viewer. Notably, by automatically detecting timing errors, the subtitle timing application facilitates proper and efficient re-scheduling of subtitles that are not optimally timed with shot changes.

ELECTRONIC DEVICE FOR CAPTURING MOVING IMAGE AND OPERATION METHOD THEREOF
20240214533 · 2024-06-27 ·

An electronic device obtains first video data by using a camera, obtains first reference data depending on a first schedule during a first time duration by using a microphone, receives first audio data corresponding to the first video data from an external electronic device via a wireless communication circuit, and changes the first schedule to a second schedule based on a comparison result between the first reference data and a portion of the first audio data. The electronic device further obtains second reference data depending on the second schedule during a second time duration subsequent to the first time duration by using the microphone, corrects a delay of the first audio data based on the second reference data, and creates a moving image file based on the first video data and the corrected first audio data.

ELECTRONIC DEVICE FOR CAPTURING MOVING IMAGE AND OPERATION METHOD THEREOF
20240214533 · 2024-06-27 ·

An electronic device obtains first video data by using a camera, obtains first reference data depending on a first schedule during a first time duration by using a microphone, receives first audio data corresponding to the first video data from an external electronic device via a wireless communication circuit, and changes the first schedule to a second schedule based on a comparison result between the first reference data and a portion of the first audio data. The electronic device further obtains second reference data depending on the second schedule during a second time duration subsequent to the first time duration by using the microphone, corrects a delay of the first audio data based on the second reference data, and creates a moving image file based on the first video data and the corrected first audio data.

SYNERGISTIC TEMPORAL ANTI-ALIASING AND COARSE PIXEL SHADING TECHNOLOGY

Systems, apparatuses and methods may provide for technology that determines a frame rate of video content, sets a blend amount parameter based on the frame rate, and temporally anti-aliases the video content based on the blend amount parameter. Additionally, the technology may detect a coarse pixel (CP) shading condition with respect to one or more frames in the video content and select, in response to the CP shading condition, a per frame jitter pattern that jitters across pixels, wherein the video content is temporally anti-aliased based on the per frame jitter pattern. The CP shading condition may also cause the technology to apply a gradient to a plurality of color planes on a per color plane basis and discard pixel level samples associated with a CP if all mip data corresponding to the CP is transparent or shadowed out.

Scenario generation system, scenario generation method and scenario generation program

A scenario generation system, a scenario generation method, and a scenario generation program are provided. A scenario generation system used for video playback synchronized with musical piece playback includes a situation estimating portion for estimating a situation expressed by the musical piece, a video specifying portion for specifying at least one video suited for the estimated situation in the video constituted by scenes each having a time-series order, and a scenario generating portion for generating a scenario associating the scenes constituting the specified video with each section of the musical piece. As a result, a scenario can be generated by the scenes each having the time-series order, and the synchronized video with a natural impression can be reproduced corresponding to the musical piece playback on the basis of the scenario.

Scenario generation system, scenario generation method and scenario generation program

A scenario generation system, a scenario generation method, and a scenario generation program are provided. A scenario generation system used for video playback synchronized with musical piece playback includes a situation estimating portion for estimating a situation expressed by the musical piece, a video specifying portion for specifying at least one video suited for the estimated situation in the video constituted by scenes each having a time-series order, and a scenario generating portion for generating a scenario associating the scenes constituting the specified video with each section of the musical piece. As a result, a scenario can be generated by the scenes each having the time-series order, and the synchronized video with a natural impression can be reproduced corresponding to the musical piece playback on the basis of the scenario.

Clip scheduling with conflict alert

An example method involves: accessing a first list including ordered clip identifiers C.sub.1 . . . C.sub.n; accessing a second list including ordered player identifiers P.sub.1 . . . P.sub.x; determining that an identifier C.sub.m of the identifiers C.sub.1 . . . C.sub.n is restricted to being assigned an identifier P.sub.z from the identifiers P.sub.1 . . . P.sub.x; making a first determination that an identifier C.sub.p is a next one of the identifiers C.sub.1 . . . C.sub.n after the identifier C.sub.m to have a player-identifier assignment-restriction; responsive to making the first determination, (i) determining that the identifier C.sub.p is restricted to being assigned an identifier P.sub.y from the identifiers P.sub.1 . . . P.sub.x, and (ii) matching with each identifier C.sub.m+1 . . . C.sub.p in reverse order a respective one of the identifiers P.sub.1 . . . P.sub.x selected in a reverse ordered and looping fashion starting with the identifier P.sub.y; making a second determination that the identifier C.sub.m+1 has been matched with the player identifier P.sub.z; and responsive to making the second determination, causing an alert to be output.

Clip scheduling with conflict alert

An example method involves: accessing a first list including ordered clip identifiers C.sub.1 . . . C.sub.n; accessing a second list including ordered player identifiers P.sub.1 . . . P.sub.x; determining that an identifier C.sub.m of the identifiers C.sub.1 . . . C.sub.n is restricted to being assigned an identifier P.sub.z from the identifiers P.sub.1 . . . P.sub.x; making a first determination that an identifier C.sub.p is a next one of the identifiers C.sub.1 . . . C.sub.n after the identifier C.sub.m to have a player-identifier assignment-restriction; responsive to making the first determination, (i) determining that the identifier C.sub.p is restricted to being assigned an identifier P.sub.y from the identifiers P.sub.1 . . . P.sub.x, and (ii) matching with each identifier C.sub.m+1 . . . C.sub.p in reverse order a respective one of the identifiers P.sub.1 . . . P.sub.x selected in a reverse ordered and looping fashion starting with the identifier P.sub.y; making a second determination that the identifier C.sub.m+1 has been matched with the player identifier P.sub.z; and responsive to making the second determination, causing an alert to be output.

Apparatus and method for playback of audio-visual recordings

An imaging system comprising a panoramic visual image display, an associated directional sound playback device, and an associated motion reproduction device is disclosed. The imaging system conveys visual, sound and motion information related to a particular viewing direction to provide a realistic experience for the viewer. The imaging system can also comprise a panoramic visual image recording device capable of recording panoramic images, an associated directional sound capturing device capable of recording sound, and an associated directional motion capturing device capable of recording motion. Recorded panoramic images, sound and motion can be synchronously recorded to a common time code for simultaneous playback.