G11B27/323

Systems and methods for media production and editing

The various embodiments disclosed herein relate to systems and methods for generating a derived media clip corresponding to a live event. In particular, the system comprises a processor configured to receive a plurality of content streams corresponding to the live event, each content stream corresponding to a content source. The processor is further configured to generate an annotated timeline for one or more of the plurality of content streams and receive a first user input requesting the derived media clip. The processor is then configured to generate the derived media clip based on the user input and the annotated timeline.

Signal processing device, audio-video display device and processing method

A signal processing device is disclosed, which includes a plurality of channel receivers, a plurality of time code processors in one-to-one correspondence with the channel receivers, a timing generator, a signal processor and a transmitter, wherein each channel receiver is configured to parse an audio-video signal which has a data format defined by the SDI protocol and including a time code that characterizes time information. Each time code processor is configured to extract the time code from a parsed audio-video signal obtained by a corresponding channel receiver, and form first frame image data including a frame time code. The signal processor is configured to form an absolute frame output image based on multiple channels of the first frame of image data, frame time codes therein, and an internal clock signal generated by the timing generator. The transmitter is configured to transmit the absolute frame output image for display.

METHOD AND SYSTEM FOR SYNCHRONIZING PLAYBACK OF TWO RECORDED VIDEOS OF THE SAME SURGICAL PROCEDURE
20220217305 · 2022-07-07 ·

This disclosure provides techniques of synchronizing the playback of two recorded videos of the same surgical procedure. In one aspect, a process for generating a composite video from two recorded videos of a surgical procedure is disclosed. This process begins by receiving a first and second surgical videos of the same surgical procedure. The process then performs phase segmentation on each of the first and second surgical videos to segment the first and second surgical videos into a first set of video segments and a second set of video segments, respectively, corresponding to a sequence of predefined phases. Next, the process time-aligns each video segment of a given predefined phase in the first video with a corresponding video segment of the given predefined phase in the second video. The process next displays the time-aligned first and second surgical videos for comparative viewing.

Commercial information generating device, commercial information generating method, and program

Information related to CMs included in a broadcast program can be automatically added. A CM information generation device 100 includes: a CM section detection unit 120 that detects one or more CM sections within a broadcast program by comparing the volume of the broadcast program with a volume threshold; a CM detection list generation unit 150 that generates a CM detection list describing company names of companies that have advertised detected CMs, which are CMs in the CM sections detected by the CM section detection unit 120, by cross-referencing the detected CMs with CM masters that have been associated with company names of advertisers in advance; a company name list generation unit 170 that generates a company name list describing company names that are specified by a sponsorship credit display indicating sponsors of the broadcast program; and a CM information generation unit 180 that generates CM information related to the detected CMs by comparing the CM detection list with the company name list.

EARLY WARNING IN MULTIMEDIA CONTENT
20220224954 · 2022-07-14 ·

Provided herein are embodiments of systems, devices and methods for an early warning system, for example in-movie early warning system. The early warning system may be implemented in unedited films, or movies.

Editing and tracking changes in visual effects
11380364 · 2022-07-05 ·

A method for determining edits of a subject video reel, comprising steps of opening an original EDL, reading every line of the original EDL, identifying event names representing each shot and identifying a source file. Each event includes at least a camera time code for the shot length, and a location time code indicating location of the shot in the source file. Locating events and picking up the in and out camera time codes from the shot names, noting shot names and camera times for shots found to have common in and out times, identifying every VFX shot and storing VFX names. Next, the software compares camera times for the shots in the first temporary file with camera times for the same shots in the second temporary file; preparing a result EDL file listing exclusively all VFX shots in which changes were found, and detailing the changes.

Method and system for synchronizing procedure videos for comparative learning

Embodiments described herein provide various examples of synchronizing the playback of a recorded video of a surgical procedure with a live video feed of a user performing the surgical procedure. In one aspect, a system can simultaneously receive a recorded video of a surgical procedure and a live video feed of a user performing the surgical procedure in a training session. More specifically, the recorded video is shown to the user as a training reference, and the surgical procedure includes a set of surgical tasks. The system next simultaneously monitors the playback of a current surgical task in the set of surgical tasks in the recorded video and the live video feed depicting the user performing the current surgical task. Next, the system detects that the end of the current surgical task has been reached during the playback of the recorded video. In response to determining that the user has not completed the current surgical task in the live video feed, the system pauses the playback of the recorded video while awaiting the user to complete the current surgical task.

SYSTEMS AND METHODS FOR MEDIA PRODUCTION AND EDITING

The various embodiments disclosed herein relate to systems and methods for generating a derived media clip corresponding to a live event. In particular, the system comprises a processor configured to receive a plurality of content streams corresponding to the live event, each content stream corresponding to a content source. The processor is further configured to generate an annotated timeline for one or more of the plurality of content streams and receive a first user input requesting the derived media clip. The processor is then configured to generate the derived media clip based on the user input and the annotated timeline.

COMMERCIAL INFORMATION GENERATING DEVICE, COMMERCIAL INFORMATION GENERATING METHOD, AND PROGRAM

Information related to CMs included in a broadcast program can be automatically added. A CM information generation device 100 includes: a CM section detection unit 120 that detects one or more CM sections within a broadcast program by comparing the volume of the broadcast program with a volume threshold; a CM detection list generation unit 150 that generates a CM detection list describing company names of companies that have advertised detected CMs, which are CMs in the CM sections detected by the CM section detection unit 120, by cross-referencing the detected CMs with CM masters that have been associated with company names of advertisers in advance; a company name list generation unit 170 that generates a company name list describing company names that are specified by a sponsorship credit display indicating sponsors of the broadcast program; and a CM information generation unit 180 that generates CM information related to the detected CMs by comparing the CM detection list with the company name list.

EDITING AND TRACKING CHANGES IN VISUAL EFFECTS
20210304798 · 2021-09-30 ·

A method for determining edits of a subject video reel, comprising steps of opening an original EDL, reading every line of the original EDL, identifying event names representing each shot and identifying a source file. Each event includes at least a camera time code for the shot length, and a location time code indicating location of the shot in the source file. Locating events and picking up the in and out camera time codes from the shot names, noting shot names and camera times for shots found to have common in and out times, identifying every VFX shot and storing VFX names. Next, the software compares camera times for the shots in the first temporary file with camera times for the same shots in the second temporary file; preparing a result EDL file listing exclusively all VFX shots in which changes were found, and detailing the changes.