Patent classifications
H04N21/43072
SYNCHRONIZED RECORDING OF AUDIO AND VIDEO WITH WIRELESSLY CONNECTED VIDEO AND AUDIO RECORDING DEVICES
A method of synchronizing video and audio when recording with a video recording device and an audio recording device configured for wireless data communication with each other including activating a recording at the video recording device, sending an audio recording command from the video recording device to the audio recording device, storing a recorded video data stream in a memory of the video recording device, receiving an audio data stream from the audio recording device at the video recording device and storing in the memory of the video recording device, determining a delay of the stored audio data stream relative to the stored video data stream, and joining the stored audio data stream and the stored video data stream together, taking the determined delay into consideration, in order to provide a recording data stream with synchronized video and audio.
Methods and apparatuses for combining and distributing user enhanced video/audio content
Methods and apparatuses are provided, which may be implemented to combine and distribute user enhanced video and/or audio content.
Video-based competition platform
A video-based competition platform supports video-based competitions between possibly geographically distributed competitors. The video-based competition platform enables users of electronic communication devices to create, compete, view, and vote in video-based competitions. In at least some embodiments, a video-based competition is presented to a user with two or more video clips played in conjunction. The video clips may be synchronized to a time base and/or common audio clip.
ACTION SYNCHRONIZATION FOR TARGET OBJECT
A method for synchronizing an action of a target object with source audio is provided. Facial parameter conversion is performed on an audio parameter of the source audio at different time periods to obtain source parameter information of the source audio at the respective time periods. Parameter extraction is performed on a target video that includes the target object to obtain target parameter information of the target video. Image reconstruction is performed on the target object in the target video based on the source parameter information of the source audio and the target parameter information of the target video, to obtain a reconstructed image. Further, a synthetic video is generated based on the reconstructed image, the synthetic video including the target object, and the action of the target object being synchronized with the source audio.
TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEPTION APPARATUS, AND RECEPTION METHOD
Simplifying subtitle display processing in a variable speed reproduction mode on the receiving side is intended.
A video stream formed with a video packet having coded image data in a payload is generated. A subtitle stream formed with a subtitle packet having subtitle information in a payload is generated. A multiplexed stream including the video stream and the subtitle stream is generated and transmitted. In generating the multiplexed stream, the subtitle packet is arranged at a random access position.
BROADCAST MANAGEMENT SYSTEM
A broadcast management system creates, manages, and streams a broadcast of an event from videos captured from multiple cameras. A video capture system comprising multiple cameras captures videos of the event and transmits the videos to a broadcast management server. The broadcast management server generates a website or other graphical interface that simultaneously displays the captured videos in a time-synchronized manner. A broadcast manager user creates a broadcast by selecting which video to output to the broadcast at any given time. A broadcast map is stored for each broadcast that includes all of the broadcast decisions made by the broadcast manager user such that the broadcast can be recreated at a later time by applying the broadcast map to the raw videos. Using a viewer client, viewers can browse or search for broadcasts and select a broadcast for viewing.
MANAGING INTERACTIVE SUBTITLE DATA
Embodiments of the present application relate to a method, apparatus, and system for processing subtitle data. The method includes dividing subtitle data into multiple subtitle groups according to subtitle data display time information related to a played object, wherein a subtitle group comprises at least one subtitle data entry, and wherein a subtitle data entry comprises subtitle content, a subtitle display time in relation to the played object, and a speed of subtitle motion, selecting a piece of subtitle data from a subtitle group according to the display time information of the played object, and causing the selected piece of subtitle data to be displayed on a track such that the selected piece of subtitle data does not overlap with or pass another piece of subtitle data displayed on the track.
AUDIO TIME SYNCHRONIZATION USING PRIORITIZED SCHEDULE
A method is provided for synchronizing playback of audio an/or video by a plurality of separate devices in a computer network, e.g. in a wi-fi network. Each separate device is programmed to select a synchronization mechanism in accordance with a predetermined prioritized list of at least two different synchronization mechanisms, and to use the selected synchronization mechanism for synchronizing audio and/or video playback. E.g. use of a clock based on the audio codec clock can be set to a higher priority than use of the system clock, which provides a poorer precision. A session leader serves to provide the synchronization to other separate devices in a session, however a group of two or more separate devices within the session may agree on selecting a synchronization mechanism providing a higher precision than the one provided by the session leader. E.g. to allow high precision timing between separate left and right loudspeakers in a stereo setup. A group leader can be elected to provide synchronization to a group of devices using a higher synchronization precision than the synchronization mechanism provided by the overall session leader. E.g. a dedicated synchronization channel separate from the audio/video streaming channel may be selected.
Transmission method, reception method, transmission device, and reception device
According to one aspect of the present disclosure, a transmission method for enabling transmission of content using a broadcast and a communication channel includes: transmitting playback control information and service information using at least the broadcast wave, the service information being information for playing back content transmission using the broadcast and content transmission using the communication channel when the content is transmitted using the broadcast and the communication channel. The service information includes attribute information indicating that the first content data and the second content data constitute the content and location information that indicates a location for acquiring meta-information on playback control of the second content. The playback control information includes indexes of a relationship between the first content data and the second content data.
AUDIOVISUAL COLLABORATION SYSTEM AND METHOD WITH LATENCY MANAGEMENT FOR WIDE-AREA BROADCAST AND SOCIAL MEDIA-TYPE USER INTERFACE MECHANICS
Techniques have been developed to facilitate the livestreaming of group audiovisual performances. Audiovisual performances including vocal music are captured and coordinated with performances of other users in ways that can create compelling user and listener experiences. For example, in some cases or embodiments, duets with a host performer may be supported in a sing-with-the-artist style audiovisual livestream in which aspiring vocalists request or queue particular songs for a live radio show entertainment format. The developed techniques provide a communications latency-tolerant mechanism for synchronizing vocal performances captured at geographically-separated devices (e.g., at globally-distributed, but network-connected mobile phones or tablets or at audiovisual capture devices geographically separated from a live studio).