Patent classifications
G11B27/028
APPARATUS FOR CONTROLLING LIGHTING BEHAVIOR OF A PLURALITY OF LIGHTING ELEMENTS AND A METHOD THEREFOR
There is provided an apparatus configured to control a plurality of light emitting elements so that the light emitting elements display a lighting behavior based on audio output which is based on an audio file. The apparatus can include a control portion and a choreography portion. The control portion is operable to generate an instruction sequence which can be used to control the lighting behavior of the light emitting elements, the instruction sequence can include a plurality of lighting instructions. The choreography portion is operable to associate at least one lighting instruction from the plurality of lighting instructions with at least one portion of the audio file.
APPARATUS FOR CONTROLLING LIGHTING BEHAVIOR OF A PLURALITY OF LIGHTING ELEMENTS AND A METHOD THEREFOR
There is provided an apparatus configured to control a plurality of light emitting elements so that the light emitting elements display a lighting behavior based on audio output which is based on an audio file. The apparatus can include a control portion and a choreography portion. The control portion is operable to generate an instruction sequence which can be used to control the lighting behavior of the light emitting elements, the instruction sequence can include a plurality of lighting instructions. The choreography portion is operable to associate at least one lighting instruction from the plurality of lighting instructions with at least one portion of the audio file.
ASSISTIVE MIXING SYSTEM AND METHOD OF ASSEMBLING A SYNCHRONISED SPATIAL SOUND STAGE
Intensifying contextually relevant sound events for time-delayed broadcast uses multiple directional microphones that capture sound events from a specific location within regions of a stadium. The events are contextually relevant to the environment, such as a referee's whistle sound. A processor executes signal processing of the captured events on each channel to produce audio samples each having a signal profile. The profiles are automatically compared to reference feature templates stored in a database, correspond to pre-identified sound events of contextual relevance. The signal processing accentuates characteristic traits in the events, reflective of contextually relevant events that should be included in a final composite audio output for transmission. If the comparison of samples to the feature templates suggests a high probability of a correspondence, then buffered audio for that channel is introduced into a final audio mix.
ASSISTIVE MIXING SYSTEM AND METHOD OF ASSEMBLING A SYNCHRONISED SPATIAL SOUND STAGE
Intensifying contextually relevant sound events for time-delayed broadcast uses multiple directional microphones that capture sound events from a specific location within regions of a stadium. The events are contextually relevant to the environment, such as a referee's whistle sound. A processor executes signal processing of the captured events on each channel to produce audio samples each having a signal profile. The profiles are automatically compared to reference feature templates stored in a database, correspond to pre-identified sound events of contextual relevance. The signal processing accentuates characteristic traits in the events, reflective of contextually relevant events that should be included in a final composite audio output for transmission. If the comparison of samples to the feature templates suggests a high probability of a correspondence, then buffered audio for that channel is introduced into a final audio mix.
Motion stills experience
The technology disclosed herein includes a user interface for viewing and combining media items into a video. An example method includes presenting a user interface that displays media items in a first portion of the user interface; receiving user input in the first portion that comprises a selection of a first media item; upon receiving the user input, adding the first media item to a set of selected media items and updating the user interface to comprise a control element and a second portion, wherein the first and second portions are concurrently displayed and are each scrollable along a different axis, and the second portion displays image content of the set and the control element enables a user to initiate the creation of the video based on the set of selected media items; and creating the video based on video content of the set of selected media items.
Motion stills experience
The technology disclosed herein includes a user interface for viewing and combining media items into a video. An example method includes presenting a user interface that displays media items in a first portion of the user interface; receiving user input in the first portion that comprises a selection of a first media item; upon receiving the user input, adding the first media item to a set of selected media items and updating the user interface to comprise a control element and a second portion, wherein the first and second portions are concurrently displayed and are each scrollable along a different axis, and the second portion displays image content of the set and the control element enables a user to initiate the creation of the video based on the set of selected media items; and creating the video based on video content of the set of selected media items.
System and Method for Media Synchronization and Collaboration
An improved system and method for media synchronization and collaboration involves a data storage, a plurality of media recording devices used by a plurality of users to independently record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of the event, and a media player comprising a processor and a graphical user interface. Each of the plurality of media recording devices convey to the data storage media data and metadata corresponding to their respective view of the event, where the metadata includes time samples in accordance with a common time reference. The media player uses the metadata to synchronize and play the plurality of views of the event. The graphical user interfaces can be used to select views of the plurality of views of the event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views. The media player is configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event. The multiple view video data package may include a composite video produced using the plurality of media files and metadata files. The media player can be configured to post the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
System and Method for Media Synchronization and Collaboration
An improved system and method for media synchronization and collaboration involves a data storage, a plurality of media recording devices used by a plurality of users to independently record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of the event, and a media player comprising a processor and a graphical user interface. Each of the plurality of media recording devices convey to the data storage media data and metadata corresponding to their respective view of the event, where the metadata includes time samples in accordance with a common time reference. The media player uses the metadata to synchronize and play the plurality of views of the event. The graphical user interfaces can be used to select views of the plurality of views of the event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views. The media player is configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event. The multiple view video data package may include a composite video produced using the plurality of media files and metadata files. The media player can be configured to post the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
System and Method for Automated Video Editing
A system and method for automated video editing. A reference media is selected and analyzed. At least one video may be acquired, and thereby synced to the reference audio media. Once synced, audio analysis is used to assemble an edited video. The audio analysis can include information, including user inputs, video analysis, and metadata. The system and method for automated video editing may be applied to collaborative creation, simulated stop motion animation, and real-time implementations.
System and Method for Automated Video Editing
A system and method for automated video editing. A reference media is selected and analyzed. At least one video may be acquired, and thereby synced to the reference audio media. Once synced, audio analysis is used to assemble an edited video. The audio analysis can include information, including user inputs, video analysis, and metadata. The system and method for automated video editing may be applied to collaborative creation, simulated stop motion animation, and real-time implementations.