Patent classifications
G10H2210/071
Method of producing light animation with rhythm of music
A method of producing a light animation with a rhythm of music is disclosed. An electronic device performs Fourier series transform on a sound signal of music produced from at least one musical instrument, so as to obtain a rhythm diagram of the sound signal. The operation to extract a rhythm change point of the rhythm diagram is performed, and when the intensity of the rhythm diagram has a change from increase to decrease, the time point of the change is used as the rhythm change point and the electronic device transmits a lighting control signal to a light emitting device. After receiving the lighting control signal, the light emitting device emits light based on the lighting control signal, and the light emitted from the light emitting device continues to form the light animation, thereby improving overall performance appreciation of the music for audiences.
SYSTEMS, DEVICES, AND METHODS FOR MUSICAL CATALOG AMPLIFICATION SERVICES
Musical catalog amplification services that leverage or deploy a computer-based musical composition system are described. The computer-based musical composition system employs algorithms and, optionally, artificial intelligence to generate new music based on analyses of existing music. The new music may be wholly distinctive from, or may include musical variations of, the existing music. Rights in the new music generated by the computer-based musical composition system are granted to the rights holder(s) of the existing music. In this way, the musical catalog(s) of the rights holder(s) is/are amplified to include additional music assets. The computer-based musical composition system may be tuned so that the new music sounds more like, or less like, the existing music of the rights holder(s). Revenues generated from the new music are shared between the musical catalog amplification service provider and the rights holder(s).
Intelligent system for matching audio with video
An intelligent system for matching audio with video of the present invention provides a video analysis module targeting color tone, storyboard pace, video dialogue, length and category and director's special requirement, actors expression, movement, weather, scene, buildings, spacial and temporal, things and a music analysis module targeting recorded music form, sectional turn, style, melody and emotional tension, and then uses an AI matching module to adequately match video of the video analysis module with musical characteristics of the music analysis module, so as to quickly complete a creative composition selection function with respect to matching audio with a video.
Band-pass filtering adaptive response method and system for music lamp strip
The present invention provides a band-pass filtering adaptive response method and system for music lamp strip. The method comprises the following steps: Step 1: obtaining sound data acquired by a microphone in real time, and sequentially filtering the obtained sound data through a low-pass filter; Step 2: classifying the filtered sound data by a volume classifier, so as to classify the continuous changes of sound into a number of discrete classifications; Step 3: determining a BPM of the sound data according to a classification result of the volume classifier; Step 4: determining a acquisition frequency of MCU according to the determined BPM; Step 5: acquiring the classification result of the volume classifier by MCU according to the determined acquisition frequency; Step 6: controlling color change and/or brightness change of LED lamp on the music lamp strip according to the classification result acquired by MCU.
BAND-PASS FILTERING ADAPTIVE RESPONSE METHOD AND SYSTEM FOR MUSIC LAMP STRIP
The present invention provides a band-pass filtering adaptive response method and system for music lamp strip. The method comprises the following steps: Step 1: obtaining sound data acquired by a microphone in real time, and sequentially filtering the obtained sound data through a low-pass filter; Step 2: classifying the filtered sound data by a volume classifier, so as to classify the continuous changes of sound into a number of discrete classifications; Step 3: determining a BPM of the sound data according to a classification result of the volume classifier; Step 4: determining a acquisition frequency of MCU according to the determined BPM; Step 5: acquiring the classification result of the volume classifier by MCU according to the determined acquisition frequency; Step 6: controlling color change and/or brightness change of LED lamp on the music lamp strip according to the classification result acquired by MCU.
Mobile App riteTune to provide music instrument players instant feedback on note pitch and rhythms accuracy based on sheet music
A tool is needed for music instrument learners to get feedbacks on the correctness of their performances of a particular piece of music. The invention disclosed here is such a tool that can provide music instrument players instant feedback on note pitch and rhythms accuracy based on sheet music. This is accomplished through audio signal processing, sheet music image processing, and conversion of both analogue images and audio signals into standard digital music representation so a comparison can be done and hence a feedback can be presented to the player. An advanced feature will allow users to save the data to the cloud and retrieve later for comparison of progress. It also will allow user to participate an online competition with other players of the same piece of music.
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM
An information processing apparatus according to the present disclosure includes: a storage unit that stores a plurality of pieces of music feature information in which a plurality of types of feature amounts extracted from music information is associated with predetermined identification information, the music feature information being used as learning data in composition processing using machine learning; a reception unit that receives instruction information transmitted from a terminal apparatus; an extraction unit that extracts the music feature information from the storage unit according to the instruction information; and an output unit that outputs presentation information of the music feature information extracted by the extraction unit.
System and method for generating musical score
A method for generating a musical score based on user performance during playing a keyboard instrument may include detecting a status change of a plurality of execution devices of the keyboard instrument. The method may include generating a first signal according to the detected status change. The method may include generating a second signal indicating a plurality of timestamps. The method may include determining a tune of the musical score based on the first signal. The method may include determining a rhythm of the musical score based on the second signal. The method may further include generating the musical score based on the tune and the rhythm of the musical score.
APPARATUS FOR OUTPUTTING AN AUDIO SIGNAL IN A VEHICLE CABIN
Apparatus for outputting an audio signal in a vehicle cabin comprising: at least one audio outputting device configured to output an audio signal comprising at least one audio signal component containing a human voice, particularly a singer's voice, and/or a musical instrument in a vehicle cabin; at least one audio processing device configured to process at least one audio signal output by the at least one audio outputting device so as to suppress the audio signal component in a suppression mode; at least one audio receiving device configured to receive an acoustic human voice signal, of at least one person located in the vehicle cabin whilst the audio outputting device outputs the audio signal in the vehicle cabin; and a control device configured to control operation of the audio processing device based on at least one acoustic human signal received by the audio receiving device.
Controller for real-time visual display of music
A controller for real-time visual display of music includes a music analysis module and a display control module. The music analysis module receives an audio input, determines human perceived musical structures, human felt affect and emotion as a function of the audio input, and outputs a signal corresponding to the determined structure, affect and emotion. The display control module is operatively coupled to the music analysis module and receives the signal and controls a visual display as a function thereof to express the determined musical structure, affect and emotion in a visual manner.