Patent classifications
G10H2210/141
REAL-TIME JAMMING ASSISTANCE FOR GROUPS OF MUSICIANS
Real-time jamming is automatically assisted for musicians. A real-time audio signal is received of played music that is played by at least one person. Beat is tracked of the played music from the real-time audio signal and accordingly a time of a next beat is predicted. At least one of chords; notes; and drum sounds is recognized from the real-time audio signal and repetitions in the played music are accordingly detected. A next development is predicted in the played music, based on the detected repetitions, including at least one of chords; notes; and drum sounds that will be played next, and respective timing based on the predicted time of the next beat. A real-time output is produced based on the predicted next development in the played music.
Pseudo—live music and sound
A method and apparatus for the creation and playback of music and/or sound, so that sound sequences are generated that vary from one playback to another playback. In one embodiment, during composition creation, artist(s) may define how the composition may vary from playback to playback using visually interactive display(s). The artist's definition may be embedded into a composition dataset. During playback, a composition data set may be processed by a playback device and/or a playback program, so that each time the composition is played-back a unique version may be generated. Variability during playback may include: the variable selection of alternative sound segment(s); variable editing of sound segment(s) during playback processing; variable placement of sound segment(s) during playback processing; the spawning of group(s) of alternative sound segments from initiating sound segment(s); and the combining and/or mixing of alternative sound segments in one or more sound channels. MIDI-like variable compositions and the variable use of sound segments comprised of a timed sequence of MIDI-like commands are also disclosed.
Methods, Devices and Computer Program Products for Interactive Musical Improvisation Guidance
A method and a device provides a user with the ability to freely improvise or play around with a selection of different chords and be provided with visual guidance assisting the improvisation of melody, while also providing accompaniment consistent with the selection of chords. Sound files consistent with a user selection of a chord and/or a dynamic level are selected from an audio library and played, while the user is given visual cues on a user interface keyboard assisting the user to select notes that are consistent with the chord selected for the accompaniment.
System and method of generating music from electrical activity data
The Plant Choir system comprises a software program and hardware that measures electrical activity of a person, plant, or animal and translates those readings into music on a computing device. The system gathers electrical activity data using electrodermal activity (EDA) measurement devices. The EDA readings of the individual subjects are translated via the software into musical melodies in real time. The individual subject melodies are combined to create interesting harmonies similar to a choir. The music is rendered using a MIDI (Musical Instrument Data Interface) programming interface of the computer operating system. The software allows the user to select program options, set music and program parameters. Variations in the EDA signal are interpreted as music. Each subject connected to the system is assigned a musical voice and the voices are combined to create multi part harmonies similar to a choir.
DEVICE, SYSTEM AND METHOD FOR GENERATING AN ACCOMPANIMENT OF INPUT MUSIC DATA
A device for automatically generating a real time accompaniment of input music data includes a music input that receives music data. A music analyzer analyzes received music data to obtain a music data description including one or more characteristics of the analyzed music data. A query generator generates a query to a music database including music patterns and associated metadata including one or more characteristics of the music patterns, the query being generated from the music data description and from an accompaniment description describing preferences of the real time accompaniment and/or music rules describing general rules of music. A query interface queries the music database using a generated query and receives a music pattern selected from the music database by use of the query. A music output outputs the received music pattern.
Techniques of coordinating sensory event timelines of multiple devices
Embodiments described herein relate to techniques of coordinating sensory event timelines of multiple devices. The devices may use the sensory even timelines to output sensory events such as audio segments. The devices may take turns determining the sensory events to be output by the devices using their sensory event timelines. The techniques coordinate transitions of the devices between a first mode in which a device is allowed to determine sensory events to be output and a second mode in which a device outputs sensory events determined by another device.