Patent classifications
G10H2220/455
SOUND PRODUCTION CONTROL APPARATUS, SOUND PRODUCTION CONTROL METHOD, AND STORAGE MEDIUM
A sound production control apparatus by which a sound production mode is controlled on the basis of a player's motion even during a non-playing control operation period. An information obtaining unit 30 obtains detection information by detecting a player's motion. A sound processing unit 36 produces sound on the basis of the detection information obtained in response to operation for generating a sound trigger in the player's motion, and controls a sound production mode on the basis of the detection information obtained in response to operation for generating no sound trigger in the player's motion.
SYSTEM AND METHOD FOR PROVIDING ELECTRONIC MUSICAL SCORES
A method of distribution of digital musical scores for performance by a plurality of musicians. The method includes: receiving a musical score request from a user device; obtaining, via a musical score source, a set of musical score components of a musical part for a piece of music associated with the musical score request; obtaining a transmission delay value associated with a communication path; providing, from the musical score source and using the processor, at least one of the set of musical score components to the user device using a first distribution timing based on the obtained transmission delay value; and providing a set of corresponding musical score components of a different musical part for the piece of music to a different user device based on a different transmission delay value associated with the different user device, such as to enable coordinated playback by users without a conductor.
SYSTEMS AND METHODS FOR CAPTURING AND INTERPRETING AUDIO
A device is provided for capturing vibrations produced by an object such as a musical instrument such as a drum head of a drum kit. The device comprises a detectable element, such as a ferromagnetic element, such as a metal shim and a sensor spaced apart from and located relative to the musical instrument. The detectable element is located between the sensor and the musical instrument. When the musical instrument vibrates, the sensor remains stationary and the detectable element is vibrated relative to the sensor by the musical instrument.
Network-based processing and distribution of multimedia content of a live musical performance
Methods, systems, and computer program products for network-based processing and distribution of multimedia content of a live performance are disclosed. In some implementations, recording devices can be configured to record a multimedia event (e.g., a musical performance). The recording devices can provide the recordings to a server while the event is ongoing. The server automatically synchronizes, mixes and masters the recordings. The server performs the automatic mixing and mastering using reference audio data previously captured during a rehearsal. The server streams the mastered recording to multiple end users through the Internet or other public or private network. The streaming can be live streaming.
Gesture-controlled virtual reality systems and methods of controlling the same
Gesture-controlled virtual reality systems and methods of controlling the same are disclosed herein. An example apparatus includes an on-body sensor to output first signals associated with at least one of movement of a body part of a user or a position of the body part relative to a virtual object and an off-body sensor to output second signals associated with at least one of the movement or the position relative to the virtual object. The apparatus also includes at least one processor to generate gesture data based on at least one of the first or second signals, generate position data based on at least one of the first or second signals, determine an intended action of the user relative to the virtual object based on the position data and the gesture data, and generate an output of the virtual object in response to the intended action.
COMPUTER VISION AND MAPPING FOR AUDIO APPLICATIONS
Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.
SYSTEM FOR CONVERTING IMAGES INTO SOUND SPECTRUM
Disclosed is a system for the real-time acquisition, analysis, and conversion of the visual spectrum of shapes, images, colors, and signs into sound spectrum, which is usable in various communicative contexts, such as in the field of interactive videogames, computer science, neuroscience and neuroimaging in the medical field, visual arts, social arts, but also in the pedagogical field, characterized in that it includes a hardware component for the analog optical acquisition of the still or dynamic images present on a transparent flat surface of the hardware component, and a software component for processing the acquired images and converting their visual spectrum into sound spectrum.
COMPUTER VISION AND MAPPING FOR AUDIO APPLICATIONS
Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.
LEARNING PROGRESSION FOR INTELLIGENCE BASED MUSIC GENERATION AND CREATION
An artificial intelligence (AI) method includes generating a first musical interaction behavioral model. The first musical interaction behavioral model causes an interactive electronic device to perform a first set of musical operations and a first set of motional operations. The AI method further includes receiving user inputs received in response to the performance of the first set of musical operations and the first set of motional operations and determining a user learning progression level based on the user inputs. In response to determining that the user learning progression level is above a threshold, the AI method includes generating a second musical interaction behavioral model. The second musical interaction behavioral model causes the interactive electronic device to perform a second set of musical operations and a second set of motional operations. The AI method further includes performing the second set of musical operations and the second set of motional operations.
INFORMATION PROCESSING SYSTEM AND COMPUTER SYSTEM IMPLEMENTED METHOD OF PROCESSING INFORMATION
An information processing system includes an image obtaining circuit and a display control circuit. The image obtaining circuit is configured to obtain observation images of a first keyboard of a first keyboard instrument. The display control circuit is configured to display, on a display device, the observation images and reference images. The reference images include moving images of at least one hand and one or more fingers of a reference performer who is playing a second keyboard of a second keyboard instrument. The at least one hand and the one or more fingers of the reference performer are displayed overlapping the first keyboard included in the observation images.