Patent classifications
G10H2230/015
Playback, recording, and analysis of music scales via software configuration
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.
COMPUTER VISION AND MAPPING FOR AUDIO APPLICATIONS
Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.
Wireless Switching System for Musical Instruments and Related Methods
Wirelessly programmed and controlled switching system for use with stringed musical instruments (e.g., guitars) to enable a user to seamlessly change pickup coil settings without having to adjust the physical connections between the switch and the pickups.
AUDIOVISUAL COLLABORATION METHOD WITH LATENCY MANAGEMENT FOR WIDE-AREA BROADCAST
Techniques have been developed to facilitate the livestreaming of group audiovisual performances. Audiovisual performances including vocal music are captured and coordinated with performances of other users in ways that can create compelling user and listener experiences. For example, in some cases or embodiments, duets with a host performer may be supported in a sing-with-the-artist style audiovisual livestream in which aspiring vocalists request or queue particular songs for a live radio show entertainment format. The developed techniques provide a communications latency-tolerant mechanism for synchronizing vocal performances captured at geographically-separated devices (e.g., at globally-distributed, but network-connected mobile phones or tablets or at audiovisual capture devices geographically separated from a live studio).
COMPUTER VISION AND MAPPING FOR AUDIO APPLICATIONS
Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.
MUSICAL INSTRUMENT TUNER, MUSICAL PERFORMANCE SUPPORT DEVICE AND MUSICAL INSTRUMENT MANAGEMENT DEVICE
The musical instrument tuner includes: a sensor device that is attached to a musical instrument; and an operation device that is able to perform wireless communication mutually with the sensor device, in which the sensor device includes an acceleration sensor that has at least two detection axes, frequency detection means for detecting, as a detected frequency, a frequency of a vibration of musical sound generated through an operation of the musical instrument based on an output from the acceleration sensor, and sensor-side communication means for transmitting transmission information including information regarding the detected frequency to the operation device, and the operation device includes operation-side communication means for receiving the transmission information transmitted from the sensor device, display means, and control means for generating tuning information of the musical instrument and causing the display means to display the tuning information based on the transmission information received from the sensor device.
PLAYBACK, RECORDING, AND ANALYSIS OF MUSIC SCALES VIA SOFTWARE CONFIGURATION
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.
REAL-TIME MUSIC GENERATION ENGINE FOR INTERACTIVE SYSTEMS
A real-time music generation engine for an interactive system includes a Musical Rule Set (MRS AND/OR CA AND/OR PCU) unit configured to combine predefined composer input with a real-time control signal into a music signal; a Constructor Automaton (CA) configured to generate a fluid piece of music based on rule definitions defined by the predefined composer input within the MRS AND/OR CA AND/OR PCU unit by musical handlers; and a Performance Cluster Unit (PCU) configured to convert the fluid piece of music from the CA into a corresponding music control signal for real-time playback by the interactive system.
Method and device for processing, playing and/or visualizing audio data, preferably based on AI, in particular decomposing and recombining of audio data in real-time
The present invention relates to a method for processing and playing audio data comprising the steps of receiving mixed input data and playing recombined output data. Furthermore, the invention relates to a device for processing and playing audio data, preferably DJ equipment, comprising an audio input unit for receiving a mixed input signal, a recombination unit and a playing unit for playing recombined output data. In addition, the present invention relates to a method and a device for representing audio data, i.e. on a display.
OUTDOOR MUSICAL INSTRUMENTS HAVING SMARTPHONE-INTERACTIVE FEATURES AND SMARTPHONE STAND
An outdoor musical instrument, such as may be installed in a playground or other public outdoor recreational area, includes a machine-readable code. The code is readable by a mobile computing device, such as a smartphone or tablet, and, when read, furnishes the device with one or more interactive functions relating to the musical instrument and that provide enhanced play opportunities. The musical instrument may also include a stand configured for a user to place the device after reading the code in order to utilize the interactive functions while playing the instrument.