Patent classifications
G10H2220/241
Device for Detecting the Grip Pattern When Playing a Bowed Instrument, and Bowed Instrument Comprising Such a Device
A device for detecting the grip pattern when playing a bowed instrument, having a sensor film arrangeable on the fingerboard for detecting the grip pattern. The sensor film is formed from at least one resistance layer, a conductive layer and a spacer layer arranged in-between, and having an evaluation circuit, by which evaluation circuit the resistance changes of the sensor film caused by the grip pattern are able to be detected. The resistance layer is divided into a number of resistance tracks corresponding to the number of strings of the bowed instrument. The width of each resistance track is formed increasing from one end of the sensor film to the other end of the sensor film, and the upper layer of the sensor film is formed by the conductive layer, and the sensor film has a curvature, corresponding to the arch of the fingerboard.
Ergonomic electronic musical instrument with pseudo-strings
An ergonomic, portable, electronic, string-like instrument that utilizes a string-like interface. The string-like interface is tactile for sightless playability and capable of advanced input such as force and pressure sensitivity. The string-like interface functions to select a note, trigger a selected note, select and play a note on the instrument or an external peripheral. The instrument is played using the techniques of multiple stringed instruments and the ergonomics allow the user to hold and handle the device consistent with playing techniques familiar to musicians of multiple instruments. It is internally or externally powered and connects directly to industry-standard musical hardware such as MIDI devices, amplifiers and multi-track recorders.
Separate isolated and resonance samples for a virtual instrument
A virtual instrument can manage separate static and dynamic samples for various notes that can be played by the virtual instrument. In some cases, the static samples correspond to resonance sounds recorded for an instrument and are the same for every note. However, the dynamic samples may correspond to isolated sounds that are recorded for each variation of a note that can be played. In response to a user's selection of a note on a user interface of the virtual instrument, the virtual instrument may determine a rule for layering the various static and dynamic samples for playback.
Polyphonic multi-dimensional controller with sensor having force-sensing potentiometers
A polyphonic multi-dimensional controller (PMC) provides for independently expressing multiple concurrently sounding musical notes. The PMC includes rows and columns of force-sensing potentiometers (FSPs) that define an array of single-touch zones (STZs). Using a z-axis switch configuration, touches are detected and the forces associated with the touches are measured. For STZs for which a touch is detected, a fine x position and a fine y position are determined respectively using an x-axis switch configuration and a y-axis switch configuration. By repeatedly scanning the STZs, the x-axis position, the y-axis position, and the z-axis force can be tracked and translated into 3-axis note expression data. The PMC is multi-touch so that the 3-axis note expression data can be polyphonic.
Storage medium, tone generation assigning apparatus and tone generation assigning method
When a harmony function is on state, a generator determines a chord based on one or more notes of depressed keys in a chord kea area, automatically generates one or more additional notes having pitches which harmonizes the pitch of the note of the depressed key in a performance key area according to the determined chord, and inputs the “note of the pressed key in the performance key area+additional notes” to an assignment controller. The assignment controller assigns plural parts (timbres) to the inputted notes, and sounds the notes in timbres of the plural parts according to the assignment. That is, tone generation assignment process is performed regarding not only notes according to performance operation by a user but also automatically generated additional notes.
System, method and computer program product for generating musical notes via a user interface touch pad
A music generating system enables a user to generate independent and sequential musical melodies, having reduced latency periods, without playing an electronic percussion instrument. The music generating system includes an electronic percussion instrument, a MIDI controller coupled to the electronic percussion instrument, a unidirectional USB communication link coupled to the MIDI controller, a MIDI converter coupled to the unidirectional USB communication link, a bidirectional USB-MIDI communication link coupled to the MIDI converter, and a rhythm drum machine in communication with the bidirectional USB-MIDI communication link. The MIDI converter is configured to independently and sequentially receive and learn a first audio control signal and a second audio control signal in a non-overlapping pattern, and thereby independently and sequentially generates and transmits to the rhythm drum machine a first musical melody and a second musical melody corresponding to the first audio control signal and the second audio control signal, respectively.
Control methods for musical performance
A method for generating music is provided, the method comprising receiving, on a capacitive touch sensitive interface such as a keyboard, multi-finger gesture inputs having a first component and a second component, wherein the second component has a temporal evolution such as speed; determining the onset of an audio signal, such as a tone, based on the first component, analyzing the temporal evolution of the second component to determine MIDI or Open Sound Control OSC instructions; modifying the audio signal based on the instructions, in particular by decoupling the temporal relationship between specific gesture inputs (e.g. at key onset, during a note and upon key release), thus mapping gesture and motion inputs, to thus obtain previously unachievable musical effects with music synthesizers.
Playback, recording, and analysis of music scales via software configuration
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.
Device for detecting the grip pattern when playing a bowed instrument, and bowed instrument comprising such a device
A device for detecting the grip pattern when playing a bowed instrument, having a sensor film arrangeable on the fingerboard for detecting the grip pattern. The sensor film is formed from at least one resistance layer, a conductive layer and a spacer layer arranged in-between, and having an evaluation circuit, by which evaluation circuit the resistance changes of the sensor film caused by the grip pattern are able to be detected. The resistance layer is divided into a number of resistance tracks corresponding to the number of strings of the bowed instrument. The width of each resistance track is formed increasing from one end of the sensor film to the other end of the sensor film, and the upper layer of the sensor film is formed by the conductive layer, and the sensor film has a curvature, corresponding to the arch of the fingerboard.
PLAYBACK, RECORDING, AND ANALYSIS OF MUSIC SCALES VIA SOFTWARE CONFIGURATION
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.