Patent classifications
G10H2210/525
MULTIDIMENSIONAL GESTURES FOR MUSIC CREATION APPLICATIONS
A graphical user interface for music creation applications, such as score notation applications and digital audio workstations, includes multi-dimensional gestures. To enter a sound event into a musical project, a user uses an input device to select and drag a desired sound event in one or more dimensions. The relative position or rate of movement along a given dimension defines a value of a sound event parameter allocated to the given dimension. The sound event is entered into the project when the selection is released. The user inputs the gesture using a pointing device such as a mouse, stylus with a touch screen, or finger on a touch screen. Stylus dimensions mapped to sound event parameters may include, horizontal and vertical stylus tip positions, vertical and horizontal tilt of the stylus, and stylus tip pressure. Sound event parameters controlled by the gestures may include diatonic pitch, chromatic inflection, and duration.
AUTOMATIC MUSIC PLAYING CONTROL DEVICE, ELECTRONIC MUSICAL INSTRUMENT, METHOD OF PLAYING AUTOMATIC MUSIC PLAYING DEVICE, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM
Provided is an automatic music playing control device that provides an instruction of playing music, the control device implementing natural music playing capable of expressing the timing and voicing in live music playing of a musical instrument by a player. The automatic music playing control device includes at least one processor, and the at least one processor selects a voicing pattern corresponding to a combination of the probabilistically-selected number of sounds to be emitted and a decided voicing type corresponding to a range from among a plurality of voicing patterns based on a scale decided according to the tune and chords of a music piece and instructs a sound source to emit a chord voiced based on the selected voicing pattern.
COMPUTER-BASED SYSTEMS, DEVICES, AND METHODS FOR GENERATING MUSICAL COMPOSITIONS THAT ARE SYNCHRONIZED TO VIDEO
Computer-based systems, devices, and methods for generating musical compositions that are purposefully synchronized with video are described. A video timeline is defined with various time-markers that demarcate specific events in the video. A music timeline is generated based on the video timeline. The music timeline preserves the various time-markers from the video timeline. A computer-based musical composition system generates a musical composition based on the music timeline. The musical composition includes various musical events that align, synchronize, or coincide with the time-markers such that when the video and musical composition are played together the musical events align, synchronize, or coincide with the demarcated events in the video.
Playback, recording, and analysis of music scales via software configuration
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.
Electronic wind instrument and key operation detection method
An electronic wind instrument and key operation detection method are provided. The electronic wind instrument includes an instrument body and a plurality of keys which have an operation surface operated by a player's finger and are provided on an external surface of the instrument body. Among the plurality of keys, at least two keys disposed to sandwich or surround a predetermined region comprise restriction parts formed on the operation surfaces. The restriction parts restrict escape of the player's finger from between the at least two keys having the restriction parts formed thereon.
Dynamic music modification
A method for electronic music generation comprising electronically applying one or more functions that change one or more compositional elements of a musical input in a first tonality or other musical representation to generate a musical output in a second tonality or other musical representation and recording data corresponding to the musical output in a recording medium or rendering such musical Transformations to a reproductive medium such as an amplifier and speakers or headphones.
PLAYBACK, RECORDING, AND ANALYSIS OF MUSIC SCALES VIA SOFTWARE CONFIGURATION
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.
ELECTRONIC MUSICAL INSTRUMENT AND KEY OPERATION DETECTION METHOD
An electronic musical instrument includes an instrument body and a plurality of keys, each of which has an operation surface operated by a player's finger is provided on an external surface of the instrument body. Among the plurality of keys, at least two keys are disposed to be adjacent to each other, and the operation surfaces of the at least two keys are configured to be inclined to descend toward between the at least two keys when viewed from a left-right direction of the electronic musical instrument.
2D USER INTERFACE FOR A MUSICAL INSTRUMENT FOR PLAYING COMBINED SEQUENCES OF CHORDS AND TUNES, AND COMPUTER-READABLE STORAGE MEDIUM
The invention relates to a user interface for a musical instrument, in particular an electronic or virtual musical instrument, for playing combined sequences of chords and tunes, comprising a key matrix (1) having a plurality of zones 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . that can be activated, these zones being arranged in columns and rows, each row of activatable zone 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . forming a region 101, 102, . . . , each region 101, 102, . . . being associated with a basic chord that preferably is a chord of a scale, preferably of a diatonic scale, the chord being specific of the scale, and each zone 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . being associated with a musical tone of the tune, which is preferably a musical tone of the tune of the scale. The user interface is designed to produce, when a zone 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . in a region 101, 102, . . . is activated, a musical tone-producing command in accordance with the activated zone 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . and region 101, 102, . . . , and the musical tone-producing command comprises at least one basic chord note command of a pitch that is contained in the basic chord associated with the activated region 101, 102, . . . and comprises a tune note command the pitch of which corresponds to the musical tone of the tune of the activated zone 111, 121, . . . ; 112, 122, . . . ; 11n, 12n, . . . . The invention also relates to a musical instrument, a method for producing combined sequences of chords and tunes, and a computer-readable storage medium.
PLAYBACK, RECORDING, AND ANALYSIS OF MUSIC SCALES VIA SOFTWARE CONFIGURATION
Playback, recording, and analysis of music scales via software configuration. In an embodiment, a graphical user interface is generated with staff and keyboard canvases, visually representing a music staff and keyboard, respectively, a scale input, parameter input(s), and a play input. In response to selection of a scale, the staff canvas is updated to visually represent the notes in the scale. In response to the selection of a musical parameter, the staff canvas and/or keyboard canvas are updated to reflect the musical parameter. In response to selection of the play input, a soundtrack of the scale is output, while simultaneously highlighting the note, being played, on the staff canvas and the key, associated with the note being played, on the keyboard canvas.