Patent classifications
G10H2220/391
Improvised guitar simulation
The present disclosure is directed at methods, apparatus and systems for implementing an improvised guitar playing feature on a rhythm-action game. The improvised guitar playing feature allows players to manipulate a guitar controller to produce a pleasing, musical-sounding improvised play even if the players have little experience or skill at improvising music. This feature uses quantized 8.sup.th and 16.sup.th note musical phrases, or “licks”, strung together to form authentic, melodic, and rhythmically musical and impressive guitar lines, regardless of the player's ability. The improvised guitar playing feature can also display cues directing the player to improvise in a certain manner, while still providing players a degree of musical freedom in selecting how to play. In some embodiments, the present disclosure is also directed at scoring mechanisms for evaluating improvised guitar play.
Isolation mount for a percussion instrument
A percussion instrument is adapted with a foam arrangement directly or indirectly in communication with its percussion surface. The foam arrangement reduces acoustic impact sounds when the instrument is struck, helps isolate vibrations from nearby percussion surfaces, and reduces or removes sound generation when air is released from the damper. To achieve these results, directly or indirectly secured to the percussion surface is an open-cell foam layer that is configured with a closed-cell foam layer positioned in a lateral side-by-side arrangement to create a spring and damper system. The open-cell foam may have one or more holes that extend entirely through its body, and inside those, one or more holes are closed-cell foam to provide additional spring-like functionality. The side-by-side dual-layer arrangement enables the closed- and open-cell foam layers to operate in tandem—the closed-cell layer operates as a spring, and the open-cell layer operates as a damper.
SYSTEM, APPARATUS, AND METHOD THEREOF FOR GENERATING SOUNDS
An electronic musical instrument for generating a plurality of audio signals can include an attitude measuring unit that is configured to measure one or more changes in an attitude of the musical instrument with respect to a reference frame in a multi-dimensional space. The musical instrument can also include a processor that is configured to generate one or more signals based on the measured changes in the attitude of the musical instrument. The musical instrument can also include an audio synthesizer that is configured to generate the plurality of audio signals based on the generated signals.
Pace-aware music player
An electronic device may comprise audio processing circuitry, pace tracking circuitry, and positioning circuitry. The pace tracking circuitry may be operable to selects songs to be processed for playback, and/or control time stretching applied to such songs, by the audio processing circuitry based on position data generated by the positioning circuitry, a desired tempo, and whether the songs are stored locally or network-accessible. The position data may indicate the pace of a runner during a preceding, determined time interval. The pace tracking circuitry may control the song selection and/or time stretching based on a runner profile data stored in memory of the music device. The profile data may include runner's distance-per-stride data. The electronic device may include sensors operable to function as a pedometer. The pace tracking circuitry may update the distance-per-stride data based on the position data and based on data output by the one or more sensors.
Ergonomic electronic musical instrument with pseudo-strings
An ergonomic, portable, electronic, string-like instrument that utilizes a string-like interface. The string-like interface is tactile for sightless playability and capable of advanced input such as force and pressure sensitivity. The string-like interface functions to select a note, trigger a selected note, select and play a note on the instrument or an external peripheral. The instrument is played using the techniques of multiple stringed instruments and the ergonomics allow the user to hold and handle the device consistent with playing techniques familiar to musicians of multiple instruments. It is internally or externally powered and connects directly to industry-standard musical hardware such as MIDI devices, amplifiers and multi-track recorders.
Virtual instrument playing scheme
Technologies are generally described for a virtual instrument playing system. In some examples, a virtual instrument playing system may include a sensor data receiving unit configured to receive first sensor data of a first user and second sensor data of the first user, a sound event prediction unit configured to detect a sound event of the first user and to predict a sound generation timing corresponding to the sound event of the first user based at least in part on the first sensor data of the first user, an instrument identification unit configured to identify a virtual instrument corresponding to the sound event from one or more virtual instruments based at least in part on the second sensor data of the first user, a sound data generation unit configured to generate sound data of the first user regarding the identified virtual instrument based at least in part on the sound generation timing, and a video data generation unit configured to generate video data of the first user regarding the identified virtual instrument based at least in part on the second sensor data of the first user.
EMULATING A VIRTUAL INSTRUMENT FROM A CONTINUOUS MOVEMENT VIA A MIDI PROTOCOL
The present invention relates to methods and systems for creating a sound effect out of a continuous movement, in particular by means of detecting a continuous movement through a force sensor in a device. A method is shown for creating a sound effect out of a continuous movement. The method comprises a step of providing a first device, where-by the device is adapted at detecting continuous movement and a no-movement state. The method further comprises the step of defining at least one first parameter of movement, in particular a first axis of movement of said continuous movement. A further step comprises the assigning at least one first midi-channel to the first axis of movement. A base-line value is defined for the no-movement state, and along that first axis of movement a range of values is relative to said base-line value is defined. This range of values is relative to said base-line value is reflective of a continuous movement along that first axis of movement. A sound effect is then output relative to the detected continuous movement. One aspect or additional embodiment of the present invention comprises the step of defining at least one first parameter of movement, whereby said first parameter of movement is an angular range in one axis X, Y, Z of an orientation in space of the first device (99.1) adapted at detecting continuous movement (A.1) and a no-movement state.
INFORMATION PROCESSING METHOD, IMAGE PROCESSING APPARATUS, AND PROGRAM
[Object] To propose an image processing method, image processing apparatus and program which are capable of exciting the emotions of a viewer more effectively. [Solution] An information processing method including: analyzing a beat of input music; extracting a plurality of unit images from an input image; and generating, by a processor, editing information for switching the extracted unit images depending on the analyzed beat.
Control methods for musical performance
A method for generating music is provided, the method comprising receiving, on a capacitive touch sensitive interface such as a keyboard, multi-finger gesture inputs having a first component and a second component, wherein the second component has a temporal evolution such as speed; determining the onset of an audio signal, such as a tone, based on the first component, analyzing the temporal evolution of the second component to determine MIDI or Open Sound Control OSC instructions; modifying the audio signal based on the instructions, in particular by decoupling the temporal relationship between specific gesture inputs (e.g. at key onset, during a note and upon key release), thus mapping gesture and motion inputs, to thus obtain previously unachievable musical effects with music synthesizers.
Detachable controller device for musical instruments
A device that can be temporarily attached to a musical instrument and easily detached without permanent modification to the instrument. The device is comprised of a set of controls attached to circuitry that is used to send digital data to a computer or other hardware to be used for music synthesis, manipulation, or production.