Patent classifications
G10H2220/201
Systems and methods for calibrating a musical device
The present disclosure relates to a method and system for calibrating a musical device. In some embodiments, a method for calibrating a musical device includes: energizing an actuator to actuate a key using a force corresponding to a first intensity level; obtaining, from a sensor, a first sensor signal representing motion information of the key corresponding to application of the force; and calibrating the musical device based on the first sensor signal.
AUDIO SIGNAL PROCESSING DEVICE
An adjustment of an audio signal based on an orientation variation amount is carried out according to a surrounding audio condition. An audio signal processing device includes an audio signal analysis unit, an orientation variation analysis unit, and an audio signal adjustment unit. The audio signal analysis unit acquires an audio signal and sets a target value for an audio adjustment on the basis of the audio signal. The orientation variation analysis unit acquires orientation information and generates an orientation variation amount on the basis of the orientation information. The audio signal adjustment unit adjusts the audio signal toward the target value according to the orientation variation amount.
METHOD AND DEVICE FOR PROCESSING MUSIC FILE, TERMINAL AND STORAGE MEDIUM
Provided are a method and device for processing a music file, a terminal and a storage medium. The method comprises: in response to a received sound effect adjustment instruction, acquiring a music file, the adjustment of which is indicated by the sound effect adjustment instruction; carrying out vocals and accompaniment separation on the music file to obtain vocal data and accompaniment data in the music file; carrying out first sound effect processing on the vocal data to obtain target vocal data, and carrying out second sound effect processing on the accompaniment data to obtain target accompaniment data; and synthesizing the target vocal data and the target accompaniment data to obtain a target music file.
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM
A mechanism that enables the real-time provision of content according to a user's body movement is provided. An information processing apparatus includes a reproduction control unit (43) that controls reproduction of content on the basis of a result of prediction of a timing of a predetermined state in traveling movement of a user, which is predicted on the basis of sensor information regarding the traveling movement.
IN-EAR WIRELESS AUDIO MONITOR SYSTEM WITH INTEGRATED INTERFACE FOR CONTROLLING DEVICES
An in-ear wireless audio monitor system with integrated interface for controlling devices includes an in-ear monitor device in communication with a communication module. The in-ear monitor device provides audio, tactile, and other information to a wearer, and transmits information from sensors located in or on the device to the communication module which effectuates control of external devices over a two-way MIDI link. Thus, a performer using the device may command control of an external lighting, audio, or other device by using head gestures, or other movements or sounds such as using their teeth or tongues.
REAL-TIME MUSIC GENERATION ENGINE FOR INTERACTIVE SYSTEMS
A real-time music generation engine for an interactive system includes a Musical Rule Set (MRS AND/OR CA AND/OR PCU) unit configured to combine predefined composer input with a real-time control signal into a music signal; a Constructor Automaton (CA) configured to generate a fluid piece of music based on rule definitions defined by the predefined composer input within the MRS AND/OR CA AND/OR PCU unit by musical handlers; and a Performance Cluster Unit (PCU) configured to convert the fluid piece of music from the CA into a corresponding music control signal for real-time playback by the interactive system.
LEARNING PROGRESSION FOR INTELLIGENCE BASED MUSIC GENERATION AND CREATION
An artificial intelligence (AI) method includes generating a first musical interaction behavioral model. The first musical interaction behavioral model causes an interactive electronic device to perform a first set of musical operations and a first set of motional operations. The AI method further includes receiving user inputs received in response to the performance of the first set of musical operations and the first set of motional operations and determining a user learning progression level based on the user inputs. In response to determining that the user learning progression level is above a threshold, the AI method includes generating a second musical interaction behavioral model. The second musical interaction behavioral model causes the interactive electronic device to perform a second set of musical operations and a second set of motional operations. The AI method further includes performing the second set of musical operations and the second set of motional operations.
Systems and methods for music simulation via motion sensing
The present disclosure relates to systems, methods, and devices for music simulation. The methods may include determining one or more simulation actions based on data associated with one or more simulation actions acquired by at least one sensor. The methods may further include determining, based on at least one of the one or more simulation actions and a mapping relationship between simulation actions and corresponding musical instruments, a simulation musical instrument that matches with the one or more simulation actions. The methods may further include determining, based on the one or more simulation actions, one or more first features associated with the simulation musical instrument. The methods may further include playing music based on the one or more first features.
GESTURE-CONTROLLED VIRTUAL REALITY SYSTEMS AND METHODS OF CONTROLLING THE SAME
Gesture-controlled virtual reality systems and methods of controlling the same are disclosed herein. An example apparatus includes an on-body sensor to output first signals associated with at least one of movement of a body part of a user or a position of the body part relative to a virtual object and an off-body sensor to output second signals associated with at least one of the movement or the position relative to the virtual object. The apparatus also includes at least one processor to generate gesture data based on at least one of the first or second signals, generate position data based on at least one of the first or second signals, determine an intended action of the user relative to the virtual object based on the position data and the gesture data, and generate an output of the virtual object in response to the intended action.
ELECTRONIC PERCUSSION INSTRUMENT
An electronic percussion instrument includes a base and a drumhead mounted resiliently on the base so as to be movable relative to the base. A plurality of sensors to detect movement of the drumhead relative to the base are spaced about the drumhead at selected positions to measure displacement of the drumhead relative to the base at said selected positions. At least some of said sensors are adjacent the rim of the drumhead and are responsive to movement of the rim of the drumhead towards and away from the base. The individual sensors respond to the movement at their location and the individual responses of the plurality of sensors are simultaneously measured and compared to determine the location of a hit on the drumhead, the force of the hit, and/or the duration of the hit.