G10H2220/326

Fingerless Digital Musical Effects Controller Glove
20190251937 · 2019-08-15 ·

The Fingerless Digital Musical Effects Controller Glove is a glove-mounted wearable device that includes a flex sensor placed to record the movement of a wearer's wrist, an accelerometer, and a flexible pressure sensor or switch on the palm of the glove. The device transmits data wirelessly to digital audio software running on an external personal computer, laptop, or smartphone, allowing users to control various parameters of the resultant sound or performance. The Fingerless Digital Musical Effects Controller Glove is so configured as to be perceived by the external computer as a MIDI or audio device, making it compatible with existing digital audio software. The glove is lightweight, flexible, and fingerless, allowing for unencumbered movement of the user's fingersthus freely allowing simultaneous use of a musical instrument or other musical equipment.

Notation for gesture-based composition

Various systems and methods for air gesture-based composition and instruction systems are described herein. A composition system for composing gesture-based performances may receive an indication of an air gesture performed by a user; reference a mapping of air gestures to air gesture notations to identify an air gesture notation corresponding to the air gesture; and store an indication of the air gesture notation in a memory of the computerized composition system. Another system used for instruction may present a plurality of air gesture notations in a musical arrangement; receive an indication of an air gesture performed by a user; reference a mapping of air gestures to air gesture notations to identify an air gesture notation corresponding to the air gesture; and guide the user through the musical arrangement by sequentially highlighting the air gesture notations in the musical arrangement based on the mapping of air gestures to air gesture notations.

Wearable electronic musical instrument
10255894 · 2019-04-09 ·

Disclosed is a wearable electronic music system including a plurality of first finger mountable implements, each having at least one surface engageable element operatively connected thereto. The system further includes a second finger mountable implement with a state altering element being disposable in a first state or a second state. A sound producing subsystem operatively communicates with the plurality of first finger mountable implements and the second finger mountable implement. In operation, when the state altering element is disposed in the first state, the sound producing subsystem causes a first output to be produced when at least one of said plurality of first finger mountable implements contacts the solid surface, and when the state altering element is disposed in the second state, the sound producing subsystem causes a second output to be produced as the at least one of said plurality of first finger mountable implements contacts the solid surface.

Music learning apparatus and music learning method using tactile sensation

A tactile music learning apparatus converts sound data of a user's voice corresponding to original music into first tactile data including tactile information, generates a synchronized tactile pattern by synchronizing the first tactile data with second tactile data including tactile information corresponding to sound data of the original music, and transfers the synchronized tactile pattern to a tactile reproducing apparatus to allow the tactile reproducing apparatus to reproduce the synchronized tactile pattern.

Enhancing stringed instrument learning with a wearable device

Enhanced stringed instrument learning is provided by a wearable device. A control unit comprising a processor and a memory device is integrated with a frame, of the wearable device, formed to fit over a hand of a user while playing a stringed instrument. A finger assembly is provided each finger of the hand and is coupled to the frame to immobilize a finger of the hand while playing the stringed instrument. A sensor array is coupled to each finger assembly and to the control unit to determine at least a positioning of fingers on the stringed instrument. The sensor also sends user performance data to the control unit. The control unit analyzes the user performance data and outputs feedback to the user based on the performance data analysis.

Methods, Devices, and Methods for Creating Control Signals
20190011987 · 2019-01-10 ·

A interface comprising a hand operated input device with a series of activation points activated by the digits (fingers and/or thumb) of a user; a sensor component measuring a current motion, orientation, and/or position of the input device and a output component interconnected to the activation points and the sensor component for outputting in a series the currently active activation points and the current motion, orientation, and/or position of the input device.

EXTENDED REALITY CONTROLLER AND VISUALIZER
20190005733 · 2019-01-03 ·

A method comprises: capturing images of a movable object in a scene and tracking movement of the object in the scene based on the images, to produce movement parameters that define the movement; generating for display an extended reality (XR) visualization of the physical object in the scene and changing the XR visualization responsive to changing ones of the movement parameters, such that the XR visualization visually reflects the tracked movement; displaying the XR visualization; and converting the movement parameters to control messages configured to control one or more of sound and light, and transmitting the control messages.

SMART DETECTING AND FEEDBACK SYSTEM FOR SMART PIANO

A smart musical instrument system may include a musical instrument, a first sensor, a second sensor, a processor device, and a reminder device. The first sensor may be configured to obtain first performance data reflecting operations of the musical instrument. The may be second sensor configured to receive second performance data associated with hand posture of a user of the smart instrument system. The processor device may be in communication with the first sensor and the second sensor, and may be configured to generate feedback based on at least one of the first performance data or the second performance data. The reminder device may be configured to deliver the feedback to the user.

NOTATION FOR GESTURE-BASED COMPOSITION
20180315406 · 2018-11-01 ·

Various systems and methods for air gesture-based composition and instruction systems are described herein. A composition system for composing gesture-based performances may receive an indication of an air gesture performed by a user; reference a mapping of air gestures to air gesture notations to identify an air gesture notation corresponding to the air gesture; and store an indication of the air gesture notation in a memory of the computerized composition system. Another system used for instruction may present a plurality of air gesture notations in a musical arrangement; receive an indication of an air gesture performed by a user; reference a mapping of air gestures to air gesture notations to identify an air gesture notation corresponding to the air gesture; and guide the user through the musical arrangement by sequentially highlighting the air gesture notations in the musical arrangement based on the mapping of air gestures to air gesture notations.

FINGER COMPUTER DISPLAY AND CONTROLLER DEVICE
20180292917 · 2018-10-11 ·

A processor connected to one or more displays shaped to affix to a fingernail for displaying an image.