G10H2220/091

INFORMATION PROCESSING SYSTEM AND COMPUTER SYSTEM IMPLEMENTED METHOD OF PROCESSING INFORMATION
20230326357 · 2023-10-12 ·

An information processing system includes an image obtaining circuit and a display control circuit. The image obtaining circuit is configured to obtain observation images of a first keyboard of a first keyboard instrument. The display control circuit is configured to display, on a display device, the observation images and reference images. The reference images include moving images of at least one hand and one or more fingers of a reference performer who is playing a second keyboard of a second keyboard instrument. The at least one hand and the one or more fingers of the reference performer are displayed overlapping the first keyboard included in the observation images.

Context based tab autoscrolling
11749238 · 2023-09-05 ·

In general terms the present disclosure proposes computer-implemented method for context-based scrolling of tablature. The computer-implemented method comprises receiving, at computer, tablature from device at which tablature is stored or is generated, wherein computer comprises at least one processor, determining structure of tablature using at least one feature recognition algorithm, using at least one processor, determining one or more parameters of display environment, using at least one processor, building abstract syntax tree of tablature comprising an array of structural elements of tablature, using at least one processor, determining scrolling time period in which portion of the tablature that is displayed on given region of a viewport of display environment is to be replaced by a next portion of tablature, using at least one processor, and scrolling tablature according to at least abstract syntax tree of tablature and scrolling time period, in display environment, using the at least one processor.

Lane- and rhythm-based melody generation system
11640815 · 2023-05-02 · ·

To generate a melody, one or more machine-readable constraints are accepted from a user through a user interface. The constraints include rhythm constraints and pitch constraints. A sequence of musical elements is generated based on the constraints, each of the musical elements specifying, in machine-readable data, a musical pitch or silence and a duration of the musical pitch or silence. The pitch constraints prescribe pitches in the sequence of musical elements and the rhythm constraints prescribe rhythm of the sequence of musical elements. The sequence of musical elements is rendered in human-perceivable form as a melody.

Virtual tutorials for musical instruments with finger tracking in augmented reality
11798429 · 2023-10-24 · ·

Systems, devices, media, and methods are described for presenting a tutorial in augmented reality on the display of a smart eyewear device. The system includes a marker registration utility for setting a marker on a musical instrument, a localization utility for locating the eyewear device relative to the marker location and the instrument, a virtual object rendering utility for presenting a series of virtual tutorial objects on the display near one or more actuators on the instrument, and a hand tracking utility for tracking the performer's finger locations in real time during playback of a song file. A high-definition video camera captures sequences of frames of video data. The series of virtual tutorial objects, in one example, includes graphical elements presented on a virtual scroll that appears to move toward the instrument at a speed correlated with the song tempo. The hand tracking utility calculates a set of expected fingertip coordinates based on a detected hand shape and a library of hand poses and landmarks.

Wireless multi-string tuner for stringed instruments and associated method of use
11562721 · 2023-01-24 · ·

A stringed instrument tuner that senses the vibration of all the strings of the instrument independently and simultaneously via ultraviolet reflective light sensors that are immune to interference from ambient alternating-current lighting. The pitches of the strings are then measured continuously in real-time and transmitted wirelessly to a receiver that simultaneously graphically displays how far out-of-tune all of the strings are so that the musician can instantly see which strings need tuning and tune them quickly. The receiver may be a smartphone, smartwatch, smart glasses, computer, self-tuning system, or a dedicated wearable receiver-display unit.

SYNCHRONIZED AUDIOVISUAL WORK
20220277661 · 2022-09-01 ·

The teachings described herein are generally directed to a system, method, and apparatus for separating and mixing tracks within music. The system can have a video that is synchronized with the variations in the musical tempo through a variable timing reference track designed and provided for a user of the preselected performance that was prerecorded, wherein the designing of the variable timing reference track includes creating a tempo map having variable tempos, rhythms, and beats using notes from the preselected performance.

Platforms, media, and methods providing a first play streaming media station

Disclosed herein are media, systems, and methods of creating and publishing a first play station, wherein a media creator is allowed to select a plurality of audio media and sequence the selected audio media to create an ordered combination and request a first play station. Each audio media in the ordered combination is validated to create a first play station, and the media creator is allowed to play the first play station. After the first paly station is published, a call from a third party player application prompts the published first play station, At least one alteration to the ordered combination of validated audio media can be generated, wherein the at least one alteration is selected from the group consisting of: an alternate validated audio media, an alternate sequence of the ordered combination of validated audio media, or a combination thereof.

Technologies for tracking and analyzing musical activity

Techniques are described herein for tracking and analyzing musical activity (e.g., musical performances) captured by a music controller device. Data indicative of a musical performance by an individual is received. Metadata characterizing the musical data is generated. One or more analytics of the musical data is generated based, at least in part, on the metadata.

Environment Awareness System for Experiencing an Environment Through Music

An environment awareness system includes a memory and first and second modules. The memory is configured to store environmental data, one or more music composition templates, and one or more maps, where the environmental data is indicative of at least one of a state, condition, or change in an environment in which the environment awareness system is located. The first module is configured to receive and store the environmental data in the memory. The second module is configured to: based on the one or more music composition templates and the one or more maps, convert the environmental data to a music signal including modifying variables in the one or more music composition templates based on the environmental data; and based on the music signal, play out a musical composition via an audio system to audibly indicate the at least one of the state, condition, or change in the environment.

Artificially intelligent music instruction methods and systems
11288975 · 2022-03-29 · ·

Apparatus and associated methods relate to comparing a musical score model to a captured performance of the musical score, calculating a degree of similarity between the musical score model and the captured performance based on the comparison, and automatically evaluating the captured performance based on the degree of similarity and musical score degree of difficulty determined as a function of the musical score entropy. In an illustrative example, a musician may be learning to play the musical score. The musical score may be modelled, for example, based on pitch, volume, and rhythm, permitting comparison to the captured performance of the musical score. In some examples, the musical score degree of difficulty may be adapted based on the captured performance evaluation. Some embodiments may generate musical scores based on the captured performance evaluation. Various examples may advantageously provide corrective instruction based on the degree of difficulty and the captured performance evaluation.