G10H2210/341

AUTOMATED MUSIC COMPOSITION AND GENERATION SYSTEMS, ENGINES AND METHODS EMPLOYING PARAMETER MAPPING CONFIGURATIONS TO ENABLE AUTOMATED MUSIC COMPOSITION AND GENERATION
20200168195 · 2020-05-28 · ·

An automated music composition and generation system includes a graphical user interface (GUI) based system user interface for enabling system users to review and select one or more musical experience descriptors as well as time and/or space parameters; and an automated music composition and generation engine, operably connected to the GUI-based system user interface, for receiving, storing and processing the musical experience descriptors and time and/or space parameters, and composing and generating digital pieces of music, each containing a set of musical notes arranged and performed in the digital piece of composed music. A system network and methods are provided for designing and developing parameter mapping configurations (SMCs) used in the automated music composition and generation engine so as to enable the automated music composition and generation engine to automatically compose and generate music in response to musical experience descriptors and time and/or space parameters provided as input to the system.

METHOD OF SCORING DIGITAL MEDIA OBJECTS USING MUSICAL EXPERIENCE DESCRIPTORS TO INDICATE WHAT, WHERE AND WHEN MUSICAL EVENTS SHOULD APPEAR IN PIECES OF DIGITAL MUSIC AUTOMATICALLY COMPOSED AND GENERATED BY AN AUTOMATED MUSIC COMPOSITION AND GENERATION SYSTEM
20200168196 · 2020-05-28 · ·

An automated music composition and generation system having a system user interface operably connected to an automated music composition and generation engine, and supporting a method of scoring a selected media object with one or more pieces of digital music. The method uses the system user interface to select one or more musical experience descriptors and then apply the selected musical experience descriptors to the selected digital media object to indicate what, when and how particular musical events should occur in the one or more pieces of digital music to be automatically composed and generated by the automated music composition and generation engine. The generated piece of digital music is then used in musically scoring the selected digital media object.

METHOD OF COMPOSING A PIECE OF DIGITAL MUSIC USING MUSICAL EXPERIENCE DESCRIPTORS TO INDICATE WHAT, WHEN AND HOW MUSICAL EVENTS SHOULD APPEAR IN THE PIECE OF DIGITAL MUSIC AUTOMATICALLY COMPOSED AND GENERATED BY AN AUTOMATED MUSIC COMPOSITION AND GENERATION SYSTEM
20200168197 · 2020-05-28 · ·

An automated music composition and generation system having a system user interface operably connected to an automated music composition and generation engine, and supporting a method of composing a piece of digital music using musical experience descriptors as to indicate what, when and how particular musical events should occur in the piece of digital music to be automatically composed and generated. The method uses the system user interface to select one or more musical experience descriptors and applying the musical experience descriptors along a timeline representation of a piece of digital music to be automatically composed and generated by the automated music composition and generation engine.

Computer-implemented method of digital music composition
11875763 · 2024-01-16 ·

A computer-implemented method of digital music composition that creates a digital multi-genre musical composition track by downloading a host digital music track of a first genre and two or more separate donor multi-genre musical tracks, and then selectively modulating the instruments and rhythmic patterns of the donor musical tracks by manipulating the rhythmic patterns. The manipulation includes manipulating at least one of the intensities, frequency, sound, beat, and rhythm of the rhythmic pattern. The manipulated donor musical tracks are then integrated into the host musical track to create a combined digital multi-genre musical composition track, which can be downloaded, saved in a file, and replayed as needed.

AUTO-GENERATED ACCOMPANIMENT FROM SINGING A MELODY
20200074966 · 2020-03-05 ·

A method for processing a voice signal by an electronic system to create a song is disclosed. The method comprises the steps in the electronic system of acquiring an input singing voice recording (11); estimating a musical key (15b) and a Tempo (15a) from the singing voice recording (11); defining a tuning control (16) and a timing control (17) able to align the singing voice recording (11) with the estimated musical key (15b) and Tempo (15a); applying the tuning control (16) and the timing control (17) to the singing voice recording (11) so that an aligned voice recording (20) is obtained. Next, the method comprises the step of generating an music accompaniment (23) as function of the estimated musical key (15b) and Tempo (15a) and an arrangement database (22) and mixing the aligned voice recording (20) and the music accompaniment (23) to obtain the song (12). A system a server and a device are also disclosed.

METHOD AND APPARATUS FOR MUSIC GENERATION

A method and apparatus for music generation may include steps of receiving any length of input; recognizing pitches and rhythm of the input; generating a first segment of a full music; generating segments other than the first segment to complete the full music; generating connecting notes, chords and beats of the segments of the full music and handling anacrusis; and generating instrument accompaniment for the full music, and comprise a music generating system to realize the steps of music generation.

AUTOMATED MUSIC COMPOSITION AND GENERATION SYSTEM EMPLOYING VIRTUAL MUSICAL INSTRUMENT LIBRARIES FOR PRODUCING NOTES CONTAINED IN THE DIGITAL PIECES OF AUTOMATICALLY COMPOSED MUSIC
20240062736 · 2024-02-22 ·

An automated music composition and generation system including a system user interface for enabling system users to review and select one or more musical experience descriptors, as well as time and/or space parameters; and an automated music composition and generation engine, operably connected to the system user interface, for receiving, storing and processing musical experience descriptors and time and/or space parameters selected by the system user, so as to automatically compose and generate one or more digital pieces of music in response to the musical experience descriptors and time and/or space parameters selected by the system user. The automated music composition and generation engine includes: a digital piece creation subsystem for creating and delivering the digital piece of music to the system user interface; and a digital audio sample producing subsystem supported by virtual musical instrument libraries.

Automated music composition and generation system for spotting digital media objects and event markers using emotion-type, style-type, timing-type and accent-type musical experience descriptors that characterize the digital music to be automatically composed and generated by the system
10467998 · 2019-11-05 · ·

An autonomous music composition and performance system employing an automated music composition and generation engine configured to receive musical signals from a set of a real or synthetic musical instruments being played by a group of human musicians. The system buffers and analyzes musical signals from the set of real or synthetic musical instruments, composes and generates music in real-time that augments the music being played by the band of musicians, and/or records, analyzes and composes music recorded for subsequent playback, review and consideration by the human musicians.

AUTOMATED MUSIC COMPOSITION AND GENERATION SYSTEM EMPLOYING AN INSTRUMENT SELECTOR FOR AUTOMATICALLY SELECTING VIRTUAL INSTRUMENTS FROM A LIBRARY OF VIRTUAL INSTRUMENTS TO PERFORM THE NOTES OF THE COMPOSED PIECE OF DIGITAL MUSIC
20190304418 · 2019-10-03 · ·

An automated music composition and generation system for automatically composing and generating digital pieces of music using an automated music composition and generation engine driven by a set of emotion-type and style-type musical experience descriptors and time and/or space parameters provided by a system user. The automated music composition and generation engine includes an instrument subsystem supporting a library of virtual instruments, wherein each virtual instrument is capable of performing one or more notes of at least a portion of the composed piece of music, in response to the emotion-type and/or style-type musical experience descriptors; an instrument selector subsystem for automatically selecting one or more of virtual instruments from the library, so that each selected virtual instrument performs one or more notes of at least a portion of the composed piece of music; and a digital piece creation subsystem for creating the digital piece of composed music by assembling the notes produced from the virtual instruments selected from the library.

Performance apparatus, performance method, recording medium, and electronic musical instrument
10424279 · 2019-09-24 · ·

The present disclosure provides a performance apparatus capable of changing to a performance pattern making a musically natural transition. A CPU 13 determines which stage a performance pattern of each of tracks Tr (1) to (4) currently selected belongs to, and obtains the number of performance patterns for each stage, in accordance with depression of a transition button. The CPU 13 determines that the stage having the maximum number of performance patterns is the current stage among the number of performance patterns for each stage obtained, and changes the performance pattern of each of the tracks Tr (1) to (4) to the stage subsequent to the determined current stage. Therefore, the performance pattern can be changed into a performance pattern making a musically natural transition. As a result, even a beginner user being poor in music knowledge can set an appropriate performance pattern matching the transition of the song.