Method and system for generating musical notations for musical score
11756515 · 2023-09-12
Inventors
Cpc classification
G10H2220/121
PHYSICS
G10H2210/066
PHYSICS
G10H2210/095
PHYSICS
G10H2210/086
PHYSICS
International classification
Abstract
Disclosed is a computer-implemented method and system for generating musical notations for a musical score. The method comprises receiving, via a first user interface, a musical note event of the musical score, processing, via a processing arrangement, the received musical note event of the musical score to determine one or more relevant music profile definitions therefor, defining, via the processing arrangement, one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor and generating, via the processing arrangement, at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith.
Claims
1. A computer implemented method for generating musical notations for a musical score, the method comprising: receiving, via a first user interface, a musical note event of the musical score; processing, via a processing arrangement, the received musical note event of the musical score to determine one or more relevant music profile definitions therefor; defining, via the processing arrangement, one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor; and generating, via the processing arrangement, at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith, wherein in case two or more relevant music profile definitions are determined for the received musical note event of the musical score, the method comprising: determining, via the processing arrangement, correspondingly, two or more parameters to be associated with the received musical note event of the musical score based on the two or more relevant music profile definitions therefor; generating, via the processing arrangement, correspondingly, two or more notation outputs for the received musical note event of the musical score based on the determined two or more parameters associated therewith; receiving, via the user interface, selection of one of the generated two or more notation outputs for the received musical note event of the musical score; and generating, via the processing arrangement, a notation output for the received musical note event of the musical score based on the selected one of the generated two or more notation outputs therefor; wherein the method further comprising: receiving, via a second user interface, a command to implement the selected one of the generated two or more notation outputs for the received musical note event of the musical score for defining a notation output for entirety of the musical score; determining, via the processing arrangement, one or more parameters to be associated with musical note events of the musical score complementary to one or more parameters corresponding to the selected one of the generated two or more notation outputs for the received musical note event of the musical score; and generating, via the processing arrangement, the notation output for entirety of the musical score based on the determined one or more parameters to be associated with musical note events of the musical score.
2. The method according to claim 1, further comprising receiving, via the second user interface, at least one user-defined music profile definition for the musical score, wherein the one or more relevant music profile definitions for the received musical note event of the musical score are determined based on the received at least one user-defined music profile definition for the musical score.
3. The method according to claim 1, wherein the one or more relevant music profile definitions comprise at least one of: a genre of the received musical note event, an instrument of the received musical note event, a composer of the received musical note event, a period profile of the received musical note event, a custom profile of the received musical note event.
4. The method according to claim 1, wherein the one or more parameters comprise at least one of: an arrangement context providing information about the musical note event including at least one of a duration for the musical note event, a timestamp for the musical note event and a voice layer index for the musical note event, a pitch context providing information about a pitch for the musical note event including at least one of a pitch class for the musical note event, an octave for the musical note event and a pitch curve for the musical note event, and an expression context providing information about one or more articulations for the musical note event including at least one of an articulation map for the musical note event, a dynamic type for the musical note event and an expression curve for the musical note event.
5. The method according to claim 4, wherein, in the arrangement context, the duration for the musical note event indicates a time duration of the musical note event, the timestamp for the musical note event indicates an absolute position of the musical note event, and the voice layer index for the musical note event provides a value from a range of indexes indicating a placement of the musical note event in a voice layer, or a rest in the voice layer.
6. The method according to claim 4, wherein, in the pitch context, the pitch class for the musical note event indicates a value from a range including C, C #, D, D #, E, F, F #, G, G #, A, A #, B for the musical note event, the octave for the musical note event indicates an integer number representing an octave of the musical note event, and the pitch curve for the musical note event indicates a container of points representing a change of the pitch of the musical note event over duration thereof.
7. The method according to claim 4, wherein, in the expression context, the articulation map for the musical note event provides a relative position as a percentage indicating an absolute position of the musical note event, the dynamic type for the musical note event indicates a type of dynamic applied over the duration of the musical note event, and the expression curve for the musical note event indicates a container of points representing values of an action force associated with the musical note event.
8. The method according to claim 1, wherein the received musical note event is a Musical Instrument Digital Interface (MIDI) note event comprising each of MIDI messages received in a time interval between a MIDI NoteOn and a MIDI NoteOff message in a single MIDI channel.
9. A system for generating musical notations for a musical score, the system comprising: a first user interface configured to receive a musical note event of the musical score; and a processing arrangement configured to: process the received musical note event of the musical score to determine one or more relevant music profile definitions therefor; define one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor; and generate at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith, wherein in case of two or more relevant music profile definitions been determined for the received musical note event of the musical score, the processing arrangement is further configured to: determine, correspondingly, two or more parameters to be associated with the received musical note event of the musical score based on the two or more relevant music profile definitions therefor; generate, correspondingly, two or more notation outputs for the received musical note event of the musical score based on the determined two or more parameters associated therewith; receive, via the second user interface, selection of one of the generated two or more notation outputs for the received musical note event of the musical score; and generate a notation output for the received musical note event of the musical score based on the selected one of the generated two or more notation outputs therefor; wherein the processing arrangement is further configured to: receive, via the second user interface, a command to implement the selected one of the generated two or more notation outputs for the received musical note event of the musical score for defining a notation output for entirety of the musical score; determine one or more parameters to be associated with musical note events of the musical score complementary to one or more parameters corresponding to the selected one of the generated two or more notation outputs for the received musical note event of the musical score; generate the notation output for entirety of the musical score based on the determined one or more parameters to be associated with musical note events of the musical score.
10. The system according to claim 9, further comprising the second user interface configured to receive at least one user-defined music profile definition for the musical score, wherein the one or more relevant music profile definitions for the received musical note event of the musical score are determined, via the processing arrangement, based on the received at least one user-defined music profile definition for the musical score.
11. The system according to claim 9, wherein the one or more relevant music profile definitions comprise at least one of: a genre of the received musical note event, an instrument of the received musical note event, a composer of the received musical note event, a period profile of the received musical note event, a custom profile of the received musical note event.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) One or more embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION OF THE DRAWINGS
(9) Referring to
(10) Referring to
(11) Referring to
(12) Referring to
(13) Referring to
(14) Referring to
(15) Referring to
(16) Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.