Method and system for generating musical notations for musical score

11756515 · 2023-09-12

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed is a computer-implemented method and system for generating musical notations for a musical score. The method comprises receiving, via a first user interface, a musical note event of the musical score, processing, via a processing arrangement, the received musical note event of the musical score to determine one or more relevant music profile definitions therefor, defining, via the processing arrangement, one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor and generating, via the processing arrangement, at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith.

    Claims

    1. A computer implemented method for generating musical notations for a musical score, the method comprising: receiving, via a first user interface, a musical note event of the musical score; processing, via a processing arrangement, the received musical note event of the musical score to determine one or more relevant music profile definitions therefor; defining, via the processing arrangement, one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor; and generating, via the processing arrangement, at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith, wherein in case two or more relevant music profile definitions are determined for the received musical note event of the musical score, the method comprising: determining, via the processing arrangement, correspondingly, two or more parameters to be associated with the received musical note event of the musical score based on the two or more relevant music profile definitions therefor; generating, via the processing arrangement, correspondingly, two or more notation outputs for the received musical note event of the musical score based on the determined two or more parameters associated therewith; receiving, via the user interface, selection of one of the generated two or more notation outputs for the received musical note event of the musical score; and generating, via the processing arrangement, a notation output for the received musical note event of the musical score based on the selected one of the generated two or more notation outputs therefor; wherein the method further comprising: receiving, via a second user interface, a command to implement the selected one of the generated two or more notation outputs for the received musical note event of the musical score for defining a notation output for entirety of the musical score; determining, via the processing arrangement, one or more parameters to be associated with musical note events of the musical score complementary to one or more parameters corresponding to the selected one of the generated two or more notation outputs for the received musical note event of the musical score; and generating, via the processing arrangement, the notation output for entirety of the musical score based on the determined one or more parameters to be associated with musical note events of the musical score.

    2. The method according to claim 1, further comprising receiving, via the second user interface, at least one user-defined music profile definition for the musical score, wherein the one or more relevant music profile definitions for the received musical note event of the musical score are determined based on the received at least one user-defined music profile definition for the musical score.

    3. The method according to claim 1, wherein the one or more relevant music profile definitions comprise at least one of: a genre of the received musical note event, an instrument of the received musical note event, a composer of the received musical note event, a period profile of the received musical note event, a custom profile of the received musical note event.

    4. The method according to claim 1, wherein the one or more parameters comprise at least one of: an arrangement context providing information about the musical note event including at least one of a duration for the musical note event, a timestamp for the musical note event and a voice layer index for the musical note event, a pitch context providing information about a pitch for the musical note event including at least one of a pitch class for the musical note event, an octave for the musical note event and a pitch curve for the musical note event, and an expression context providing information about one or more articulations for the musical note event including at least one of an articulation map for the musical note event, a dynamic type for the musical note event and an expression curve for the musical note event.

    5. The method according to claim 4, wherein, in the arrangement context, the duration for the musical note event indicates a time duration of the musical note event, the timestamp for the musical note event indicates an absolute position of the musical note event, and the voice layer index for the musical note event provides a value from a range of indexes indicating a placement of the musical note event in a voice layer, or a rest in the voice layer.

    6. The method according to claim 4, wherein, in the pitch context, the pitch class for the musical note event indicates a value from a range including C, C #, D, D #, E, F, F #, G, G #, A, A #, B for the musical note event, the octave for the musical note event indicates an integer number representing an octave of the musical note event, and the pitch curve for the musical note event indicates a container of points representing a change of the pitch of the musical note event over duration thereof.

    7. The method according to claim 4, wherein, in the expression context, the articulation map for the musical note event provides a relative position as a percentage indicating an absolute position of the musical note event, the dynamic type for the musical note event indicates a type of dynamic applied over the duration of the musical note event, and the expression curve for the musical note event indicates a container of points representing values of an action force associated with the musical note event.

    8. The method according to claim 1, wherein the received musical note event is a Musical Instrument Digital Interface (MIDI) note event comprising each of MIDI messages received in a time interval between a MIDI NoteOn and a MIDI NoteOff message in a single MIDI channel.

    9. A system for generating musical notations for a musical score, the system comprising: a first user interface configured to receive a musical note event of the musical score; and a processing arrangement configured to: process the received musical note event of the musical score to determine one or more relevant music profile definitions therefor; define one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor; and generate at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith, wherein in case of two or more relevant music profile definitions been determined for the received musical note event of the musical score, the processing arrangement is further configured to: determine, correspondingly, two or more parameters to be associated with the received musical note event of the musical score based on the two or more relevant music profile definitions therefor; generate, correspondingly, two or more notation outputs for the received musical note event of the musical score based on the determined two or more parameters associated therewith; receive, via the second user interface, selection of one of the generated two or more notation outputs for the received musical note event of the musical score; and generate a notation output for the received musical note event of the musical score based on the selected one of the generated two or more notation outputs therefor; wherein the processing arrangement is further configured to: receive, via the second user interface, a command to implement the selected one of the generated two or more notation outputs for the received musical note event of the musical score for defining a notation output for entirety of the musical score; determine one or more parameters to be associated with musical note events of the musical score complementary to one or more parameters corresponding to the selected one of the generated two or more notation outputs for the received musical note event of the musical score; generate the notation output for entirety of the musical score based on the determined one or more parameters to be associated with musical note events of the musical score.

    10. The system according to claim 9, further comprising the second user interface configured to receive at least one user-defined music profile definition for the musical score, wherein the one or more relevant music profile definitions for the received musical note event of the musical score are determined, via the processing arrangement, based on the received at least one user-defined music profile definition for the musical score.

    11. The system according to claim 9, wherein the one or more relevant music profile definitions comprise at least one of: a genre of the received musical note event, an instrument of the received musical note event, a composer of the received musical note event, a period profile of the received musical note event, a custom profile of the received musical note event.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    (1) One or more embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:

    (2) FIG. 1 is an illustration of a flowchart listing steps involved in a computer-implemented method for generating notations, in accordance with an embodiment of the present disclosure;

    (3) FIG. 2 is an illustration of a block diagram of a system for generating notations, in accordance with another embodiment of the present disclosure;

    (4) FIG. 3 is an illustration of an exemplary depiction of a musical note event being represented using one or more parameters thereof, in accordance with an embodiment of the present disclosure;

    (5) FIG. 4 is an exemplary depiction of a musical note event being translated into an arrangement context, in accordance with an embodiment of the present disclosure;

    (6) FIG. 5 is an exemplary depiction of a musical note event being translated into a pitch context, in accordance with an embodiment of the present disclosure;

    (7) FIG. 6A is an exemplary illustration of a first user interface, in accordance with an embodiment of the present disclosure;

    (8) FIG. 6B is an exemplary illustration of a second user interface, in accordance with an embodiment of the present disclosure.

    DETAILED DESCRIPTION OF THE DRAWINGS

    (9) Referring to FIG. 1, illustrated is a flowchart listing steps involved in a computer-implemented method 100 for generating musical notations for a musical score, in accordance with an embodiment of the present disclosure. As shown, the method 100 comprising steps 102, 104, and 106. At a step 102, the method 100 comprises receiving, via a first user interface, a musical note event of the musical score. The musical note event(s) are entered by the user via the first user interface configured to allow the user to enter the musical note event of the musical score to be translated or notated by the method 100. At a step 104, the method 100 further comprises processing, via a processing arrangement, the received musical note event of the musical score to determine one or more relevant music profile definitions therefor. The processing arrangement is configured to determine the one or more one or more relevant music profile definitions via processing of the received musical note event of the musical score. At a step 106, the method further comprises defining, via the processing arrangement, one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor. The processing arrangement is configured to define the one or more parameters to be associated with the musical event based on at least the determined one or more relevant music profile definitions for enabling further processing thereof. And, at a step 108, the method further comprises generating, via the processing arrangement, at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith. The processing arrangement is further configured to generate the at least one notation output for the received musical note event based on the defined one or more parameters for generating the musical notations for the musical score.

    (10) Referring to FIG. 2, illustrated is a block diagram of a system 200 for generating musical notations for a musical score, in accordance with another embodiment of the present disclosure. As shown, the system 200 comprises a first user interface 202 configured to receive a musical note event of the musical score and a second user interface 204 configured to receive at least one user-defined music profile definition for the musical score and a processing arrangement 206 configured to process the received musical note event of the musical score to determine one or more relevant music profile definitions therefor. The processing arrangement 206 is further configured to define one or more parameters to be associated with the received musical note event of the musical score based, at least in part, on the determined one or more relevant music profile definitions therefor; and generate at least one notation output for the received musical note event of the musical score based on the defined one or more parameters associated therewith.

    (11) Referring to FIG. 3, illustrated is an exemplary depiction of a musical note event represented using one or more parameters 300 thereof, in accordance with one or more embodiments of the present disclosure. As shown, the exemplary musical note event is depicted using the one or more parameters 300 added by the user via the second user interface 204 i.e., the musical note event is translated using the one or more parameters 300 for further processing and analysis thereof. Herein, the one or more parameters 300 comprises at least an arrangement context 302, wherein the arrangement context 302 comprises a timestamp 302A, a duration 302B and a voice layer index 302C. Further, the one or more parameters 300 comprises a pitch context 304, wherein the pitch context 304 comprises a pitch class 304A, an octave 304B, and a pitch curve 304C. Furthermore, the one or more parameters 300 comprises an expression context 306, wherein the expression context 306 comprises an articulation map 306A, a dynamic type 306B, and an expression curve 306C. Collectively, the arrangement context 302, the pitch context 304, the expression context 306 enable the method 100 or the system 200 to generate accurate and effective notations.

    (12) Referring to FIG. 4, illustrated is an exemplary depiction of a musical note event 400 being translated into the arrangement context 302, in accordance with an embodiment of the present disclosure. As shown, the musical note event 400 comprises a stave and five distinct events or notes that are required to be translated into corresponding arrangement context i.e., the five distinct events of the musical note event 400 are represented by the arrangement context 302 further comprising inherent arrangement contexts 402A to 402E. The first musical note event is represented as a first arrangement context 402A comprising a timestamp=0 s, a duration=500 ms, and a voice layer index=0. The second musical note event is represented as a second arrangement context 402B comprising a timestamp=500 ms, a duration=500 ms, and a voice layer index=0. The third musical note event is represented as a third arrangement context 402C comprising a timestamp=1000 ms, a duration=250 ms, and a voice layer index=0. The fourth musical note event is represented as a fourth arrangement context 402D comprising a timestamp=1250 s, a duration=250 ms, and a voice layer index=0. The fifth musical note event is represented as a fifth arrangement context 402A comprising a timestamp=0 s, a duration=500 ms, and a voice layer index=0.

    (13) Referring to FIG. 5, illustrated is an exemplary depiction of a musical note event 500 being translated into the pitch context 304, in accordance with an embodiment of the present disclosure. As shown, the musical note event 500 comprises two distinct events or notes that are required to be translated into corresponding pitch context i.e., the two distinct events of the musical note event 500 are represented by the pitch contexts 304 further comprising inherent pitch contexts 504A and 504B. The first musical note event is represented by the first pitch context 504A, wherein the first pitch context 504A comprises the pitch class=E, the octave=5, and the pitch curve 506A. The second musical note event is represented by the second pitch context 504B, wherein the second pitch context 504B comprises the pitch class=C, the octave=5, and the pitch curve 506B.

    (14) Referring to FIG. 6A, illustrated is an exemplary illustration of the first user interface 202, in accordance with an embodiment of the present disclosure. As shown, the first user interface 202 comprises two lists i.e., a first list 202A for different musical styles and a second list 202B for different composers (of various eras or styles). Herein, the first user interface 202 is configured to receive a musical note event of the musical score for processing, via the processing arrangement 206, to determine one or more relevant music profile definitions therefor. As shown, the user selects “Romantic” as style from the first list 202A and “Frederic Chopin” as composer from the second list 202B of the first user interface 202. Correspondingly, based on the selected one or more relevant profile definitions, the one or more parameters 300 associated therewith for generating the musical notations of the musical score may be determined.

    (15) Referring to FIG. 6B, illustrated is a second user interface 204, in accordance with one or more embodiments of the present disclosure. As shown, the second user interface 204 is configured to enable the user to select a particular performance style (or definition) for the received musical note event, wherein the selected performance style may be selected either on the received musical note event or the entire musical score generated via the method 100 or system 200. Alternatively stated, after recording of a musical performance, wherein the at least one user-defined profile matches with one or more relevant profiles (which are adjusted for the style and instrument in question), the user is asked to specify the manner in which certain aspects of the musical performance should are to be interpreted. Herein, the second user interface 204 is configured to receive, via the 202, at least one user-defined music profile definition for the musical score, wherein the one or more relevant music profile definitions for the received musical note event of the musical score are determined, via the processing arrangement, based on the received at least one user-defined music profile definition for the musical score. Herein, the user may select or input the user-defined music profile definition based on which the one or more relevant music profile definitions may be determined for further generating the musical notations for the musical score.

    (16) Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.