Apparatus and method for producing and streaming music generated from plants

10909956 · 2021-02-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for producing and streaming music generated from plants. Plant microfluctuations are converted to MIDI notes and subsequent CC messages, and are mapped to a unique signal chain of virtual instruments and effects to produce musical notes which are output through the speakers of an apparatus, or through a linked portable electronic device.

    Claims

    1. A method and apparatus for generating music from microfluctuations in a plant comprising: a MIDI plant device for measuring plant microfluctuations; and said measured plant microfluctuations converted into MIDI note messages; and said measured plant microfluctuations converted into continuous control messages; and said MIDI notes and continuous control messages sent to a portable electronic device; and software in said portable electronic device processes MIDI notes by applying; timing; and scale; and transposition; and arpeggiation; and said software uses virtual instruments to output musical tones; and said software uses synthesized musical effects based on said continuous control messages with said synthesized instruments to apply musical effects to said musical tones; wherein measured plant microfluctuations are converted into musical tones in a scale played on virtual instruments, with musical effects.

    2. The apparatus of claim 1 further comprising: said plant device for measuring changes in plant microfluctuations and converting them into MIDI note and control messages over time; and said software processes the MIDI notes into sound and assigns musical effects to said tones that are modulated by said control messages; wherein plant microfluctuation changes over time to determine the output of musical effects.

    3. A method for generating music from measured plant microfluctuations, employing the apparatus of claim 1, the method comprising: measuring plant microfluctuations; and converting said measured plant microfluctuations to MIDI notes; and converting said measured plant microfluctuations to continuous control messages; and processing MIDI notes into sounds through virtual instruments; and assigning musical effects to said sounds that are modulated by continuous control messages; wherein music is created by software musical instruments playing MIDI notes and musical effects as derived from plant microfluctuations.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    (1) FIG. 1 is an illustration of an example embodiment;

    (2) FIG. 2 illustrates a second embodiment;

    (3) FIG. 3 illustrates in detail the signal-processing functions of FIGS. 1 and 2;

    (4) FIG. 4 shows an example graphical user interface of an example embodiment.

    (5) Any of these embodiments are understood to be non-exclusive and interchangeable.

    DESCRIPTION

    (6) In FIG. 1, example embodiment 100: MIDI plant device 110 sends 3.3 volts of electric current via electrodes 112 to the leaves of a plant 114. Microfluctuations 116 from the plant are sent through the same electrodes to the MIDI plant device 110. MIDI plant device 110 graphs these fluctuations as waves or data patterns and translates these data patterns into MIDI note and control messages 118.

    (7) Open-source firmware in the MIDI plant device 110 dictates the creation of notes when variations in electrical conductivity are sensed in the plant. Each note produced is proportional to the difference in conductivity between a baseline and a measured change event. When conductivity goes up, notes go up in a scale, and when conductivity goes down, notes go down in a scale. In addition to generating MIDI note values from the waves of plant microfluctuations, derivatives of plant waves are applied to create MIDI control values. Because this output is controlled by a derivative, the generated control values correspond to larger shifts in a plant's electric activity, adding dimension to the ongoing microfluctuations that drive the creation of notes as described above.

    (8) That information is received by a portable electronic device 120 via wired connection, Bluetooth or Wifi. Ongoing microfluctuations continue to drive the creation of notes; software 122 on the device continually analyzes MIDI information, and the process as described above loops to continually produce musical tones.

    (9) The software 122 controls what virtual instruments are played, as well as the texture of those instruments as controlled by effects. Resulting musical tones are delivered through the device's speakers or headphones 128.

    (10) Referring to FIG. 2, in example embodiment 200, MIDI plant device 210 sends 3.3 volts of electric current via electrodes 212 to the leaves of a plant 214. Microfluctuations 216 from the plant are sent through the same electrodes to the MIDI plant device 210. MIDI plant device 210 graphs these fluctuations as waves or data patterns and translates these data patterns into MIDI control messages 218. Open-source firmware in the MIDI plant device dictates the creation of notes when variations in electrical conductivity are sensed in the plant. Each note produced is proportional to the difference in conductivity between a baseline and a measured change event. When conductivity goes up, notes go up in a scale, and when conductivity goes down, notes go down in a scale.

    (11) In addition to generating MIDI note values from the waves of plant microfluctuations, derivatives of plant waves are applied to create MIDI control values. Because this output is controlled by a derivative, the generated control values correspond to larger shifts in a plant's electric activity, adding dimension to the ongoing microfluctuations that drive the creation of notes as described above.

    (12) That information 232 is received by a portable electronic device 220. Software 222 on the device analyzes the MIDI information and employs a specific algorithm to apply MIDI processing, virtual instruments, and audio effects. Ongoing microfluctuations continue to drive the creation of notes; software 122 on the device continually analyzes MIDI information, and the process as described above loops to continually produce musical tones 124.

    (13) The software 222 controls which virtual instruments are played, as well as the texture of those instruments as controlled by effects.

    (14) Resulting MIDI note values may be sent via Internet connection to the embodiment's server 230. Users connect to the server to send their MIDI information or to stream other users' MIDI information. The embodiment's software, which users have loaded onto their devices, connects, through an Internet connection, to the server 230, enabling the user to stream MIDI information 232 to or from the server.

    (15) In receiving MIDI from the server, software 236 on this user's device 234 analyzes the MIDI information and employs a specific algorithm to apply virtual instruments and audio effects to produce musical tones 238. As ongoing microfluctuations drive the ongoing creation of tones, the software 236 controls what virtual instruments are played, as well as the texture of those instruments as controlled by effects.

    (16) Resulting musical tones 238 are delivered through the device's speakers or headphones 240.

    (17) FIG. 3, 300 illustrates in detail the software method's MIDI and sound-engine processes. A MIDI processor 322 applies the functions of clock, scaler, arpeggiator, note-wrapping and transposition to MIDI 320 derived from plant data 318. A master clock determines tempo in beats per minute or in samples per second. A MIDI bus takes the MIDI note and continuous-controller (CC) messages from the algorithm and busses them to multiple MIDI channels. MIDI instruments process the MIDI thus generated 324 from the data into audio 326. Audio effects 328, which include reverb, delay, bit-crushing, filters, resonators, gain, equalizers, are added and modulated by the CCs derived from the plant data. The resulting output is sent in the form of in musical tones 330 to a device or speaker 332.

    (18) FIG. 4, 300 shows an example graphical user interface (GUI) of the embodiment. The GUI is for the user to adjust and customize the sound. The GUI provides buttons for saving and loading preset audio configurations 470 and allows the user to edit parameters of audio effects 476; edit parameters of MIDI processor 472; or add audio for sample-based instruments 474.