Method for producing and streaming music generated from biofeedback
10636400 ยท 2020-04-28
Inventors
Cpc classification
A61B5/6801
HUMAN NECESSITIES
G10H2220/101
PHYSICS
A61B5/7264
HUMAN NECESSITIES
G10H2220/371
PHYSICS
A61B5/02438
HUMAN NECESSITIES
G10H1/0025
PHYSICS
G10H2220/106
PHYSICS
G16H50/70
PHYSICS
International classification
A61B5/00
HUMAN NECESSITIES
Abstract
A method for producing and streaming music generated from biofeedback. The method employs machine-readable language instructions that alter received signals from a wearable biofeedback device, generating musical tones to be played on a portable electronic device and/or shared with others. The varying musical tones that are produced by the method can be modulated by the user to reach a targeted state of emotion.
Claims
1. A method for generating music from biofeedback comprising: receiving physiological signals from a biofeedback device worn by a person; and transforming said analyzed signals into MIDI note values by applying an algorithm comprising: measuring a first interbeat interval; and measuring a second interbeat interval; and determining the difference between said first and second interbeat interval; and applying a MOD function to the said difference; and assigning the result of said MOD function to MIDI a note value; and translating said analyzed signals into MIDI continuous control values by applying an algorithm comprising: calculating a point value by dividing 128 by a target number; and measuring a current value from said received signals from a biofeedback device; and multiplying said point value by said current value to obtain a MIDI continuous-control value; and determining tempo in beats per minute by a master clock; and sending at least one MIDI note value through a first MIDI channel; and sending at least one MIDI continuous-control value through a second MIDI channel; and applying MIDI effects to each of said first and second MIDI channels; and applying MIDI instruments to said applied MIDI effects of said first and second MIDI channels; and outputting said mixed audio through a speaker.
2. The method of claim 1 further comprising: sending MIDI note values and MIDI continuous-control values to a MIDI processor; and controlling pitch, timing and timbre qualities of digital instruments assigned to said MIDI note values and MIDI continuous-control values.
3. The method of claim 1 further comprising: applying a MOD-12 function to the said difference between said first and second interbeat interval; and assigning a MIDI note value from a 12-note scale to the result of said MOD-12 function; wherein notes selected from a 12-note scale are sent through said first MIDI channel.
4. The method of claim 1 further comprising: applying a MOD-36 function to the said difference between said first and second interbeat interval; and assigning at least one MIDI note value from a 36-note scale to the result of said MOD-36 function; wherein notes selected from a 36-note scale are sent through said first MIDI channel.
5. The method of claim 1 further comprising: applying a MOD-8 function to the said difference between said first and second interbeat interval; and assigning at least one MIDI note value from an 8 note scale to the result of said MOD-8 function; wherein notes selected from an 8-note scale is sent through said first MIDI channel.
6. The method of claim 1 further comprising: creating an audio master from said MIDI instruments applied to said MIDI effects; and mixing said audio master; wherein the mixed audio master is output through a speaker.
7. The method of claim 1 further comprising: Depicting parameters of audio effects and MIDI processor settings and sample-based instruments in a graphical user interface; and allowing a user to adjust audio effects and MIDI processor settings and to choose sample-based instruments, and to mix the sound that is output to a speaker.
8. The method of claim 1 further comprising: applying said MIDI effects to each of said first and second MIDI channels by scaling MIDI note values; and transposing MIDI note values; and arpeggiating MIDI note values.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The references below are made to assist those of skill in the art in making and using the disclosed method.
(2)
(3)
(4)
(5)
(6)
(7) Any of these embodiments are understood to be non-exclusive and interchangeable.
DESCRIPTION
(8) Referring to
(9) Referring to
(10) The MIDI information, along with MIDI processors, MIDI instruments and audio effects are then loaded to any number of other users' devices 224. The embodiment's software, which any number of users has loaded onto their devices, connects to the server 220, streams the MIDI information 219 and processes the MIDI information through the MIDI processor. The resulting MIDI is used to control the MIDI instruments. Resulting audio in the form of musical tones 218 is then amplified through these devices or through speakers 220 paired with the devices.
(11) In
(12) Referring to
(13)