Apparatus and product of manufacture for generating a probability value for an event
11803753 · 2023-10-31
Assignee
Inventors
Cpc classification
A61B5/374
HUMAN NECESSITIES
G16H50/20
PHYSICS
A61B5/7275
HUMAN NECESSITIES
International classification
Abstract
A method and system for generating a probability value for an event. The system includes a source for generating a plurality of digital input signals, a processor connected to the source to receive the plurality of digital input signals from the source, and a display connected to the processor for displaying a final output. Preferably, the method further includes validating the probability value.
Claims
1. A non-transitory computer-readable medium that stores a program that causes a processor to perform functions to generate a probability value for an event by executing the following steps: receiving a plurality of input signals at a first processor, plurality of input signals generated from a source; submitting the plurality of input signals to a multilayer perceptron operating on a first processor to generate a plurality of raw scores; sorting at the first processor the plurality of raw scores by similar values; calibrating at the first processor a first half of the plurality of raw scores with similar values to generate a probability value that an event will occur; validating at the first processor the probability value that the event has occurred with actual data; validating at the first processor a second half of the plurality of raw scores with similar values at the multilayer perceptron to determine if the probability value is correct; and, generating at the first processor a graph of the probability value versus time on a graphical display; wherein the plurality of digital input signals comprises at least one of a value for a fraudulent credit card transaction, a value for a monthly salary income for the loan applicant, a value for monthly rental income for the loan applicant, a value of a collateral for the loan, a value for a monthly car payment for the loan applicant, or a value of a number of years employed for the loan applicant.
2. The non-transitory computer-readable medium according to claim 1 wherein the multilayer perceptron comprises a plurality of inputs, a plurality of hidden nodes and a single output.
3. A system for generating a probability value for an event, the system comprising: a source for generating a plurality of digital input signals; a processor connected to the source to receive from the plurality of digital input signals from the source; and a graphical display connected to the processor for displaying a final output; wherein the plurality of digital input signals is submitted to a multilayer perceptron at the processor to generate a plurality of raw scores; wherein the processor is configured to sort the plurality of raw scores by similar values; wherein the processor is configured to calibrate a first half of the plurality of raw scores to generate a probability value that an event has occurred; wherein the processor is configured to validate the probability value that an event has occurred with actual data of an event; wherein the processor is configured to validate a second half of the plurality of raw scores with similar values from the multilayer perceptron to determine if the probability value is correct; wherein the processor is configured to generate a display of the probability value versus time on the graphical display; wherein the plurality of digital input signals comprises at least one of a value for a fraudulent credit card transaction, a value for a monthly salary income for the loan applicant, a value for monthly rental income for the loan applicant, a value of a collateral for the loan, a value for a monthly car payment for the loan applicant, or a value of a number of years employed for the loan applicant.
4. The system according to claim 3 wherein the multilayer perceptron consist essentially of a plurality of inputs, a plurality of hidden nodes and a single output.
5. A non-transitory computer-readable medium that stores a program that causes a processor to perform functions to determine a probability value for an event by executing the following steps: generating a plurality of training set inputs from a machine comprising a source, a processor and a user-interface; submitting the plurality of training set inputs to a recognition algorithm to generate a raw score; calibrating the raw score to generate a probability value that an event will occur; validating a set to test; and generating probability values against data submitted for analysis.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DETAILED DESCRIPTION OF THE INVENTION
(17) As shown in
(18) A general method 200 for generating a probability value is illustrated in the flow chart of
(19)
(20) In another example, the digital input signals from the source 70 are a value for a fraudulent credit card transaction, a value for a monthly salary income for a loan applicant, a value for monthly rental income for a loan applicant, a value of a collateral for a loan, a value for a monthly car payment for a loan applicant, or a value of a number of years employed for a loan applicant.
(21) Artificial neural networks (ANN) have been used to solve various tasks in numerous fields that are hard to solve using ordinary rule-based programming. An ANN can learn and adapt through learning algorithms. The types of ANNs and ANN architecture varies, mainly in the learning method.
(22) The basic phases of an example algorithm 300 are shown in
(23) A multilayer perceptron (MLP) is a feed forward ANN.
(24)
(25) Returning to
(26) Next, the other half of the raw scores values are validated with the corrected algorithm using the probability value. If the actual data for this second half of raw scores values demonstrates that the probability value is correct, then the calibrated algorithm has been validated. However, if the validation is incorrect, the process is repeated.
(27) In classification, the task is to a classify a variable y=x.sub.0 called class variable or output given a set of variables x=x.sub.1 . . . x.sub.n, called attribute variables or input. A classifier h:x.fwdarw.y is a function that maps an instance of x to a value of y. The classifier is learned from a dataset d consisting of samples over (x, y). The learning task consists of finding an appropriate Bayesian network given a data set d over U. Let U={x.sub.1, . . . , x.sub.n}, n≥1 be a set of variables.
(28) In an example for a loan application, there are two classes, low-risk and high-risk applicants. In order to find out if an applicant may default on the loan, a probability is calculated, P(Y|X), where X is the input, such as salary income, and Y is the 0 or 1 to indicate low-risk or high-risk, respectively. For a given X=x, P(Y=1|X=x)=0.9, the probability is 90 percent that the applicant is high-risk.
(29) A perceptron models a biological neuron as a mathematical function,
(30)
(31) where the weighted sum, y, of the input values, x.sub.j∈, j=1, . . . , d.sub.j, are calculated. The weights are w.sub.j∈
.
(32) The following is a Perceptron Training Algorithm for training a MLP with K outputs.
(33) TABLE-US-00001 For i = 1,..., K For j = 0,..., d w.sub.ij ← rand(−0.01,0.01) Repeat For all (x.sup.t, r.sup.t) ∈ X in random order For i = 1,..., K o.sub.i ← 0 For j = 0,..., d o.sub.i ← o.sub.i +w.sub.ijx.sup.t.sub.j For i = 1,..., K y.sub.i ← exp(o.sub.i) / Σ.sub.k exp(o.sub.k) For i = 1,..., K For j = 0,..., d w.sub.ij ← w.sub.ij + η (r.sup.t.sub.i − y.sub.i)x.sup.t.sub.j
(34) Until convergence
(35) Where η is the learning factor.
(36) The following is a Backpropagation Algorithm for training a MLP with K outputs.
(37) Initialize all v.sub.ih and w.sub.hj to rand(−0.01,0.01)
(38) Repeat For all (x.sup.t, r.sup.t)∈X in random order For h=1, . . . , H z.sub.h←sigmoid(w.sup.T.sub.hx.sup.t) For i=1, . . . , K y.sub.i=v.sup.T.sub.iz For i=1, . . . , K Δv.sub.i=η(r.sup.t.sub.i−y.sup.t.sub.i)z For h=1, . . . , H Δw.sub.h=η(Σ(r.sup.t.sub.i−y.sup.t.sub.i)v.sub.ih)z.sub.h(1−z.sub.h)x.sup.t For i=1, . . . , K v.sub.i←v.sub.i+Δv.sub.i For h=1, . . . , H w.sub.h←w.sub.h+Δw.sub.h
(39) until convergence.
(40)
(41) The EEG is optimized for automated artifact filtering. The EEG recordings are then processed using neural network algorithms to generate a processed EEG recording which is analyzed for display.
(42) An additional description of analyzing EEG recordings is set forth in Wilson et al., U.S. patent application Ser. No. 13/620,855, filed on Sep. 15, 2012, for a Method And System For Analyzing An EEG Recording, which is hereby incorporated by reference in its entirety.
(43) A patient has a plurality of electrodes attached to the patient's head with wires from the electrodes connected to an amplifier for amplifying the signal to a processor, which is used to analyze the signals from the electrodes and create an EEG recording. The brain produces different signals at different points on a patient's head. Multiple electrodes are positioned on a patient's head. The CZ site is in the center. The number of electrodes determines the number of channels for an EEG. A greater number of channels produce a more detailed representation of a patient's brain activity. Preferably, each amplifier 42 of an EEG machine component 40 corresponds to two electrodes 35 attached to a head of the patient 15. The output from an EEG machine component 40 is the difference in electrical activity detected by the two electrodes. The placement of each electrode is critical for an EEG report since the closer the electrode pairs are to each other, the less difference in the brainwaves that are recorded by the EEG machine component 40.
(44) Algorithms for removing artifact from EEG typically use Blind Source Separation (BSS) algorithms like CCA (canonical correlation analysis) and ICA (Independent Component Analysis) to transform the signals from a set of channels into a set of component waves or “sources.”
(45) In one example an algorithm called BSS-CCA is used to remove the effects of muscle activity from the EEG. Using the algorithm on the recorded montage will frequently not produce optimal results. In this case it is generally optimal to use a montage where the reference electrode is one of the vertex electrodes such as CZ in the international 10-20 standard. In this algorithm the recorded montage would first be transformed into a CZ reference montage prior to artifact removal. In the event that the signal at CZ indicates that it is not the best choice then the algorithm would go down a list of possible reference electrodes in order to find one that is suitable.
(46) An additional description of analyzing EEG recordings is set forth in Wilson et al., U.S. patent application Ser. No. 13/684,469, filed on Nov. 23, 2012, for a User Interface For Artifact Removal In An EEG, which is hereby incorporated by reference in its entirety. An additional description of analyzing EEG recordings is set forth in Wilson et al., U.S. patent application Ser. No. 13/684,556, filed on Nov. 25, 2012, for a Method And System For Detecting And Removing EEG Artifacts, which is hereby incorporated by reference in its entirety.
(47)
(48) Also shown in
(49) Rhythmicity spectrograms allow one to see the evolution of seizures in a single image. The rhythmicity spectrogram measures the amount of rhythmicity which is present at each frequency in an EEG record.
(50) The seizure probability trend shows a calculated probability of seizure activity over time. The seizure probability trend shows the duration of detected seizures, and also suggests areas of the record that may fall below the seizure detection cutoff, but are still of interest for review. The seizure probability trend when displayed along with other trends, provides a comprehensive view of quantitative changes in an EEG.
(51) As shown in
(52) A patient has a plurality of electrodes attached to the patient's head with wires from the electrodes connected to an amplifier for amplifying the signal to a processor, which is used to analyze the signals from the electrodes and create an EEG recording. The brain produces different signals at different points on a patient's head. Multiple electrodes are positioned on a patient's head as shown in
(53) Algorithms for removing artifact from EEG typically use Blind Source Separation (BSS) algorithms like CCA (canonical correlation analysis) and ICA (Independent Component Analysis) to transform the signals from a set of channels into a set of component waves or “sources.” The sources that are judged as containing artifact are removed and the rest of the sources are reassembled into the channel set.
(54)
(55)
(56) From the foregoing it is believed that those skilled in the pertinent art will recognize the meritorious advancement of this invention and will readily understand that while the present invention has been described in association with a preferred embodiment thereof, and other embodiments illustrated in the accompanying drawings, numerous changes modification and substitutions of equivalents may be made therein without departing from the spirit and scope of this invention which is intended to be unlimited by the foregoing except as may appear in the following appended claim. Therefore, the embodiments of the invention in which an exclusive property or privilege is claimed are defined in the following appended claims.