CREATION DEVICE, CREATION METHOD, AND PROGRAM
20210232861 · 2021-07-29
Assignee
Inventors
Cpc classification
G06F18/214
PHYSICS
G06F16/00
PHYSICS
G06F16/28
PHYSICS
International classification
Abstract
A learning section (13) learns a classification criterion of a classifier at each time point using labeled learning data collected until a past prescribed time point and unlabeled learning data collected on and after the prescribed time point and learns a time-series change of the classification criterion. A classifier creation section (14) predicts a classification criterion of the classifier at an arbitrary time point including a future time point and certainty expressing the reliability of the classification criterion using the learned classification criterion and the time-series change. Thus, the classifier that outputs a label expressing an attribute of input data is created.
Claims
1. A creation device for creating a classifier that outputs a label expressing an attribute of input data, the creating device comprising: a classifier learning section that learns a classification criterion of the classifier at each time point using labeled data collected until a past prescribed time point and unlabeled data collected on and after the prescribed time point as learning data; a time-series change learning section that learns a time-series change of the classification criterion; and a prediction section that predicts a classification criterion of the classifier at an arbitrary time point including a future time point and reliability of the classification criterion using the learned classification criterion and the time-series change.
2. The creation device according to claim 1, wherein the data is data in which a discrete time interval is nonuniform.
3. The creation device according to claim 1, wherein the time-series change learning section learns the time-series change in parallel with the learning of the classification criterion by the classifier learning section.
4. The creation device according to claim 1, wherein the time-series change learning section learns the time-series change after the learning of the classification criterion by the classifier learning section.
5. A creation method performed by a creation device for creating a classifier that outputs a label expressing an attribute of input data, the creating method comprising: a classifier learning step of learning a classification criterion of the classifier at each time point using labeled data collected until a past prescribed time point and unlabeled data collected on and after the prescribed time point as learning data; a time-series change learning step of learning a time-series change of the classification criterion; and a prediction step of predicting a classification criterion of the classifier at an arbitrary time point including a future time point and reliability of the classification criterion using the learned classification criterion and the time-series change.
6. A non-transitory computer readable medium storing a creation program which causes a computer to perform: a classifier learning step of learning a classification criterion of a classifier at each time point using labeled data collected until a past prescribed time point and unlabeled data collected on and after the prescribed time point as learning data; a time-series change learning step of learning a time-series change of the classification criterion; and a prediction step of predicting a classification criterion of the classifier at an arbitrary time point including a future time point and reliability of the classification criterion using the learned classification criterion and the time-series change.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
DESCRIPTION OF EMBODIMENTS
First Embodiment
[0022] Hereinafter, an embodiment of the present invention will be illustrated in detail with reference to the drawings. Note that the present invention is not limited to the embodiment. Further, the same portions will be denoted by the same reference signs in the description of the drawings.
[0023] [Configuration of Creation Device]
[0024] First, the schematic configuration of a creation device according to the present embodiment will be described with reference to
[0025] Note that as shown in
[0026] [Creation Unit]
[0027] The creation unit 10 has a learning data input section 11, a data conversion section 12, a learning section 13, a classifier creation section 14, and a classifier storage section 15.
[0028] The learning data input section 11 is realized by an input device such as a keyboard and a mouse and inputs various instruction information to a control unit in response to an input operation by an operator. In the present embodiment, the learning data input section 11 receives labeled learning data and unlabeled learning data that are to be used in creation processing.
[0029] Here, the labeled learning data represents learning data that is assigned a label expressing the attribute of the data. For example, when learning data is text, a label such as politics, economy, and sports expressing the content of the text is assigned. Further, the unlabeled learning data represents learning data that is not assigned a label.
[0030] Further, the labeled learning data and the unlabeled learning data are assigned time information. For example, when learning data is text, the time information represents a date and time or the like at which the text was published. In the present embodiment, a plurality of labeled learning data and a plurality of unlabeled learning data that are assigned past different time information up to the present are received.
[0031] Note that the labeled learning data may be input from an external server device or the like to the creation unit 10 via a communication control unit (not shown) realized by a NIC (Network Interface Card) or the like.
[0032] The control unit is realized by a CPU (Central Processing Unit) or the like that performs a processing program and functions as the data conversion section 12, the learning section 13, and the classifier creation section 14.
[0033] The data conversion section 12 converts received labeled learning data into the data of a combination of a collection time, a feature vector, and a numeric value label as preparation for processing by the learning section 13 that will be described later. Further, the data conversion section 12 converts unlabeled learning data into the data of a combination of a collection time and a feature vector. The labeled learning data and the unlabeled learning data in the following processing by the creation unit 10 represent data after being converted by the data conversion section 12.
[0034] Here, the numeric value label is one obtained by converting a label assigned to labeled learning data into a numeric value. Further, the collection time is time information that shows time at which learning data was collected. Further, the feature vector is one obtained by writing received labeled learning data as a specific n-dimensional number vector. Learning data is converted by a general-purpose method in machine learning. For example, when learning data is text, the learning data is converted by a morphological analysis, n-gram, or delimiter.
[0035] The learning section 13 functions as a classifier learning section and learns the classification criterion of a classifier at each time point using labeled data that was collected until a past prescribed time point and unlabeled data that was collected on an after the prescribed time point as learning data. Further, the learning section 13 functions as a time-series change learning section and learns the time-series change of the classification criterion. In the present embodiment, the learning section 13 performs the learning of a classification criterion as the classifier learning section and the learning of a time-series change as the time-series change learning section in parallel.
[0036] Specifically, the learning section 13 simultaneously performs the learning of a classification criterion and the learning of the time-series change of the classification criterion of a classifier using labeled learning data that is assigned collection time of t.sub.1 to t.sub.L and unlabeled learning data that is collection time of t.sub.L+1 to t.sub.L+U. In the present embodiment, logistic regression is applied as the model of a classifier with the assumption that an event in which a certain label is assigned by the classifier occurs at a prescribed probability distribution. Note that the model of the classifier is not limited to the logistic regression but may include support vector machine, boosting, or the like.
[0037] Further, in the present embodiment, a Gaussian process is applied as a time-series model expressing the time-series change of the classification criterion of a classifier. Note that the time-series model is not limited to the Gaussian process but may include a model such as a VAR model.
[0038] First, labeled learning data at time t is expressed by the following expression (1). Note that a label is composed of two discrete values of 0 and 1 in the present embodiment. However, the present embodiment is also applicable to a case in which there are three or more labels or a case in which a label is composed of continuous values.
[Formula 1]
.sub.t.sup.L:={x.sub.n.sup.t,y.sub.n.sup.t}.sub.n=1.sup.N.sup.
[0039] where
[0040] x.sub.n.sup.t represents the D-dimensional feature vector of the n-th data,
[0041] y.sub.n.sup.t∈{0,1} represents the label of the n-th data, and
[0042] t.sup.L:=(t.sub.1, . . . , t.sub.L) represents time at which labeled learning data was collected.
[0043] Further, the whole labeled learning data is expressed by the following expression (2).
[Formula 2]
.sup.L={
.sub.t.sup.L}.sub.t=t.sup.t.sup.
[0044] Further, unlabeled learning data at the time t is expressed by the following expression (3).
[Formula 3]
.sub.t.sup.U:={x.sub.m.sup.t}.sub.m=1.sup.M.sup.
[0045] where
[0046] t.sup.U:=(t.sub.L+1, . . . , t.sub.L+U) represents time at which the unlabeled learning data was collected.
[0047] Further, the whole unlabeled learning data is expressed by the following expression (4)
[Formula 4]
.sup.U={
.sub.t.sup.U}.sub.t=t.sub.
[0048] In this case, the probability that the label y.sub.n.sup.t of the feature vector x.sub.n.sup.t is 1 in a classifier to which logistic regression is applied is expressed by the following expression (5).
[Formula 5]
p(y.sub.n.sup.t=1|x.sub.n.sup.t,w.sub.t)=σ(w.sub.t.sup.Tx.sub.n.sup.t)=(1+c.sup.−w.sup.
[0049] where
[0050] w.sub.t∈.sup.D represents the parameter of the classifier (D-dimensional vector),
[0051] σ represents a sigmoid function, and
[0052] T represents transposition.
[0053] It is assumed that a d-component w.sub.td of the parameter of the classifier at the time t is described by the following expression (6) using a nonlinear function f.sub.d. Here, d is 1 to D.
[Formula 6]
w.sub.td=f.sub.d(t)+ϵ.sub.d (6)
[0054] where
[0055] f.sub.d represents a nonlinear function using the time t as input, and
[0056] ε.sub.d represents Gaussian noise.
[0057] Further, the prior distribution of the nonlinear function f.sub.d is based on a Gaussian process. That is, it is assumed that the value of the nonlinear function f.sub.d at each time point of the time t of t.sub.1 to t.sub.L+U shown in the following expression (7) is generated by a Gaussian distribution shown in the following expression (8).
[Formula 7]
f.sub.d=(f.sub.d(t.sub.1), . . . ,f.sub.d(t.sub.T)) (7)
[Formula 8]
p(f.sub.d)=(f.sub.d|0,K.sub.d) (8)
[0058] where
[0059] N(μ,Σ) represents the Gaussian distribution of an average μ and a covariance matrix Σ, and
[0060] K.sub.d represents a covariance matrix using a kernel function k.sub.d as a component.
[0061] Here, each component of the covariance matrix is expressed by the following expression (9).
[Formula 9]
[K.sub.d].sub.tt′:=k.sub.d(t,t′) (9)
[0062] The above k.sub.d can be defined by an arbitrary kernel function but is defined by a kernel function shown in the following expression (10) in the present embodiment.
[0063] where
[0064] α.sub.d, β.sub.d, γ.sub.d, and ζ.sub.d represent parameters (actual numbers) featuring dynamics.
[0065] In this case, the probability distribution of the parameter (d-component) of the classifier at the time t of t.sub.1 to t.sub.L+U shown in the following expression (11) is expressed by the following expression (12).
[Formula 12]
p(w.sub..Math.d)=∫p(w.sub..Math.d|f.sub.d)p(f.sub.d)df.sub.d=(w.sub..Math.d|0,C.sub.d) (12)
[0066] where
[0067] C.sub.d represents a covariance matrix in which each component is defined by a kernel function c.sub.d.
[0068] The component of the covariance matrix is defined by a kernel function c.sub.d shown in the following expression (13)
[Formula 13]
c.sub.d(t,t′):=k.sub.d(t,t′)+δ.sub.tt′η.sub.d.sup.2 (13)
[0069] where
[0070] η.sub.d represents a parameter (actual number), and
[0071] δ.sub.tt, represents a function that returns 1 when t is equal to t′ and returns 0 in other cases.
[0072] In this case, a simultaneous distribution probability model for learning a classification criterion W of the classifier shown in the following expression (14) and a parameter θ shown in the following expression (15) expressing the time series change (dynamics) of the classification criterion is defined by the following expression (16).
[Formula 14]
W:=(w.sub.t.sub.
[Formula 15]
θ:=(α.sub.1, . . . ,α.sub.D,β.sub.1, . . . ,β.sub.D,γ.sub.1, . . . ,γ.sub.D,ζ.sub.1, . . . ,ζ.sub.D,η.sub.1, . . . ,η.sub.D) (15)
[0073] Next, on the basis of the probability model defined by the above expression (16), the probability that the classifier of a classification criterion W (hereinafter also referred to as the classifier W) is obtained when the labeled learning data is provided and the dynamics parameter θ are estimated using a so-called variational Bayesian method in which a posterior distribution is approximated from data to be provided. In the variational Bayesian method, a function shown in the following expression (17) is maximized to obtain the distribution of desired W, that is, q(W) and the dynamics parameter θ.
[0074] where
[0075] q(W) represents the approximated distribution of the probability p(W|D.sup.L) that the classifier W is obtained under the provision of labeled learning data D.sup.L.
[0076] However, the function shown in the above expression (17) does not depend on the unlabeled learning data. Therefore, in order to practically use the unlabeled learning data, an entropy minimization principle shown in the following expression (18) is applied in the present embodiment so that the decision boundary of the classifier is recommended to pass through a region having low data density.
[0077] where
[0078] time t∈t.sup.U
[0079] By the minimization of R.sub.t in the above expression (18) with respect to w.sub.t, w.sub.t is learned to pass through a region having low data density in the unlabeled learning data at the time t. That is, the optimization problem of the present embodiment is to solve an optimization problem shown in the following expression (19).
[0080] where
R=Σ.sub.tR.sub.t
[0081] ρ represents a positive constant, and
M=Σ.sub.tM.sub.t.
[0082] In order to find the solution of the optimization problem, it is assumed that q(W) can be factorized as shown in the following expression (20).
[0083] Further, it is assumed that q(w.sub.t) is expressed by the function form of a Gaussian distribution as shown in the following expression (21).
[Formula 21]
q(w.sub.td)=(w.sub.td|μ.sub.td,σ.sub.td.sup.2) (21)
[0084] where
[0085] time t∈t.sup.U.
[0086] In this case, it is found that q(W) is expressed by the function form of a Gaussian distribution shown in the following expression (22).
[Formula 22]
q(w.sub.td)=(w.sub.td|μ.sub.td,λ.sub.td.sup.−1) (22)
[0087] where
[0088] q(w.sub.t) for t∈t.sup.L
[0089] Here, u.sub.td and λ.sub.td are estimated using an update expression shown in the following expression (23).
[0090] Where
[0091] ξ.sub.n.sup.t represents an approximate parameter corresponding to each data, and
[0092] σ represents a sigmoid function.
[0093] The distribution q(w.sub.t) at the time t can be obtained by the maximization of an objective function shown in the following expression (24), the objective function being obtained by approximating a regularization term R(w) using Reparameterization Trick. The maximization is numerically executable using, for example, a quasi-Newton method.
[0094] where
W.sub.t.sup.(j):=μ.sub.t+σ.sub.t⊙ϵ.sub.t.sup.(j),ϵ.sub.t.sup.(j)˜(0.I),
[0095] J represents the number of sample times.
[0096] Further, the dynamics parameter θ is updated using the quasi-Newton method. In the quasi-Newton method, a term related to θ of a lower limit L and a differential related to θ shown in the following expression (25) are used.
[0097] where
μ.sub..Math.d=(μ.sub.t.sub.
[0098] I represents a unit matrix.
[0099] The learning section 13 can estimate a desired parameter by alternately repeatedly performing the update of q(W) and the update of θ until a prescribed convergence condition is satisfied using the above update expression. The prescribed convergence condition represents, for example, a state in which the number of update times set in advance is exceeded, a state in which a change amount of a parameter becomes a certain value or less, or the like.
[0100] The classifier creation section 14 functions as a prediction section that predicts the classification criterion of a classifier at an arbitrary time point including a future time point and the reliability of the classification criterion. Specifically, the classifier creation section 14 derives the prediction of the classification criterion of a classifier at future time t, and certainty expressing the reliability of the predicted classification criterion using the classification criterion of the classifier and the time-series change of the classification criterion that have been learned by the learning section 13.
[0101] When logistic regression is applied as the model of the classifier and a Gaussian process is applied as a time-series model expressing the time-series change of the classification criterion of the classifier, a probability distribution at which the classifier W is obtained at time t, that is greater than t.sub.L+U is expressed by the following expression (26). Note that q(w.sub.t*) is only required to be applied when t* is less than or equal to t.sub.L+U.
[0102] where
k.sub.d:=(k.sub.d(t*,t.sub.1), . . . ,k.sub.d(t*,t.sub.T)).
[0103] m.sub.t*d represents the parameter (d-component) of the classifier, and
[0104] the reciprocal of σ.sub.t*d.sup.2 represents the certainty of the parameter (d-component) of the classifier.
[0105] Thus, the classifier creation section 14 can obtain the classifier of a predicted classification criterion at arbitrary time together with the certainty of the prediction. The classifier creation section 14 stores the predicted classification of the classifier and the certainty in the classifier storage section 15.
[0106] The classifier storage section 15 is realized by a semiconductor memory element such as a RAM (Random Access Memory) and a flash memory or a storage device such a hard disk and an optical disk and stores the created classification criterion of a classifier at future time and the certainty. A storage form is not particularly limited, and a data base form such as MySQL and PostgreSQL, a table form, a text form, or the like is illustrated by example.
[0107] [Classification Unit]
[0108] The classification unit 20 has a data input section 21, a data conversion section 22, a classification section 23, and a classification result output section 24 and performs classification processing in which data is classified using a classifier that has been created by the creation unit 10 and a label is output as described above.
[0109] The data input section 21 is realized by an input device such as a keyboard and a mouse and inputs various instruction information to a control unit or receives data to be classified in response to an input operation by an operator. Here, the received data to be classified is assigned time information at a certain time point. The data input section 21 may be the same hardware as that of the learning data input section 11.
[0110] The control unit is realized by a CPU or the like that performs a processing program and has the data conversion section 22 and the classification section 23.
[0111] The data conversion section 22 converts data to be classified that has been received by the data input section 21 into a combination of collection time and a feature vector like the data conversion section 12 of the creation unit 10. Here, since the data to be classified is assigned time information at a certain time point, the collection time and the time information are the same.
[0112] The classification section 23 refers to the classifier storage section 15 and performs the classification processing of data using a classifier at the same time as the collection time of data to be classified and the certainty of the classifier. For example, when logistic regression is applied as the model of the classifier and a Gaussian process is applied as a time-series model expressing the time-series change of the classification criterion of the classifier as described above, the probability that the label y of the data x is 1 is obtained by the following expression (27). The classification section 23 sets the label as 1 when the obtained probability is a prescribed threshold or more and sets the label as 0 when the obtained probability is smaller than the threshold.
[0113] The classification result output section 24 is realized by a display device such as a liquid crystal display, a printing device such as a printer, an information communication device, or the like and outputs the result of classification processing to an operator. For example, the classification result output section 24 outputs a label with respect to input data or outputs data obtained by assigning a label to input data.
[0114] [Creation Processing]
[0115] Next, the creation processing by the creation unit 10 of the creation device 1 will be described with reference to
[0116] First, the learning data input section 11 receives labeled learning data and unlabeled learning data that are assigned time information (step S1). Next, the data conversion section 12 converts the received labeled learning data into the data of a combination of collection time, a feature vector, and a numeric value label. Further, the data conversion section 12 converts the received unlabeled learning data into the data of a combination of collection time and a feature vector (step S2).
[0117] Then, the learning section 13 learns the classification criterion of a classifier until time t and a time-series model expressing the time-series change of the classifier (step S3). For example, a parameter w.sub.t of a logistic regression model and a parameter θ of a Gaussian process are simultaneously found.
[0118] Next, the classifier creation section 14 predicts the classification criterion of the classifier at arbitrary time t together with its certainty to create the classifier (step S4). For example, about a classifier to which a logistic regression model and a Gaussian process are applied, a parameter w.sub.t of the classifier at arbitrary time t and certainty are found.
[0119] Finally, the classifier creation section 14 stores the created classification criterion of the classifier and the certainty in the classifier storage section 15 (step S5).
[0120] [Classification Processing]
[0121] Next, the classification processing by the classification unit 20 of the creation device 1 will be described with reference to
[0122] First, the data input section 21 receives data to be classified at time t (step S6), and the data conversion section 22 converts the received data into the data of a combination of collection time and a feature vector (step S7).
[0123] Next, the classification section 23 refers to the classifier storage section 15 and performs the classification processing of the data using the certainty with a classifier at the collection time of the received data (step S8). Then, the classification result output section 24 outputs a classification result, that is, the label of the classified data (step S9).
[0124] As described above, in the creation device 1 of the present embodiment, the learning section 13 learns the classification criterion of a classifier at each time point and the time-series change of the classification criterion using labeled learning data that was collected until a past prescribed time point and unlabeled learning data that was collected after the prescribed time point, and the classifier creation section 14 predicts the classification criterion of the classifier at an arbitrary time point including a future time point and the reliability of the classification criterion using the learned classification criterion and the time-series change.
[0125] That is, as illustrated by example in
[0126] In the example shown in
[0127] Thus, according to the creation processing of the creation unit 10 in the creation device 1 of the present embodiment, the time development of a classification criterion learned only from labeled learning data can be corrected using unlabeled learning data that was collected on and after the collection time point of the labeled learning data. Further, a future classification criterion is predicted together with certainty using labeled learning data and unlabeled learning data that is low in collection cost. Accordingly, the selective use of a classifier with consideration given to the certainty of a predicted classification criterion makes it possible to prevent a decrease in the classification accuracy of the classifier and perform classification with high accuracy. As described above, according to the creation processing of the creation device 1, a classifier maintaining its classification accuracy can be created using unlabeled learning data with consideration given to the time development of a classification criterion.
[0128] Further, particularly when the classification criterion of a classifier and the time-series change of the classification criterion are simultaneously learned, more secured learning can be performed compared with a case in which the classification criterion of the classifier and the time-series change of the classification criterion are separately learned even in, for example, a case in which the number of labeled learning data is small.
[0129] Note that the creation processing of the present invention is not limited to a classification problem in which a label is composed of discrete values but may include a regression problem in which a label is composed of actual values. Thus, the future classification criteria of various classifiers can be predicted.
[0130] Further, the past collection time of labeled learning data and unlabeled learning data may not be continuous at a constant discrete time interval. For example, when a Gaussian process is applied as a time-series model expressing the time-series change of the classification criterion of a classifier as in the above embodiment, the classifier can be created even if a discrete time interval is nonuniform.
Second Embodiment
[0131] The learning section 13 of the above first embodiment may be separated into a classifier learning section 13a and a time-series model learning section 13b.
[0132] Note that in the present embodiment, logistic regression is applied as the model of a classifier and a Gaussian process is applied as a time-series model expressing the time-series change of the classification criterion of the classifier like the above first embodiment. Note that the time-series model is not limited to the Gaussian process but may include a model such as a VAR model.
[0133]
[0134] In the processing of step S31, the classifier learning section 13a learns the classification criterion of a classifier at arbitrary time t using labeled learning data at collection time t of t.sub.1 to t.sub.L and unlabeled learning data at collection time t of t.sub.L+1 to t.sub.L+U. For example, a parameter w.sub.t at time t of a logistic regression model is found.
[0135] In the processing of step S32, the time-series model learning section 13b learns a time-series model expressing the time-series change of the classification criterion using the classification criterion of the classifier until the time t that has been obtained by the classifier learning section 13a. For example, a parameter θ of a Gaussian process is found.
[0136] As described above, the classification criterion of a classifier and the time-series change of the classification criterion are separately learned in the creation device 1 of the present embodiment. Thus, even, for example, when the numbers of labeled learning data and unlabeled learning data are great, it is possible to lighten processing loads on respective function sections and perform processing in a short period of time compared with a case in which the classification criterion of a classifier and the time-series change of the classification criterion are simultaneously learned.
[0137] [Program]
[0138] A program in which the processing performed by the creation device 1 according to the above embodiment is described in language executable by a computer can be generated. As an embodiment, the creation device 1 can be mounted when a creation program for performing the above creation processing is installed in a desired computer as package software or online software. For example, an information processing device can function as the creation device 1 by performing the above creation program. Here, the information processing device includes a desktop or notebook personal computer. Besides, the information processing device includes a mobile body communication terminal such as a mobile phone and a PHS (Personal Handyphone System) and a slate terminal such as a PDA (Personal Digital Assistants), or the like. Further, assuming that a terminal device used by a user is a client, the creation device 1 can be mounted in the client as a server device that provides a service related to the above creation processing. For example, the creation device 1 is mounted as a server device that receives labeled learning data as input and provides a creation processing service to output a classifier. In this case, the creation device 1 may be mounted as a web server, or may be mounted as a cloud that provides a service related to the above creation processing by outsourcing. Hereinafter, an example of a computer that performs a creation program to realize the same function as that of the creation device 1 will be described.
[0139]
[0140] The memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM 1012. The ROM 1011 stores, for example, a boot program such as a BIOS (Basic Input Output System). The hard disk drive interface 1030 is connected to the hard disk drive 1031. The disk drive interface 1040 is connected to a disk drive 1041. For example, a detachable storage medium such as a magnetic disk and an optical disk is inserted into the disk drive 1041. For example, a mouse 1051 and a keyboard 1052 are connected to the serial port interface 1050. For example, a display 1061 is connected to the video adapter 1060.
[0141] Here, the hard disk drive 1031 stores, for example, an OS 1091, an application program 1092, a program module 1093, and program data 1094. The respective information described in the above embodiment is stored in, for example, the hard disk drive 1031 or the memory 1010.
[0142] Further, the creation program is stored in the hard disk drive 1031 as, for example, the program module 1093 in which an instruction performed by the computer 1000 is described. Specifically, the program module 1093 in which the respective processing performed by the creation device 1 described in the above embodiment is stored in the hard disk drive 1031.
[0143] Further, data used for information processing based on the creation program is stored in, for example, the hard disk drive 1031 as the program data 1094. Then, the CPU 1020 reads the program module 1093 or the program data 1094 stored in the hard disk drive 1031 into the RAM 1012 where necessary to perform the respective procedures describe above.
[0144] Note that the program module 1093 or the program data 1094 according to the creation program may be stored in, for example, a detachable recording medium rather than being stored in the hard disk drive 1031 and read by the CPU 1020 via the disk drive 1041 or the like. Alternatively, the program module 1093 or the program data 1094 according to the creation program may be stored in other computers via a network such as a LAN (Local Area Network) and a WAN (Wide Area Network) and read by the CPU 1020 via the network interface 1070.
[0145] The embodiment to which the present invention made by the present inventor is applied is described above. However, the present invention is not limited to the descriptions and the drawings constituting a part of the disclosure of the present invention according to the present embodiment. That is, other embodiments, examples, operation technologies, or the like made by persons skilled in the art or the like on the basis of the present embodiment are all included in the scope of the present invention.
REFERENCE SIGNS LIST
[0146] 1 Creation device [0147] 10 Creation unit [0148] 11 Learning data input section [0149] 12 Data conversion section [0150] 13 Learning section [0151] 13a Classifier learning section [0152] 13b Time-series model learning section [0153] 14 Classifier creation section [0154] 15 Classifier storage section [0155] 20 Classification unit [0156] 21 Data input section [0157] 22 Data conversion section [0158] 23 Classification section [0159] 24 Classification result output section