COUNTERFACTUAL INFERENCE MANAGEMENT DEVICE, COUNTERFACTUAL INFERENCE MANAGEMENT METHOD, AND COUNTERFACTUAL INFERENCE MANAGEMENT COMPUTER PROGRAM PRODUCT
20230214695 · 2023-07-06
Inventors
Cpc classification
G06F18/214
PHYSICS
G06F18/2321
PHYSICS
G06V10/454
PHYSICS
International classification
Abstract
Aspects relate to providing a counterfactual inference management technique capable of providing increased flexibility to allow users to select an appropriate counterfactual inference and offering scalability for handling tabular data and image data in a single configuration. A counterfactual inference management device comprising a classifier unit trained to determine whether a set of input data that includes a set of data features achieves a predetermined target and a counterfactual inference unit for generating a set of transformed data in which a subset of the set of data features are modified to counterfactual features. The classifier unit processes the set of transformed data to determine whether it achieves the predetermined target and calculates a counterfactual loss. The counterfactual inference unit is trained to reduce the counterfactual loss and generate a set of transformed data including counterfactual features that achieve the predetermined target.
Claims
1. A counterfactual inference management device comprising: a classifier unit trained to determine whether a set of input data that includes a set of data features achieves a predetermined target; and a counterfactual inference unit for generating, by processing the set of input data, a set of transformed data in which a subset of the set of data features are modified to counterfactual features with respect to the set of input data; wherein: the classifier unit processes the set of transformed data generated by the counterfactual inference unit to determine whether the set of transformed data achieves the predetermined target, and calculates a counterfactual loss value associated with a subset of the set of transformed data that does not achieve the predetermined target; and the counterfactual inference unit is trained to reduce the counterfactual loss value and generate a second set of transformed data including counterfactual features that achieve the predetermined target.
2. The counterfactual inference management device according to claim 1, wherein: the set of input data includes: a set of tabular data having a set of data features including a set of numerical features or a set of categorical features; or a set of image data having a set of data features including a set of image features.
3. The counterfactual inference management device according to claim 2, further comprising: a pre-processing unit configured to: determine whether the set of input data includes the set of tabular data or the set of image data; and perform, in a case that the set of input data is determined to include the set of tabular data, a tabular data processing operation on the set of input data; and perform, in a case that the set of input data is determined to include the set of image data, an image processing operation on the set of input data.
4. The counterfactual inference management device according to claim 3, wherein: the pre-processing unit performs, as the tabular data processing operation, a normalization technique with respect to the set of numerical features and a one-hot encoding with respect to the set of categorical features.
5. The counterfactual inference management device according to claim 4, wherein: the pre-processing unit performs, as the tabular data processing operation, a masking operation to prevent modification to a subset of the set of numerical features or a subset of the set of categorical features.
6. The counterfactual inference management device according to claim 5, wherein generating the set of transformed data includes: generating, in a case that the set of input data is determined to include the set of tabular data, a latent space representation of the set of input data that is dimensionally reduced with respect to the set of tabular data by processing the set of tabular data with an encoder model; generating the set of transformed data in which a subset of the set of data features are modified to counterfactual features with respect to the set of tabular data by processing the latent space representation with a decoder model; and generating, by performing a statistical analysis technique on the set of transformed data, a set of cluster results that indicates a correlation between data features of the set of tabular data.
7. The counterfactual inference management device according to claim 3, wherein: the pre-processing unit performs, as the image processing operation, one or more of a pixel brightness transformation and a geometric transformation.
8. The counterfactual inference management device according to claim 7, wherein generating the set of transformed data includes: generating, in a case that the set of input data is determined to include the set of image data, a latent space representation of the set of image data that is dimensionally reduced with respect to the set of image data by processing the set of image data with an encoder model having a convolutional layer; and generate the set of transformed data in which a subset of the set of data features are modified to counterfactual features with respect to the set of image data by processing the latent space representation with a decoder model having a convolutional layer.
9. The counterfactual inference management device according to claim 1, further comprising: a user interface unit configured to: present a counterfactual inference result that includes the second set of transformed data, a second set of data features characterizing the second set of transformed data, and a target achievement indicator that indicates whether the second set of transformed data achieves the predetermined target; and receive a user input to modify a subset of the second set of data features of the second set of transformed data to a set of user-selected counterfactual features.
10. The counterfactual inference management device according to claim 9, further comprising: a feedback unit configured to use the counterfactual inference result together with the set of user-selected counterfactual features as training data to train the classifier unit.
11. A counterfactual inference management method comprising: training a classifier unit to determine whether a set of input data that includes a set of data features achieves a predetermined target; generate, by processing the set of input data with a counterfactual inference unit, a set of transformed data in which a subset of the set of data features are modified to counterfactual features with respect to the set of input data; processing, using the classifier unit, the set of transformed data generated by the counterfactual inference unit to determine whether the set of transformed data achieves the predetermined target, and calculating a counterfactual loss value associated with a subset of the set of transformed data that does not achieve the predetermined target; and training the counterfactual inference unit to reduce the counterfactual loss value and generate a second set of transformed data including counterfactual features that achieve the predetermined target.
12. The counterfactual inference management method according to claim 11, further comprising: receiving a set of test data; determining whether the set of test data includes a set of tabular data or a set of image data; performing, in a case that the set of test data is determined to include the set of tabular data, a tabular data processing operation on the set of test data; performing, in a case that the set of test data is determined to include the set of image data, an image processing operation on the set of test data; generate, by processing the set of test data with the counterfactual inference unit, a third set of transformed data including counterfactual features that achieve the predetermined target; presenting, to a user via a graphical user interface, a counterfactual inference result that includes the third set of transformed data, a third set of data features characterizing the third set of transformed data, and a target achievement indicator that indicates whether the third set of transformed data achieves the predetermined target; receiving, via the graphical user interface, a user input to modify a subset of the third set of data features of the third set of transformed data to a set of user-selected counterfactual features; and training the classifier unit using the counterfactual inference result together with the set of user-selected counterfactual features.
13. A counterfactual inference management computer program product comprising a computer readable storage medium having program instructions embodied therewith, wherein the computer readable storage medium is not a transitory signal per se, the program instructions executable by a processor to cause the processor to perform a method comprising: training a classifier unit to determine whether a set of input data that includes a set of data features achieves a predetermined target; generating, by processing the set of input data with a counterfactual inference unit, a set of transformed data in which a subset of the set of data features are modified to counterfactual features with respect to the set of input data; processing, using the classifier unit, the set of transformed data generated by the counterfactual inference unit to determine whether the set of transformed data achieves the predetermined target, and calculating a counterfactual loss value associated with a subset of the set of transformed data that does not achieve the predetermined target; and training the counterfactual inference unit to reduce the counterfactual loss value and generate a second set of transformed data including counterfactual features that achieve the predetermined target.
14. The counterfactual inference management computer program product according to claim 13, further comprising: receiving a set of test data; determining whether the set of test data includes a set of tabular data or a set of image data; performing, in a case that the set of test data is determined to include the set of tabular data, a tabular data processing operation on the set of test data; performing, in a case that the set of test data is determined to include the set of image data, an image processing operation on the set of test data; generating, by processing the set of test data with the counterfactual inference unit, a third set of transformed data including counterfactual features that achieve the predetermined target; presenting, to a user via a graphical user interface, a counterfactual inference result that includes the third set of transformed data, a third set of data features characterizing the third set of transformed data, and a target achievement indicator that indicates whether the third set of transformed data achieves the predetermined target; receiving, via the graphical user interface, a user input to modify a subset of the third set of data features of the third set of transformed data to a set of user-selected counterfactual features; and training the classifier unit using the counterfactual inference result together with the set of user-selected counterfactual features.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DETAILED DESCRIPTION
[0025] Hereinafter, embodiments of the present invention will be described with reference to the Figures. It should be noted that the embodiments described herein are not intended to limit the invention according to the claims, and it is to be understood that each of the elements and combinations thereof described with respect to the embodiments are not strictly necessary to implement the aspects of the present invention.
[0026] Various aspects are disclosed in the following description and related drawings. Alternate aspects may be devised without departing from the scope of the disclosure. Additionally, well-known elements of the disclosure will not be described in detail or will be omitted so as not to obscure the relevant details of the disclosure.
[0027] The words “exemplary” and/or “example” are used herein to mean “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” and/or “example” is not necessarily to be construed as preferred or advantageous over other aspects. Likewise, the term “aspects of the disclosure” does not require that all aspects of the disclosure include the discussed feature, advantage or mode of operation.
[0028] Further, many aspects are described in terms of sequences of actions to be performed by, for example, elements of a computing device. It will be recognized that various actions described herein can be performed by specific circuits (e.g., an application specific integrated circuit (ASIC)), by program instructions being executed by one or more processors, or by a combination of both. Additionally, the sequence of actions described herein can be considered to be embodied entirely within any form of computer readable storage medium having stored therein a corresponding set of computer instructions that upon execution would cause an associated processor to perform the functionality described herein. Thus, the various aspects of the disclosure may be embodied in a number of different forms, all of which have been contemplated to be within the scope of the claimed subject matter.
[0029] Turning now to the Figures,
[0030] The computer system 100 may contain one or more general-purpose programmable central processing units (CPUs) 102A and 102B, herein generically referred to as the processor 102. In embodiments, the computer system 100 may contain multiple processors; however, in certain embodiments, the computer system 100 may alternatively be a single CPU system. Each processor 102 executes instructions stored in the memory 104 and may include one or more levels of on-board cache.
[0031] In embodiments, the memory 104 may include a random-access semiconductor memory, storage device, or storage medium (either volatile or non-volatile) for storing or encoding data and programs. In certain embodiments, the memory 104 represents the entire virtual memory of the computer system 100, and may also include the virtual memory of other computer systems coupled to the computer system 100 or connected via a network. The memory 104 can be conceptually viewed as a single monolithic entity, but in other embodiments the memory 104 is a more complex arrangement, such as a hierarchy of caches and other memory devices. For example, memory may exist in multiple levels of caches, and these caches may be further divided by function, so that one cache holds instructions while another holds non-instruction data, which is used by the processor or processors. Memory may be further distributed and associated with different CPUs or sets of CPUs, as is known in any of various so-called nonuniform memory access (NUMA) computer architectures.
[0032] The memory 104 may store all or a portion of the various programs, modules and data structures for processing data transfers as discussed herein. For instance, the memory 104 can store a counterfactual inference management application 130. In embodiments, the counterfactual inference management application 130 may include instructions or statements that execute on the processor 102 or instructions or statements that are interpreted by instructions or statements that execute on the processor 102 to carry out the functions as further described below.
[0033] In certain embodiments, the counterfactual inference management application 130 is implemented in hardware via semiconductor devices, chips, logical gates, circuits, circuit cards, and/or other physical hardware devices in lieu of, or in addition to, a processor-based system. In embodiments, the counterfactual inference management application 130 may include data in addition to instructions or statements. In certain embodiments, a camera, sensor, or other data input device (not shown) may be provided in direct communication with the bus interface unit 109, the processor 102, or other hardware of the computer system 100. In such a configuration, the need for the processor 102 to access the memory 104 and the counterfactual inference management application 130 may be reduced.
[0034] The computer system 100 may include a bus interface unit 109 to handle communications among the processor 102, the memory 104, a display system 124, and the I/O bus interface unit 110. The I/O bus interface unit 110 may be coupled with the I/O bus 108 for transferring data to and from the various I/O units. The I/O bus interface unit 110 communicates with multiple I/O interface units 112, 113, 114, and 115, which are also known as I/O processors (IOPs) or I/O adapters (IOAs), through the I/O bus 108. The display system 124 may include a display controller, a display memory, or both. The display controller may provide video, audio, or both types of data to a display device 126. Further, the computer system 100 may include one or more sensors or other devices configured to collect and provide data to the processor 102.
[0035] As examples, the computer system 100 may include biometric sensors (e.g., to collect heart rate data, stress level data), environmental sensors (e.g., to collect humidity data, temperature data, pressure data), motion sensors (e.g., to collect acceleration data, movement data), or the like. Other types of sensors are also possible. The display memory may be a dedicated memory for buffering video data. The display system 124 may be coupled with a display device 126, such as a standalone display screen, computer monitor, television, or a tablet or handheld device display.
[0036] In one embodiment, the display device 126 may include one or more speakers for rendering audio. Alternatively, one or more speakers for rendering audio may be coupled with an I/O interface unit. In alternate embodiments, one or more of the functions provided by the display system 124 may be on board an integrated circuit that also includes the processor 102. In addition, one or more of the functions provided by the bus interface unit 109 may be on board an integrated circuit that also includes the processor 102.
[0037] The I/O interface units support communication with a variety of storage and I/O devices. For example, the terminal interface unit 112 supports the attachment of one or more user I/O devices 116, which may include user output devices (such as a video display device, speaker, and/or television set) and user input devices (such as a keyboard, mouse, keypad, touchpad, trackball, buttons, light pen, or other pointing device). A user may manipulate the user input devices using a user interface in order to provide input data and commands to the user I/O device 116 and the computer system 100, and may receive output data via the user output devices. For example, a user interface may be presented via the user I/O device 116, such as displayed on a display device, played via a speaker, or printed via a printer.
[0038] The storage intrface 113 supports the attachment of one or more disk drives or direct access storage devices 117 (which are typically rotating magnetic disk drive storage devices, although they could alternatively be other storage devices, including arrays of disk drives configured to appear as a single large storage device to a host computer, or solid-state drives, such as flash memory). In some embodiments, the storage device 117 may be implemented via any type of secondary storage device. The contents of the memory 104, or any portion thereof, may be stored to and retrieved from the storage device 117 as needed. The I/O device interface 114 provides an interface to any of various other I/O devices or devices of other types, such as printers or fax machines. The network interface 115 provides one or more communication paths from the computer system 100 to other digital devices and computer systems; these communication paths may include, for example, one or more networks 130.
[0039] Although the computer system 100 shown in
[0040] In various embodiments, the computer system 100 is a multi-user mainframe computer system, a single-user system, or a server computer or similar device that has little or no direct user interface, but receives requests from other computer systems (clients). In other embodiments, the computer system 100 may be implemented as a desktop computer, portable computer, laptop or notebook computer, tablet computer, pocket computer, telephone, smart phone, or any other suitable type of electronic device.
[0041] Next, an example logical configuration of a counterfactual inference management device according to the embodiments of the present disclosure will be described with reference to
[0042]
[0043] In embodiments, the counterfactual inference management device 200 may be configured in three phases including a classifier training phase for training the classifier unit 210, a counterfactual inference unit training phase for training the counterfactual inference unit 220, and an inference phase in which the trained counterfactual inference unit 220 is used to generate a counterfactual inference result.
[0044] First, in the classifier training phase, the classifier unit 210 is trained using a set of training data 202. The classifier unit 210 is a functional unit configured to determine whether an input data set that includes a set of data features achieves a predetermined target (e.g., loan approval). More particularly, the classifier unit 210 may order or categorize data into one or more of a set of classes.
[0045] As an example, the classifier unit 21 may include an algorithm configured to classify loan applicants into classes of “approved” or “denied” based upon features that define the characteristics (income, educational background, age, marital status) of the applicant. In this example, applicants that are “approved” are considered to achieve the predetermined target, and applicants that are “denied” are considered to fail to achieve the predetermined target. The set of training data 202 may include a set of data (image data or tabular data) used for training the classifier unit 210 to perform a given classification task. For instance, with reference to the loan application example, the set of training data 202 may include data including features that define the characteristics of a number of applicants that can be used to train the classifier unit 210 to accurately classify applicants into categories of “approved” or “denied.” In general, “training” refers to adjusting the parameters (e.g., hyper parameters, weights, and neural connections) of a machine learning unit until a particular task can be performed with a predetermined accuracy. This training can be performed over a plurality of iterations until the desired accuracy is achieved.
[0046] As the flow of the training process of the classifier unit 210 will be described later, the details thereof will be omitted here.
[0047] In the counterfactual inference unit training phase, the counterfactual inference unit 220 is trained using a set of training data 204. The counterfactual inference unit 220 is a functional unit configured to perform encoding and decoding operations on an input data set to generate a set of transformed data that achieves the predetermined target. The counterfactual inference unit 220 may be implemented as a variational autoencoder, for instance. The set of training data 204 may include a set of data (image data or tabular data) used for training the counterfactual inference unit 220. In embodiments, the set of training data 204 may substantially correspond to the set of training data 202 used to train the classifier unit 210.
[0048] In embodiments, the counterfactual inference unit 220 may be configured to switch between separate operating modes based on the type of input data. For instance, an operating configuration trained to handle image data and an operating configuration trained to handle tabular data may be prepared in advance, and the counterfactual inference unit 220 may be configured to load the operating configuration for processing tabular data in the case that the set of input data is tabular data and load the operating configuration for processing image data in the case that the set of input data is image data. In this way, the counterfactual inference management device 200 is capable of generating counterfactual inference results with respect to both tabular data and image data.
[0049] Here, the set of transformed data (not illustrated in
[0050] As an example, a user associated with a data feature of “Educational Background- High school graduate” may modify this data feature to a counterfactual feature of “Educational Background: Bachelor’s Degree in Business Management” to explore how this affects their career option. In addition, the set of transformed data may indicate the correlation between different clusters of data features (e.g., education level and income).
[0051] The encoding and decoding operations performed on the set of input data result in a model loss 206 associated with the difference between the set of transformed data and the set of training data 204. The set of transformed data is input to the classifier unit 210 trained in the classifier training phase. The trained classifier unit 210 processes the set of transformed data generated by the counterfactual inference unit 220 to determine whether the set of transformed data achieves a predetermined target (e.g., loan approval), and calculates a counterfactual loss 208 associated with a portion of the set of transformed data that does not achieve the predetermined target. Subsequently, the counterfactual inference unit 220 is trained to reduce the model loss 206 (including the reconstruction loss 206). In this way, the counterfactual inference unit 220 is trained to generate sets of transformed data that include counterfactual features that achieve the predetermined target.
[0052] As the flow of the training process of the counterfactual inference unit 220 will be described later, the details thereof will be omitted here.
[0053] In the inference phase, the trained counterfactual inference unit 220 is used to generate a counterfactual inference result with respect to a set of test data 212. The set of test data 212 may include a set of data (image data or tabular data) about which a set of counterfactual inferences are to be generated.
[0054] First, the set of test data 212 may be input to the pre-processing unit 230. The pre-processing unit 230 is a functional unit configured to perform one or more pre-processing operations on the set of test data 212 to facilitate the generation of counterfactual inferences. More particularly, in embodiments, the pre-processing unit 230 may analyze the set of test data 212 to determine whether the set of test data 212 includes a set of tabular data or a set of image data. In the case that the set of test data 212 is determined to include a set of tabular data, the pre-processing unit 230 may use a tabular data processing unit 232 to perform a tabular data processing operation (e.g., normalize a set of data features, mask a subset of data features to prevent modification by the counterfactual inference unit 220, or the like) on the set of test data 212. In the case that the set of test data 212 is determined to include a set of image data, the pre-processing unit 230 may use an image processing unit 234 to perform an image processing operation (e.g., a pixel brightness operation, a geometric transformation) on the set of test data 212.
[0055] After the set of test data 212 has been processed by the pre-processing unit 230, it is input to the counterfactual inference unit 220 trained in the counterfactual inference unit training phase described above. The counterfactual inference unit 220 processes the pre-processed set of test data 212 to generate a set of transformed data in which a subset of the set of data features of the set of test data 212 have been modified to counterfactual features such that the set of transformed data achieves the predetermined target (e.g., loan approval). The set of transformed data is then input to the user interface unit 240.
[0056] The user interface unit 240 is a functional unit configured to display information and receive inputs from users via a graphical user interface. In embodiments, the user interface unit 240 may be configured to present a counterfactual inference result that includes the set of transformed data generated by the counterfactual inference unit 220 as well as the set of data features associated with the set of transformed data and a target achievement indicator that indicates whether the set of transformed data achieves the predetermined target. The user interface unit 240 may also indicate the correlation between different clusters of data features. The user interface unit 240 may receive a user input to further modify one or more of the data features of the set of transformed data to a user-selected counterfactual feature (e.g., a user may modify a data feature corresponding to their income or education level to observe how this affects their loan approval result).
[0057] The feedback unit 250 may collect the user input including the set of user-selected counterfactual features, and use this set of user-selected counterfactual features to generate training data to further train the classifier unit 210.
[0058] As the flow of the inference phase will be described later, the details thereof will be omitted here.
[0059] According to the counterfactual inference management device 200 described above, it is possible to consider the correlation between features in order to facilitate the elimination of infeasible counterfactual inferences, provide increased flexibility to allow users to select an appropriate counterfactual inference, and offer scalability for handling tabular data and image data in a single configuration.
[0060] Next, a counterfactual inference management method with respect to tabular data according to the embodiments of the present disclosure will be described with reference to
[0061]
[0062] It should be noted that the counterfactual inference management method 300 corresponds to the inference phase of the counterfactual inference management device 200; that is, the classifier unit and the counterfactual inference unit are assumed to already have been trained. As the training processes of the classifier unit and the counterfactual inference unit will be described later, the details thereof will be omitted here.
[0063] First, at Step S301, the pre-processing unit (for example, the pre-processing unit 230 illustrated in
[0064] The set of input data may include a set of data features. Here, the set of data features refer to a collection of properties or attributes that characterize the set of input data. In the case of tabular data, the set of data features may include a set of numerical features or a set of categorical features. As an example, in the case that the set of input data includes personal information provided by a loan applicant for use in determining their eligibility to qualify for a loan, the set of data features may include numerical features such as the age of the applicant, the income of the applicant, or the like, as well as categorical features such as the gender of the applicant, the educational level of the applicant, the occupation of the applicant, or the like.
[0065] Next, at Step S302, the pre-processing unit performs a mode selection operation to determine whether the set of input data is a set of tabular data or a set of image data. In the case that the set of input data includes a set of tabular data, the processing proceeds to Step S303. In the case that the set of input data includes a set of image data, the processing proceeds to Step S403. Note that, as the overall flow of a counterfactual inference management method 400 with respect to image data will be described with reference to
[0066] Next, at Step S303, the tabular data processing unit (that is, the tabular data processing unit 232 of the pre-processing unit 230) performs a masking operation with respect to the set of input data. Here, the masking operation refers to an operation to mask (e.g., freeze, lock, hold, maintain) one or more of the data features of the set of input data. Features that are masked are prevented from being modified by the processing of the counterfactual inference unit, as will be described later.
[0067] Next, at Step S304, the tabular data processing unit performs a tabular data processing operation with respect to the set of data features of the set of input data that were not masked in Step S303. Here, the tabular data processing operation may include an operation to facilitate processing by the counterfactual inference unit. As examples, the tabular data processing unit may utilize a normalization technique to normalize numerical features (e.g., age, income, hours worked per week) of the set of input data, or a one-hot encoding feature to represent the categorical features (e.g., occupation, educational level, marital status) of the input data in a vector format. Performing the tabular data processing operation with respect to the set of input data may increase the accuracy of the counterfactual inference unit.
[0068] At Step S305, the classifier unit is trained to determine whether an input data set that includes a set of data features achieves a predetermined target. More particularly, the classifier unit may be trained to order or categorize data into one or more of a set of classes. Here, the predetermined target refers to a particular goal or classification defined in advance. As an example, in the case of a loan application scenario, the predetermined target may be “loan approval.” As the training process of the classifier unit will be described later, the details thereof will be omitted here.
[0069] At Step S306, the counterfactual inference unit is trained to perform encoding and decoding operations on the set of input data set to generate a set of transformed data that achieves the predetermined target. As the training process of the counterfactual inference unit will be described later, the details thereof will be omitted here.
[0070] As described herein, the training process of the classifier unit in Step S305 and the training process of the counterfactual inference unit in Step S306 are independent steps that can be performed in advance, and are assumed to be completed prior to the pre-processed data being input to the counterfactual inference unit (that is, the counterfactual inference unit is trained prior to receiving the pre-processed set of input data).
[0071] Next, at Step S307, the set of input data pre-processed at Step S304 is input to the counterfactual inference unit trained at Step S306, and at Step S308, the counterfactual inference unit generates a counterfactual inference result at least including a set of transformed data in which a subset of the set of data features of the set of input data have been modified to counterfactual features such that the set of transformed data achieves the predetermined target (e.g., loan approval).
[0072] As the processing performed by the counterfactual inference unit to generate the set of transformed data will be described later, the details thereof will be omitted here.
[0073] Next, at Step S309, the user interface unit (e.g., the user interface unit 240 illustrated in
[0074] According to the counterfactual inference management method 300 with respect to tabular data described above, it is possible to generate a counterfactual inference result for tabular data that provides users with increased flexibility to select an appropriate counterfactual inference, and utilize the user input to further increase the accuracy of the counterfactual inference management device.
[0075] Next, a counterfactual inference management method with respect to image data according to the embodiments of the present disclosure will be described with reference to
[0076]
[0077] It should be noted that the counterfactual inference management method 400 corresponds to the inference phase of the counterfactual inference management device 200; that is, the classifier unit and the counterfactual inference unit are assumed to already have been trained. As the training processes of the classifier unit and the counterfactual inference unit will be described later, the details thereof will be omitted here.
[0078] First, at Step S401, the pre-processing unit (for example, the pre-processing unit 230 illustrated in
[0079] The set of input data may include a set of data features. Here, the set of data features refer to a collection of properties or attributes that characterize the set of input data. In the case of image data, the set of data features may include a set of image features. As an example, in the case that the set of input data includes an image of a bedroom, the set of data features may include image features such as the lighting of the room, the angle of the image, spatial composition of the image, classes of objects present in the image, colors of the objects, or the like. However, the set of image features are not limited herein, and other image features, such as weather or the like, may also be utilized.
[0080] Next, at Step S402, the pre-processing unit performs a mode selection operation to determine whether the set of input data is a set of tabular data or a set of image data. In the case that the set of input data includes a set of tabular data, the processing proceeds to Step S303. In the case that the set of input data includes a set of image data, the processing proceeds to Step S403. Note that, as the overall flow of a counterfactual inference management method 300 with respect to tabular data was previously described with reference to
[0081] Next, at Step S403, the image processing unit (that is, the image processing unit 234 of the pre-processing unit 230) performs an image processing operation with respect to the set of input data. Here, the image processing operation refers to an operation to facilitate processing by the counterfactual inference unit. As examples, the image processing unit may utilize a pixel brightness transformation or a geometric transformation to modify the set of image data. Performing the image processing operation with respect to the set of input data may increase the accuracy of the counterfactual inference unit.
[0082] At Step S404, the classifier unit is trained to determine whether an input data set that includes a set of data features achieves a predetermined target (e.g., whether there is indoor lighting in an image of a bedroom, whether an image shows a cat). More particularly, the classifier unit may be trained to order or categorize data into one or more of a set of classes. As the training process of the classifier unit will be described later, the details thereof will be omitted here.
[0083] At Step S405, the counterfactual inference unit is trained to perform encoding and decoding operations on the set of input data set to generate a set of transformed data that achieves the predetermined target. As the training process of the counterfactual inference unit will be described later, the details thereof will be omitted here.
[0084] As described herein, the training process of the classifier unit in Step S404 and the training process of the counterfactual inference unit in Step S405 are independent steps that can be performed in advance, and are assumed to be completed prior to the pre-processed data being input to the counterfactual inference unit (that is, the counterfactual inference unit is trained prior to receiving the pre-processed set of input data).
[0085] Next, at Step S406, the set of input data pre-processed at Step S403 is input to the counterfactual inference unit trained at Step S405, and at Step S407, the counterfactual inference unit generates a counterfactual inference result that at least includes a set of transformed data in which a subset of the set of data features of the set of input data have been modified to counterfactual features such that the set of transformed data achieves the predetermined target (e.g., an image of a bedroom with no indoor lighting is transformed to an image with indoor lighting).
[0086] As the processing performed by the counterfactual inference unit to generate the set of transformed data will be described later, the details thereof will be omitted here.
[0087] Next, at Step S408, the user interface unit (e.g., the user interface unit 240 illustrated in
[0088] According to the counterfactual inference management method 400 with respect to image data described above, it is possible to generate a counterfactual inference result for image data that provides users with increased flexibility to select an appropriate counterfactual inference, and utilize the user input to further increase the accuracy of the counterfactual inference management device.
[0089] In this way, by means of the pre-processing performing a mode selection operation to determine whether a set of input data is tabular data or image data, and subsequently using a tabular data processing unit to process tabular data features such as numerical features or categorical features, or alternatively using an image data processing unit to process image data features, it is possible to generate counterfactual inferences for both tabular data and image data with a single configuration.
[0090] Next, a counterfactual inference generation method with respect to tabular data according to the embodiments of the present disclosure will be described with reference to
[0091]
[0092] Here, for convenience of explanation, an example of a counterfactual inference generation method 500 with respect to tabular data related to a loan application scenario will be described, but the present disclosure is not limited hereto, and counterfactual inference generation may be applied to a variety of use cases.
[0093] First, at Step S502, the pre-processing unit (for example, the pre-processing unit 230 illustrated in
[0094] Upon receiving the set of tabular data 501, the tabular data processing unit performs a tabular data processing operation with respect to the set of data features of the set of tabular data 501 (e.g., the set of data features for which a mask was not designated in Step S303 illustrated in
[0095] Next, at Step S503, the set of tabular data 501 pre-processed at Step S502 is input to the encoder of the counterfactual inference unit. Here, the encoder is a neural network configured to compress and dimensionally reduce the set of tabular data 501 to generate a latent space representation 504 of the set of tabular data 501.
[0096] More particularly, at Step S503A, the encoder performs down sampling on the set of tabular data 501. This down sampling may be performed in a down sampling layer of the neural network used as the encoder. Performing down sampling on the set of tabular data 501 allows for the essential data features of the set of tabular data 501 to be maintained while reducing the overall dimensionality of the set of tabular data 501 and suppress noise.
[0097] Next, at Step S503B, the encoder processes the set of tabular data with a non-linear layer. The non-linear layer may utilize a non-linear activation function, such as a rectified linear unit (ReLU) activation function to introduce non-linearities to the set of tabular data 501.
[0098] By processing the set of tabular data 501, the encoder generates a latent space representation 504 of the set of tabular data 501. Here, the latent space representation 504 is a dimensionally reduced representation of the set of tabular data 501. In embodiments, the latent space representation 504 may include a multi-dimensional vector that characterizes the primary data features of the set of tabular data 501.
[0099] Next, at Step S505, the latent space representation 504 is input to the decoder of the counterfactual inference unit. Here, the decoder is a neural network configured to decode the latent space representation 504 in order to generate a reconstructed version of the set of tabular data 501.
[0100] More particularly, at Step S505A, the decoder performs up sampling on the latent space representation 504. This up sampling may be performed in an up-sampling layer of the neural network used as the decoder. Performing up-sampling on the latent space representation 504 decodes the latent space representation 504 to generate a set of transformed data 507 in which a subset of the set of data features are modified to counterfactual features with respect to the set of tabular data 501.
[0101] In addition, at Step S505B, the decoder uses a probabilistic learner layer to determine the correlation between the data features of the set of transformed data 507. The probabilistic learner layer may utilize a statistical analysis technique to predict data features that have a high probability of exhibiting correlation with one or other data features of the set of transformed data 507. Here, “correlation” refers to a co-dependent relationship between two or more data features, such that a change in one data feature leads to a change in another data feature. As an example, the probabilistic learner layer may identify correlation between data features of “age” and “education level,” as changes to an individuals’ education level often take time, which would result in corresponding changes to the age of the individual.
[0102] Further, at Step S505C, the decoder applies masks to the subset of data features selected for mask application in Step S303, as described above. Here, applying masks to the subset of data features may include modifying the values of the subset of data features selected for mask application to the same value as the set of tabular data 501, thereby maintaining them at the original value. As an example, the decoder may apply masks to data features that cannot be changed by the user (e.g., race, gender) or the like.
[0103] The decoder outputs a counterfactual inference result at least including the set of transformed data 507 together with the cluster results 508 generated by the probabilistic learner in Step S505B.
[0104] As described herein, as the counterfactual inference unit used here has been trained to perform encoding and decoding operations on the set of tabular data 501 to generate a set of transformed data that achieves the predetermined target, the set of transformed data 507 is a set of data in which a subset of the set of data features of the set of tabular data are modified to counterfactual features to increase the likelihood of the set of transformed data achieving the predetermined target (e.g., loan approval). For instance, as illustrated in
[0105] Further, the cluster result 508 indicates those data features that are determined to have correlation to each other. As will be described later, the user may use this correlation to eliminate non-feasible counterfactual inferences, and select more feasible counterfactual inferences.
[0106] According to the counterfactual inference generation method 500 with respect to tabular data described above, it is possible to generate a counterfactual inference result for tabular data that provides users with recommendations of what attributes (e.g., data features) to change to increase their likelihood of achieving a predetermined target.
[0107] Next, a counterfactual inference generation method with respect to image data according to the embodiments of the present disclosure will be described with reference to
[0108]
[0109] Here, for convenience of explanation, an example of a counterfactual inference generation method 600 with respect to image data illustrating a bedroom will be described, but the present disclosure is not limited hereto, and counterfactual inference generation may be applied to a variety of use cases.
[0110] First, at Step S602, the pre-processing unit (for example, the pre-processing unit 230 illustrated in
[0111] Upon receiving the set of image data 601, the image processing unit (that is, the image processing unit 234 of the pre-processing unit 230) performs an image processing operation with respect to the set of image data 601. Here, the image processing operation refers to an operation to facilitate processing by the counterfactual inference unit. As examples, the image processing unit may utilize a pixel brightness transformation or a geometric transformation to modify the set of image data. Performing the image processing operation with respect to the set of input data may increase the accuracy of the counterfactual inference unit.
[0112] Next, at Step S603, the set of image data 601 pre-processed at Step S602 is input to the encoder of the counterfactual inference unit. Here, the encoder is a neural network configured to compress and dimensionally reduce the set of image data 601 to generate a latent space representation 604 of the set of image data 601.
[0113] More particularly, at Step S603A, the encoder performs down sampling on the set of image data 601. This down sampling may be performed in a down sampling layer of the neural network used as the encoder. Performing down sampling on the set of image data 601 allows for the essential data features of the set of image data 601 to be maintained while reducing the overall dimensionality of the set of image data 601 and suppress noise.
[0114] Next, at Step S603B, the encoder processes the set of image data 601 with a convolutional layer. Here, the convolutional layer is a neural network layer configured to process the set of image data to extract a feature map. The feature map is a vectorial representation of the set of image data 601. As examples, the convolutional network may include LeNet, AlexNet, VGG-16 Net, Resnet, Inception Net, or the like.
[0115] By processing the set of image data 601, the encoder generates a latent space representation 604 of the set of image data 601. Here, the latent space representation 604 is a dimensionally reduced representation of the set of image data 601. In embodiments, the latent space representation 604 may include a multi-dimensional vector that characterizes the primary data features of the set of image data 601.
[0116] Next, at Step S605, the latent space representation 604 is input to the decoder of the counterfactual inference unit. Here, the decoder is a neural network configured to decode the latent space representation 604 in order to generate a reconstructed version of the set of image data 601.
[0117] More particularly, at Step S6505A, a convolutional layer is used to reconstruct an image from the latent space representation 604.
[0118] Subsequently, at Step S605B, the decoder performs up sampling on the image generated by the convolutional network from the latent space representation 604. This up sampling may be performed in an up-sampling layer of the neural network used as the decoder.
[0119] In this way, the decoder is able to generate a counterfactual inference result at least including a set of transformed data 606 in which a subset of the set of image features have been modified to counterfactual features. As described herein, as the counterfactual inference unit used here has been trained to perform encoding and decoding operations on the set of image data 601 to generate a set of transformed data that achieves a predetermined target, the set of transformed data 606 is a set of data in which a subset of the set of data features of the set of image data 601 are modified to counterfactual features to increase the likelihood of the set of transformed data achieving the predetermined target (e.g., modifying an image in which no indoor lighting is present to an image of a room in which indoor lighting is present). For instance, as illustrated in
[0120] According to the counterfactual inference generation method 600 with respect to image data described above, it is possible to generate a counterfactual inference result for image data that provides users with transformed images that achieve a predetermined target. In embodiments, these transformed images may be used as training data for other machine language models. For instance, transformed images that illustrate rare scenarios (e.g., a bear crossing a road) may be generated from input images illustrating more common scenarios (e.g., a dog crossing a road). These transformed images may then be used to supplement machine learning in situations where data availability is an issue.
[0121] Next, an example of a training process for the classifier unit and the counterfactual inference unit with respect to tabular data will be described with reference to
[0122]
[0123] First, a training management unit 703 receives a set of training data 701. Here, the training management unit 703 may include a software module or dedicated hardware configured to implement the training process 700 with respect to the classifier unit and the counterfactual inference unit. The set of training data 701 may include a set of tabular data or a set of image data for training the classifier unit to perform a given classification task. As an example, in the case of a classification task in which a classifier unit is trained to predict individuals whose income exceeds a threshold, the set of training data may include information regarding the age, education level, gender, nationality, and occupation of a number of individuals, together with ground truth data that indicates the correct classification label results for each individual.
[0124] Upon receiving the set of training data 701, the training management unit 703 may select, from a group of base models 702, an appropriate model type for performing the desired classification task. Here, the group of base models 702 may include a collection of machine learning models, networks, or algorithms that can be trained to perform the desired classification task. In embodiments, the training management unit 703 may receive a model selection instruction together with the set of training data 701 that indicates a particular model for use. As examples, the group of base models 702 may include artificial neural networks, deep learning algorithms, learning classifiers, Bayesian networks, or the like. Upon selection of a base model from the group of base models 702, the training management unit 703 utilizes the set of training data 701 to train the selected base model to perform the desired classification task (e.g., predicting whether or not an individual’s income exceeds a threshold). The training process 700 may be repeated until the base model achieves a desired accuracy level, and subsequently saved as the trained classifier unit 704.
[0125] Next, a counterfactual inference unit 706 is trained using a set of training data 705. The set of training data 705 may include a set of tabular data or image data used for training the counterfactual inference unit 220. In embodiments, the set of training data 204 may substantially correspond to the set of training data 701 used to train the trained classifier unit 704. Here, the counterfactual inference unit 706 is trained to perform encoding and decoding operations on an input data set to generate a set of transformed data that achieves the predetermined target of the trained classifier unit 704. As an example, in the case that the trained classifier unit 704 is trained to predict individuals whose income exceeds a threshold, the counterfactual inference unit 706 is trained to generate sets of transformed data in which a subset of the set of data features of the input data set are modified to counterfactual features (e.g., modifications to data features such as occupation, hours worked per week, or the like) such that the set of transformed data is classified as corresponding to an individual whose income will exceed the threshold. As will be described herein, the training of the counterfactual inference unit 706 is associated with a model loss 708. This model loss 708 will be described below.
[0126] The set of transformed data generated by the counterfactual inference unit 706 is input to the trained classifier unit 704. The trained classifier unit 704 processes the set of transformed data generated by the counterfactual inference unit 706 to determine whether the set of transformed data achieves a predetermined target (e.g., income above a threshold), and calculates an associated counterfactual loss 707. This counterfactual loss 707 arises from those samples of the transformed data that are determined to not achieve the predetermined target.
[0127] As described above, the counterfactual inference unit 220 is trained to reduce the model loss 708. Here, as illustrated in Equation 1, the model loss 708 (L.sub.v) is represented as the sum of a plurality of loss values.
[0128] Here, L.sub.kl represents the Kullback-Leibler divergence loss, L.sub.recon represents the reconstruction loss resulting from the encoder-decoder processing, L.sub.CF represents the counterfactual loss 707 arising from those samples of the transformed data that were determined to not achieve the predetermined target by the trained classifier unit 704, and L.sub.prob represents the supplementary probabilistic loss of calculating the correlation between clusters of data features.
[0129] By adjusting the parameters of the counterfactual inference unit 706 to minimize this model loss 708 (L.sub.v), the counterfactual inference unit 706 is trained to generate sets of transformed data that achieve the predetermined target of the trained classifier unit 704.
[0130] In addition, it should be noted that in the training process 700, the masking operation described herein is not performed to mask a subset of the set of data features of the training data; that is, the training process 700 is performed without masking any data features. By performing the training process 700 without masking any data features of the training data, local convergence of the models can be prevented.
[0131] According to the training process 700 for the classifier unit and the counterfactual inference unit as described above, the counterfactual inference unit can be trained to generate sets of transformed data in which a subset of the set of data features of the input data set are modified to counterfactual features such that the set of transformed data achieves a predetermined target. These counterfactual features can be used to provide users with insights about actions to take to increase their likelihood of achieving a predetermined target (e.g., loan approval, income above a threshold).
[0132] Next, an example of a feedback process of the counterfactual inference management device will be described with reference to
[0133]
[0134] As illustrated in
[0135] Subsequently, this counterfactual inference result 807 together with the user input may be aggregated as a set of training data 810 and input to the training management unit 703. The training management unit 703 may then select, from a group of base models 702, an appropriate model type for performing a desired classification task, and train the selected model using the set of training data 810 to generate a trained classifier unit 704.
[0136] According to the feedback process 800 of the counterfactual inference management device, the counterfactual inference result 807 and the user input received from the user can be used to train a classifier unit. This trained classifier unit can then subsequently be used to train a counterfactual inference unit as described above. In this way, by means of training a classifier unit and a counterfactual inference unit based on the counterfactual inference result 807 and the user input received from the user, it becomes possible to generate flexible counterfactual inference results that provide users a range of options to customize based on their preferences.
[0137] Next, an example of a mask selection window according to the embodiments of the present disclosure will be described with reference to
[0138]
[0139] As illustrated in
[0140] By selecting the import file button 901, a user can select the set of input data (e.g., the set of test data to be used in the counterfactual inference generation process) including the set of data features they wish to mask.
[0141] By selecting the mode selection button 902, a user can select between a tabular mode for designating a tabular data processing operation (e.g., mask selection, data normalization) and an image mode for designating an image processing operation (e.g., pixel brightness operation, geometric transformation).
[0142] By selecting the mask selection button 903, a user can select the particular data features to which they wish to assign a mask. In embodiments, it may be preferable to assign masks to those data features that cannot be freely changed by the user. As an example, as illustrated in
[0143] By means of the mask selection window 900, users can assign masks to any number of data features of the set of data features. In this way, by assigning masks to those data features that cannot be freely changed by the user, it is possible to suppress the generation of infeasible counterfactual inferences that require changes to data features that correspond to attributes that cannot be changed by the user.
[0144] Next, an example of an image import window according to the embodiments of the present disclosure will be described with reference to
[0145]
[0146] As illustrated in
[0147] By selecting the import file button 1001, a user can select the set of image data to be used in the counterfactual inference generation process.
[0148] By selecting the mode selection button 1002, a user can select between a tabular mode for designating a tabular data processing operation (e.g., mask selection, data normalization) and an image mode for designating an image processing operation (e.g., pixel brightness operation, geometric transformation).
[0149] By selecting the pixel brightness operation button 1003, a user can select one or more pixel brightness operations to perform with respect to the set of image data. The pixel brightness operation may include an operation to increase or decrease the brightness of one or more pixels of the set of image data.
[0150] By selecting the geometric transformation button 1004, a user can select one or more geometric transformation operations to perform with respect to the set of image data. As examples, the geometric transformation operations may include translations, Euclidean transformations, resizing, scaling, or other operations to adjust the geometric features of elements of the set of image data.
[0151] In the image display area 1005, a preview of the set of image data selected by the user via the import file button 1001 is displayed. As an example, as illustrated in
[0152] By means of the image import window 1000, users can select sets of image data to be used in the counterfactual inference process. Further, users can designate one or more image processing operations to perform on the set of image data. Performing one or more image processing operations on the set of image data may increase the accuracy of the counterfactual inference results generated by the counterfactual inference process.
[0153] Next, an example of a counterfactual inference result display for tabular data will be described with reference to
[0154]
[0155] As illustrated in
[0156] The feature display area 1110 is a graphical user interface element for illustrating the data features of the set of transformed data. As described herein, in the case of tabular data, the set of transformed data may include a set of numerical features and a set of categorical features. As examples, in
[0157] As described herein, in embodiments, users of the counterfactual inference result display 1100 may enter a user input to modify one or more data features of the set of transformed data. As an example, in the case that feature 1 represents “income,” a user may use the slider for feature 1 to modify his or her income value to a counterfactual value to observe how this affects the counterfactual inference result. Similarly, in the case that feature 4 represents “occupation,” a user may use the drop-down box to modify his or her occupation to observe how this affects the counterfactual inference result.
[0158] The target achievement indicator area 1120 is a graphical user interface element for indicating whether the set of transformed data achieves the predetermined target. As an example, in the case that the predetermined target is “loan approval,” the target achievement indicator area 1120 may indicate “success” or “failure” of a loan applicant to be approved for a loan. In embodiments, the target achievement indicator area 1120 may be configured to automatically update in real time in response to the modifications to the data features made by the user. Accordingly, a user can observe in real time how modifications to the data features influence the counterfactual inference result.
[0159] The correlation indicator area 1130 is a graphical user interface element that indicates the correlation between particular data features of the set of transformed data. For example, as illustrated in
[0160] Additionally, the counterfactual inference result display 1100 may include a save button 1135. By selecting the save button 1135, a user may save the counterfactual inference result to a designated storage area. In embodiments, upon saving the counterfactual inference results, the counterfactual inference results may be sent to the training management unit for use as training data to train a classifier unit.
[0161] According to the counterfactual inference result display 1100 illustrated in
[0162] Next, an example of a counterfactual inference result display for image data will be described with reference to
[0163]
[0164] As illustrated in
[0165] The feature display area 1210 is a graphical user interface element for illustrating the data features of the set of transformed data. As described herein, in the case of image data, the set of transformed data may include a transformed image characterized by a set of image features. As examples, in
[0166] As described herein, in embodiments, users of the counterfactual inference result display 1200 may enter a user input to modify one or more data features of the set of transformed data. As an example, in the case that feature 1 represents “brightness,” a user may use the slider for feature 1 to modify the brightness level to a counterfactual value to observe how this affects the transformed image.
[0167] The transformed image display area 1220 is a graphical user interface element for illustrating the transformed image generated by the counterfactual inference unit. As an example, as illustrated in
[0168] Additionally, the counterfactual inference result display 1200 may include a save button 1235. By selecting the save button 1135, a user may save the counterfactual inference result to a designated storage area. In embodiments, upon saving the counterfactual inference results, the counterfactual inference results may be sent to the training management unit for use as training data to train a classifier unit.
[0169] According to the counterfactual inference result display 1200 illustrated in
[0170] As described herein, according to the counterfactual inference management device, counterfactual inference management method, and counterfactual inference management computer program product of the present disclosure, a variety of advantageous effects can be demonstrated.
[0171] For example, as the counterfactual inference unit is trained based on the output of a trained classifier unit, the trained counterfactual inference unit is capable of generating counterfactual inference results that include sets of transformed data in which one or more data features have been modified to counterfactual data features that achieve a predetermined target of the classifier unit. These counterfactual inference results may serve as recommendations to users that provide insight into how a particular outcome would change if the input factors were hypothetically changed.
[0172] Additionally, as the counterfactual inference management unit is trained based on user-selected counterfactual features, the trained counterfactual inference unit is capable of generating flexible counterfactual inference results that allow a user to explore a variety of hypothetical cases for achieving a predetermined target.
[0173] In addition, as the counterfactual inference unit utilizes a statistical analysis technique to predict data features that have a high probability of exhibiting correlation with one or other data features of the set of transformed data, users can easily eliminate infeasible counterfactual inferences (e.g., counterfactual inferences that require changes to one data feature but not another correlated feature may not be feasible).
[0174] Further, as the counterfactual inference unit may be configured to switch between separate operating modes and perform specialized processing steps based on whether the input data is tabular data or image data, the counterfactual inference management device is capable of generating counterfactual inference results with respect to both tabular data and image data.
[0175] The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
[0176] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
[0177] A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[0178] Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
[0179] These computer readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
[0180] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0181] Embodiments according to this disclosure may be provided to end-users through a cloud-computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
[0182] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
[0183] While the foregoing is directed to exemplary embodiments, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow. The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
[0184] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the various embodiments. “Set of,” “group of,” “bunch of,” etc. are intended to include one or more. It will be further understood that the terms “includes” and/or “including,” when used in this specification, specify the presence of the stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. In the previous detailed description of exemplary embodiments of the various embodiments, reference was made to the accompanying drawings (where like numbers represent like elements), which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the various embodiments may be practiced. These embodiments were described in sufficient detail to enable those skilled in the art to practice the embodiments, but other embodiments may be used, and logical, mechanical, electrical, and other changes may be made without departing from the scope of the various embodiments. In the previous description, numerous specific details were set forth to provide a thorough understanding the various embodiments. However, the various embodiments may be practiced without these specific details. In other instances, well-known circuits, structures, and techniques have not been shown in detail in order not to obscure embodiments.
[REFERENCE SIGNS LIST]
[0185] 100... Computer system [0186] 102... Processor [0187] 104... Memory [0188] 106... Memory bus [0189] 108... I/O bus [0190] 109... Bus IF [0191] 110... I/O Bus IF [0192] 112... Terminal interface [0193] 113... Storage interface [0194] 114... I/O device interface [0195] 115... Network interface [0196] 116... User I/O device [0197] 117... Storage device [0198] 124... Display system [0199] 126... Display [0200] 130... Network [0201] 150... Counterfactual inference management application [0202] 200... Counterfactual inference management device [0203] 210... Classifier unit [0204] 220... Counterfactual inference unit [0205] 230... Pre-processing unit [0206] 232... Tabular data processing unit [0207] 234... Image processing unit [0208] 240... User interface unit [0209] 250... Feedback unit