G06N3/088

NEUROSYMBOLIC DATA IMPUTATION USING AUTOENCODER AND EMBEDDINGS
20230048764 · 2023-02-16 ·

Methods, systems and apparatus, including computer programs encoded on computer storage medium, for training a neurosymbolic data imputation system on training data inputs in a domain to impute missing data in a data input from the data domain. In one aspect a method includes, for each training data input, adding random noise to missing fields of the training data input;

generating an embedding data input for the training data input using concept embeddings from the domain; processing the noisy data input and the embedding data input through a correlation network to obtain correlation data; applying attention to the noisy training data input and the correlation data to generate a combined data input; processing, by an autoencoder, the combined data input to obtain a decoded data output; computing a difference between the decoded data output and the training data input; and updating parameters of the data imputation system using the difference.

SYSTEM AND METHOD FOR UNSUPERVISED LEARNING OF SEGMENTATION TASKS
20230050573 · 2023-02-16 ·

Apparatuses and methods are provided for training a feature extraction model determining a loss function for use in unsupervised image segmentation. A method includes determining a clustering loss from an image; determining a weakly supervised contrastive loss of the image using cluster pseudo labels based on the clustering loss; and determining the loss function based on the clustering loss and the weakly supervised contrastive loss.

RECORD MATCHING MODEL USING DEEP LEARNING FOR IMPROVED SCALABILITY AND ADAPTABILITY
20230046079 · 2023-02-16 ·

Systems and methods are described for linking records from different databases. A search may be performed for each record of a received record set for similar records based on having similar field values. Recommended records of the record set may be assigned with the identified similar records to sub-groups. Pairs of records may be formed for each record of the sub-group, and comparative and identifying features may be extracted from each field of the pairs of records. Then, a trained model may be applied to the differences to determine a similarity score. Cluster identifiers may be applied to records within each sub-group having similarity scores greater than a predetermined threshold. In response to a query for a requested record, all records having the same cluster identifier may be output on a graphical interface, allowing users to observe linked records for a person in the different databases.

MULTI-LINGUAL CODE GENERATION WITH ZERO-SHOT INFERENCE

A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.

MULTI-LINGUAL CODE GENERATION WITH ZERO-SHOT INFERENCE

A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.

MESSAGE-BASED PROCESSING SYSTEM AND METHOD OF OPERATING THE SAME

A message based processor system (1) with a plurality of message based processor system cores (100) is proposed. Cores therein comprise a processor element controller that is configured to receive a message with an indication of a subset processor elements in the core to which it is directed as well as an indication of a target pattern, and to update the state value of the processor elements (Ei) in the subset in accordance with a specification of the target pattern. The processor element controller (PEC) is configurable in an address computation mode selected from a cyclic set of address computation modes, and configured to maintain its computation mode or assume a next address computation mode selected from the cyclic set dependent on a control value of a currently applied pattern element. Therewith a target pattern can efficiently specified.

METHOD FOR TRAINING NON-AUTOREGRESSIVE TRANSLATION MODEL
20230051373 · 2023-02-16 ·

A method for training a non-autoregressive translation (NAT) model includes: acquiring a source language text, a target language text corresponding to the source language text and a target length of the target language text; generating a target language prediction text and a prediction length by inputting the source language text into the NAT model, in which initialization parameters of the NAT model are determined based on parameters of a pre-trained translation model; and obtaining a target NAT model by training the NAT model based on the target language text, the target language prediction text, the target length and the prediction length.

METHOD FOR LEARNING REPRESENTATIONS FROM CLOUDS OF POINTS DATA AND A CORRESPONDING SYSTEM
20230050120 · 2023-02-16 ·

A method for learning representations from clouds of points data includes encoding clouds of points data into at least one representation by creating at least one tensor representation out of the clouds of points data. The method further includes using a loss function that utilizes a noisy reconstruction for reducing overfitting.

SYSTEM AND METHOD FOR ADDITIVE MANUFACTURING CONTROL

An additive manufacturing apparatus, a computing system, and a method for operating an additive manufacturing apparatus are provided. The method includes obtaining two or more images corresponding to respective build layers at a build plate, wherein each image comprises a plurality of data points comprising a feature and corresponding location at the build plate; removing variation between the features of the plurality of data points; and normalizing each feature to remove location dependence in the plurality of data points.

NEURAL NETWORK LOOP DETECTION
20230051050 · 2023-02-16 ·

Apparatuses, systems, and techniques to detect loops in neural network graphs. In at least one embodiment, one or more loops are detected within one or more graphs corresponding to one or more neural networks.