G06N7/08

Efficient verification of machine learning applications

An example operation may include one or more of generating, by a training participant client comprising a training dataset, a plurality of transaction proposals that each correspond to a training iteration for machine learning model training related to stochastic gradient descent, the machine learning model training comprising a plurality of training iterations, the transaction proposals comprising a gradient calculation performed by the training participant client, a batch from the private dataset, a loss function, and an original model parameter, receiving, by one or more endorser nodes of peers of a blockchain network, the plurality of transaction proposals, and evaluating each transaction proposal.

Efficient verification of machine learning applications

An example operation may include one or more of generating, by a training participant client comprising a training dataset, a plurality of transaction proposals that each correspond to a training iteration for machine learning model training related to stochastic gradient descent, the machine learning model training comprising a plurality of training iterations, the transaction proposals comprising a gradient calculation performed by the training participant client, a batch from the private dataset, a loss function, and an original model parameter, receiving, by one or more endorser nodes of peers of a blockchain network, the plurality of transaction proposals, and evaluating each transaction proposal.

Quantum-attack resistant operating system for use in a key management mechanism
11562070 · 2023-01-24 · ·

A quantum-attack resistant operating system for use in a key management mechanism which is a full solution of cyber-security for quantum transmission via optical paths, in order to detect and bypass quantum computing attacks, or to perform quantum counterattacks, during various procedures of quantum key managements; wherein the system avoids the attacks of key tampering, destroying, detecting, and blocking, from other quantum systems in a quantum key storage phase; meanwhile, it also avoids the sniffing from other quantum systems on key entangled properties, in a quantum key clearing phase; in addition, in a quantum key recycling phase, facing quantum computing attacks, it not only can disrupt the judgement of other systems on key verification, but also consumes the computing resources on the attacker side; thereby the present invention provides a protection mechanism which cannot be achieved by a conventional PQC (Post-quantum cryptography) solution.

Quantum-attack resistant operating system for use in a key management mechanism
11562070 · 2023-01-24 · ·

A quantum-attack resistant operating system for use in a key management mechanism which is a full solution of cyber-security for quantum transmission via optical paths, in order to detect and bypass quantum computing attacks, or to perform quantum counterattacks, during various procedures of quantum key managements; wherein the system avoids the attacks of key tampering, destroying, detecting, and blocking, from other quantum systems in a quantum key storage phase; meanwhile, it also avoids the sniffing from other quantum systems on key entangled properties, in a quantum key clearing phase; in addition, in a quantum key recycling phase, facing quantum computing attacks, it not only can disrupt the judgement of other systems on key verification, but also consumes the computing resources on the attacker side; thereby the present invention provides a protection mechanism which cannot be achieved by a conventional PQC (Post-quantum cryptography) solution.

Nervous system emulator engine and methods using same
11556724 · 2023-01-17 ·

A nervous system emulator engine includes working computational models of the vertebrate nervous system to generate lifelike animal behavior in a robot. These models include functions representing several anatomical features of the vertebrate nervous system, such as spinal cord, brainstem, basal ganglia, thalamus and cortex. The emulator engine includes a hierarchy of controllers in which controllers at higher levels accomplish goals by continuously specifying desired goals for lower-level controllers. The lowest levels of the hierarchy reflect spinal cord circuits that control muscle tension and length. Moving up the hierarchy into the brainstem and midbrain/cortex, progressively more abstract perceptual variables are controlled. The nervous system emulator engine may be used to build a robot that generates the majority of animal behavior, including human behavior. The nervous system emulator engine may also be used to build working models of nervous system functions for clinical experimentation.

Nervous system emulator engine and methods using same
11556724 · 2023-01-17 ·

A nervous system emulator engine includes working computational models of the vertebrate nervous system to generate lifelike animal behavior in a robot. These models include functions representing several anatomical features of the vertebrate nervous system, such as spinal cord, brainstem, basal ganglia, thalamus and cortex. The emulator engine includes a hierarchy of controllers in which controllers at higher levels accomplish goals by continuously specifying desired goals for lower-level controllers. The lowest levels of the hierarchy reflect spinal cord circuits that control muscle tension and length. Moving up the hierarchy into the brainstem and midbrain/cortex, progressively more abstract perceptual variables are controlled. The nervous system emulator engine may be used to build a robot that generates the majority of animal behavior, including human behavior. The nervous system emulator engine may also be used to build working models of nervous system functions for clinical experimentation.

STORAGE MEDIUM, MODEL GENERATION METHOD, AND INFORMATION PROCESSING APPARATUS
20230012430 · 2023-01-12 · ·

A non-transitory computer-readable storage medium storing a model generation program that causes a computer to execute a process includes generating a plurality of first coefficient matrixes representing a relationship between a first observation matrix that has a feature and a characteristic vector that has a characteristic value of each of the plurality by a regression coefficient; generating a histogram in which a plurality of total regression coefficients obtained by totaling the regression coefficient included in the plurality of first coefficient matrixes for each of the plurality of elements is arranged in order of element in the first observation matrix; generating a second observation matrix including a second element acquired by combining a plurality of first elements that corresponds to the adjacent total regression coefficients of nonzero in the histogram into one; and generating a second coefficient matrix representing a relationship between the second observation matrix and the characteristic vector.

STORAGE MEDIUM, MODEL GENERATION METHOD, AND INFORMATION PROCESSING APPARATUS
20230012430 · 2023-01-12 · ·

A non-transitory computer-readable storage medium storing a model generation program that causes a computer to execute a process includes generating a plurality of first coefficient matrixes representing a relationship between a first observation matrix that has a feature and a characteristic vector that has a characteristic value of each of the plurality by a regression coefficient; generating a histogram in which a plurality of total regression coefficients obtained by totaling the regression coefficient included in the plurality of first coefficient matrixes for each of the plurality of elements is arranged in order of element in the first observation matrix; generating a second observation matrix including a second element acquired by combining a plurality of first elements that corresponds to the adjacent total regression coefficients of nonzero in the histogram into one; and generating a second coefficient matrix representing a relationship between the second observation matrix and the characteristic vector.

System and method for dynamic scheduling of distributed deep learning training jobs

A scheduling algorithm for scheduling training of deep neural network (DNN) weights on processing units identifies a next job to provisionally assign a processing unit (PU) based on a doubling heuristic. The doubling heuristic makes use of an estimated number of training sets needed to complete training of weights for a given job and/or a training speed function which indicates how fast the weights are converging. The scheduling algorithm solves a problem of efficiently assigning PUs when multiple DNN weight data structures must be trained efficiently. In some embodiments, the training of the weights uses a ring-based message passing architecture. In some embodiments, performance using a nested loop approach or nested loop fashion is provided. In inner iterations of the nested loop, PUs are scheduled and jobs are launched or re-started. In outer iterations of the nested loop, jobs are stopped, parameters are updated and the inner iteration is re-entered.

System and method for dynamic scheduling of distributed deep learning training jobs

A scheduling algorithm for scheduling training of deep neural network (DNN) weights on processing units identifies a next job to provisionally assign a processing unit (PU) based on a doubling heuristic. The doubling heuristic makes use of an estimated number of training sets needed to complete training of weights for a given job and/or a training speed function which indicates how fast the weights are converging. The scheduling algorithm solves a problem of efficiently assigning PUs when multiple DNN weight data structures must be trained efficiently. In some embodiments, the training of the weights uses a ring-based message passing architecture. In some embodiments, performance using a nested loop approach or nested loop fashion is provided. In inner iterations of the nested loop, PUs are scheduled and jobs are launched or re-started. In outer iterations of the nested loop, jobs are stopped, parameters are updated and the inner iteration is re-entered.