G06F18/2415

TECHNIQUES FOR PREDICTION BASED MACHINE LEARNING MODELS

Various embodiments are generally directed to techniques for prediction based machine learning (ML) models, such as to utilize a ML model to generate predictions based on the output of another ML model. Some embodiments are particularly directed to a secondary ML model that revises predictions generated by a primary ML model based on structured input data. In many embodiments, the secondary ML model may utilize predictions from the primary ML model to learn metadata regarding the structured input data. In many such embodiments, the metadata regarding the structured input data may be used to revise the predictions from the primary ML model. For example, the secondary ML model may utilize a structure of the input data combined with patterns in the predictions from the primary ML model to revise the predictions from the primary ML model.

Systems and methods for simulating traffic scenes

Example aspects of the present disclosure describe a scene generator for simulating scenes in an environment. For example, snapshots of simulated traffic scenes can be generated by sampling a joint probability distribution trained on real-world traffic scenes. In some implementations, samples of the joint probability distribution can be obtained by sampling a plurality of factorized probability distributions for a plurality of objects for sequential insertion into the scene.

Univariate density estimation method

A method for use with a computing device. The method may include receiving a data set including a plurality of univariate data points and determining a target kernel bandwidth for a kernel density estimator (KDE). Determining the target kernel bandwidth may include computing a plurality of sample KDEs and selecting the target kernel bandwidth based on the sample KDEs. The method may further include computing the KDE for the data set using the target kernel bandwidth. For one or more tail regions of the data set, the method may further include computing one or more respective tail extensions. The method may further include computing and outputting a renormalized piecewise density estimator that, in each tail region, equals a renormalization of the respective tail extension for that tail region, and, outside the one or more tail regions, equals a renormalization of the KDE.

Automated honeypot creation within a network

Systems and methods for managing Application Programming Interfaces (APIs) are disclosed. Systems may involve automatically generating a honeypot. For example, the system may include one or more memory units storing instructions and one or more processors configured to execute the instructions to perform operations. The operations may include receiving, from a client device, a call to an API node and classifying the call as unauthorized. The operation may include sending the call to a node-imitating model associated with the API node and receiving, from the node-imitating model, synthetic node output data. The operations may include sending a notification based on the synthetic node output data to the client device.

CLASSIFICATION MODEL TRAINING METHOD, SYSTEM, ELECTRONIC DEVICE AND STRORAGE MEDIUM
20230038579 · 2023-02-09 ·

Provided are a classification model training method, system, electronic device, and storage medium. The method includes: determining sampling rates of first-class samples and second-class samples in a data set, and setting the samples with a sampling rate less than a preset value as target samples (S101); determining data distribution feature information of the target samples based on Euclidean distances between all the samples in the data set (S102); wherein the data distribution feature information is information describing the number of same-class samples in nearest neighbor samples, and the nearest neighbor samples are two samples at a Euclidean distance less than a preset distance; generating new samples corresponding to the target samples based on the data distribution feature information (S103); and training the classification model using the first-class samples, the second-class samples and the new samples (S104).

METHOD, ELECTRONIC DEVICE AND STORAGE MEDIUM FOR REMOTE DAMAGE ASSESSMENT OF VEHICLE
20230038645 · 2023-02-09 ·

A method for remote damage assessment of a vehicle is provided. The present disclosure relates to the technical field of artificial intelligence, in particular to the technical field of image and text recognition. An implementation solution is: performing data collection on a target vehicle to determine damage information of the target vehicle; obtaining call content of an insurance claiming call for the target vehicle, and extracting accident-related information from the call content, wherein the accident-related information includes named entities in the call content and a relationship between the named entities; and determining a first fraud probability corresponding to the target vehicle at least based on the damage information and the accident-related information.

Scene-based automatic white balance

A method and apparatus may be used for performing a scene-based automatic white balance correction. The method may include obtaining an input image. The method may include obtaining a raw image thumbnail. The method may include obtaining an augmented image thumbnail. The method may include computing a histogram from an image thumbnail. The method may include determining a scene classification. The method may include learning a filter. The filter may be learned from one or several different instances of the raw image thumbnail, the augmented image thumbnail, the scene classification, or any combination thereof. The method may include applying the filter to the histogram to determine white balance correction coefficients and obtain a processed image.

PARTICLE FILTERING AND NAVIGATION SYSTEM USING MEASUREMENT CORRELATION

Disclosed is a box-regularized particle filtering process which includes an Epanechnikov kernel smoothing step. For this purpose, the process uses a special method for generating random numbers that follow an Epanechnikov probability density function. The process can be performed autonomously in a navigation system using correlation measurement, in particular on board an aircraft such as an aircraft, a flying drone or any self-propelled aerial carrier.

Scoring events using noise-contrastive estimation for anomaly detection
11593639 · 2023-02-28 · ·

Techniques for monitoring a computing environment for anomalous activity are presented. An example method includes receiving a request to invoke an action within the computing environment. An anomaly score is generated for the received request by applying a probabilistic model to properties of the request. The anomaly score generally indicates a likelihood that the properties of the request correspond to historical activity within the computing environment for a user associated with the request. The probabilistic model generally comprises a model having been trained using historical activity within the computing environment for a plurality of users, the historical activity including information identifying an action performed in the computing environment and contextual information about a historical request. Based on the generated anomaly score, one or more actions are taken to process the request such that execution of requests having anomaly scores indicative of unexpected activity may be blocked pending confirmation.

PATTERN RECOGNITION DEVICE, PATTERN RECOGNITION METHOD, AND COMPUTER PROGRAM PRODUCT
20180005087 · 2018-01-04 ·

According to an embodiment, a pattern recognition device is configured to divide an input signal into a plurality of elements, convert the divided elements into feature vectors having the same dimensionality to generate a set of feature vectors, and evaluate the set of feature vectors using a recognition dictionary including models corresponding to respective classes, to output a recognition result representing a class or a set of classes to which the input signal belongs. The models each include sub-models each corresponding to one of possible division patterns in which a signal to be classified into a class corresponding to the model can be divided into a plurality of elements. A label expressing a model including a sub-model conforming to the set of feature vectors, or a set of labels expressing a set of models including sub-models conforming to the set of feature vectors is output as the recognition result.