Patent classifications
G06F16/24545
Dynamic query optimization with pilot runs
In one embodiment, a computer-implemented method includes selecting one or more sub-expressions of a query during compile time. One or more pilot runs are performed by one or more computer processors. The one or more pilot runs include a pilot run associated with each of one or more of the selected sub-expressions, and each pilot run includes at least partial execution of the associated selected sub-expression. The pilot runs are performed during execution time. Statistics are collected on the one or more pilot runs during performance of the one or more pilot runs. The query is optimized based at least in part on the statistics collected during the one or more pilot runs, where the optimization includes basing cardinality and cost estimates on the statistics collected during the pilot runs.
Search engine optimizer
A search engine optimizer which works independently and in parallel with a browser and search engine supercomputer to gather, analyze, and distill input information interactively. The optimizer reorganizes the input, and providing an optimized version as an output. The optimized version of the input (e.g. output) is sent to the search engine which responds to the end user with search results. The optimizer recognizes each request as a pattern and stores the pattern in an advanced Glyph format. This permits the optimizer to identify a left and ride side check mate combination required to achieve certitude.
PERFORMANCE OF SQL EXECUTION SEQUENCE IN PRODUCTION DATABASE INSTANCE
A method, computer program product, and computer system for improving performance of a SQL execution sequence of SQL statements. The SQL execution sequence is recorded in an event log. Original results of executing the SQL statements and an original CPU cost of executing the SQL statements in accordance with the original access path are recorded in a logical log. A new access path is generated from analysis of the event log and the logical log. The SQL statements are executed in accordance with the new access path resulting in new results of executing the SQL statements including a new CPU cost of executing the SQL statements in accordance with the new access path. In response to a determination that the new results replicate the original results and that the new CPU cost is less than the original CPU cost, the original access path is replaced with the new access path.
Learning-based workload resource optimization for database management systems
A DBMS training subsystem trains a DBMS workload-manager model with training data identifying resources used to execute previous DBMS data-access requests. The subsystem integrates each request's high-level features and compile-time operations into a vector and clusters similar vectors into templates. The requests are divided into workloads each represented by a training histogram that describes the distribution of templates associated with the workload and identifies the total amounts and types of resources consumed when executing the entire workload. The resulting knowledge is used to train the model to predict production resource requirements by: i) organizing production queries into candidate workloads; ii) deriving for each candidate a histogram similar in form and function to the training histograms; iii) using the newly derived histograms to predict each candidate's resource requirements; iv) selecting the candidate with the greatest resource requirements capable of being satisfied with available resources; and v) executing the selected workload.
Database statistical histogram forecasting
A method and system for forecasting a histogram in a database system is provided. The method includes determining that database table statistics and historical statistical histograms associated with specified subject matter have been previously retrieved. The database table statistics and historical statistical histograms are retrieved and determined to be frequency based histograms. Historical target values associated with the historical statistical histograms are identified and new target values associated with the historical target values are identified. A value identifying a number of occurrences for identified target values comprising the new target values and the historical target values is forecast and database table histograms comprising the identified target values are stored.
SCALING QUERY PROCESSING RESOURCES FOR EFFICIENT UTILIZATION AND PERFORMANCE
Scaling of query processing resources for efficient utilization and performance is implemented for a database service. A query is received via a network endpoint associated with a database managed by a database service. Respective response times predicted for the query using different query processing configurations available to perform the query are determined. Those query processing configurations with response times that exceed a variability threshold determined for the query may be excluded. A remaining query processing configuration may then be selected to perform the query.
Query plans for analytic SQL constructs
A system and method for managing data storage and data access with querying data in a distributed system without buffering the results on intermediate operations in disk storage.
Query Optimizer Advisor
Methods for optimization in query plans are performed by computing systems via a query optimizer advisor. A query optimizer advisor (QO-Advisor) is configured to steer a query plan optimizer towards more efficient plan choices by providing rule hints to improve navigation of the search space for each query in formulation of its query plan. The QO-Advisor receives historical information of a distributed data processing system as an input, and then generates a set of rule hint pairs based on the historical information. The QO-Advisor provides the set of rule hint pairs to a query plan optimizer, which then optimizes a query plan of an incoming query through application of a rule hint pair in the set. This application is based at least on a characteristic of the incoming query matching a portion of the rule hint pair.
METHOD AND QUERY SUGGESTION SERVER FOR PROVIDING ALTERNATE QUERY SUGGESTIONS FOR TIME BOUND RESULTS
The present disclosure relate to a method of providing alternate query suggestions for time bound results. The first step comprises receiving, by a query suggestion server, a query comprising one or more dimensions and a target time for executing the query from a user device associated to a user. The second step comprises determining, in real time, by the query suggestion server, execution time for the received query. The third step comprises identifying one or more alternate query suggestions upon determining the execution time for the received query exceeding the target time. The last step comprises providing the one or more alternate query suggestions to the user device for modifying the query.
Dynamically switching between execution paths for user-defined functions
Dynamically switching between a plurality of execution paths to execute a function, such as a user-defined function. The plurality of execution paths include an execution path that uses caching and another execution path that uses inlining. A user-defined function is executed at least once using a first execution path. Then, for a later execution of the function, the execution path is automatically switched to a second execution path.