ARTIFICIAL INTELLIGENCE-SUPPORTED SETUP AND EXECUTION OF BACKORDER PROCESSING
20250356308 ยท 2025-11-20
Assignee
Inventors
Cpc classification
International classification
Abstract
A computer-implemented method for improved backorder processing (BOP) in an enterprise resource planning system is disclosed. The method can receive one or more user prompts from a user interface and create a BOP segment using a large language model. The BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts. A filter is defined by an attribute, an operator, and one or more attribute values. The method can create a BOP variant using the large language model. The BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts. The method can further execute the BOP variant using the large language model, including batch processing the subset of the plurality of order requirements using the confirmation scheme.
Claims
1. An enterprise resource planning (ERP) system for improved backorder processing (BOP), the ERP system comprising: memory; one or more hardware processors coupled to the memory; and one or more computer readable storage media storing instructions that, when loaded into the memory, cause the one or more hardware processors to perform operations comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme.
2. The ERP system of claim 1, wherein the operations further comprise prompting, in runtime, the large language model with the one or more use prompts and a system prompt, wherein the system prompt defines a cardinality relationship between the BOP segment and the BOP variant.
3. The ERP system of claim 2, wherein the operations further comprise prompting, in runtime, the large language model with a context prompt, wherein the context prompt defines a plurality of confirmation schemes, one of which is defined for the BOP segment by the BOP variant.
4. The ERP system of claim 3, wherein the context prompt comprises prototype definitions of functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant, respectively.
5. The ERP system of claim 4, wherein the system prompt is configured to instruct the large language model to sequentially invoke the functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant in backend.
6. The ERP system of claim 4, wherein the system prompt is configured to instruct the large language model to identify missing input values of the functions and request the missing input values from the user interface.
7. The ERP system of claim 3, wherein the context prompt comprises a list of attributes, from which attributes are selected to define the one or more filters, wherein the operations further comprise retrieving, in runtime, the list of attributes from database tables corresponding to the BOP segment.
8. The ERP system of claim 3, wherein the context prompt comprises a list of operators, from which operators are selected to define the one or more filters.
9. The ERP system of claim 3, wherein the context prompt comprises one or more structured objects, wherein a structured object comprises one or more example user prompts and corresponding output of the large language model generated in response to the one or more example user prompts.
10. The ERP system of claim 3, wherein the context prompt comprises one or more configuration parameters of the large language model.
11. A computer-implemented method for improved backorder processing (BOP) in an enterprise resource planning (ERP) system, the method comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme.
12. The computer-implemented method of claim 11, wherein the operations further comprise prompting, in runtime, the large language model with the one or more use prompts and a system prompt, wherein the system prompt defines a cardinality relationship between the BOP segment and the BOP variant.
13. The computer-implemented method of claim 12, wherein the operations further comprise prompting, in runtime, the large language model with a context prompt, wherein the context prompt defines a plurality of confirmation schemes, one of which is defined for the BOP segment by the BOP variant.
14. The computer-implemented method of claim 13, wherein the context prompt comprises prototype definitions of functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant, respectively.
15. The computer-implemented method of claim 14, wherein the system prompt is configured to instruct the large language model to sequentially invoke the functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant in backend.
16. The computer-implemented method of claim 13, wherein the context prompt comprises a list of attributes, from which attributes are selected to define the one or more filters, wherein the operations further comprise retrieving, in runtime, the list of attributes from database tables corresponding to the BOP segment.
17. The computer-implemented method of claim 13, wherein the context prompt comprises a list of operators, from which operators are selected to define the one or more filters.
18. The computer-implemented method of claim 13, wherein the context prompt comprises one or more structured objects, wherein a structured object comprises one or more example user prompts and corresponding output of the large language model generated in response to the one or more example user prompts.
19. The computer-implemented method of claim 13, wherein the context prompt comprises one or more configuration parameters of the large language model.
20. One or more non-transitory computer-readable media having encoded thereon computer-executable instructions causing one or more processors to perform a method for improved backorder processing (BOP) in an enterprise resource planning (ERP) system, the method comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0002]
[0003]
[0004]
[0005]
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
DETAILED DESCRIPTION
Overview of ATP and BOP
[0021] ATP is a software feature implemented in various supply chain management, manufacturing, and fulfillment systems. It is designed to respond to customer order inquiries based on resource availability, promising specific delivery terms such as available quantities and delivery due dates of a requested product. ATP supports order promising and fulfillment, manages demand, and aligns it with production plans. Enabled by information technology, ATP functions are typically integrated into ERP software packages. In some scenarios, ATP can anticipate future demand for specific products, computing quantities and availability dates. In some circumstances, ATP can dynamically allocate resources in response to actual customer orders.
[0022] One of the advanced features of ATP is BOP. BOP involves the bulk handling of orders in a batch mode, where order confirmations are adjusted to reflect business priorities and changes in the demand/supply dynamics of the order fulfillment process. This feature allows for the re-prioritization and resolution of demand and supply situations. For instance, in advanced ATP (aATP) in S/4HANA (provided by SAP SE, Walldorf, Germany), BOP is used to check material availability when the demand or supply situation in the order fulfillment process has changed, and it is necessary to check if previously calculated confirmations for requirements in business documents are still realistic. A sales order might be canceled, freeing up stock quantities, or an important customer might increase the requested quantity for a material, thereby wanting to consume stock currently confirmed for other sales orders. Not reacting to the changed availability situation can result in confirmed quantities exceeding available quantities, leading to availability checks for over-confirmed materials failing, leaving the system unable to release materials for delivery creation.
[0023] However, the implementation and execution of BOP in existing ERP systems can pose challenges. Users often find themselves navigating through multiple applications (e.g., configure BOP segments, create BOP variants, schedule BOP run, etc.), each allowing to configure a specific step prior to running BOP. Each application requires users to manually translate their intentions into selection criteria and confirmation strategies. This process can be time-consuming and necessitates a deep understanding of the various technical aspects of BOP, including understanding the purpose of different confirmation strategies, how to assign segments to these strategies, etc. As a result, BOP is often under-utilized and/or set up incorrectly, leading to inefficiencies in the order fulfillment process.
[0024] The technologies described herein overcome many of the challenges described above by leveraging the power of artificial intelligence (AI). This AI-powered solution simplifies the complex process of BOP in ERP systems. It automates the setup and execution of BOP, thereby eliminating the need for users to navigate through multiple applications and manually configure each step. This high-level automation not only saves time but also reduces the chances of errors, leading to a more efficient and effective order fulfillment process.
Example ERP System with AI-Powered BOP
[0025]
[0026] The ERP system 100 includes an ATP module 130, which receives input from three primary blocks: demand 110, supply 120, and configuration 150. The demand 110 includes requirements (which can also be referred to as order requirements) for products or services, such as sales orders 112 and stock transfer orders (STOs) 114. As described herein, a requirement refers to a requested quantity of a certain material on a certain date (e.g., a line item in a sales order). Sales orders 112 can be generated by customer requests for specific products, while STOs 114 can be generated by internal requests, e.g., to relocate inventory from one location to another. The supply 120 represents the resources available to meet the demand 110. For example, the supply 120 can include warehouse stock 122 (e.g., the inventory currently available in warehouses), stock in transit 124 (e.g., inventory being relocated), production orders 126 (e.g., requests to manufacture more inventory), etc. The configuration 150 includes parameters that guide the operation of the ATP module 130. Example configuration parameters include supply chain constraints (such as limits on the amount of inventory that can be moved at once, etc.), business priorities (such as which orders should be fulfilled first, etc.), profitability considerations (such as the cost of producing or moving inventory, etc.), among others.
[0027] The ATP module 130 is configured to balance the demand 110 with the supply 120 while considering various parameters in the configuration 150. The ATP module 130 processes these inputs to generate order confirmations 160, which are commitments to fulfill specific orders in the demand 110. In addition to processing inputs and generating order confirmations, the ATP module 130 interfaces with a database module 170, which manages the storage and retrieval of data relevant to the ATP module's operation. Such data can include historical and real-time information about demand, supply, and configurations, as well as the status of order confirmations. By accessing this data, the ATP module 130 can make informed decisions about how to balance demand and supply, prioritize orders, and manage inventory, thereby optimizing the order fulfillment process.
[0028] As shown in
[0029] The BOP engine 140 can include several functional units, each of which can have a specific role in the BOP workflow. For example, the BOP engine 140 can include a segment creator 142, a sorting operator 144, a variant creator 146, and an executor 148. Each of these functional units can exist as a separate application, complete with its own unique application programming interface (API).
[0030] The segment creator 142 can be invoked to generate BOP segments, which include groups of similar backorders. As described herein, a BOP segment can include a saved selection of settings or filters that determines which requirements are selected and the sequence in which the requirements are prioritized. This grouping allows the BOP engine 140 to handle similar orders together, thereby improving efficiency.
[0031] The sorting operator 144 can be employed to organize backorders within each BOP segment based on certain criteria, such as order importance or delivery deadline (e.g., to ensure that the most critical orders are handled first). In some cases, the sorting operation can be optional in the BOP workflow.
[0032] The variant creator 146 can be called to create a BOP variant, which is a saved selection of settings that determines which requirements are included in a BOP run and how they are checked. The BOP variant represents a strategic plan for fulfilling the backorders within each BOP segment. This plan not only takes into account the available supply and the configuration parameters of the ATP module 130, but also assigns a confirmation strategy (which can also be referred to as confirmation scheme) for each BOP segment. As described further below, a confirmation strategy defines how a requirement (or all requirements in a BOP segment) is handled in a BOP run.
[0033] The executor 148 is configured to execute the plan or the BOP variantalso referred to as a BOP runeither immediately or at a scheduled time. As described herein, the BOP run refers to an executed instance of the background functionality that checks the availability of multiple requirements at the same time, in a defined sequence and taking any filters into consideration, as defined in the BOP variant. A BOP run can be simulated or scheduled for immediate execution, or it can be scheduled as a one-time or regular background job (e.g., at mid-night of each day, at noon of each Sunday, etc.) in a batch mode.
[0034] Although not shown, the BOP engine 140 can include some additional functional units, such as a monitor configured to display results of BOP runs and/or confirmation details, etc. Together, these functional units enable the BOP engine 140 to effectively manage backorders, optimizing the order fulfillment process of the ATP module 130.
[0035] The BOP engine 140 can leverage AI to facilitate automatic setup and execution of BOP runs. As shown in
[0036] The integration of user prompts 104, system prompt 136, and context prompt 138 enables the LLM 180 to formulate a comprehensive plan for the setup and execution of BOP runs. This interactive and intelligent setup and execution of BOP runs greatly enhance the efficiency and effectiveness of the BOP engine 140. Moreover, such AI-powered BOP engine 140 eliminates the need for specialized expertise from the user 102, as the LLM 180 can intelligently interpret natural language user prompts 104 and translates them into actionable commands.
[0037] In practice, the systems shown herein, such as the ERP system 100, can vary in complexity, with additional functionality, more complex components, and the like. For example, there can be additional functionality within the BOP engine 140. Additional components can be included to implement security, redundancy, load balancing, report design, data logging, and the like.
[0038] The described computing systems can be networked via wired or wireless network connections, including the Internet. Alternatively, systems can be connected through an intranet connection (e.g., in a corporate environment, government environment, or the like).
[0039] The ERP system 100 and any of the other systems described herein can be implemented in conjunction with any of the hardware components described herein, such as the computing systems described below (e.g., processing units, memory, and the like). In any of the examples herein, attributes, prompts, values, and the like can be stored in one or more computer-readable storage media or computer-readable storage devices. The technologies described herein can be generic to the specifics of operating systems or hardware and can be applied in any variety of environments to take advantage of the described features.
Example Overall Method for Implementing AI-Powered BOP
[0040]
[0041] At step 210, the method can receive one or more user prompts (e.g., user prompts 104) from a user interface (e.g., the UI 102). The user prompts can be written in natural language. As described herein, a natural language user prompt is a user-generated text-based input that employs human language to request information or actions from a software or system.
[0042] In some examples, one user prompt (or few user prompts) can contain sufficient information to define a BOP run, which can be automatically setup and executed in the backend of the system (also referred to as a fast-track approach). In other examples, an interactive process (also referred to as a guided approach) may be necessary, where multiple user prompts are required to gather all the necessary information to set up and execute a BOP run. This guided approach involves a back-and-forth dialogue between the user and the system. The user provides information or makes requests through the user interface, and the system responds with requests for additional information or clarification as needed. This iterative process continues until all the necessary information has been collected so that the BOP run can be set up and executed accordingly. This guided approach allows for a more tailored and precise setup of BOP runs, accommodating complex scenarios that may not be fully captured in a single user prompt. It also provides users with the opportunity to refine their requirements throughout the setup process, enhancing the flexibility and adaptability of the system.
[0043] The received user prompts can be sent to a LLM (e.g., the LLM 180), which can generate corresponding responses. In some examples, the LLM receives not only the user prompts, but also a system prompt (e.g., the system prompt 136) and a context prompt (e.g., the context prompt 138). The system prompt can be configured to instruct the LLM on how to setup and execute BOP runs, including making a sequence of function or API calls in the backend. The context prompt can be configured to provide contextual information about the BOP, including prototype definitions of various functions or APIs that are required to setup and execute BOP runs.
[0044] At step 220, the method can create, in runtime, a BOP segment using the LLM. For example, based on the user prompts, system prompt, and context prompt, the LLM can invoke a function or API call to create the BOP segment (e.g., by the segment creator 142) in the backend. The BOP segment selects a subset of a plurality of requirements (e.g., backorders that need to be fulfilled) using one or more filters determined based on the one or more user prompts. As described herein, each filter can be defined by an attribute, an operator, and one or more attribute values. In some examples, multiple BOP segments can be generated by the LLM, each having its own set of filters. Thus, each BOP segment can include a group of requirements or backorders which satisfy conditions specified by the filters of the corresponding BOP segment.
[0045] At step 230, the method can create, in runtime, a BOP variant using the LLM. For example, based on the user prompts, system prompt, and context prompt, the LLM can invoke a function or API call to create the BOP variant (e.g., by the variant creator 146) in the backend. The BOP variant can define a confirmation strategy for the BOP segment based on the one or more user prompts. The confirmation strategy defined for the BOP segment applies to all requirements included in the BOP segment. In other words, all backorders within the BOP segment will be processed using the same confirmation strategy during execution of the BOP variant. If there are multiple BOP segments, the BOP variant can assign a corresponding confirmation strategy for each BOP segment. In some examples, the confirmation strategy assigned to each BOP segment can be selected from a predefined list of enumerated confirmation strategies, which can be included in the context prompt. Example confirmation strategies are described more fully below.
[0046] At step 240, the method can execute the BOP variant using the LLM. This execution can be scheduled or performed immediately, and it can be an actual BOP run or a simulated BOP run for analysis purposes. For example, based on the user prompts, system prompt, and context prompt, the LLM can invoke a function or API call to execute (or schedule to execute, or simulate executing) the BOP variant (e.g., by the executor 146) in the backend. The execution involves batch processing the subset of the plurality of requirements (included in the BOP segment) using the confirmation strategy (assigned to the BOP segment). If the BOP variant has multiple BOP segments, the execution will batch process requirements included in each BOP segment using the specific confirmation strategy assigned to the BOP segment.
[0047] Although not shown in
[0048] The method 200 and any of the other methods described herein can be performed by computer-executable instructions (e.g., causing a computing system to perform the method) stored in one or more computer-readable media (e.g., storage or other tangible media) or stored in one or more computer-readable storage devices. Such methods can be performed in software, firmware, hardware, or combinations thereof. Such methods can be performed at least in part by a computing system (e.g., one or more computing devices).
[0049] The illustrated actions can be described from alternative perspectives while still implementing the technologies. For example, send can also be described as receive from a different perspective.
Example Confirmation Strategies
[0050] As described above, the BOP variant can assign a confirmation strategy (selected from a list of predefined confirmation strategies) for each created BOP segment. Example confirmation strategies include Win, Gain, Improve, Redistribute, Fill, and Lose, as implemented in BOP of aATP in SAP S/4HANA.
[0051] For a BOP segment assigned with a Win confirmation strategy, each requirement in the BOP segment shall be fully confirmed on the requested date (or immediately, if the requested material available date is in the past). For the Win confirmation strategy, if requirements cannot be fully confirmed on time, an exception is raised. Depending on the configuration options for exception handling, the BOP run stops completely or only for the affected material-plant combinations.
[0052] For a BOP segment assigned with a Gain confirmation strategy, all requirements in the BOP segment shall retain, at least their confirmation or, if possible, improve. For the Gain confirmation strategy, if current confirmations cannot be retained, an exception is raised. Depending on the configuration options for exception handling, the BOP run stops completely or only for the affected material-plant combinations.
[0053] For a BOP segment assigned with an Improve confirmation strategy, each requirement in the BOP segment will try to retain, at least its confirmation and if possible, improve. But it is also possible that some of the requirements may lose their confirmation. For the Improve confirmation strategy, if current confirmations cannot be retained, no exceptions are raised. Instead, the BOP would worsen the confirmation based on the available quantity.
[0054] For a BOP segment assigned with a Redistribute confirmation strategy, each requirement in the BOP segment can get a better, equal, or worse confirmation. For the Redistribute confirmation strategy, BOP can release quantities of requirements so that they can be used to fulfill other requirements with higher priorities. BOP can also produce a worse confirmation when requirements with a higher priority have reduced the available quantity.
[0055] For a BOP segment assigned with a Fill confirmation strategy, each requirement in the BOP segment does not gain additional quantity or get an earlier confirmation date; instead, it can get an equal, or worse confirmation. To confirm requirements from Win and Gain confirmation strategies, BOP releases quantities of requirements assigned to the Fill confirmation strategy. This can result in a later confirmation or a loss in confirmed quantity.
[0056] For a BOP segment assigned with a Lose confirmation strategy, all requirements in the BOP segment will lose their current confirmation and are not confirmed. The released quantity of the Lose requirements can be used to confirm more important requirements or left as quantity available to confirm future requirements.
Example Overview of LLMs
[0057] Generative AI is a type of AI that can create content, such as text, images, or even code, and it is used in enterprise environments for tasks like automated content generation, data analysis, and chatbot interactions to enhance productivity and efficiency. In contrast to discriminative AI models which aim to make decisions or predictions based on features of the input data, generative AI models focus on generating new data points. The LLM is a type of generative AI that can understand and generate human-like text. In generative AI, such as LLMs, a prompt serves as an input or instruction that informs the AI of the desired content, context, or task, allowing users to guide the AI to produce tailored responses, explanations, or creative content based on the provided prompt.
[0058] In any of the examples herein, an LLM can take the form of an AI model that is designed to understand and generate human language. Such models typically leverage deep learning techniques such as transformer-based architectures to process language with a very large number (e.g., billions) of parameters. Examples include the Generative Pretrained Transformer (GPT) developed by OpenAI, Bidirectional Encoder Representations from Transforms (BERT) by Google, A Robustly Optimized BERT Pretraining Approach developed by Facebook AI, Megatron-LM of NVIDIA, or the like. Pretrained models are available from a variety of sources.
[0059] In any of the examples herein, prompts can be provided, in runtime, to an LLM to generate responses. Prompts in LLM can be input instructions that guide model behavior. Prompts can be textual cues, questions, or statements that users provide to elicit desired responses from the LLMs. Prompts can act as primers for the model's generative process. Sources of prompts can include user-generated queries, predefined templates, or system-generated suggestions. Technically, prompts are tokenized and embedded into the model's input sequence, serving as conditioning signals for subsequent text generation. Experiment with prompt variations can be performed to manipulate output, using techniques like prefixing, temperature control, top-K sampling, chain-of-thought, etc. These prompts, sourced from diverse inputs and tailored strategies, enable users to influence LLM-generated content by shaping the underlying context and guiding the neural network's language generation. For example, prompts can include instructions and/or examples to encourage the LLMs to provide results in a desired style and/or format.
Example Architecture of LLM
[0060]
[0061] In the depicted example, the LLM 300 uses an autoregressive model (as implemented in OpenAI's GPT) to generate text content by predicting the next word in a sequence given the previous words. The LLM 300 can be trained to maximize the likelihood of each word in the training dataset, given its context.
[0062] As shown in
[0063] For autoregressive text generation, the LLM 300 generates text in order, and for each word it generates, it relies on the preceding words for context. During training, the target or output sequence, which the model is learning to generate, is presented to the decoder 340. However, the output is right shifted by one position compared to what the decoder 340 has generated so far. In other words, the model sees the context of the previous words and is tasked with predicting the next word. As a result, the LLM 300 can learn to generate text in a left-to-right manner, which is how language is typically constructed.
[0064] Text inputs to the encoder 320 can be preprocessed through an input embedding unit 302. Specifically, the input embedding unit 302 can tokenize a text input into a sequence of tokens, each of which represents a word or part of a word. Each token can then be mapped to a fixed-length vector known as an input embedding, which provides a continuous representation that captures the meaning and context of the text input. Likewise, to train the LLM 300, the targets or output sequences presented to the decoder 340 can be preprocessed through an output embedding unit 322. Like the input embedding unit 302, the output embedding unit 322 can provide a continuous representation, or output embedding, for each token in the output sequences.
[0065] Generally, the vocabulary in LLM 300 is fixed and is derived from the training data. The vocabulary in LLM 300 consists of tokens generated above during the training process. Words not in the vocabulary cannot be output. These tokens are strung together to form sentences in the text output.
[0066] In some examples, positional encodings (e.g., 304 and 324) can be performed to provide sequential order information of tokens generated by the input embedding unit 302 and output embedding unit 322, respectively. Positional encoding is needed because the transformer, unlike recurrent neural networks, process all tokens in parallel and do not inherently capture the order of tokens. Without positional encoding, the model would treat a sentence as a collection of words, losing the context provided by the order of words. Positional encoding can be performed by mapping each position/index in a sequence to a unique vector, which is then added to the corresponding vector of input embedding or output embedding. By adding positional encoding to the input embedding, the model can understand the relative positions of words in a sentence. Similarly, by adding positional encoding to the output encoding, the model can maintain the order of words when generating text output.
[0067] Each of the encoder 320 and decoder 340 can include multiple stacked or repeated layers (denoted by Nx in
[0068] The encoder 320 and the decoder 340 are related through shared embeddings and attention mechanisms, which allow the decoder 340 to access the contextual information generated by the encoder 320, enabling the LLM 300 to generate coherent and contextually accurate responses. In other words, the output of the encoder 320 can serve as a foundation upon which the decoder network can build the generated text.
[0069] Both the encoder 320 and decoder 340 comprise multiple layers of attention and feedforward neural networks. An attention neural network can implement an attention mechanism by calculating the relevance or importance of different words or tokens within an input sequence to a given word or token in an output sequence, enabling the model to focus on contextually relevant information while generating text. In other words, the attention neural network plays attention on certain parts of a sentence that are most relevant to the task of generating text output. A feedforward neural network can process and transform the information captured by the attention mechanism, applying non-linear transformations to the contextual embeddings of tokens, enabling the model to learn complex relationships in the data and generate more contextually accurate and expressive text.
[0070] In the example depicted in
[0071] In addition, the decoder 340 also includes an inter-attention or encoder-decoder attention neural network 330, which receives input from the output of the encoder 320. The encoder-decoder attention neural network 330 allows the decoder 340 to focus on relevant parts of the input sequence (output of the encoder 320) while generating the output sequence. As described below, the output of the encoder 320 is a continuous representation or embedding of the input sequence. By feeding the output of the encoder 320 to the encoder-decoder attention neural network 330, the contextual information and relationships captured in the input sequence (by the encoder 320) can be carried to the decoder 340. Such connection enables the decoder 340 to access to the entire input sequence, rather than just the last hidden state. Because the decoder 340 can attend to all words in the input sequence, the input information can be aligned with the generation of output to improve contextual accuracy of the generated text output.
[0072] In some examples, one or more of the attention neural networks (e.g., 306, 326, 330) can be configured to implement a single head attention mechanism, by which the model can capture relationships between words in an input sequence by assigning attention weights to each word based on its relevance to a target word. The term single head indicates that there is only one set of attention weights or one mechanism for capturing relationships between words in the input sequence. In some examples, one or more of the attention neural networks (e.g., 306, 326, 330) can be configured to implement a multi-head attention mechanism, by which multiple sets of attention weights, or heads, in parallel to capture different aspects of the input sequence. Each head learns distinct relationships and dependencies within the input sequence. These multiple attention heads can enhance the model's ability to attend to various features and patterns, enabling it to understand complex, multi-faceted contexts, thereby leading to more accurate and contextually relevant text generation. The outputs from multiple heads can be concatenated or linearly combined to produce a final attention output.
[0073] As depicted in
[0074] A linear layer 342 at the output end of the decoder 340 can transform the output embeddings into the original input space. Specifically, the output embeddings produced by the decoder 340 are forwarded to the linear layer 342, which can transform the high-dimensional output embeddings into a space where each dimension corresponds to a word in the vocabulary of the LLM 300.
[0075] The output of the linear layer 342 can be fed to a softmax layer 344, which is configured to implement a softmax function, also known as softargmax or normalized exponential function, which is a generalization of the logistic function that compresses values into a given range. Specifically, the softmax layer 344 takes the output from the linear layer 342 (also known as logits) and transforms them into probabilities. These probabilities sum up to 1, and each probability corresponds to the likelihood of a particular word being the next word in the sequence. Typically, the word with the highest probability can be selected as the next word in the generated text output.
[0076] Still referring to
[0077] First, the input text is tokenized, e.g., by the input embedding unit 302, into a sequence of tokens, each representing a word or part of a word. Each token is then mapped to a fixed-length vector or input embedding. Then, positional encoding 304 is added to the input embeddings to retain information regarding the order of words in the input text.
[0078] Next, the input embeddings are processed by the self-attention neural network 306 of the encoder 320 to generate a set of hidden states. As described above, multi-head attention mechanism can be used to focus on different parts of the input sequence. The output from the self-attention neural network 306 is added to its input (residual connection) and then normalized at the addition and normalization layer 308.
[0079] Then, the feedforward neural network 310 is applied to each token independently. The feedforward neural network 310 includes fully connected layers with non-linear activation functions, allowing the model to capture complex interactions between tokens. The output from the feedforward neural network 310 is added its input (residual connection) and then normalized at the addition and normalization layer 312.
[0080] The decoder 340 uses the hidden states from the encoder 320 and its own previous output sequence to generate the next token in an autoregressive manner so that the sequential output is generated by attending to the previously generated tokens. Specifically, the output of the encoder 320 (input embeddings processed by the encoder 320) are fed to the encoder-decoder attention neural network 330 of the decoder 340, which allows the decoder 340 to attend to all words in the input sequence. As described above, the encoder-decoder attention neural network 330 can implement a multi-head attention mechanism, e.g., computing a weighted sum of all the encoded input vectors, with the most relevant vectors being attributed the highest weights.
[0081] The previous output sequence of the decoder 340 is first tokenized by the output embedding unit 322 to generate an output embedding for each token in the output sequence. Similarly, positional embedding 324 is added to the output embedding to retain information regarding the order of words in the output sequence.
[0082] The output embeddings are processed by the self-attention neural network 326 of the decoder 340 to generate a set of hidden states. The self-attention mechanism allows each token in the text output to attend to all tokens in the input sequence as well as all previous tokens in the output sequence. The output from the self-attention neural network 326 is added to its input (residual connection) and then normalized at the addition and normalization layer 328.
[0083] The encoder-decoder attention neural network 330 receives the output embeddings processed through the self-attention neural network 326 and the addition and normalization layer 328. Additionally, the encoder-decoder attention neural network 330 also receives the output from the addition and normalization layer 312 which represents input embeddings processed by the encoder 320. By considering both processed input embeddings and output embeddings, the output of the encoder-decoder attention neural network 330 represents an output embedding which takes into account both the input sequence and the previously generated outputs. As a result, the decoder 340 can generate the output sequence that is contextually aligned with the input sequence.
[0084] The output from the encoder-decoder attention neural network 330 is added to part of its input (residual connection), i.e., the output from the addition and normalization layer 328, and then normalized at the addition and normalization layer 332. The normalized output from the addition and normalization layer 332 is then passed through the feedforward neural network 334. The output of the feedforward neural network 334 is then added to its input (residual connection) and then normalized at the addition and normalization layer 336.
[0085] The processed output embeddings output by the decoder 340 are passed through the linear layer 342, which maps the high-dimensional output embeddings back to the size of the vocabulary, that is, it transforms the output embeddings into a space where each dimension corresponds to a word in the vocabulary. The softmax layer 344 then converts output of the linear layer 342 into probabilities, each of which corresponds to the likelihood of a particular word being the next word in the sequence. Finally, the LLM 300 samples an output token from the probability distribution generated by the softmax layer 344 (e.g., selecting the token with the highest probability), and this token is added to the sequence of generated tokens for the text output.
[0086] The steps described above are repeated for each new token until an end-of-sequence token is generated or a maximum length is reached. Additionally, if the encoder 320 and/or decoder 340 have multiple stacked layers, the steps performed by the encoder 320 and decoder 340 are repeated across each layer in the encoder 320 and the decoder 340 for generation of each new token.
Example Overview of Input Prompts
[0087]
[0088] As shown in
[0089] If the fast-track approach is taken for the BOP, one user prompt can be configured to contain sufficient information to define a BOP run. For example, the user prompt can specify values for all parameters that are necessary to create BOP segments, create a BOP variant, and execute the BOP variant. Alternatively, if any of the required information for a BOP run (e.g., names of the BOP segments, confirmation strategies for the BOP segments, etc.) is missing, predefined default values can be used.
[0090] On the other hand, if the guided approach is used, an initial user prompt can simply be a request for confirmation of some backorders. The LLM 450 can respond to such request by asking the user to provide missing information, e.g., through additional user prompt(s). This prompt-response between the user and the LLM 450 can be iterative until all the necessary information has been collected for setting up and executing the BOP run.
[0091] The system prompt 420 can be predefined through prompt engineering. In some examples, the system prompt 420 can contain specific instructions (including the logical sequence of multiple BOP functions) that guide the LLM 450 in its interactions with the user. An example system prompt is described further below with reference to
[0092] The context prompt 430 includes various contextual information relevant to the BOP. For example, the context prompt 430 can include a list of confirmation strategies 432 (from which a specific confirmation strategy can be assigned to a BOP segment), a list of attributes 434 which can be possibly used to define filters in BOP segments, a list of operators 436 which can be paired with any of the attributes used to define those filters, and multiple prototype definitions 438 of BOP functions or APIs (e.g., functions for creating a BOP segment, creating a BOP variant, executing a BOP variant, etc.). In some examples, the context prompt 430 can also include example objects 440 which represent example chat sessions (e.g., examples of chat sessions using the fast-track approach and guided approach) between the user and the LLM 450. Additionally, the context prompt 430 can include configuration parameters 440 for the LLM 442 (e.g., LLM version, model temperature, maximum number of tokens, etc.). Further information about the context prompt is described below with examples shown in
Example System Prompt
[0093]
[0094] For example, a Role section of the system prompt 500 provides context about the user and the task at hand. It informs the LLM that its task is to assist a fulfillment manager. It specifies that the context of assistance is within SAP, particularly focusing on the Available-to-Promise (ATP) system and its Backorder Processing (BOP) component. Although BOP is an abbreviation used in various contexts, in this case, it clearly refers to ATP backorder processing in SAP. This section effectively establishes the context and the specific task of the LLM.
[0095] A Task section of the system prompt 500 outlines the primary objective (to setup a BOP run) and its associated subtasks (e.g., creating a segment, creating a variant, and simulating a variant) in a sequential manner. The task section also presents two procedural options: a guided approach and a fast-track method. The subtasks are enumerated in a logical order. This enumeration, which the LLM can process more effectively than sentence-based descriptions, also correlates with the function calls available to the LLM, as described further below.
[0096] An Information about BOP section of the system prompt 500 provides the LLM with concise and necessary details needed to create a BOP run. This section explains the dependencies and relationships between different components of a BOP run, such as segments, variants, and conditions. For instance, it explains that a BOP run always consists of one variant, which can contain multiple segments, but at least one. This illustrates a one-to-many cardinality relationship between the BOP variant and BOP segments. It also specifies that a BOP segment must include at least one criterion with at least one condition, and a condition must include an attribute, an operator, and at least one value. This information aids the LLM in ensuring the necessary conditions for a function call are met.
[0097] A Language section of the system prompt 500 sets the tone and formality of the conversation. It also limits the scope of the response and provides guidelines on how the LLM should react to instructions unrelated to BOP or ATP. The guidelines cover both the content and form of the answer to reduce the risk of hallucination by the LLM. It also instructs the LLM to add positive comments for a better user experience.
[0098] An Additional section of the system prompt 500 provides further instructions to improve the flow of the conversation. It includes guidelines on when to ask for clarifications, make suggestions, and how to handle multiple requests in a single user input. Specifically, it instructs the LLM to identify missing input values of the functions and request the user to provide the missing input values.
[0099] An Output section of the system prompt 500 emphasizes that the LLM's responses should be in natural language and text form only (e.g., it discourages the use of enumerations or JSON files in responses).
[0100] Thus, the system prompt 500 can serve as a comprehensive guide for the LLM to assist a user in setting up a BOP run in SAP's ATP system. It provides clear instructions, context, and guidelines to ensure an effective and user-friendly interaction. It should be understood that the system prompt 500 depicted in
Example Prototype Definitions of BOP Functions in Context Prompt
[0101] As described above, a context prompt sent to the LLM can provide contextual information about the BOP, including how to invoke different BOP functions or APIs.
[0102] As an example,
[0103] As another example,
[0104] As yet another example,
Example Attributes, Operators, and Confirmation Strategies in Context Prompt
[0105] As described above, the context prompt sent to the LLM can also include a list of attributes which can be used to define filters in BOP segments, a list of operators which can be paired with any of the attributes used to define those filters, and a list of confirmation strategies which can be assigned to different BOP segments.
[0106] As an example,
[0107] As described herein, the list of attributes 900 can be retrieved in runtime from database tables corresponding to the BOP segments. This dynamic retrieval process is significant as it allows for real-time customization and flexibility in defining the BOP segments based on the current state of the database tables. Importantly, this automatic runtime retrieval process is crucial due to the potentially large size of the attribute list and the diverse sources of the database tables. It ensures that the most accurate and up-to-date attribute information is used in the BOP and eliminates the potential for errors and inefficiencies that could arise from otherwise manual retrieval methods. Thus, this automatic retrieval process can significantly improve the efficiency and accuracy of BOP.
[0108] As another example,
[0109] As a further example,
Example Objects in Context Prompt Representing Chat Sessions
[0110] In some examples, the context prompt can further include example objects. An example object includes one or more example user prompts and corresponding output of the LLM generated in response to the one or more example user prompts. In other words, each example object represents a chat session or back-and-forth conversations between the user and the LLM. Incorporating example objects in the context prompt can improve the performance of the LLM because these example objects provide the LLM with a clear demonstration of expected behaviors. The example objects can also serve as a form of training data, enabling the LLM to learn and adapt over time, thereby improving its ability to generate relevant and accurate responses. Providing example objects in context prompt can also facilitate more natural and engaging conversations, as the LLM can reference these examples to follow the flow of the conversation more effectively.
[0111] As an example,
[0112]
Example Use Case
[0113] Two example use cases (one using the guided approach, and another using the fast-track approach) are described herein to illustrate operation of AI-powered BOP.
[0114] In a first use case,
[0115] Next, the LLM proposes creating a BOP variant to receive the new BOP segment and explains the need for a confirmation strategy. It also provides suggestions based on the problem definition and system prompt. In this example, the user asks the LLM to explain different confirmation strategies. In response, the LLM provides the explanations, followed by another request for the user to select a confirmation strategy, as this is a required input for creating a BOP variant. After the user selects Redistribute confirmation strategy, the LLM executes another function call to create a BOP variant (the previously user-defined prefix is reused for naming the BOP variant). The backend function call successfully creates the variant and returns a status message, which the LLM processes and communicates to the user.
[0116] Finally, the LLM proposes simulating the BOP run with the new BOP variant. After receiving the user's consent, the LLM executes a function call to simulate execution of the BOP variant and the results are returned to the user.
[0117] In a second use case,
Example Advantages
[0118] The technologies described herein offer a multitude of advantages, particularly when compared to traditional approaches. As described above, in traditional ERP systems, users often face challenges navigating through multiple applications to configure BOP segments, create BOP variants, schedule BOP runs, etc. Each application requires users to manually translate their intentions into selection criteria and confirmation strategies. This process can be laborious and requires a comprehensive understanding of the various technical aspects of BOP. Inexperienced users may also incorrectly set up BOP, resulting in undesirable confirmation outcomes and potential errors. Indeed, particularly in time-sensitive scenarios, the conventional approach of setting up many BOP runs, each having many BOP segments with various filter settings, can prove impractical and error-prone.
[0119] The AI-powered solution disclosed herein offers a practical solution for complex BOP scenarios. It streamlines the intricate BOP workflow by using a chat interface to automate the setup and execution of BOP, thereby eliminating the need for users to navigate through multiple applications and manually configure each step. This high-level automation not only conserves time but also reduces the likelihood of errors, leading to a more efficient and effective order fulfillment process.
[0120] Specifically, by harnessing the power of AI, the disclosed technologies facilitate a more user-friendly and efficient approach to setting up and executing BOP, thereby overcoming many of the challenges and inefficiencies associated with traditional methods. Notably, all BOP function calls are performed in the backend of the ERP system, encapsulated by the BOP engine, which means the technical implementation of many BOP tasks are behind the scenes. The system can intelligently prompt the user to provide any missing information to set up the BOP and provides a detailed system prompt and rich context prompt (including automatic retrieval of attributes of BOP segments from database tables) to the LLM. This allows the LLM to accurately interpret the user's intent based on user prompts and perform proper BOP tasks.
Example Computing Systems
[0121]
[0122] With reference to
[0123] A computing system 1800 can have additional features. For example, the computing system 1800 can include storage 1840, one or more input devices 1850, one or more output devices 1860, and one or more communication connections 1870, including input devices, output devices, and communication connections for interacting with a user. An interconnection mechanism (not shown) such as a bus, controller, or network can interconnect the components of the computing system 1800. Typically, operating system software (not shown) can provide an operating environment for other software executing in the computing system 1800, and coordinate activities of the components of the computing system 1800.
[0124] The tangible storage 1840 can be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way and which can be accessed within the computing system 1800. The storage 1840 can store instructions for the software implementing one or more innovations described herein.
[0125] The input device(s) 1850 can be an input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, touch device (e.g., touchpad, display, or the like) or another device that provides input to the computing system 1800. The output device(s) 1860 can be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1800.
[0126] The communication connection(s) 1870 can enable communication over a communication medium to another computing entity. The communication medium can convey information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.
[0127] The innovations can be described in the context of computer-executable instructions, such as those included in program modules, being executed in a computing system on a target real or virtual processor (e.g., which is ultimately executed on one or more hardware processors). Generally, program modules or components can include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The functionality of the program modules can be combined or split between program modules as desired in various embodiments. Computer-executable instructions for program modules can be executed within a local or distributed computing system.
[0128] For the sake of presentation, the detailed description uses terms like determine and use to describe computer operations in a computing system. These terms are high-level descriptions for operations performed by a computer and should not be confused with acts performed by a human being. The actual computer operations corresponding to these terms vary depending on implementation.
Computer-Readable Media
[0129] Any of the computer-readable media herein can be non-transitory (e.g., volatile memory such as DRAM or SRAM, nonvolatile memory such as magnetic storage, optical storage, or the like) and/or tangible. Any of the storing actions described herein can be implemented by storing in one or more computer-readable media (e.g., computer-readable storage media or other tangible media). Any of the things (e.g., data created and used during implementation) described as stored can be stored in one or more computer-readable media (e.g., computer-readable storage media or other tangible media). Computer-readable media can be limited to implementations not consisting of a signal.
[0130] Any of the methods described herein can be implemented by computer-executable instructions in (e.g., stored on, encoded on, or the like) one or more computer-readable media (e.g., computer-readable storage media or other tangible media) or one or more computer-readable storage devices (e.g., memory, magnetic storage, optical storage, or the like). Such instructions can cause a computing device to perform the method. The technologies described herein can be implemented in a variety of programming languages.
Example Cloud Computing Environment
[0131]
[0132] The cloud computing services 1910 can be utilized by various types of computing devices (e.g., client computing devices), such as computing devices 1920, 1922, and 1924. For example, the computing devices (e.g., 1920, 1922, and 1924) can be computers (e.g., desktop or laptop computers), mobile devices (e.g., tablet computers or smart phones), or other types of computing devices. For example, the computing devices (e.g., 1920, 1922, and 1924) can utilize the cloud computing services 1910 to perform computing operations (e.g., data processing, data storage, and the like).
[0133] In practice, cloud-based, on-premises-based, or hybrid scenarios can be supported.
Example Implementations
[0134] In any of the examples herein, a software application (or application) can take the form of a single application or a suite of a plurality of applications, whether offered as a service (SaaS), in the cloud, on premises, on a desktop, mobile device, wearable, or the like.
[0135] Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, such manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth herein. For example, operations described sequentially can in some cases be rearranged or performed concurrently.
[0136] As described in this application and in the claims, the singular forms a, an, and the include the plural forms unless the context clearly dictates otherwise. Additionally, the term includes means comprises. Further, and/or means and or or, as well as and and or.
[0137] In any of the examples described herein, an operation performed in runtime means that the operation can be completed in real time or with negligible processing latency (e.g., the operation can be completed within 1 second, etc.).
Example Clauses
[0138] Any of the following example clauses can be implemented. [0139] Clause 1. An enterprise resource planning (ERP) system for improved backorder processing (BOP), the ERP system comprising: memory; one or more hardware processors coupled to the memory; and one or more computer readable storage media storing instructions that, when loaded into the memory, cause the one or more hardware processors to perform operations comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme. [0140] Clause 2. The ERP system of clause 1, wherein the operations further comprise prompting, in runtime, the large language model with the one or more use prompts and a system prompt, wherein the system prompt defines a cardinality relationship between the BOP segment and the BOP variant. [0141] Clause 3. The ERP system of clause 2, wherein the operations further comprise prompting, in runtime, the large language model with a context prompt, wherein the context prompt defines a plurality of confirmation schemes, one of which is defined for the BOP segment by the BOP variant. [0142] Clause 4. The ERP system of clause 3, wherein the context prompt comprises prototype definitions of functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant, respectively. [context prompt: function call definitions] [0143] Clause 5. The ERP system of clause 4, wherein the system prompt is configured to instruct the large language model to sequentially invoke the functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant in backend. [0144] Clause 6. The ERP system of any one of clauses 4-5, wherein the system prompt is configured to instruct the large language model to identify missing input values of the functions and request the missing input values from the user interface. [0145] Clause 7. The ERP system of any one of clauses 3-6, wherein the context prompt comprises a list of attributes, from which attributes are selected to define the one or more filters, wherein the operations further comprise retrieving, in runtime, the list of attributes from database tables corresponding to the BOP segment. [0146] Clause 8. The ERP system of any one of clauses 3-7, wherein the context prompt comprises a list of operators, from which operators are selected to define the one or more filters. [0147] Clause 9. The ERP system of any one of clauses 3-8, wherein the context prompt comprises one or more structured objects, wherein a structured object comprises one or more example user prompts and corresponding output of the large language model generated in response to the one or more example user prompts. [0148] Clause 10. The ERP system of any one of clauses 3-9, wherein the context prompt comprises one or more configuration parameters of the large language model. [0149] Clause 11. A computer-implemented method for improved backorder processing (BOP) in an enterprise resource planning (ERP) system, the method comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme. [0150] Clause 12. The computer-implemented method of clause 11, wherein the operations further comprise prompting, in runtime, the large language model with the one or more use prompts and a system prompt, wherein the system prompt defines a cardinality relationship between the BOP segment and the BOP variant. [0151] Clause 13. The computer-implemented method of clause 12, wherein the operations further comprise prompting, in runtime, the large language model with a context prompt, wherein the context prompt defines a plurality of confirmation schemes, one of which is defined for the BOP segment by the BOP variant. [0152] Clause 14. The computer-implemented method of clause 13, wherein the context prompt comprises prototype definitions of functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant, respectively. [0153] Clause 15. The computer-implemented method of clause 14, wherein the system prompt is configured to instruct the large language model to sequentially invoke the functions for creating the BOP segment, creating the BOP variant, and executing the BOP variant in backend. [0154] Clause 16. The computer-implemented method of any one of clauses 13-15, wherein the context prompt comprises a list of attributes, from which attributes are selected to define the one or more filters, wherein the operations further comprise retrieving, in runtime, the list of attributes from database tables corresponding to the BOP segment. [0155] Clause 17. The computer-implemented method of any one of clauses 13-16, wherein the context prompt comprises a list of operators, from which operators are selected to define the one or more filters. [0156] Clause 18. The computer-implemented method of any one of clauses 13-17, wherein the context prompt comprises one or more structured objects, wherein a structured object comprises one or more example user prompts and corresponding output of the large language model generated in response to the one or more example user prompts. [0157] Clause 19. The computer-implemented method of any one of clauses 13-18, wherein the context prompt comprises one or more configuration parameters of the large language model. [0158] Clause 20. One or more non-transitory computer-readable media having encoded thereon computer-executable instructions causing one or more processors to perform a method for improved backorder processing (BOP) in an enterprise resource planning (ERP) system, the method comprising: receiving one or more user prompts from a user interface; creating, in runtime, a BOP segment using a large language model, wherein the BOP segment selects a subset of a plurality of order requirements using one or more filters determined based on the one or more user prompts, wherein a filter is defined by an attribute, an operator, and one or more attribute values; creating, in runtime, a BOP variant using the large language model, wherein the BOP variant defines a confirmation scheme for the BOP segment based on the one or more user prompts; and executing the BOP variant using the large language model, wherein the executing comprises batch processing the subset of the plurality of order requirements using the confirmation scheme.
Example Alternatives
[0159] It should be noted that the technologies descried herein that leverage API metadata for intelligent user query answering are merely exemplary, and alternative approaches can be taken. For example, one alternative could be using UI metadata, which can be mapped to natural language input from the user. Automation tools like Selenium could be used to automate the navigation process to a given page and UI element. A hybrid approach of using both APIs and UI could also be considered, where the source of information can be determined optimally considering several factors. For instance, while retrieving answers via UI could be performance intensive, APIs could be the default choice for answering queries. If the user is already on the same UI page which has the context to the user's query, it might be useful to answer the question directly via the UI element on the page. For navigation related queries or rendering complex user interfaces like graphs, pie charts, etc., within the chat interface, intelligent UI based navigation or direct rendering of the UI element within the chat interface could be used.
[0160] The technologies from any example can be combined with the technologies described in any one or more of the other examples. In view of the many possible embodiments to which the principles of the disclosed technology can be applied, it should be recognized that the illustrated embodiments are examples of the disclosed technology and should not be taken as a limitation on the scope of the disclosed technology. Rather, the scope of the disclosed technology includes what is covered by the scope and spirit of the following claims.