H04M2203/355

AUTOMATED CHATBOT GENERATION FROM AN INTERACTIVE VOICE RESPONSE TREE
20230020613 · 2023-01-19 ·

A method comprising: receiving an interactive voice response (IVR) tree configured to implement one or more tasks, each associated with one or more IVR node paths comprising a plurality of IVR nodes arranged in a hierarchical relationship; analyzing the IVR tree to identify one or more intent IVR nodes, each associated with one of the tasks; with respect to each of the intent IVR nodes, identifying a plurality of corresponding entity IVR nodes included within the IVR node path associated with the intent IVR node; assembling one or more task-specific chatbot skills, each comprising (i) one of the intent IVR nodes, and (ii) at least some of the plurality of corresponding entity IVR nodes, wherein each of the task-specific chatbot skills is configured to perform one of the tasks by conducting a dialog with a user; and generating a chatbot comprising at least one of the task-specific chatbot skills.

Delayed responses by computational assistant
11521037 · 2022-12-06 · ·

An example method includes receiving, by a computational assistant executing at one or more processors, a representation of an utterance spoken at a computing device; identifying, based on the utterance, a task to be performed by the computational assistant; responsive to determining, by the computational assistant, that complete performance of the task will take more than a threshold amount of time, outputting, for playback by one or more speakers operably connected to the computing device, synthesized voice data that informs a user of the computing device that complete performance of the task will not be immediate; and performing, by the computational assistant, the task.

Visualization of training dialogs for a conversational bot

This document relates to creating and/or updating a chatbot using a graphical user interface. For example, training dialogs for a chatbot can be displayed in a tree form on a graphical user interface. Based at least on interactions between a developer and the graphical user interface, the training dialogs can be modified in the tree form, and training dialogs can be updated based on the modifications provided on the tree form via the graphical user interface.

Development of voice and other interaction applications

Among other things, a developer of an interaction application for an enterprise can create items of content to be provided to an assistant platform for use in responses to requests of end-users. The developer can deploy the interaction application using defined items of content and an available general interaction model including intents and sample utterances having slots. The developer can deploy the interaction application without requiring the developer to formulate any of the intents, sample utterances, or slots of the general interaction model.

DELAYED RESPONSES BY COMPUTATIONAL ASSISTANT
20230054023 · 2023-02-23 ·

An example method includes receiving, by a computational assistant executing at one or more processors, a representation of an utterance spoken at a computing device; identifying, based on the utterance, a task to be performed by the computational assistant; responsive to determining, by the computational assistant, that complete performance of the task will take more than a threshold amount of time, outputting, for playback by one or more speakers operably connected to the computing device, synthesized voice data that informs a user of the computing device that complete performance of the task will not be immediate; and performing, by the computational assistant, the task.

VISUALIZATION OF TRAINING DIALOGS FOR A CONVERSATIONAL BOT

This document relates to creating and/or updating a chatbot using a graphical user interface. For example, training dialogs for a chatbot can be displayed in a tree form on a graphical user interface. Based at least on interactions between a developer and the graphical user interface, the training dialogs can be modified in the tree form, and training dialogs can be updated based on the modifications provided on the tree form via the graphical user interface.

Contact center customization in data communications systems

Certain aspects of the disclosure are directed to customization of a contact center, using a data communications server. According to a specific example, the data communications server includes circuitry configured and arranged to provide data communications services to a plurality of remotely-situated client entities. The data communications server further provides a graphical user interface (GUI) for each respective remotely-situated client entity, the GUI including a display of communications-based campaigns, interactive voice response (IVR) tools, and data analytics. Moreover, the data communications server provides a display on the GUI including selectable components to create for the associated remotely-situated client entity, a customization for handling incoming data communications by the data communications server. Accordingly, the data communications server may handle communications for the associated remotely-situated client entity according to the provided customization.

MODULAR EMERGENCY COMMUNICATION FLOW MANAGEMENT SYSTEM

Disclosed are systems, methods, and media capable of generating and implementing emergency flows. The emergency flow is highly customizable and can connect with various stakeholders (user, emergency contacts, corporate representatives, emergency service provider personnel, etc.). The systems, methods, and media can be triggered in various ways including user input (e.g. voice command) or by sensors (e.g. by using sound detection capabilities).

Modular emergency communication flow management system

Disclosed are systems, methods, and media capable of generating and implementing emergency flows. The emergency flow is highly customizable and can connect with various stakeholders (user, emergency contacts, corporate representatives, emergency service provider personnel, etc.). The systems, methods, and media can be triggered in various ways including user input (e.g. voice command) or by sensors (e.g. by using sound detection capabilities).

User Experience Workflow Configuration
20230030849 · 2023-02-02 ·

A user experience workflow may be configured based on input received for various object types selectively arranged within the user experience workflow and then bound to a destination identifier, such as a telephone number or web address. A user interface of software for configuring a user experience workflow is presented at a user device and input from that user device is used to selectively arrange objects within a user experience workflow and/or to configure objects thereof. After configurations are applied to the objects, the user experience workflow is bound to the destination identifier. An end user device which accesses the destination identifier (e.g., by calling the telephone number, visiting the web address, or using an application connecting to the web address) may then traverse the user experience workflow, including in some cases having configured content presented thereto.