Patent classifications
G06F3/0482
Receiving voice samples from listeners of media programs
Listeners to media programs provide feedback to creators or other entities associated with the media programs in the form of one or more spoken utterances. When a listener to a media program speaks one or more words to a microphone or other system, the words are captured and processed to determine an emotion of the listener, or to determine whether the words include any objectionable content. Data including the spoken words is captured and stored, and presented to the creator of the media program. Notifications of the utterances are provided to the creator, who may identify one of the utterances, and include the utterance in the media program, or invite the listener who provided the utterances to participate in the media program.
Dynamic address-based dashboard customization
Systems and methods are provided for dynamic configuration of interactive controls available on a dashboard. Interactive controls may be dynamically configured by manipulating network resource address information for a network resource that provides a dashboard, for example using query string parameters. For example, a dashboard that displays one type, source, or summary of information can be dynamically configured to allow interactive selection and display of another type, source, or summary of information depending on values passed in the network resource address information for the dashboard network resource.
Dynamic address-based dashboard customization
Systems and methods are provided for dynamic configuration of interactive controls available on a dashboard. Interactive controls may be dynamically configured by manipulating network resource address information for a network resource that provides a dashboard, for example using query string parameters. For example, a dashboard that displays one type, source, or summary of information can be dynamically configured to allow interactive selection and display of another type, source, or summary of information depending on values passed in the network resource address information for the dashboard network resource.
Systems and methods for interacting with three-dimensional graphical user interface elements to control computer operation
Disclosed are three-dimensional (“3D”) graphical user interface (“GUI”) elements for improving user interactions with a digital environment or a device by simplifying access to different data, functionality, and operations of the digital environment or the device. A 3D GUI element may include first visual information at a first position and second visual information at a second position within the 3D space represented by the 3D GUI element. In response to first input directed to the first visual information, the 3D GUI or system may perform a first action that is mapped to the first input and the first visual information within the 3D GUI element. In response to second input directed to the second visual information, the 3D GUI or system may perform a second action that is mapped to the second input and the second visual information within the 3D GUI element.
Payload recording and comparison techniques for discovery
Persistent storage may contain an input discovery payload that contains entries representing configuration items and relationships therebetween, wherein the configuration items contain attributes defining devices, components, or applications on a network. One or more processors may be configured to: provide, for display, a graphical user interface containing a representation of the input discovery payload and a button; provide the input discovery payload to an identification and reconciliation engine (IRE) software application; receive, from the IRE software application, an output discovery payload that includes a log generated from execution of the IRE software application on the input discovery payload, wherein the log indicates, for the configuration items and the relationships in the input discovery payload, how a configuration management database (CMDB) would be updated by the IRE software application; and provide, for display, a further graphical user interface containing a further representation of the output discovery payload.
Systems and methods for gamification of instrument inspection and maintenance
Disclosed is a gamification system for overlaying user-controlled graphical targeting elements over a real-time video feed of an instrument being inspected, and providing interactive controls for firing virtual weapons or other graphical indicators to designate and/or record the presence of contaminants, defects, and/or other issues at specific locations within or on the instrument. The system may receive and present images of the instrument under inspection in a graphical user interface (“GUI”). The system may receive user input that tags a particular region of a particular image with an issue identifier, and may generate a visualization that is presented in conjunction with the particular image in the GUI in response to receiving the input. The visualization corresponds to firing of a virtual weapon and other gaming visuals associated with tagging the particular region of the particular image with the issue identifier.
Exercised-based watch face and complications
Exercise-based watch faces and complications for use with a portable multifunction device are disclosed. The methods described herein for exercise-based watch faces and complications provide indications of time and affordances representing applications (e.g., a workout application or a weather application). In response to detecting a user input corresponding to a selection of the affordance (e.g., representing a workout application), a workout routine can optionally be begun. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein, as well as electronic devices related thereto.
Exercised-based watch face and complications
Exercise-based watch faces and complications for use with a portable multifunction device are disclosed. The methods described herein for exercise-based watch faces and complications provide indications of time and affordances representing applications (e.g., a workout application or a weather application). In response to detecting a user input corresponding to a selection of the affordance (e.g., representing a workout application), a workout routine can optionally be begun. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein, as well as electronic devices related thereto.
Display for pump
This document discusses, among other things, an apparatus comprising a pump configured to deliver insulin, a processor, and a user interface including a bistable display. A display element of the bistable display is placed in one of two stable orientations upon application of a biasing voltage and stays in the stable orientation when the biasing voltage is removed. The processor includes a display module configured to display a non-blank reversion display screen on the bistable display when no input is received at the user interface after a specified time duration, and to recurrently change the reversion display screen until input is received at the user interface.
Display for pump
This document discusses, among other things, an apparatus comprising a pump configured to deliver insulin, a processor, and a user interface including a bistable display. A display element of the bistable display is placed in one of two stable orientations upon application of a biasing voltage and stays in the stable orientation when the biasing voltage is removed. The processor includes a display module configured to display a non-blank reversion display screen on the bistable display when no input is received at the user interface after a specified time duration, and to recurrently change the reversion display screen until input is received at the user interface.