G06F3/04812

SMALL WINDOW EXIT METHOD, ELECTRONIC DEVICE AND STORAGE MEDIUM

A small window exit method, applied to a terminal and includes: showing an exit region corresponding to a small window under the condition that the small window is displayed in a screen of the terminal; and executing an exit operation of the small window when a preset operation specific to the exit region is detected.

Input Session between Devices based on an Input Trigger

Techniques for input session between devices based on an input trigger are described and may be implemented to enable a first device (e.g., a mobile device) to serve as an input device for a second device. Generally, the described implementations enable multiple different input triggers to be utilized to trigger an input session between devices, such as for enabling proximity-based input (e.g., stylus input, touch input, etc.) to a first device to be provided as input to a second device.

Tab visibility
11556227 · 2023-01-17 · ·

According to one general aspect, a computing device may include an application configured to create a tab in a context of a window, and a window manager configured to register the tab with a first UI element registry. The window manager may be configured to receive, over a network, at least a portion of a second UI element registry from a secondary window manager of a secondary computing device. The portion of the second UI element registry may identify a remote tab previously registered with the secondary window manager. The window manager may be configured to cause a display to provide a graphical arrangement of the tab and the remote tab.

Tab visibility
11556227 · 2023-01-17 · ·

According to one general aspect, a computing device may include an application configured to create a tab in a context of a window, and a window manager configured to register the tab with a first UI element registry. The window manager may be configured to receive, over a network, at least a portion of a second UI element registry from a secondary window manager of a secondary computing device. The portion of the second UI element registry may identify a remote tab previously registered with the secondary window manager. The window manager may be configured to cause a display to provide a graphical arrangement of the tab and the remote tab.

RELOCATION OF CONTENT ITEM TO MOTION PICTURE SEQUENCES AT MULTIPLE DEVICES
20230008575 · 2023-01-12 ·

An electronic presentation system comprising: one or more computer processors operatively coupled to one or more computer memories storing a set of instructions for configuring the one or more computer processors to perform operations comprising: causing display of a motion picture image sequence (MPIS) captured at a first computing device, within display screens at multiple computing devices; causing display of first multiple content items, on the display screens of the multiple computing devices, separate from the MPIS; and based at least in part upon receiving at the presentation management system from the first computing device, first relocation information indicating a first content item, causing relocation of display of the of the first content item, to within the MPIS displayed at the multiple computing devices.

RELOCATION OF CONTENT ITEM TO MOTION PICTURE SEQUENCES AT MULTIPLE DEVICES
20230008575 · 2023-01-12 ·

An electronic presentation system comprising: one or more computer processors operatively coupled to one or more computer memories storing a set of instructions for configuring the one or more computer processors to perform operations comprising: causing display of a motion picture image sequence (MPIS) captured at a first computing device, within display screens at multiple computing devices; causing display of first multiple content items, on the display screens of the multiple computing devices, separate from the MPIS; and based at least in part upon receiving at the presentation management system from the first computing device, first relocation information indicating a first content item, causing relocation of display of the of the first content item, to within the MPIS displayed at the multiple computing devices.

User-defined groups of graphical objects

In an example, a computer-implemented method to group graphical objects includes displaying, on a display device, a graphical diagram with multiple graphical objects that represent data of a data source. The method includes receiving input to define one or more groups. The method includes, in response to the input, generating one or more containers, each of the one or more containers representing a different one of the one or more groups; and graphically depicting membership of the graphical objects in the one or more groups by relative arrangement of the graphical objects and the one or more containers according to group membership of each of the graphical objects.

User-defined groups of graphical objects

In an example, a computer-implemented method to group graphical objects includes displaying, on a display device, a graphical diagram with multiple graphical objects that represent data of a data source. The method includes receiving input to define one or more groups. The method includes, in response to the input, generating one or more containers, each of the one or more containers representing a different one of the one or more groups; and graphically depicting membership of the graphical objects in the one or more groups by relative arrangement of the graphical objects and the one or more containers according to group membership of each of the graphical objects.

System and method for tracking changes between a current state and a last state seen by a user

A system and method for tracking differences between a last state seen by a user and a current state is provided. A user views a graphical user interface (GUI) wind that displays one or more states. Should the user's focus shift from the GUI and return after one or more states displayed therein have been modified, the system highlights the change between the current state and the user's last seen state.

ADAPTIVE SMOOTHING BASED ON USER FOCUS ON A TARGET OBJECT

Techniques described herein dynamically adapt an amount of smoothing that is applied to signals of a device (e.g., positions and/or orientations of an input mechanism, positions and/or orientations of an output mechanism) based on a determined distance between an object and the device, or based on a determined distance between the object and another device (e.g., a head-mounted device). The object can comprise one of a virtual object presented on a display of the head-mounted device or a real-world object within a view of the user. The object can be considered a “target” object based on a determination that a user is focusing on, or targeting, the object. For example, the head-mounted device or other devices can sense data associated with an eye gaze of a user and can determine, based on the sensed data, that the user is looking at the target object.