Patent classifications
G06F2203/04801
Systems and methods for controlling cursor behavior
Systems, methods, and non-transitory computer readable media containing instructions for causing at least one processor to perform operations to enable cursor control in an extended reality space are provided. In one implementation, the processor is configured to perform operations comprising receiving from an image sensor first image data reflecting a first region of focus of a user of a wearable extended reality appliance; causing a first presentation of a virtual cursor in the first region of focus; receiving from the image sensor second image data reflecting a second region of focus of the user outside the initial field of view in the extended reality space; receiving input data indicative of a desire of the user to interact with the virtual cursor; and causing a second presentation of the virtual cursor in the second region of focus in response to the input data.
Image identification system
Embodiments may relate to a graphical user interface (GUI). The GUI may include a first portion that displays an image related to images of a location. The GUI may also include a second portion that displays an image related to detection and ranging information of the location. The two images may be linked such that an interaction with an object in one portion of the GUI causes changes in the other portion of the GUI. Other embodiments may be described or claimed.
Unrestricted cursor positioning in multi-display environment
A method for controlling a mouse pointer on at least two displays is provided. A virtual display layout defines a mutual relative positioning of display areas relating to the at least two displays. The method comprises creating the virtual display area and tracking a position of the mouse pointer within it. Upon the mouse pointer being positioned within an area of any of the display areas, the mouse pointer is displayed. Upon the mouse pointer being positioned outside the area of any of the display areas, but within the virtual display area, a first marker is displayed on a side border of the display in a direction the mouse pointer is positioned within the virtual display area, and a second marker is displayed on at least one other side border of another display in a direction the mouse pointer is positioned within the virtual display area.
Systems and methods for virtual whiteboards
Methods, systems, apparatuses, and non-transitory computer-readable media are provided for tying virtual whiteboards to physical spaces. In one implementation, the computer-readable medium includes instructions to cause a processor to receive wirelessly, an indication of a location of a first wearable extended reality appliance; perform a lookup to determine that the location of the first wearable extended reality appliance corresponds to a location of a particular virtual whiteboard; transmit to the first wearable extended reality appliance, data corresponding to content of the particular virtual whiteboard; receive, during a first time period, virtual content added by a first user; receive wirelessly at a second time period an indication that a second wearable extended reality appliance is in the location of the particular virtual whiteboard; and transmit to the second wearable extended reality appliance, data corresponding to the content and the added content of the particular virtual whiteboard.
SIMULATING USER INTERACTIONS OVER SHARED CONTENT
Methods, systems, apparatuses, and computer-readable media are provided for simulating user interactions with shared content. In one implementation, the computer-readable medium includes instructions to cause a processor to establish a communication channel for sharing content and user interactions; transmit to at least one second wearable extended reality appliance, first data, representing an object associated with first wearable extended reality appliance, enabling a virtual representation of the object to be displayed through the at least one second wearable extended reality appliance; receive image data from an image sensor associated with the first wearable extended reality appliance; detect in the image data at least one user interaction including a human hand pointing to a specific portion of the object; and transmit to the at least one second wearable extended reality appliance second data indicating an area of the specific portion of the object.
MODES OF CONTROL OF VIRTUAL OBJECTS IN 3D SPACE
Systems, methods, and non-transitory computer readable media for selectively controlling display of virtual objects are provided. Virtual objects may be virtually presented in an environment via a wearable extended reality appliance operable in a first and second display modes. In the first display mode, positions of the virtual objects are maintained in the environment regardless of detected movements of the wearable extended reality appliance, and in the second display mode, the virtual objects move in the environment in response to detected movements of the wearable extended reality appliance. Movement of the wearable extended reality appliance may be detected. A selection of the first or second display mode may be received. Display signals configured to present the virtual objects in a manner consistent with the selected display mode may be outputted for presentation via the wearable extended reality appliance in response to the selected display mode.
Devices, methods, and graphical user interfaces for interacting with a position indicator within displayed text via proximity-based inputs
An electronic device, while displaying a first user interface, detects an input for an input object, detects that first hover proximity criteria are met by the input object, and displays first visual feedback. While displaying the first visual feedback, the device detects a change in a current value of a hover proximity parameter of the input object and that second hover proximity criteria are met by the input object after the change. In response to detecting that the second hover proximity criteria are met, the device displays second visual feedback, distinct from the first visual feedback.
REVOLVING ON-SCREEN VIRTUAL KEYBOARD FOR EFFICIENT USE DURING CHARACTER INPUT
A method includes displaying a user interface on a display of a client device. The user interface includes a virtual keyboard including characters arranged in rows and columns. The method also includes fixing a focus of a cursor displayed in the user interface to a first row or a first column in the virtual keyboard, responsive to receiving a first request to move the cursor in a first direction in the virtual keyboard, wrapping the one of the characters around the row or the column to a last row or a last column of the virtual keyboard to present another of the characters within the focus of the cursor, and responsive to receiving a second request to move the cursor in the virtual keyboard in a second direction, moving the focus of the cursor to a list of media item suggestions.
Display apparatus and displaying method for changing a cursor based on a user change of manipulation mode
A display apparatus and a displaying method thereof are provided. The displaying method of the display apparatus includes displaying a cursor, changing a manipulation mode of the display apparatus based on an input, and changing the cursor to a highlight or a mode guide icon corresponding to the changed manipulation mode.
Dual mode control of virtual objects in 3D space
Systems, methods, and non-transitory computer readable media containing instructions for selectively controlling display of virtual objects are provided. In one implementation, virtual objects may be virtually presented in an environment via a wearable extended reality appliance operable in a first and second display modes; in the first display mode, positions of the virtual objects are maintained in the environment regardless of detected movements of the wearable extended reality appliance, and in the second display mode, the virtual objects move in the environment in response to detected movements of the wearable extended reality appliance; movement of the wearable extended reality appliance may be detected; selection of the first or second display mode may be received; display signals configured to present the virtual objects in a manner consistent with the selected display mode may be outputted for presentation via the wearable extended reality appliance in response to the selected display mode.