Patent classifications
G06F2203/04801
VEHICLE CONTROLLING APPARATUS AND VEHICLE HAVING THE SAME
Disclosed is a vehicle controlling apparatus for controlling a vehicle having a display. The vehicle controlling apparatus includes: a communication unit configured to communicate with the display; and a processor configured to set at least one region among an entire region of the display as a display region, based on a sight line range changed according to a position of a passenger who got on the vehicle, and configured to control the communication unit so that visual information is displayed on the display region.
Tracking and restoring pointer positions among applications
A method for restoring a pointer position within an application in response to a user switching between applications. The method includes one or more computer processors identifying a set of applications executing on a computing device of a user. The method further includes determining a series of pointer positions within a graphical user interface (GUI) of a first application of the set of executing applications in response to the user interfacing with the first application. The method further includes determining that the user pauses interfacing with a second application and resumes accessing the first application. The method further includes determining a pointer position from among the series of pointer positions respectively associated with the GUI of the first application. The method further includes responding to determining that the user resumes accessing the first application by positioning the pointer within the GUI of the first application at the determined pointer position.
PAGE NAVIGATION METHOD AND ELECTRONIC DEVICE
An electronic apparatus and a page navigation method thereof are provided. The page navigation method includes obtaining first navigation information from a first information source, and obtaining second navigation information from a second information source, generating a first page navigation command based on the first navigation information, and generating a second page navigation command based on the second navigation information, and executing the first page navigation command and the second page navigation command.
Devices, Methods, and User Interfaces for Interacting with a Position Indicator Within Displayed Text via Proximity-Based Inputs
An electronic device that is in communication with a display generation component, and sensor(s) to detect location of an input object displays a content selection object within selectable content, wherein the content selection object includes a first edge and a second edge. The device detects an input by the input object, including detecting the input object at a first hover location that corresponds to the first edge of the content selection object. In response to detecting the first portion of the input: in accordance with a determination that the first portion of the input meets first criteria that require the input object meets proximity criteria with respect to the content selection object, the device changes an appearance of the first edge relative to the second edge of the content selection object to indicate that the first edge will be selected for movement when the input object meets second criteria.
FLOATING SOFT TRIGGER FOR TOUCH DISPLAYS ON ELECTRONIC DEVICE
A portable electronic device having a touch screen with a floating soft trigger icon for enabling various functions of the electronic device, such as bar code reading, capturing RFID data, capturing video and images, calling applications, and/or placing phone calls. The floating trigger icon is displayed on the touch screen to enable easy identification and access of the trigger icon. The trigger icon may be selected via application of various unique control gestures to configure the electronic device. Based on the selected mode or function of the device, the trigger icon may alter its appearance to facilitate use of the device. The operation and functionality of the trigger icon may be programmed to customize operation of the device.
Gesture interaction with invisible virtual objects
Methods, systems, apparatuses, and non-transitory computer-readable media are provided for enabling gesture interaction with invisible virtual objects. In one implementation, the computer-readable medium includes instructions to cause a processor to receive image data captured by at least one image sensor of a wearable extended reality appliance in a field of view; display a plurality of virtual objects in a portion of the field of view; receive a selection of a specific physical object; receive a selection of a specific virtual object; dock the specific virtual object with the specific physical object; when the specific physical object and the specific virtual object are outside the portion of the field of view such that the specific virtual object is invisible to a user of the wearable extended reality appliance, receive a gesture input indicating interaction with the specific virtual object; and cause an output associated with the specific virtual object.
Simulating user interactions over shared content
Methods, systems, apparatuses, and computer-readable media are provided for simulating user interactions with shared content. In one implementation, the computer-readable medium includes instructions to cause a processor to establish a communication channel for sharing content and user interactions; transmit to at least one second wearable extended reality appliance, first data, representing an object associated with first wearable extended reality appliance, enabling a virtual representation of the object to be displayed through the at least one second wearable extended reality appliance; receive image data from an image sensor associated with the first wearable extended reality appliance; detect in the image data at least one user interaction including a human hand pointing to a specific portion of the object; and transmit to the at least one second wearable extended reality appliance second data indicating an area of the specific portion of the object.
VIRTUAL DISPLAY CHANGES BASED ON POSITIONS OF VIEWERS
Systems, methods, and non-transitory computer readable media configured for enabling content sharing between users of wearable extended reality appliances are provided. In one implementation, the computer readable medium may be configured to contain instructions to cause at least one processor to establish a link between a first wearable extended reality appliance and a second wearable extended reality appliance. The first wearable extended reality appliance may display first virtual content. The second wearable extended reality appliance may obtain a command to display first virtual content via the second wearable extended reality appliance, and in response, this content may be transmitted and displayed via the second extended reality appliance. Additionally, the first wearable extended reality appliance may receive second virtual content from the second wearable extended reality appliance, and display said second virtual content via the first wearable extended reality appliance.
IMAGE IDENTIFICATION SYSTEM
Embodiments may relate to a graphical user interface (GUI). The GUI may include a first portion that displays an image related to images of a location. The GUI may also include a second portion that displays an image related to detection and ranging information of the location. The two images may be linked such that an interaction with an object in one portion of the GUI causes changes in the other portion of the GUI. Other embodiments may be described or claimed.
Virtual display changes based on positions of viewers
Systems, methods, and non-transitory computer readable media configured for enabling content sharing between users of wearable extended reality appliances are provided. In one implementation, the computer readable medium may be configured to contain instructions to cause at least one processor to establish a link between a first wearable extended reality appliance and a second wearable extended reality appliance. The first wearable extended reality appliance may display first virtual content. The second wearable extended reality appliance may obtain a command to display first virtual content via the second wearable extended reality appliance, and in response, this content may be transmitted and displayed via the second extended reality appliance. Additionally, the first wearable extended reality appliance may receive second virtual content from the second wearable extended reality appliance, and display said second virtual content via the first wearable extended reality appliance.