G06F2203/04808

Simultaneous Use of a Capacitance-Based Track Pad
20230053717 · 2023-02-23 · ·

An apparatus may include a capacitance-based trackpad, a tracking driver in communication with the capacitance-based trackpad, a key driver in communication with the capacitance-based trackpad, a processor, and a memory having programmed instructions that, when executed, may cause the processor to modify the raw track inputs to associate a non-confidence indicator with at least one raw track input from the track inputs to form processed track inputs, send the processed track inputs to the tracking driver, and send the processed track inputs to the key driver. The tracking driver may be configured to receive raw track inputs from the capacitive-based trackpad and the key driver may be configured to receive raw key inputs from the capacitance-based trackpad.

IMAGE DISPLAY APPARATUS

There is provided an image display apparatus that enables an intuitive operation even when a detection target is not able to be inserted into a first space in which a three-dimensional object is visually recognized. A position of a detection target is detected, and a display position of a pointer displayed by a display unit is moved on a basis of a position of the detection target that exists within a second space not overlapping the first space which is a space in which the three-dimensional object is displayed among the position of the detected detection target.

Terminal and method for setting menu environments in the terminal
11586340 · 2023-02-21 · ·

An apparatus and method for setting a menu environment in a mobile terminal are provided. The apparatus includes a controller for switching to an environment setting mode of a menu according to a type of a gesture having occurred on the menu.

Control apparatus, control method, and storage medium
11586349 · 2023-02-21 · ·

A viewpoint control unit 204 detects a user operation on a display surface for displaying a virtual-viewpoint video (S801) and controls at least one of the position and the orientation of a virtual viewpoint concerning generation of the virtual-viewpoint video in accordance with the user operation (S805, S808, S812, S814).

Systems and Methods for Interacting with User Interfaces
20220365669 · 2022-11-17 ·

A device detects, while displaying a first user interface that includes a first plurality of notifications in a list of notifications, a first user input that includes a first input. In response to detecting the first user input, the device, in accordance with a determination that the first input includes a swipe input in a first direction and a determination that an end of the list of notifications has been reached, displays a search input region. In response to detecting the first user input, the device, in accordance with a determination that the first input includes the swipe input in the first direction and that an end of the list of notifications has not been reached, displays a second plurality of notifications that are between the first plurality of notifications and the end of the list of notifications.

Pointer position detection method and sensor controller
11586301 · 2023-02-21 · ·

Disclosed is a pointer position detection method for detecting, using a touch sensor including a plurality of sensor electrodes, a touch position indicated by a passive pointer that does not transmit a signal and a pen position indicated by an active pen configured to transmit a downlink signal from a pen electrode disposed in a distal end of the active pen, the pointer position detection method performed by a sensor controller connected to the touch sensor. The pointer position detection method includes: detecting one or more candidate pen positions based on a level of the downlink signal in each of the plurality of sensor electrodes; determining the pen position in an active region at least partially surrounded by a bezel region of a display device from the one or more candidate pen positions; and detecting a gesture performed by the active pen in the bezel region of the display device.

Digital processing systems and methods for communications triggering table entries in collaborative work systems

Systems, methods, and computer-readable media for triggering table entries characterizing workflow-related communications occurring between workflow participants are disclosed. The systems and methods may involve presenting a table via a display, the table containing rows and columns defining cells, the rows and cells being configured to manage respective roles of the workflow participants; presenting on the display at least one active link for enabling workflow participants to join in a video or an audio communication; logging in memory, characteristics of the communication including identities of the workflow participants who joined in the communication; and generating an object associated with the table, the object containing the characteristics of the communication logged in memory.

Touchscreen user interface for interacting with a virtual model

A method comprises accessing a three-dimensional (3D) model of an object with a visualization tool, including a touchscreen; displaying, via the touchscreen, an image of the 3D model; detecting a first pressure based input at a first location on the touchscreen; selecting a first component of the 3D model as a result of the first pressure based input at the first location; detecting movement of the first pressure based input from the first location on the touchscreen to a second location on the touchscreen; selecting an adjustable display parameter as a result of the first pressure based input at the second location; detecting a second pressure based input at the touchscreen; and while the first component is selected, changing the display parameter of the first component of the 3D model as a result of the second pressure based input.

Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
11500536 · 2022-11-15 · ·

A neural network system includes an eyewear device. The eyewear device has a movement tracker, such as an accelerometer, gyroscope, or an inertial measurement unit for measuring acceleration and rotation. The neural network system tracks, via the movement tracker, movement of the eyewear device from at least one finger contact inputted from a user on an input surface. The neural network system identifies a finger gesture by detecting at least one detected touch event based on variation of the tracked movement of the eyewear device over a time period. The neural network system adjusts the image presented on the image display of the eyewear device based on the identified finger gesture. The neural network system can also detect whether the user is wearing the eyewear device and identify an activity of the user wearing the eyewear device based on the variation of the tracked movement over the time period.

Touch input processing method and electronic device supporting the same

An electronic device including: a housing; a sensor module disposed on an inner face of the housing and including a plurality of sensing units; and a processor positioned within the housing and electrically connected to the sensor module. Each of the plurality of sensing units is electrically connected to another sensing unit adjacent thereto among the plurality of sensing units, and includes a central portion and a plurality of peripheral portions connected to a partial area of the central portion and arranged around the central portion, and each of the central portion and the plurality of peripheral portions includes a touch sensor. In addition to this, various embodiments understood through this document are possible.