G06F3/0487

Contextual assistant using mouse pointing or touch cues
11709653 · 2023-07-25 · ·

A method for a contextual assistant to use mouse pointing or touch cues includes receiving audio data corresponding to a query spoken by a user, receiving, in a graphical user interface displayed on a screen, a user input indication indicating a spatial input applied at a first location on the screen, and processing the audio data to determine a transcription of the query. The method also includes performing query interpretation on the transcription to determine that the query is referring to an object displayed on the screen without uniquely identifying the object, and requesting information about the object. The method further includes disambiguating, using the user input indication indicating the spatial input applied at the first location on the screen, the query to uniquely identify the object that the query is referring to, obtaining the information about the object requested by the query, and providing a response to the query.

GESTURE-BASED USER INTERFACE
20180011544 · 2018-01-11 ·

A computer-implemented method for enabling gesture-based interactions between a computer program and a user is disclosed. According to certain embodiments, the method may include initiating the computer program. The method may also include detecting that a condition has occurred. The method may also include activating a gesture-based operation mode of the computer program. The method may also include receiving gesture data generated by a sensor, the gesture data representing a gesture performed by the user. The method may further include performing a task based on the gesture data.

OPERATION INPUT DEVICE
20180011556 · 2018-01-11 · ·

An operation member 30 of an operation input device 20 includes a finger sensor 54 adapted to detect each of a plurality of fingers gripping the operation member 30. A controller 36 specifies the number of gripping fingers Ngf which is the number of fingers 300 gripping the operation member 30 based on the detection result of the finger sensor 54 and switches, in accordance with the number of gripping fingers Ngf, a type of an operation command corresponding to the operation input.

IDENTIFYING PHYSICAL ACTIVITIES PERFORMED BY A USER OF A COMPUTING DEVICE BASED ON MEDIA CONSUMPTION
20230237350 · 2023-07-27 ·

A method includes identifying, based on sensor data received by a motion sensor, a physical activity performed by a user of the computing system during a time period and determining whether the user consumed media during the time period that the user performed the physical activity. The method also includes responsive to determining that the user consumed the media during the time period that the user performed the physical activity, determining, based on data indicative of the media consumed by the user, an updated physical activity performed by the user during the time period; and outputting data indicating the updated physical activity.

IDENTIFYING PHYSICAL ACTIVITIES PERFORMED BY A USER OF A COMPUTING DEVICE BASED ON MEDIA CONSUMPTION
20230237350 · 2023-07-27 ·

A method includes identifying, based on sensor data received by a motion sensor, a physical activity performed by a user of the computing system during a time period and determining whether the user consumed media during the time period that the user performed the physical activity. The method also includes responsive to determining that the user consumed the media during the time period that the user performed the physical activity, determining, based on data indicative of the media consumed by the user, an updated physical activity performed by the user during the time period; and outputting data indicating the updated physical activity.

CONTENT-BASED TACTILE OUTPUTS

The present disclosure generally relates to content-based tactile outputs. In some embodiments, user interfaces associated with content-based tactile outputs are described. In some embodiments, user interfaces associated with end-of-content tactile outputs are described. In some embodiments, user interfaces associated with moving a user interface in response to different types of input are described. In some embodiments, user interfaces associated with adjustable item-based tactile outputs are described. In some embodiments, user interfaces associated with input velocity-based tactile outputs are described.

CONTENT-BASED TACTILE OUTPUTS

The present disclosure generally relates to content-based tactile outputs. In some embodiments, user interfaces associated with content-based tactile outputs are described. In some embodiments, user interfaces associated with end-of-content tactile outputs are described. In some embodiments, user interfaces associated with moving a user interface in response to different types of input are described. In some embodiments, user interfaces associated with adjustable item-based tactile outputs are described. In some embodiments, user interfaces associated with input velocity-based tactile outputs are described.

PROVIDING INPUTS TO COMPUTING DEVICES
20230004222 · 2023-01-05 ·

Techniques for providing inputs to computing devices are described. In an example, a computing device may display an input interface based on a gaze of a user. The input interface may comprise a plurality of keys, each of which is selectable by the user to provide an input to the computing device. Based on a gaze of the user at a key, the computing device may determine that the user has selected the key and perform an action corresponding to the selected key.

PROVIDING INPUTS TO COMPUTING DEVICES
20230004222 · 2023-01-05 ·

Techniques for providing inputs to computing devices are described. In an example, a computing device may display an input interface based on a gaze of a user. The input interface may comprise a plurality of keys, each of which is selectable by the user to provide an input to the computing device. Based on a gaze of the user at a key, the computing device may determine that the user has selected the key and perform an action corresponding to the selected key.

CONTROL DEVICE AND DATA PROCESSING SYSTEM

The power consumption of a control device or a data processing system is reduced. Safety is enhanced. An electronic device is operated in a simple way. A control device includes an arithmetic circuit, an input unit, and a power management unit. The input unit includes a sensor element. The power management unit has a function of controlling supply and shutdown of power to the arithmetic circuit. The power management unit has a function of supplying power to the arithmetic circuit in response to a detection signal output from the sensor element. The sensor element includes one or more selected from an acceleration sensor, an angular velocity sensor, and a magnetic sensor. The arithmetic circuit includes a register. The register includes a first circuit and a second circuit. The register has a function of storing, in the second circuit, first data stored in the first circuit in a period during which the power management unit supplies power to the arithmetic circuit and retaining the first data, in a period during which the power management unit stops power supply to the arithmetic circuit. The arithmetic circuit has a function of generating second data with use of signal data output from the sensor element and the first data.