G06F2200/1636

APPARATUS AND METHOD FOR SIGNAL PROCESSING

A signal processing apparatus includes a frequency detector configured to receive a user input including at least one of a vibration input and a user voice, vibrate in response to the received user input, and detect a frequency of the received user input, based on the vibration, and a processor configured to determine a type of the user input received by the frequency detector, based on the frequency detected by the frequency detector, and perform a function corresponding to the user input of the determined type.

TASK EXECUTION ORDER DETERMINATION SYSTEM AND TASK EXECUTION METHOD
20220382554 · 2022-12-01 ·

A technique for evaluating human cognitive and motor functions by a plurality of hand movement tasks is disclosed. A task execution method determines the execution order of a plurality of tasks which a test subject is caused to execute to acquire a characteristic quantity. A test subject group task database includes scores given in advance and characteristic quantities obtained from a plurality of tasks stored as past data corresponding to each of a plurality of test subjects. In a storage device, (1) a differentiation precision database for a case in which test subjects are divided into two groups by predetermined threshold value scores differentiated by the characteristic quantities, or (2) an estimation precision database for a case in which a score is estimated using the characteristic quantity for a predetermined score value is prepared for each of the tasks on the basis of the test subject group task database.

Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
11500536 · 2022-11-15 · ·

A neural network system includes an eyewear device. The eyewear device has a movement tracker, such as an accelerometer, gyroscope, or an inertial measurement unit for measuring acceleration and rotation. The neural network system tracks, via the movement tracker, movement of the eyewear device from at least one finger contact inputted from a user on an input surface. The neural network system identifies a finger gesture by detecting at least one detected touch event based on variation of the tracked movement of the eyewear device over a time period. The neural network system adjusts the image presented on the image display of the eyewear device based on the identified finger gesture. The neural network system can also detect whether the user is wearing the eyewear device and identify an activity of the user wearing the eyewear device based on the variation of the tracked movement over the time period.

Systems and methods for providing information and performing task
11573620 · 2023-02-07 ·

Systems, methods, and apparatus for presenting information and performing a task using an electronic device. In some aspects, a device shows content items when a gaze or a shaking act plus a gaze are detected. In some aspects, a device performs a task when a name, a code, and the task are detected in voice input. In some aspects, a user communicates with a selected vehicle via a user device.

Determining tap locations on a handheld electronic device based on inertial measurements

Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications. Tap locations, tap timing and tap attributes such as the magnitude of applied forces, device orientation, and the amplitude and directions of motions during and following a tap, may be used to control or modulate responses within the handheld device and/or actions within connected devices.

CONTROL BY SLIDING VIRTUAL BUTTONS
20220353611 · 2022-11-03 ·

Audio playback equipment includes microphones, a loudspeaker, emitter means arranged to emit a sound detection signal, and at least one processor component arranged: to acquire detection audio signals produced by the microphones as a result of picking up the detection sound signal; from the audio detection signals, to detect a run of maskings in which at least two distinct microphones are masked in succession; to analyze a detected run of maskings so as to detect a command slide made by a user on the housing via at least two distinct microphones; and to cause at least one predetermined action to take place as a result of detecting said command slide.

Underwater Camera Operations
20220345591 · 2022-10-27 ·

Camera operations are controlled by motion patterns determined from the outputs of an internal motion sensor. These methods remove the need for a nob, button, touch screen, or other mechanical control devices with movable components. This effectively removes common water leakage weak points for electronic devices with cameras. Motion patterns determined from the outputs of an internal motion sensor are also used to adjust camera operation parameters such as the brightness of a supporting light source, shutter speed, aperture opening, and contrast. These methods are also applicable for land operations.

Systems and Methods for Providing Information And Performing Task
20220334633 · 2022-10-20 ·

Systems, methods, and apparatus for presenting information and performing a task using an electronic device. In some aspects, a device shows content items when a gaze or a shaking act plus a gaze are detected. In some aspects, a device performs a task when a name, a code, and the task are detected in voice input. In some aspects, a user communicates with a selected vehicle via a user device.

Adaptive enclosure for a mobile computing device

A device includes an enclosure and logic. The enclosure includes a plurality of capacitive touch sensor arrays disposed at least on two of a top side, a bottom side, a left side, a right side, a front side, and a back side of the device. The enclosure also includes a first display on the front side of the device. The logic receives touch interaction information from the plurality of capacitive touch sensor arrays and initiates an action based at least in part on the touch interaction information.

Method for quickly invoking small window when video is displayed in full screen, graphic user interface, and terminal
11599254 · 2023-03-07 · ·

A method for quickly invoking a small window while a video is being displayed in full screen, a graphic user interface, and a terminal are provided. The method may include: When a terminal displays a video playing interface in full screen, the terminal may display a small window in a hover box based on a user operation. A display interface of the small window may be switched. The terminal may further quickly switch between multi-window display and full-screen display based on a user operation. In this process, the terminal continuously plays the video. According to this method, the terminal can quickly switch between multi-window display and full-screen display while continuously playing a video.