Patent classifications
G06F2203/04108
METHOD AND DEVICE FOR NAVIGATING IN A USER INTERFACE AND APPARATUS COMPRISING SUCH NAVIGATION
A method is provided for navigating in a display screen by way of a control surface including a step of measuring: —a data item, termed position, relating to a position targeted, on the control surface, by a remote control object positioned opposite the control surface, and —a data item, termed vertical distance, relating to the distance between the at least one remote control object and the control surface; and a drive step, carrying out, as a function of the vertical distance measured: —a displacement, and/or —an adjustment of a parameter relating to a displacement; of at least one part of a zone and/or of a symbol displayed on the display screen and chosen as a function of the target position.
A CONTACTLESS TOUCHSCREEN INTERFACE
A contactless touchscreen interface has a digital display to display digital information; the proximity detector and a proximity detector comprising an image sensor to detect user interaction at a virtual touch intersection plane offset a distance from the digital display and to resolve the interaction into XY offset-plane interaction coordinates with reference to the digital display. A gaze determining imaging system comprising an image sensor determines a gaze relative offset with respect to the digital display using facial image data captured by the image sensor. An interface controller comprising a parallax adjustment controller to convert the XY offset-plane interaction coordinates to XY on-screen apparent coordinates using the gaze relative offset and the distance and an input controller generates an input at the XY on-screen apparent coordinates accordingly.
Floating touch camera module, electronic device and touch method
The present disclosure provides a floating touch camera module, a display device and a touch method. The floating touch camera module includes: a lens with a light collection surface and a light emitting surface; an image sensor at one side of the light emitting surface of the lens, the image sensor configured to receive light rays from the lens and form sensing information; and an infrared cut filter film at one side of a light incident surface of the image sensor and configured to filter out infrared light rays. The infrared cut filter film is movable relative to the lens between a first position at which the infrared cut filter film directly faces the lens and a second position at which the infrared cut filter film is offset from the lens, thereby enabling the floating touch camera module to switch between a photographing mode and a touch mode.
SYSTEM FOR DETECTING AND VALIDATING MULTIUSER INTERACTION WITH INTERFACE ELEMENTS VIA SIGNALING THROUGH USERS BODIES
A sensor system operates by: communicating a first ID signal at a first frequency between a first passenger restraint and a first sensor circuit of a touch screen through a body of a first user in a first occupancy area; receiving first sensed signal data from the first sensor circuit indicating a possible interaction with a first interactable element at a first touch screen location based on changes in electrical properties of an electrode of the first sensor circuit; determining the first sensed signal data indicates detection of the first frequency; when permissions data for the first occupancy area indicates occupants of the first occupancy area can interact with the first interactable element, facilitate performance of a functionality associated with the first interactable element; when permissions data for the first occupancy area indicates occupants of the first occupancy area cannot interact with the first interactable element, foregoing performance of the functionality associated with the interaction with the first interactable element.
SYSTEM FOR DETECTING AND VALIDATING MULTIUSER INTERACTION WITH INTERFACE ELEMENTS VIA PERIMETER SENSORS
A sensor system operates by: communicating a first ID signal at a first frequency between a first perimeter sensor and a first sensor circuit of a touch screen through a body of a first user in a first occupancy area; receiving first sensed signal data from the first sensor circuit indicating a possible interaction with a first interactable element at a first touch screen location based on changes in electrical properties of an electrode of the first sensor circuit; determining the first sensed signal data indicates detection of the first frequency; when permissions data for the first occupancy area indicates occupants of the first occupancy area can interact with the first interactable element, facilitate performance of a functionality associated with the first interactable element; when permissions data for the first occupancy area indicates occupants of the first occupancy area cannot interact with the first interactable element, foregoing performance of the functionality associated with the interaction with the first interactable element.
Display device including a touch panel
A display device includes a display panel that includes a plurality of pixels, a touch panel that includes a plurality of driving lines and a plurality of sensing lines, a display driver that drives the display panel at a first display frame rate in a normal driving mode, and that drives the display panel at a second display frame rate lower than the first display frame rate in a low power driving mode, and a touch controller that drives the touch panel with a mutual capacitance sensing method in the normal driving mode, and that drive the touch panel with a self capacitance sensing method in the low power driving mode.
ELECTRONIC DEVICE
An electronic device includes a sensor layer including a plurality of first electrodes and a plurality of second electrodes, a sensor driving circuit driving the sensor layer and operating in a first or second mode, and a main driving circuit controlling an operation of the sensor driving circuit. In the first mode, the sensor driving circuit outputs a plurality of first transmit signals to the plurality of first electrodes respectively, receives a plurality of first sensing signals from the plurality of second electrodes respectively, and outputs the plurality of first sensing signals to the main driving circuit. In the second mode, the sensor driving circuit outputs a plurality of second transmit signals to the plurality of first electrodes respectively, receives a plurality of second sensing signals from the plurality of second electrodes respectively, and provides the main driving circuit with a coordinate based on the plurality of second sensing signals.
Adaptive thresholding and noise reduction for radar data
An electronic device for gesture recognition, includes a processor operably connected to a transceiver. The transceiver is configured to transmit and receive signals for measuring range and speed. The processor is configured to transmit the signals, via the transceiver. in response to a determination that a triggering event occurred, the processor is configured to track movement of an object relative to the electronic device within a region of interest based on reflections of the signals received by the transceiver to identify range measurements and speed measurements associated with the object. The processor is also configured to identify features from the reflected signals, based on at least one of the range measurements and the speed measurements. The processor is further configured to identify a gesture based in part on the features from the reflected signals. Additionally, the processor is configured to perform an action indicated by the gesture.
Data processing device and driving method thereof for a flexible touchscreen device accepting input on the front, rear and sides
A data processing device which includes a flexible position input portion for sensing proximity or a touch of an object such as a user's palm and finger. In the case where a first region of the flexible position input portion is held by a user for a certain period, supply of image signals to the first region is selectively stopped.
Three-dimensional perceptions in haptic systems
An acoustic field may be produced from a transducer array having known relative positions and orientations In this acoustic field, one or more control points may be defined. An amplitude may be assigned to the control point. Mid-air haptic effect for a virtual object on a human body part may be generated by moving the control point in a single closed curve comprising a plurality of curve segments. The single closed curve traverses at least one location where the human body part intersects with the virtual object. Additionally, a user may interact with virtual three-dimensional content using the user's hands while a tracking system monitoring the user's hands, a physics engine updates the properties of the virtual three-dimensional content and a haptic feedback system provides haptic information to the user.