Patent classifications
G06F2203/04808
Context based annotating in an electronic presentation system
A presentation system capable of detecting one or more gestures and contacts on a touch sensitive display. The presentation system can displaying indicia of such contacts, such as when a user writes with a fingertip, and can remove or alter such indicia responsive to other gestures and contacts. The system can accurately distinguish between types of gestures detected, such as between a writing gesture and an erasing gesture, on both large and small touch sensitive displays, thereby obviating the need for a user to make additional selective inputs to transition from one type of gesture to another. The system can determine how long to keep user annotations displayed during a presentation, based on the nature of the gesture used to make the annotations and the context in which they are made.
Method and apparatus for variable impedance touch sensor array force aware interaction with handheld display devices
- John Aaron Zarraga ,
- Alexander Meagher Grau ,
- Bethany Noel Haniger ,
- Bradley James Bozarth ,
- Brogan Carl Miller ,
- Ilya Daniel Rosenberg ,
- James Frank Thomas ,
- Mark Joshua Rosenberg ,
- Peter Hans Nyboer ,
- Reuben Eric Martinez ,
- Scott Gregory Isaacson ,
- Stephanie Jeanne Oberg ,
- Timothy James Miller ,
- Tomer Moscovich ,
- Yibo Yu
The present invention relates to touch-sensor detector systems and methods incorporating an interpolated variable impedance touch sensor array and specifically to such systems and methods for force-aware interaction with handheld display devices on one or more surfaces of the device. An exemplary embodiment includes a method for receiving a flexing gesture formed on a sensor panel of the handheld device including determining two or more pressure inputs at the sensor panel and determining a relative pressure between the two or more pressure inputs. The method further includes correlating the relative pressure inputs to the flexing gesture, associating the flexing gesture with a UI element and providing an input to the UI element based on the gesture and the relative pressure between the two or more pressure inputs.
Personalizable UI component layout on curved glass
Systems and methods are provided for generating a personalizable user interface (UI) component layout on a curved screen on a mobile electronic communication device, the curved screen having a front screen display surface and a plurality of edge screen display surfaces. In an example, the described technique entails detecting that a device user has opened an application on the mobile electronic communication device, determining whether the application opened by the user supports placement of UI components on edge screen display surfaces, and determining a coverage area on the front screen display surface and the edge screen display surfaces areas when the application supports placement of UI components on edge screen display surfaces, and estimating a user hand size based on the determined coverage area, and placing one or more UI components on an edge screen display surface to be reached by the user based on the estimated hand size.
Nautical chart display device, nautical chart display method, and nautical chart display program
The purpose is to provide a nautical chart display device which enables a measurement of a distance on an electronic nautical chart, like a measuring method which is performed by using a divider. The nautical chart display device includes a display unit, an operation detector, a registration processing module, and a change processing module. The display unit has a screen and displays a nautical chart on the screen. The operation detector detects a touch operation to the screen. The registration processing module accepts two points of the touch operation on the screen, and registers a scale at which a distance between the touched points on the screen matches a distance setting on the nautical chart as an additional scale. The change processing module changes the scale of the nautical chart to the additional scale.
Methods and systems facilitating adjustment of multiple variables via a content guidance application
Systems and methods are described for facilitating adjustment of multiple variables via a content guidance application. The method comprises generating, for display via a graphical user interface of a touchscreen, a first axis defining a first scale for a first adjustment characteristic. The method further comprises assigning to the first adjustment characteristic a plurality of first variables stored in memory. The method further comprises detecting, via the touchscreen, a touch input having a component along the first axis for adjusting the first adjustment characteristic. The method further comprises, in response to detecting the touch input, adjusting, in the memory, each of the plurality of first variables assigned to the first adjustment characteristic based on the touch input and the first scale.
MULTITOUCH DATA FUSION
A method for performing multi-touch (MT) data fusion is disclosed in which multiple touch inputs occurring at about the same time are received to generating first touch data. Secondary sense data can then be combined with the first touch data to perform operations on an electronic device. The first touch data and the secondary sense data can be time-aligned and interpreted in a time-coherent manner. The first touch data can be refined in accordance with the secondary sense data, or alternatively, the secondary sense data can be interpreted in accordance with the first touch data. Additionally, the first touch data and the secondary sense data can be combined to create a new command.
Electronic device including a plurality of displays and method for operating same
Various embodiments may provide an electronic device including: a housing; a first display configured to be slidable through the housing, wherein at least a portion of the first display is exposed to an outside through the housing, and a region of the first display, exposed to the outside, is capable of being changed based on sliding of the first display through the housing; a second display spaced a certain distance apart from the exposed at least portion of the first display and disposed to form a flat surface with the exposed at least portion of the first display; and at least one processor disposed in the housing, wherein the at least one processor is configured to display first content on the first display and second content on the second display. Various other embodiments are possible.
DOCUMENT LAYOUT MANAGEMENT
The invention relates to methods and devices for managing document layout of an electronic document. In a particular embodiment, a method implemented by a computing devices comprises: displaying in an electronic document a block containing at least one item; in response to a first finger gesture of two or more fingers defining at least two selection points, determining a space line specifying an initial position from which a space is to be managed in the electronic document; in response to a second finger gesture of two or more fingers defining a movement of the selected points along a first orientation, monitoring a current position of the space line moving over time; and performing a space management based on the current position of the space line along the first orientation, comprising creating and/or reducing a space in the electronic document.
Device, Method, and Graphical User Interface for Switching Between User Interfaces
An electronic device displays a first user interface that corresponds to a first application, and detects on a touch-sensitive surface a first gesture that includes movement of a contact in a respective direction on the touch-sensitive surface. In response to detecting the first gesture, the device, in accordance with a determination that the movement of the contact is in a first direction, replaces display of the first user interface with display of a second user interface that corresponds to a second application; and in accordance with a determination that the movement of the contact is in a second direction, distinct from the first direction, displays a first system user interface for interacting with a system-level function.
MEETING INTERACTION SYSTEM
Described is an interaction system comprising an imaging device, such as a camera system, configured to image one or more users, wherein the interaction device is configured to determine one of more properties of each user. For example, the interaction system may be used to determine whether the hand of each user is raised or an orientation of each user's face.