Patent classifications
G06F3/03549
USER INTERFACE DEVICE WITH TOUCH SENSOR
The functionality of a conventional mouse is extended to provide an extended number of simultaneously adjustable user interface parameters employing one or more user-removable modules. In an embodiment, a user interface for controlling an external device, such as a computer, includes a first user interface sensor configured with a housing. This first sensor generates a first plurality of signals responsive to movement of the housing relative to two orthogonal axes. A compartment is configured with the housing and is sized to receive the user-removable module. This user-removable module contains a second user interface sensor, which generates a second plurality of signals responsive to user manipulation. Output is provided responsive to signals generated by the first and second user interface sensors. In another embodiment, the housing of an extended functionality mouse itself serves as a module removable from a compartment provided in another physical device.
Method and user interface device with touch sensor for controlling applications
The functionality of a conventional mouse is extended to provide an extended number of simultaneously adjustable user interface parameters employing one or more user-removable modules. In an embodiment, a user interface for controlling an external device, such as a computer, includes a first user interface sensor configured with a housing. This first sensor generates a first plurality of signals responsive to movement of the housing relative to two orthogonal axes. A compartment is configured with the housing and is sized to receive the user-removable module. This user-removable module contains a second user interface sensor, which generates a second plurality of signals responsive to user manipulation. Output is provided responsive to signals generated by the first and second user interface sensors. In another embodiment, the housing of an extended functionality mouse itself serves as a module removable from a compartment provided in another physical device.
Methods and systems for controlling applications using user interface device with touch sensor
The functionality of a conventional mouse is extended to provide an extended number of simultaneously adjustable user interface parameters employing one or more user-removable modules. In an embodiment, a user interface for controlling an external device, such as a computer, includes a first user interface sensor configured with a housing. This first sensor generates a first plurality of signals responsive to movement of the housing relative to two orthogonal axes. A compartment is configured with the housing and is sized to receive the user-removable module. This user-removable module contains a second user interface sensor, which generates a second plurality of signals responsive to user manipulation. Output is provided responsive to signals generated by the first and second user interface sensors. In another embodiment, the housing of an extended functionality mouse itself serves as a module removable from a compartment provided in another physical device.
Gesture operation method, apparatus, device and medium
The present disclosure provides a gesture operation method, apparatus, device and medium. The method includes: obtaining depth information of a user hand; determining space coordinates of a virtual hand corresponding to the hand in a virtual space based on the depth information; binding trackballs to the virtual hand based on the space coordinates, which includes binding a palm ball to a palm position of the virtual hand, and binding at least one fingertip ball to at least one fingertip position of the virtual hand, a volume of the palm ball being greater than the at least one fingertip ball; and performing a corresponding operation in the virtual space based on a straight-line distance between the at least one fingertip ball and the palm ball.
ELECTRONIC APPARATUS FOR REPOSITIONING AND TRANSITIONING AMONG INPUT DEVICES
An electronic apparatus that comprises at least a repositioning mechanism wherein at least one of said repositioning mechanisms alternately repositions for use and disuse at least an input device such that user can transition from using at least one of said input devices to using at least another one of said input devices even while both palms remain continuously engaged to their locations.
USING TOUCH SENSING TO MAKE A TRACKBALL BEHAVE LIKE A JOYSTICK
While one type of input, either a mouse input or a joystick input, may be preferred for one type of a game, it may not be preferred, or even compatible, for another type of a game. Introduced herein is a game controller that employs a dedicated input, which is capable of the absolute accuracy of a mouse input or trackball input, but is also capable of measuring how far off center the input is (e.g., how far off center it has moved), and can also return to center when released, as is present in a joystick input. The introduced game controller integrates a touch sensing trackball to enjoy the benefits of both the mouse type input and joystick type input, in a single dedicated input, providing a user freedom to play any type of game without worrying about the compatibility of their game controllers.
Multi-sensor device with an accelerometer for enabling user interaction through sound or image
A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to the interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to the pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.
Multi-sensor device with an accelerometer for enabling user interaction through sound or image
A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to the interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to the pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.
Pointing device
A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to the interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to the pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.
Rowing machine having a handle with a cursor control device for controlling a cursor at a graphical user interface
A handle for a rowing machine is provided. The handle has a cursor control device which accepts user input. A processor is in electronic communication with the cursor control device and an electronic display. An electronic storage device comprises executable software instructions, which when executed, configure the processor to receive user input from the cursor control device and move the cursor on a graphical user interface (“GUI”) in response to the received user input or update the GUI based on user input.