G06F3/0426

Systems and methods for providing on-screen virtual keyboards
11656723 · 2023-05-23 · ·

Systems and methods for providing a virtual keyboard are shown and described. User gestures are captured by a camera and are mapped to spatial coordinates that correspond to the keys of a virtual keyboard. The user defines the coordinate system based on his or her range of motion and also defines the spatial dimensions of the virtual keyboard. The spatial dimensions are then scaled to provide a display image of the virtual keyboard on a TV display. Facial recognition techniques and corresponding data regarding the viewer's anatomy and previously captured reference gestures are used to interpret the viewer's gestures and determine which keystrokes are intended. A character prediction technique using the trajectory of the cursor (i.e., trajectory of entered keystrokes) is combined with language/semantic-based character prediction models to identify a next predicted character that is indicated as the user's selected character, thereby disambiguating the key selection indicated the positioning of his or her fingers relative to the virtual keyboard keys.

DISPLAY APPARATUS AND METHOD FOR CONTROLLING DISPLAY APPARATUS
20170371426 · 2017-12-28 · ·

When position information is outputted, in a case where the correspondence between a first interface via which an image signal is inputted and a second interface via which the position information is outputted has not been stored, it is evaluated whether a candidate of the second interface is uniquely determined. When the candidate is not uniquely determined, an output destination identification image is displayed, and an output destination selected in the output destination identification image is stored in a management table. When the correspondence between the first interface and the second interface has been stored, it is evaluated whether a no output destination has been stored in the management table, and when the no output destination does not apply, the position information is transmitted.

Autonomous computing and telecommunications head-up displays glasses

A pair of eyeglasses has a frame and lenses mounted on the frame. A computer processor is mounted on eyeglasses together with a plurality of cameras and a digital projection system. IR sensors and/or dual zoom cameras may also be mounted on the glasses and configured to track the movement of the user's hand.

Finger-Mounted Device With Sensors and Haptics

A finger-mounted device may include finger-mounted units. The finger-mounted units may each have a body that serves as a support structure for components such as force sensors, accelerometers, and other sensors and for haptic output devices. The body may have sidewall portions coupled by a portion that rests adjacent to a user's fingernail. The body may be formed from deformable material such as metal or may be formed from adjustable structures such as sliding body portions that are coupled to each other using magnetic attraction, springs, or other structures. The body of each finger-mounted unit may have a U-shaped cross-sectional profile that leaves the finger pad of each finger exposed when the body is coupled to a fingertip of a user's finger. Control circuitry may gather finger press input, lateral finger movement input, and finger tap input using the sensors and may provide haptic output using the haptic output device.

ELECTRONIC DEVICE FOR USING VIRTUAL INPUT DEVICE AND OPERATION METHOD IN THE ELECTRONIC DEVICE
20230196689 · 2023-06-22 ·

An example electronic device which, by at least one processor, may obtain an image corresponding to a real drawing by the camera module, detect an outline of at least one object included in the image, obtain at least one virtual button corresponding to a pattern of the outline of the at least one object, deploy the at least one virtual button in an area of the at least one object matching at least one real object included in the real drawing, and process a user interaction corresponding to a user input according to a movement of button control in an area of the at least one virtual button.

LIGHTING APPARATUS

A lighting apparatus with an image-projecting function that is convenient for a user can be provided. The lighting apparatus includes: an illuminating unit that emits illumination light; and a projection-type image display unit that projects an image. The projection-type image display unit is configured so that a setting menu screen settable about an image displayed by the projection-type image display unit can be displayed.

DETECTING FINGER MOVEMENTS
20170351345 · 2017-12-07 ·

Examples relate to determining finger movements. In one example, a computing device may: receive input from at least one of: a first proximity sensor coupled to the frame at a first position and facing a first direction; or a second proximity sensor coupled to the frame at a second position and facing a second direction; determine, based on the input, that a finger action occurred, the finger action being one of: a first movement of a first finger, the first movement being detected by the first proximity sensor; a second movement of a second finger, the second movement being detected by the second proximity sensor; generate, based on the finger action, output that includes data defining an event that corresponds to the finger action; and provide the output to another computing device.

METHOD OF FORMING A GRAPHENE DEVICE USING POLYMER MATERIAL AS A SUPPORT FOR A GRAPHENE FILM

The invention concerns a method of forming a graphene device, the method comprising: forming a graphene film (100) over a substrate; depositing, by gas phase deposition, a polymer material covering a surface of the graphene film (100); and removing the substrate from the graphene film (100), wherein the polymer material forms a support (102) for the graphene film (100).

INTERACTIVE REAR PROJECTION SYSTEM
20230185178 · 2023-06-15 · ·

An interactive rear projection system including a projection screen, a projection optical engine, an input device and a controller is provided. The projection optical engine emits a projection beam which is projected on the projection screen. The input device includes a sensor and a first light source both disposed on a first side of the projection screen. The controller is electrically connected to the projection optical engine, the sensor, and the first light source and controls the first light source to project a first beam toward the projection screen. When an object approaches a second side of the projection screen, the sensor senses a light spot generated by the first beam diffusely reflected by the object and recognizes a position of the object or a position where the object touches the projection screen according to the light spot.

LIGHTING APPARATUS

A lighting apparatus with an image-projecting function that is convenient for a user is provided. It includes: an illuminating unit that emits illumination light; a projection-type image display unit that emits image-projecting emission light for projecting an image; and a sensor that emits operation-detecting light for operation detection and that is capable of detecting an operation by an operation object in a range including an image projection area of the projection-type image display unit, and is configured so that the image-projecting light, and the operation-detecting emission light have respective different wavelength distribution characteristics, and regarding a light amount in the wavelength range of light used by the sensor for the operation detection, a light amount of the operation-detecting light is the largest among those of the illumination light, the image-projecting emission light, and the operation-detecting light.