G06F3/042

PROJECTION DISPLAY UNIT AND FUNCTION CONTROL METHOD
20170228057 · 2017-08-10 · ·

A projection display unit and a function control method. The projection display unit includes: a projection optical system including an illumination section, a projection lens, and a light valve; and a detection optical system including an image pickup device, the image pickup device being disposed in a position optically conjugating with the light valve, wherein: the detection optical system is configured to detect whether an input operation is performed in a peripheral region of a projection region on a projection surface, and in response to detection by the detection optical system in the peripheral region, a specific function is caused to be executed.

PROJECTION DISPLAY UNIT AND FUNCTION CONTROL METHOD
20170228057 · 2017-08-10 · ·

A projection display unit and a function control method. The projection display unit includes: a projection optical system including an illumination section, a projection lens, and a light valve; and a detection optical system including an image pickup device, the image pickup device being disposed in a position optically conjugating with the light valve, wherein: the detection optical system is configured to detect whether an input operation is performed in a peripheral region of a projection region on a projection surface, and in response to detection by the detection optical system in the peripheral region, a specific function is caused to be executed.

DISPLAY PANEL, MOBILE TERMINAL AND METHOD FOR CONTROLLING MOBILE TERMINAL

Embodiments of the present application provide a display panel, a mobile terminal and a method for controlling a mobile terminal which may allow the user perform single hand operations on a large-size mobile terminal to improve the convenience of the mobile terminal. The display panel includes: a first substrate; a second substrate arranged opposite to the first substrate; and a single hand operation sensing unit arranged on the first substrate or the second substrate, which is configured to sense a single hand holding operation of a user and to trigger the display panel to demagnify an operation graphic interface displayed in full-screen and display the demagnified operation graphic interface in a predetermined single hand operation comfortable region positioned on the basis of a holding position in case that the single hand holding operation of the user is sensed. The display panel is used in single hand operation scene.

TOUCH REGION PROJECTION ONTO TOUCH-SENSITIVE SURFACE
20170228092 · 2017-08-10 ·

Examples disclosed herein relate to projecting onto a touch-sensitive surface a projection image having projected regions corresponding to target and non-target touch regions. Examples include a computing system having a touch-sensitive surface, and a camera to capture an image representing an object disposed between the camera and the touch-sensitive surface. The computing system may also include a detection engine to identify, based at least on the object represented in the image, at least one touch region of the touch-sensitive surface, and to generate a projection image including a projected region corresponding to the touch region, and a projector to project the projection image onto the touch-sensitive surface.

MISALIGNMENT DETECTION

Examples disclosed herein relate to detecting misalignment of a touch sensitive mat Examples include detecting corners of the touch sensitive mat, determining a set of reference corners, performing a comparison of the detected corners of the mat with the set of reference corners, and determining a level of misalignment based on the comparison.

METHODS AND SYSTEMS FOR PERFORMING MEDICAL PROCEDURES AND FOR ACCESSING AND/OR MANIPULATING MEDICALLY RELEVANT INFORMATION

A system permits a medical practitioner to interact with medically relevant information during a medical procedure. The system comprises: a projector for projecting a user interface menu image onto a projection surface; a three-dimensional optical imaging system for capturing three-dimensional location information for objects in a sensing volume which includes the projection surface; and a controller connected to receive the three-dimensional location information from the three-dimensional optical imaging system and configured to interpret one or more gestures made by the practitioner in a space between the three-dimensional optical imaging system and the projection surface based on a location of the gesture relative to the projected user interface menu image. The controller is connectable to a display to cause the display to render images and is configured to cause the display to render an image comprising medically relevant information, the medically relevant information based at least in part on the interpreted gesture.

DETECTION APPARATUS, DETECTION METHOD, AND SPATIAL PROJECTION APPARATUS

A detection apparatus includes at least one processing unit, a light guide optical system configured to focus projected light to a spatial image forming plane to form an image thereon, and a sensor configured to shine a laser beam over a set scanning range to detect an entry depth of a target object in a direction orthogonal to the spatial image forming plane and an entry position of the target object on the spatial image forming plane, and the processing unit executes an operation associated with the entry position when the entry depth of the target object reaches a set depth.

LIGHT SOURCE DEVICE, ELECTRONIC BLACKBOARD SYSTEM, AND METHOD OF CONTROLLING LIGHT SOURCE DEVICE
20170228103 · 2017-08-10 ·

A light source includes an invisible light emitter that emits an invisible light beam and a visible light emitter that emits a visible light beam. A light source controller causes the light source to emit at least one light beam from among the invisible light beam and the visible light beam. A propagation controller causes an emission light beam emitted from the light source to propagate along a predetermined plane.

VIRTUAL KEYBOARD
20170228153 · 2017-08-10 ·

Examples relate to improving typing accuracy using a virtual keyboard. One example enables detection that a key of the virtual keyboard has been pressed and identification of a finger of a user used to press the key of the virtual keyboard. The key that was intended to be pressed may be determined based on the determined key, the finger of the user, and a mapping of finger placement and keys of the virtual keyboard. Functionality associated with pressing the intended key may be performed.

VEHICLE HUMAN INTERFACE ASSEMBLY
20170228089 · 2017-08-10 · ·

A vehicle human interface assembly has an interior trim portion of a vehicle, a flexible membrane, and a frame defining an opening. The flexible membrane extends over the opening such that the flexible membrane is in tension and is capable of being depressed by a user to form a depression in the membrane in a range of locations within the opening. One or more flexible membrane sensors are configured to determine the location at which the flexible membrane has been depressed by the user and provide a signal indicating the location at which the flexible membrane has been depressed. A controller is configured to control one or more systems of the vehicle based on the signal from the one or more flexible membrane sensors. The frame is provided in the interior trim portion such that the flexible membrane and a surrounding surface of the interior trim portion are substantially flush.