G06F3/0425

Systems and methods for non-contacting interaction with user terminals
11693557 · 2023-07-04 ·

Systems and methods are provided to enable users to interact with user terminals having a touch screen interface without requiring the user to physically contact a surface of the touch screen interface.

IMAGE PROCESSING METHOD AND IMAGE PROCESSING DEVICE
20230004282 · 2023-01-05 · ·

An image processing method includes displaying a first screen on a display surface, the first screen including at least a part of a drawing area in which at least one object image drawn by a user is arranged, excluding an area in which at least a part of an object image included in the first screen is arranged from the drawing area when an operation of erasing the at least a part of the object is received, and displaying a reduced screen on the display surface when an operation is received, the reduced screen being obtained by reducing a whole of the drawing area into a size of the display surface.

IMAGE DISPLAY SYSTEM AND IMAGE DISPLAY METHOD
20230007217 · 2023-01-05 ·

An image display system includes a projector projecting image light onto a projection surface, at least one camera picking up at least one image of the projection surface and thus acquiring at least one picked-up image, at least one microphone detecting a sound generated in an image pickup range of the camera, and a control device controlling a position or an orientation of an image displayed on the projection surface by the image light, based on the at least one picked-up image, when a target sound is detected based on the sound.

METHOD, 3D DISPLAY DEVICE AND 3D TERMINAL FOR REALIZING FLOATING TOUCH
20220413619 · 2022-12-29 ·

A method for realizing floating touch is provided, comprising: controlling a multi-viewpoint 3D display screen to display a 3D touch object, and acquiring a floating touch position of a user relative to the multi-viewpoint 3D display screen; and generating touch trigger information when the floating touch position matches the display position of the 3D touch object. Different from 2D touch, the floating touch technology in the present disclosure can realize position matching in a 3D space to generate the touch trigger information. A 3D display device and a 3D terminal for realizing floating touch, a computer readable storage medium, a computer program product and a 3D display system are also provided.

Pictograms as Digitally Recognizable Tangible Controls

Concepts and technologies disclosed herein are directed to pictograms as digitally recognizable tangible controls. According to one aspect disclosed herein, a user system can include a processing component and a memory component. The memory component can include instructions of a pictogram digitization module. The user system can capture, via a camera component, an image containing a pictogram that is a digitally recognizable tangible manifestation of a digital control. The user system can determine, via the pictogram digitization module, the digital control associated with the pictogram. The user system can implement, via the pictogram digitization module, the digital control. The digital control can include a digital content, an action, or a context. The user system can create, via the pictogram digitization module, a digital interface that includes the digital control. In some embodiments, the pictogram includes a formal pictogram. In other embodiments, the pictogram includes an informal pictogram.

Communication management system, communication system, communication management device, image processing method, and non-transitory computer-readable medium

A communication management system manages a session in which a plurality of terminal apparatuses shares a stroke image. The communication management system includes circuitry configured to: manage stroke information including a plurality of pieces of stroke data representing the stroke image; receive, from a first terminal apparatus, group operation information for designating one or more pieces of stroke data, which are operation targets, from among the plurality of pieces of stroke data; and restrict, based on the group operation information, an operation regarding the one or more pieces of stroke data by a second terminal apparatus, which is different from the first terminal apparatus.

INPUT APPARATUS, INPUT METHOD, AND RECORDING MEDIUM RECORDING INPUT PROGRAM
20220404948 · 2022-12-22 ·

An input apparatus includes: an operation position determination device that determines an operation position of a gesture operation by a user; a movement vector calculator that calculates a movement vector at an input position on the basis of a movement amount of the operation position when the operation position moves; an input processor that executes first input processing at the input position at the time when a first gesture operation is detected, and executes second input processing at the input position at the time when a second gesture operation is detected; and a movement vector corrector that corrects the movement vector in the case where a change from the first gesture operation to the second gesture operation is determined.

Generating and rendering motion graphics effects based on recognized content in camera view finder

Systems and methods are described for providing co-presence in an augmented reality environment. The method may include receiving a visual scene within a viewing window depicting a multi-frame real-time visual scene captured by a camera onboard an electronic device associated with the augmented reality environment, identifying a plurality of elements of the visual scene, detecting at least one graphic indicator associated with at least one of the plurality of elements, detecting at least one boundary associated with the at least one element, and generating, in the viewing window and based on the detection of the at least one graphic indicator, Augmented Reality (AR) motion graphics within the detected boundary. In response to determining that content related to the at least one element is available, the method may include retrieving the content and visually indicating an AR tracked control on the at least one element within the viewing window.

IMAGE PROCESSING METHOD AND IMAGE PROCESSING DEVICE
20220397977 · 2022-12-15 · ·

An image processing method includes detecting pointer contact positions trajectories on a display surface, displaying a first image on the display surface, displaying a second image having first and second portions on the display surface using the first image as a background, moving a second image display position on the display surface along the pointer trajectory when a pointer trajectory starting point is included in a display surface area on which the second image is displayed, providing a drawing effect to a first image portion overlapping a second image trajectory when the starting point is included in an area in which the first portion is displayed in the moving the second image display position, and keeping the first image portion display overlapping the second image trajectory when the starting point is included in an area in which the second portion is displayed in the moving the second image display position.

Interactive environment with virtual environment space scanning

An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.