Patent classifications
G06F3/0346
Prestaging, gesture-based, access control system
A prestaging, gesture-based, access control system includes a local access assembly, a mobile device, a storage medium, and a processor. The assembly includes a controller to effect actuation between access and no-access states. The mobile device is carried by a user, and includes a detection system configured to detect a prestaging event inherently performed by the user toward an intent to gain access and followed by the detection of a primary intentional gesture specifically performed by the user toward the intent to gain access. The storage medium and the processor are configured to receive prestaging event information and primary intentional gesture information from the detection system, and execute an application to determine the performance of the prestaging event from the prestaging event information, then determine the performance of the primary intentional gesture from the primary intentional gesture information if the prestaging event is determined to have occurred.
INFORMATION PROCESSING DEVICE, PROGRAM, AND METHOD
There is a demand for an information processing device capable of determining a position where a user attempts to perform operation on a screen of the information processing device, without impairing usability. Therefore, the present disclosure proposes an information processing device including: an acceleration sensor unit that detects a tilt of the information processing device; a display unit; a gyroscope sensor unit that measures angular velocity of the information processing device; and a determination unit that, in response to detection of a first tilt of the information processing device, determines a position where a user attempts to perform operation on the display unit on the basis of a first gyro waveform obtained from the angular velocity and a second gyro waveform at each operation position of the information processing device measured in advance or a learning model generated by using the second gyro waveform as training data. The display unit further displays a screen displayed on the display unit while reducing the screen by a predetermined amount or moving the screen by a predetermined amount in a predetermined direction in accordance with the determined position.
INFORMATION PROCESSING DEVICE, PROGRAM, AND METHOD
There is a demand for an information processing device capable of determining a position where a user attempts to perform operation on a screen of the information processing device, without impairing usability. Therefore, the present disclosure proposes an information processing device including: an acceleration sensor unit that detects a tilt of the information processing device; a display unit; a gyroscope sensor unit that measures angular velocity of the information processing device; and a determination unit that, in response to detection of a first tilt of the information processing device, determines a position where a user attempts to perform operation on the display unit on the basis of a first gyro waveform obtained from the angular velocity and a second gyro waveform at each operation position of the information processing device measured in advance or a learning model generated by using the second gyro waveform as training data. The display unit further displays a screen displayed on the display unit while reducing the screen by a predetermined amount or moving the screen by a predetermined amount in a predetermined direction in accordance with the determined position.
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING TERMINAL, AND PROGRAM
An information processing device is provided which includes a first acquisition unit (214) configured to acquire a control command that is inputted by a first user and corresponds to presentation unit information for designating a presentation unit for presenting a tactile stimulus by a tactile presentation device and mode information for designating a mode of the tactile stimulus; a generation unit (218) configured to generate a tactile control signal for presenting the tactile stimulus to the presentation unit in accordance with the control command; and a first distribution unit (222) configured to distribute the tactile control signal to the tactile presentation device worn on a body of a second user.
Integration of a two-dimensional input device into a three-dimensional computing environment
A workstation enables operation of a 2D input device with a 3D interface. A cursor position engine determines the 3D position of a cursor controlled by the 2D input device as the cursor moves within a 3D scene displayed on a 3D display. The cursor position engine determines the 3D position of the cursor for a current frame of the 3D scene based on a current user viewpoint, a current mouse movement, a CD gain value, a Voronoi diagram, and an interpolation algorithm, such as the Laplacian algorithm. A CD gain engine computes CD gain optimized for the 2D input device operating with the 3D interface. The CD gain engine determines the CD gain based on specifications for the 2D input device and the 3D display. The techniques performed by the cursor position engine and the techniques performed by the CD gain engine can be performed separately or in conjunction.
Integration of a two-dimensional input device into a three-dimensional computing environment
A workstation enables operation of a 2D input device with a 3D interface. A cursor position engine determines the 3D position of a cursor controlled by the 2D input device as the cursor moves within a 3D scene displayed on a 3D display. The cursor position engine determines the 3D position of the cursor for a current frame of the 3D scene based on a current user viewpoint, a current mouse movement, a CD gain value, a Voronoi diagram, and an interpolation algorithm, such as the Laplacian algorithm. A CD gain engine computes CD gain optimized for the 2D input device operating with the 3D interface. The CD gain engine determines the CD gain based on specifications for the 2D input device and the 3D display. The techniques performed by the cursor position engine and the techniques performed by the CD gain engine can be performed separately or in conjunction.
Systems for leveraging device orientation to modify visual characteristics of edits
Device orientation techniques and systems are described to modify visual characteristics of edits that support the ability of a computing device to modify visual characteristics of digital objects appearing in a user interface of a mobile device based on orientation of the computing device, itself. A user interaction, for instance, is detected in a user interface of a device. Responsive to the detected user interaction, a multiaxial orientation of the device is determined. In some examples, the orientation is translated into tilt data such as in the form of altitude and azimuth values. A relative amount is calculated based on the orientation. The relative amount is applied to a visual characteristic of an edit to a digital object. Once the relative amount is applied, the digital object having the relative amount applied is displayed in the user interface.
Systems for leveraging device orientation to modify visual characteristics of edits
Device orientation techniques and systems are described to modify visual characteristics of edits that support the ability of a computing device to modify visual characteristics of digital objects appearing in a user interface of a mobile device based on orientation of the computing device, itself. A user interaction, for instance, is detected in a user interface of a device. Responsive to the detected user interaction, a multiaxial orientation of the device is determined. In some examples, the orientation is translated into tilt data such as in the form of altitude and azimuth values. A relative amount is calculated based on the orientation. The relative amount is applied to a visual characteristic of an edit to a digital object. Once the relative amount is applied, the digital object having the relative amount applied is displayed in the user interface.
Using an image sensor for always-on application within a mobile device
A mobile device includes an application processor and an image sensor. The application processor includes an imaging subsystem configured to process high resolution image data through a first interface and a sensor hub configured to process sensor data through a second interface. The image sensor operates in one of first and second modes. The image sensor is configured to capture the high resolution image data in response to a request from the imaging subsystem and the imaging subsystem is configured to access the high resolution image data using the first interface for performing a first operation, during the first mode. The image sensor is configured to capture low resolution image data and the sensor hub is configured to access the low resolution image data using the second bus for performing a second operation, during the second mode.
User terminal apparatus and management method of home network thereof
An example user terminal apparatus includes communication circuitry configured to be connected to a home network comprising a plurality of devices; a display configured to display a UI screen for managing the home network; a sensor configured to sense a user manipulation of the UI screen; and processing circuitry configured to change the UI screen displayed on the display according to the user manipulation. The UI screen is one of a plurality of service pages that are changeable according to a user manipulation in a first direction, the plurality of service pages being pages for respectively providing different home network management services. At least one of the plurality of service pages comprises an area that is displayable on the display according to a user manipulation in a second direction.