Patent classifications
G06F1/1692
User terminal device and displaying method thereof
A user terminal device and a controlling method thereof are provided. The user terminal device includes a display configured to be divided into a first area and a second area which is larger than the first area with reference to a folding line, a cover disposed on a rear side of the display, a detector configured to detect a user interaction on the display and the cover, and a controller configured to, in response to the display being folded along the folding line such that the first area and the second area face each other, control the detector to detect a user interaction through an exposure area, which is an exposed part of the second area, and the cover, and, in response to the display being folded such that the two parts of the cover face with each other with reference to the folding line, control the detector to detect a user interaction through the first area and the second area.
Electronic device and mode switching method thereof
An electronic device is provided, including first and second bodies, a processing module, a touch display panel, and at least one sensing unit. The second body is rotatably connected to the first body. The processing module is disposed in the first body or the second body. The touch display panel is disposed on the second body, is coupled to the processing module, and has a main display part, and first and second display parts. The sensing unit is disposed in the first body or the second body and coupled to the processing module. When the sensing unit detects the first and second bodies are folded relative to each other, the processing module is switched to a first mode. In the first mode, the first and second display parts are adapted to synchronously display or individually display a first message, a second message, or a third message.
DISPLAY ELEMENT DISPLAY METHOD AND ELECTRONIC DEVICE
A display element display method and an electronic device are provided. The method is applied to an electronic device including a first body and a second body. The first body is bendable relative to the second body. The first body and the second body respectively correspond to different display areas of the electronic device. The method includes: The electronic device detects a status of the first body and a status of the second body (301); determines a main interaction area and a main display area based on the status of the first body and the status of the second body (302); obtains one or more display elements on a to-be-displayed user interface (303); determines a display element type, where the display element type includes a main interaction element and a main display element (304); and displays the main interaction element in the main interaction area and displays the main display element in the main display area (306). The method helps a user better operate the electronic device, and also helps the user view content displayed by the electronic device.
TOUCHPAD NAVIGATION FOR AUGMENTED REALITY DISPLAY DEVICE
Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.
DYNAMIC MARINE DISPLAY SYSTEMS AND METHODS
Techniques are disclosed for systems and methods to provide dynamic display systems for mobile structures. A dynamic marine display system includes a user interface comprising a primary display and secondary display, where the secondary display is disposed along and physically separate from an edge of the primary display, and where the secondary display comprises a touch screen display configured to render pixelated display views and receive user input as one or more user touches and/or gestures applied to a display surface of the secondary display. A logic device is configured to receive user selection of an operational mode associated with the user interface and/or the mobile structure and render a primary display view via the primary display and/or a secondary display view via the secondary display corresponding to the received user selection and/or operational mode.
Electronic device including a plurality of displays and method for operating same
Various embodiments may provide an electronic device including: a housing; a first display configured to be slidable through the housing, wherein at least a portion of the first display is exposed to an outside through the housing, and a region of the first display, exposed to the outside, is capable of being changed based on sliding of the first display through the housing; a second display spaced a certain distance apart from the exposed at least portion of the first display and disposed to form a flat surface with the exposed at least portion of the first display; and at least one processor disposed in the housing, wherein the at least one processor is configured to display first content on the first display and second content on the second display. Various other embodiments are possible.
WEARABLE ELECTRONIC DEVICE HAVING HEAT DISSIPATION STRUCTURE
According to an embodiment disclosed in the present document, an electronic device may include: a lens frame at least partially including a thermally conductive material; a pair of wearable members which can be rotatably coupled to the lens frame; at least one lens disposed in the lens frame; and transparent conductive lines disposed on the lens. The transparent conductive lines may be connected to the thermally conductive material of the lens frame to receive heat transferred from the lens frame. Other various embodiments are possible.
Simultaneous Use of a Capacitance-Based Track Pad
An apparatus may include a capacitance-based trackpad, a tracking driver in communication with the capacitance-based trackpad, a key driver in communication with the capacitance-based trackpad, a processor, and a memory having programmed instructions that, when executed, may cause the processor to modify the raw track inputs to associate a non-confidence indicator with at least one raw track input from the track inputs to form processed track inputs, send the processed track inputs to the tracking driver, and send the processed track inputs to the key driver. The tracking driver may be configured to receive raw track inputs from the capacitive-based trackpad and the key driver may be configured to receive raw key inputs from the capacitance-based trackpad.
Display unit and its manufacturing method
A display unit which can realize reduction in thickness and weight of the display unit by omitting a void between a touch panel and a display panel, and its manufacturing method. Whole faces of the touch panel and the display panel are directly bonded together with an adhesive layer in between. The display panel has a structure wherein a driving substrate in which organic light emitting devices are formed and a sealing substrate are bonded together with an adhesive layer in between.
GESTURE DETECTION USING PIEZO-ELECTRIC ACTUATORS
A gesture detection system comprising a virtual button structure for mounting in an outer frame of a mobile device for detecting finger gestures by a user. First and second piezo-electric actuators are in contact with the virtual button structure, and configured to generate first and second varying electrical signals, respectively in response to a dynamic force application to the virtual button structure. A processor is configured to execute instructions stored in memory to i) determine a magnitude and a position of the dynamic force application on the virtual button structure over time, based on the first varying electrical signal and the second varying electrical signal, ii) determine a gesture corresponding to the magnitude and the position of the dynamic force application over time; and iii) provide a response signal based on the gesture.