Patent classifications
G06F2203/04805
A METHOD FOR LAYOUT AND SELECTION OF THE MENU ELEMENTS IN MAN-MACHINE INTERFACE
The present invention relates to a method, which is operated by a control unit, and which enables the user to select any of the menu elements and switchover between menus at the main display menu of the machines desired to be interacted, or within an application installed on such device. By virtue of said method, the menu elements can be present on the display according to the level cluster of a cone-shaped function with any set of parameters, but also the positioning of the menu elements can also be performed according to the position of the pointer (M) after interacting with the machine. Upon interaction with the machine the corresponding menu element/elements is/are magnified according to the position of the pointer (M). The size of the elements deviated from due to movement of the pointer is reduced, while the elements converged to are magnified. Moreover, the position of the elements can be altered in the movement direction or a certain direction depending on the reverse direction according to the position of the pointer (M).
INTELLIGENT INTERACTION METHOD AND DEVICE, AND STORAGE MEDIUM
Provided are an intelligent interaction device and method, and a non-transitory computer readable storage medium. The method includes: displaying, on a touch screen, a current window of a multimedia file in a playing state; displaying, in response to an instruction from a user for zooming the current window, a zoomed window of the current window at a first predetermined position of the current window, the zoomed window being smaller than the current window; and displaying, in response to an annotation operation performed by the user for the zoomed window, an annotation in the zoomed window, and updating the current window by displaying the annotation in the current window.
OVERLAYING AUGMENTED REALITY (AR) CONTENT WITHIN AN AR HEADSET COUPLED TO A MAGNIFYING LOUPE
A computer-implemented method for displaying augmented reality (AR) content within an AR device coupled to one or more loupe lenses comprising: obtaining calibration parameters defining a magnified display portion within a display of the AR device, wherein the magnified display portion corresponds to boundaries encompassing the one or more loupe lenses; receiving the AR content for display within the AR device; and rendering the AR content within the display, wherein the rendering the AR content comprises: identifying a magnified portion of the AR content to be displayed within the magnified display portion, and rendering the magnified portion of the AR content within the magnified display portion.
PORTABLE ELECTRONIC DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR DISPLAYING ELECTRONIC DOCUMENTS AND LISTS
In a computer-implemented method, a portion of an electronic document is displayed on a touch screen display of a portable multifunction device. The displayed portion has a vertical position and a horizontal position in the electronic document. An object is detected on or near the displayed portion of the electronic document. In response to detecting the object, a vertical bar and a horizontal bar are displayed on top of the displayed portion. The vertical bar has a vertical position on top of the displayed portion that corresponds to the vertical position in the electronic document of the displayed portion. The horizontal bar has a horizontal position on top of the displayed portion that corresponds to the horizontal position in the electronic document of the displayed portion. After a predetermined condition is met, display of the vertical bar and of the horizontal bar is ceased.
DWELL TIME RECORDING OF DIGITAL IMAGE REVIEW SESSIONS
Systems and methods describe dwell time recording of digital image review sessions. The system displays, at a user interface (UI), a portion of an image on at least one monitor, where the image is segmented into a multitude of patches. The system then receives UI events involving a change in the currently displayed patches. For each of the UI events, the system records one or more dwell times representing durations for which the current patches of the image were displayed. The system also receives a report associated with the image review session, and processes the text of the report to determine a classification label for the image. Finally, the system trains a machine learning model, using at least the recorded dwell times and the classification label for the image.
CONTROL METHOD FOR MAGNIFYING DISPLAY SCREEN AND ASSOCIATED DISPLAY SYSTEM
A display system includes a display device. The display device is arranged to receive a video signal and a control signal from a host system, and includes a processing circuit and a display screen, wherein the processing circuit is arranged to process an original frame corresponding to the video signal according to the control signal, to generate a magnified frame, and generate a processed frame according to the magnified frame and the original frame, and the display screen is coupled to the processing circuit, and is arranged to display the processed frame.
Control method for magnifying display screen and associated display system
A display system includes a display device. The display device is arranged to receive a video signal and a control signal from a host system, and includes a processing circuit and a display screen, wherein the processing circuit is arranged to process an original frame corresponding to the video signal according to the control signal, to generate a magnified frame, and generate a processed frame according to the magnified frame and the original frame, and the display screen is coupled to the processing circuit, and is arranged to display the processed frame.
Portable electronic device, method, and graphical user interface for displaying electronic documents and lists
In a computer-implemented method, a portion of an electronic document is displayed on a touch screen display of a portable multifunction device. The displayed portion has a vertical position and a horizontal position in the electronic document. An object is detected on or near the displayed portion of the electronic document. In response to detecting the object, a vertical bar and a horizontal bar are displayed on top of the displayed portion. The vertical bar has a vertical position on top of the displayed portion that corresponds to the vertical position in the electronic document of the displayed portion. The horizontal bar has a horizontal position on top of the displayed portion that corresponds to the horizontal position in the electronic document of the displayed portion. After a predetermined condition is met, display of the vertical bar and of the horizontal bar is ceased.
ADJUSTABLE MAGNIFIER FOR VIRTUAL DESKTOP
Aspects include a computing device, method and computer readable medium for controlling a magnifier in a virtual desktop session. A disclosed computing device includes: a memory storing instructions for controlling a magnifier in a virtual desktop session; and a processor coupled to the memory and configured to execute the instructions to perform processes including: initiating the magnifier in the virtual desktop session to magnify a portion of content displayed on a client device; and in response to detecting a change in at least one of an orientation of the client device or a distance between the client device and a user: adjusting a magnification level of the magnifier on the portion of the content or shifting the magnifier to magnify a distinct portion of the content displayed on the client device.
Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement
One or more processors of an electronic device detect a communication device electronically in communication with a content presentation companion device operating as a primary display for the electronic device and including a first ultra-wide band component. The one or more processors determine, with a second ultra-wide band tag component carried by the electronic device, a distance between the electronic device and the content presentation companion device using an ultra-wide band ranging process. The one or more processors dynamically enhance a user interface of the content presentation companion device as a function of the distance between the electronic device and the content presentation companion device. The enhancing magnifies a user interface element being presented on the content presentation companion device.