Patent classifications
G06F3/04847
DISPLAY DATA OBTAINING METHOD, ELECTRONIC DEVICE AND STORAGE MEDIUM
A method is provided for obtaining display data. The method includes: obtaining rendering data corresponding to an application program surface; obtaining, in response to detecting an operation of adjusting a dimension of the application program surface, a transformation coefficient corresponding to the application program surface, where the transformation coefficient is a scale value between a dimension of the application program surface after adjustment and a dimension before adjustment; and obtaining display data of a corresponding masking layer of the application program surface according to the transformation coefficient and the rendering data.
HANDHELD TATTOO DEVICE WITH INTEGRATED BATTERY POWER SOURCE, CONTROL CIRCUITRY, AND USER INTERFACE WITH TOUCH SENSOR
A handheld tattoo device comprises an elongated body comprising a grip section and an upper section above the grip section. The grip section comprises a coupling end configured to removably couple with a needle module comprising one or more needles. The upper section comprises a needle actuator for actuating the one or more needles through a shaft extending through the grip section, and a battery power source for supplying electrical power to the needle actuator. The device also comprises a user interface on the upper section, the user interface comprising a display and a touch sensor for detecting finger gestures of a user; and a control circuitry in the upper section for controlling the power supplied to the actuator by the battery power source based on user input received through the user interface.
Method for user interaction for data manipulation in a CAE/CAD system
A method serves for user interaction in a CAE/CAD system for designing physical parts, the parts being components shaped by a forming process or tools used in a forming process. The method comprises displaying to a user: a graphical user interface (2) with a model display region (3), and a control region (5) for displaying widgets (7) for modifying control parameters controlling operation of the CAE/CAD system,
and, on the basis of user input actions in the control region (5), specifying control parameters (14), modifying the part model.
Each control parameter (15) corresponds to a geometric feature (17) of the graphical model representation (4) that is displayed in the model display region. For each control parameter (15), its corresponding widget (7) and geometric feature (17) are visually marked by visual markers (17, 18) in the same manner, allowing to differentiate them from those of other control parameters (15).
Method for user interaction for data manipulation in a CAE/CAD system
A method serves for user interaction in a CAE/CAD system for designing physical parts, the parts being components shaped by a forming process or tools used in a forming process. The method comprises displaying to a user: a graphical user interface (2) with a model display region (3), and a control region (5) for displaying widgets (7) for modifying control parameters controlling operation of the CAE/CAD system,
and, on the basis of user input actions in the control region (5), specifying control parameters (14), modifying the part model.
Each control parameter (15) corresponds to a geometric feature (17) of the graphical model representation (4) that is displayed in the model display region. For each control parameter (15), its corresponding widget (7) and geometric feature (17) are visually marked by visual markers (17, 18) in the same manner, allowing to differentiate them from those of other control parameters (15).
Interactive Graphical User Interface for Specification Rate Settings and Predictions
A computing system obtains computer model(s) configured to predict response(s) based on variable(s). The system obtains a specification defining an allowed response set for the response(s). The system receives an initial setting for bound(s). The system generates an initial design space for the variable(s) defined by the initial setting. The system displays in a graphical user interface (GUI) an initial representation of a specification rate. The specification rate indicates a portion of the initial design space predicted to generate a response within the allowed response set defined by the specification. The system receives an updated setting. The system generates an updated design space for the variable(s) defined by the updated setting. The system displays in the GUI an updated representation of an updated specification rate. The updated specification rate indicates a portion of the updated design space predicted to generate a response within the allowed response set defined by the specification.
Interactive Graphical User Interface for Specification Rate Settings and Predictions
A computing system obtains computer model(s) configured to predict response(s) based on variable(s). The system obtains a specification defining an allowed response set for the response(s). The system receives an initial setting for bound(s). The system generates an initial design space for the variable(s) defined by the initial setting. The system displays in a graphical user interface (GUI) an initial representation of a specification rate. The specification rate indicates a portion of the initial design space predicted to generate a response within the allowed response set defined by the specification. The system receives an updated setting. The system generates an updated design space for the variable(s) defined by the updated setting. The system displays in the GUI an updated representation of an updated specification rate. The updated specification rate indicates a portion of the updated design space predicted to generate a response within the allowed response set defined by the specification.
DIGITAL AUDIO WORKSTATION AUGMENTED WITH VR/AR FUNCTIONALITIES
Embodiments of the present technology are directed at features and functionalities of a VR/AR enabled digital audio workstation. The disclosed audio workstation can be configured to allow users to record, produce, mix, and edit audio in virtual 3D space based on detecting and manipulating human gestures in a virtual reality environment. The audio can relate to music, voice, background noise, speeches, background noise, one or more musical instruments, special effects music, electronic humming or noise from electrical/mechanical equipment, or any other type of audio.
CUSTOMIZED DISPLAY COLOR PROFILES FOR INDIVIDUAL COLOR PREFERENCE
Customization of display color profiles is described. A first interface for customizing a global color preference is generated. Feedback from a user is received via the first interface. The feedback describes a customized global color preference. A second interface for customizing a memory color preference is generated. Feedback is received from the user via the second interface. The feedback describes the customized memory color preference. A color preference profile is generated using the customized global color preference and the customized memory color preference. The color preference profile describes the user's preference for how color is presented. The color preference profile is associated with a user profile of the user. Visual content is rendered on a display of the device in accordance with the color preference profile.
CUSTOMIZED DISPLAY COLOR PROFILES FOR INDIVIDUAL COLOR PREFERENCE
Customization of display color profiles is described. A first interface for customizing a global color preference is generated. Feedback from a user is received via the first interface. The feedback describes a customized global color preference. A second interface for customizing a memory color preference is generated. Feedback is received from the user via the second interface. The feedback describes the customized memory color preference. A color preference profile is generated using the customized global color preference and the customized memory color preference. The color preference profile describes the user's preference for how color is presented. The color preference profile is associated with a user profile of the user. Visual content is rendered on a display of the device in accordance with the color preference profile.
AUTOMATIC CONTROL OF PHACOEMULSIFICATION NEEDLE TRAJECTORY
A system and method that includes inserting a needle of a phacoemulsification handpiece into an eye of a patient and vibrating the needle in a first trajectory. Matter from the eye is aspirated via an aspiration line while the needle is vibrated in the first trajectory. An indication is received, of a vacuum level in the aspiration line. Determined is, that the vacuum level has changed by at least a preset vacuum level change, and in response vibrating the needle is switched to a second trajectory, different from the first trajectory.