Patent classifications
G06T11/80
Methods and apparatus for providing a digital illustration system
A non-transitory processor-readable medium storing code representing instructions to be executed by a processor to receive a set of data elements associated with a user-defined content having a content type. The processor interpolates the set of data elements to produce a first set of content data based on a filter domain associated with the user-defined content. The processor further refines the first set of content data based, at least in part, on the content type to produce a second set of content data. The processor also sends a signal representing the second set of content data such that the user-defined content is displayed based on the second set of content data.
AUTONOMOUSLY ACTING ROBOT THAT CHANGES PUPIL IMAGE OF THE AUTONOMOUSLY ACTING ROBOT
A monitor is installed in an eye of a robot, and an eye image is displayed on the monitor. The robot extracts a feature quantity of an eye of a user from a filmed image of the user. The feature quantity of the eye of the user is reflected in the eye image. For example, a feature quantity is a size of a pupillary region and a pupil image, and a form of an eyelid image. Also, a blinking frequency or the like may also be reflected as a feature quantity. Familiarity with respect to each user is set, and which user's feature quantity is to be reflected may be determined in accordance with the familiarity.
AUTONOMOUSLY ACTING ROBOT THAT CHANGES PUPIL IMAGE OF THE AUTONOMOUSLY ACTING ROBOT
A monitor is installed in an eye of a robot, and an eye image is displayed on the monitor. The robot extracts a feature quantity of an eye of a user from a filmed image of the user. The feature quantity of the eye of the user is reflected in the eye image. For example, a feature quantity is a size of a pupillary region and a pupil image, and a form of an eyelid image. Also, a blinking frequency or the like may also be reflected as a feature quantity. Familiarity with respect to each user is set, and which user's feature quantity is to be reflected may be determined in accordance with the familiarity.
SYSTEMS AND METHODS FOR COLOR PALETTE OPTIMIZATION
A method and system for color optimization in generated images are described. The method and system include receiving an image generation prompt that includes a text description of target image content and color information describing a target color palette; encoding the image generation prompt to obtain image features that represent the target image content and the target color palette; and generating an image representing the target image content with the target color palette based on the image features.
SYSTEMS AND METHODS FOR COLOR PALETTE OPTIMIZATION
A method and system for color optimization in generated images are described. The method and system include receiving an image generation prompt that includes a text description of target image content and color information describing a target color palette; encoding the image generation prompt to obtain image features that represent the target image content and the target color palette; and generating an image representing the target image content with the target color palette based on the image features.
Polybezier-based digital ink strokes
An aspect of the present disclosure relates to a method including: (i) obtaining an ordered set of points in a two-dimensional space; (ii) obtaining, for each point in the ordered set of points, a width value; (iii) determining a plurality of left points and a plurality of right points in the two-dimensional space, wherein each point in the ordered set of points corresponds to a left point in the plurality of left points and a right point in the plurality of right points such that the left and right points are separated by the width value obtained for the point in the ordered set of points; and (iv) determining a left curve and a right curve defining boundaries of a virtual brushstroke in the two-dimensional space.
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM
Provided is an image processing apparatus that performs image processing on original image data, the apparatus including a communication unit that performs communication with a terminal, a storage that stores parameter information indicating a relationship between image processing-related information transmitted from the terminal and an image processing parameter, and a processor, in which the processor acquires the original image data, performs, in a case in which an image processing request including the image processing-related information is received from the terminal via the communication unit, the image processing on the original image data using the image processing parameter corresponding to the image processing-related information based on the parameter information, and transmits image data after the image processing to the terminal via the communication unit.
Electronic device and method for providing avatar based on emotion state of user
An electronic device may include: a camera, a display, a processor operatively coupled to the camera and the display, and a memory operatively coupled to the processor. The memory may store a plurality of avatar templates containing a plurality of gestures and instructions. The instructions, when executed by the processor may control the electronic device to: obtain an image of an external object using the camera, obtain a value of at least one parameter corresponding to an emotion state based on the obtained image, select an avatar template including a first gesture from among the plurality of avatar templates based on the value of the at least one parameter, generate an avatar sticker including a second gesture different from the first gesture based on the selected avatar template and the value of the at least one parameter, and display the generated avatar sticker on at least a portion of the display.
Electronic device and method for providing avatar based on emotion state of user
An electronic device may include: a camera, a display, a processor operatively coupled to the camera and the display, and a memory operatively coupled to the processor. The memory may store a plurality of avatar templates containing a plurality of gestures and instructions. The instructions, when executed by the processor may control the electronic device to: obtain an image of an external object using the camera, obtain a value of at least one parameter corresponding to an emotion state based on the obtained image, select an avatar template including a first gesture from among the plurality of avatar templates based on the value of the at least one parameter, generate an avatar sticker including a second gesture different from the first gesture based on the selected avatar template and the value of the at least one parameter, and display the generated avatar sticker on at least a portion of the display.
DIGITAL OVERPAINTING CONTROLLED BY OPACITY AND FLOW PARAMETERS
Certain embodiments involve a graphics manipulation application using brushstroke parameters that include a maximum alpha-deposition parameter and a fractional alpha-deposition parameter. For instance, the graphics manipulation application uses an alpha flow increment computed from the maximum alpha-deposition parameter and the fractional alpha-deposition parameter to compute an output canvas color. In some embodiments, if the current canvas opacity exceeds or equals the maximum alpha-deposition parameter, the current canvas opacity is selected as the output canvas opacity. Otherwise, the graphics manipulation application computes the output canvas opacity by increasing the current canvas opacity based on the alpha flow increment. The graphics manipulation application updates a canvas portion affected by a brushstroke input to include the output canvas opacity and the output canvas color.