Image processing device and method
11620967 · 2023-04-04
Inventors
Cpc classification
G09G2360/18
PHYSICS
G09G2320/0686
PHYSICS
G09G2320/0666
PHYSICS
International classification
Abstract
An image processing device having a processor coupled to a memory. The processor is programmed to process two or more media formats stored in the memory, each media format is made up of one or more digital image layers that includes non-transparent pixels and may include transparent pixels. The processor is programmed to: set the non-transparent pixels in each of the digital image layer of the two or more media formats to a contrast state, set pixels stored in an off-screen data buffer of the memory to pixels corresponding to a predetermined color scheme, apply an image function to each media format that is drawn to the off-screen data buffer so as to allow the plurality of overlapping media formats to be displayed on the display screen as see through.
Claims
1. An image processing device, comprising: a processor coupled to a memory; and a display screen, wherein the processor is configured to process a plurality of media formats stored in the memory, each of the plurality of media formats is made up of a digital image layer that includes at least non-transparent pixels; the processor configured to execute the following: set the non-transparent pixels in each of the digital image layer of the plurality of media formats to a contrast state by application of one or more image filters of a plurality of image filters; set pixels stored in an off-screen data buffer of the memory to pixels corresponding to a predetermined color scheme; apply an image function to each of the plurality of media formats drawn successively to the off-screen data buffer so as to allow the plurality of overlapping media formats to be displayed on the display screen as see through, wherein the one or more image filters manipulates the non-transparent pixels in each of the digital image layer of the plurality of media formats to a black and white version, and wherein the plurality of image filters comprise multiple contrast filters applied successively and include a grayscale filter, a threshold filter, an edge detection filter, a sharpen filter, a blur filter, and an assign bins filter.
2. The image processing device as set forth in claim 1, wherein the processor is further configured to simultaneously draw the off screen buffer and another media format to the display screen by using the same pixels and by displaying the true color information of the another media format and overlapping the edge information of the pixels in the off screen buffer as the image function applied to the true colors of the another media format.
3. The image processing device as set forth in claim 2, wherein the image function being a replacing function that blends the pixels in the off-screen data buffer with the non-transparent pixels in the digital image layer to generate new pixel values in the off-screen data buffer.
4. The image processing device as set forth in claim 3, wherein the new pixel values are generated by replacing every pixel in the digital image layer with the pixels of the another digital image while drawing the digital image layer to the off-screen buffer.
5. The image processing device as set forth in claim 1, wherein the plurality of media formats include any digitally displayable information including images, videos, animations, graphs and text.
6. The image processing device as set forth in claim 1, wherein the contrast state being a white RGB color scheme.
7. The image processing device as set forth in claim 1, wherein the contrast state being one of a plurality of RGB color schemes.
8. The image processing device as set forth in claim 1, wherein the one or more image filters comprises one or more simple contrast filters which set the non-transparent pixels to black or white by using an edge detection algorithm.
9. A non-transitory computer-readable storage medium with an executable program stored thereon, wherein the program instructs a computer to perform the following steps: processing a plurality of media formats where each of the plurality of media formats is made up of a digital image layer that includes at least non-transparent pixels; setting the non-transparent pixels in each of the digital image layer of the plurality of media formats to a contrast state by application of one or more image filters of a plurality of image filters; setting pixels stored in an off-screen data buffer to pixels corresponding to a predetermined color scheme; applying an image function to each of the plurality of media formats drawn successively to the off-screen data buffer so as to allow the plurality of overlapping media formats to be displayed on the display screen as see through, wherein the one or more image filters manipulates the non-transparent pixels in each of the digital image layer of the plurality of media formats to a black and white version, and wherein the plurality of image filters comprise multiple contrast filters applied successively and include a grayscale filter, a threshold filter, an edge detection filter, a sharpen filter, a blur filter, and an assign bins filter.
10. The non-transitory computer-readable storage medium as set forth in claim 9, further comprising the step of: simultaneously drawing the off screen buffer and another media format to a display screen by using the same pixels and by displaying the true color information of the another media format and overlapping the edge information of the pixels in the off screen buffer as the image function applied to the true colors of the another media format.
11. The non-transitory computer-readable storage medium as set forth in claim 10, wherein the image function being a replacing function that replaces the pixels in the off-screen data buffer with the non-transparent pixels in the digital image layer to generate new pixel values in the off-screen data buffer.
12. The non-transitory computer-readable storage medium as set forth in claim 11, wherein the new pixel values are generated by replacing every pixel in the digital image layer with the pixels of the another digital image while drawing the digital image layer to the off-screen buffer.
13. The non-transitory computer-readable storage medium as set forth in claim 9, wherein the one or more image filters comprises one or more simple contrast filters which set the non-transparent pixels to black or white by using an edge detection algorithm.
14. An image processing method, comprising: processing, using a central processing unit, a plurality of media formats stored in a memory, each of the plurality of media formats is made up of a digital image layer that includes at least non-transparent pixels; setting the non-transparent pixels in each of the digital image layer of the plurality of media formats to a contrast state by application of one or more image filters of a plurality of image filters; setting pixels stored in an off-screen data buffer of the memory to pixels corresponding to a predetermined RGB color scheme; applying an image function to each of the plurality of media formats drawn successively to the off-screen data buffer so as to allow the plurality of overlapping media formats to be displayed on the display screen as see through, wherein the one or more image filters manipulates the non-transparent pixels in each of the digital image layer of the plurality of media formats to a black and white version, and wherein the plurality of image filters comprise multiple contrast filters applied successively and include a grayscale filter, a threshold filter, an edge detection filter, a sharpen filter, a blur filter, and an assign bins filter.
15. The image processing device as set forth in claim 14, further comprising the step of: simultaneously drawing the off screen buffer and another media format to the display screen by using the same pixels and by displaying the true color information of the another media format and overlapping the edge information of the pixels in the off screen buffer as the image function applied to the true colors of the another media format.
16. The image processing method as set forth in claim 15, wherein the image function being a replacing function that replaces the pixels in the off-screen data buffer with the non-transparent pixels in the digital image layer to generate new pixel values in the off-screen data buffer.
17. The image processing method as set forth in claim 16, wherein the new pixel values are generated by replacing every pixel in the digital image layer with the pixels of the another digital image while drawing the digital image layer to the off-screen buffer.
18. The image processing method as set forth in claim 15, wherein the image function comprises one of edge detection, threshold, grayscale, sepia, blur, brightness, contrast, invert, saturate, opacity, and blending.
19. The image processing method as set forth in claim 14, wherein the contrast state being a white RGB color scheme.
20. The image processing method as set forth in claim 14, wherein the contrast state being one of a plurality of RGB color schemes including a white RGB color scheme.
21. The image processing method as set forth in claim 14, wherein the predetermined color scheme being a white RGB color scheme.
22. The image processing method as set forth in claim 14, wherein the plurality of media formats include any digitally displayable information including images, videos, animations, graphs and text.
23. The image processing method as set forth in claim 14, wherein the one or more image filters comprises one or more simple contrast filters which set the non-transparent pixels to black or white by using an edge detection algorithm.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
(38)
(39)
(40)
(41)
(42)
(43)
DETAILED DESCRIPTION OF THE INVENTION
(44) The subject invention is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject invention. It may be evident, however, that the subject invention can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the subject invention.
(45) While the present invention may be embodied in many different forms, a number of illustrative embodiments are described herein with the understanding that the present disclosure is to be considered as providing examples of the principles of the invention and such examples are not intended to limit the invention to preferred embodiments described herein and/or illustrated herein.
(46) In an exemplary embodiment, this aforementioned problem of resource sharing and taking turns may be solved by displaying two contexts, e.g. application (context1) and a multi-layer advertisement (context2), at the same time using the same pixels by displaying the true color information of context2 and overlapping the edge information of context1 as a function to select which layer of context2 to display. In a simple embodiment a function could be a composite blur, grayscale, edge detection, threshold (BGET) image function. A BGET image function would display the edge information of context1 as one color, i.e. white, where edges are detected and would display all other non-transparent pixels as another color, i.e. black. A select function would display the colors of a layer of context2 where the BGET function displays black, and would display the colors of a second layer of context2 where the BGET function displays white. The BGET filter can first blur the image to reduce noise. In an embodiment a standard gaussian blur with a variable standard deviation could be applied to blur the image. A different standard deviation could be selected to provide an optimal noise reduction for each scene of a particular application. In an embodiment a blur/noise reduction filter could be one of various industry standard filters. A BGET filter could then apply a standard grayscale filter. A BGET filter could then apply an edge detection filter. In an embodiment an edge detection filter could be a 3×3 convolve matrix of the form ((−1, −1, −1), (−1, 8, −1), (−1, −1, −1)). In an embodiment an edge detection filter could be a Sobel filter. In an embodiment an edge detection filter could be one of various industry edge detection filters. A standard edge detection image function could display the edge information of context1 as lighter grayscale values and all other information as darker grayscale values. A BGET function could then edge detect the grayscale values to set all pixel values to one of two colors, i.e. white and black. One color represents all edge information detected, and one color represents all non-edge information. In an embodiment an edge detection filter could be one of various industry edge detection filters.
(47) Conceptually a BGET image function provides two states—the positive state, i.e. black, shows layer 1 of context2 in original RGB colors, and the negative state, i.e. white, shows layer 2 of context2 in original RGB colors. This two state functionality provides contrast to discern edge information of context1. Several other standard image functions can be applied to provide two states, e.g. sepia, grayscale, charcoal, etc.
(48) Discernible contrast between corresponding pixels in layers of an advertisement is required to visually discern edge information of an application. Standard color distance calculations, i.e. CIE76, can be applied to show where contrast is low or high. Low contrast will not display edge information clearly. Layers 1 and layers 2 could be displayed using a checkerboard or striped or concentric circle black and white selecting filter to visually discern contrast in the layers.
(49) Various embodiments of this invention can have different sources for context1 and context2. One context could be an application and the other context could be an advertisement, or vice versa. One context could be video or animation and the other context could be displayed as an overlay which displays the video through a function. Both contexts could be applications. These are listed as examples of different sources for context1 and context2 and are not intended to be an exhaustive list.
(50) Various embodiments of this invention can use color schemes other than RGB, e.g. CMYK. This same image processing device and method can be applied to different color schemes.
(51) Various embodiments of this invention can utilize images defined as raster or vector. For purposes of explanation raster images are assumed. This same image processing device and method can be applied to vector images.
(52) Many Integrated Development Environments (IDEs) provide functionality to computer programmers for software development. Many IDEs include image editing functionalities. Typical scene rendering in an IDE includes capabilities for drawing scenes one layer at a time in which successively drawn layers overwrite previously drawn layers (painter's algorithm) and capabilities for applying image functions when drawing. Typical image functions include grayscale, sepia, blur, brightness, contrast, invert (negative), saturate, opacity, threshold, edge detection, and blending. IDEs could be expanded to provide the new functionality described in this invention.
(53) Many advertising networks provide Software Development Kits (SDKs) that developers can include in their applications to allow the displaying of advertisements. Typical SDKs allow display of banner, interstitial, and in-app advertisements. SDKs could be expanded to provide the new functionality described in this invention.
(54) Applications often allow/require user interaction in the form of clicking, dragging, squeezing, swiping, etc. Advertisements often allow/require user interaction in the form of clicking through, making a selection, etc. The challenge of allowing/requiring user interaction for both application and advertisement concurrently can be overcome in various ways, e.g. by allocating a hot-spot on the display reserved for advertisement click through. Other methods might select a particular actor or entity as the click through agent for the advertisement. Another method might allow the advertiser to provide an additional actor or entity, e.g. an icon or logo, which is displayed. This additional actor or entity could be displayed in a static location or be made to move to locations on the display away from current user interaction. Another method might display a standard clickable banner ad and also display a context 2 branding ad in an application. These are listed as examples of allowing user interaction when more than one context is displayed concurrently and are not intended to be an exhaustive list.
(55)
(56)
(57)
(58)
(59) Each scene in many typical applications is drawn using a painter's algorithm. Scenes are usually defined in layers that are different distances from the viewer.
(60) When advertising is incorporated into an application there are typically multiple threads of execution running concurrently. One thread can be for the application and one thread can be for the advertisement.
(61) In an exemplary embodiment this invention adds a preprocessing step to the typical application thread. This preprocessing step can include a manual or automatic setting of the standard deviation of a gaussian blur filter to optimize noise reduction for improved edge detection. The preprocessing step can include a manual or automatic setting of an edge detection kernel to optimize edge detection.
(62)
(63)
(64) When the layers in context2 have corresponding regions that are “close” in color the contrast can be too low to discern edge information of actors and entities.
(65)
(66) When the colors of the layers of context2 are ‘close’ to the colors of the application there is a low contrast problem at the intersection of context2 and context1. It becomes difficult to discern which pixels belong to context2 and which pixels belong to context1.
(67) In some embodiments this invention can display multiple advertisements concurrently with an application. For instance, context2 could consist of 6 or more different banner advertisements that are tiled. Context1 could be displayed as filtered images of multiple advertisements.
(68) In some embodiments this invention can display advertisements moving, rotating, scaling, etc. in the display.
(69) In an embodiment, a computer-readable storage medium may be RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other hardware medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor.
(70) It would be understood that the invention described herein requires at least a controller capable of executing image processing programs and a memory to store programs executed by the controller and images processed by the controller. The controller may comprise a central processing unit (CPU), a computer, a computer unit, a data processor, a microcomputer, microelectronics device, or a microprocessor. The memory includes, but is not limited to a read/write memory, read only memory (ROM), random access memory (RAM), DRAM, SRAM etc.
(71) What has been described above includes examples of the subject invention. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the subject invention, but one of ordinary skill in the art may recognize that many further combinations and permutations of the subject invention are possible. Accordingly, the subject invention is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.
(72) While embodiments of the present disclosure have been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
LIST OF NUMBERED ITEMS
(73) 1) Display 2) banner ad 3) application 4) interstitial ad 5) application w/in-app ad 6) actor/entity allocated to displaying ads 7) ad displayed on actor/entity 8) ad full screen 9) PacMan character 10) Checkerboard transparent pixels 11) Bounding box 31) Man actor 32) Background image 33) Tree entity 34) Comet actor 35) Painter's algorithm 36) Retrieve next ad 37) Calculate next scene 38) Preprocess step 39) Modified painter's algorithm 40) Offscreen buffer 41) Test if next ad received 42) Scaling step 51) Draw Ad to Display 56) Draw OSB to Display 74) Man actor edge information 76) Tree entity edge information 77) Comet actor edge information