IMAGE PROCESSING APPARATUS, IMAGE FORMING APPARATUS, AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM
20190279056 ยท 2019-09-12
Assignee
Inventors
- Daiki TAKAZAWA (Kanagawa, JP)
- Tomoyuki Ono (Kanagawa, JP)
- Masaki NUDEJIMA (Kanagawa, JP)
- Takayuki HASHIMOTO (Kanagawa, JP)
- Suguru Oue (Kanagawa, JP)
Cpc classification
International classification
Abstract
An image processing apparatus includes a trapping section that performs trapping on a received pixel, a digital filtering section that performs digital filtering on the pixel in parallel with the trapping using the trapping section, and a selection section that selects a result of the trapping using the trapping section and a result of the digital filtering using the digital filtering section.
Claims
1. An image processing apparatus comprising: a trapping section that performs trapping on a received pixel; a digital filtering section that performs digital filtering on the pixel in parallel with the trapping using the trapping section; and a selection section that selects a result of the trapping using the trapping section and a result of the digital filtering using the digital filtering section.
2. The image processing apparatus according to claim 1, wherein the digital filtering section performs the digital filtering on the pixel that is not processed by the trapping section.
3. The image processing apparatus according to claim 2, wherein the digital filtering section does not process the pixel that is processed by the trapping section.
4. The image processing apparatus according to claim 1, wherein the trapping section and the digital filtering section respectively perform the trapping and the digital filtering through different paths.
5. The image processing apparatus according to claim 4, wherein the trapping section performs the trapping through a trapping path.
6. The image processing apparatus according to claim 4, wherein the digital filtering section performs the digital filtering through a digital filtering path.
7. The image processing apparatus according to claim 1, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
8. The image processing apparatus according to claim 2, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
9. The image processing apparatus according to claim 3, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
10. The image processing apparatus according to claim 4, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
11. The image processing apparatus according to claim 5, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
12. The image processing apparatus according to claim 6, further comprising: a window generation section that generates a window as an inspection frame constituted by a set of pixels which is commonly used in the trapping section and the digital filtering section.
13. The image processing apparatus according to claim 7, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
14. The image processing apparatus according to claim 8, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
15. The image processing apparatus according to claim 9, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
16. The image processing apparatus according to claim 10, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
17. The image processing apparatus according to claim 11, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
18. The image processing apparatus according to claim 12, further comprising: a storage section that stores a received image, wherein the window generation section generates the window by using pixels stored in the storage section.
19. An image forming apparatus comprising: an image forming unit that forms a color image; and an image processing unit that processes the image for the image forming unit, wherein the image processing unit includes a trapping section that performs trapping on a received pixel, a digital filtering section that performs digital filtering on the pixel in parallel with the trapping using the trapping section, and a selection section that selects a result of the trapping using the trapping section and a result of the digital filtering using the digital filtering section.
20. A non-transitory computer readable medium storing a program causing a computer to perform: trapping on a received pixel; digital filtering on the pixel in parallel with the trapping; and selecting a result of the trapping and a result of the digital filtering.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
DETAILED DESCRIPTION
[0020] Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the drawings.
[0021] The data reception unit 12 receives image data. The data reception unit receives the image data via a network, or receives an image read by an image reading device.
[0022] The image processing unit 14 processes the image data received by the data reception unit 12, and sends the processed image data to the image forming unit 16.
[0023] For example, the image forming unit 16 includes yellow (Y), magenta (M), cyan (C), and black (K) image forming units, and forms a color image obtained by superimposing YMCK images on one another by using the YMCK image forming units. The image forming unit 16 may be a xerography type or an inkjet type.
[0024]
[0025] Received pixels are transmitted to a storage unit 22 through a control unit 20. For example, the storage unit 22 is a static random access memory (SRAM, a storage element that does not require a regular data write operation for frequently reading or rewriting accumulated stored data). The storage unit 22 stores pixels for every line. The control unit 20 is a direct memory access (DMA, a type that directly transmits data). The control unit constitutes a window generation unit 24 that transmits the received data to the storage unit 22, receives pixel data from the storage unit 22, and generates a 55 window.
[0026]
[0027] The window is generated in a state in which it is assumed that pixels outside the received image are identical to pixels at an end of the received image. For example, on the initial window 28, it is assumed that pixels at two stages on an upper outer side of the received image 26 are upper-end pixels of the received image 26, it is assumed that pixels at two stages on a left outer side are left-end pixels of the received image 26, and it is assumed that four pixels at a diagonal outer side are upper-left-end pixels of the received image 26. The same is true of a final window 28.
[0028] Referring back to
[0029] A trapping unit 40 is provided at the trapping path 34. In a case where there is an image on which a red character 44 is included in a green background 42 as shown in
[0030] As stated above, since the digital filtering unit 38 and the trapping unit 40 are provided at the different paths 32 and 34, the digital filtering is performed on the pixel on which the trapping is not performed. The digital filtering is not performed after the trapping is performed, and the digital filtering and the trapping are performed in parallel.
[0031] A selector 48 constituting a selection section is configured to select the pixel on which the digital filtering is performed by the digital filtering unit 38 and the pixel on which the trapping is performed by the trapping unit 40 depending on the state of the received image.
[0032] Next, a data flow in the module 18 will be described.
[0033] In step S10, one pixel is initially received from the module at the previous stage. In next step S12, the 55 window is generated by the window generation unit 24.
[0034] The 55 window generated in step S12 is commonly used in the processes of steps S14, S16, and S18.
[0035] In step S14, the edge detection is performed by the edge detection unit 36, and the pixel to which edge information, that is, data indicating whether or not the output pixel is an edge is added is output.
[0036] Since the digital filtering is performed by the digital filtering unit 38 by using the 55 window generated in step S12 but the edge information detected in step S14 is also used in step S16, a timing when the process of step S14 is ended and a timing when the digital filtering is started are synchronized in step S20 before step S16.
[0037] In step S18, the trapping is performed by the trapping unit 40 by using the 55 window generated in step S12, and the pixel to which trapping information is added is output.
[0038] In step S22, the timing when the digital filtering performed in step S16 and the timing when the trapping performed in step S18 are synchronized.
[0039] In next step S24, it is determined whether or not the pixel is a trapping target pixel. In a case where it is determined that the pixel is the trapping target pixel in step S24, the process proceeds to step S26, and the pixel on which the trapping is performed is output. Thereafter, the process is ended. Meanwhile, in a case where it is determined that the pixel is not the trapping target pixel in step S24, the process proceeds to step S28, and the pixel on which the digital filtering is performed is output. Thereafter, the process is ended.
[0040]
[0041] In next step S32, a first control unit 32a generates a 55 window by using a first window generation unit 24a in cooperation with a first storage unit 22a.
[0042] In next step S34, the edge detection is performed by the edge detection unit 36 by using the 55 window generated by the first window generation unit 24a in step S32, and the pixel to which the edge information indicating whether or not the output pixel is the edge is added is output.
[0043] In next step S36, a second control unit 32b generates a 55 window by using a second window generation unit 24b in cooperation with a second storage unit 22b.
[0044] In next step S38, the trapping is performed by the trapping unit 40 by using the 55 window generated in step S36, and the processed pixel is output.
[0045] In next step S40, a third control unit 32c generates a 55 window by using a third window generation unit 24c in cooperation with a third storage unit 22c.
[0046] In next step S42, the digital filtering is performed by the digital filtering unit 38 by using the 55 window generated in step S40, and the process is ended.
[0047] By comparison of the exemplary embodiment with the comparative example, the trapping and the digital filtering are exclusively performed in parallel in the exemplary embodiment, whereas the edge detection, the trapping, and the digital filtering are performed in order in the comparative example. In the comparative example, in a case where the overlapped portion 46 shown in
[0048] The window commonly used in the edge detection, the digital filtering, and the trapping is generated in the exemplary embodiment, whereas the windows individually used in the edge detection, the digital filtering, and the trapping are generated in the comparative example. Accordingly, the generation of the window takes as much delay time as receiving two lines in the exemplary embodiment, whereas the generation of the window takes as much delay time as receiving six lines in the comparative example.
[0049] The window is generated by one control unit 20 and one storage unit 22 in the exemplary embodiment, whereas the windows are generated by three control units 20a to 20c and three storage units 22a to 22c in the comparative example. Accordingly, the area of the circuit, the number or storage capacity of storage units, and an available bandwidth are further reduced in the exemplary embodiment. The available bandwidth is a bandwidth to be used in communications. In a case where pipeline processing is performed in order to generate the windows, an available bandwidth which is three times the available bandwidth in the exemplary embodiment is needed in the comparative example.
[0050] Although it has been described in the exemplary embodiment that the image processing apparatus is constituted by the hardware, the image processing apparatus may be realized by software. In this case, the number of steps is reduced instead of the circuit scale.
[0051] The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.