Superpixel merging

11615515 · 2023-03-28

Assignee

Inventors

Cpc classification

International classification

Abstract

Techniques are described for merging super pixels of an image. The image may include two superpixel, for which a similarity value is calculated. The similarity value is determined based on the link and cut values of the superpixels, the similarity value representing pixel-based similarity of the superpixels. The link value is determined based on the similarity between color values of the pixels in the superpixels, while the cut value is determined based on the edge pixels of the superpixels. Based on the calculated similarity value, the system determines whether to merge the superpixels and if so, merges the superpixels thereby generating another superpixel.

Claims

1. A computer-implemented method comprising: selecting an image comprising of a plurality of superpixels, each superpixel in the plurality of superpixels comprising a set of pixels; calculating one or more similarity values for a smallest superpixel and corresponding one or more neighboring superpixels; wherein the corresponding one or more neighboring superpixels have one or more pixels that are neighboring pixels with one or more pixels of the smallest superpixel; from the corresponding one or more neighboring superpixels, determining a mergeable superpixel that has a highest similarity value from the one or more similarity values of the corresponding one or more neighboring superpixels; merging the mergeable superpixel with the smallest superpixel thereby generating a merged superpixel; calculating a plurality of similarity values, each similarity value of the plurality of similarity values representing a similarity of a particular superpixel and a particular neighboring superpixel; determining, from the plurality of similarity values, a greatest similarity value representing that the particular superpixel and the particular neighboring superpixel are the most similar superpixels in the plurality of superpixels of the image; determining, from the plurality of superpixels, the smallest superpixel that has the least number of pixels among the plurality of superpixels; determining that, if the particular superpixel and the particular neighboring superpixel are merged generating a second merged superpixel, then the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least a threshold; if it is determined that the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least the threshold, selecting the smallest superpixel for merging.

2. The method of claim 1, wherein the image comprises a first set of pixels of a first superpixel and a second set of pixels of a second superpixel; wherein the first set of pixels of the first superpixel includes a first set of edge pixels neighboring a second set of edge pixels included in the second set of pixels of the second superpixel; and the method further comprising: determining a similarity value that represents pixel-based similarity of the first superpixel and the second superpixel, the determining comprising: determining a link value for the first superpixel and the second superpixel, wherein the link value is based on similarity between color values of the first set of pixels and color values of the second set of pixels, determining a cut value for the first superpixel and the second superpixel based on the first set of edge pixels of the first superpixel and the second set of edge pixels of the second superpixel, and calculating the similarity value based on the link value and the cut value for the first superpixel and the second superpixel; based on the similarity value, determining whether to merge the first superpixel with the second superpixel; if, based on the similarity value, it is determined to merge the first superpixel with the second superpixel, then merging the first superpixel with the second superpixel thereby generating a third superpixel that includes the first set of pixels and the second set of pixels.

3. The method of claim 2, further comprising: calculating a distance value between the color values of the first set of pixels and the color values of the second set of pixels; based on the distance value, determining the link value for the first superpixel and the second superpixel.

4. The method of claim 3, wherein the distance value is calculated based on one or more statistical functions applied on the color values of the first set of pixels and one or more statistical functions applied on the color values of the second set of pixels.

5. The method of claim 2, wherein the link value and the cut value are determined based on one or more of: size, perimeter and color intensity values of the first superpixel and based on one or more of: size, perimeter and color intensity values of the second superpixel.

6. The method of claim 2, further comprising determining the first set of edge pixels of the first superpixel based on first contour values of the first set of pixels; determining the second set of edge pixels of the second superpixel based on second contour values of the second set of pixels wherein a contour value represents a probability value that a corresponding pixel is an edge pixel.

7. The method of claim 6, further comprising: determining the first contour values of the first set of pixels and the second contour values of the second set of pixels based on applying one or more statistical models on color values of pixels in the image.

8. The method of claim 6, further comprising: calculating a plurality of weight values based on the first contour values and the second contour values; based on the plurality of weight values, determining the cut value for the first superpixel and the second superpixel.

9. The method of claim 2, further comprising: selecting a single pixel in the image as the first superpixel; merging the first superpixel with the second superpixel thereby generating the third superpixel that includes the single pixel of the first superpixel.

10. The method of claim 1, wherein the threshold is based on a particular number of a multiple by which the second merged superpixel is greater than the smallest superpixel.

11. A system comprising one or more processing units and memory, the memory storing a set of program instructions, which when executed by the one or more processing units, causes: selecting an image comprising of a plurality of superpixels, each superpixel in the plurality of superpixels comprising a set of pixels; calculating one or more similarity values for a smallest superpixel and corresponding one or more neighboring superpixels; wherein the corresponding one or more neighboring superpixels have one or more pixels that are neighboring pixels with one or more pixels of the smallest superpixel; from the corresponding one or more neighboring superpixels, determining a mergeable superpixel that has a highest similarity value from the one or more similarity values of the corresponding one or more neighboring superpixels; merging the mergeable superpixel with the smallest superpixel thereby generating a merged superpixel; calculating a plurality of similarity values, each similarity value of the plurality of similarity values representing a similarity of a particular superpixel and a particular neighboring superpixel; determining, from the plurality of similarity values, a greatest similarity value representing that the particular superpixel and the particular neighboring superpixel are the most similar superpixels in the plurality of superpixels of the image; determining, from the plurality of superpixels, the smallest superpixel that has the least number of pixels among the plurality of superpixels; determining that, if the particular superpixel and the particular neighboring superpixel are merged generating a second merged superpixel, then the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least a threshold; if it is determined that the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least the threshold, selecting the smallest superpixel for merging.

12. The system of claim 11, wherein the image comprising a first set of pixels of a first superpixel and a second set of pixels of a second superpixel; wherein the first set of pixels of the first superpixel includes a first set of edge pixels neighboring a second set of edge pixels included in the second set of pixels of the second superpixel; and wherein the set of program instructions comprise one or more program instructions, which, when executed by the one or more processing units, cause: determining a similarity value that represents pixel-based similarity of the first superpixel and the second superpixel, the determining comprising: determining a link value for the first superpixel and the second superpixel, wherein the link value is based on similarity between color values of the first set of pixels and color values of the second set of pixels, determining a cut value for the first superpixel and the second superpixel based on the first set of edge pixels of the first superpixel and the second set of edge pixels of the second superpixel, and calculating the similarity value based on the link value and the cut value for the first superpixel and the second superpixel; based on the similarity value, determining whether to merge the first superpixel with the second superpixel; if, based on the similarity value, it is determined to merge the first superpixel with the second superpixel, then merging the first superpixel with the second superpixel thereby generating a third superpixel that includes the first set of pixels and the second set of pixels.

13. The system of claim 12, wherein the set of program instructions comprise one or more program instructions, which, when executed by the one or more processing units, cause: calculating a distance value between the color values of the first set of pixels and the color values of the second set of pixels; based on the distance value, determining the link value for the first superpixel and the second superpixel.

14. The system of claim 13, wherein the distance value is calculated based on one or more statistical functions applied on the color values of the first set of pixels and one or more statistical functions applied on the color values of the second set of pixels.

15. The system of claim 12, wherein the link value and the cut value are determined based on one or more of: size, perimeter and color intensity values of the first superpixel and based on one or more of: size, perimeter and color intensity values of the second superpixel.

16. The system of claim 12, wherein the set of program instructions comprise one or more program instructions, which, when executed by the one or more processing units, cause: determining the first set of edge pixels of the first superpixel based on first contour values of the first set of pixels; determining the second set of edge pixels of the second superpixel based on second contour values of the second set of pixels wherein a contour value represents a probability value that a corresponding pixel is an edge pixel.

17. The system of claim 12, wherein the set of program instructions comprise one or more program instructions, which, when executed by the one or more processing units, cause: selecting a single pixel in the image as the first superpixel; merging the first superpixel with the second superpixel thereby generating the third superpixel that includes the single pixel of the first superpixel.

18. The system of claim 11, wherein the threshold is based on a particular number of a multiple by which the second merged superpixel is greater than the smallest superpixel.

19. One or more non-transitory computer-readable media storing a set of instructions, wherein the set of instructions includes instructions, which when executed by one or more hardware processors, cause: selecting an image comprising of a plurality of superpixels, each superpixel in the plurality of superpixels comprising a set of pixels; calculating one or more similarity values for a smallest superpixel and corresponding one or more neighboring superpixels; wherein the corresponding one or more neighboring superpixels have one or more pixels that are neighboring pixels with one or more pixels of the smallest superpixel; from the corresponding one or more neighboring superpixels, determining a mergeable superpixel that has a highest similarity value from the one or more similarity values of the corresponding one or more neighboring superpixels; merging the mergeable superpixel with the smallest superpixel thereby generating a merged superpixel; calculating a plurality of similarity values, each similarity value of the plurality of similarity values representing a similarity of a particular superpixel and a particular neighboring superpixel; determining, from the plurality of similarity values, a greatest similarity value representing that the particular superpixel and the particular neighboring superpixel are the most similar superpixels in the plurality of superpixels of the image; determining, from the plurality of superpixels, the smallest superpixel that has the least number of pixels among the plurality of superpixels; determining that, if the particular superpixel and the particular neighboring superpixel are merged generating a second merged superpixel, then the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least a threshold; if it is determined that the second merged superpixel has greater number of pixels than the smallest superpixel of the image by at least the threshold, selecting the smallest superpixel for merging.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The solution will now be described in more detail by means of exemplary embodiments and with reference to the accompanying drawings, in which:

(2) FIG. 1 is an example of segmentation into superpixels of an image after processing using a prior art superpixel algorithm, such as linear spectral clustering.

(3) FIG. 2 is an example of segmentation into superpixels of an image already segmented into superpixels in a first stage and further processed with embodiments of the method of the present invention.

(4) FIG. 3 illustrates the principle of a four-connected neighborhood.

(5) FIG. 4 illustrates the principle of an eight-connected neighborhood.

(6) FIG. 5 is a block diagram illustrating a user device, according to possible embodiments.

(7) FIG. 6 is a flowchart of an embodiment of the method for merging superpixels.

DETAILED DESCRIPTION

(8) FIG. 1 shows an example of segmentation into superpixels of an image. This segmentation may be done with any prior art superpixel algorithm for generating superpixels. Such algorithm may be any of the algorithms mentioned in the background section, such as linear spectral clustering.

(9) FIG. 2 is an example of how an image may look like after it has been segmented into fewer superpixels than in FIG. 1. The image shown in FIG. 1 may be used as an input when performing the method and FIG. 2 may be the output as will be explained in greater detail below in conjunction with FIG. 6.

(10) In image processing and image recognition, pixel connectivity is the way in which pixels in 2-dimensional images relate to their neighbors. There are different types of connectivity, some of which will briefly be described here.

(11) FIG. 3 illustrates the principle of a four-connected neighborhood. Four connected pixels are neighbors to every pixel that touches one of their edges. These pixels are connected horizontally and vertically. In terms of pixel coordinates, every pixel that has the following coordinates (x±1, y) or (x, y±1) is connected to a pixel at (x, y).

(12) FIG. 4 illustrates the principle of an eight-connected neighborhood. In this case eight connected pixels are neighbors to every pixel that touches one of their edges or corners. These pixels are connected horizontally, vertically, and diagonally. In addition to four-connected pixels, each pixel with coordinates (x±1, y±1) is connected to a pixel at (x, y).

(13) When talking about neighboring superpixels in context of this application, these neighboring superpixels may be chosen using any type of connectivity, for example in addition to above also six-connected pixels may be used as neighbors.

(14) Turning now to FIG. 5 and FIG. 6 the method for merging superpixels will be closer described. FIG. 5 is a block diagram illustrating a user device 100, according to an exemplary embodiment, and is an example of a user device 100 on which the method may be performed and which user device 100 is caused to perform the method. The user device 100 may be a mobile telephone, a smart phone, a tablet computer, a laptop, a personal computer or the like. The user device 100 comprises a controller 114, which in turn may comprise a processor 134 and a memory 136. The user device 100 also comprises a communication interface 116 for communication with other devices. The interface 116 may be configured for both wireless communication and communication by wire. The controller 114 may be tailor made for this application or be an off-the-shelf processor and memory configured for this application. For example, the described processor may comprise a single Central Processing Unit, CPU, or could comprise two or more processing units. The processor may include general purpose microprocessors, instruction set processors and/or related chips sets and/or special purpose microprocessors such as Application Specific Integrated Circuits, ASICs, Field Programmable Gate Arrays, FPGAs or Complex Programmable Logic Devices, CPLDs. Even if the ASICs, FPGAs and CPLDs as such do not include a processor and memory, they may perform the same function as the controller 114 depicted with processor and memory in FIG. 5. Thus, FIG. 5 is only used to describe the function performed by the user device 100 and should not be interpreted as the actual physical layout of the user device 100, which for that matter also could be a device. Thus, the user device 100 function may be realized by a controller 114 in form of processor and memory, ASICs, FPGAs and/or CPLDs, i.e. in hardware only, in software or in any combination thereof.

(15) If carried out in software, the software may be stored in the memory 136 in form of a computer program 138. However, the software in form of the computer program 138 may also be carried by a computer program product 140 which is loaded into the user device 100 for execution. The computer program products may be any medium capable to carry the computer program 138, such as CD, DVD, flash memory, USB-stich or downloadable objects. Each computer program product 140 or memory 136 thus comprises a computer readable medium on which the computer program 138 is stored e.g. in the form of computer program units. For example, the memories may be a flash memory, a Random-Access Memory, RAM, a Read-Only Memory, ROM or an Electrically Erasable Programmable ROM, EEPROM, and the program units could in alternative embodiments also be distributed on different computer program products.

(16) The controller 114 in the user device 100 is configured to execute the method described below for example by letting the processor 134 execute instructions stored in the memory 136.

(17) Turning now to FIG. 6 the method performed by the user device for merging superpixels will be described closer. As mentioned above the method merges superpixels in an image having an initial number of superpixels into a final number of superpixels. Each superpixel has an average color intensity value, as defined by any suitable color space. A color space is a specific organization of colors. A color space identifies a particular combination of color model and mapping function. A color model is an abstract mathematical model describing the way colors can be represented as tuples of numbers (e.g. triples in RGB or quadruples in CMYK. Adding a specific mapping function between a color model and a reference color space establishes within the reference color space a definite “footprint”, known as a gamut, and for a given color model this defines a color space. For example, Adobe RGB and sRGB are two different absolute color spaces, both based on the RGB color model. When defining a color space, the usual reference standard is the CIELAB or CIEXYZ color spaces, which were specifically designed to encompass all colors the average human can see. As for the present invention, any color space may be used which is capable of outputting the average color intensity value for a superpixel, such as the CIELAB color space.

(18) The method starts with step S100, in which the user device 100 receives an image custom character={x.sub.l}.sub.l=1.sup.N consisting of N pixels. The intensity value of each pixel is defined in some custom characterarbitrary color space, such as RGB, CIELAB or gray scale. The received image comprises information about the initial number, K, of superpixels that the image consists of. The initial number of superpixels, K, is preferably above 1000. Furthermore, the received image also comprises information about the contours of the image. The initial number of superpixels may have been obtained by using any prior art superpixel algorithm and the contour information may be obtained by using any known contour detection algorithm.

(19) Let custom character={c.sub.l}.sub.l=1.sup.N∈[0,1].sup.N represent the contour information of the image custom character, given by any arbitrary contour detection algorithm. Let custom character={S.sub.l}.sub.l=1.sup.N represent the initial decomposition of custom character into K superpixels (S.sub.i⊂{1, 2, . . . , N}). Here, K=N represents each pixel as a superpixel. From the received image custom character, pixel connectivity is determined by constructing a fourth-degree weighted graph or by using any other suitable pixel connectivity, G, method, where G=(custom character, ε, custom character), whose vertices custom character are the pixels of the image, and whose edges ε form a grid of connecting neighboring pixels. The weight of an edge or contour represents the similarity of two corresponding pixels. For two pixels l and l′ w.sub.l,l′=custom character.sub.1(l, l′), where custom character.sub.1(l, l′) is an arbitrary function depending on pixels l and l′. For example, this function can be defined as a Kernel as follows:

(20) 1 ( l , l ) = e - c l + c l σ 1

(21) In this case, when the pixels l, l′ actually belong to a boundary of an object, the contour values c.sub.l, c.sub.l′, tend to be high (close to 1) and hence the weight w.sub.l,l′ tends to be very small. σ.sub.1 is a constant between 0 to 1. The cut value of two superpixels S.sub.i and S.sub.j is then defined by
Cut(S.sub.i,S.sub.j)=custom character.sub.3(S.sub.i,S.sub.j)

(22) Where custom character.sub.3(S.sub.i, S.sub.j) is an arbitrary function depending on the size, the perimeter and/or the color intensity values of superpixels S.sub.i and S.sub.j. For example, this function can be defined as:
custom character.sub.3(S.sub.i,S.sub.j)=Σ.sub.l∈S.sub.iΣ.sub.l′∈S.sub.jw.sub.l,l′

(23) The linkage distance of two superpixels is defined as
Link(S.sub.i,S.sub.j)=custom character.sub.2(S.sub.i,S.sub.j),

(24) where the custom character.sub.2(S.sub.i, S.sub.j) is another arbitrary function depending on superpixels S.sub.i and S.sub.j. For example, this function can be defined by a Kernel as follows:

(25) 2 ( S i , S j ) = e - | | μ i - μ j | | σ 1 .Math. "\[LeftBracketingBar]" S i .Math. "\[RightBracketingBar]" .Math. "\[LeftBracketingBar]" S j .Math. "\[RightBracketingBar]" ,

(26) where μ.sub.i represents the average color intensity of superpixel S.sub.i and |S.sub.i| represents the number of pixels in S.sub.i. Given the Cut and Link functions a Penalized Average Linkage Cuts, PALC, similarity is defined as follows:
PALC(S.sub.i,S.sub.j)=Cut(S.sub.i,S.sub.j).Math.Link(S.sub.i,S.sub.j)

(27) As mentioned above the arbitrary Cut and Link functions are depending on the superpixels S.sub.i and S.sub.j, i.e. the properties of the superpixels such as size, perimeter and/or color intensity values.

(28) In step S110, the user device 100 also receives the value of the final number, k, of superpixels into which the image is to be segmented. The number of final superpixels may be in the range of 2 to K−1 superpixels. After receiving the image and the final number of superpixels a similarity value is calculated in step S130, according to the PALC similarity function defined above using the Cut and Link functions. The similarity value is used for determining the similarity between each superpixel and its neighboring superpixels, wherein the similarity value is based on the PALC value between each pair of neighboring superpixels.

(29) In step S140 the pair of superpixels that has the highest similarity value is selected. After this selection it may be checked, in step S150, if the size of the selected pair of superpixels, when this selected pair is merged, is more than a constant, h, times larger than smallest superpixel. h is a constant that preferably is between 10 and 100. In one embodiment the constant h is preprogrammed into the user device 100, in another embodiment a user may enter it into the user device 100. Thus, the constant h may be received by the user device 100, in an optional step 120 (shown with dashed lines in FIG. 6). If it is determined that the size of the selected pair of superpixels, when this selected pair is merged, is more than h times larger than smallest superpixel, the smallest superpixel is selected in step S160 and the similarity value between pairs of the selected smallest superpixel and superpixels neighboring the selected smallest superpixel is calculated in step S170. Thereafter the in step S180 the selected pair of superpixels is merged into a larger superpixel.

(30) If it in step S150 is determined that the size of the selected pair of superpixels, when this selected pair is merged, is not more than h times larger than the smallest superpixel, steps S160 and S170 are omitted and the pair of selected superpixels that has the highest similarity value is are merged in step S180. Thus, when the difference in size between two superpixels that are to merged is smaller than h, the method proceeds directly from step S150 to step S180. This check regarding the size is done in order to somewhat balance the sizes of the superpixels.

(31) Thereafter, in step S190 it is determined if the current number of superpixels in the image is greater than the received value of final number of superpixels. If the number of superpixels is greater than the target value the method continues with reducing the number of superpixels by repeating the steps of calculating, S130, selecting, S140, determining S150 and merging S180 until the current number of superpixels in the image is equal to the received value of final number of superpixels. If the current number of superpixels instead is the same as the received value of the final number of superpixels, the target number of superpixels has been achieved and the method ends.

(32) In a first embodiment for merging superpixels in an image from an initial number of superpixels into a final number of superpixels, the embodiment being performed by a user device (100) and includes: receiving (S100) the image comprising information about the initial number of superpixels and about the contours of the image, receiving (S110) a value of the final number of superpixels into which the image is to be segmented, calculating (S130) a similarity value for the similarity between each superpixel (S.sub.i) and its neighboring superpixels (S.sub.j), wherein the similarity value is based on Penalized Average Linkage Cuts, PALC, where PALC (S.sub.i, S.sub.j)=Cut (S.sub.i, S.sub.j)*Link (S.sub.i, S.sub.j) and the Cut and Link functions are chosen depending on the properties of the superpixels (S.sub.i, S.sub.j), selecting (S140) the pair of superpixels that has the highest similarity value, determining (S150) that the selected pair of superpixels that has the highest similarity value, when merged, is more than a constant, h, times larger than the smallest superpixel in the image, and in response thereto: a) selecting (S160) the smallest superpixel in the image, i.e. the superpixel that consists of the fewest number of pixels, and b) calculating (S170) the similarity value for the similarity between pairs of the selected smallest superpixel and superpixels neighboring the selected smallest superpixel, and further in response to the determining step: a) merging (S180) the pair of superpixels that has the highest similarity value, b) determining (S190) that the current number of superpixels in the image is greater than the received value of the final number of superpixels, and in response thereto, repeating the calculating, selecting, determining and merging steps (S130, S140, S150, S180) until it is determined (S190) that the current number of superpixels in the image is equal to the received value of the final number of superpixels.

(33) In a related second embodiment, the first embodiment further includes receiving (S120) the constant, h, wherein h is within a range of 1 to N and N is an original number of pixels in the received image.

(34) In a related third embodiment, the first or second embodiments include Cut (S.sub.i, S.sub.j)=Σ.sub.l∈S.sub.iΣ.sub.l′∈S.sub.jcustom character.sub.1(l,l′),

(35) Link ( S i , S j ) = e - | | μ i - μ j | | σ 1 .Math. "\[LeftBracketingBar]" s i .Math. "\[RightBracketingBar]" .Math. "\[LeftBracketingBar]" s j .Math. "\[RightBracketingBar]" , and 1 ( l , l ) = e - c l + c l σ 1 ,
where μ.sub.i represents the average color intensity of S.sub.i, c.sub.l represents the contour value of the pixel corresponding to x.sub.l, and wherein σ.sub.1 and σ.sub.2 are constants.

(36) In a related fourth embodiment, any of the first through third embodiments include the average color intensity values that are defined by a CIELAB color space.

(37) In a related fifth embodiment, any of the first through four embodiments include neighboring superpixels that are selected using a 4-connected neighborhood.

(38) In a sixth embodiment, a user device (100) for merging superpixels in an image from an initial number of superpixels into a final number of superpixels, wherein the user device (100) comprises a controller (114) comprising a processor (134) and a memory (136), the memory (136) comprising instructions which when executed by the processor (134) causes the user device (100) to perform any of the first through fifth embodiments.

(39) In a seventh embodiment for merging superpixels in an image from an initial number of superpixels into a final number of superpixels, the embodiment being performed by a user device (100) and includes: receiving the image comprising information about the initial number of superpixels and about the contours of the image, receiving a value of the final number of superpixels into which the image is to be segmented, calculating a similarity value for the similarity between each superpixel (S.sub.i) and its neighbouring superpixels (S.sub.j), wherein the similarity value is based on Penalized Average Linkage Cuts, PALC, where PALC (S.sub.i, S.sub.j)=Cut (S.sub.i, S.sub.j)*Link (S.sub.i, S.sub.j) and the Cut and Link functions are chosen depending on the properties of the superpixels (S.sub.i, S.sub.j), selecting the pair of superpixels that has the highest similarity value, determining that the selected pair of superpixels that has the highest similarity value, when merged, is more than a constant, h, times larger than the smallest superpixel in the image, and in response thereto: a) selecting the smallest superpixel in the image, i.e. the superpixel that consists of the fewest number of pixels, b) calculating the similarity value for the similarity between pairs of the selected smallest superpixel and superpixels neighbouring the selected smallest superpixel, and further in response to the determining step: a) merging the pair of superpixels that has the highest similarity value, b) determining that the current number of superpixels in the image is greater than the received value of the final number of superpixels, and in response thereto, repeating the steps of calculate, select, determine and merge until it is determined that the current number of superpixels in the image is equal to the received value of the final number of superpixels.

(40) In a related eights embodiment, the user device (100) according to the sixth embodiment, which is further caused to receive the constant, h, wherein h is within a range of 1 to N, and N is an original number of pixels in the received image.

(41) In a related tenth embodiment, the user device (100) according to the sixth and seventh embodiments, which is further caused to determine the Cut (S.sub.i, S.sub.j)=Σ.sub.l∈S.sub.iΣ.sub.l′∈S.sub.jcustom character.sub.1(l,l′),

(42) the Link ( S i , S j ) = e - .Math. μ i - μ j .Math. σ 1 .Math. "\[LeftBracketingBar]" S i .Math. "\[RightBracketingBar]" .Math. "\[LeftBracketingBar]" S j .Math. "\[RightBracketingBar]" , and where 1 ( l , l ) = e - c l + c l σ 1 ,
μ.sub.i represents the average color intensity of S.sub.i, c.sub.l represents the contour value of the pixel corresponding to x.sub.l, and σ.sub.1 and σ.sub.2 are constants.

(43) In a related eleventh embodiment, a computer program (138) comprising computer program code, the computer program code being adapted, if executed on a processor (136), to implement the method according to any one of the first through fifth embodiments.

(44) In a related twelfth embodiment, a computer program product comprising a computer readable storage medium (140), the computer readable storage medium having the computer program (138) according to the ninth embodiment.

(45) In an embodiment, the present invention relates to a method of merging superpixels in an image from an initial number of superpixels into a final number of superpixels, the method being performed by a user device (100). The method, in such an embodiment, comprises receiving the image (S100) comprising the initial number of superpixels and the value, K, (S110) of the final number of superpixels into which the image is to be segmented. An example similarity value for the similarity between each superpixel and its neighboring superpixels is calculated (S120). The similarity value is based on a Penalized Average Linkage Cuts (PALC) value between each pair of neighboring superpixels. The pair of superpixels that has the highest similarity value may be selected (S140) and the selected pair of superpixels may be merged (S180). The method may be repeated until it is determined (S190) that the current number of superpixels in the image is equal to the received value of the final number of superpixels.

(46) While the solution has been described with reference to specific exemplary embodiments, the description is generally only intended to illustrate the inventive concept and should not be taken as limiting the scope of the solution.