Discrete Edge Binning Template Matching System, Method And Computer Readable Medium
20170270668 · 2017-09-21
Inventors
Cpc classification
G06V10/751
PHYSICS
A61B5/0033
HUMAN NECESSITIES
A61B5/7246
HUMAN NECESSITIES
G06V10/50
PHYSICS
International classification
Abstract
According to the invention, one or more discrete edge binning (“DEB”) features of a DEB template matching system, method and/or computer readable medium may preferably comprise and/or apply an image processing algorithm, preferably for use in template matching. According to the invention, template matching may preferably involve using one or more known reference features to detect and/or localize similar features within an image.
Claims
1. A method of matching at least part of an image against one or more reference templates stored in a database, wherein at least one edge feature is embedded in the image, wherein the method comprises: a) a database providing step of providing the database, such that each of the reference templates comprises a set of reference feature parameters; b) a receiving step of receiving the image; c) a feature extraction step comprising: i) a differential edge detection substep of using a contrast invariant technique to render the image contrast invariant and to depict one or more edge pixels of the edge feature among one or more image pixels of the image; and ii) an orientation and spatial binning substep of binning the edge pixels into a predetermined number of orientation bins, and spatially binning adjacent ones of the image pixels into discrete edge binning (DEB) cells, to generate a DEB cell image depicting the edge feature; and d) a feature classification step comprising: i) a feature response substep of comparing the DEB cell image to each said set of reference feature parameters to determine how well the DEB cell image matches each of the reference templates; and ii) a match detection substep of locating a best match of the DEB cell image among the reference templates, and correlating the best match against one or more predetermined match threshold values to determine when a matching one of the reference templates is found; whereby the image is matched with the matching one of the reference templates.
2. The method according to claim 1, wherein a first one of the reference templates is provided with higher sensitivity, than a second one of the reference templates, to the edge feature embedded in the image and depicted in the DEB cell image.
3. The method according to claim 1, wherein a first one of the reference templates matches a different region of the image than a second one of the reference templates.
4. The method according to any one of claims 1 to 3 wherein, in the differential edge detection substep, the image is scaled and any artifacts in the image with spatial resolution lower than a predetermined spatial resolution threshold value are suppressed.
5. The method according to any one of claims 1 to 4 wherein, in the differential edge detection substep, a low pass filter is applied to and convolved with the image to suppress high frequencies associated with pixel noise.
6. The method according to claim 5, wherein the low pass filter is a multivariate Gaussian filter.
7. The method according to any one of claims 1 to 6 wherein, in the differential edge detection substep, the image is converted to greyscale.
8. The method according to any one of claims 1 to 7 wherein, in the differential edge detection substep, one or more derivatives of the image are differentially calculated and used to localize and geometrically define the edge feature.
9. The method according to claim 8 wherein, in the differential edge detection substep, the derivatives comprise a gradient calculated by differentiating the image in two dimensions, and a direction of the gradient is obtained and used to localize and geometrically define the edge pixels of the edge feature at a gradient maximum along the direction of the gradient.
10. The method according to one of claims 8 and 9, wherein the derivatives are used, with reference to a predetermined edge minimum threshold value, to define the edge feature.
11. The method according to any one of claims 1 to 7 wherein, in the differential edge detection substep, one or more derivatives of the image are differentially calculated; and wherein the derivatives are used to calculate an orientation for each of the edge pixels.
12. The method according to claim 11 wherein, in the orientation and spatial binning substep, the orientation for each of the edge pixels is assigned to one of the predetermined number of orientation bins most closely corresponding to the orientation.
13. The method according to claim 12 wherein, in the orientation and spatial binning substep, for each one of the discrete edge binning (DEB) cells, a sum is calculated based on the orientation bin assigned to each of the edge pixels among the image pixels spatially binned into said each one of the discrete edge binning (DEB) cells.
14. The method according to any one of claims 1 to 12 wherein, in the orientation and spatial binning substep, for each one of the discrete edge binning (DEB) cells, a sum is calculated based on the orientation for each of the edge pixels among the image pixels spatially binned into said each one of the discrete edge binning (DEB) cells.
15. The method according to any one of claims 1 to 14 wherein, in the orientation and spatial binning substep, each of the discrete edge binning (DEB) cells correlates to a substantially rectangular (M.sub.1×M.sub.2) configuration of said adjacent ones of the image pixels.
16. The method according to claim 15 wherein, in the orientation and spatial binning substep, the image is processed to generate a cell offset image containing (M.sub.1×M.sub.2) scaled images corresponding to a starting offset of said each of the discrete edge binning (DEB) cells.
17. The method according to any one of claims 1 to 16, wherein the feature extraction step further comprises a feature cropping substep of cropping the DEB cell image to normalize depiction of the edge feature in the DEB cell image.
18. The method according to any one of claims 1 to 17, wherein in the feature response substep, for each of the reference templates, a match value is calculated against the DEB cell image.
19. The method according to any one of claims 1 to 18, wherein in the feature response substep, one or more feature response maps are generated representing how well the DEB cell image matches each of the reference templates.
20. The method according to claim 19 wherein, in the match detection substep, the best match is located on the feature response maps.
21. The method according to any one of claims 1 to 20, wherein the predetermined match threshold values comprise: a predetermined correlation threshold value based on a correlation with the edge feature; and/or a predetermined distance threshold value based on a distance from a search origin for the edge feature.
22. The method according to any one of claims 1 to 21, adapted for use with a rapid diagnostic test device and/or cassette image as the image.
23. A system for matching at least part of an image, wherein at least one edge feature is embedded in the image, wherein the system comprises: a) a database which stores one or more reference templates, with each of the reference templates comprising a set of reference feature parameters; b) an image receiving element operatively receiving the image; c) one or more image processors operative to match said at least part of the image against the reference templates stored in the database, with the image processors operatively encoded to: i) use a contrast invariant technique to render the image contrast invariant and depict one or more edge pixels of the edge feature among one or more image pixels of the image; ii) bin the edge pixels into a predetermined number of orientation bins, and spatially bin adjacent ones of the image pixels into discrete edge binning (DEB) cells, to generate a DEB cell image depicting the edge feature; iii) compare the DEB cell image to each said set of reference feature parameters to determine how well the DEB cell image matches each of the reference templates; and iv) locate a best match of the DEB cell image among the reference templates, and correlating the best match against one or more predetermined match threshold values to determine when a matching one of the reference templates is found; whereby the system matches the image with the matching one of the reference templates.
24. The system according to claim 23, wherein a first one of the reference templates has higher sensitivity, than a second one of the reference templates, to the edge feature embedded in the image and depicted in the DEB cell image.
25. The system according to claim 23, wherein a first one of the reference templates matches a different region of the image than a second one of the reference templates.
26. The system according to any one of claims 23 to 25 wherein the image processors are also operatively encoded to scale the image and suppress any artifacts in the image with spatial resolution lower than a predetermined spatial resolution threshold value.
27. The system according to any one of claims 23 to 26 further comprising a low pass filter; and wherein the image processors are also operatively encoded to apply the low pass filter to, and convolve the low pass filter with, the image to suppress high frequencies associated with pixel noise.
28. The system according to claim 27, wherein the low pass filter is a multivariate Gaussian filter.
29. The system according to any one of claims 23 to 28, wherein the image processors are also operatively encoded to convert the image to greyscale.
30. The system according to any one of claims 22 to 29, wherein the image processors are also operatively encoded to differentially calculate one or more derivatives of the image and to use the derivatives to localize and geometrically define the edge feature.
31. The system according to claim 30, wherein the image processors are also operatively encoded: to calculate one or more of the derivatives as a gradient by differentiating the image in two dimensions; to obtain a direction of the gradient; and to use the direction of the gradient to localize and geometrically define the edge pixels of the edge feature at a gradient maximum along the direction of the gradient.
32. The system according to one of claims 30 and 31, wherein the image processors are also operatively encoded to use the derivatives, with reference to a predetermined edge minimum threshold value, to define the edge feature.
33. The system according to any one of claims 23 to 29, wherein the image processors are also operatively encoded: to differentially calculate one or more derivatives of the image; and to use the derivatives to calculate an orientation for each of the edge pixels.
34. The system according to claim 33, wherein the image processors are also operatively encoded to assign the orientation for each of the edge pixels to one of the predetermined number of orientation bins most closely corresponding to the orientation.
35. The system according to claim 34, wherein the image processors are also operatively encoded to, for each one of the discrete edge binning (DEB) cells, calculate a sum based on the orientation bin assigned to each of the edge pixels among the image pixels spatially binned into said each one of the discrete edge binning (DEB) cells.
36. The system according to any one of claims 23 to 34, wherein the image processors are also operatively encoded to, for each one of the discrete edge binning (DEB) cells, calculate a sum based on the orientation for each of the edge pixels among the image pixels spatially binned into said each one of the discrete edge binning (DEB) cells.
37. The system according to any one of claims 23 to 36, wherein the image processors are also operatively encoded to correlate each of the discrete edge binning (DEB) cells to a substantially rectangular (M.sub.1×M.sub.2) configuration of said adjacent ones of the image pixels.
38. The system according to claim 37, wherein the image processors are also operatively encoded to process the image to generate a cell offset image containing (M.sub.1×M.sub.2) scaled images corresponding to a starting offset of said each of the discrete edge binning (DEB) cells.
39. The system according to any one of claims 23 to 38, wherein the image processors are also operatively encoded to crop the DEB cell image to normalize depiction of the edge feature in the DEB cell image.
40. The system according to any one of claims 23 to 39, wherein the image processors are also operatively encoded to, for each of the reference templates, calculate a match value against the DEB cell image.
41. The system according to any one of claims 23 to 40, wherein the image processors are also operatively encoded to generate one or more feature response maps representing how well the DEB cell image matches each of the reference templates.
42. The system according to claim 41, wherein the image processors are also operatively encoded to locate the best match on the feature response maps.
43. The system according to any one of claims 23 to 42, wherein the predetermined match threshold values comprise: a predetermined correlation threshold value based on a correlation with the edge feature; and/or a predetermined distance threshold value based on a distance from a search origin for the edge feature.
44. The system according to any one of claims 23 to 43, adapted for use with a rapid diagnostic test device and/or cassette image as the image.
45. A computer readable medium for use with an image wherein at least one edge feature is embedded, and with a database which stores one or more reference templates that each comprise a set of reference feature parameters, the computer readable medium encoded with executable instructions to, when executed, encode one or more image processors to automatically match at least part of the image against the reference templates stored in the database by automatically performing the steps of: a) using a contrast invariant technique to render the image contrast invariant and depict one or more edge pixels of the edge feature among one or more image pixels of the image; b) binning the edge pixels into a predetermined number of orientation bins, and spatially binning adjacent ones of the image pixels into discrete edge binning (DEB) cells, to generate a DEB cell image depicting the edge feature; c) comparing the DEB cell image to each said set of reference feature parameters to determine how well the DEB cell image matches each of the reference templates; and d) locating a best match of the DEB cell image among the reference templates, and correlating the best match against one or more predetermined match threshold values to determine when a matching one of the reference templates is found; whereby the computer readable medium encodes the image processors to match the image with the matching one of the reference templates.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0066] The novel features which are believed to be characteristic of the system, method and computer readable medium according to the present invention, as to their structure, organization, use, and method of operation, together with further objectives and advantages thereof, will be better understood from the following drawings in which presently preferred embodiments of the invention will now be illustrated by way of example. It is expressly understood, however, that the drawings are for the purpose of illustration and description only, and are not intended as a definition of the limits of the invention. In the accompanying drawings:
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0081] Referring now to
[0082] One or more DEB transforms (which may preferably result in one or more DEB features) preferably forms the basis for a feature extraction front end, preferably in an image registration system, method and/or computer readable medium according to the invention.
[0083] As shown in
[0084] According to the present invention, DEB features classification are preferably a second phase of DEB template matching performed and provided by a system, method and/or computer readable medium, for comparing a set of reference features (preferably obtained during feature extraction) to an image to be classified, preferably in order to determine if the features match.
[0085] As shown in
[0086] For the match detection step 220, the location and correlation of best match is preferably compared to one or more internal thresholds, preferably to determine if a match is found (i.e., a match result 40) based on the comparison of the DEB features 20 with the DEB cell image 30.
[0087] The method 100 preferably provides for the achievement of contrast invariance by using a differential edge detector front end (i.e., substep 110 in
[0088] The method 100 also preferably provides for the achievement of selectable regional invariance by defining templates using a binned orientation and spatial approach—e.g., similar to those used by image registration algorithms of the prior art such as HOG and SIFT. Whereas prior art image registration algorithms may have defined feature descriptors over a small area, the present invention preferably combines feature cells into DEB templates. The DEB templates are preferably defined in such a manner that only the salient features are preferably selected—preferably provided that selectable regional invariance to areas in the image or within the template itself may be highly variable.
A. DEB Feature Extraction
Differential Edge Detector
[0089] According to the present invention, the differential edge detector step 110 preferably extracts a salient edge of a cassette image 10, preferably using contrast invariant techniques to provide contrast invariance (as shown in
[0090] According to an aspect of one preferred embodiment of the invention, and as best seen in
[0091]
[0092] In the sections below, one or more steps of the differential edge detector substep 110 as preferably performed by a system, method and/or computer readable medium according to one aspect of the invention are described in more detail.
I. Scale Image
[0093] As shown in
II. Low Pass Filter
[0094] According to the invention, the low pass filter substep 112 depicted in
[0095] The low pass filter substep 112 preferably reduces high frequency noise by the application of a multivariate Gaussian filter to the input image 10. According to an aspect of one preferred embodiment of the invention, a covariance between x and y may preferably be assumed to be zero and the filter is preferably described according to the equation
where F is the filter coefficient described above.
[0096] The filter is preferably convolved with the input image I, preferably to produce the filtered output I.sub.f according to the equation
I.sub.f=I*F
[0097] where F is the filter coefficient, as above.
III. Grayscale Conversion
[0098] As shown in
IV. Differentiation
[0099] The differentiation substep 114 shown in
V. Differential Edge Definition and/or Threshold
[0100] According to the invention, the edge definition and threshold substep 115 shown in
[0101] The edge definition and threshold substep 115 preferably involves the construction of an edge pixel defined as a point at which a gradient magnitude assumes a maximum in a direction of the gradient [see, for example, J. Canny, “A Computational Approach to Edge Detection”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. PAMI-8, No. 6, November 1986].
[0102] According to the invention, the gradient is preferably calculated by differentiating the filtered image along the x and y dimensions according to the equations
where I.sub.f is the filtered output from above.
[0103] According to the invention, the direction of the gradient is preferably obtained according to the equation
based on the results from differentiating the filtered image along the x and y dimensions above.
[0104] Each point (or edge pixel) is preferably modified with a local coordinate transform, preferably such that new coordinates (u,v) are aligned to the direction of the gradient. Coordinate v is preferably parallel to the direction of the gradient, θ and coordinate u is preferably perpendicular to the direction of the gradient, θ. Preferably, with this coordinate transform, I.sub.v is preferably defined as the gradient at each point, preferably in the direction of the gradient. I.sub.v preferably has a magnitude equal to √{square root over (I.sub.x.sup.2+I.sub.y.sup.2)} and/or an angle of θ. Using the definition above, it may be clear to persons having ordinary skill in the art that I(i,j) is preferably an edge pixel in accordance with the following equations
I.sub.vv=0
I.sub.vvv<0.
[0105] According to the invention, the first condition (I.sub.vv=0), above, preferably defines a condition when the gradient reaches a minimum or maximum value, preferably since a first derivative of the gradient may be equal to zero. According to the invention, the second condition (I.sub.vvv<0), above, preferably requires that that point be a maximum, preferably such that a second derivative of the gradient is negative, preferably corresponding to a concave curvature of the peak.
[0106] It is preferably shown that I.sub.vv and/or I.sub.vvv, according to the invention, are given by the following expressions [see, for example, T. Lindberg, “Edge Detection and Ridge Detection with Automatic Scale Selection”, International Journal of Computer Vision 30 (2): 117-154, 1998]
I.sub.vv=i.sub.x.sup.2I.sub.xx+2I.sub.xI.sub.yI.sub.xy+I.sub.y.sup.2I.sub.yy
I.sub.vvv=I.sub.x.sup.3I.sub.xxx+3I.sub.x.sup.2I.sub.yI.sub.xxy+3I.sub.xI.sub.y.sup.2I.sub.xyy+I.sub.y.sup.3I.sub.yyy.
[0107] The foregoing definitions, as provided, preferably return all edges in a given image. Preferably, given that it still may be desirable to provide relatively strong edges, the second condition is preferably modified in accordance with the following relationship
I.sub.vvv<−T
where T is preferably a minimum threshold which preferably sets a minimum curvature of the edge point to be considered an edge.
VI. Differential Edge Detector Output
[0108] According to the invention, an output of the differential edge detector substep 115—the edge image 60—is preferably seen in
[0109]
Orientation and/or Spatial Binning
[0110] According to the present invention, in the orientation and spatial binning step 120 of the DEB feature extraction method 100 shown in
[0111] This step 120 preferably involves the binning of the orientation of edges and the spatial location of edges. One purpose of this step 120 is preferably to provide higher specificity in edges, preferably by using the orientation of the edges to discriminate between shapes, and tolerance to edge location, preferably by allowing binning along the spatial dimension.
[0112] As shown in
I. Edge Orientation
[0113] According to the present invention, the edge orientation substep 121 is preferably used to estimate an orientation of each pixel in the gradient image 50 and the edge image 60.
[0114] The orientation of the edge is preferably obtained using the first derivatives derived in the differentiation substep 114. The angle and magnitude of each edge pixel is preferably obtained using one or more of the following equations
[0115] According to the present invention, it may be desirable to be concerned only with the orientation. The formulae above are preferably used to generate an orientation image I.sub.θ.
II. Orientation Binning
[0116] According to the present invention, for the orientation binning substep 122, the orientation for each edge pixel is preferably binned into one of N bins representing the orientation between 0 and 180 degrees. The number of bins (N) is preferably application specific. Each pixel is preferably represented by a vector of length N, preferably where the vector is zero for all elements except the bin corresponding to the orientation of that pixel.
[0117] Preferably, for the orientation binning substep 122, a range from −90 to +90 degrees is divided into N segments. Since a significant number of features may be located on the vertical or horizontal directions, the bins are preferably defined such that they are centered around 0 degrees. Preferably by doing this or ensuring N is even, the vertical and horizontal edges are preferably positioned in the center of their respective bins. Using this definition, each edge pixel from the edge image, I.sub.E, is preferably compared to the range and assigned to one of N bins. The assignment preferably occurs by defining a vector of length N for each pixel. The vector is preferably 0 for all elements, preferably except the element corresponding to the current edge orientation. That element is preferably set to unity. For example, if N=4, a pixel on a vertical line may correspond to
v=(0,1,0,0)
[0118] and, for a horizontal line, may correspond to
v=(0,0,0,1).
III. Spatial Binning
[0119] According to the present invention, the spatial binning substep 123 preferably involves the binning of adjacent pixels into M×M cells and the orientation vectors are preferably summed.
[0120] This substep 123 is preferably performed on the output of the orientation binning substep 122, preferably to generate a cell output. Cells are preferably defined as a summed response of the pixels in an M×M area. A formula is preferably provided as follows
C(i,j)=Σk.sub.=i.sup.i+MΣ.sub.l=j.sup.j+Mv(k,l).
[0121] According to an aspect of one preferred embodiment of the invention, a result is preferably that each cell preferably contains a histogram of the orientations of the surrounding M×M pixels.
[0122]
IV. Cell Offset Image
[0123] According to the present invention, the cell offset image substep 124 preferably involves the processing of the resulting image from the binning substeps 122,123 to generate a cell offset image, preferably containing M×M scaled images corresponding to the starting offset of the cell. This is preferably done to ensure overlap in features when binning the pixels into M×M cells.
[0124] According to the present invention, the output of the spatial binning substep 123 is preferably further processed, preferably to generate a DEB cell offset image (I.sub.DEB) 30. This substep 123 is preferably an optimization substep, which preferably facilitates feature classification and matching. The offset image 30 is preferably created by reorganizing the cells, preferably such that the cells are located relative to the initial offset. According to one aspect of the invention, the offset is preferably the distance in pixels from the origin to the starting location of the first cells in the M×M region. For example, for cell(0,0) the offset may be (0,0), whereas for cell(1,4) the offset may be (1,4) based on 4×4 spatial binning. Preferably by reorganizing the cells, the resulting DEB image 30 is similar to the image depicted in
[0125]
Feature Cropping
[0126] According to the present invention, in the feature cropping step 130 as shown in the method 100 of
[0127] The feature cropping step 130 may preferably identify and crop selected salient features, which are preferably used for matching. This is preferably achieved by selecting a block of pixels from the DEB cell image 30, F. One or more of the following constraints is preferably used in the feature cropping step 130: (i) the starting column is preferably a multiple of the number of orientation bins (N); and/or (ii) the ending column is preferably a multiple of the number of orientation bins (N).
B. Feature Classification
[0128] As best shown in
Feature Response
[0129] According to the DEB feature classification method 200 shown in
where I(I,j) is preferably an input DEB cell image 30 of size (M,N) and T(i,j) is preferably a DEB feature 20 of size (P,Q) being evaluated. Here C(x,y) is preferably the correlation and R(x,y) is preferably the normalized response.
[0130] According to the present invention, and preferably in order to facilitate localization of the optimal match, the response image is preferably reorganized to produce a match response image M(x,y). The original response, R(x,y)—similar to the DEB cell image 30—preferably contains M×M sub-images corresponding to the response for each cell offset. This image is preferably reorganized, preferably such that a single image is visible. For example,
Match Detection
[0131] According to the present invention, the match detection step 220, as shown in
[0132] The match detection step 220 preferably includes a search of a defined search area of an image, using the DEB template containing DEB reference features, for a response peak. Preferably, if the response peak exceeds the correlation threshold or the distance to the search origin is less than the distance threshold, then a match is preferably found, producing a match result 40. Persons having ordinary skill in the art may understand that a match result 40 for an RDT may, for example, indicate the presence of absence of a specific disease state or biomarker.
CONCLUSION
[0133] The present invention is contemplated for use in association with image processing algorithms and template matching to detect and localize similar features within an image, and/or in association with recognition of RDTs based on their appearance, to afford increased functionality and/or advantageous utilities in association with same. The invention, however, is not so limited.
[0134] The foregoing description has been presented for the purpose of illustration and is not intended to be exhaustive or to limit the invention to the precise form disclosed. Other advantages, features and/or characteristics of the present invention, as well as methods of operation and/or functions of the related elements of the system, method and/or computer readable medium, and/or the combination of steps, parts and/or economies of manufacture, will become more apparent upon consideration of the accompanying drawings. Certain novel features which are believed to be characteristic of the system, method and/or computer readable medium according to the present invention, as to their organization, use, and/or method of operation, together with further objectives and/or advantages thereof, will be better understood from the accompanying drawings in which presently preferred embodiments of the invention are illustrated by way of example. It is expressly understood, however, that the drawings are for the purpose of illustration and description only, and are not intended as a definition of the limits of the invention.
[0135] Naturally, in view of the teachings and disclosures herein, persons having ordinary skill in the art may appreciate that alternate designs and/or embodiments of the invention may be possible (e.g., with substitution of one or more components, features, steps, algorithms, etc. for others, with alternate configurations of components, features, steps, algorithms, etc). Although some of the components, features, steps, algorithms, relations and/or configurations according to the invention are not specifically referenced in association with one another, they may be used, and/or adapted for use, in association therewith. All of the aforementioned, depicted and various structures, configurations, features, steps, algorithms, relationships, utilities and the like may be, but are not necessarily, incorporated into and/or achieved by the invention. Any one or more of the aforementioned structures, configurations, features, steps, algorithms, relationships, utilities and the like may be implemented in and/or by the invention, on their own, and/or without reference, regard or likewise implementation of any of the other aforementioned structures, configurations, features, steps, algorithms, relationships, utilities and the like, in various permutations and combinations, as will be readily apparent to those skilled in the art, without departing from the pith, marrow, and spirit of the disclosed invention.
[0136] This concludes the description of presently preferred embodiments of the invention. The foregoing description has been presented for the purpose of illustration and is not intended to be exhaustive or to limit the invention to the precise form disclosed. Other modifications, variations and alterations are possible in light of the above teaching and will be apparent to those skilled in the art, and may be used in the design and manufacture of other embodiments according to the present invention without departing from the spirit and scope of the invention. It is intended the scope of the invention be limited not by this description but only be the claims forming a part hereof