System and method for detecting woody breast condition in broilers using image analysis of carcass features
10806153 ยท 2020-10-20
Assignee
Inventors
- Casey Owens Hanning (Springdale, AR, US)
- Xiao Sun (Nanjing, CN)
- Juan P. Caldas-Cueva (Lima, PE)
- Andronikos Mauromostakos (Fayetteville, AR, US)
Cpc classification
International classification
Abstract
This invention relates generally to a system and method of detecting woody breast using image analysis of carcass features, and more particularly to a real-time system and method of detecting woody breast in broilers using non-destructive and/or non-contact image analysis of carcass features. The system and method assess woody breast in broilers at the fillet level using image analysis of the angle or area associated with the tip of the keel bone and surrounding breast meat of broiler carcasses. The method is configured be incorporated into and utilized by vision grading system.
Claims
1. A method of detecting woody breast condition in a broiler carcass using image analysis, said method comprising the steps of: acquiring a digital image of said broiler carcass; detecting, by a processor, a presence of woody breast condition in said broiler carcass using automated image analysis of at least one carcass feature of said image of said broiler carcass; computing at least one measurement of a caudal region of said broiler carcass; computing a breast width of a cranial region of said carcass, wherein said breast width is M1; computing a length from a tip of a keel bone of said broiler carcass to about th of a total breast length of said broiler carcass, wherein said length is M2; and automatically grading or categorizing a severity of any detected presence of said woody breast condition in said broiler carcass.
2. The method of claim 1 wherein said step of acquiring said digital image of said broiler carcass further comprises the step of acquiring said digital image of said broiler carcass against a background having a sharp outline of said broiler carcass.
3. The method of claim 1 further comprising the step of detecting woody breast condition in said broiler carcass using non-destructive and non-contact image analysis of said carcass feature.
4. The method of claim 3 further comprising the step of detecting woody breast condition in said broiler carcass using said non-destructive and non-contact image analysis of said carcass feature controlled by an on-line process control system.
5. The method of claim 1 further comprising the step of grading or categorizing said severity as mild, moderate or severe.
6. The method of claim 5 further comprising the step of grading or categorizing said severity of said woody breast condition in said broiler carcass using an automated, real-time vision grading system.
7. The method of claim 1 further comprising the step of computing a breast width at an end of M2, wherein said breast width at said end of M2 is M3.
8. The method of claim 7 further comprising the step of computing an angle formed at a tip of a keel bone of said broiler carcass, wherein said angle is M4.
9. The method of claim 8 further comprising the step of computing an area of a triangle formed by M3 and M4, wherein said angle M4 extends to end points of M3, and wherein said area of said triangle is M5.
10. The method of claim 7 further comprising the step of computing an area of said caudal region formed above M3, wherein said area of said caudal region formed above M3 is M6.
11. The method of claim 10 further comprising the step of computing a difference of said area M6 and said area M5, wherein said difference is M7.
12. The method of claim 7 further comprising the step of computing a ratio of M3 to M1, wherein said ratio of M3 to M1 is M8.
13. The method of claim 7 further comprising the step of computing a ratio of M3 to M2, wherein said ratio of M3 to M2 is M9.
14. The method of claim 11 further comprising the step of computing a ratio of M7 to M5, wherein said ratio of M7 to M5 is M10.
15. A method for automated vision grading of woody breast condition in a broiler carcass, said method comprising the steps of: electronically processing by a processor of a digital image of a broiler carcass to compute at least one measurement of a caudal region of said broiler carcass; electronically computing a breast width of a cranial region of said carcass, wherein said breast width is M1; electronically computing a length from a tip of a keel bone of said broiler carcass to about 1/5th of a total breast length of said broiler carcass, wherein said length is M2; based on said measurement of said caudal region of said broiler carcass, electronically detecting a presence of woody breast condition in said broiler carcass using non-destructive and non-contact image analysis of said digital image; and for any detected presence of said woody breast condition, automatically grading or categorizing a severity of said woody breast condition in said broiler carcass.
16. The method of claim 15 wherein said at least one measurement of said caudal region of said broiler carcass comprises: a breast width at an end of M2, wherein said breast width at said end of M2 is M3; an angle formed at a tip of a keel bone of said broiler carcass, wherein said angle is M4; an area of a triangle formed by M3 and M4, wherein said angle M4 extends to end points of M3, and wherein said area of said triangle is M5; an area of said caudal region formed above M3, wherein said area of said caudal region formed above M3 is M6; a difference of said area M6 and said area M5, wherein said difference is M7; a ratio of M3 to M1, wherein said ratio of M3 to M1 is M8; a ratio of M7 to M5, wherein said ratio of M7 to M5 is M10; a ratio of M3 to M2, wherein said ratio of M3 to M2 is M9; or a combination thereof.
17. The method of claim 15 further comprising the step of acquiring said digital image of said broiler carcass against a background having a sharp outline of said broiler carcass.
18. The method of claim 15 wherein said step of detecting said presence of woody breast condition in said broiler carcass further comprises the step of using non-destructive and non-contact image analysis of said broiler carcass controlled by an on-line process control system.
19. The method of claim 15 further comprising the step of electronically grading or categorizing said severity as mild, moderate or severe.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) These and further aspects of the invention are described in detail in the following examples and accompanying drawings.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
DETAILED DESCRIPTION OF THE INVENTION
(25) While this invention is susceptible of embodiment in many different forms, there is shown in the drawings, and will herein be described hereinafter in detail, some specific embodiments of the invention. It should be understood, however, that the present disclosure is to be considered an exemplification of the principles of the invention and is not intended to limit the invention to the specific embodiments so described.
(26) This invention relates generally to a system and method for detecting woody breast condition in broilers using image analysis of carcass features, and in particular to an automated and/or real-time vision grading system and method for detecting and grading woody breast condition in broiler carcasses using non-destructive and non-contact image analysis. The method is configured be incorporated into and utilized by an automated vision grading system.
(27) The system and method takes or otherwise acquires a digital image of the broiler carcass and then detects the presence of woody breast (Pectoralis major) features based on measurements of the broiler carcass. The system and method use the image analysis to extract and compute various measurements of the broiler carcass, such as a breast width at a caudal region (M3), an angle at a tip of the keel bone (M4) of the broiler carcass, an area of a triangle formed by M3 and lines generated by M4 (M5), an area of the breast formed above M3 (M6) of the broiler carcass, a difference of the areas of M6 and M5 (M7) of the broiler carcass, and/or various ratios of those measurements (i.e., M8 (M3/M1), M9 (M3/M2), and M10 (M7/M5)). Based on the measurements, the system and method then grades any detected incidents of woody breast condition in predetermined categories, such as mild, moderate or severe.
EXAMPLES
(28) The system and method for detecting woody breast condition in broilers using image analysis of carcass features disclosed herein is further illustrated by the following examples, which are provided for the purpose of demonstration rather than limitation.
(29) Processing of Birds:
(30) Male broiler carcasses from three commercial strains (2 high breast yielding strains and 1 standard breast yielding strain) and five ages (6, 7, 8, 9 and 10 weeks) were evaluated. A total of 1,203 broiler carcass images were collected in nine-month period at 13 different processing dates between October 2016 and July 2017. All birds were processed at the University of Arkansas Poultry Processing Pilot Plant according to commercial practices wherein birds were weighed, shackled, electrically stunned (11 V, 11 mA, 11 s), manually slaughtered, bled out (1.5 min), scalded (54 C., 2 min), picked in-line using defeathering equipment, manually eviscerated, and washed.
(31) Three experiments were conducted over the project period. Experiment 1 used data (n=180) from 7 and 10 wk old broiler from 2 strains. Experiment 2 used a subset of data (n=156) and included compression analysis. Experiment 3 incorporated all carcasses (n=1203) used in project for model development.
(32) Image Collection:
(33) Images of broiler carcasses were captured prior to evisceration against a black background to have a sharp outline of the carcass conformation (
(34) Deboning and Tactile Evaluation:
(35) Broiler carcasses were deboned by severing the humeral-scapular joint and pulling firmly downward on the wings. The deboned breast fillets were scored for degree of hardness using tactile evaluation. The categorization considered for the experiment was the following: 0 or 0.5 as normal (NOR); 1 or 1.5 as mild (MIL), and 2, 2.5 or 3 as severe (SEV).
(36) Compression Analysis:
(37) Compression force (N) was determined on intact fillets (n=156; Experiment 2) in quadruplicate at predetermined locations in the cranial region of each fillet. Compression force was determined using a Texture Analyzer (Model TA.XT Plus, Texture Technologies Corp, Hamilton, Mass.) with a 5-kg load cell using a 6 mm diameter flat probe. The sample was compressed to 20% of height and maximum force (N) to compress the area was determined.
(38) Image Analysis:
(39) After collection, 2-dimensional (2D) and frontal images were processed using the ImageJ software (National Institutes of Health). Some image processing functions were used such as vertical rotation and sharpening. The parameters for carcass conformation considered for this study were: M1, M2, M3, M4, M5, M6, and M7, as described in Table 1 and shown in
(40) TABLE-US-00001 TABLE 1 Measurements of structural information extracted from broiler carcass images Measurement Description M1 (mm) breast width in the cranial region M2 (mm) a vertical line from the tip of keel to 1/5th of breast length (right side) M3 (mm) breast width at the end of M2 M4 () angle formed at the tip of keel and extending to outer points of M3 M5 (mm.sup.2) area of the triangle formed by M3 and lines generated by M4 M6 (mm.sup.2) area of the breast section formed above M3 M7 (mm.sup.2) M6-M5 (difference between M6 and M5) M8 The ratio M3/M1 M9 The ratio M3/M2 M10 The ratio M7/M5
Experiment 1
(41) Experiment 1 demonstrates that image analysis method is a reliable method to identify WB in commercial high breast yielding broiler carcasses from 7 and 10 weeks of age.
(42) Carcass measurements were evaluated to determine the effect of strain, age, and woody breast category on carcass breast shape conformation. Strain impacted most measurements (M2, M3, M4 and M7) while age impacted all measurements (P<0.05). This was expected as the broiler will get larger with age and thus have larger measurements of carcass features. Woody breast category impacted (P<0.05) all measurements with exception of M2. There was also an interaction (P<0.05) of age and WB category (with exception of M7) while no interaction of strain and age were noted or strain by WB (with exception to M6). Table 2 below shows means for carcass measurements for the age by WB category interaction. Though an interaction was present, the means for WB categories generally showed the same trends at each age. The cranial width (M1) significantly increased from normal to severe with the mild being intermediate. M2 is 20% of the carcass length (breast area) and this increased (P<0.05) with age, but not with WB breast severity (P>0.05). The breast width at the caudal region also increased (P<0.05) with age and increasing degrees of WB severity. The M4 (representing an angle at point of keel) also increased (P<0.05) with age and WB severity. The area in the caudal region also increased with age and WB severity. This increase in is explained by the increased width of the caudal region. Furthermore, M6 and M7 increased significantly with increasing degrees of WB severity, suggesting that the breast shape is fuller creating a more U-shape rather than V-shape. The U-shaped breast is more characteristic of a carcass with severe woody breast (examples shown in
(43) Correlation coefficients of WB scores and carcass measurements are shown strain by age in Table 3 and
(44) TABLE-US-00002 TABLE 2 Means of physical carcass measurements of broilers 7 and 10 wk of age by WB category. M1 M2 M3 M4 M5 M6 M7 Age WB (mm) (mm) (mm) () (mm.sup.2) (mm.sup.2) (mm.sup.2) 7 wk NORM 152.38.sup.d 46.69.sup.b 92.75.sup.e 88.81.sup.e 2,212.66.sup.e 2,840.81.sup.e 622.62.sup.d MILD 166.41.sup.c 47.74.sup.b 103.20.sup.d 94.75.sup.d 2,521.63.sup.d 3,357.20.sup.d 835.57.sup.c SEV 169.76.sup.c 46.90.sup.b 116.41.sup.bc 102.02.sup.b 2,786.96.sup.c 3,897.06.sup.c 1,110.10.sup.b 10 wk NORM 183.20.sup.b 52.02.sup.a 114.44.sup.c 94.67.sup.d 3,027.59.sup.b 3,925.85.sup.c 898.26.sup.c MILD 183.48.sup.b 51.34.sup.a 119.86.sup.b 99.56.sup.c 3,137.98.sup.b 4,164.36.sup.b 1,026.38.sup.b SEV 194.18.sup.a 52.48.sup.a 137.88.sup.a 105.00.sup.a 3,677.98.sup.a 5,018.54.sup.a 1,340.56.sup.a .sup.a-cMeans with no common superscript within column differ (P < 0.05). n = 15 per mean
(45) TABLE-US-00003 TABLE 3 Correlation coefficients of woody breast score and physical carcass measurements from 2 strains of broilers 7 and 10 weeks of age. Strain Age M1 M2 M3 M4 M5 M6 M7 Strain 1 7 0.7359 0.0405 0.9520 0.9124 0.8761 0.9385 0.8892 <.0001 0.7914 <.0001 <.0001 <.0001 <.0001 <.0001 10 0.6374 0.0600 0.9170 0.8714 0.7820 0.8538 0.8130 <.0001 0.6951 <.0001 <.0001 <.0001 <.0001 <.0001 Strain 2 7 0.6731 0.0083 0.9267 0.9332 0.8164 0.9116 0.9236 <.0001 0.9564 <.0001 <.0001 <.0001 <.0001 <.0001 10 0.5225 0.2583 0.9228 0.8639 0.8435 0.8635 0.7895 0.0002 0.0866 <.0001 <.0001 <.0001 <.0001 <.0001
Experiment 2
(46) Experiment 2 was conducted to evaluate the relationship of compression force to carcass measurements. Compression force is an instrumental method to assess fillet hardness and can provide a more objective method to classify fillets into WB categories. After scoring fillets via palpation (tactile evaluation), fillets were subjected to compression analysis. Compression force significantly increased (P<0.05) with WB category as expected (Table 4). All other carcass measurements increased (P<0.05) with increasing degrees of WB severity, which supports the results obtained in Experiment 1.
(47)
(48) These data support the system and method for detecting woody breast condition in broilers carcass using image analysis disclosed herein. The potential integration of these image measurements into an in-line vision grading technology would allow processors to identify and sort broiler carcasses by WB category.
(49) TABLE-US-00004 TABLE 4 Means of compression force and physical carcass measurements from broiler breast fillets exhibiting various degrees of woody breast. WB Force(N) M1 M2 M3 M4 M5 M6 M7 NOR 5.16.sup.c 186.97.sup.c 53.43.sup.a 121.00.sup.c 96.98.sup.c 3232.91.sup.c 4157.36.sup.c 924.45.sup.c MIL 8.15.sup.b 192.10.sup.b 52.74.sup.ab 130.76.sup.b 102.01.sup.b 3450.51.sup.b 4562.56.sup.b 1112.05.sup.b SEV 12.40.sup.a 196.08.sup.a 52.23.sup.b 141.67.sup.a 107.10.sup.a 3701.68.sup.a 5136.39.sup.a 1434.71.sup.a .sup.a-cMeans with no common superscript within column differ (P < 0.05)
Experiment 3
(50) Modelling:
(51) Experiments 1 and 2 above developed the initial relationships between various carcass measurements and woody breast fillet score. Experiment 3 evaluated a larger data set to determine models for predicting woody breast based on carcass features. Images (n=1203) collected from broilers 6 to 10 weeks of age were used to develop models for predicting woody breast based on a wide range of data by using multiple ages and strains (i.e., different sizes and shape characteristics).
(52) Correlations between woody breast scores (deboned fillets) and carcass measurements followed similar trends as in Experiments 1 and 2 (
(53) Model Using Carcass Measurements:
(54) Image measurements of M1 through M7 and carcass weight, without giblets (WOG), were initially used for model develop, and M1, M4, M6 and M7 were significant (P<0.05) and therefore chosen for model parameters (Table 5). From the total number of images, 842 were used for model development (training; generalized R.sup.2 0.5891) and 361 for validation (generalized R.sup.2 0.6039). These measurements were used to predict three categories of WB: Normal (score 0-0.5), Mild (score 1-1.5), and Severe (score 2-3).
(55) TABLE-US-00005 TABLE 5 Parameter estimates for Ordinal Logistic Model based on four carcass measurements for predicting three levels of woody breast. Term Estimate Std Error ChiSquare Prob > ChiSq Intercept [NOR] 41.4345204 2.7980342 219.29 <.0001* Intercept [MIL] 43.9291783 2.8570817 236.41 <.0001* M1 0.0760441 0.0129945 34.25 <.0001* M4 0.3625174 0.0240556 227.11 <.0001* M6 0.0027432 0.0004283 41.02 <.0001* M7 0.0041144 0.0006483 40.28 <.0001*
(56) The overall misclassification rate for the model was 32.6%; however, this included both categories and when evaluating misclassifications for each category, the rate is lower. For example, in the training model, less than 1% of normal fillets would have been misclassified as severe fillets whereas approximately 22% of normal would be misclassified as mild (Table 6, Training). For the severe category, there would be less than 2% misclassified into the normal and 31% into the mild category. Therefore, the misclassification rate is actually low for predicting normal and severe categories. The mild fillets were misclassified most often (46%) into either the normal or severe categories. This could be expected as these fillets are intermediate in many shape features along with meat quality attributes when compared to the other categories. When validating the model, the results sensitivity was similar (Table 6, Validation). The profiler shown in
(57) TABLE-US-00006 TABLE 6 Classification of fillets into WB categories; false positives. Actual Predicted Count WB_3 NOR MIL SEV Training NOR 283 80 3 MIL 78 141 43 SEV 4 67 143 Validation NOR 122 38 4 MIL 27 49 24 SEV 3 30 64
(58) Model Using Ratios of Carcass Measurements:
(59) Ratios between certain carcass features were calculated: M8, M9 and M10. Measurements M8 (width of caudal region to width of cranial region) and M9 (width at caudal region to 20% fillet length, M5, ratio) were chosen for model based on P values. From the total number of images, 828 were used for model development (training; generalized R.sup.2 0.5729) and 375 for validation (generalized R.sup.2 0.5879). These measurements were used to predict three categories of WB. The sensitivity (true positive) rates for both mild (0.89, good) and severe (0.90, excellent) categories of WB were similar to the previous model using carcass measurements. The overall misclassification rate was also similar to the previous model at 33.7%. However, the percent of misclassified severe fillets into a normal category was higher at approximately 6% in the training model (Table 7).
(60) TABLE-US-00007 TABLE 7 Classification of fillets into WB categories; false positives. Actual Predicted Count WB_3 NOR MIL SEV Training NOR 299 61 8 MIL 83 116 51 SEV 13 63 134 Validation NOR 128 28 6 MIL 46 50 16 SEV 3 26 72
(61) Statistical Analysis:
(62) In Experiment 1, a subset of data (n=180) was used to determine if there are conformation changes that can be used to identify broiler carcasses with WB characteristics from two commercial high breast yielding strains at 7 and 10 wk using the image analysis. In this study, 15 replications (carcass) per treatment were used. ANOVA of broiler carcass dimensions across WB categories was also conducted and means were separated using Tukey's HSD at a significance of P<0.05 using JMP (SAS Institute Inc., Cary, N.C.).
(63) In Experiment 2, another subset of data (n=156) was used to evaluate the relationships of broiler (8 wk) carcass conformation, palpation scores and instrumental compression force. Correlation coefficients were estimated for WB severity scores, compression force and image measurements. ANOVA of broiler carcass dimensions across WB categories was also conducted and means were separated using Tukey's HSD at a significance of P<0.05.
(64) In Experiment 3, all carcass image data (n=1203) were used to develop predictive models using Ordinal Logistic analysis. Models were based on 3 categories of WB (Normal, Mild, and Moderate/Severe), and additionally, models were developed based on 2 categories with a score of 2 and above as a threshold of WB presence or absence. Correlation coefficients were also estimated between WB score and the carcass measurements along with carcass without giblet (WOG) weight.
(65) As illustrated in
(66) As noted above, the inventive method can be utilized by an automated, real-time vision grading system in conjunction with an on-line process control in a poultry processing plant. The vision grading system can include a high-speed digital camera, LED lighting and advanced recognition software for evaluate size, shape, color and/or texture of broiler chickens. The vision grading system stores the digital images received from the camera to a computer and then processes the digital images in accordance the method described above.
(67) The system and method described herein may be deployed in part or in whole through network infrastructures. The network infrastructure may include elements such as computing devices, servers, routers, hubs, firewalls, clients, personal computers, communication devices, routing devices and other active and passive devices, modules and/or components as known in the art. The computing and/or non-computing device(s) associated with the network infrastructure may include, apart from other components, a storage medium such as flash memory, buffer, stack, RAM, ROM and the like. The processes, methods, program codes, instructions described herein and elsewhere may be executed by one or more of the network infrastructural elements.
(68) The computer software, program codes, and/or instructions may be stored and/or accessed on machine readable media that may include: computer components, devices, and recording media that retain digital data used for computing for some interval of time; semiconductor storage known as random access memory (RAM); mass storage typically for more permanent storage, such as optical discs, forms of magnetic storage like hard disks, tapes, drums, cards and other types; processor registers, cache memory, volatile memory, non-volatile memory; optical storage such as CD, DVD; removable media such as flash memory (e.g. USB sticks or keys), floppy disks, magnetic tape, paper tape, punch cards, standalone RAM disks, Zip drives, removable mass storage, off-line, and the like; other computer memory such as dynamic memory, static memory, read/write storage, mutable storage, read only, random access, sequential access, location addressable, file addressable, content addressable, network attached storage, storage area network, bar codes, magnetic ink, and the like.
(69) The systems and/or methods described herein, and steps thereof, may be realized in hardware, software or any combination of hardware and software suitable for a particular application. The hardware may include a general-purpose computer and/or dedicated computing device or specific computing device or particular aspect or component of a specific computing device. The methods may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory. The methods may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the methods may be realized as a computer executable code capable of being executed on a machine-readable medium.
(70) The computer executable code may be created using a structured programming language such as C, an object oriented programming language such as .NET and C++, a lightweight data-interchange programming language such as JavaScript Object Notation (JSON) data-interchange format over HTTP POST request/response, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software, or any other machine capable of executing program instructions.
(71) Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof. In another aspect, the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, the means for performing the steps associated with the methods described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
(72) It is to be understood that the terms including, comprising, consisting and grammatical variants thereof do not preclude the addition of one or more components, features, steps, or integers or groups thereof and that the terms are to be construed as specifying components, features, steps or integers.
(73) If the specification or claims refer to an additional element, that does not preclude there being more than one of the additional element.
(74) It is to be understood that where the claims or specification refer to a or an element, such reference is not be construed that there is only one of that element.
(75) It is to be understood that where the specification states that a component, feature, structure, or characteristic may, might, can or could be included, that particular component, feature, structure, or characteristic is not required to be included.
(76) Where applicable, although state diagrams, flow diagrams or both may be used to describe embodiments, the invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.
(77) Methods of the disclosure may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
(78) The term method may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
(79) For purposes of the disclosure, the term at least followed by a number is used herein to denote the start of a range beginning with that number (which may be a ranger having an upper limit or no upper limit, depending on the variable being defined). For example, at least 1 means 1 or more than 1. The term at most followed by a number is used herein to denote the end of a range ending with that number (which may be a range having 1 or 0 as its lower limit, or a range having no lower limit, depending upon the variable being defined). For example, at most 4 means 4 or less than 4, and at most 40% means 40% or less than 40%. Terms of approximation (e.g., about, substantially, approximately, etc.) should be interpreted according to their ordinary and customary meanings as used in the associated art unless indicated otherwise. Absent a specific definition and absent ordinary and customary usage in the associated art, such terms should be interpreted to be 10% of the base value.
(80) When, in this document, a range is given as (a first number) to (a second number) or (a first number)(a second number), this means a range whose lower limit is the first number and whose upper limit is the second number. For example, 25 to 100 should be interpreted to mean a range whose lower limit is 25 and whose upper limit is 100. Additionally, it should be noted that where a range is given, every possible subrange or interval within that range is also specifically intended unless the context indicates to the contrary. For example, if the specification indicates a range of 25 to 100 such range is also intended to include subranges such as 26-100, 27-100, etc., 25-99, 25-98, etc., as well as any other possible combination of lower and upper values within the stated range, e.g., 33-47, 60-97, 41-45, 28-96, etc. Note that integer range values have been used in this paragraph for purposes of illustration only and decimal and fractional values (e.g., 46.7-91.3) should also be understood to be intended as possible subrange endpoints unless specifically excluded.
(81) It should be noted that where reference is made herein to a method comprising two or more defined steps, the defined steps can be carried out in any order or simultaneously (except where context excludes that possibility), and the method can also include one or more other steps which are carried out before any of the defined steps, between two of the defined steps, or after all of the defined steps (except where context excludes that possibility).
(82) Still further, additional aspects of the invention may be found in one or more appendices attached hereto and/or filed herewith, the disclosures of which are incorporated herein by reference as if fully set out at this point.
(83) Thus, the present invention is well adapted to carry out the objects and attain the ends and advantages mentioned above as well as those inherent therein. While the inventive concept has been described and illustrated herein by reference to certain illustrative embodiments in relation to the drawings attached thereto, various changes and further modifications, apart from those shown or suggested herein, may be made therein by those of ordinary skill in the art, without departing from the spirit of the inventive concept the scope of which is to be determined by the following claims.