METHOD FOR MAPPING THE VAULT FOR AN IMPLANTED INTER OCULAR LENS
20210353252 · 2021-11-18
Inventors
- Daniel Z. Reinstein (London, GB)
- Linda Johnson (Boulder, CO, US)
- Johan Erik Giphart (Aurora, CO, US)
- Andrew K. Levien (Morrison, CO, US)
- Matthew G. Sassu (Highlands Ranch, CO, US)
Cpc classification
A61B8/0833
HUMAN NECESSITIES
A61B8/5223
HUMAN NECESSITIES
A61B8/4461
HUMAN NECESSITIES
A61B8/5207
HUMAN NECESSITIES
International classification
A61B8/00
HUMAN NECESSITIES
Abstract
The present disclosure is directed to a system and method that detects and measures a vault of an anterior segment of an eye of a patient. The system and method locate an implanted contact lens (ICL) between a cornea and a natural lens, form a B-scan of the eye based on the received ultrasound pulses, removes background pixels, such as by binarizing and thresholding, from the B-Scan from a grayscale color palette to a black/white color palette, determines, from the resulting B-scan a fiduciary location in the anterior segment of the eye, and forms using the resulting B-scan and fiduciary location, a vault map mapping a distance between an anterior segment surface and a posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior segment surface.
Claims
1. A method for detecting and measuring a vault of an anterior segment of an eye of a patient comprising: imaging, by an ultrasound scanning device, an anterior segment of the eye; locating, by a processor, an implanted contact lens (ICL) between a cornea and a natural lens; forming, by the processor, one or more B-scans of at least a portion of the eye based on the received ultrasound pulses; converting, by the processor, the one or more B-scans from a grayscale color palette to a black/white color palette; determining, by the processor, using segmentation analysis of the converted one or more B-scans, a fiduciary location in the anterior segment of the eye; and forming, by the processor and using the converted one or more B-scans and fiduciary location, a vault map mapping a distance between an anterior segment surface and a posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior segment surface.
2. A method of claim 1, wherein the converting comprises binarizing and thresholding the one or more B-scans to remove background image information and wherein the determining comprises: locating, by the processor, a center of an iris of the eye; dividing the one or more B-scans along the iris center to form left and right portions of the one or more B-scans; and horizontally flipping one of the left and right portions.
3. The method of claim 2, further comprising: for each of the left and right portions: finding, by the processor, a region of interest representing a rough estimate of a vault; measuring, by the processor, a vault dimension along the anterior segment surface; and merging, by the processor, the measured vault dimensions in the left and right portions to form a common vault map region of interest.
4. The method of claim 3, wherein the one or more B-scans comprises multiple B-scans and each of the multiple B-scans corresponds to a region of interest and wherein, for each of the plurality of regions of interest, the forming comprises: extrapolating, by the processor, a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning, by the processor, a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
5. The method of claim 4, wherein the forming comprises: interpolating, by the processor, between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
6. The method of claim 5, wherein the forming comprises: assigning, by the processor, color values to a range of virtual heights; and generating, by the processor, the vault map based on the assigned color values.
7. An eye imaging system, comprising: an input to receive a plurality of A-scans of an anterior capsule of an eye of a patient from an ultrasound scanning device; a processor coupled with the input; and a memory coupled with and readable by the processor and storing therein a set of instructions which, when executed by the processor causes the processor to: locate an implanted contact lens (ICL) between a cornea and a natural lens of the eye; form, from the plurality of A-scans, one or more B-scans of the anterior capsule of the eye; convert the one or more B-scans from a grayscale color palette to a black/white color palette; determine a fiduciary location in the anterior capsule of the eye; and form, using the converted one or more B-scans and fiduciary location, a vault map mapping a height between the anterior capsule and a posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior capsule.
8. The system of claim 7, wherein the converting comprises binarizing and thresholding the one or more B-scans to remove background image information and wherein the determining is performed using segmentation analysis.
9. The system of claim 8, wherein the determining comprises: locating a center of an iris of the eye; dividing the one or more B-scans along the iris center to form left and right portions of the one or more B-scans; and horizontally flipping one of the left and right portions.
10. The system of claim 9, wherein, for each of the left and right portions, the instructions cause the processor to: find a region of interest representing a rough estimate of a vault; and measure a vault dimension along the anterior capsule surface; and wherein the instructions further cause the processor to merge the measured vault dimensions in the left and right portions to form a common vault map region of interest.
11. The system of claim 10, wherein the one or more B-scans comprises multiple B-scans and each of the multiple B-scans corresponds to a region of interest and wherein, for each of the plurality of regions of interest, the forming comprises: extrapolating a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
12. The system of claim 11, wherein the forming comprises: interpolating between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
13. The system of claim 12, wherein the forming comprises: assigning color values to a range of virtual heights; and generating the vault map based on the assigned color values.
14. An eye imaging system, comprising: an input to receive a plurality of A-scans of an anterior capsule of an eye of a patient from an ultrasound scanning device; a processor coupled with the input; and a memory coupled with and readable by the processor and storing therein a set of instructions which, when executed by the processor causes the processor to: locate an implanted contact lens (ICL) between a cornea and a natural lens of the eye; form, from the plurality of A-scans, one or more B-scans of the anterior capsule of the eye; remove background pixels of the one or more B-scans to form a binary image; determine a fiduciary location in the anterior capsule of the eye; and determine, from the binary image and the fiduciary location, a vault dimension between the anterior capsule and a posterior surface of the ICL along a selected line drawn perpendicular to a local surface of the anterior capsule surface.
15. The system of claim 14, wherein the vault distance determining comprises: forming, based on a plurality of vault distances, a vault map mapping a distance between the anterior capsule and the posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior capsule surface, the plurality of lines comprising the selected lines.
16. The system of claim 15, wherein the determining comprises: locating a center of an iris of the eye; dividing the one or more B-scans along the iris center to form left and right portions of the one or more B-scans; and horizontally flipping one of the left and right portions.
17. The system of claim 16, wherein, for each of the left and right portions, the instructions cause the processor to: find a region of interest representing a rough estimate of a vault; and measure a vault dimension along the anterior capsule surface; and wherein the instructions further cause the processor to merge the measured vault dimensions in the left and right portions to form a common vault map region of interest.
18. The system of claim 17, wherein the one or more B-scans comprises multiple B-scans and each of the multiple B-scans corresponds to a region of interest and wherein, for each of the plurality of regions of interest, the forming comprises: extrapolating a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
19. The system of claim 18, wherein the forming comprises: interpolating between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
20. The system of claim 19, wherein the forming comprises: assigning color values to a range of virtual heights; and generating the vault map based on the assigned color values.
21. The system of claim 14, wherein the instructions cause the processor to: identify, based on a size, shape, and/or location of anatomical structures in the binary image, the anterior capsule surface and the posterior ICL surface; remove anatomical structures other than the anterior capsule surface and posterior ICL surface from the resulting binary image to form a region of interest; based on the region of interest, identify peaks in the resulting binary image, the peaks representing the anterior capsule surface and the posterior ICL surface; and measure a vault dimension along the anterior capsule surface.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0123] The present disclosure may take form in various components and arrangements of components, and in various steps and arrangements of steps. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. In the drawings, like reference numerals may refer to like or analogous components throughout the several views.
[0124]
[0125]
[0126]
[0127]
[0128]
[0129]
[0130]
[0131]
[0132]
[0133]
[0134]
[0135]
[0136]
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143]
[0144]
[0145]
[0146]
[0147]
DETAILED DESCRIPTION OF THE DRAWINGS
Ultrasound Eye Scanning Apparatus
[0148]
[0149] The positioning mechanism assembly 109 and scan head assembly 108 are both fully immersed in water (typically distilled water) which fills the chamber from base plate 106 to the top of the chamber on which the eyepiece 107 is attached.
[0150] A patient is seated at the scanning device 101 with one eye engaged with the disposable eyepiece 107. The patient is typically directed to look downward at one of the fixation lights during a scan sequence. The patient is fixed with respect to the scanning device 101 by a headrest system such as shown, for example, in
[0151] An operator using a mouse and/or a keyboard and the video monitor, for example, inputs information into the computer selecting the type of scan and scan sequences as well as the desired type of output analyses. The operator using the mouse and/or the keyboard, the video camera located in the scanning machine, and the video screen, centers a reference marker such as, for example, a set of cross hairs displayed on the video screen on the desired component of the patient's eye which is also displayed on video screen. This is done by setting one of the cross hairs as the prime meridian for scanning. These steps are carried out using the positioning mechanism which can move the scan head in the x, x, z and beta space (three translational motions plus rotation about the z-axis). The z-axis is parallel to the longitudinal axis 110. Once this is accomplished, the operator instructs the computer to proceed with the scanning sequence. Now the computer processor takes over the procedure and issues instructions to the scan head 108 and the scanning transducer 104 and receives positional and imaging data. The computer processor proceeds with a sequence of operations such as, for example: (1) with the transducer carriage substantially centered on the arcuate guide track, rough focusing of the scanning transducer 104 on a selected eye component; (2) accurately centering of the arcuate guide track with respect to the selected eye component; (3) accurately focusing the scanning transducer 104 on the selected feature of the selected eye component; (4) rotating the scan head assembly 108 through a substantial angle (including orthogonal) and repeating steps (1) through (3) on a second meridian; (5) rotating the scan head back to the prime meridian; (6) initiating a set of A-scans along each of the of selected scan meridians, storing this information in the memory module; (7) utilizing the processor, converting the A-scans for each meridian into a set of B-scans and then processing the B-scans to form an image associated with each meridian; (8) performing the selected analyses on the A-scans, B-scans and images associated with each or all of the meridians scanned; and (9) outputting the data in a preselected format to an output device such as a printer. As can be appreciated, the patient's head must remain fixed with respect to the scanning device 101 during the above operations when scanning is being carried out, which in a modern ultrasound scanning machine, can take several tens of seconds.
[0152] An eyepiece serves to complete a continuous acoustic path for ultrasonic scanning, that path extending in water from the transducer to the surface of the patient's eye. The eyepiece 107 also separates the water in which the patient's eye is immersed (typically a saline solution) from the water in the chamber (typically distilled water) in which the transducer guide track assemblies are contained. The patient sits at the machine and looks down through the eyepiece 107 in the direction of the longitudinal axis 110. Finally, the eyepiece provides an additional steady rest for the patient and helps the patient's head to remain steady during a scan procedure.
[0153]
[0154]
[0155] An eyepiece serves to complete a continuous (substantially constant acoustic impedance) acoustic path for ultrasonic scanning, that path extending from the transducer to the surface of the patient's eye. The eyepiece also separates the water in which the patient's eye is immersed from the water in the chamber in which the positioner and scan head assemblies are immersed. Finally, when the patient is in position for a scan with his or her head firmly against the eye piece, the eyepiece provides a reference frame for the patient and helps the patient's head to remain steady during a scan. The eyepiece also must be able to pass optical wavelengths of light so that fixation targets can be used to focus the patient's eye in a desired focal state and alignment with respect to the eye's visual or optical axis.
[0156] An eyepiece system that satisfies these requirements typically consists of a mounting ring and a detachable eye piece. The mounting ring is attached to and is typically a permanent part of the main arc scanner assembly. The mounting ring has several attachment grooves which can accept attaching mechanisms on the eye piece. The eye piece is comprised of a base and a soft rubber or foam contoured face seal which is designed to seal against a typical human face around the eye that is to be scanned.
[0157] The eyepiece consists of a base ring 301 and a soft flexible sealing ring 303. A mounting ring (not shown) is attached to the main scanner housing as a permanent part of the main scanner assembly. The mounting ring has several attachment grooves which can accept attaching tabs molded into the eye piece base ring 301. In this embodiment, the attaching tabs are pushed down into the attachment grooves and then rotated into position, using the thumb and finger protrusions also molded into the eye piece base ring 301 to form a mechanical connection that seals the eye piece base against the mounting ring to prevent water leakage. This is also known as a bayonet type connection.
[0158] A sealed hygienic barrier or membrane 302 is formed as part of the eye piece base 301 and is typically located where the soft rubber or foam face seal 303 is connected to the eye piece base 301.
[0159] The eye piece of
[0160]
ICLs (Implantable Collamer Lens)
[0161] An ICL is an artificial lens that is implanted in the eye. It is an implantable contact lens inserted through a small incision in the eye and placed into its position behind the iris but in front of the natural lens.
[0162] These are also known as the Implantable Collamer® Lens. For example, the Visian ICL is FDA approved implantable lens that works with the natural eye to correct vision. The Visian ICL procedure does not remove corneal tissue. The lens gently unfolds when implanted in the eye, rests behind the iris, and is biocompatible with body chemistry.
[0163] Implantable contact lenses are thin, pliable lenses often used as an alternative to LASIK vision correction surgery for permanent vision improvement. The lenses are implanted in the eye during ICL eye surgery and work with the eye's natural lens to improve vision.
[0164] Unlike LASIK surgery, no laser is required for ICL eye surgery and the procedure can be reversed by performing an additional surgery to remove the lens.
[0165] Two types of lenses are available for ICL eye surgery. A foldable lens is inserted through a small incision and unfolds into its place between the iris and the eye's natural lens. This procedure requires an extremely small incision that is self-healing. Another available lens, on the other hand, is inserted in front of the iris through a somewhat larger incision that must be closed with sutures which dissolve over time.
[0166] Phakic intraocular lenses (IOLs), also known as implantable contact lenses (ICLs), are implantable contact lenses that are surgically inserted into the eye where they provide excellent quality of vision with predictable and stable results. The Visian ICL is a phakic intraocular lens receiving approval from the FDA for a wide range of myopic (nearsightedness) correction needs.
[0167] This technology is adapted from the proven lens technology used for cataract surgery, and works by placing the ICL in front of the natural lens inside the eye. The Visian ICL is made of a material that is biocompatible and provides excellent optical performance. The implantable contact lens (ICL) is a posterior chamber phakic IOL can be folded and injected through about a 3-mm self-sealing clear-corneal incision under topical anesthesia.
[0168] Perfectly fitting the ICL in the posterior chamber depends on lens design and lens sizing. These factors determine the position of the ICL in the posterior chamber, especially its vaulting—the phakic lens should neither touch the natural lens nor obstruct the normal circulation of the aqueous humor.
[0169] The concern about whether the ICL touches the crystalline lens has received much attention, because intimate contact between the artificial and the natural lenses raises the possibility of cataractogenesis.
[0170] Ultrasound biomicroscopy is helpful to identify the position of the posterior phakic IOL. Preoperatively, ultrasound can also show the existence of iridociliary cysts and other disorders that cannot be seen through slit lamp examination and would interfere with the ICL's normal position.
[0171] In a previous ultrasound study featuring a version of an ICL for the correction of high myopia, it was observed that the ICL clearly vaulted over the crystalline lens centrally; however, it touched the crystalline lens in mid-periphery in most cases, under the thickest part of the ICL, which is located at the optic-haptic junction (the vault is the distance between the posterior surface of the ICL and the anterior surface of the natural lens).
[0172] Contact between the ICL and the crystalline lens in midperiphery does not cause prompt opacification; however, it may block the normal circulation of the aqueous humor. A pool of aqueous stagnation may be responsible for the anterior subcapsular opacification seen in a few cases after uneventful implantation in the late postop period.
[0173] It is important to state that no ICL-induced cataracts have been reported with the hyperopic model, most likely because of its design, which avoids blockage of normal aqueous circulation.
[0174] Ideally, the ICL should bend forward, leaving a space between the phakic and crystalline lenses. However, ICL vaulting must not cause excessive contact with the posterior surface of the iris, which could lead to pigment dispersion, increasing the risk of developing glaucoma, especially in highly myopic eyes.
[0175] So proper ICL sizing is of paramount importance as it dictates vaulting and positioning of the ICL in the posterior chamber. ICL sizing calculation is crucial. Traditionally, sizing is based on the white-to-white measurement, which is an indirect and less accurate assessment of the ciliary sulcus diameter and can be misleading in some cases.
[0176] A more precise and direct way to measure the ciliary sulcus diameter to improve ICL sizing determination is needed Instead of using the recommended “golden rule,” which is to add 0.5 mm to the horizontal white-to-white distance for the myopic model and to subtract the same amount for the hyperopic one.
[0177] Small sizing miscalculations can bring about great errors in ICL positioning, and this “golden rule” has been criticized for lack of accuracy.
[0178] In adult life there is a steady axial growth of the crystalline lens of about 25 μm per year, which may be responsible for vaulting reduction over time, with all the negative consequences. That is why higher vaulting values are welcome, such as the recently recommended range of 300 μm to 600 μm.
[0179] Should cataract occur, the solution is simple because the ICL is easily explanted through the original clear-corneal incision. Routine phacoemulsification and IOL implantation are then performed, which, in reality, represent an alternative form of treatment, especially for high myopia in adults.
[0180]
[0181] Two types of lenses are available for ICL eye surgery. A foldable lens is inserted through a small incision and unfolds into its place between the iris and the eye's natural lens. This procedure requires an extremely small incision that is self-healing. Another available lens, on the other hand, is inserted in front of the iris through a somewhat larger incision that must be closed with sutures which dissolve over time.
The Vault
[0182] The vault is defined as the space between the posterior surface of the ICL and the anterior surface of the natural lens. Perfectly fitting the ICL in the posterior chamber depends on lens design and lens sizing. These factors determine the position of the ICL in the posterior chamber, especially its vaulting. The phakic lens should neither touch the natural lens nor obstruct the normal circulation of the aqueous humor.
[0183] Roadmap for Measuring and Detecting the ICL Vault
[0184]
[0185] The computer system 2400 may additionally include a computer-readable storage media reader 2425; a communications system 2430 (e.g., a modem, a network card (wireless or wired), an infra-red communication device, etc.); and working memory 2440, which may include RAM and ROM devices as described above. In some embodiments, the computer system 2400 may also include a processing acceleration unit 2435, which can include a DSP, a special-purpose processor and/or the like.
[0186] The computer-readable storage media reader 2425 can further be connected to a computer-readable storage medium, together (and, optionally, in combination with storage device(s) 2420) comprehensively representing remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing computer-readable information. The communications system 2430 may permit data to be exchanged with the network 2420 and/or any other computer described above with respect to the system 2400. Moreover, as disclosed herein, the term “storage medium” may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
[0187] The computer system 2400 may also comprise software elements, shown as being currently located within a working memory 2440, including an operating system 2445 and/or other code 2450, such as program code implementing the servers or devices described herein. It should be appreciated that alternate embodiments of a computer system 2400 may have numerous variations from that described above. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed.
[0188] The following description is based on processor executable MatLab code, which is stored in working memory 2440 and when executed by one or more central processing units (CPUs) or processors 2405 measures a vault for an ICL implanted in an eye of a patient and generates a vault map. While the present disclosure is discussed with reference to MatLab, it is to be understood that any programming code may be employed.
[0189] The processor executable function MeasureICLVault.m contains the main “entry point” from the Insight application of the present disclosure (also referenced herein as MeasureICLVault( ). This processor executable function determines a best guess at a background pixel intensity and column of the mid-point of the eye. The processor executable function calls or invokes the processor executable code DetermineICL_ROI_Object( ) to obtain the right and left regions of interest or ROI (ROI—the inverted vault space, spanning from the mid-column to the end of the lens and/or ICL surface), recombines the 2 ROIs, and then calls or invokes the processor executable code CalcTotalVaultUsingSlopeMethod( ) to measure the ICL vault from the combined ROI and GrayScale Image. The last function call in this processor executable function (DisplayVaultMeasurements( )) can be called for debugging purposes. The processor executable function returns the estimated midpoint as a 2-cell array representing the row and column of the estimated mid column along the top ICL surface. It also returns 4 arrays, all indexed by the pixel column (of the lens) in the original image. The arrays include the following: [0190] vaultLensRow Array—represents the row in each column where the lens was detected. If no lens is detected in a row, a value is set to “0”. [0191] vaultICLRow Array, vaultICLColArray—represents the row and column of the point found along the line perpendicular to the lens intersecting the back ICL surface. The index of these arrays represents the column of the associated point on the lens. If no point is found on the ICL surface, the values are set to 0. [0192] vaultMeasurementArray—contains the measured vault for each point on the lens, in um.
[0193] The processor executable function FindMidColAndBackgroundPix.m calls DetectICLBackGroundPixel( ) to determine an intensity of background pixel and thresholds image accordingly. The processor executable function attempts to detect an ICL front surface and the iris on each side of eye. If two irises are detected, the processor executable function calculates the mid-column as halfway between the 2 irises. (This calculation does not need to be precise—it is only to try to definitively locate a point somewhere between the 2 irises so that when we “cut the image in half”, both sides will have a sufficient area of ICL surface without an iris.) If both irises are not detected, the processor executable function uses a mid-column of the entire image as the mid-column.
[0194] The processor executable function DetectICLBackgroundPixel.m determines the background intensity using 3 methods. This file/function is based on the processor executable code (DetectBackgroundPixel( ) with the following differences: [0195] In using the jpg files output from the Insight Application to test with, it appeared that the first 20 rows and the first 25 columns were filled with some sort of miscellanea that needed to be ignored. [0196] In method 2, the maximum intensity should be about 3 times the mean intensity versus [0197] added processor executable code discounts a potential outlier of the output from the 3 methods.
[0198] The processor executable function MaskICLJunk.m, MaskLeftICLJunk.m attempts to mask out anything other than iris, sclera, ICL, and lens.
[0199] The processor executable function DetermineICL_ROI_Object.m isolates/returns an object representing the inverted vault space for the “half image” (space between the lens and the back of the ICL). It can accomplish this by:
1) Adjusting a threshold until separate lens/ICL surfaces are visible at the previously determined mid-column (which is the right edge of the objects in the half images)
2) Continue adjusting threshold until a vault object is isolated. The processor executable function calls the processor executable code FindICLLensSegment( ) to try to distinguish the ICL and lens surfaces from the cornea and cataract surfaces. If it cannot isolate the lens/ICL surfaces, it adjusts the threshold until it can (or gives up.) The processor executable function returns the ROI image, the gray scale image with all but the ROI area masked out, the final threshold used when obtaining ROI, and the left-most column of the ROI.
[0200] The processor executable function CalculateTotalVaultUsingSlopeMethod.m uses the ICL ROI as a starting point/guide, calls DeterminGrayPeaks( ) to find the surface peaks in the grayscale image for the back ICL and lens surfaces. The processor executable function then calls the processor executable code MeasureVaultUsingPeakSurfaces( ) to calculate the vault. Last, the processor executable function attempts to fill in holes in the lens and ICL arrays via interpolation, and then recalculates the vault measurement for any holes in that array.
[0201] The processor executable function DetermineGrayPeaks.m, for each column in a grayscale image, between specified start and end columns, calls the processor executable function FindLensAndICLSurfaces( ) to find the ICL and lens surfaces (peaks) using the ICL ROI as a guide. If a surface is not detected for 5 or more contiguous columns, the processor executable function resets the last known row for that surface to try to re-detect it, in case there was a gap in the surface (in the image . . . ) Once the processor executable function reaches the end of the ROI, it calls ExtendIGrayPeaks( ) to try to detect the surfaces for as far as it can. If there is not a valid ROI, the processor executable function starts at startCol and calls ExtendIGrayPeaks( ) to find as much of the surfaces as it can.
[0202] The processor executable function FindLensAndICLSurfaces( ) finds the row of the ICL surface and the row of the lens surface in a specified column of the image. If the bottom and top of the ICL ROI can be found, the processor executable function looks for the next peak above/below those rows in GrayCol (specified single column from the grayscale image).
[0203] The processor executable function ExtendIGrayPeaks( ) using the grayscale image, starts with last known row for the given surface, and traces the peak out further until it loses it.
[0204] The processor executable function FindPeaksInColumn.m finds the peaks (local maxima) in a specified column of a grayscale image between the specified starting and ending rows using the following logic: [0205] If endingRow<=0, it signifies that we are only looking for first peak. [0206] If consecutive cells are equivalent (and a peak), chooses the middle row. [0207] If we find one peak, then dip below a “low threshold”, but don't dip below a valley threshold, and then find another equal peak within a specified range choose mid points between those 2 peaks.
Returns an array of peak locations (peakRows), an associated array of the intensities at those peak locations, and the original column array with all but the peak locations set to zero.
[0208] The processor executable function MeasureVaultUsingPeakSurfaces.m, for each column along the lens surface, finds the average slope of the lens at that point (averages the slopes of lines formed by equidistant points on the lens from the current lens point, with various spans), finds the point on the ICL surface along the line perpendicular to the lens point, performs a reasonableness check on the point, records the point in the output arrays, and records the distance between the lens point and the associated ICL point. Outputs are the 4 arrays described in the processor executable functions CalculateTotalVaultUsingSlopeMethod( ) and MeasureICLVault( ). Finds a point the general area of the iris root. Not the exact location of the root, but somewhere close enough to establish a reasonable “region of interest” in the image.
ICL Vault Detection/Measurement
[0209] U.S. patent application Ser. No. 16/422,182 entitled “A Method for Measuring Behind the Iris after Locating the Scleral Spur” is incorporated herein by reference.
[0210] U.S. patent application Ser. No. 16/422,182 discloses a method, using ultrasonic imaging of the anterior segment of an eye, for automatically locating the scleral spur in an ultrasound image of an eye using a form of segmentation analysis, and, using the scleral spur as a fiduciary, automatically making measurements from the image, in front of and behind the iris.
[0211] The method of U.S. patent application Ser. No. 16/422,182 consists of the following principal steps which are performed automatically: [0212] 1. Acquire B-Scans. [0213] 2. Binarize B-Scans. [0214] 3. Determine the iris/lens contact distance (ILCD) and anterior chamber depth (ACD). [0215] 4. Locate Root of the Iris. [0216] 5. Locate Root of the Ciliary Sulcus. [0217] 6. Isolate the Sclera. [0218] 7. Locate the Scleral Spur. [0219] 8. Using the scleral spur as a fiduciary, make measurements including, at least, the trabecular/iris angle (TIA), the iris lens contact distance, the iris zonule distance (IZD) and the trabecular ciliary process distance (TCPD). [0220] 9. Prepare an Automated based on a B-Scan with all measurements displayed.
[0221] The method of U.S. patent application Ser. No. 16/422,182 is based on detecting a scleral spur in an ultrasound image of an eye of a patient. The method comprises providing an ultrasound device having a scan head with an arcuate guide track and a carriage movable along the arcuate guide track; an eyepiece configured to maintain the eye of the patient in a fixed location with respect to the arcuate guide track; and a transducer connected to the carriage. The method includes emitting, from the transducer, ultrasound pulses as the carriage moves along the arcuate guide track; storing the received ultrasound pulses on a non-transitory computer readable medium; forming, by at least one electronic device, a B-Scan of the eye of the patient based on the received ultrasound pulses; binarizing, by the at least one electronic device, the B-Scan from a grayscale color palette to a black/white color palette; determining, by the at least one electronic device, an average surface of a sclera of the eye; and locating, by the at least one electronic device, a bump of the average surface of the sclera that corresponds to the scleral spur.
[0222] In U.S. patent application Ser. No. 16/422,182, a system is disclosed for detecting a scleral spur in an ultrasound image of an eye of a patient, comprising an ultrasound device, having a scan head having an arcuate guide track and a carriage movable along the arcuate guide track; an eyepiece configured to maintain the eye of the patient in a fixed location with respect to the arcuate guide track; a transducer connected to the carriage, wherein ultrasound pulses are emitted into the eye of the patient and the received ultrasound pulses are stored on a non-transitory computer readable medium; wherein at least one electronic device has non-transitory readable medium and has instructions that, when executed, cause the at least one electronic device to form a B-Scan of the eye of the patient based on the received ultrasound pulses; binarize the B-Scan from a grayscale color palette to a black/white color palette; determine an average surface of a sclera of the eye; and locate a bump of the average surface of the sclera that corresponds to the scleral spur.
[0223] The method of U.S. patent application Ser. No. 16/422,182 also discloses a system for binarizing a B-Scan of an eye of a patient, comprising an ultrasound device, having a scan head having an arcuate guide track and a carriage movable along the arcuate guide track; an eyepiece configured to maintain the eye of the patient in a fixed location with respect to the arcuate guide track; a transducer connected to the carriage, wherein ultrasound pulses are emitted into the eye of the patient, and wherein the received ultrasound pulses are stored on a non-transitory computer readable medium and wherein at least one electronic device having the non-transitory readable medium and having instructions that, when executed, cause the at least one electronic device to form a B-Scan of the eye of the patient based on the received ultrasound pulses; to determine an average intensity of a grayscale color palette of the B-Scan of the eye; to binarize the B-Scan of the eye from the grayscale color palette to a black/white color palette, wherein discrete areas of the B-Scan above a predetermined intensity are binarized to white and discrete areas of the B-Scan below the predetermined intensity are binarized to black, and the predetermined intensity depends on the average intensity.
[0224] U.S. patent application Ser. No. 16/422,182 discloses a method, using ultrasonic imaging of the anterior segment of an eye, for automatically locating the scleral spur in an ultrasound image of an eye using a form of segmentation analysis, and, using the scleral spur as a fiduciary, automatically making measurements in front of and behind the iris of an ultrasound image of an eye.
[0225] The method of U.S. patent application Ser. No. 16/422,182 is based on detecting a scleral spur in an ultrasound image of an eye of a patient. The method comprises providing an ultrasound device having a scan head with an arcuate guide track and a carriage movable along the arcuate guide track; an eyepiece configured to maintain the eye of the patient in a fixed location with respect to the arcuate guide track; and a transducer connected to the carriage. The method includes emitting, from the transducer, ultrasound pulses as the carriage moves along the arcuate guide track; storing the received ultrasound pulses on a non-transitory computer readable medium; forming, by at least one electronic device, a B-Scan of the eye of the patient based on the received ultrasound pulses; binarizing, by the at least one electronic device, the B-Scan from a grayscale color palette to a black/white color palette; determining, by the at least one electronic device, an average surface of a sclera of the eye; and locating, by the at least one electronic device, a bump of the average surface of the sclera that corresponds to the scleral spur.
[0226] Various figures and embodiments will now be discussed in connection with the processor executable functions described above under Roadmap for ICL Vault.
[0227] The whole anterior segment image is first binarized—from 0 to 255 grades of grayscale to black and white.
A. Determine background pixels and filter them out.
B. Find the approximate center by starting in a middle of image and trying to detect an iris on each side. If found, the processor executable function selects a midpoint between them, otherwise it just uses the midpoint of image.
C. For each half (horizontally flip the right side image for processing):
[0228] a. Find a lens and ICL surface point at the mid column (right side of the half image)
[0229]
[0230] a. Trace along the surfaces in the left direction until one of the surfaces “disappears” (is no longer visible/detectable in the image.)
[0231] b. Create an object (ROI—region of interest) comprising the cavity between the 2 surfaces, bounded on the right by the edge of the half image, and on the left by the end point of the first surface to “disappear”.
[0232]
[0233]
D. Combine the 2 ROIs into one
[0234]
E. For each column in the ROI of
[0235]
1) Features of an ideal image:
[0236] a. a strong difference in pixel intensity between background pixels and real pixels sometimes in oversaturated images, surfaces blur together;
[0237] b. continuous lines defining lens and ICL surface. (Currently, as soon as a break in the surface is encountered, it is assumed that it is the end point. Or worse, a different surface is detected and interpreted as the continuation of the surface;
[0238] c. image mostly centered. Processor executable code is available detect the center if a clear definition of the iris is obtained, but this only works best when the image is mostly centered; and
[0239] d. minimal extraneous “junk” can be confused with lens or ICL, or throw off calculations of background pixel intensity.
2) Features of images causing difficulty so far:
[0240] a. Broken surfaces
Scan set 3.4 showing broken ICL/lens, resulting in mistaking the cornea for the lens.
[0241]
[0242] Scan sets 2.0-2.1 showing missing ICL surface resulting in mistaking the top of the ICL for the bottom of the ICL.
[0243]
[0244]
[0245] b. Extraneous features in an image
[0246] Scan set 2.2 with extraneous features under lens, causing one to think the extraneous features were a lens or an ICL:
[0247]
[0248]
[0249]
[0250]
[0251]
[0252]
[0253]
[0254] With reference to
1. Thresholding.
[0266] Starting with an ultrasound image of an eye containing an ICL at a certain meridian (see
2. Find the center of the iris.
[0267] The center of the iris is found as described in U.S. patent application Ser. No. 16/422,182, which is incorporated herein by this reference.
[0268] To find the approximate center of the iris, the processor starts with the center of the binarized image, and look at the pixels in that column of the image. Then the processor moves right or left until an area that is between the “left and right iris” depicted in the image can be identified. This is done by the processor looking for “thin” lines along the column in the binarized image, depicting the ICL and lens surfaces, without finding the “thicker” lines that would be found depicting the iris. Then the processor moves left and right until the thicker lines near the ICL surface are found, depicting the iris on each side. Once the inner edges of the iris are detected, the processor chooses the point mid-way between the two as the approximate center of the iris.
3. Split image
[0269] Once the center of the iris is found the image is split in a left and right side and the right image is horizontally flipped by the processor to allow for the same ROI algorithm to be applied (
4. Find ROI
[0270] In this step, the processor through executing the algorithm aims to identify the anterior capsule surface as well as the posterior ICL surface and reject all other anatomy present following thresholding such as the cornea surfaces, scleral wall, iris, as well as eye lid (if present) and cataract reflections (if present). Factors included in the decisions to include or reject objects in the image are size, shape, and location.
[0271] Following rejection of extraneous objects, the vault between the anterior capsule and the posterior lens is identified by the processor starting at the center and continued until one of the two surfaces are no longer visible. The ROI is returned for each half of the image (
5. Measure vault height
[0272] The vaults in both halves of the images are merged by the processor (
[0273] For example, the vault height for a given point on the anterior capsule can be found by the processor by: [0274] Finding the orientation of the anterior capsule surface at the given point; [0275] Calculating the normal to this surface orientation; [0276] Finding the intersection of the normal with the posterior surface of the ICL; and [0277] Calculating the distance between the point on the anterior capsule and its corresponding intersection point on the posterior.
[0278] Finding the orientation of the anterior capsule in potentially noisy data may require the use of curve fitting or local linear fitting on capsular peak points found.
[0279] This process is repeated by the processor for all points found on the anterior capsule.
Post Processing
[0280] In addition to the basic vault height calculations described above, additional post-processing is necessary to address issues with images that are not ideal.
[0281] Features of ideal image:
[0282] a. Strong difference in pixel intensity between background pixels and anatomical pixels.
[0283] b. Continuous lines defining lens and ICL surface.
[0284] c. Image mostly centered.
[0285] d. Minimal extraneous artefacts, such as eyelids, that may throw off thresholding.
[0286] Examples of features of images requiring additional processing:
a. Interrupted surfaces
[0287] When either the anterior capsule surface or the posterior ICL surface is interrupted, the processor addresses the gap in many ways. Small gaps can be interpolated. Gaps in the posterior ICL surface can cause the algorithm to initially jump to the superior ICL surface for the vault calculation. The processor can detect and correct these issues.
b. “Extraneous” features in an image
[0288] Small features such as those due to early stages of cataract in the images can exceed the threshold and interfere with the algorithm for detecting the anterior capsule. The processor can detect these artefacts for reliable vault detection.
[0289]
Vault Maps
[0290] Once vault heights are determined for each of the images of a multi-meridian ultrasound scan, the vault height measurements can be combined into a vault map. A vault map represents the vault height as a function of the location on the anterior capsule presented in the form of a map.
[0291]
[0292] Those skilled in the art can appreciate there are multiple ways to create maps from a set of meridians. In this disclosure, the process detailed below is followed:
1. Recombine, by the processor, the halves to form a single Vault Map ROI for the meridian.
This reverses the initial split we performed on the image. The result is a single ROI for each meridian. This step is not necessary if the ROI wasn't split initially.
2. Extrapolate, by the processor, the positions of the ROIs for each individual B-Scan onto a single 3D coordinate system. Convert, by the processor, the position of the ROI from its two-dimensional position within the image into a common three-dimensional space based on the orientation of the scan head relative to the eye as the sweep was captured. The 3D position of each point in each ROI, for each meridian, is calculated.
3. Align, by the processor, the ROIs within the 3D coordinate system around a central axis. It is necessary to account for eye movement which will naturally occur as scanning is performed. The central axis of each ROI is calculated and used to position it within the 3D coordinate space.
4. Interpolate, by the processor, between calculated ROI data to fill in the 3D space between scanned meridians.
[0293] The full three-dimensional area of the vault must now be interpolated between the 2D data we have aligned within the common 3D space. This is done by creating, by the processor, a blank map image and aligning, by the processor, the center of the map to the central axis of the aligned ROIs. Then determine, by the processor, the radial position of each pixel relative to center and calculate, by the processor, the vault height at each pixel location. See
[0294] Each pixel in the map will have an ROI on either side of it. If data is not present in one or both of the aligned ROIs at that pixel's radial position, the pixel will not have a height assigned to it. Interpolation is first performed within the meridians on either side of the pixel to calculate the height within the ROI at the pixels radial distance. The weighted average of the resulting values, based on the arc length between the pixel and the meridians on either side, is calculated to find the eight at the pixel's location. This process is repeated by the processor for all pixel locations we will display on our final map.
5. Assign, by the processor, color values to the height range.
[0295] Once the heights are calculated, the maximum and minimum width values are used, by the processor, to set the color scale, unless the user has created default maximum and minimum height values which will be used instead. There are a variety of options within the software to change the color scale.
6. Display the map.
[0296]
[0297]
Automated Centering and Range Finding
[0298] Early practice in preparing to set up a particular B-scan, centering the transducer on the arcuate guide track and on the center of the patient's eye was a manual process using both optical and ultrasound techniques. As disclosed in U.S. Pat. No. 8,758,252 “Innovative Components for an Ultrasonic Arc Scanning Apparatus”,
[0299] In early practice, range finding (setting the focal point of the ultrasound transducer at the desired location within the eye) was also a manual process.
[0300] In current practice, centering and range finding has been automated. The general steps for automating centering and range finding are:
[0301] For each meridian: [0302] 1. Use, by the processor, the camera to do coarse centering [0303] 2. Do, by the processor, range finding [0304] 3. Use, by the processor, linear scans to do fine centering [0305] These 3 steps take about 7 seconds.
Optical Centering
[0306] The processor takes 5 frames (about 1 second or 200 ms per frame) finds OD of iris by looking for pixel gradient, fits a circle to the horizontal diameter (to avoid eyelid blocking) by least squares, averages the diameters from the 5 frames and get average x-y of center or get x-y of center of each, and looks for movement by taking the difference of the x-y of the centers.
[0307] The processor moves the transducer to center by using the positioner and assumes the transducer is centered on the arcuate and linear tracks.
[0308] Note—on the screen, the cross hairs indicate the position of the transducer when centered. The idea is to get the center of the eye as measured to be co-incident with the cross hairs just before scanning. If the center of the cross hairs is within the diamond on the screen, then there is enough travel on the gamma and arcuate guide tracks to avoid hitting the container walls.
Range Finding
[0309] The positioner, in response to commands from the processor, sets the scan head as far back as it can. Then the device does a horizontal linear scan (transducer at the center of the arcuate guide track). The processor looks for curved surface with Rc in the range of cornea (another least squares fit) and hence finds a z-position of cornea. The processor advanced z by say, 4 mm and repeat until anterior surface of cornea is confirmed and sets a position of focal plane where desired for scanning. If centering, the processor sets a focal plane in the cornea (about 2.5 mm beyond anterior surface).
Ultrasound Centering
[0310] The processor sets the transducer focal plane in cornea and does a linear scan to find max of anterior cornea (shortest time from A-scan). This is the center of cornea and used to move the positioning mechanism assembly until center of cornea is co-incident with center of cross hairs.
Automated Detection of Anterior Capsule for the Purpose of Automatically Setting Scan Depth.
[0311] For optimal image quality in ultrasound it is important to have the focal plane of the ultrasound transducer set at or near the structures of interest. One common structure of interest is the anterior capsule. For instance, measurements to correctly size an ICL are generally made between structures very close to the anterior capsule. It can be challenging to determine the optimal focal depth to image the anterior capsule when setting imaging parameters manually. Therefore, a new method was developed to automatically detect the anterior capsule using progressive scanning and ultrasound image analysis followed by the determination of the optimal focal plane depth.
[0312] The process starts with a lateral (also known as a linear) sweep with the transducer retracted to its base plane (Z=0 mm). An image analysis is performed which searches for circular surfaces in the image within a specified radius range. An analysis is performed, by the processor, to select the most likely candidate representing the epithelial surface of the cornea. If no qualifying surface is found, the transducer is moved forward a short distance (e.g., Z=2 mm) and the process is repeated with the new focal plane. Once an image is obtained with an identifiable epithelial surface, the image is further examined, by the processor, in the region posterior to the epithelial surface. A point is chosen, by the processor, at 3 mm posterior to the center of the epithelial surface, and the point among candidate surfaces with the minimal distance from the reference point is chosen, by the processor, as the target focal position. If no candidate surface within 1 mm is located, the transducer is moved forward and the process is repeated. If a capsule surface is found by the processor, the transducer is moved to that position, the lateral scan is repeated, and the same capsule analysis is performed, by the processor, again at the new focal plane, in order to adjust the focal position based on the better focused image. The transducer can now be moved to a depth (Z) such that the planned focal plan will go through the anterior capsule surface at the estimated visual axis (or a settable offset relative to that).
[0313]
[0314] In one aspect of the present disclosure, a method for detecting and measuring a vault of an anterior segment of an eye of a patient can comprise: imaging, by an ultrasound scanning device, an anterior segment of the eye; locating, by a processor, an implanted contact lens (ICL) between a cornea and a natural lens; forming, by the processor, a B-scan of at least a portion of the eye based on the received ultrasound pulses; binarizing and thresholding by the processor, the B-Scan from a grayscale color palette to a black/white color palette; determining, by the processor, using segmentation analysis of the binarized and thresholded B-scan, a fiduciary location in the anterior segment of the eye; and forming, by the processor and using the binarized and thresholded B-scan and fiduciary location, a vault map mapping a distance between an anterior segment surface and a posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior segment surface.
[0315] In the aspect, the determining can include locating, by the processor, a center of an iris of the eye; dividing the B-Scan along the iris center to form left and right portions of the B-Scan; and horizontally flipping one of the left and right portions.
[0316] The aspect can further include for each of the left and right portions: finding, by the processor, a region of interest representing a rough estimate of a vault; measuring, by the processor, a vault height along the anterior segment surface; and merging, by the processor, the measured vault heights in the left and right portions to form a common vault map region of interest.
[0317] In the aspect, the B-scan can comprise multiple B-scans and each of the B-scans corresponds to a region of interest and wherein, for each of the plurality of regions of interest, the forming can comprise: extrapolating, by the processor, a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning, by the processor, a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
[0318] In the aspect, the forming can comprise interpolating, by the processor, between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
[0319] In the aspect, the forming can comprise: assigning, by the processor, color values to a range of virtual heights; and generating, by the processor, the vault map based on the assigned color values.
[0320] In an aspect, an eye imaging system can comprise: an input to receive a plurality of A-scans of an anterior capsule of an eye of a patient from an ultrasound scanning device; a processor coupled with the input; and a memory coupled with and readable by the processor and storing therein a set of instructions which, when executed by the processor causes the processor to: locate an implanted contact lens (ICL) between a cornea and a natural lens of the eye; form, from the plurality of A-scans, a B-scan of the anterior capsule of the eye; binarize and threshold the B-Scan from a grayscale color palette to a black/white color palette; determine, using segmentation analysis, a fiduciary location in the anterior capsule of the eye; and form, using the binarized and thresholded B-scan and fiduciary location, a vault map mapping a distance between an anterior capsule and a posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior capsule surface.
[0321] In the aspect, the determining can comprise: locating a center of an iris of the eye; dividing the B-Scan along the iris center to form left and right portions of the B-Scan; and horizontally flipping one of the left and right portions.
[0322] In the aspect, for each of the left and right portions, the instructions can cause the processor to: find a region of interest representing a rough estimate of a vault; and measure a vault height along the anterior capsule surface; wherein the instructions can further cause the processor to merge the measured vault heights in the left and right portions to form a common vault map region of interest.
[0323] In the aspect, the B-scan can comprise multiple B-scans and each of the B-scans can correspond to a region of interest and wherein, for each of the plurality of regions of interest, the forming can comprise: extrapolating a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
[0324] In the aspect, the forming can comprise: interpolating between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
[0325] In the aspect, the forming can comprise assigning color values to a range of virtual heights; and generating the vault map based on the assigned color values.
[0326] In an aspect of the disclosure, an eye imaging system can comprise: an input to receive a plurality of A-scans of an anterior capsule of an eye of a patient from an ultrasound scanning device; a processor coupled with the input; and a memory coupled with and readable by the processor and storing therein a set of instructions which, when executed by the processor causes the processor to: locate an implanted contact lens (ICL) between a cornea and a natural lens of the eye; form, from the plurality of A-scans, a B-scan of the anterior capsule of the eye; remove background pixels of the B-scan to form a binary image; determine a fiduciary location in the anterior capsule of the eye; and determine, from the binary image and the fiduciary location, a vault distance between the anterior capsule and a posterior surface of the ICL along a selected line drawn perpendicular to a local surface of the anterior capsule surface.
[0327] In an aspect of the disclosure, an eye imaging method can comprise: a processor, locating an implanted contact lens (ICL) between a cornea and a natural lens of the eye; forming, from the plurality of A-scans, a B-scan of the anterior capsule of the eye; removing background pixels of the B-scan to form a binary image; determining a fiduciary location in the anterior capsule of the eye; and determining, from the binary image and the fiduciary location, a vault distance between the anterior capsule and a posterior surface of the ICL along a selected line drawn perpendicular to a local surface of the anterior capsule surface.
[0328] In the aspects, the vault distance determining can comprise: forming, based on a plurality of vault distances, a vault map mapping a distance between the anterior capsule and the posterior surface of the ICL along a plurality of lines drawn perpendicular to a local surface of the anterior capsule surface, the plurality of lines comprising the selected lines.
[0329] In the aspects, the determining can comprise: locating a center of an iris of the eye; dividing the B-Scan along the iris center to form left and right portions of the B-Scan; and horizontally flipping one of the left and right portions.
[0330] In the aspects, for each of the left and right portions, the instructions can cause the processor to: find a region of interest representing a rough estimate of a vault; and measure a vault height along the anterior capsule surface; and wherein the instructions can further cause the processor to merge the measured vault heights in the left and right portions to form a common vault map region of interest.
[0331] In the aspects, the B-scan can comprise multiple B-scans and each of the B-scans can correspond to a region of interest and wherein, for each of the plurality of regions of interest, the forming can comprise: extrapolating a position of each selected region of interest for each scanned meridian onto a common three-dimensional coordinate system; and aligning a central axis of each of the regions of interest in the plurality of regions of interest within the common three-dimensional coordinate system.
[0332] In the aspects, the forming can comprise interpolating between calculated region of interest data for each region of interest to fill in a three-dimensional space between scanned meridians.
[0333] In the aspects, the forming can comprise: assigning color values to a range of virtual heights; and generating the vault map based on the assigned color values.
[0334] In the aspects, the instructions can cause the processor to: identify, based on a size, shape, and/or location of anatomical structures in the binary image, the anterior capsule surface and the posterior ICL surface; remove anatomical structures other than the anterior capsule surface and posterior ICL surface from the resulting binary image to form a region of interest; based on the region of interest, identify peaks in the resulting binary image, the peaks representing the anterior capsule surface and the posterior ICL surface; and measure a vault height along the anterior capsule surface. In the foregoing description, for the purposes of illustration, methods were described in a particular order. It should be appreciated that in alternate embodiments, the methods may be performed in a different order than that described. It should also be appreciated that the methods described above may be performed by hardware components or may be embodied in sequences of machine-executable instructions, which may be used to cause a machine, such as a general-purpose or special-purpose processor or logic circuits programmed with the instructions to perform the methods. These machine-executable instructions may be stored on one or more machine readable mediums, such as CD-ROMs or other types of optical disks, floppy diskettes, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memory, or other types of machine-readable mediums suitable for storing electronic instructions. Alternatively, the methods may be performed by a combination of hardware and software.
[0335] Specific details were given in the description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits may be shown in block diagrams in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0336] Also, it is noted that the embodiments were described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
[0337] Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as storage medium. A processor(s) may perform the necessary tasks. A code segment may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
[0338] The present disclosure, in various embodiments, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various embodiments, sub-combinations, and subsets thereof. Those of skill in the art will understand how to make and use the present disclosure after understanding the present disclosure. The present disclosure, in various embodiments, includes providing devices and processes in the absence of items not depicted and/or described herein or in various embodiments hereof, including in the absence of such items as may have been used in previous devices or processes, for example for improving performance, achieving ease and\or reducing cost of implementation.
[0339] The foregoing discussion of the disclosure has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
[0340] Moreover though the description of the disclosure has included description of one or more embodiments and certain variations and modifications, other variations and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative embodiments to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.