Spectrally encoded endoscopic image process
09846940 · 2017-12-19
Assignee
Inventors
Cpc classification
A61B3/10
HUMAN NECESSITIES
G06T7/80
PHYSICS
H04N23/555
ELECTRICITY
H04N23/00
ELECTRICITY
International classification
Abstract
An image processing method. At least one light ray presented by a vector in a first affine coordinate system with a tip of the probe as an origin is projected from the tip of the probe. The light ray is intercepted from a projection surface satisfying a function in the first affine coordinate. A distance between the tip of the probe and an interception point of the light ray on the projection surface is obtained based on a rotation angle of the probe, a wavelength of the light ray, and a deflection angle of the light ray from the probe. A relationship between the first coordinate and a second affine coordinate system defined with the projection surface as a reference is obtained. Image data are acquired from the light ray reflected from the target surface presented in the first affine coordinate. The image data presented in the first affine coordinate are converted into image data presented in the second affine coordinate, and the image data in the second coordinate system are resampled by interpolating or extrapolating a gray scale.
Claims
1. An image processing method, comprising: projecting at least one light ray from a probe, wherein the light ray is presented by a vector in a first affine coordinate system with a tip of the probe as an origin; intercepting the light ray with a projection surface, the projection surface satisfying a function in the first affine coordinate; obtaining a distance between the tip of the probe and an interception point of the light ray on the projection surface based on a rotation angle of the probe, a wavelength of the light ray, and a deflection angle of the light ray from the probe; obtaining a relationship between the first coordinate and a second affine coordinate system, the second coordinate system being defined with the projection surface as a reference; acquiring image data from the light ray reflected from the projection surface presented in the first affine coordinate; converting the image data presented in the first affine coordinate into image data presented in the second affine coordinate; and resampling the image data in the second coordinate system by interpolating or extrapolating a gray scale.
2. The method according to claim 1, wherein the first coordinate includes a Cartesian coordinate including an x-axis, a y-axis, and a z-axis extending along the probe.
3. The method according to claim 2, wherein the vector of the light ray is presented as:
4. The method according to claim 1, wherein the function of the projection surface satisfies: ƒ(x, y, z)=0 at the interception point between the light ray and the projection surface.
5. The method according to claim 1, further comprising determining the azimuth angle φ based on a series of rotation angles of the probe with respect to scanning time between the rotation angles in a calibration or mapping process.
6. The method according to claim 1, further comprising determining the wavelength of the light ray λ based on interpolation or extrapolation from at least two distinct wavelengths and pixel indices of two pixels configured to receive the reflected light ray corresponding to the two wavelengths.
7. The method according to claim 1, further comprising determining the deflection angle θ based on the wavelength of the light ray λ, the light incident angle on the grating, the diffractin order, the grating constant, and refractive indices at two opposing sides of the grating.
8. The method according to claim 1, wherein the second affine coordinate includes a Cartesian coordinate system.
9. The method according to claim 1, wherein the projection surface includes a target surface to be imaged and analyzed.
10. The method according to claim 1, wherein the projection surface is separate from a target surface to be imaged and analyzed.
11. The method according to claim 10, wherein the projection surface includes a conjugate plane of a hypothetical image plane of the target surface to be imaged and analyzed.
12. The method according to claim 10, further comprising interpolating or resampling a grayscale value to the image data presented in the second affine coordinate.
13. The method according to claim 1, further comprising projecting a plurality of light rays from the probe.
14. The method according to claim 1, wherein the projection surface is separate from a target surface to be imaged and analyzed.
15. The method according to claim 14, wherein the projection surface includes a conjugate plane of a hypothetical image plane of the target surface to be imaged and analyzed.
16. An image processing method, comprising: a) projecting at least one light ray of a first color from a probe, wherein the light ray is presented by a vector in a first affine coordinate system with a tip of the probe as an origin; b) intercepting the light ray with a projection surface, the projection surface satisfying a function in the first affine coordinate; c) obtaining a distance between the tip of the probe and an interception point of the light ray and the projection surface based on a rotation angle of the probe, a wavelength of the light ray, and a deflection angle of the light ray from the probe; d) obtaining a relationship between the first coordinate and a second affine coordinate system, the second coordinate system being defined with the projection surface as a reference; e) acquiring image data from the light ray reflected from the projection surface presented in the first affine coordinate; f) converting the image data presented in the first affine coordinate into image data presented in the second affine coordinate; g) repeating steps a) to f) for a light ray of second and third colors; h) overlaying the image data acquired from the light rays of the first, second, and third colors; and i) resampling the overlaid image by interpolating or extrapolating the image data presented in the second affine coordinate.
17. The method according to claim 16, wherein the first coordinate includes a Cartesian coordinate including an x-axis, a y-axis, and a z-axis extending along the probe.
18. The method according to claim 17, wherein the vector of the light ray is presented as:
19. The method according to claim 17, wherein the function of the target surface satisfies: ƒ(x, y, z)=0 at the interception point between the light ray and the projection surface.
20. The method according to claim 17, further comprising determining the azimuth angle φ based on a series of rotation angles of the probe with respect to scanning time between the rotation angles in a calibration or mapping process.
21. The method according to claim 17, further comprising determining the wavelength of the light ray λ based on interpolation or extrapolation from at least two distinct wavelengths and pixel indices of two pixel configured to receive the reflected light ray corresponding to the two wavelengths.
22. The method according to claim 17, further comprising determining the deflection angle θ based on the wavelength of the light ray λ, the light incident angle on the grating, the diffractin order, the grating constant, and refractive indices at two opposing sides of the grating.
23. The method according to claim 17, wherein the second affine coordinate includes a Cartesian coordinate.
24. The method according to claim 17, wherein the projection surface includes a target surface to be imaged and analyzed.
25. The method according to claim 17, further comprising projecting a plurality of light rays from the probe.
26. The method according to claim 17, further comprising interpolating or resampling a grayscale value to the image data presented in the second affine coordinate.
27. A spectrally encoded endoscopic apparatus, comprising: a light source; a probe having a proximal end optically coupled to the light source and a distal end; a grating attached to the distal end; and a spectrometer, wherein: the light source is configured to generate a light propagating through the grating and then projecting on a projection surface, and the spectrometer is configured to receive the light reflected from the projection surface; obtain three-dimensional information of image data carried by the reflected light presented in a first coordinate system with reference to the probe based on a rotation angle of the probe, a wavelength of the light, and a deflection angle of the light deflecting from the probe; transform the location information of the image data from the first coordinate system into a second coordinate system with reference to the projection surface; and resample the image data by interpolating or extrapolating a grayscale value.
28. The apparatus according to claim 27, further comprising a motor configured to rotate the probe.
29. The apparatus according to claim 27, further comprising a processor and/or an encoder configured to control and calibrate the motor, so as to determine the rotation angle of the probe.
30. The apparatus according to claim 27, wherein the projection surface includes a target surface to be imaged and analyzed by the spectrometer.
31. The apparatus according to claim 27, wherein the projection surface is separate from a target surface to be imaged and analyzed by the spectrometer.
32. The apparatus according to claim 31, wherein the spectrometer is further configured to generate a perspective view or a fisheye view of the image data in accordance with an angle and a distance of the projection surface with respect to the probe.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION
(12) The following description is of certain illustrative embodiments, although other embodiments may include alternatives, equivalents, and modifications. Additionally, the illustrative embodiments may include several novel features, and a particular feature may not be essential to practice the devices, systems, and methods described herein.
(13)
(14) Due to its smaller diameter of about hundred microns, the probe 30 is flexible and can be maneuvered to inspect hard-to-reach areas with a minimum bending radius of several millimeters. Other color lights and fluorescent light can also be used as the light source for the SEE apparatus. The SEE apparatus as shown in
(15) Image rectification is an image transformation process used to project one or more images onto a designated image plane. As shown in
(16)
where r is the length of the light ray, θ is the deflection angle of the light ray with respect to the z-axis, and φ is the azimuth angle, which is measured from the angle between x-axis and the projection of the light ray on the XY plane. In general, the target plane P can be presented by a function of x, y, and z, that is,
ƒ(x,y,z)=0 (2).
(17) From Equations (1) and (2), the length of the light ray, that is, the distance between the tip and the interception point of the light ray and the target plane r can be solved and thus the interception point of each light ray in the three-dimension is determined. This is evident if one considers a plane in the three-dimension of the interception point (x,y,z) satisfies:
(18)
where (a, b, c) is the surface normal of the plane as shown in
(19) As discussed above, the probe 30 rotates about the z-axis. The azimuth angle φ can be determined by an encoder of a motor driving the probe, for example, the galvo motor 40 as shown in
(20)
where Γ is the total scanning angle, for example, 70° in one embodiment of the current invention, N is the number of linear portion of pixels, for example, 800 and m is the step index between 1 and N.
(21) Alternatively, it might be preferable to include an encoder to motor to record the curve as shown in
(22) Each wavelength of the light propagating through the grating 31 is diffracted to a distinct angle towards the target plane. Equation (5) shows the relationship between the spectral distribution of the light ray projected from the probe 30 and the incident angle and the diffractive angle of the light propagating through grating 31:
n.sub.i sin θ.sub.i+n.sub.d sin θ.sub.d=lGλ (5),
where n.sub.i and n.sub.d are the refractive indices of the media through which the light propagates, including the incident side and the diffractive side of the grating 31, respectively; θ.sub.i is the incident angle of the light onto the grating 31; θ.sub.d is the diffractive angle of the light projecting from the grating 31; l is the diffraction order, G is the grating constant of the grating 31, and λ is the wavelength of the light. Further, as shown in
θ(λ)=θ.sub.i−θ.sub.d(λ) (6)
(23) The wavelength λ of the light at the spectrometer may be calibrated based on interpolation or extrapolations from two or more wavelengths, that is, two or more color lights, and the pixel index P(λ) of each pixel by Equation (7):
(24)
where λ.sub.1 and λ.sub.2 are the wavelengths of known spectra, for example, blue and red lasers. The linearity of the spectral distribution at the spectrometer is shown in
(25) By applying the deflection angle θ and the azimuth angle φ obtained from Equations (4) and (6), the distance between the tip of the probe 30 and the coordinate of the interception point (x, y, z) can be obtained. With the information derived from the light reflected from the target plane P, the image data of the target plane P can be analyzed at the spectrometer 70 and the image of the target plane P can be presented in the coordinate with the tip of the probe 30. However, as the image (image data) is presented in the coordinate with the tip of the probe as the origin, distortion and deviation can be expected from the actual image presented in the coordinate of the target plane P itself. Therefore, the image data presented in the coordinate of (x, y, z) are resampled into a coordinate system of (α, β, γ) of the target plane P as follows. The coordinate system having its origin on the target plane P can be any Affine coordinate, Cartesian or non-Cartesian. When an affine coordinate system is selected, a transformation between the coordinate system with reference to the probe and the coordinate system of or with reference to the target plane P can be presented as:
(26)
where C is the transition matrix between the coordinate system of (x, y, z) and the new coordinate system of (α, β, γ) as:
(27)
and (x.sub.0, y.sub.0, z.sub.0) is the original of the new coordinate system. The transition matrix C can be derived from Equation (9):
(28)
(29) In the situation where the target plane 50 is a plane normal of (0, b, c), the surface of the target plane 50 can be presented as:
(30)
From Equations (1) and (10), the length of the light ray r can be solved as:
(31)
Equation (8) can be modified as:
(32)
The relationship between these two coordinate systems can be presented by Equation (13):
(33)
where
(34)
and
(35)
According to Equation (11), the coordinate transformation between the coordinate systems of (α, β, γ) and (x, y, z) satisfies:
(36)
As shown in
(37)
In this situation, no coordinate transformation is needed from (x, y, z) to (α, β, γ) is needed. It is possible to derive from the spherical coordinate [R, θ, φ] to Cartesian (x, y, z) and resample the plane (x,y).
(38)
(39) The above computation and visualization of the image of the target plane can be done by parallel computing or GPU computing for faster processing and display.
(40) In additional to the planar surface, the target may have a surface with more complex shapes such as a quadratic surface as:
ax.sup.2+by.sup.2+cz.sup.2+2fyz+2gzx+2hxy+2px+2qy+2rz+d=0 (16)
where a, b, c, f, g, h, p, q, r, d are known parameters related to the quadratic surface. In general, the surface can be presented as:
Σ.sub.i.sub.
where a.sub.i.sub.
(41) In the embodiment discussed above, it is assumed that the target plane overlaps with the surface of a target object to be imaged and analyzed. In the situation that the surface of the target object is curved or even irregularly shaped, the target plane may be separate from the surface of the target object as shown in
(42) The image rectification process for the perspective view application is similar to those as described in
(43) Image registration is a process of transforming different sets of data into one coordinate system. In one embodiment, the SEE image rectification discussed above is the first step for the color image registration. Once the image for an individual color, that is, an individual channel image, is properly rectified, three or more of the channel images on the same plane are mapped and overlay with each other. Steps S1101 to S1106 are the same as steps S1001 to S1004. However, as shown
(44) The above embodiments describe the transformation of two Cartesian coordinate systems ({right arrow over (x)}, {right arrow over (y)}, {right arrow over (z)}) and ({right arrow over (α)}, {right arrow over (β)}, {right arrow over (γ)}). These coordinate system in general can be Affine coordinate systems. In geometry, an Affine transformation is a function between Affine spaces which preserves points, straight lines, and planes. Sets of parallel lines remain parallel after an Affine transformation. An Affine transformation does not necessarily preserve angles between lines or distances between points, though it does preserve ratios of distances between points lying on a straight line. For many purposes an Affine space can be thought of as Euclidean space, though the concept of Affine space is far more general. That is, all Euclidean spaces are Affine, but there are Affine spaces that are non-Euclidean. In an Affine coordinate system, each output coordinate of an Affine map is a linear function of all inputs coordinates. Another way to deal with Affine transformations systematically is to select a point as the origin, and then any Affine transformation is equivalent to a linear transformation (of position vectors) followed by a translation. The linear transformations discussed above can be naturally extended to Affine coordinate systems and displayed in Affine coordinate systems. However, the transformation matrix C in Equation (9) will no longer be an orthogonal matrix if the coordinate systems are not Cartesian.
(45) While the above disclosure describes certain illustrative embodiments, the invention is not limited to the above-described embodiments, and the following claims include various modifications and equivalent arrangements within their scope.