Information processing apparatus, information processing method, and storage medium
11039076 · 2021-06-15
Assignee
Inventors
Cpc classification
G01J3/42
PHYSICS
G01J3/0297
PHYSICS
International classification
H04N7/18
ELECTRICITY
G01J3/42
PHYSICS
Abstract
An information processing method includes the steps of calculating information on object illumination light for illuminating an object based on spatial distribution information of illumination light and orientation information of the object, and correcting information on the reflected light so as to reduce influence of the object illumination light based on the information on the object illumination light and information on the reflected light from the object.
Claims
1. An information processing method comprising the steps of: acquiring information on illuminance on an object surface based on information on a spatial distribution of radiance of illumination light and information on an orientation of the object surface; acquiring information on light that has been reflected on the object surface while the object surface is illuminated with the illumination light; and correcting the acquired information on the light that has been reflected on the object surface so that influence of the illumination light is reduced, based on the information on the illuminance, wherein the step of correcting the acquired information on the light that has been reflected on the object surface includes the step of converting the acquired information on the light that has been reflected on the object surface in a space where the illumination light is distributed into information on reflected light which is reflected on the object surface in a space where other illumination light is distributed.
2. The information processing method according to claim 1, wherein the step of correcting the acquired information on the light that has been reflected on the object surface includes the step of acquiring a reflection characteristic of the object surface based on the information on the illuminance, the acquired information on the light that has been reflected on the object surface, and information on an orientation of the light that has been reflected on the object surface.
3. The information processing method according to claim 2, wherein the information on the orientation of the light that has been reflected on the object surface includes information on a unit vector of the light that has been reflected on the object surface based on the information on the orientation of the object surface.
4. The information processing method according to claim 1, wherein when a plurality of local objects is considered to be one object, the information on the orientation of the object surface contains information on an orientation of a surface of the one object.
5. The information processing method according to claim 1, wherein the information on the orientation of the object surface is obtained by averaging information on orientations of local surfaces of an object.
6. The information processing method according to claim 1, further comprising the step of acquiring the information on the spatial distribution by using an image including at least three wavelength bands.
7. The information processing method according to claim 6, wherein the image is an image acquired by using an image capturer which has an angle of view of 180 degrees or higher for all azimuths.
8. The information processing method according to claim 1, further comprising the step of acquiring the information on the orientation of the object surface based on three-dimensional measurement information of an object.
9. A computer-readable non-transitory storage medium storing a program that enables a computer to execute the information processing method according to claim 1.
10. An information processing method comprising the steps of: acquiring information on illuminance on an object surface based on information on a spatial distribution of radiance of illumination light and information on an orientation of the object surface; acquiring information on light that has been reflected on the object surface while the object surface is illuminated with the illumination light; and correcting the acquired information on the light that has been reflected on the object surface so that influence of the illumination light is reduced, based on the information on the illuminance, wherein the information on the orientation of the object surface contains one of a surface normal vector of the object surface, an azimuth angle and an elevation angle of the object surface, and slopes of the object surface in directions.
11. An information processing apparatus comprising: an illumination information inputter configured to input information on a spatial distribution of radiance of illumination light; an object orientation information inputter configured to input information on an orientation of an object surface; an object illumination light calculator configured to acquire information on illuminance on the object surface based on the information on the spatial distribution of the radiance of the illumination light and the information on the orientation of the object surface; a reflected light information inputter configured to input information on light that has been reflected on the object surface while the object surface is illuminated with the illumination light; and a reflected light corrector configured to correct the input information on the light that has been reflected on the object surface so that influence of the illumination light is reduced, based on the information on the illuminance, wherein the reflected light corrector converts the input information on the light that has been reflected on the object surface in a space where the illumination light is distributed into information on reflected light which is reflected on the object surface in a space where other illumination light is distributed.
12. The information processing apparatus according to claim 11, further comprising a reflected light orientation information inputter configured to input information on an orientation of the light that has been reflected on the object surface, wherein the reflected light corrector corrects the input information on the light that has been reflected on the object surface based on the information on the illuminance, the input information on the light that has been reflected on the object surface, and the information on the orientation of the light that has been reflected on the object surface.
13. The information processing apparatus according to claim 11, further comprising an illumination information acquirer configured to acquire the information on the spatial distribution of the radiance of the illumination light, wherein the illumination information inputter inputs the information on the spatial distribution of the radiance of the illumination light acquired by the illumination information acquirer.
14. The information processing apparatus according to claim 13, wherein the illumination information acquirer includes: an illumination light capturer configured to acquire an image including at least three wavelength bands; and an illumination light estimator configured to estimate the illumination light based on the image.
15. The image processing apparatus according to claim 14, wherein the illumination light capturer includes an image capturer that has an angle of view of 180 degrees or higher for all azimuths.
16. The image processing apparatus according to claim 15, wherein the image capturer includes an optical system of an equisolid angle projection.
17. The image processing apparatus according to claim 11, further comprising an object orientation information acquirer configured to acquire the information on the orientation of the object surface, wherein the object orientation information inputter inputs the information on the orientation of the object surface acquired by the object orientation information acquirer.
18. The information processing apparatus according to claim 17, wherein the object orientation information acquirer includes: a three-dimensional measurement unit configured to three-dimensionally measure an object; and an object orientation information calculator configured to calculate the information on the orientation of the object surface based on the three-dimensional measurement information output from the three-dimensional measurement unit.
19. The information processing apparatus according to claim 11, further comprising a reflected light information acquirer configured to acquire the information on the light that has been reflected on the object surface, wherein the reflected light information inputter inputs the information on the light that has been reflected on the object surface acquired by the reflected light information acquirer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DESCRIPTION OF THE EMBODIMENTS
(9) A detailed description will be given of embodiments of the present invention with reference to the accompanying drawings. An information processing method according to the embodiments includes calculating or acquiring processing of information on object illumination light based on spatial distribution information of illumination light and orientation information of an object, and correcting processing of information on reflected light based on information on the object illumination light and information on the reflected light from the object. A detailed description of each processing will be given below.
(10) (Calculating Method of Object Illumination Light)
(11) Referring now to
(12) Where both the spatial distribution information of the illuminated light and the orientation information of the object under the illumination light are known, object illumination light E.sub.i(λ) for illuminating the object is calculated as in the following expression (1).
E.sub.i(λ)=∫.sub.Ω.sub.
(13) In the expression (1), Ω.sub.i is a hemisphere space in which the object is placed, dΩ.sub.i is an infinitesimal or differential solid angle in the hemisphere space Ω.sub.i. B.sub.i(λ, Ω.sub.i) is a radiance of light that reaches the object from the infinitesimal solid angle dΩ.sub.i in a direction with an angle ω.sub.i of illumination light entering the object, θ.sub.i is an angle between a center axis of the reflected light from the object and a surface normal of the object. That the spatial distribution information of the illumination line is known means that the radiance B.sub.i(λ, ω.sub.i) is known. That the orientation information of the object is known means that the angle θ.sub.i is known. Thus, the object illumination light E.sub.i(λ) (information on the object illumination light) can be calculated in accordance with the orientation of the object based on the spatial distribution information of the illumination light and orientation information of the object.
(14) (Correcting Method of Influence by the Object Illumination Light)
(15) The reflected light R(λ) from the object illuminated by the object illumination light E.sub.i(λ) is expressed as in the expression (2).
R(λ)=∫.sub.Ω.sub.
(16) In the expression (2), BRDF(ω.sub.i.fwdarw.ω.sub.r) is a bi-lateral reflectance distribution function as a ratio of a luminance of light entering the object from a direction with the angle ω.sub.i to the luminance of the reflected/scattering light on the object in the direction with the angle ω.sub.r. In addition, dΩ.sub.r is an infinitesimal or differential solid angle of the measured reflected/scattering light, and θ.sub.r is an angle between the angle ω.sub.r (angular information of the reflected light from the object) and the surface normal of the object.
(17) As expressed in the expression (2), the reflected light R(λ) is subject to the radiance B.sub.i(λ, ω.sub.i) or the object illumination light E.sub.i(λ). Thus, so as to make a correction for reducing the influence of the object illumination light E.sub.i(λ) and to calculate the reflectance of the object itself, it is necessary to divide the reflected light R(λ) expressed by the expression (2) by the object illumination light E.sub.i(λ). Where S(λ) is a true reflectance (or reflection) characteristic of the object in which the influence of the object illumination light E.sub.i(λ) is corrected, the reflectance characteristic (reflection characteristic) S(λ) is calculated as in the following expression (3).
(18)
(19) In the expression (3), BRDF.sub.ideal(ω.sub.i.fwdarw.ω.sub.r) is a bi-lateral reflectance distribution function of an ideal object in which a shape of an angular characteristic shape of the reflected/scattering characteristic is equal to BRDF(ω.sub.i.fwdarw.ω.sub.r) and the reflectance is 1. Since BRDF(ω.sub.i.fwdarw.ω.sub.r) of the object is usually unknown, it is difficult to calculate BRDF.sub.ideal(ω.sub.i.fwdarw.ω.sub.r). However, where the object has a perfectly diffusing surface, BRDF.sub.ideal(ω.sub.i.fwdarw.ω.sub.r)=1/ℏ and thus the expression (3) can be expressed as follows.
(20)
(21) This configuration can calculate the reflectance characteristic S(λ) (information on the true reflected light from the object) corrected so as to reduce the influence of the object illumination light E.sub.i(λ) using the object illumination light E.sub.i(λ) and the angle θ.sub.r (orientation information of the reflected light).
(22) When a multi-band camera obtains the reflected light from the object, the reflectance characteristic (reflection characteristic) of the object can be calculated in which the influence of the object illumination light is corrected as follows. Initially, each band luminance value I.sub.k of the multi-band camera having n bands can be expressed as follows.
I.sub.k=C∫.sub.λ.sub.
(23) In the expression (5), C is a proportionality constant, R(λ) is reflected light expressed by the expression (2), and L(λ) is a spectral transmittance of an optical system used for image pickup. In addition, φ.sub.k (k=1, 2, . . . , n) is a k-th band spectral sensitivity characteristic in the multi-band camera used for image pickup. λ.sub.k1 and λ.sub.k2 (k=1, 2, . . . , n) are a minimum wavelength and a maximum wavelength of the spectral sensitivity characteristic of the k-th band in the multi-band camera.
(24) The reflected light R(λ) in the expression (5) is subject to the radiance B.sub.i(λ, ω.sub.i) by the expression (2). At this time, Sk, as the reflectance characteristic (reflection characteristic) of the object in which the influence of the illumination light is corrected, is expressed as follows.
(25)
(26) The expression (6) assumes that the object has a perfectly diffusing surface, similar to the expression (4). Using the expression (6), the reflectance characteristic S.sub.k of the object in which the influence of the object illumination light is corrected. Due to the expressions (4) and (6), the absolute reflectance of the object can be obtained based on the reflected light from the object obtained under the illumination light (E.sub.i(λ)).
(27) On the other hand, the reflected light (R(λ), I.sub.k) of the object obtained under the illumination light (E.sub.i(λ)) is converted as follows into the reflected light (R′(λ), I′.sub.k) of the object obtained under other illumination light (E′.sub.i(λ)).
(28)
(29) In this case, the orientation information of the reflected light (angle θ.sub.r) may not be used.
(30) (Acquiring Method of Spatial Distribution Information of the Illumination Light)
(31) Next follows a description of an acquiring method of spatial distribution information of the illumination light. Herein, a description will be given of the acquiring method of the spatial distribution information of the illumination light based on a half celestial sphere image in the sky.
(32) Initially, as a precondition, natural light B(λ) is approximated as follows.
B(λ)=F(λ)T(λ)[a.sub.0+a.sub.1(λ/λ.sub.0).sup.−1+a.sub.2(λ/λ.sub.0).sup.−4] (9)
(33) In the expression (9), F(λ) is an extra terrestrial radiation spectrum, T(λ) is an a transmissivity, λ is a wavelength, and λ.sub.0 is a reference wavelength. In addition, a.sub.0 is a coefficient representing a direct light component, a.sub.1 is a coefficient representing a Rayleigh scattering component, and a.sub.2 is a coefficient representing a Mie scattering component by atmospheric aerosol particles. Since F(λ) and T(λ) can use prior observation results, the natural light B(λ) is determined by the three variables of a.sub.0, a.sub.1, and a.sub.2. The natural light B(λ) changes according to the sun position, the weather condition, the atmospheric state, etc., but the influenced natural light B(λ) can be expressed by changing the values of the coefficients a.sub.0, a.sub.1, and a.sub.2.
(34) Next follows a description of a method for acquiring the spatial distribution information of the illumination light based on the half celestial sphere image of the sky. While this embodiment exemplarily uses the three-band or RGB-band image, the present invention is not limited to this embodiment and is applicable to a multi-band image having at least three bands. The RGB values when the sky is imaged are expressed as follows.
R=C∫.sub.λ.sub.
G=C∫.sub.λ.sub.
B=C∫.sub.λ.sub.
(35) In the expression (10), L(λ) is a spectral transmittance of an optical system used for image pickup, φ.sub.k (k=R, G, B) is a spectral sensitivity characteristic of a camera used for image pickup, and λ.sub.k1 and λ.sub.k2 (k=R, G, B) are a minimum wavelength and a maximum wavelength of the spectral sensitivity characteristic of the camera. In addition, C is a proportionality constant.
(36) When the expression (9) is substituted for the expression (10), the following expression (11) can be obtained. I.sub.0.sup.k, I.sub.1.sup.k, I.sub.2.sup.k in the expression (11) can be expressed as in the expression (12). In addition, b.sub.0, b.sub.1, b.sub.2 are defined as in the expression (13).
R=b.sub.0I.sub.0.sup.R+b.sub.1I.sub.1.sup.R+b.sub.2I.sub.2.sup.R
G=b.sub.0I.sub.0.sup.G+b.sub.1I.sub.1.sup.G+b.sub.2I.sub.2.sup.G
B=b.sub.0I.sub.0.sup.B+b.sub.1I.sub.1.sup.B+b.sub.2I.sub.2.sup.B (11)
I.sub.0.sup.k=∫.sub.λ.sub.
I.sub.1.sup.k=∫.sub.λ.sub.
I.sub.2.sup.k=∫.sub.λ.sub.
b.sub.0=Ca.sub.0
b.sub.1=Ca.sub.1
b.sub.2=Ca.sub.2 (13)
(37) The value of the expression (12) can be previously calculated. Hence, the coefficients a.sub.0, a.sub.1, and a.sub.2 can be calculated by the following expression (14) with the RGB values.
(38)
(39) Thus, the illumination light can be obtained based on the specific area in the sky by substituting the calculated coefficients a.sub.0, a.sub.1, and a.sub.2 for the expression (9). The coefficients a.sub.0, a.sub.1, and a.sub.2 can be calculated for each point in the image based on the half celestial sphere image in the sky. Hence, the spatial distribution information of the illumination light can be obtained based on the half celestial sphere image in the sky.
(40) This embodiment uses the three-band image, but the present invention is not limited to this embodiment as long as the number of bands is equal to or larger than the number of variables in the natural light approximation model. For example, a four-band image can be used and the variables may be calculated by the least square method etc. instead of the expression (14). While this embodiment uses the RGB images, but the present invention is not limited to the RGB bands as long as the number of stages in the inverse matrix as illustrated in the expression (14) is equal to or larger than the number of variables in the natural light approximation model. For example, instead of the visible range red wavelength band R, the near infrared wavelength band IR may be used. In addition, the natural light calculated by the approximation model in the expression (9) is not limited to the visible wavelength band and this embodiment can utilize a range from the ultraviolet ray (300 nm) to the near infrared region (2.5 μm). In addition, the approximation model other than the expression (9) may be used.
(41) (Acquiring Method of the Orientation Information of the Object)
(42) The orientation information of the object in this embodiment corresponds to any one of a surface normal vector of the object, an azimuth angle and an elevation angle of the object plane, or slopes of the object plane in two direction, but the present invention is not limited to this embodiment. The orientation information of the object can be manually input by the user. In addition, the surface normal vector of the object etc. can be calculated based on the three-dimensional measurement information. The three-dimensional measurement information can be acquired by a stereo method, an active stereo method, a light laser method, etc.
(43) When a target object includes a plurality of objects, one of a plurality of objects (a plurality of local objects) may be considered to be one object and the orientation information of the object may be calculated. For example, in a rice paddy field, a plurality of rice crops may be regarded as one object and the orientation information of the object may be calculated. Moreover, when the object has a surface structure, the orientation information of the object may be calculated by averaging the orientation information of the surface structures (local orientation information).
(44) A description will now be given of a variety of embodiments of the present invention.
First Embodiment
(45) Referring now to
(46) At the step S11 in
(47) The top in
(48) In accurately estimating the illumination light, this embodiment may use image data having a photoelectrically converted value obtained by an image sensor having a linear sensitivity characteristic to incident light. When the sky that contains the sun is captured in the fine weather, a photoelectrically converted value saturates and an output value loses linearity to the incident light but a so-called high dynamic range image may be used in which a plurality of images captured by changing the exposure are combined. When the preprocess for estimating the illumination light using the image sensor may execute a shading correction that corrects shading generated in the optical system and by the sensor and a distortion correction that corrects the distortion in the optical system, the illumination light can be accurately estimated. When the expression (9) is used, the illumination light cannot be estimated from the multi-band image in the area other than the sky area. In that case, the masking process that masks the area other than the sky area in the image may be provided.
(49) Images in the middle of
(50) Next, at the step S12, the information processing apparatus calculates the object illumination light E.sub.i(λ) for illuminating the object, based on the spatial distribution information of the illumination light estimated at the step S11 and the surface normal information of the object (orientation information of the object).
(51)
θ.sub.p(i,j)=2 sin.sup.−1{√{square root over (x(i,j).sup.2+y(i,j).sup.2)}/(2f)} (15)
ϕ(i,j)=tan.sup.−1{y(i,j)/x(i,j)} (16)
n.sub.lay(i,j)=[sin θ.sub.p(i,j)cos ϕ(i,j), sin θ.sub.p(i,j)sin ϕ(i,j), cos θ.sub.p(i,j)] (17)
(52) In the expressions (15) to (17), f is a focal length of the optical system, x(i, j) and y(i, j) are distances in the x direction and in the y direction on the image sensor at the pixel (x, y).
(53)
(54) Finally, at the step S13, the information processing apparatus corrects information on the reflected light measured under the illumination light (data obtained by the camera) so as to reduce the influence of the illumination light of the object. More specifically, the information processing apparatus corrects the information on the reflected light based on the object illumination light calculated at the step S12 (information on the object illumination light), the reflected light from the object measured under the illumination light (information on the reflected light), and the orientation information of the reflected light. The information processing apparatus corrects the information on the reflected light and calculates the reflectance characteristic of the object, for example, as expressed by the expression (4).
Second Embodiment
(55) Referring now to
(56)
(57) As described above, this embodiment previously has the spatial distribution information of the illumination light, and thus the object illumination light can be accurately calculated in accordance with the object orientation.
Third Embodiment
(58) Referring now to
(59) The information processor 10 executes, for example, the information processing method according to the first and second embodiments. The information processor 10 includes an illumination information inputter (first inputter) 11, an object orientation information inputter (second inputter) 12, an object illumination light calculator 13, a reflected light information inputter (third inputter) 14, a reflected light orientation information inputter (fourth inputter) 15, and a reflected light corrector 16. The illumination information acquirer (first acquirer) 21 includes an illumination light capturer 22 and an illumination light estimator 23. The object orientation information acquirer (second acquirer) 31 includes a three-dimensional measurement unit 32 and an object orientation information calculator 33.
(60) The illumination light capturer (image capturer) 22 captures a hemispherical image in the sky. The illumination light estimator 23 estimates the illumination light (spatial distribution information of the illumination light) using an image captured by the illumination light capturer 22, such as image data captured with a fisheye lens. The spatial distribution information of the estimated illumination light is input into the illumination information inputter 11. The three-dimensional measurement unit (measurement unit) 32 measures three-dimensional information of the object. The object orientation information calculator (second calculator) 33 calculates surface normal information (surface normal vector n.sub.obj) as orientation information of the object based on the three-dimensional information (three-dimensional measurement information) measured by the three-dimensional measurement unit 32. The surface normal information of the object calculated by the object orientation information calculator 33 is input into the object orientation information inputter 12. The object illumination light calculator (first calculator) 13 calculates the object illumination light E.sub.i(A) using the spatial distribution information of the illumination light input into the illumination information inputter 11 and the surface normal information of the object input into the object orientation information inputter 12 and based on the expression (1), for example.
(61) The reflected light information acquirer (third acquirer) 41 is, for example, a camera, and obtains information on the reflected light from the object under the illumination light used to capture the hemispherical image (data obtained by the camera). The information on the reflected light from the object acquired by the reflected light information acquirer 41 is input into the reflected light information inputter 14. The orientation information of the reflected light is input into the reflected light orientation information inputter 15. The orientation information of the reflected light contains information on an angle θ.sub.r between the center axis of the reflected light from the object and the surface normal of the object. This embodiment acquires the orientation information of the reflected light based on the information from the angular sensor attached to the reflected light information acquirer 41.
(62) The reflected light corrector 16 corrects information on the reflected light from the object based, for example, on the expressions (4) and (6). In other words, the reflected light corrector 16 corrects the information on the reflected light based on the information on the object illumination light calculated by the object illumination light calculator 13, the information on the reflected light from the object input into the reflected light information inputter 14, and the orientation information of the reflected light input into the reflected light orientation information inputter 15. The reflected light corrector 16 corrects information on the reflected light of the object using the expressions (7) and (8) based on the information on the object illumination light calculated by the object illumination light calculator 13, and the information on the reflected light from the object input into the reflected light information inputter 14. The information on the reflected light corrected by the reflected light corrector 16 (information on the corrected reflected light) is, for example, output to an external apparatus, and stored in a storage or memory in the information processing apparatus 100 or displayed on a display unit.
Fourth Embodiment
(63) Referring now to
(64) Each of the illumination light capturer 22 and the reflected light information acquirer 41 according to this embodiment includes a multi-band camera having RGB or three bands with a common spectral sensitivity characteristic. Each optical system is set so that the spectral transmittance L(A) of the optical system in the illumination light capturer 22 and the spectral transmittance L(A) of the optical system in the reflected light information acquirer 41 can be equal to each other. Thus, in calculating the reflection characteristic in which the influence of the object illumination light is reduced or removed from the reflected light from the object using the information processing apparatus according to this embodiment, the expression (5) is equivalent with the RGB values obtained in the reflection light information acquirer 41. The value of the denominator on the right side in the expression (6) is equal to the value made by replacing B(i, j) in the expression (1) with the values of R(i, j), G(i, j), and B(i, j) obtained in the illumination light capturer 22.
(65) Therefore, when the illumination light capturer 22 and the reflected light information acquirer 41 have the same spectral transmittance of the optical system and the same spectral sensitivity characteristic of the multi-band camera, the reflection characteristic can be calculated in which the influence of the object illumination light is reduced or removed without the illumination estimation processing. In addition, in this case, without using the approximation model expressed in the expression (9), the object illumination light is calculated using the actually measured values and advantageously the object illumination light can be more accurately calculated. Since the approximation model expressed in the expression (9) is not used, the influence of the illumination light entering the object from the area other than the sky area can be considered. For example, the illumination light reflected on the structure can be precisely expressed by the integrated value of the multi-band image in the structure area. In this case, the masking processing illustrated in
Other Embodiments
(66) Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
(67) Each embodiment can accurately estimate the illumination light and correct the influence of the illumination light based on the reflected light from the object, irrespective of an insolation condition, the weather, and an object orientation. As a consequence, each embodiment can provide an information processing method and an information processing apparatus, which can accurately acquire a characteristic of an object.
(68) While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
(69) This application claims the benefit of Japanese Patent Application No. 2017-047327, filed on Mar. 13, 2017, which is hereby incorporated by reference herein in its entirety.