CONTROL APPARATUS, IMAGE PICKUP APPARATUS, LENS APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM
20220286615 · 2022-09-08
Inventors
- Yuki Shinzato (Tochigi, JP)
- Tomoki Tokita (Tochigi, JP)
- TOMOHIRO INO (Tochigi, JP)
- Satoshi Maetaki (Tochigi, JP)
Cpc classification
H04N23/55
ELECTRICITY
H04N23/683
ELECTRICITY
H04N23/6842
ELECTRICITY
H04N23/6812
ELECTRICITY
H04N23/663
ELECTRICITY
International classification
Abstract
A control apparatus includes at least one processor or circuit configured to execute a plurality of tasks including a first acquiring task configured to acquire information on an image shift sensitivity to a tilt of an imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system, and a second acquiring task configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization. The second acquiring task acquires the image-stabilization driving amount corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
Claims
1. A control apparatus comprising at least one processor or circuit configured to execute a plurality of tasks including: a first acquiring task configured to acquire information on an image shift sensitivity to a tilt of an imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system; and a second acquiring task configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization, wherein the second acquiring task acquires the image-stabilization driving amount corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
2. The control apparatus according to claim 1, wherein the information on the image shift sensitivity is acquired by using a designed value of the imaging optical system.
3. The control apparatus according to claim 1, wherein the information on the image shift sensitivity is information that can provide a moving amount of the predetermined image point position to the tilt of the imaging optical system.
4. The control apparatus according to claim 1, wherein the information on the image shift sensitivity is information determined for each position on an image plane.
5. The control apparatus according to claim 1, wherein the image-stabilization driving amount is acquired by using information on a blur and the information on the image shift sensitivity.
6. The control apparatus according to claim 1, wherein the image-stabilization driving amount is acquired by using information on a blur, information on the predetermined image point position, and the information on the image shift sensitivity.
7. The control apparatus according to claim 5, wherein the information on the blur includes information on angular velocities about a plurality of rotation axes.
8. The control apparatus according to claim 5, wherein the information on the blur includes information on accelerations in a plurality of axial directions.
9. The control apparatus according to claim 1, wherein the predetermined image point position is a position on an image plane represented by a plurality of parameters.
10. The control apparatus according to claim 1, wherein the information on the image shift sensitivity differs corresponding to a focal length of the imaging optical system.
11. The control apparatus according to claim 1, wherein the information on the image shift sensitivity differs corresponding to an object distance to be focused.
12. The control apparatus according to claim 1, wherein the image stabilizer decenters an image sensor from an optical axis of the imaging optical system.
13. The control apparatus according to claim 1, wherein the image stabilizer changes an effective pixel area in an image sensor.
14. The control apparatus according to claim 1, wherein the image stabilizer decenters at least part of the imaging optical system from an optical axis of the imaging optical system.
15. An image pickup apparatus comprising: an image sensor; and a control apparatus, wherein the control apparatus includes at least one processor or circuit configured to execute a plurality of tasks including: a first acquiring task configured to acquire information on an image shift sensitivity to a tilt of an imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system; and a second acquiring task configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization, wherein the second acquiring task acquires the image-stabilization driving amount corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
16. The image pickup apparatus according to claim 15, further comprising a memory configured to store the information on the image shift sensitivity.
17. A lens apparatus comprising: an imaging optical system; and a control apparatus, wherein the control apparatus includes at least one processor or circuit configured to execute a plurality of tasks including: a first acquiring task configured to acquire information on an image shift sensitivity to a tilt of the imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system; and a second acquiring task configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization, wherein the second acquiring task acquires the image-stabilization driving amount corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
18. The lens apparatus according to claim 17, further comprising a memory configured to store the information on the image shift sensitivity.
19. A control method configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization, the control method comprising: a first acquiring step of acquiring information on an image shift sensitivity to a tilt of an imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system; and a second acquiring step of acquiring an image-stabilization driving amount for the image stabilizer corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
20. A storage medium storing a program that causes a computer to execute the control method according to claim 19.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
DESCRIPTION OF THE EMBODIMENTS
[0027] Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the present invention. Corresponding elements in respective figures will be designated by the same reference numerals, and a duplicate description thereof will be omitted.
[0028] In the following description, in a three-dimensional orthogonal coordinate system (X-axis direction, Y-axis direction, and Z-axis direction), the X-axis direction is a long side direction of the imaging plane, the Y-axis direction is a short side direction of the imaging plane, and the Z-axis direction is an optical axis direction of the optical system.
First Embodiment
[0029]
[0030] The imaging optical system 101 includes a focus optical system 1011, a magnification varying (zoom) optical system 1012, a diaphragm (aperture stop) 1013, and an OIS optical system 1014. The imaging optical system 101 forms an object image on an imaging plane of the image sensor 201 using light from the object at an in-focus position within a set angle of view. The focus optical system 1011 provides focusing. The magnification varying optical system 1012 provides a magnification variation (zooming) in order to change an imaging angle of view. The diaphragm 1013 adjusts a light amount captured from the object. The OIS optical system 1014 provides the image stabilization to an image blur that occurs during still or motion image capturing by decentering itself from the optical axis of the imaging optical system 101. Here, OIS is an image stabilization performed by moving the OIS optical system 1014.
[0031] The lens microcomputer 102 controls the OIS optical system 1014. More specifically, the lens microcomputer 102 determines an OIS driving amount of the OIS actuator 105 using an image-stabilization (IS) driving amount from the camera microcomputer 202 and a position signal from the OIS encoder 103 that detects the position of the OIS optical system 1014. The OIS driving amount is determined so as not to exceed a movable range of the OIS actuator 105. When the OIS actuator 105 receives an OIS driving amount signal from the OIS driver 104, the OIS actuator 105 moves the OIS optical system 1014 in a direction including a component of a direction orthogonal to the Z-axis direction to decenter it from the optical axis of the imaging optical system 101 and thereby provides the image stabilization. That is, the OIS actuator 105 functions as one of the image stabilizers that provide image stabilization.
[0032] The lens memory 106 stores optical design information on the imaging optical system 101. The optical design information includes information on a tilt-image shift sensitivity of the imaging optical system 101 for each image height (information on an image shift sensitivity to a tilt of the imaging optical system 101 according to an image point position of the imaging optical system 101). The information on the tilt-image shift sensitivity is information obtained by using the designed value of the imaging optical system 101 and includes the influence of the distortion of the imaging optical system 101. Use of the information on the tilt-image shift sensitivity can provide a satisfactory image stabilization at a predetermined image point position of the imaging optical system 101, when the image pickup system 1 generates a rotation blur so that the X-Y plane orthogonal to the optical axis is tilted to the optical axis. The camera memory 211 may store the optical design information on the imaging optical system 101 including information on the tilt-image shift sensitivity. Both the lens memory 106 and the camera memory 211 may store the optical design information on the imaging optical system 101 including the information on the tilt-image shift sensitivity.
[0033] The imaging optical system 101 has a distortion DIST(h) expressed by the following expression:
DIST(h)=(h−h0)/h0
h0=f tan ω
where f is a focal length of the imaging optical system 101, and ω is a half angle of view, h is a distance (real image height) from the optical axis of the imaging optical system 101 to a position on the image plane where a principal ray having the half angle of view ω incident from the object plane is imaged, and h0 is an ideal image height of the central projection method.
[0034] Having a distortion means that a distortion amount at any image height within the imaging range is not zero. The imaging optical system having the distortion includes an imaging optical system having a magnification varying function and a focusing function and having distortion in a certain magnification varying state or in-focus state.
[0035] The image sensor 201 includes a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or another image sensor. The image sensor 201 converts an object image formed on the imaging plane of the image sensor 201 by the imaging optical system 101 into an electric signal, and outputs it as an image signal. The image signal as an analog signal is converted into a digital signal by an unillustrated A/D converter and then output.
[0036] The camera microcomputer 202 controls the entire image pickup system 1. For example, the camera microcomputer 202 reads out the image signal as image data from the image sensor 201. The camera microcomputer 202 performs processing such as image processing for the image data based on the optical design information, displaying the image data on the display/operation unit 203, and saving the image data in the recording medium 204. The camera microcomputer 202 issues instructions, such as focusing, zoom magnification changing, and diaphragm adjusting of the imaging optical system 101, to the lens microcomputer 102. Some of the settings relating to the processing may be changed by an operation unit such as a display/operation unit 203 and an unillustrated button.
[0037] The camera microcomputer 202 acquires the IS driving amount (image-stabilization driving amount that is used for the image stabilization by the image stabilizer) according to a flow of
[0038] The gyro sensor 205 outputs information on an angular velocity of the image pickup system 1 as a motion detecting signal. The acceleration sensor 206 outputs information on a moving amount of the image pickup system 1 in a translational direction as a motion detecting signal. The camera microcomputer 202 when receiving the motion detecting signal transmitted from each sensor, transmits the IS driving amount to the lens microcomputer 102 or the IIS control unit 207 in the camera microcomputer 202 so as to provide the image stabilization to an object image for a motion of the image pickup system 1. In the image stabilization, either OIS or IIS may be performed, or both OIS and IIS may be performed with a determined share of image stabilization (such as 50% of OIS and 50% of IIS).
[0039] The IIS control unit 207 controls the image sensor 201. More specifically, the IIS control unit 207 determines the IIS driving amount of the IIS actuator 210 using the IS driving amount from the camera microcomputer 202 and the position signal from the IIS encoder 208 that detects the position of the image sensor 201. The IIS driving amount is determined so as not to exceed the movable range of the IIS actuator 210. When the IIS actuator 210 receives the IIS driving amount signal from the IIS driver 209, it moves the image sensor 201 in a direction including a component of a direction perpendicular to the Z-axis direction to decenter it from the optical axis of the imaging optical system 101 and provides the image stabilization. That is, the IIS actuator 210 functions as one of the image stabilizers that provide the image stabilization.
[0040] The lens apparatus 100 may include a gyro sensor 107 and an acceleration sensor 108. In this case, in the OIS, the lens microcomputer 102 determines the OIS driving amount using the IS driving amount acquired using the motion detecting signals output from these sensors and the position signal from the OIS encoder 103.
[0041] A description will now be given of processing during the image stabilization at a predetermined image point position. When the gyro sensor 205 or the acceleration sensor 206 detects a motion of the image pickup system 1, each sensor outputs the motion detecting signal (information on a blur) to the camera microcomputer 202. The camera microcomputer 202 acquires the IS driving amount using the information on the tilt-image shift sensitivity stored by the lens memory 106, IS position information on the imaging plane, and the motion detecting signals. The camera microcomputer 202 transmits the acquired IS driving amount to the lens microcomputer 102 or the IIS control unit 207.
Acquisition of Information on Tilt-Image Shift Sensitivity
[0042] In this embodiment, the tilt-image shift sensitivity is an image-point moving amount in a direction orthogonal to a predetermined rotation axis when the imaging optical system 101 is tilted to that rotation axis orthogonal to the optical axis of the imaging optical system 101 on the imaging plane.
[0043]
[0044] A description will now be given of an image-point moving amount t.sub.x0 in the +X-axis direction at a center position O on the imaging plane, which is the image center when a rotation blur amount ω.sub.y about the Y-axis occurs, and an image-point moving amount t.sub.x at a predetermined image point position A.
[0045] The image-point moving amount t.sub.x0 is expressed by the following expression (1):
t.sub.x0=ω.sub.y.Math.LS
where LS is a tilt-image shift sensitivity at an image height of 0.
[0046] Assume the imaging plane (X-Y plane) in a polar coordinate system (R-Θ coordinate system) with a center position O as an origin, and (r, θ) is a coordinate of the predetermined image point position A. That is, in this embodiment, the predetermined image point position A is a position on the imaging plane represented by a plurality of parameters. The image height on the horizontal axis in
k.sub.LS_r(h.sub.r)=LS.sub.r(h.sub.r)/LS (2)
where LS.sub.r(h.sub.r) is a tilt-image shift sensitivity at an image height h.sub.r.
[0047] The image-point moving amount t.sub.x0 is expressed by the following expressions (3) to (5) using a parallel component t.sub.rx0 parallel to a straight line OA and a vertical component t.sub.θx0 perpendicular to the straight line OA:
t.sub.rx0=t.sub.x0.Math.cos θ=ω.sub.y.Math.LS.Math.cos θ (3)
t.sub.θx0=t.sub.x0.Math.(−sin θ)=−ω.sub.y.Math.LS.Math.sin θ (4)
|t.sub.x0|=(t.sub.rx0.sup.2+t.sub.θx0.sup.2).sup.1/2 (5)
[0048] The parallel component trio has a positive sign in the direction separating from the center position O (R direction), and the vertical component t.sub.θx0 has a positive sign in the direction orthogonal to the R direction toward the counterclockwise direction about the center position O (θ direction). The R direction and the θ direction are also referred to as a meridional direction and a sagittal direction, respectively.
[0049] Next, consider the image-point moving amount t.sub.x at the predetermined image point position A. The parallel component t.sub.rx parallel to the straight line OA is affected by the tilt-image shift sensitivity LS.sub.r(r) at the image height r, and the vertical component t.sub.θx perpendicular to the straight line OA is affected by the tilt-image shift sensitivity LS at the image height of 0. From the above, the image-point moving amount t.sub.x is expressed by the following expressions (6) to (8) using the parallel component t.sub.rx and the vertical component t.sub.θx:
t.sub.rx=k.sub.LS_r(r).Math.t.sub.rx0=k.sub.LS_r(r).Math.ω.sub.y.Math.LS.Math.cos θ (6)
t.sub.θx=k.sub.LS_r(0).Math.t.sub.θx0=−ω.sub.y.Math.LS.Math.sin θ (7)
|t.sub.x|=(t.sub.rx.sup.2+t.sub.θx.sup.2).sup.1/2 (8)
[0050] In this way, the image-point moving amount t.sub.x at the predetermined image point position A when the rotation blur amount ωy occurs about the Y-axis can be calculated. Similarly, an image-point moving amount t.sub.y at the predetermined image point position A in the polar coordinate system when the rotation blur amount ω.sub.x occurs about the X-axis is expressed by the following expressions (9) to (11) using a parallel component t.sub.ry parallel to the straight line OA and a vertical component t.sub.θy perpendicular to the straight line OA:
[0051] As described above, the image-point moving amount t at the predetermined image point position A when the rotation blur amount (ω.sub.x, ω.sub.y) occurs about the predetermined rotation axis orthogonal to the optical axis on the imaging plane can be expressed by the following expressions (12) to (14) using a parallel component t.sub.r parallel to the straight line OA and a vertical component to perpendicular to the straight line OA.
[0052] Coefficients (K.sub.1, K.sub.2, K.sub.3, K.sub.4) in the expressions (12) and (13) are given as follows:
K.sub.1(r,θ)=k.sub.LS_r(r).Math.LS.Math.cos θ
K.sub.2(r,θ)=k.sub.LS_r(r).Math.LS.Math.sin θ
K.sub.3(r,θ)=−LS.Math.sin θ
K.sub.4(r,θ)=LS.Math.cos θ
[0053] As expressed by the expressions (12) to (14), the image-point moving amount t includes correction coefficient information (K.sub.1, K.sub.2, K.sub.3, K.sub.4) including the tilt-image shift sensitivity and the position information (r, θ) at the image point position, and a rotation blur amount (ω.sub.x, ω.sub.y). In this embodiment, the lens memory 106 previously stores as the information on the tilt-image shift sensitivity a correction coefficient table of the correction coefficient information (K.sub.1, K.sub.2, K.sub.3, K.sub.4) in a matrix format defined by the image point position illustrated in
[0054] The information on the tilt-image shift sensitivity may include the tilt-image shift sensitivity for each image height in order to reduce the information stored in the lens memory 106, or may provide the image-point moving amount t using position information on a predetermined image point position that is a target of the image stabilization. The position information on the image point position may be information on a polar coordinate system or information on a predetermined coordinate system (such as an orthogonal coordinate system).
Setting of Image-Stabilization Position Information on Imaging Plane
[0055] This embodiment can switch a setting mode of the image pickup system 1 to an image-center IS mode that sets a predetermined image point position (image stabilization position) that is a target of the image stabilization to the center of the imaging plane, or an IS spot setting mode that can set an image stabilization point to a predetermined image point position. In a case where the IS spot setting mode is set, the image stabilization position can be set on the display/operation unit 203. The position settable by the display/operation unit 203 may be linked with the image point position where autofocus is performed or the image point position where automatic photometry (light metering) is performed. The image point position where the autofocus is performed may be a position automatically detected by the pupil detection, person detection, or the like. The IS position information (r, θ) on the imaging plane is sent to the camera microcomputer 202, and the correction coefficient information to be used is selected from the correction coefficient table.
Motion Detecting Signal
[0056] The gyro sensor 205 detects angular velocities about a plurality of rotation axes of the image pickup system 1 and outputs information on the rotation blur amount as a motion detecting signal. In this embodiment, the gyro sensor 205 detects angular velocities about the X-axis and the Y-axis, and outputs information on the rotation blur amount (ω.sub.x, ω.sub.y). The acceleration sensor 206 detects accelerations in a plurality of axial directions of the image pickup system 1 and outputs information on the translational blur amount as a motion detecting signal. In this embodiment, the acceleration sensor 206 detects accelerations in the X-axis and Y-axis directions, and outputs information on the translational blur amount (a.sub.x, a.sub.y). The gyro sensor 205 may include a plurality of sensors, each of which detects an angular velocity about a single axis. Similarly, the acceleration sensor 206 may include a plurality of sensors, each of which detects acceleration in a single direction.
Acquisition of Image-Stabilization Driving Amount
[0057] The camera microcomputer 202 acquires an IS driving amount using the information on the tilt-image shift sensitivity, the IS position information, and the motion detecting signal. For example, in a case where an image blur at a predetermined image point position A due to the rotation blur is corrected by IIS, the image sensor 201 may be moved so as to cancel the image-point moving amount t. An IS driving amount x in the X-axis direction and an IS driving amount y in the Y-axis direction of the IIS actuator 210 are expressed by the following expressions (15) and (16):
x=t.sub.r.Math.cos θ−t.sub.θ.Math.sin θ=ω.sub.y{sin.sup.2θ+k.sub.LS_r(r).Math.cos.sup.2θ}LS+ω.sub.x{k.sub.LS_r(r)−1}LS.Math.sin θ.Math.cos θ=K′.sub.1(r,θ).Math.ω.sub.y+K′.sub.2(r,θ).Math.ω.sub.x (15)
y=t.sub.r.Math.sin θ+t.sub.θ.Math.cos θ=ω.sub.y{k.sub.LS_r(r)−1}LS.Math.sin θ.Math.cos θ+ω.sub.x{k.sub.LS_r(r).Math.sin.sup.2θ+cos.sup.2θ}LS=K′.sub.3(r,θ).Math.ω.sub.y+K′.sub.4(r,θ).Math.ω.sub.x (16)
[0058] Coefficients (K′.sub.1, K′.sub.2, K′.sub.3, K′.sub.4) in the expressions (15) and (16) are given as follows:
K′.sub.1(r,θ)={sin.sup.2θ+k.sub.LS_r(r).Math.cos.sup.2θ}LS
K′.sub.2(r,θ)={k.sub.LS_r(r)−1}LS.Math.sin θ.Math.cos θ
K′.sub.3(r,θ)={k.sub.LS_r(r)−1}LS.Math.sin θ.Math.cos θ
K′.sub.4(r,θ)={k.sub.LS_r(r).Math.sin.sup.2θ+cos.sup.2θ}LS
[0059]
[0060] As expressed by the expressions (15) and (16), the IS driving amount (x, y) includes the correction coefficient information (K′.sub.1, K′.sub.2, K′.sub.3, K′.sub.4) and the rotation blur amount (ω.sub.x, ω.sub.y). Thus, the correction coefficient table of the correction coefficient information (K′.sub.1, K′.sub.2, K′.sub.3, K′.sub.4) in a matrix format may be stored as the information on the tilt-image shift sensitivity in the lens memory 106. By using K′.sub.1 or the like instead of the above correction coefficient information (K.sub.1, K.sub.2, K.sub.3, K.sub.4), the IS driving amount (x) at the predetermined image point position A when the rotation blur amount (ω.sub.x, ω.sub.y) occurs can be easily obtained.
[0061] In the case of OIS, the OIS eccentric (decentering) sensitivity TS(h) for each image height of the OIS optical system 1014 increases as the image height becomes higher, so that the IS driving amount may be acquired based on the OIS eccentric sensitivity TS(h). Thereby, the image stabilization can be performed with high accuracy.
[0062] Regarding the image blur derived from the translation blur, the IS driving amount may be acquired using the information on the translation blur amount from the acceleration sensor 206. The IS driving amount for the translation blur may be acquired by converting the translation blur amount (a.sub.x, a.sub.y) into the rotation blur amount (ω.sub.x, ω.sub.y) using the in-focus object distance information. In a case where the rotation blur and translation blur occur at the same time, the IS driving amount may be acquired by adding the IS driving amount for the translation blur and the IS driving amount for the rotation blur. The IS driving amount for the translation blur at the predetermined image point position may be acquired by multiplying the converted rotation blur amount by the correction coefficient included in the information on the tilt-image shift sensitivity.
[0063] In a case where the in-focus position is close to the short distance end, the translational component of the object plane generated by the rotation blur becomes large. The IS driving amount for the image blur caused by the translational component according to the object distance may be acquired by the above method.
[0064] The tilt-image shift sensitivity changes according to the object distance and the focal length (imaging angle of view) on which the imaging optical system 101 is focused. In this embodiment, the lens memory 106 stores a plurality of correction coefficient tables that differ according to an in-focus position determined by the focus optical system 1011 and a focal length determined by the magnification varying optical system 1012. Thereby, the image stabilization can be satisfactorily provided at a predetermined image point position even during the magnification variation (zooming) or focusing.
[0065] The lens apparatus 100 may be detachably attached to the image pickup apparatus 200. In this case, the information on the proper tilt-image shift sensitivity may be used for each lens apparatus 100. Thereby, even where a different lens apparatus 100 is attached to the image pickup apparatus 200 and used, the image blur at a predetermined image point position can be satisfactorily corrected.
Second Embodiment
[0066] This embodiment expands the information on the tilt-image shift sensitivity is wider than the first embodiment. Since the configuration of the image pickup system 1 and the processing in the image stabilization in this embodiment are the same as those in the first embodiment, a detailed description thereof will be omitted.
[0067] In this embodiment, a distortion amount by which the object image of the imaging optical system 101 is deformed into a barrel shape is larger than that in the first embodiment. In an imaging optical system with a small distortion amount, an image-point moving amount at any image point position in a direction orthogonal to an image-point moving direction at the image center when the imaging optical system is tilted is almost similar to an image-point moving amount at the image center. On the other hand, the imaging optical system 101 according to this embodiment has a large distortion amount. Then, the image-point moving amount in the direction orthogonal to the image-point moving direction at the image center when the imaging optical system 101 is tilted becomes smaller as a position is more separated from the image center. Thus, in this embodiment, the lens memory 106 stores the information on a significant tilt-image shift sensitivity for the image-point moving amount in the direction orthogonal to the image-point moving direction at the image center caused by the rotation blur.
[0068]
[0069] Accordingly, this embodiment adds the tilt-image shift sensitivity to the information described in the first embodiment, and creates information that includes the influence of the image-point moving amount for each image height in a direction parallel to the rotation axis when the imaging optical system 101 is tilted. The tilt-image shift sensitivity coefficient k.sub.LS_θ(h.sub.θ) at the image height h.sub.θ against the tilt-image shift sensitivity LS at the image center is expressed by the following expression (17):
k.sub.LS_θ(h.sub.θ)=LS.sub.θ(h.sub.θ)/LS (17)
[0070] When a rotation blur amount (ω.sub.x, ω.sub.y) occurs, a parallel component t.sub.r parallel to the straight line OA, which is a polar coordinate system component of the image-point moving amount t at a predetermined image point position A, and a vertical component to perpendicular to the straight line OA are expressed by the following expressions (12a) and (13a), respectively:
t.sub.r=t.sub.rx+t.sub.ry=k.sub.LS_r(r).Math.k.sub.LS_θ(0).Math.LS(ω.sub.y.Math.cos θ+ω.sub.x.Math.sin θ)=K.sub.1(r,θ).Math.ω.sub.y+K.sub.2(r,θ).Math.ω.sub.x (12a)
t.sub.θ=t.sub.θx+t.sub.θy=k.sub.LS_r(0).Math.k.sub.LS_θ(r).Math.LS(−ω.sub.y sin θ+ω.sub.x cos θ)=K.sub.3(r,θ).Math.ω.sub.y+K.sub.4(r,θ).Math.ω.sub.x (13a)
[0071] Coefficients (K.sub.1, K.sub.2, K.sub.3, K.sub.4) in the expression (12a) and (13a) are given as follows:
K.sub.1(r,θ)=k.sub.LS_r(r).Math.k.sub.LS_θ(θ).Math.LS.Math.cos θ
K.sub.2(r,θ)=k.sub.LS_r(r).Math.k.sub.LS_θ(0).Math.LS.Math.sin θ
K.sub.3(r,θ)=−k.sub.LS_r(0).Math.k.sub.LS_θ(r).Math.LS.Math.sin θ
K.sub.4(r,θ)=k.sub.LS_r(0).Math.k.sub.LS_θ(r).Math.LS.Math.cos θ
[0072] An IS driving amount x in the X-axis direction and an IS driving amount y in the Y-axis direction of the IIS actuator 210 are expressed by the following expressions (15a) and (16a):
x=t.sub.r.Math.cos θ−t.sub.θ.Math.sin θ=ω.sub.y{k.sub.LS_θ(r).Math.sin.sup.2θ+k.sub.LS_r(r).Math.cos.sup.2θ}LS+ω.sub.x{k.sub.LS_r(r)−k.sub.LS_θ(r)}LS.Math.sin θ.Math.cos θ=K′.sub.1(r,θ).Math.ω.sub.y+K′.sub.2(r,θ).Math.ω.sub.x (15a)
y=t.sub.r.Math.sin θ=t.sub.θ.Math.cos θ=ω.sub.y{k.sub.LS_r(r)−k.sub.LS_θ(r)}LS.Math.sin θ.Math.cos θ+ω.sub.x{k.sub.LS_r(r).Math.sin.sup.2θ+k.sub.LS_θ(r).Math.cos.sup.2θ}LS=K′.sub.3(r,θ).Math.ω.sub.y+K′.sub.4(r,θ).Math.ω.sub.x (16a)
[0073] Coefficients (K′1, K′2, K′3, K′4) in the expressions (15a) and (16a) are given as follows:
K′.sub.1(r,θ)={k.sub.LS_θ(r).Math.sin.sup.2θ+k.sub.LS_r(r).Math.cos.sup.2θ}LS
K′.sub.2(r,θ)={k.sub.LS_r(r)−k.sub.LS_θ(r)}LS.Math.sin θ.Math.cos θ
K′.sub.3(r,θ)={k.sub.LS_r(r)−k.sub.LS_θ(r)}LS.Math.sin θ.Math.cos θ
K′.sub.4(r,θ)={k.sub.LS_r(r).Math.sin.sup.2θ+k.sub.LS_θ(r)cos.sup.2θ}LS
[0074] As described above, this embodiment acquires the IS driving amount based on the tilt-image shift sensitivities for each image height in the direction parallel to the rotation axis and the direction orthogonal to the rotation axis. Thereby, even when the image pickup system 1 uses the imaging optical system 101 having a large distortion amount, the image stabilization can be satisfactorily provided at a predetermined image point position.
[0075] An optical system designed by a fisheye lens projection method (such as an equidistant projection method and an equisolid angle projection method) also has a significant tilt-image shift sensitivity characteristic against the image-point moving amount in the θ direction. Thus, the IS driving amount may be acquired based on the tilt-image shift sensitivities for each image height in the R and θ directions.
[0076] In a case where a large image-stabilization angle is guaranteed as a specification of the image stabilization mechanism, the IS driving amount may be determined based on the tilt-image shift sensitivity according to this embodiment.
EXAMPLES
[0077] Referring now to the accompanying drawings, a description will be given of examples of the imaging optical system 101 according to the present invention.
[0078]
[0079] In each sectional view, a left side is an object side and a right side is an image side. The optical system L0 according to each example includes a plurality of lens units. As used herein, a lens unit is a group of lenses that move or stand still integrally during zooming, focusing, or image stabilization. That is, in the optical system L0 according to each example, a distance between adjacent lens units changes during zooming or focusing. The lens unit may include one or more lenses. The lens unit may include a diaphragm (aperture stop).
[0080] SP denotes the diaphragm. IP denotes an image plane, on which an imaging plane of an image sensor (photoelectric conversion element), such as a CCD sensor or a CMOS sensor, is disposed. The OIS optical system is eccentric to the optical axis of the optical system L0 during OIS.
[0081] The projection method of the optical system L0 according to Examples 1, 2, and 4 is a central projection method (Y=f tan θ). The projection method of the optical system L0 according to Example 3 is an equisolid angle projection method (Y=2.Math.f.Math.sin(θ/2)).
[0082]
[0083] In the spherical aberration diagram, Fno denotes an F-number and indicates the spherical aberration amounts for the d-line (wavelength 587.6 nm) and the g-line (wavelength 435.8 nm). In the astigmatism diagram, S denotes an astigmatism amount in a sagittal image plane, and M denotes an astigmatism amount in a meridional image plane. The distortion diagram illustrates a distortion amount for the d-line. A chromatic aberration diagram illustrates a lateral chromatic aberration for the g-line. ω denotes an imaging half angle of view (°).
[0084] Numerical examples 1 to 4 corresponding to Examples 1 to 4, respectively, will be illustrated below.
[0085] In the surface data in each numerical example, r denotes a radius of curvature of each optical surface, and d (mm) denotes an on-axis distance (a distance on the optical axis) between an m-th surface and an (m+1)-th surface, where m is a surface number counted from the light incident surface. nd denotes a refractive index of each optical member for the d-line, and vd denotes an Abbe number of the optical member. The Abbe number vd of a certain material is expressed as follows:
vd=(Nd−1)/(NF−NC)
where Nd, NF, and NC are refractive indexes for the d-line (wavelength 587.6 nm), F-line (wavelength 486.1 nm), and C-line (wavelength 656.3 nm) in the Fraunhofer lines.
[0086] In each numerical example, all values of d, a focal length (mm), an F-number, and a half angle of view (°) are values when the optical system L0 according to each example is focused on an object at infinity (infinity object). A backfocus (BF) is a distance on the optical axis from the final surface of the lens (the lens surface closest to the image plane) to the paraxial image plane in terms of an air conversion length. An overall optical length is a length obtained by adding the backfocus to a distance on the optical axis from the frontmost surface of the lens (the lens surface closest to the object) to the final surface of the lens.
[0087] In a case where the optical surface is aspherical, an asterisk * is added to the right side of the surface number. The aspherical shape is expressed as follows:
where X is a displacement amount from the surface vertex in the optical axis direction, h is a height from the optical axis in the direction perpendicular to the optical axis, R is a paraxial radius of curvature, k is a conical constant, A4, A6, A8, A10, and A12 are aspherical coefficients of each order. “e±XX” in each aspherical coefficient means “×10.sup.±XX”
[0088] The tilt-image shift sensitivity data and the eccentric sensitivity data for the eccentricity of the OIS optical system are illustrated in each numerical example. An acquiring method of them will be described with reference to
[0089]
[0090] The tilt-image shift sensitivity for each image height in the tilt direction (R direction) can be acquired by dividing by the tilt angle ω.sub.x the image-point moving amount Δy.sub.LSr(h.sub.r) as a difference of an imaging position on the image plane IP corresponding to each half angle of view between
[0091] The eccentric sensitivity of the OIS optical system for each image height in the eccentric direction (R direction) is acquired by dividing by the eccentricity y of the OIS optical system the image-point moving amount Δy.sub.TSr(h.sub.r) as a difference of the imaging position on the image plane IP corresponding to each half angle of view between
Numerical Example 1
[0092]
TABLE-US-00001 UNIT: mm Surface Data Surface No. r d nd νd 1 211.125 2.10 1.80810 22.8 2 80.660 6.03 1.77250 49.6 3 248.854 0.15 4 57.558 6.97 1.77250 49.6 5 160.440 (Variable) 6 66.217 1.40 1.88300 40.8 7 18.113 8.41 8 −206.710 1.20 1.61800 63.4 9 22.688 4.36 1.85478 24.8 10 79.196 4.20 11 −35.317 1.20 1.58313 59.4 12* −312.513 0.43 13 910.041 5.47 1.59270 35.3 14 −19.928 1.10 1.88300 40.8 15 −47.138 (Variable) 16 (Diaphragm) ∞ 0.40 17 81.194 4.45 1.83481 42.7 18 −54.244 0.15 19 41.217 7.25 1.49700 81.5 20 −32.257 1.10 2.00069 25.5 21 −293.896 2.41 22* −71.464 1.75 1.76802 49.2 23 64.990 1.91 1.80810 22.8 24 199.742 (Variable) 25 30.855 6.56 1.59522 67.7 26 −85.643 0.35 27 38.493 1.20 1.73800 32.3 28 22.868 7.83 1.53775 74.7 29 −71.877 0.15 30* −4310.465 1.70 1.85400 40.4 31* 109.508 (Variable) 32 53.194 0.90 1.80400 46.6 33 22.891 (Variable) 34* −42.821 1.70 1.58313 59.4 35* −2156.781 0.15 36 344.261 3.20 2.00100 29.1 37 −88.670 (Variable) Image Plane ∞ ASPHERIC DATA 12th Surface K = 0.00000e+000 A 4 = −5.69442e−006 A 6 = −2.29053e−009 A 8 = −4.72363e−011 A10 = 4.65343e−013 A12 = −1.99227e−015 22nd Surface K = 0.00000e+000 A 4 = 1.87606e−006 A 6 = 1.45872e−009 A 8 = 2.78338e−011 A10 = −2.10980e−013 A12 = 3.98590e−016 30th Surface K = 0.00000e+000 A 4 = −2.01869e−005 A 6 = 6.17344e−008 A 8 = −2.64177e−010 A10 = −2.98832e−013 A12 = 2.64092e−015 31st Surface K = 0.00000e+000 A 4 = 1.63774e−006 A 6 = 9.32838e−008 A 8 = −2.34772e−010 A10 = −7.39973e−013 A12 = 4.51086e−015 34th Surface K = 0.00000e+000 A 4 = −2.51719e−005 A 6 = 1.25180e−007 A 8 = −5.32709e−010 A10 = 5.08044e−013 A12 = 7.30860e−016 35th Surface K = 0.00000e+000 A 4 = −2.60571e−005 A 6 = 1.26402e−007 A 8 = −6.23562e−010 A10 = 1.45147e−012 A12 = −1.39940e−015
TABLE-US-00002 VARIOUS DATA ZOOM RATIO 2.74 WIDE-ANGLE MIDDLE TELEPHOTO Focal Length: 24.72 43.76 67.66 Fno: 2.91 2.91 2.91 Half Angle of View: 42.00 25.95 17.34 Image Height: 21.64 21.64 21.64 Overall Optical Length: 144.33 158.18 172.04 BF: 14.30 25.72 35.98 d 5 0.80 17.81 28.91 d15 16.54 8.10 2.46 d24 11.55 5.41 3.56 d31 2.38 1.11 0.91 d33 12.58 13.85 14.04 d37 14.30 25.72 35.98
TABLE-US-00003 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN TILT DIRECTION Image-point Tilt Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSr [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00004 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN DIRECTION ORTHOGONAL TO TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSθ [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.θ (h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00005 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM AT WIDE-ANGLE END Image-point Moving Amount Eccentric Eccentric Half Angle Image Δy.sub.TSr [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
TABLE-US-00006 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN DIRECTION ORTHOGONAL TO ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM AT WIDE-ANGLE END Image-point Moving Amount Eccentric. Eccentric Half Angle Image Δy.sub.T$θ [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.θ (h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
Numerical Example 2
[0093]
TABLE-US-00007 UNIT: mm Surface Data Surface No. r d nd νd 1* 3000.000 2.85 1.58313 59.4 2* 16.526 10.57 3* −809.327 2.25 1.85400 40.4 4* 91.828 5.56 5 −53.256 1.20 1.59522 67.7 6 68.528 0.15 7 43.587 5.03 1.85478 24.8 8 −485.244 (Variable) 9 63.607 2.67 1.84666 23.9 10 −1472.964 0.15 11 52.737 1.00 1.92286 20.9 12 22.996 5.41 1.53172 48.8 13 489.976 (Variable) 14 (Diaphragm) ∞ (Variable) 15 27.733 1.20 2.00069 25.5 16 19.641 9.29 1.53775 74.7 17 −78.882 (Variable) 18 −67.558 4.31 1.92286 20.9 19 −20.948 0.77 1.83400 37.2 20 136.126 3.52 21 ∞ (Variable) 22 30.487 11.20 1.49700 81.6 23 −50.182 0.15 24 40.928 11.00 1.49700 81.6 25 −25.800 1.20 2.05090 26.9 26 208.835 4.54 27* −73.669 2.10 1.85400 40.4 28* −1000.000 0.15 29 216.036 3.40 1.92286 20.9 30 −127.538 (Variable) Image Plane ∞ ASPHERIC DATA 1st Surface K = 0.00000e+000 A 4 = 8.30213e−006 A 6 = −1.33976e−008 A 8 = 4.25008e−011 A10 = −8.60253e−014 A12 = 1.03363e−016 2nd Surface K = −9.81344e−001 A 4 = 4.49709e−007 A 6 = −2.34544e−008 A 8 = −1.05516e−010 A10 = 8.07443e−013 A12 = −2.78552e−015 3rd Surface K = 0.00000e+000 A 4 = −9.01759e−006 A 6 = −1.39642e−007 A 8 = 1.23272e−009 A10 = −3.49283e−012 A12 = 3.62808e−015 4th Surface K = 0.00000e+000 A 4 = 6.34981e−006 A 6 = −1.29871e−007 A 8 = 1.67920e−009 A10 = −6.48374e−012 A12 = 1.50043e−014 27th Surface K = 0.00000e+000 A 4 = −8.04129e−005 A 6 = 2.64851e−007 A 8 = −1.06038e−009 A10 = 4.87911e−012 A12 = −8.56493e−015 28th Surface K = 0.00000e+000 A 4 = −6.00659e−005 A 6 = 2.67376e−007 A 8 = −7.05021e−010 A10 = 2.04492e−012 A12 = −2.97985e−015
TABLE-US-00008 VARIOUS DATA ZOOM RATIO 2.20 WIDE-ANGLE MIDDLE TELEPHOTO Focal Length: 15.45 24.00 33.95 Fno: 2.91 2.91 2.91 Half Angle of View: 55.41 41.57 31.88 Image Height: 21.64 21.64 21.64 Overall Optical Length: 159.58 147.48 144.99 BF: 14.00 22.21 32.15 d 8 25.32 7.72 1.50 d13 8.24 11.30 7.40 d14 13.71 5.42 0.71 d17 1.60 9.89 14.61 d21 7.04 1.27 −1.05 d30 14.00 22.21 32.15
TABLE-US-00009 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSr [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00010 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN DIRECTION ORTHOGONAL TO TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSθ [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.θ(h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00011 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM AT WIDE-ANGLE END Image-point Moving Amount Eccentric Eccentric Half Angle Image Δy.sub.TSr [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
TABLE-US-00012 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN DIRECTION ORTHOGONAL TO ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM AT WIDE-ANGLE END Image-point Moving Amount Eccentric Eccentric Half Angle Image Δy.sub.TSθ [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.θ (h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
Numerical Example 3
[0094]
TABLE-US-00013 UNIT: mm Surface Data Surface No. r d nd νd 1 69.371 2.50 1.76385 48.5 2 14.745 14.38 3 263.184 1.35 1.53775 74.7 4 20.476 5.83 5 −39.045 1.20 1.59282 68.6 6 33.526 0.24 7 26.797 6.12 1.85025 30.1 8 −37.028 (Variable) 9 −25.901 1.30 1.91082 35.3 10 −122.028 (Variable) 11 ∞ 1.00 12 24.286 2.50 1.48749 70.2 13 −95.780 0.20 14 20.038 1.00 1.95375 32.3 15 11.039 3.62 1.51742 52.4 16 231.303 2.50 17 (Diaphragm) ∞ 6.09 18 1764.677 0.90 1.67300 38.3 19 15.325 3.96 1.76385 48.5 20 −112.119 (Variable) 21 53.370 3.77 1.43700 95.1 22 −76.916 0.30 23 −276.683 6.34 1.43700 95.1 24 −16.095 1.20 1.88300 40.8 25 −21.730 0.94 26 −17.489 1.40 1.88300 40.8 27 −24.438 (Variable) Image Plane ∞
TABLE-US-00014 VARIOUS DATA ZOOM RATIO 1.86 WIDE-ANGLE MIDDLE TELEPHOTO Focal Length: 8.10 11.97 15.06 Fno: 4.10 4.10 4.10 Half Angle of View: 91.53 90.00 90.34 Image Height: 11.50 17.00 21.50 Overall Optical Length: 111.33 107.33 108.91 BF: 13.12 24.44 31.62 d 8 3.88 3.84 3.41 d10 17.92 6.25 1.70 d20 7.77 4.16 3.54 d27 13.12 24.44 31.62
TABLE-US-00015 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN TILT DIRECTION Image-point Tilt-image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSr [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00016 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT AT WIDE-ANGLE END IN DIRECTION ORTHOGONAL TO TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSθ [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.θ (h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
Numerical Example 4
[0095]
TABLE-US-00017 UNIT: mm Surface Data Surface No. r d nd νd 1 50.658 1.57 1.48749 70.2 2 17.433 7.73 3 82.620 1.50 1.48749 70.2 4 22.068 13.94 5 28.055 5.75 1.90043 37.4 6 −26.190 1.00 1.80000 29.8 7 −678.364 6.06 8 (Diaphragm) ∞ 2.86 9 74.460 1.40 1.77250 49.6 10 −3498.619 2.98 11 −20.479 1.00 1.85478 24.8 12 30.759 3.15 1.49700 81.5 13 −76.152 0.29 14 107.343 4.13 1.58313 59.4 15* −42.035 0.15 16 108.394 4.96 1.85150 40.8 17 −35.438 (Variable) 18 −72.427 1.84 1.83481 42.7 19 −45.108 10.50 20 −23.819 1.57 1.51742 52.4 21 −53.298 11.00 Image Plane ∞ ASPHERIC DATA 15th Surface K = 0.00000e+000 A 4 = 2.14904e-005 A 6 = −6.26885e−009 A 8 = 3.11936e−010 A10 = −1.96590e−012 A12 = 3.25155e−015
TABLE-US-00018 VARIOUS DATA Focal Length: 20.60 Fno: 1.85 Half Angle of View: 46.42 Image Height: 18.71 Overall Optical Length: 84.88 BF: 11.00 INFINITY CLOSE d17 1.50 11.92
TABLE-US-00019 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN IN-FOCUS STATE AT INFINITY IN TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSr [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00020 TILT-IMAGE SHIFT SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN IN-FOCUS STATE AT INFINITY IN DIRECTION ORTHOGONAL TO TILT DIRECTION Image-point Tilt-Image Tilt-Image Moving Amount Shift Shift Half Angle Image Δy.sub.LSθ [mm] Sensitivity Sensitivity of View Height after Blur LS.sub.θ (h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.5 deg [mm/deg] k.sub.LS.sub.
TABLE-US-00021 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM IN IN-FOCUS STATE AT INFINITY Image-point Moving Amount Eccentric Eccentric Half Angle Image Δy.sub.TSr [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.r (h.sub.r) Coefficient ω [deg] h.sub.r [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
TABLE-US-00022 ECCENTRIC SENSITIVITY DATA FOR EACH IMAGE HEIGHT IN DIRECTION ORTHOGONAL TO ECCENTRIC DIRECTION AGAINST ECCENTRICITY OF OIS OPTICAL SYSTEM IN IN-FOCUS STATE AT INFINITY Image-point Moving Amount Eccentric Eccentric Half Angle Image Δy.sub.TSθ [mm] after Sensitivity Sensitivity of View Height Eccentricity TS.sub.θ(h.sub.θ) Coefficient ω [deg] h.sub.θ [mm] of 0.1 mm [mm/mm] k.sub.TS.sub.
[0096] As described above, the configuration according to the present invention can easily and satisfactorily provide an image stabilization at a predetermined image point position including the center of the optical axis.
[0097] Each embodiment expresses the information on the image shift sensitivity to the tilt of the imaging optical system 101 according to the image point position as a correction coefficient table of the correction coefficient information in a matrix format defined by the image point position, but the present invention is not limited to this embodiment. It may be the tilt-image shift sensitivity LS.sub.r(h.sub.r) or LS.sub.θ(h.sub.θ) or the off-axis correction coefficient information acquired from the tilt-image shift sensitivity. That is, the information on the image shift sensitivity may be information that can provide a moving amount of a predetermined image point position to the tilt of the imaging optical system 101.
[0098] Each embodiment has described the tilt-image shift sensitivity as information for each image height in the direction (R direction) orthogonal to the tilt rotation axis of the imaging optical system 101 and in the direction parallel to the rotation axis of the tilt. However, the tilt-image shift sensitivity may be information determined for each image point position over the entire imaging plane against a predetermined tilt direction. In that case, it may be the tilt-image shift sensitivity directly acquired from the image-point moving amount over the entire imaging plane acquired using the designed value of the imaging optical system 101.
[0099] Each numerical example may acquire the image point position using the imaging position of the principal ray, but may acquire it using the peak position of MTF (Modulation Transfer Function).
[0100] The camera microcomputer 202 may perform image stabilization using an electronic image stabilization function that changes the effective pixel area of the image sensor 201. That is, the camera microcomputer 202 may function as one of the image stabilizers.
Other Embodiments
[0101] Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
[0102] Each of the above embodiments can provide a control apparatus, an image pickup apparatus, a lens apparatus, a control method, and a storage medium, each of which can easily and satisfactorily provide an image stabilization to a predetermined image point position including a center of an optical axis.
[0103] While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
[0104] This application claims the benefit of Japanese Patent Application No. 2021-033221, filed on Mar. 3, 2021, which is hereby incorporated by reference herein in its entirety.