Image processing apparatus and method using tracking of gaze of user
09781410 · 2017-10-03
Assignee
Inventors
- Yun Tae Kim (Hwaseong-si, KR)
- Gee Young Sung (Daegu, KR)
- Dong Kyung Nam (Yongin-si, KR)
- Ju Yong Park (Seoul, KR)
Cpc classification
H04N13/383
ELECTRICITY
H04N13/302
ELECTRICITY
H04N13/376
ELECTRICITY
International classification
Abstract
An apparatus and method for outputting view images by tracking a gaze of a user are provided. The image processing apparatus estimates movement of a gaze of a user using a camera, and determines an output order of view images according to the gaze movement.
Claims
1. An image processing method, comprising: estimating an eye position of a user in a viewing zone; determining, by way of a processor, a first sub-zone for a plurality of view images in a first group to be consecutively arranged; and determining, by the processor, a second sub-zone for a second group comprising a repeated view image, based on the eye position of the user, wherein the first group and the second group comprise different view images, and wherein the second sub-zone is disposed at a side of the eye position and at another side of the eye position and the first sub-zone is disposed at the eye position.
2. The image processing method of claim 1, wherein the second group is adjacent to the first group.
3. The image processing method of claim 1, wherein the at repeated view image is repeatedly arranged within the second group.
4. The image processing method of claim 3, wherein the repeated view image is repeatedly arranged is corresponding to the outermost view image.
5. The image processing method of claim 1, wherein: the estimating of the eye position comprises estimating a movement angle of the eye position of the user using the eye position of the user and a focal distance; and the determining of the first sub-zone comprises varying a number of the plurality of view images of which the first sub-zone is to be adjusted, according to the movement angle.
6. The image processing method of claim 1, further comprising displaying an output image comprising the plurality of the view images according to the determined first sub-zone.
7. The image processing method of claim 6, wherein the output image comprises pixel units each comprising at least one view image.
8. A non-transitory computer readable recording medium storing a program to cause a computer to implement the method of claim 1.
9. An image processing apparatus, comprising: a processor configured to control one or more processor-executable units; an estimating unit configured to estimate an eye position of a user; and an output order determining unit configured to determine a first sub-zone for a plurality of view images in a first group to be consecutively arranged and determining a second sub-zone for a second group comprising a repeated view image, based on the eye position of the user, wherein the first group and the second group comprises different view images, and wherein the second sub-zone is disposed at a side of the eye position and at another side of the eye position and the first sub-zone is disposed at the eye position.
10. The image processing method of claim 1, wherein the second sub-zone and a third sub-zone for the second group are located at the sides of the viewing zone.
11. The image processing method of claim 1, wherein the second sub-zone is to the left or right side of the eye position of the user's left or right eye, respectively.
12. The image processing method of claim 1, wherein the repeated view image in the second group is consecutive with a view image, in the first group of the first sub-zone, at the eye position.
13. The image processing apparatus of claim 9, wherein the second group is adjacent to the first group.
14. The image processing apparatus of claim 9, wherein the at repeated view image is repeatedly arranged within the second group.
15. The image processing apparatus of claim 14, wherein the repeated view image repeatedly arranged is corresponding to the outermost view image.
16. The image processing apparatus of claim 9, wherein: the estimation unit estimates a movement angle of the eye position of the user using the eye position of the user and a focal distance; and the output order determining unit varies a number of the plurality of view images of which the first sub-zone is to be adjusted, based on the movement angle.
17. The image processing apparatus of claim 9, further comprising a display configured to display an output image comprising the plurality of the view images, according to the determined first sub-zone.
18. The image processing apparatus of claim 17, wherein the output image comprises pixel units each containing at least one view image.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) These and/or other aspects and advantages will become apparent and more readily appreciated from the following description of the example embodiments, taken in conjunction with the accompanying drawings of which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
DETAILED DESCRIPTION
(16) Reference will now be made in detail to example embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. Example embodiments are described below to explain the present disclosure by referring to the figures.
(17)
(18) When the user moves to the right, a left eye L of the user is disposed in an orthoscopic viewing zone of a display 10 and a right eye R is disposed in a sub-viewing zone of the display 10. Accordingly, a right view image is seen by the left eye L while a left view image is seen by the right eye R, which may make the user 30 feel dizzy.
(19) For example, according to
(20) Hereinafter, a process of determining the output order of view images according to movement of a gaze of a user will be further described with reference to
(21)
(22) Referring to
(23) The movement estimation unit 210 may estimate whether the gaze of the user is moved, based on user eye position information sensed by a sensing device 100. Here, the movement estimation unit 210 may include a movement distance calculating unit 211 and a movement angle calculating unit 212.
(24) For example, the sensing device 100 may photograph the user in real time and transmit a photographed image to the movement distance calculating unit 211 in real time. The sensing device 100 may include any one of a web camera, a monocular camera, a stereo camera, a multi camera, and a camera measuring depth-information. Therefore, the movement distance calculation unit 211 is capable of calculating a position of left or right eyes or both eyes of the user from the photographed image. The user eye position may be expressed by a coordinate value corresponding to view images being displayed on the display 230. In other words, the user eye position may be expressed by a coordinate value of the view images corresponding to positions gazed upon by the eyes of the user. For example, the user eye position calculated from the photographed image may be (x1, y1).
(25) When a next photographed image is received, the movement distance calculating unit 211 may calculate a next position of the left and right eyes of the user from the next photographed image. For example, the next user eye position may be (x2, y2).
(26) In addition, the movement distance calculating unit 211 may determine a direction in which the gaze of the user is moved, that is, a movement direction of the gaze, by calculating a left difference value (x2−x1) between a current left eye position and a next left eye position and a right difference value (y2−y1) between a current right eye position and a next right eye position. Here, when any one of the left difference value and the right difference value is equal to or greater than a preset error value, the movement distance calculating unit 211 may estimate that the gaze is moved. In addition, when the left and the right difference values are positive, the movement distance calculating unit 211 may estimate that the gaze is moved to the right. When the left and right difference values are negative, the movement distance calculating unit 211 may estimate that the gaze is moved to the left.
(27) Referring to
(28) The movement angle calculating unit 212 may calculate a movement angle of an eye of the user, corresponding to the gaze movement of the user, using the movement distance x and a focal distance fl. Here, a focal distance of the sensing device 100 may be preset as the focal distance fl. The movement angle calculating unit 212 may calculate the movement angle of the eye using Equation 1 below.
(29)
wherein, θ refers to the movement angle, and fl refers to the focal distance.
(30) Referring to Equation 1, the movement angle calculating unit 212 may calculate the movement angle by dividing the movement distance by the focal distance and calculating an arc tangent value of the quotient. Therefore, the output order determining unit 220 may determine the output order of the view images based on the calculated movement angle.
(31) More specifically, the output order determining unit 220 may determine the output order of the view images corresponding to the movement of the gaze. When the gaze is estimated to have moved, the output order determining unit 220 may vary the number of the view images of which the output order is to be adjusted, in accordance with the calculated movement angle.
(32) For example, referring to
(33) As the gaze of the user moves to the right as shown in
(34) Here, when a lens of the display 400 is tilted by a predetermined angle, view images neighboring on the left and the right may be simultaneously displayed by a single pixel. For example, in a display that displays 12 view images, one pixel may display 5 view images at once. Therefore, as shown in
(35) In a similar manner, as shown in
(36) In the same manner, referring to
(37) As described above, as the movement angle increases from θ to 2θ and 3θ, the number of the view images of which the output order is to be adjusted may be increased from 1 to 2 and from 2 to 3, respectively.
(38) In the above, the process of determining the output order of the view images corresponding to the movement of the gaze when the gaze is moved to the right has been described with reference to
(39) Here, among the plurality of view images, the view image corresponding to the position of the gaze may be disposed in the middle of the slide window. The same number of view images is arranged on the left and the right of the view image disposed in the middle of the slide window. Referring to
(40) As described above, the output order determining unit 220 may determine the output order of the plurality of view images constituting the output image based on the movement angle calculated as the gaze of the user is moved. Accordingly, the display 230 may display the plurality of view images according to the determined order. Here, the output order determining unit 220 may determine the output order using a database where the output orders according to the movement angles are stored.
(41) Table 1 shows the output orders corresponding to the movement angles calculated as the gaze of the user is moved to the right. The input image in Table 1 is a 36-view image.
(42) TABLE-US-00001 TABLE 1 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 13 14 15 16 17 18 19 20 21 22 23 24 1 13 14 15 16 17 18 19 20 21 22 23 22 2 13 14 15 16 17 18 19 20 21 22 11 12 3 13 14 15 16 17 18 19 20 21 10 11 12 4 13 14 15 16 17 18 19 20 9 10 11 12 5 13 14 15 16 17 18 19 8 9 10 11 12 6 13 14 15 16 17 18 7 8 9 10 11 12 7 13 14 15 16 17 6 7 8 9 10 11 12 8 13 14 15 16 5 6 7 8 9 10 11 12 9 13 14 15 4 5 6 7 8 9 10 11 12 10 13 14 3 4 5 6 7 8 9 10 11 12 11 13 2 3 4 5 6 7 8 9 10 11 12 12 1 2 3 4 5 6 7 8 9 10 11 12
(43) Referring to Table 1, the output order determining unit 220 may determine the output order according to the movement angles calculated by the movement angle calculating unit 212. In addition, the display 230 may display a natural 3D image despite the rightward movement of the gaze, by outputting the plurality of view images according to the determined output order. Here, when the movement angle is 0 in Table 1, the user is located in the middle of the display 230. When the movement angle is 12θ, both the left and the right eyes are moved to the sub-viewing zone. Therefore, the display 230 may display the output image comprised of 1 to 12-view images in that order.
(44) The image processing apparatus 200 may enlarge a viewing angle twice as the gaze of the user is moved to the right. In this case, the output order determining unit 220 may determine the output order of the plurality of view images according to Table 2 below. The display 230 may display the plurality of view images according to the determined output order. Here, calculation of the movement angle according to the gaze movement is the same as described above regarding the movement angle calculating unit 212 and therefore will not be described again.
(45) Table 2 shows the output orders of the view images, the output orders corresponding to the movement angles as the viewing angle increases. In Table 2, the input image is a 36-view image image.
(46) TABLE-US-00002 TABLE 2 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 13 14 15 16 17 18 19 20 21 22 23 24 1 13 14 15 16 17 18 19 20 21 22 23 22 2 13 14 15 16 17 18 19 20 21 22 11 12 3 13 14 15 16 17 18 19 20 21 10 11 12 4 13 14 15 16 17 18 19 20 9 10 11 12 5 13 14 15 16 17 18 19 8 9 10 11 12 6 13 14 15 16 17 18 7 8 9 10 11 12 7 13 14 15 16 17 6 7 8 9 10 11 12 8 13 14 15 16 5 6 7 8 9 10 11 12 9 13 14 15 4 5 6 7 8 9 10 11 12 10 13 14 3 4 5 6 7 8 9 10 11 12 11 13 2 3 4 5 6 7 8 9 10 11 12 12 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 3 4 5 6 7 8 9 10 11 1 14 1 2 3 4 5 6 7 8 9 10 1 1 15 1 2 3 4 5 6 7 8 9 1 1 1 16 1 2 3 4 5 6 7 8 1 1 1 1 17 1 2 3 4 5 6 7 1 1 1 1 1 18 1 2 3 4 5 6 1 1 1 1 1 1 19 1 2 3 4 5 1 1 1 1 1 1 1 20 1 2 3 4 1 1 1 1 1 1 1 1 21 1 2 3 1 1 1 1 1 1 1 1 1 22 1 2 1 1 1 1 1 1 1 1 1 1 23 1 1 1 1 1 1 1 1 1 1 1 1
(47) According to Table 2, when the gaze of the user is moved to the right, a wide viewing zone may be supplied without generating the inversion.
(48) Table 3 shows the output orders of the view images, corresponding to the movement angles calculated according to a leftward movement of the gaze. In Table 3, the input image is a 36-view image.
(49) TABLE-US-00003 TABLE 3 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 13 14 15 16 17 18 19 20 21 22 23 24 1 25 14 15 16 17 18 19 20 21 22 23 24 2 25 26 15 16 17 18 19 20 21 22 23 24 3 25 26 27 16 17 18 19 20 21 22 23 24 4 25 26 27 28 17 18 19 20 21 22 23 24 5 25 26 27 28 29 18 19 20 21 22 23 24 6 25 26 27 28 29 30 19 20 21 22 23 24 7 25 26 27 28 29 30 31 20 21 22 23 24 8 25 26 27 28 29 30 31 32 21 22 23 24 9 25 26 27 28 29 30 31 32 33 22 23 24 10 25 26 27 28 29 30 31 32 33 34 23 24 11 25 26 27 28 29 30 31 32 33 34 35 24 12 25 26 27 28 29 30 31 32 33 34 35 36
(50) Referring to Table 3, when the movement angle is 12θ, both the left and the right eyes are disposed in the sub-viewing zone. Accordingly, the output order determining unit 220 may determine such that the view images constituting the output image are output in order of the 25 to 36-view images instead of in order of the 13 to 24-view images. That is, the display 230 may display the output image rendered with the 25 to 36-view images. Therefore, the output order determining unit 220 may determine the output order such that the user first sees the 13 to 24-view images in the middle of the display 230 and further sees up to the 36-view image while movement the gaze to the left. In other words, the viewing angle may be expanded to the left while preventing the inversion.
(51) Referring to Table 3, in addition, the movement angle calculating unit 212 may calculate the movement angle according to leftward movement of the gaze. In addition, the output order determining unit 220 may determine the output order of the plurality of view images by referring to Table 2. Therefore, the display 230 outputs the plurality of view images according to the determined output order, thereby displaying a natural 3D image even though the gaze of the user is moved leftward.
(52) For example, in a state where the left eye gazes at a 22-view image 421 and the right eye gazes at a 15-view image 422 among the plurality of view images 410 as shown in
(53) Similarly, as shown in
(54) Also, as shown in
(55) A process of determining the output order of the view images corresponding to the gaze movement of the user when the input image is a 36-view image and the display is capable of displaying 12 view images was previously described with reference to
(56)
(57) Referring to
(58) Here, as shown in
(59) Similarly, when the gaze is moved by about 2θ as shown in
(60) In the same manner, when the gaze is moved by about 3θ as shown in
(61)
(62) Referring to
(63) Similarly, when the gaze of the user is moved to the left by about 2θ as shown in
(64) Similarly, when the gaze of the user is moved to the left by about 3θ as shown in
(65) The process of displaying a natural 3D image by slide windows that include different view images has been described above. Hereinafter, a process of selecting two view images from the plurality of view images constituting the input image and determining the output order of the view images using the selected view images is described. In this case, deterioration of a 3D image by a crosstalk may be prevented by performing rendering using the selected two view images in a position where 12-view images are to be displayed.
(66) For example, the output order determining unit 220 may output order the output order of a plurality of the view images according to Table 4 below. The plurality of view images may include 2 view images.
(67) Table 4 shows the output order of the view images, corresponding to the movement angle calculated as the gaze is moved to the right. In Table 4, the input image may be a 36-view image and a plurality of the displayed view images may include only 2 view images out of the 36 view images.
(68) TABLE-US-00004 TABLE 4 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 15 15 15 15 15 15 22 22 22 22 22 22 1 14 14 14 14 14 21 21 21 21 21 21 14 2 13 13 13 13 20 20 20 20 20 20 13 13 3 12 12 12 19 19 19 19 19 19 12 12 12 4 11 11 18 18 18 18 18 18 11 11 11 11 5 10 17 17 17 17 17 17 10 10 10 10 10 6 16 16 16 16 16 16 9 9 9 9 9 9 7 15 15 15 15 15 8 8 8 8 8 8 15 8 14 14 14 14 7 7 7 7 7 7 14 14 9 13 13 13 6 6 6 6 6 6 13 13 13 10 12 12 5 5 5 5 5 5 12 12 12 12 11 11 4 4 4 4 4 4 11 11 11 11 11 12 3 3 3 3 3 3 10 10 10 10 10 10 13 2 2 2 2 2 9 9 9 9 9 9 2 14 1 1 1 1 8 8 8 8 8 8 1 1 15 1 1 1 7 7 7 7 7 7 1 1 1 16 1 1 6 6 6 6 6 6 1 1 1 1 17 1 5 5 5 5 5 5 1 1 1 1 1 18 4 4 4 4 4 4 1 1 1 1 1 1 19 3 3 3 3 3 1 1 1 1 1 1 3 20 2 2 2 2 1 1 1 1 1 1 2 2 21 1 1 1 1 1 1 1 1 1 1 1 1
(69) According to Table 4, when the gaze of the user is moved to the right and the calculated movement angle is about 12θ, both the left and the right eyes may be disposed in the sub-viewing zone. Accordingly, the display 230 may display an output image comprised of a 3-view image and a 10-view image. As the output order determining unit 220 determines the output order referring to
(70) For example, when the gaze of the user is moved to the right by about 2θ as shown in
(71) Table 5 shows the output order of the view images, corresponding to the movement angle calculated as the gaze is moved to the left. In Table 5, the input image may be a 36-view image and a plurality of the displayed view images may include only 2 view images out of the 36 view images.
(72) TABLE-US-00005 TABLE 5 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 15 15 15 15 15 15 22 22 22 22 22 22 1 23 16 16 16 16 16 16 23 23 23 23 23 2 24 24 17 17 17 17 17 17 24 24 24 24 3 25 25 25 18 18 18 18 18 18 25 25 25 4 26 26 26 26 19 19 19 19 19 19 26 26 5 27 27 27 27 27 20 20 20 20 20 20 27 6 28 28 28 28 28 28 21 21 21 21 21 21 7 22 29 29 29 29 29 29 22 22 22 22 22 8 23 23 30 30 30 30 30 30 23 23 23 23 9 24 24 24 31 31 31 31 31 31 24 24 24 10 25 25 25 25 32 32 32 32 32 32 25 25 11 26 26 26 26 26 33 33 33 33 33 33 26 12 27 27 27 27 27 27 34 34 34 34 34 34 13 35 28 28 28 28 28 28 35 35 35 35 35 14 36 36 29 29 29 29 29 29 36 36 36 36 15 36 36 36 30 30 30 30 30 30 36 36 36 16 36 36 36 36 31 31 31 31 31 31 36 36 17 36 36 36 36 36 32 32 32 32 32 32 36 18 36 36 36 36 36 36 33 33 33 33 33 33 19 34 36 36 36 36 36 36 34 34 34 34 34 20 35 35 36 36 36 36 36 36 35 35 35 35 21 36 36 36 36 36 36 36 36 36 36 36 36
(73) Referring to Table 5, when the gaze is moved to the left and the calculated movement angle is about 12θ, both the left and the right eyes may be disposed in the sub-viewing zone. Accordingly, the display 230 may display an output image comprised of a 27-view image and a 34-view image. As the output order determining unit 220 determines the output order referring to
(74) For example, when the gaze of the user is moved to the left by about 2θ as shown in
(75) In the same manner, when the input image is a 12-view image and when the gaze is moved to the right, the output order determining unit 220 may determine the output order corresponding to the movement angle according to Table 6 below. Here, when the gaze is moved to the left, the output order determining unit 220 may determine the output order of the view image corresponding to the movement angle as shown in Table 6 below.
(76) Table 6 shows the output order of the view images, corresponding to the movement angle calculated as the gaze is moved to the right. In
(77) TABLE-US-00006 TABLE 6 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 3 3 3 3 3 3 10 10 10 10 10 10 1 2 2 2 2 2 9 9 9 9 9 9 2 2 1 1 1 1 8 8 8 8 8 8 1 1 3 1 1 1 7 7 7 7 7 7 1 1 1 4 1 1 6 6 6 6 6 6 1 1 1 1 5 1 5 5 5 5 5 5 1 1 1 1 1 6 4 4 4 4 4 4 1 1 1 1 1 1 7 3 3 3 3 3 1 1 1 1 1 1 3 8 2 2 2 2 1 1 1 1 1 1 2 2 9 1 1 1 1 1 1 1 1 1 1 1 1
(78) Referring to Table 6, when the input image is the 12-view image and the movement angle becomes about 9θ, the left eye and the right eye both see a 1-view image, that is, a 2D image is seen by the user. However, although the display 230 displays an output image comprised of twelve 1-view images, the output image is seen as a 3D image by the user due to motion parallax.
(79) For example, when the gaze is moved to the right by about 2θ as shown in
(80) Table 7 shows the output order corresponding to the movement angle calculated as the gaze is moved to the left. In
(81) TABLE-US-00007 TABLE 7 Movement angle Output orders of view images (θ) 1 2 3 4 5 6 7 8 9 10 11 12 0 3 3 3 3 3 3 10 10 10 10 10 10 1 11 4 4 4 4 4 4 11 11 11 11 11 2 12 12 5 5 5 5 5 5 12 12 12 12 3 12 12 12 6 6 6 6 6 6 12 12 12 4 12 12 12 12 7 7 7 7 7 7 12 12 5 12 12 12 12 12 8 8 8 8 8 8 12 6 12 12 12 12 12 12 9 9 9 9 9 9 7 10 12 12 12 12 12 12 10 10 10 10 10 8 11 11 12 12 12 12 12 12 11 11 11 11 9 12 12 12 12 12 12 12 12 12 12 12 12
(82) Referring to Table 7, since the input image is the 12-view image, when the movement angle is about 9θ, the left and the right eyes both see only the 12-view image, that is, a 2D image is seen by the user. However, although the display 230 displays an output image comprised of twelve 12-view images, the output image is seen as a 3D image by the user due to motion parallax.
(83) For example, when the gaze is moved to the left by about 2θ as shown in
(84) Tables 1 through 7 above may be stored in the image processing apparatus 200 in the form of a database. Accordingly, the output order determining unit 220 may determine the output order of the view images corresponding to the movement angle calculated by the movement angle calculating unit 212, by referring to the table database.
(85)
(86) Referring to
(87) For example, the sensing device 100 may photograph the user in real time and transmit the photographed image to the movement distance calculating unit 211 in real time. Accordingly, the movement distance calculating unit 211 may calculate positions of the left eye and the right eye of the user from the photographed image, and calculate next positions of the left or the right eye or of both eyes from a next photographed image. Here, the left and right eye positions may be expressed as coordinate values. The movement distance calculating unit 211 may calculate difference values between current and next left eye positions and between current and next right eye positions calculated from the current and the next photographed images, thereby calculating the left difference value and the right difference value.
(88) The movement distance calculating unit 211 may estimate whether the gaze is moved using the left and the right difference values, in operation S2220.
(89) For example, when any of the left difference value and the right difference value is equal to or greater than the preset error value, the movement distance calculating unit 211 may estimate that the gaze is moved (“YES” branch of operation S2220). In addition, when the left and the right difference values are positive, the movement distance calculating unit 211 may estimate that the gaze is moved to the right. When the left and right difference values are negative, the movement distance calculating unit 211 may estimate that the gaze is moved to the left. The movement distance x may be expressed as a coordinate value comprised of the left and the right difference values.
(90) The movement angle calculating unit 212 may calculate the movement angle of the eye of the user, corresponding to the gaze movement of the user, using the movement distance x and a focal distance fl. For example, the movement angle calculating unit 212 may calculate the movement angle using Equation 1 described above.
(91) Next, when the gaze of the user is moved, the output order determining unit 220 may determine the output order of the view images corresponding to a movement direction of the gaze, in operation S2230.
(92) For example, the output order determining unit 220 may determine the output order of the view images corresponding to the movement angle calculated with reference to Tables 1 through 7. More specifically, the output order may be determined based on the input image and the movement direction of the gaze. That is, the output order of the view images corresponding to the movement angles may be determined according to whether the input image is a 12-view image or a 36-view image and whether the gaze of the user is moved to the right or the left.
(93) That is, the output order determining unit 220 may determine the output order of the view images corresponding to the left eye of the user when the gaze of the user is moved to the right. When the gaze is moved to the left, the output order determining unit 220 may adjust the output order of the view images corresponding to the left eye. Here, the output order may be adjusted such that the left slide window and the right slide window include respectively different view images consecutively arranged.
(94) Sizes of the left and the right slide windows may be preset corresponding to the number of view images that can be simultaneously displayed in one pixel. Left and right view images of a view image corresponding to a position to which the gaze is moved may be respectively disposed in the middle of the left and the right slide windows. In addition, a reference number of view images may be continued to the left and the right of the view images disposed in the middle. Here, the reference number may be preset as a value obtained by dividing the number of view images simultaneously displayed in one pixel by 2. Thus, by the left and the right slide windows, the natural 3D image may be seen by the user even if the user's eyes are minutely moved to the left and the right.
(95) Next, the display 230 may display a plurality of view images according to the determined output order in operation S2240. That is, the display 230 may display the output image comprised of a plurality of different view images. Here, the output image may include pixel units each including at least one view image. Therefore, even when the gaze of the user is moved to the left or the right, the left eye of the user sees only the left view images and the right eye sees only the right view images.
(96) As another example, the output order determining unit 220 may adjust the output order using only 2 view images out of the plurality of view images. In this case, the display 230 may display the view image comprised of the two view images according to the adjusted output order.
(97) When the gaze is estimated as not having been moved in operation S2210 (“NO” branch of operation S2220), the display 230 may continuously display the output image according to the order of the view images being displayed in operation S2250.
(98) The display 230 described above may include a lens structure such as a lenticular lens or a barrier structure.
(99) The methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media or processor-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The program instructions recorded on the media may be those specially designed and constructed for the purposes of the example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
(100) Examples of non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa. The methods may be executed on a general purpose computer or processor or may be executed on a particular machine such as the image processing apparatus described herein. Any one or more of the software modules described herein may be executed by a dedicated processor unique to that unit or by a processor common to one or more of the modules.
(101) Although example embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these example embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined in the claims and their equivalents.