Image display apparatus and image display method
10670880 ยท 2020-06-02
Assignee
Inventors
Cpc classification
G06F3/04815
PHYSICS
G06F3/011
PHYSICS
G02B2027/011
PHYSICS
G02B2027/0187
PHYSICS
G02B30/24
PHYSICS
H04N13/383
ELECTRICITY
H04N13/239
ELECTRICITY
G02B2027/0143
PHYSICS
International classification
G02B30/24
PHYSICS
G06T19/00
PHYSICS
H04N13/383
ELECTRICITY
G06F3/0481
PHYSICS
G09G3/00
PHYSICS
H04N13/239
ELECTRICITY
G02B27/00
PHYSICS
Abstract
A display image is superimposed and displayed on an outside image in a preferred manner. An optical system superimposes a display image displayed on a display device onto an outside image, and leads the display image to an eye of an observer. A display control unit controls the display size and the display position of the display image on the display device so that the display image is displayed in an image superimposition region (a flat region) detected from the outside image. For example, the display control unit controls the display state of the display image in accordance with the state of the image superimposition region in the outside image. Also, the display control unit performs control to selectively display the display image in a line-of-sight region or outside the line-of-sight region in accordance with the line of sight of the observer.
Claims
1. An image processing apparatus, comprising: a display control unit configured to control disparities of a left-eye image and a right-eye image, wherein a first depth position of a stereoscopic image is on a front side with respect to a second depth position of a region in an outside image of a real space, and the stereoscopic image is perceivable by an observer through the left-eye image and the right-eye image; a depth/structure estimating unit configured to: detect a flat region included in the outside image of the real space, wherein the detected flat region includes low-frequency components in a horizontal direction and a vertical direction; and determine a display position of the stereoscopic image based on the detected flat region, wherein the stereoscopic image is displayed in a superimposed manner on the detected flat region included in the outside image of the real space; an eye position estimating unit configured to detect an observer's eye position, wherein the display control unit includes: a first mode in which the stereoscopic image is displayed in a first region on a line of sight of the observer, and a second mode in which the stereoscopic image is displayed in a second region outside the first region on the line of sight of the observer; and a gyro sensor configured to detect a change in the outside image, wherein the display control unit is further configured to shift from the first mode to the second mode based on the detected change in the outside image.
2. The image processing apparatus according to claim 1, wherein the depth/structure estimating unit is further configured to: detect a depth of the outside image; and determine an image superimposition region in the outside image based on the detected depth.
3. The image processing apparatus according to claim 2, wherein the display control unit is further configured to: correct the left-eye image and the right-eye image; and change a display state of the stereoscopic image based on the correction and a state of the image superimposition region.
4. The image processing apparatus according to claim 2, wherein the display control unit is further configured to change display conditions of the stereoscopic image in a case where the detected flat region included in the outside image of the real space is undetected by the depth/structure estimating unit.
5. The image processing apparatus according to claim 1, further comprising a moving detection unit configured to determine movement of the observer, wherein the display control unit is further configured to shift from the first mode to the second mode based on the determined movement of the observer.
6. The image processing apparatus according to claim 1, further comprising: a first optical system configured to superimpose the left-eye image, displayed on a first display device, onto the outside image, and lead the left-eye image to an observer's left eye; and a second optical system configured to superimpose the right-eye image, displayed on a second display device, onto the outside image, and lead the right-eye image to an observer's right eye.
7. An image processing method, comprising: in an image processing apparatus that includes a display control unit, a depth/structure estimating unit, an eye position estimating unit, and a gyro sensor: controlling, by the display control unit, disparities of a left-eye image and a right-eye image, wherein a first depth position of a stereoscopic image is on a front side with respect to a second depth position of a region in an outside image of a real space, and the stereoscopic image is perceivable by an observer through the left-eye image and the right-eye image; detecting, by the depth/structure estimating unit, a flat region included in the outside image of the real space, wherein the detected flat region includes low-frequency components in a horizontal direction and a vertical direction; determining, by the depth/structure estimating unit, a display position of the stereoscopic image based on the detected flat region, wherein the stereoscopic image is displayed in a superimposed manner on the detected flat region included in the outside image of the real space; detecting, by the eye position estimating unit, an observer's eye position, wherein the display control unit includes: a first mode in which the stereoscopic image is displayed in a first region on a line of sight of the observer, and a second mode in which the stereoscopic image is displayed in a second region outside the first region on the line of sight of the observer; detecting, by the gyro sensor, a change in the outside image; and shifting, by the display control unit, from the first mode to the second mode based on the detected change in the outside image.
8. An image processing apparatus, comprising: a display control unit configured to control disparities of a left-eye image and a right-eye image, wherein a first depth position of a stereoscopic image is on a front side with respect to a second depth position of a region of an outside view of a real space, the stereoscopic image is perceivable by an observer through the left-eye image and the right-eye image, and the stereoscopic image is displayed in a superimposed manner on the outside view; an eye position estimating unit configured to detect an observer's eye position, wherein the display control unit includes: a first mode in which the stereoscopic image is displayed in a first region on a line of sight of the observer, and a second mode in which the stereoscopic image is displayed in a second region outside the first region on the line of sight of the observer; and a gyro sensor, wherein the display control unit is further configured to shift from the first mode to the second mode based on a result detected by the gyro sensor.
9. An image processing apparatus, comprising: a display control unit configured to control disparities of a left-eye image and a right-eye image, wherein a first depth position of a stereoscopic image is on a front side with respect to a second depth position of a region of an outside view of a real space, the stereoscopic image is perceivable by an observer through the left-eye image and the right-eye image, and the stereoscopic image is displayed in a superimposed manner on the outside view; an eye position estimating unit configured to detect an observer's eye position, wherein the display control unit includes: a first mode in which the stereoscopic image is displayed in a first region on a line of sight of the observer, and a second mode in which the stereoscopic image is displayed in a second region outside the first region on the line of sight of the observer; and a moving detection unit configured to determine movement of the observer, wherein the display control unit is further configured to shift from the first mode to the second mode based on the determined movement of the observer.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
MODES FOR CARRYING OUT THE INVENTION
(33) The following is a mode for carrying out the invention (hereinafter referred to as the embodiment). Explanation will be made in the following order.
(34) 1. Embodiment
(35) 2. Modifications
(36) <1. Embodiment>
(37) [Example Structure of an Optically-Transmissive Head Mount Display]
(38)
(39) Each of the glass lens units 101L and 101R is formed by integrating a glass lens and a holographic optical element (HOE) sheet. This HOE sheet has a half-mirror-like function to combine outside light and display light, and a function of a concave surface or an adjustable surface to enlarge a display image.
(40) Infrared sensors 103L and 103R are attached to the glass lens units 101L and 101R, respectively. The infrared sensor 103L is provided in the center position of the glass lens unit 101L in the horizontal direction (the center position of the left-eye optical system in the horizontal direction), for example. The infrared sensor 103R is also provided in the center position of the glass lens unit 101R in the horizontal direction (the center position of the right-eye optical system in the horizontal direction), for example.
(41) Sensor outputs of the infrared sensors 103L and 103R are used for estimating eye positions by a scleral reflection method. The scleral reflection method is a method that utilizes a difference in reflectance between the cornea (the black part of the eye) and the sclera (the white part of the eye). In this case, an infrared sensor horizontally scans weak infrared rays emitted onto an eye of an observer, and detects the reflected light. Since there is a large difference between the intensity of the light reflected from the cornea (the black part of the eye) and the intensity of the light reflected from the sclera (the white part of the eye), the position of the eye of the observer can be estimated from a sensor output.
(42) A gyro sensor 104 is also attached to the glass lens unit 101L. A sensor output of the gyro sensor 104 is used for determining whether there is a change in the image of the outside, and whether the observer (user) is moving. A sensor output of the gyro sensor 104 is also used for determining whether there is a change in the image of the outside being observed by the observer through the glass lens units 101L and 101R.
(43) A camera 105L is also provided in the center position of the glass lens unit 101L in the horizontal direction (the center position of the left-eye optical system in the horizontal direction). The camera 105L captures an image (left-eye imagery) of the outside being observed with the left eye of the observer through the glass lens unit 101L, and outputs the captured image data. Likewise, a camera 105R is also provided in the center position of the glass lens unit 101R in the horizontal direction (the center position of the right-eye optical system in the horizontal direction). The camera 105R captures an image (right-eye imagery) of the outside being observed with the right eye of the observer through the glass lens unit 101R, and outputs the captured image data.
(44) Outputs of the cameras 105L and 105R are used for obtaining information about the depth position of an outside image on which a stereoscopic image is to be superimposed and displayed. Outputs of the cameras 105L and 105R are also used for determining whether there is a change in the image of the outside being observed by the observer through the glass lens units 101L and 101R. Outputs of the cameras 105L and 105R are used for obtaining information (such as luminance information and color information) indicating the state of an outside image on which a stereoscopic image is to be superimposed and displayed. Outputs of the cameras 105L and 105R are also used for detecting an image superimposition region, or a flat region in this embodiment, from an outside image.
(45) The HMD 100 also includes display drivers 111L and 111R, and displays 112L and 112R. Each of the displays 112L and 112R is formed with a liquid crystal display (LCD), for example. The display 112L is driven by the display driver 111L based on left-eye image data, and displays a left-eye image for making the observer perceive a stereoscopic image. The display 112R is driven by the display driver 111R based on right-eye image data, and displays a right-eye image for making the observer perceive a stereoscopic image.
(46) The HMD 100 also includes an eye position estimating unit 121, a line-of-sight estimating unit 122, a depth/structure estimating unit 123, a display control unit 124, and a to-be-displayed image generating unit 125. The eye position estimating unit 121 estimates positions of the left eye and the right eye of the observer based on sensor outputs from the infrared sensors 103L and 103R. The line-of-sight estimating unit 122 estimates a line of sight of the observer based on the result of the left-eye and right-eye position estimation performed by the eye position estimating unit 121.
(47) The depth/structure estimating unit 123 calculates a disparity map indicating the depth position of each pixel in an outside image based on captured image data from the cameras 105L and 105R.
(48) The depth/structure estimating unit 123 also calculates a disparity map that indicates the depth positions of the respective pixels in a display image (a stereoscopic image) based on left-eye and right-eye image data serving as display image data.
(49) The depth/structure estimating unit 123 also detects an image superimposition region from an outside image based on captured image data from the cameras 105L and 105R. The depth/structure estimating unit 123 detects a region containing only low-frequency components in the horizontal direction and the vertical direction (a flat region) as an image superimposition region, for example. This image superimposition region is of course a region in which a display image is to be displayed in a superimposed manner, and therefore, has a sufficient size in both the horizontal direction and the vertical direction. In this case, not only one image superimposition region but more than one image superimposition region may be detected from an outside image.
(50) The depth/structure estimating unit 123 also determines a display size and a display position of a display image based on the detected image superimposition region. Performing the above described image superimposition region detection cyclically or for each frame, for example, the depth/structure estimating unit 123 also determines a display size and a display position of a display image for each frame.
(51) So as to stabilize a display size and a display position of a display image, the depth/structure estimating unit 123 performs temporal smoothing on the display sizes and display positions determined for the respective frames, and determines the display size and the display position for display control.
(52)
(53) The depth/structure estimating unit 123 also determines a depth position of a display image (a stereoscopic image) based on the display size and the display position for display control that are determined in the above described manner. In this case, the depth/structure estimating unit 123 determines the depth position of a display image so that the depth position of the display image is located closer to the front side than the depth position of the region in which the display image is to be displayed in a superimposed manner in the outside image.
(54) A depth position is determined as described above, so as to avoid any inconsistency in the sense of depth when a display image (a stereoscopic image) is superimposed and displayed on an image of the outside, which is the real world.
(55)
(56) Specifically, the depth/structure estimating unit 123 determines disparities to be given to the left-eye image and the right-eye image. The depth/structure estimating unit 123 determines Ha, which is the average of disparities in the region in which a display image having a display size and a display position determined as described above is displayed in a superimposed manner in the outside image. The depth/structure estimating unit 123 also determines Hb, which is the average of disparities in the entire display image having the display size determined as described above. It should be noted that Hb can be obtained by calculating the average (see
(57) For example, a case where a display image is superimposed on an outside image and is displayed as shown in
(58) Each of the rectangular frames is drawn with two lines in
(59)
(60) The depth/structure estimating unit 123 compares the disparity average Ha related to the outside image that is determined in the above described manner, with the disparity average Hb related to the display image. The depth/structure estimating unit 123 then determines whether the depth position of the display image (the stereoscopic image) is located closer to the front side than the depth position of the corresponding region in the outside image by a certain distance or more, or whether the disparity average difference of the disparity average Hb with respect to the disparity average Ha is H0 or larger, which satisfies the above condition.
(61) When the disparity average difference is smaller than H0, the depth/structure estimating unit 123 adjusts one or both of the display positions of the left-eye image and the right-eye image in the horizontal direction, so as to make the disparity average Hb have a disparity average difference equal to or larger than H0.
(62) In the above description, one or both of the display positions of the left-eye image and the right-eye image in the horizontal direction are adjusted so that the disparity average difference of the disparity average Hb with respect to the disparity average Ha becomes equal to or larger than H0. Instead, as shown in
(63) The display control unit 124 controls display of the display image based on the result of the line-of-sight estimation performed by the line-of-sight estimating unit 122, a sensor output of the gyro sensor 104, information about the display size and the display position of the display image determined by the depth/structure estimating unit 123, and the like. Although not shown in the drawings, a user operation signal is also supplied to the display control unit 124.
(64) In a case where an instruction to display the display image is issued through a user operation, the display control unit 124 basically controls display of the display image so that the display image is displayed in the display size and the display position determined by the depth/structure estimating unit 123.
(65) When any flat region that is an image superimposition region is not detected by the depth/structure estimating unit 123, or when any information about a display size and a display position is not supplied from the depth/structure estimating unit 123, the display control unit 124 changes display conditions.
(66) The display control unit 124 performs control so that display of the display image is stopped, for example. Alternatively, the display condition control unit 124 performs control to make the user select a superimposition position so that the display image is displayed in that position, for example. Alternatively, the display condition control unit 124 performs control so that the display image is displayed in a preset superimposition position, for example. Alternatively, the display condition control unit 124 performs control so that the display image is displayed in the previously displayed superimposition position, for example.
(67) The display control unit 124 also controls the display position of the display image or switching on and off of the display of the display image in accordance with the duration of non-detection time. In this case, control is performed so that the display image is displayed in the previous display position until the duration reaches a first point of time, control is performed so that the display image is displayed in a preset position until the duration reaches a second point of time after reaching past the first point of time, and control is performed so that the display is stopped when the duration is past the second point of time.
(68) The display control unit 124 also changes display conditions when a check is made to determine whether there is a change in the outside image and a change is detected based on the gyro sensor 104. The display condition control unit 124 performs control so that display of the display image is stopped, for example. Alternatively, the display condition control unit 124 performs control to make the user select a superimposition position so that the display image is displayed in that position, for example. Alternatively, the display condition control unit 124 performs control so that the display image is displayed in a preset superimposition position, for example. Alternatively, the display condition control unit 124 performs control so that the display image is displayed in the previously displayed superimposition position, for example.
(69) The display control unit 124 also controls the display position of the display image or switching on and off of the display of the display image in accordance with the duration of change. In this case, control is performed so that the display image is displayed in the previous display position until the duration reaches a first point of time, control is performed so that the display image is displayed in a preset position until the duration reaches a second point of time after reaching past the first point of time, and control is performed so that the display is stopped when the duration is past the second point of time.
(70) The display control unit 124 also changes display conditions based on a line-of-sight estimation result from the line-of-sight estimating unit 122. In accordance with a mode that is set by the user (the observer), the display condition control unit 124 performs control in the manner described below. The user can set automatic control mode, first control mode, or second control mode.
(71) Where first control mode is set, the display control unit 124 performs control so that the display image is displayed in the region on which the line of sight concentrates or the region that matches the line of sight. Where second control mode is set, the display control unit 124 performs control so that the display image is displayed in a region outside the region on which the line of sight concentrates or a region outside the region that matches the line of sight.
(72) Where automatic control mode is set, the display control unit 124 performs control in the manner described below depending on whether the user (observer) is moving. Specifically, when the user is not moving, the display control unit 124 performs control so that the display image is displayed in the region on which the line of sight concentrates as in a case where first control mode is set. When the user is moving, the display control unit 124 performs control so that the display image is displayed in a region outside the region on which the line of sight concentrates as in a case where second control mode is set. The display control unit 124 determines whether the observer is moving based on a sensor output of the gyro sensor 104.
(73) Under the control of the display control unit 124, the to-be-displayed image generating unit 125 generates left-eye and right-eye image data for display so that the display image is displayed in the display size and the display position determined by the depth/structure estimating unit 123 at the time of display of the display image. In this case, a reduction process and a moving process (a geometric transformation process) are performed on left-eye and right-eye image data supplied from outside, to obtain the left-eye and right-eye image data for display. Under the control of the display control unit 124, the to-be-displayed image generating unit 125 also corrects the left-eye and right-eye image data for display so that the display state of the display image is changed in accordance with the state of the region on which the display image is to be displayed in the outside image. In this case, the correction is performed so that elements (components) of the outside image are removed from the display image to be observed by the observer.
(74) An example of image data correction to be performed by the to-be-displayed image generating unit 125 is now described. The outside image is represented by Ireal, and the display image is represented by Idisp. Each of Idisp and Ireal is divided into blocks each consisting of (NN) pixels. The pixel at the coordinates (i, j) in the outside image Ireal is represented by Ireal (i, j), and the pixel at the coordinates (i, j) in the display image Idisp is represented by Idisp (i, j).
(75) As for the data of each pixel in a block of coordinates (s, t), the to-be-displayed image generating unit 125 performs correction as indicated in the mathematical formula (2) shown below. Here, a represents the correction coefficient, and the user (observer) can adjust the value of the correction coefficient as necessary. Also, clip(x) represents a function to perform a saturation calculation on x into a certain value range (0 to 255, for example). Although not described in detail, this pixel data correction is performed on data of the respective colors of red, green, and blue.
(76)
(77) As for the second term in the parenthesis, a value obtained by smoothing a correction value of a surrounding block may be used as indicated in the mathematical formula (3) shown below.
(78)
(79) Operation of the HMD 100 shown in
(80) The light from the left-eye image displayed on the display 112L is superimposed on the outside image at the glass lens unit 101L, and reaches the left eye of the observer. As a result, the left-eye image superimposed on the outside image (left-eye imagery) is observed with the left eye of the observer. Likewise, the light from the right-eye image displayed on the display 112R is superimposed on the outside image at the glass lens unit 101R, and reaches the right eye of the observer. As a result, the right-eye image superimposed on the outside image (right-eye imagery) is observed with the right eye of the observer. As the left-eye image and the right-eye image superimposed on the outside image are observed with the left eye and the right eye of the observer, respectively, the display image superimposed and displayed on the outside image is perceived as a stereoscopic (3D) image by the observer.
(81) A sensor output of the infrared sensor 103L provided in the center position of the glass lens unit 101L in the horizontal direction (the center position of the left-eye optical system in the horizontal direction) is supplied to the eye position estimating unit 121. Likewise, a sensor output of the infrared sensor 103R provided in the center position of the glass lens 101R in the horizontal direction (the center position of the right-eye optical system in the horizontal direction) is supplied to the eye position estimating unit 121.
(82) The eye position estimating unit 121 estimates positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R. The line-of-sight estimating unit 122 then estimates a line of sight of the observer based on the result of the left-eye and right-eye position estimation performed by the eye position estimating unit 121. The result of this line-of-sight estimation is supplied to the display control unit 124.
(83) A sensor output of the gyro sensor 104 attached to the glass lens unit 101L is supplied to the display control unit 124. An output (captured left-eye image data) of the camera 105L provided in the center position of the glass lens unit 101L in the horizontal direction (the center position of the left-eye optical system in the horizontal direction) is supplied to the depth/structure estimating unit 123.
(84) Likewise, an output (captured right-eye image data) of the camera 105R provided in the center position of the glass lens unit 101R in the horizontal direction (the center position of the right-eye optical system in the horizontal direction) is supplied to the depth/structure estimating unit 123. The left-eye and right-eye image data, which is the display image data, is further supplied to the depth/structure estimating unit 123.
(85) The depth/structure estimating unit 123 detects a flat region as an image superimposition region from the outside image based on the captured image data from the cameras 105L and 105R. The depth/structure estimating unit 123 then determines a display size and a display position of the display image based on the detected flat region. In this case, temporal smoothing is performed on display sizes and display positions determined for the respective frames, and the display size and the display position of the display image are stabilized.
(86) The depth/structure estimating unit 123 also calculates a disparity map indicating the depth positions of the respective pixels in the outside image based on the captured image data from the cameras 105L and 105R, and calculates a disparity map indicating the depth positions of the respective pixels in the display image (stereoscopic image) based on the left-eye and right-eye image data, which is the display image data. The depth/structure estimating unit 123 then determines the depth position of the display image (stereoscopic image) based on the display size and the display position for display control that are determined in the above described manner, and the disparity maps calculated in the above described manner. In this case, the depth position of the display image is determined so that the display image is located closer to the front side than the depth position of the region in which the display image is to be displayed in a superimposed manner in the outside image.
(87) The depth/structure estimating unit 123 determines whether the depth position of the display image (stereoscopic image) satisfies the condition that the display image is located at a certain distance or longer on the front side from the depth position of the region corresponding to the outside image. If the condition is not satisfied, the depth/structure estimating unit 123 performs disparity adjustment to adjust one or both of the display positions of the left-eye image and the right-eye image in the horizontal direction so that the condition is satisfied. The information about the display size and the display position for display control that are determined by the depth/structure estimating unit 123 in this manner is supplied to the display control unit 124.
(88) The display control unit 124 controls display of the display image based on the result of the line-of-sight estimation performed by the line-of-sight estimating unit 122, the sensor output of the gyro sensor 104, and the information about the display size and the display position determined by the depth/structure estimating unit 123. In this case, the display control unit 124 basically performs control so that the display image is displayed in the display size and the display position determined by the depth/structure estimating unit 123 at the time of display of the display image.
(89) When a flat region as an image superimposition region is not detected by the depth/structure estimating unit 123, the display control unit 124 changes manners of display. When a change in the outside image is detected based on the gyro sensor 104, the display control unit 124 also changes manners of display.
(90) The display control unit 124 also changes manners of display based on the line-of-sight estimation result from the line-of-sight estimating unit 122. This control is performed in accordance with a mode that is set by the user (observer). The user can set automatic control mode, first control mode, or second control mode, for example.
(91) Where first control mode is set, control is performed so that the display image is displayed in the region on which the line of sight concentrates. Where second control mode is set, control is performed so that the display image is displayed in a region outside the region on which the line of sight concentrates. Where automatic mode is set, control is performed so that the display image is displayed outside the region on which the line of sight concentrates when the observer is moving, and control is performed so that the display image is displayed in the region on which the line of sight concentrates when the observer is not moving.
(92) As described above, where second control mode is set, the display image (stereoscopic image) can be displayed in a region outside the region on which the line of sight of the observer concentrates in the outside image, and the display image can be displayed in such a manner as not to obstruct any activity. In this case, the observer views the display image while doing some other thing.
(93) The left-eye and right-eye image data, which is the display image data, is supplied to the to-be-displayed image generating unit 125. The captured image data from the cameras 105L and 105R is also supplied to this to-be-displayed image generating unit 125. Under the control of the display control unit 124, the to-be-displayed image generating unit 125 generates the left-eye and right-eye image data for displaying the display image so that the display image is displayed in the determined display size and the determined display position.
(94) In this case, a reduction process and a moving process are performed on left-eye and right-eye image data supplied from outside, to obtain the left-eye and right-eye image data for display. In this case, the display size and the display position are electronically changed. When the display of the display image is stopped, the generation of the left-eye and right-eye image data is stopped.
(95) The to-be-displayed image generating unit 125 also corrects the left-eye and right-eye image data so that the display state of the display image is changed in accordance with the state of the region on which the display image is to be displayed in the outside image (see the mathematical formula (2)). As the image data is corrected in this manner, the visibility of the display image can be increased, regardless of the state of the outside image.
(96)
(97) The left-eye image data for display generated by the to-be-displayed image generating unit 125 is supplied to the display driver 111L, and the left-eye image corresponding to this left-eye image data is displayed on the display 112L. The right-eye image data for display generated by the to-be-displayed image generating unit 125 is supplied to the display driver 111R, and the right-eye image corresponding to this right-eye image data is displayed on the display 112R.
(98) As a result, the left-eye image and the right-eye image superimposed on the outside image are observed with the left eye and the right eye of the observer, respectively, and the display image (stereoscopic image) superimposed and displayed in an appropriate position and an appropriate size on the outside image is perceived in a depth position in front of the outside image by the observer.
(99) In this case, the display image is basically displayed in an image superimposition region such as a flat region detected from the outside image, and accordingly, it becomes easier for the observer to visually recognize the display image superimposed and displayed on the outside image. An example case where the outside image is the one shown in
(100) In this case, the flat region in the wall portion shown on the upper middle side in
(101) The flowchart in
(102) In step ST3, the display control unit 124 determines whether image display is to be performed. If an image display setting operation is performed by the user, for example, the display control unit 124 determines that image display is to be performed. When image display is to be performed, the display control unit 124 in step ST4 determines the value of the mode set by the user.
(103) When the value of the set mode indicates second control mode, the display control unit 124 in step ST5 selects a region outside the line of sight in the outside image as the object to be displayed. When the value of the set mode indicates first control mode, the display control unit 124 in step ST6 selects the region that matches the line of sight in the outside image as the object to be displayed.
(104) When the value of the set mode indicates automatic mode, the display control unit 124 in step ST7 determines whether the user is currently moving. If the user is currently moving, the display control unit 124 in step ST5 selects a region outside the line of sight in the outside image as the object to be displayed. If the user is currently not moving, the display control unit 124 in step ST6 selects a region that matches the line of sight in the outside image as the object to be displayed.
(105) After carrying out the procedure of step ST5 or ST6, the display control unit 124 moves on to the procedure of step ST8. In step ST8, the display control unit 124 selects a flat region in the object to be displayed as the display position. When there is more than one flat region in the object to be displayed, the region with the largest area is selected as the display position. In this manner, the display size and the display position of the display image are determined.
(106) In step ST9, the display control unit 124 determines whether there is a change in the outside image. If there is a change in the outside image, the duration of the change is determined in step ST10. If the duration is shorter than th1, the display control unit 124 in step ST11 does not change the display position. If the duration is equal to or longer than th1 but is shorter than th2, the display control unit 124 in step ST12 changes the display position to the preset position. If the duration is equal to or longer than th2, the display control unit 124 in step ST13 stops the display of the display image.
(107) After carrying out the procedure of step ST13, the display control unit 124 returns to the procedure of step ST2, and repeats the same procedures as those described above. After carrying out the procedure of step ST11 or ST12, the display control unit 124 moves on to the procedure of step ST14. In step ST14, the display control unit 124 performs temporal smoothing on the display positions (display sizes). In this manner, even a rapid change of the display position (display size) can be changed to a smooth change.
(108) In step ST15, the display control unit 124 adjusts the depth position of the display image (stereoscopic image) in accordance with the depth position of the outside image. As described above, when the display image is to be displayed in a flat region in the outside image, the depth position of the display image is adjusted by generating the left-eye and right-eye image data for display in accordance with the display size and the display position provided from the depth/structure estimating unit 123. In a case where the display image is to be displayed in the preset position, the display positions of the left-eye and right-eye display images are moved and adjusted in the horizontal direction in accordance with the depth position of the outside image in the preset position.
(109) In step ST16, the display control unit 124 corrects the left-eye and right-eye image data for display in accordance with the state of the display region in the outside image. In step ST17, the display control unit 124 performs image display. Specifically, the left-eye and right-eye image data for display is supplied from the to-be-displayed image generating unit 125 to the display drivers 111L and 111R, respectively, and the left-eye image and the right-eye image are displayed on the displays 112L and 112R.
(110) In step ST18, the display control unit 124 determines whether the image has come to an end. If an image display canceling operation is performed by the user, for example, the display control unit 124 determines that the image has come to an end. If the image has come to an end, the display control unit 124 in step ST19 ends the display control process. If the image has not come to an end, the display control unit 124 returns to step ST3, and repeats the same procedures as those described above.
(111) For ease of explanation, the above described example of the procedures shown in the flowchart in
(112) Not all the components of the HMD 100 shown in
(113)
(114)
(115)
(116) As described above, in the HMD 100 shown in
(117) Also, in the HMD 100 shown in
(118) Also, in the HMD 100 shown in
(119) Also, in the HMD 100 shown in
(120) Also, in the HMD 100 shown in
(121) <2. Modifications>
(122) In the above described embodiment, the depth/structure estimating unit 123 calculates a disparity map indicating the depth position of each pixel in an outside image based on captured image data from the cameras 105L and 105R. However, as shown in
(123) Also, in the above described embodiment, the left-eye and right-eye image data for display differs from the captured left-eye and right-eye image data of the outside image obtained by the cameras 105L and 105R. However, as shown in
(124) Although an example of a binocular HMD has been described in the above embodiment, the present technique can also be applied to a monocular HMD.
(125) Being a monocular HMD, this HMD 100C has one glass lens unit 101, while the HMD 100 shown in
(126) A gyro sensor 104 is also attached to the glass lens unit 101. A sensor output of the gyro sensor 104 is used for determining whether there is a change in the image of the outside, and whether the observer (user) is moving. A sensor output of this gyro sensor 104 is sent to a display control unit 124.
(127) A camera 105 is also provided in the center position of the glass lens unit 101 in the horizontal direction (the center position of the optical system in the horizontal direction). The camera 105 functions in the same manner as the cameras 105L and 105R in the HMD 100 shown in
(128) The eye position estimating unit 121 estimates a position of an eye (the left eye or the right eye) of the observer based on the sensor output from the infrared sensor 103. The line-of-sight estimating unit 122 estimates a line of sight of the observer based on the result of the eye position estimation performed by the eye position estimating unit 121. The result of this line-of-sight estimation is supplied to the display control unit 124.
(129) The structure estimating unit 123C detects a flat region as an image superimposition region from the outside image based on the captured image data from the camera 105. The structure estimating unit 123 then determines a display size and a display position of the display image based on the detected flat region. In this case, temporal smoothing is performed on display sizes and display positions determined for the respective frames, and the display size and the display position of the display image are stabilized. The information about the display size and the display position for display control that are determined by the structure estimating unit 123C in this manner is supplied to the display control unit 124.
(130) The display control unit 124 controls display of the display image based on the result of the line-of-sight estimation performed by the line-of-sight estimating unit 122, the sensor output of the gyro sensor 104, and the information about the display size and the display position determined by the structure estimating unit 123C. In this case, the display control unit 124 basically performs control so that the display image is displayed in the display size and the display position determined by the structure estimating unit 123C at the time of display of the display image.
(131) When a flat region as an image superimposition region is not detected by the structure estimating unit 123C, the display control unit 124 changes manners of display. When a change in the outside image is detected based on the gyro sensor 104, the display control unit 124 also changes manners of display.
(132) The display control unit 124 also changes manners of display based on the line-of-sight estimation result from the line-of-sight estimating unit 122. This control is performed in accordance with a mode that is set by the user (observer). The user can set automatic control mode, first control mode, or second control mode, for example.
(133) Where first control mode is set, control is performed so that the display image is displayed in the region on which the line of sight concentrates. Where second control mode is set, control is performed so that the display image is displayed in a region outside the region on which the line of sight concentrates. Where automatic mode is set, control is performed so that the display image is displayed outside the region on which the line of sight concentrates when the observer is moving, and control is performed so that the display image is displayed in the region on which the line of sight concentrates when the observer is not moving.
(134) Image data is supplied to the to-be-displayed image generating unit 125. The captured image data from the camera 105 is also supplied to this to-be-displayed image generating unit 125. Under the control of the display control unit 124, the to-be-displayed image generating unit 125 generates the image data for displaying the display image so that the display image is displayed in the determined display size and the determined display position.
(135) In this case, a reduction process and a moving process are performed on image data supplied from outside, to generate the image data for display. When the display of the display image is stopped, the generation of the image data is stopped. The to-be-displayed image generating unit 125 also corrects the image data for display so that the display state of the display image is changed in accordance with the state of the region on which the display image is to be displayed in the outside image (see the mathematical formula (2)).
(136) Although the other aspects of the HMD 100C shown in
(137) In the above described embodiment, eye positions and a line of sight are estimated by using sensor outputs of the infrared sensors. However, a structure for estimating a line of sight of an observer (a user) is not limited to that structure. For example, it is also possible to use an EOG (Electro-Oculogram) method, a face recognition technology, or the like.
(138) In the above described embodiment, the present technique is applied to an optically-transmissive head mount display. However, the present technique is not limited to applications to optically-transmissive head mount displays, but can also be applied to other transmissive display apparatuses. In this case, displaying a virtual image is not necessary.
(139) The present technique may also be embodied in the structures described below.
(140) (1) An image display apparatus including:
(141) an optical system that superimposes a display image displayed on a display device onto an outside image, and leads the display image to an eye of an observer; and
(142) a display control unit that controls a display size and a display position of the display image on the display device so that the display image is displayed in an image superimposition region detected from the outside image.
(143) (2) The image display apparatus of (1), wherein the image superimposition region is detected based on captured image data obtained by forming the outside image.
(144) (3) The image display apparatus of (1) or (2), wherein the display control unit controls the display size and the display position of the display image by processing image data for displaying the display image on the display device based on information about the image superimposition region.
(145) (4) The image display apparatus of any of (1) through (3), wherein the display control unit changes a display state of the display image in accordance with a state of the image superimposition region in the outside image.
(146) (5) The image display apparatus of (4), wherein the display control unit corrects the image data for displaying the display image in accordance with the state of the image superimposition region so that elements of the outside image are removed from the display image to be observed by the observer.
(147) (6) The image display apparatus of any of (1) through (5), wherein, when the image superimposition region is not detected from the outside image, the display control unit changes manners of display of the display image.
(148) (7) The image display apparatus of any of (1) through (6), wherein the display control unit obtains the display size and the display position for the control by performing temporal smoothing on display sizes and display positions on the display device, the display sizes and the display positions being determined by the image superimposition region that is cyclically detected.
(149) (8) The image display apparatus of any of (1) through (7), wherein, when a change in the outside image is detected, the display control unit changes manners of display of the display image.
(150) (9) The image display apparatus of any of (1) through (8), wherein
(151) the optical system includes a first optical system that superimposes a left-eye image displayed on a first display device onto an outside image and leads the left-eye image to the left eye of the observer, and a second optical system that superimposes a right-eye image displayed on a second display device onto the outside image and leads the right-eye image to the right eye of the observer, and
(152) the display control unit controls disparities of the left-eye image and the right-eye image so that a depth position of a stereoscopic image to be perceived by the observer through the left-eye image and the right-eye image is located closer to the front side than a depth position of a region in which the stereoscopic image is displayed in a superimposed manner in the outside image.
(153) (10) An image display method including the steps of:
(154) superimposing a display image displayed on a display device onto an outside image, and leading the display image to an eye of an observer, the superimposing and leading the display image being performed by an optical system; and
(155) controlling a display size and a display position of the display image on the display device so that the display image is displayed in an image superimposition region detected from the outside image.
(156) (11) An image display apparatus including:
(157) an optical system that superimposes a display image displayed on a display device onto an outside image, and leads the display image to an observer; and
(158) a display control unit that has a first control mode for performing control so that the display image is displayed in a region on which a line of sight of the observer concentrates in the outside image, and a second control mode for performing control so that the display image is displayed in a region outside the region on which the line of sight of the observer concentrates in the outside image.
(159) (12) The image display apparatus of (11), wherein the display control unit performs control in the first control mode when the observer is not moving, and performs control in the second control mode when the observer is moving.
(160) (13) The image display apparatus of (11) or (12), wherein the display control unit changes a display state of the image in accordance with a state of a region on which the display image is to be superimposed in the outside image.
(161) (14) The image display apparatus of any of (11) through (13), wherein
(162) the optical system includes a first optical system that superimposes a left-eye image displayed on a first display device onto an outside image and leads the left-eye image to the left eye of the observer, and a second optical system that superimposes a right-eye image displayed on a second display device onto the outside image and leads the right-eye image to the right eye of the observer, and
(163) the display control unit controls disparities of the left-eye image and the right-eye image so that a depth position of a stereoscopic image to be perceived by the observer through the left-eye image and the right-eye image is located closer to the front side than a depth position of a region in which the stereoscopic image is displayed in a superimposed manner in the outside image.
(164) (15) An image display method including the steps of:
(165) superimposing a display image displayed on a display device onto an outside image, and leading the display image to an eye of an observer, the superimposing and leading the display image being performed by an optical system; and
(166) selectively performing control to display the display image in a region on which a line of sight of the observer concentrates in the outside image, and control to display the display image in a region outside the region on which the line of sight of the observer concentrates in the outside image.
(167) (16) An image display apparatus including:
(168) an optical system that superimposes a display image displayed on a display device onto an outside image, and leads the display image to an eye of an observer; and
(169) a display control unit that changes a display state of the display image in accordance with a state of a region on which the display image is superimposed in the outside image.
(170) (17) The image display apparatus of (16), wherein the display control unit acquires the state of the region in the outside image based on captured image data obtained by forming the outside image.
(171) (18) An image display method including the steps of:
(172) superimposing a display image displayed on a display device onto an outside image, and leading the display image to an eye of an observer, the superimposing and leading the display image being performed by an optical system; and
(173) changing a display state of the display image in accordance with a state of a region on which the display image is superimposed in the outside image.
(174) (19) An image display apparatus including:
(175) a first optical system that superimposes a left-eye image displayed on a first display device onto an outside image, and leads the left-eye image to the left eye of an observer;
(176) a second optical system that superimposes a right-eye image displayed on a second display device onto the outside image, and leads the right-eye image to the right eye of the observer; and
(177) a display control unit that controls disparities of the left-eye image and the right-eye image so that a depth position of a stereoscopic image to be perceived by the observer through the left-eye image and the right-eye image is located closer to the front side than a depth position of a region in which the stereoscopic image is displayed in a superimposed manner in the outside image.
(178) (20) An image display method including the steps of:
(179) superimposing a left-eye image displayed on a first display device onto an outside image, and leading the left-eye image to the left eye of an observer, the superimposing and leading the left-eye image being performed by a first optical system;
(180) superimposing a right-eye image displayed on a second display device onto the outside image, and leading the right-eye image to the right eye of the observer, the superimposing and leading the right-eye image being performed by a second optical system; and
(181) controlling disparities of the left-eye image and the right-eye image so that a depth position of a stereoscopic image to be perceived by the observer through the left-eye image and the right-eye image is located closer to the front side than a depth position of a region in which the stereoscopic image is displayed in a superimposed manner in the outside image.
REFERENCE SIGNS LIST
(182) 100, 100A to 100C Head mount display 101, 101L, 101R Glass lens unit 102 Connecting member 103, 103L, 103R Infrared sensor 104 Gyro sensor 105, 105L, 104R Camera 106 Distance measuring sensor 111, 111L, 111R Display driver 112, 112L, 112R Display 121 Eye position estimating unit 122 Line-of-sight estimating unit 123 Depth/structure estimating unit 123C Structure estimating unit 124 Display control unit 125 To-be-displayed image generating unit