AUGMENTED REAL IMAGE DISPLAY DEVICE FOR VEHICLE
20200150432 ยท 2020-05-14
Assignee
Inventors
Cpc classification
B60K2360/188
PERFORMING OPERATIONS; TRANSPORTING
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
B60R11/0229
PERFORMING OPERATIONS; TRANSPORTING
B60K35/29
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/186
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
G09G5/00
PHYSICS
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
The present invention presents information while maintaining the visibility of a foreground. An object selection unit 21 selects a specific real object 300 from with a foreground 200, a display position adjustment unit 22 controls a position of an augmented real image V so that the augmented real image V adjoins or at least partly overlaps the real object 300 selected by the object selection unit 21, and an image processing unit 23 makes an adjustment so that a color of a portion of the augmented real image V visible to a user is the same as or similar to the color of the real object 300 as acquired by a color information acquisition unit 30.
Claims
1. An augmented real image display device for a vehicle that displays an augmented real image including presentation information so that the augmented real image is superimposed on a foreground of the vehicle, the augmented real image display device for a vehicle comprising: an image display unit configured to allow a user to visually recognize the augmented real image; an object selection unit configured to select a specific real object from the foreground; a display position adjustment unit configured to control a position of the augmented real image so that the augmented real image adjoins or at least partly overlaps the real object selected by the object selection unit; a color information acquisition unit configured to acquire color information of the real object; and an image processing unit configured to make an adjustment so that a color of a portion of the augmented real image visible to the user is a same as or similar to the color of the real object.
2. The augmented real image display device for a vehicle according to claim 1, wherein the augmented real image includes an information image indicating the presentation information, and a background image surrounding at least a portion of a periphery of the information image, and the image processing unit is configured to make an adjustment so that the color of the background image visually recognized by the user is the same as or similar to the color of the real object.
3. The augmented real image display device for a vehicle according to claim 2, wherein the color information acquisition unit is configured to acquire, in the real object, the color of an information area including information recognizable by the user and the color of a non-information area not including information recognizable by the user, and the image processing unit is configured to make an adjustment so that the color of the background image visually recognized by the user is the same as or similar to the color of the non-information area and is not the same as or similar to the color of the information area in the real object.
4. The augmented real image display device for a vehicle according to claim 2, wherein the color information acquisition unit is configured to detect a background area. with relatively little variation in color in the non-information area, and the display position adjustment unit is configured to control the position of the augmented real image so that at least a portion of the augmented real image projects from the real object and adjoins or at least partly overlaps the background area.
5. The augmented real image display device for a vehicle according to claim 1, wherein the image processing unit is configured to perform at least one of blur processing, translucent processing, and gradation processing to blur at least an outer edge of the augmented real image.
6. The augmented real image display device for a vehicle according to claim 1, further comprising a gaze information acquisition unit configured to detect a gazing position of the user, wherein the image processing unit is configured to, when the gazing position detected by the gaze information acquisition unit moves onto the real object, make an adjustment so that the color of the augmented real image visually recognized by the user is not the same as or similar to the color of the real object.
7. The augmented real image display device for a vehicle according to claim 1, further comprising a gaze information acquisition unit configured to detect a gazing position of the user, wherein the display position adjustment unit is configured to arrange an internal augmented real image in an internal area of the vehicle, and the image processing unit is configured to, either when the gazing position detected by the gaze information acquisition unit is in the internal area or until a predetermined time elapses from when the gazing position is moved out of the internal area, make an adjustment so that the color of the augmented real image visually recognized by the user is not the same as or similar to the color of the real object, and configured to, either when the gazing position detected by the gaze information acquisition unit moves from the internal area to another area or when a predetermined tune elapses from when the gazing position is moved out of the internal area, gradually change the color of the augmented real image so that the color of the augmented real image becomes the same as or similar to the real object.
8. The augmented real image display device for a vehicle according to claim 1, wherein the object selection unit is configured to select the real object satisfying a first selection condition including the real object having relevance to the presentation information indicated by the augmented real image, and to select, when determining that there is no real object satisfying the first selection condition in the foreground, the real object satisfying a second selection condition different from the first selection condition, and the image processing unit is configured to make an adjustment so that the color of the augmented real image visually recognized by the user is not the same as or similar to the color of the real object satisfying the second selection condition.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
MODE FOR CARRYING OUT THE INVENTION
[0024] Below, embodiments according to the present invention will be described with reference to the accompanying drawings. Note that, the present invention is not limited to the following embodiments (including the contents of the drawings). It is of course that modification (including deletion of constituent elements) can be applied to the following embodiments. Furthermore, in the following description, in order to facilitate the understanding of the present invention, a description of known technical matters will be omitted as appropriate.
[0025]
[0026] (Another Example of the Image Display Unit 10)
[0027]
[0028] Next, the description proceeds with reference to
[0029] The display device 100 includes the image display unit 10, a display control unit 20, an object information acquisition unit (color information acquisition unit) 30, a gaze information acquisition unit 40, a position information acquisition unit 50, a direction information acquisition unit 60, and a communication interface 70. The display device 100 is communicatively connected to a cloud server (external server) 500 and a vehicle ECU 600 via the communication interface 70. The communication interface 70 may include a wired communication function such as a USB port, a serial port, a parallel port, an OBD II, and/or any other suitable wired communication port. A data cable from the vehicle is connected to the display control unit 20 of the display device 100 via the communication interface 70. It is noted that in other embodiments, the communication interface 70 may include a wireless communication interface using Bluetooth (registered trademark) communication protocol, IEEE 802.11 protocol, IEEE 802.16 protocol, a shared wireless access protocol, a wireless USB protocol, and/or any other suitable wireless technology. The display device 100 acquires image data of the augmented real image V from the cloud server 500 or the vehicle ECU 600 via the communication interface 70, and displays the augmented real image V based on the image data in the vicinity of the real object 300 determined by the display control unit 20. It is noted that the image data may be partly or all stored in a storage unit 24 of the display control unit 20 described below, and the display control unit 20 may be configured to read the image data stored in the storage unit 24 according to information obtained from the cloud server 500, the vehicle ECU 600, and the like to display the augmented real image V.
[0030] The display control unit 20 receives real object information including position information and color information of the real object 300 acquired by the object information acquisition unit 30 described below; gaze information indicating a gazing position of the user acquired by the gaze information acquisition unit 40; position information indicating the current position of the vehicle or the display device 100 acquired by the position information acquisition unit 50; direction information indicating a direction in which the vehicle or the display device 100 is directed, acquired by the direction information acquisition unit 60; and image data acquired by the communication interface 70 from the cloud server 500 and/or the vehicle ECU 600. The display control unit 20 controls the position and color of the augmented real image V displayed by the image display unit 10 so that the augmented real image V is arranged in the vicinity of a specific real object 300 existing in the foreground 200 of the vehicle and has a portion with the same color as the real object 300.
[0031] The display control unit 20 includes an object selection unit 21 that selects a specific real object 300 for which the augmented real image V is to be arranged in the vicinity, a display position adjustment unit 22 that adjusts a relative position at which the augmented real image V with respect to the specific real object 300 selected by the object selection unit 21 is displayed, an image processing unit 23 that can adjust the color and brightness of the augmented real image V, and the storage unit 24 that stores the image data.
[0032] The object selection unit 21 selects the specific real object 300 for which the augmented real image V is to be displayed in the vicinity from among the real objects 300 extracted by the object information acquisition unit 30 from the foreground 200. The specific real object 300 to be selected satisfies a first selection condition assigned to each augmented real image V (image data). The first selection condition preferably includes relevance to presentation information indicated by the augmented real image V. For example, the first selection condition assigned to the augmented real image V indicating a route on the way to the destination is the real object 300 being a guide sign. However, the first selection condition may not include relevance to the presentation information indicated by the augmented real image V. Further, the first selection condition may be not fixed and be changed. Specifically, it may be automatically changed depending on a change in the environment in which the vehicle travels, the state of the user, or the like, or may be changed by an operation of the user.
[0033] Further, when the object selection unit 21 determines that there is no real object 300 satisfying the first selection condition in the foreground 200, the object selection unit 21 selects a real object 300 satisfying a second selection condition that is different from the first selection condition. In other words, the object selection unit 21 preferentially selects the real object 300 satisfying the first selection condition over the real object 300 satisfying the second selection condition. It is noted that the object selection unit 21 may not select a specific real object 300 when there is no real object 300 satisfying such a condition. In this case, the augmented real image V is displayed so that it is fixed in a predetermined area in the display area 101.
[0034] The display position adjustment unit 22 determines a relative position at which the augmented real image V is to be displayed with respect to the specific real object 300 selected by the object selection unit 21, based on the position information of the real object 300 acquired by the object information acquisition unit 30. Further, the display position adjustment unit 22 may determine a display position of the augmented real image V so that the augmented real image V adjoins or partly overlaps a non-information area 312 (see
[0035] The image processing unit 23 adjusts the color of the augmented real image V to be displayed by the image display unit 10. The image processing unit 23 adjusts the color of the augmented real image V based on the color information indicating the color of the real object 300 acquired by the object information acquisition unit (color information acquisition unit) 30 described below, and makes an adjustment so that the color of a portion of the augmented real image V is the same as or similar to the color of the real object 300. Further, the image processing unit 23 may adjust the color of the augmented real image V based on the gaze information indicating the gazing position of the user acquired by the gaze information acquisition unit 40 (its details will be described below).
[0036] Further, the image processing unit 23 may perform a shading processing on part or all of the augmented real image V to be displayed by the image display unit 10. The shading processing includes blur processing, translucent processing, and gradation processing to blur at least an outer edge of the augmented real image V.
[0037] The object information acquisition unit 30 is an input interface for acquiring the position information of the real object 300 on the foreground 200. The position information is a result of analyzing, by an image analysis unit 32, a captured image of the foreground 200 captured by at least one image capturing camera (foreground image capturing unit) 31 provided on the vehicle or the image display unit 10. The acquired position information of the real object 300 is output to the display device 100.
[0038] (Color Information Acquisition Unit)
[0039] The object information acquisition unit 30 may further function as a color information acquisition unit that can acquire the color information of the real object 300. Specifically, the foreground image capturing unit 31 is preferably a color camcorder or an infrared camera that can detect the color of the real object 300, and the object information acquisition unit 30 may acquire the color information of the real object 300 on the foreground 200, which is a result of analyzing, by the image analysis unit 32, a color image of the foreground 200 captured by the foreground image capturing unit 31. It is noted that the color information acquisition unit may be configured to acquire, in the real object 300, the color of the information area 311 (see
[0040] Further, the object information acquisition unit 30 may acquire type information for identifying the type of the real object 300 on the foreground 200, which is a result of analyzing, by the image analysis unit 32, the captured image of the foreground 200 captured by the foreground image capturing unit 31. The types of real object 300 include, for example, a road sign, a road surface, a building, and the like, but are not limited to these as long as they exist in the foreground 200 and are identifiable. The image analysis by the image analysis unit 32 is performed by matching with a shape stored in advance in a storage unit of the image analysis unit 32. However, the image analysis may include an additional estimation based on the position of the real object 300 in the captured image or an additional estimation based on the position information of the vehicle or the display device 100, as described below. It is noted that the color of the real object 300 may be estimated according to the type of the real object 300. Accordingly, as a modification, the display control unit 20 may estimate the color of the real object 300 based on the type information acquired from the object information acquisition unit 30. Specifically, the object information acquisition unit 30 can acquire real object information (position information, color information, and type information of the real object 300) and output the information to the display control unit 20.
[0041] (Another Example of Color Information Acquisition Unit)
[0042] As another example, the communication interface 70 described below may have a function as the color information acquisition unit. For example, the cloud server 500 stores, for example, position information, shape information, color information, and the like of the object information acquisition unit 300 such as a road and a building together with map information, and accordingly, the communication interface 70 can acquire the color information together with the position information of the real object 300 from the cloud server 500.
[0043] The gaze information acquisition unit 40 is an input interface that acquires gazing position information indicating a gazing position of the user, which is a result of analyzing, by an analysis unit 42, a captured image of the user's eyes captured by a user detection unit 41 including an image capturing camera that captures the user. In the case of detection of line of sight, the image of the user's eyes is captured by a CCD camera or the like, and the direction of the user's line of sight is detected as a gazing position by using pattern matching processing of image processing technology.
[0044] The position information acquisition unit 50 acquires the position information of the vehicle or the display device 100 detected by a position detection unit 51 including a GNSS (Global Navigation Satellite System) or the like, and outputs the position information to the display control unit 20.
[0045] The direction information acquisition unit 60 acquires the direction information indicating the direction of the vehicle or the display device 100 detected by a direction detection unit 61 including a direction sensor, and outputs the direction information to the display control unit 20.
[0046] The display control unit 20 outputs the position information of the vehicle or the display device 100 acquired by the position information acquisition unit 50 and the direction information of the vehicle or the display device 100 acquired by the direction information acquisition unit 60 to the cloud server 500 and/or the vehicle ECU 600 via the communication interface 70. Subsequently, based on the received position information and direction information of the vehicle or the display device 100, the cloud server 500 and the vehicle ECU 600 outputs, to display control unit 20 via the communication interface 70, the image data of the augmented real image V to be displayed by the display device 100. It is noted that as another example, the cloud server 500 and the vehicle ECU 600 may output specification data for specifying an augmented real image V to be displayed by the display device 100 to the display control unit 20 via the communication interface 70 based on the received position information and direction information of the vehicle or the display device 100, and the display control unit 20 may read the image data stored in the storage unit 24 based on the received specification data. Further, as another example, the cloud server 500 and the vehicle ECU 600 may output, to the display control unit 20, the image data of the augmented real image V or specification data for specifying the augmented real image V to be displayed, based on other information different from the position information and direction information of the vehicle or the display device 100.
[0047]
[0048] Next, in step S2, the display control unit 20 receives, via the object information acquisition unit 30, the real object information including the type information, the position information, and the color information of the real object 300 existing in the foreground 200, which are results of analysis by the image analysis unit 32 of the captured image of the foreground 200 of the vehicle captured by the foreground image capturing unit 31. Further, the display control unit 20 receives, via the object information acquisition unit 30, the position information of the information area 311 (see
[0049] Next, in step S3, the object selection unit 21 of the display control unit 20 refers to the type information and the position information of the real object 300 received in step S2, and selects a specific real object 300 satisfying the first selection condition assigned to the image data received in step S1. Further, when the object selection unit 21 determines that there is no real object 300 satisfying the first selection condition in the foreground 200, the object selection unit 21 selects a real object 300 satisfying a second selection condition that is different from the first selection condition.
[0050] Next, in step S4, the display position adjustment unit 22 of the display control unit 20 determines a display position of the augmented real image V so that the augmented real image V does not overlap the information area 311 including information recognizable by the user in the real object 300. Specifically, the display position adjustment unit 22 determines a display position of the augmented real image V so that the augmented real image V adjoins or at least partly overlap the non-information area 312, preferably the background area 313, of the real object 300, based on the position information of the information area 311 (see
[0051] Next, in step S5, the image processing unit 23 of the display control unit 20 determines the color of the augmented real image V so that the color of a portion of the augmented real image V is the same as or similar to the colors of the real object 300, based on the color information of the real object 300 received in step S1. Specifically, an adjustment is made so that the color of a background image VB (see
[0052] Next, in step S6, the image processing unit 23 of the display control unit 20 performs the shading processing such as blur processing, translucent processing, and gradation processing on the augmented real image V.
[0053] Next, in step S7, the display control unit 20 causes the image display unit 10 to display the augmented real image V subjected to the shading processing in step S6 at the position determined in step S4 in the color determined in step S5.
[0054] First to fourth embodiments will be specifically described below mainly with reference to
First Embodiment
[0055] In the first embodiment, the image processing unit 23 makes an adjustment so that the color of the background image VB visually recognized by the user is the same as or similar to the color of the real object 300. When the non-information area 312 of the first real object 310 is blue, the color of the background image VB of the first augmented real image V1 is set to blue or a similar color to blue. It is noted that the similar color in the present invention is a color in which differences in R, G, and B values in the RGB space each fall within a range of 15% or less, and/or differences in H (hue), S (saturation), and V (value) values in the HSV space each fall within a range of 15% or less. The image processing unit 23 does not need to make the entire background image VB the same as the color of the real object 300, but may partly make the background image VB the same as that, and when the color of 50% or more of the entire background image VB is similar to the real object 300, it is possible to harmonize the augmented real image V with the real object 300. It is noted that the image processing unit 23, if an area close to the real object 300 in the background image VB has a similar color to the real object 300, can harmonize the augmented real image V with the real object 300 even if about 25% or more of the entire background image VB has the similar color. It is noted that the image processing unit 23 may make an adjustment so that the color of the background image VB visually recognized by the user is not similar to the color of the information area 311 of the real object 300.
[0056] It is noted that the augmented real image V does not necessarily have the background image VB. In other words, the augmented real image V may be composed of only the information image VA indicating the presentation information. In this case, the image processing unit 23 makes an adjustment so that the color of part or all of the outermost edge of the information image VA is the same as or similar to the color of the real object 300.
Second Embodiment
[0057] The display position adjustment unit 22 controls the position of the augmented real image V so that at least a portion of the augmented real image V projects from the real object 300 and adjoins or at least partly overlaps the non-information area 312 of the real object 300. In other words, the display position adjustment unit 22 may arrange the augmented real image V so that the augmented real image V has an area VB2 (see
Third Embodiment
[0058] In the third embodiment, when the gazing position of the user detected by the gaze information acquisition unit 40 moves from another position onto the real object 300 on which the augmented real image V is displayed in the vicinity, the image processing unit 23 makes an adjustment so that the color of the augmented real image V visually recognized by the user is not the same as or similar to the color of the real object 300.
Fourth Embodiment
[0059] In the fourth embodiment, either when the gazing position of the user detected by the gaze information acquisition unit 40 is in the internal area 400 of the vehicle or until a predetermined time elapses from when the gazing position of the user is moved out of the internal area 400, the image processing unit 23 makes an adjustment so that the color of the augmented real image V visually recognized by the user is not the same as or similar to the color of the real object 300, and either when the gazing position moves from the internal area 400 to another area or when a predetermined time elapses from when the gazing position is moved out of the internal area 400, the image processing unit 23 gradually changes the color of the augmented real image V so that the color of the augmented real image V becomes the same as or similar color to the real object 300.
INDUSTRIAL APPLICABILITY
[0060] The present invention is suitable for a transmissive head-mounted display device or a head-up display device, which allow a viewer to visually recognize a virtual image superimposed on a landscape.
DESCRIPTION OF REFERENCE NUMERALS
[0061] 10 Image display unit
[0062] 20 Display control unit
[0063] 21 Object selection unit
[0064] 22 Display position adjustment unit
[0065] 23 Image processing unit
[0066] 24 Storage unit
[0067] 30 Object information acquisition unit (color information acquisition unit)
[0068] 40 Gaze information acquisition unit
[0069] 50 Position information acquisition unit
[0070] 60 Direction information acquisition unit
[0071] 70 Communication interface (color information acquisition unit)
[0072] 100 Augmented real image display device for vehicle
[0073] 101 Display area
[0074] 200 Foreground
[0075] 300 Real object
[0076] 310 First real object
[0077] 311 Information area
[0078] 312 Non-information area
[0079] 313 Background area
[0080] 320 Second real object
[0081] 330 Third real object
[0082] 400 Internal area
[0083] 500 Cloud server
[0084] 600 Vehicle ECU
[0085] V Augmented real image
[0086] VA Information image
[0087] VB Background image
[0088] WS Windshield