Display system
11514785 · 2022-11-29
Assignee
Inventors
Cpc classification
G01C21/365
PHYSICS
G02B2027/0198
PHYSICS
G06F3/011
PHYSICS
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
G08G1/09626
PHYSICS
B60R2300/304
PERFORMING OPERATIONS; TRANSPORTING
G01C21/265
PHYSICS
B60R2300/308
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/305
PERFORMING OPERATIONS; TRANSPORTING
G08G1/096861
PHYSICS
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
G09G5/00
PHYSICS
G08G1/0968
PHYSICS
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A display system of the present disclosure forms an AR route by shifting node information included in road map data to a lane on which a subject vehicle is to travel on the basis of lane information. Thus, it is possible to display the AR route which matches a shape of a route on which the subject vehicle is to travel without providing a feeling of strangeness while resolving inconvenience that the AR route is largely displaced from the route on which the subject vehicle is to travel at positions such as an intersection and a branch point, where a plurality of roads intersect.
Claims
1. A display system for displaying an AR (Augmented Reality) route which is a virtual image so as to be superimposed on a real image which is seen by a user, the display system comprising: a processor that forms the AR route; and a display that displays the AR route as a virtual image, wherein the processor inputs a node coordinate, included in road map data, of a center of a road on which a subject vehicle is to travel, the processor forms the AR route by shifting the node coordinate included in the road map data onto a lane on which the subject vehicle is to travel on a basis of lane information, a route section of the AR route includes a plurality of nodes, the processor performs a process including: connecting a start node of the plurality of nodes to an end node of the plurality of nodes in a first case where a distance of each of the plurality of nodes from a line which connects the start node to the end node of the plurality of nodes is less than or equal to a threshold; and dividing the route section into two route sections in a second case where the distance of at least one of the plurality of nodes from the line which connects the start node to the end node of the plurality of nodes is greater than the threshold, and in the second case, the processor divides the route section at a node of the at least one of the plurality of nodes to which the distance from the line is greatest.
2. The display system according to claim 1, wherein the lane information includes a number of lanes, and the processor changes a shifting amount of the node coordinate in accordance with the number of lanes.
3. The display system according to claim 1, wherein the processor changes the shifting amount of the node coordinate further in accordance with whether or not a road is a one-way road.
4. The display system according to claim 1, wherein the processor shifts the node coordinate further on a basis of second lane information of a second road after the subject vehicle turns left or right or third lane information of a third road on which the subject vehicle is to follow after a branch point.
5. A display apparatus which causes a driver to view a virtual image by projecting light on a windshield, the display apparatus comprising: the display system according to claim 1.
6. The display system according to claim 1, wherein the processor repeats the process for each of the two route sections.
7. The display system according to claim 1, wherein the processor performs curve interpolation using start nodes and end nodes of the two route sections to form the AR route along the route section.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
DESCRIPTION OF EMBODIMENT
(15) An embodiment of the present invention will be described below with reference to the accompanying drawings.
<1> Schematic Configuration of Display Apparatus
(16)
(17) Display apparatus 100 in the present embodiment is embodied as an in-vehicle head-up display (HUD). Display apparatus 100 is attached near an upper face of dashboard 220 of vehicle 200.
(18) Display apparatus 100 projects light on region D10 within a field of view of a driver, indicated with a dashed-dotted line, in windshield (so-called, front windshield) 210. While part of the projected light passes through windshield 210, the other part of the light is reflected by windshield 210. This reflected light heads for eyes of the driver. The driver perceives the reflected light which has entered the eyes as virtual image Vi which looks like an image of an object located on an opposite side (outside of vehicle 200) of windshield 210 against a background of a real object seen through windshield 210.
(19)
(20) Region D10 is located at a lower portion on the driver side of windshield 210, for example, as indicated as a region enclosed by a dashed line in
(21) Note that an image projected on windshield 210 can be perceived by the driver in virtual image Vi as if it were located at different distances depending on vertical positions within region D10. For example, in the examples in
(22)
(23)
(24) As illustrated in
(25)
(26) Display apparatus 100 includes map information acquirer 101, position detector 102, radar 103, vehicle behavior detector 104, viewpoint detector 105, image former 110, display controller 120 and HUD 130.
(27) Map information acquirer 101 acquires map information including information which expresses landforms, road shapes, or the like, with coordinates in an absolute coordinate system. The map information acquired by map information acquirer 101 may be information stored in a map information storage medium mounted on vehicle 200 or may be acquired through communication with an external apparatus. In a case of the present embodiment, map information acquirer 101, which is a so-called navigation system, acquires a course from a current location to a destination. Map information acquirer 101 outputs the map information and course information to image former 110.
(28) Position detector 102, which is embodied by a GPS receiver, a gyro scope, a vehicle speed sensor, or the like, detects a current location of subject vehicle 200.
(29) Radar 103 detects whether or not there is an object and a distance to the object by emitting a radio wave or laser light toward a region ahead of subject vehicle 200 and receiving the reflected wave. Note that display apparatus 100 may include other detection apparatuses such as a camera and an infrared sensor in addition to radar 103 to detect an object in a peripheral region.
(30) Vehicle behavior detector 104, which is embodied by a gyro scope, a suspension stroke sensor, a vehicle height sensor, a vehicle speed sensor, an acceleration sensor, or the like, detects a physical amount indicating behavior of the vehicle.
(31) Viewpoint detector 105 takes an image of the eyes of the driver with, for example, an infrared camera, and measures coordinates of positions of the eyes of the driver in a vehicle coordinate system from the taken image of the eyes through image processing. The detection result by viewpoint detector 105 is output to display controller 120.
(32) Image former 110 forms an image which becomes a basis of virtual image Vi on the basis of input signals from map information acquirer 101, position detector 102, radar 103 and vehicle behavior detector 104. Image former 110 includes AR route former 111. AR route former 111 forms an image which becomes a basis of an AR route which is a virtual image on the basis of input signals from map information acquirer 101 and position detector 102.
(33) Display controller 120 displays virtual image Vi in region D10 of the windshield by controlling a light source, a scanner, a screen driver, or the like, which constitute HUD 130 on the basis of the image formed by image former 110 and viewpoint information.
<2> AR Route Formation
(34) Before characteristic AR route forming processing according to the present embodiment is described, typical route formation using map information will be described.
(35) Note that functions of AR route former 111 which will be described below are realized by a CPU copying the program stored in the storage apparatus to a RAM, sequentially reading out commands included in the program from the RAM and executing the commands. In other words, processing of AR route former 111 which will be described below is realized by the program.
(36) AR route former 111 inputs road map data from map information acquirer 101. In the road map data, a minimum unit indicating a road section is referred to as a link. That is, each road is constituted with a plurality of links set for each predetermined road section. Points which connect the links are referred to as nodes, and each of the nodes has position information (coordinate information). Further, points called shape interpolating points may be set between nodes within a link. Each of the shape interpolating points also has position information (coordinate information) in a similar manner to the nodes. A link shape, that is, a shape of a road is determined by position information of the nodes and the shape interpolating points.
(37) The node is an intersection, a branch point, a junction, or the like, and AR route former 111 inputs coordinate information of the intersection, the branch point, the junction, or the like, as information of the nodes. Further, AR route former 111 also inputs coordinate information of the shape interpolating points as described above.
(38) Each link is constituted with respective pieces of data such as a link length indicating a length of the link, shape information of the link, coordinates (latitude, longitude) of a start node and a terminal node of the link, road name, a road type, a road width, a road attribute, a one-way attribute, the number of lanes, presence or absence of a right-turn-only or left-turn-only lane, and the number of the right-turn-only or left-turn only lanes as attribute information of the link.
(39) AR route forming processing by AR route former 111 of the present embodiment will be described next. Information of the nodes and the links indicating the traveling route of the subject vehicle as described above is input to AR route former 111.
(40) <2-1> Line Correction
(41)
(42) Meanwhile, in AR route forming processing in the present embodiment, first, it is determined whether or not a section from N1 to N5 is a linear section, and, in a case where it is determined that the section is a linear section, line L0 which connects start node N1 with terminal node N5 of the section is formed as the AR route and displayed. On the other hand, in a case where it is determined that the section is not a linear section, dividing processing or curve correction processing which will be described later are performed.
(43) The processing will be specifically described. In the AR route forming processing in the present embodiment, first, as illustrated in
(44) Then, distances h2, h3 and h4 between line L0 and other nodes N2, N3 and N4 included in the route section are calculated.
(45) Next, AR route former 111 compares distances h2, h3 and h4 with a predetermined threshold. In a case where distances h2, h3 and h4 are all equal to or less than the threshold, AR route former 111 creates an AR route by connecting the section between node N1 with node N5 with one line L0. On the other hand, in a case where there is a distance greater than the threshold among distances h2, h3 and h4, AR route former 111 divides a route section at a node to which the distance is the greatest as a dividing point. In a case of the examples in
(46) Then, as illustrated in
(47) In short, line correction processing in the present embodiment is processing for creating a line while excluding (ignoring) nodes which do not largely deviate from a line connecting a start node with a terminal node of a section. By this means, it is possible to prevent an AR route from being unnaturally bent due to a way of setting of coordinates of the nodes. For example, it is possible to prevent inconvenience that an AR route is slightly bent for each intersection due to the coordinate of the node being set at a coordinate of the center of the intersection, although the road is actually a linear road.
(48)
(49)
(50) AR route former 111 first calculates distance h between a line connecting the start node with the terminal node and another node included in the section in step S1. That is, AR route former 111 calculates distances h2, h3 and h4 between line L0 connecting section start node N1 with section terminal node N5 and other nodes N2, N3 and N4 included in the section in the example in
(51) AR route former 111 calculates whether or not all distances h2, h3 and h4 are equal to or less than a threshold in the subsequent step S2. In a case where a positive result is obtained in step S2 (step S2: Yes), the processing transitions to step S3, and the section is determined to be a linear section, and an AR route is formed. In the example in
(52) Meanwhile, in a case where a negative result is obtained in step S2 (step S2: No), the processing transitions to step S4, and the section is divided at a node to which distance h is the greatest as a dividing point. That is, in the example in
(53) In this manner, AR route former 111 divides the section until there is no longer a node to which the distance is equal to or greater than the threshold by recursively repeating the processing of step S1-S2-S4-S1 until a positive result can be obtained in step S2. Then, when such a node no longer exists, the section is determined to be a linear section, and the processing in step S3 is performed to form a liner AR route within the linear section.
(54) <2-2> Curve Interpolation
(55) AR route former 111 in the present embodiment performs curve interpolation on the route section which is determined to be a non-linear section.
(56) For example, curve interpolation as illustrated in
(57) At this time, AR route former 111 performs curve interpolation using nodes included in the section as control points. In the example in
(58) Here, the nodes are not always arranged so as to make a clear curve, and there is a case where a curve shape having distortion is made if curve interpolation is performed so that the curve passes through all nodes. In view of this, in the present embodiment, curve L10 without distortion is formed by interpolation being performed using a B-spline curve. However, the curve interpolation is not limited to interpolation using a B-spline curve.
(59) Note that, while a case has been described here where curve interpolation is performed on a section which is not a linear section after the line correction as described in section <2-1> has been performed, the present invention is not limited to this, and, in short, it is only necessary to perform curve interpolation on the route section which is determined to be a non-linear section using nodes included in the section as the control points and output a curve subjected to the curve interpolation as the AR route.
(60)
(61) <2-3> Processing of Shifting AR Route on the Basis of Lane Information
(62) As described above, the AR route is created on the basis of the nodes and the links included in the road map data. However, as described in the section of Technical Problem, because the coordinate information included in the nodes and the links is often coordinates of centers of roads, if the AR route is formed using the information as it is, there is a case where the AR route which provides a feeling of strangeness is displayed. Particularly, an AR route which provides a feeling of strangeness is highly likely to be displayed at positions such as an intersection and a branch point, where a plurality of roads intersect.
(63) In view of this, in the present embodiment, the AR route is formed by shifting a coordinate of the node included in the road map data to a lane on which the subject vehicle is to travel on the basis of the lane information. By this means, on the basis of the lane information, it becomes possible to display an AR route which does not provide a feeling of strangeness, shifted on the side of the lane on which the subject vehicle travels. For comparison, because the AR route is formed by connecting the nodes, shifting the coordinate of the node is equivalent to shifting the AR route. Therefore, in the following description, it is possible to read shifting the coordinate of the node as shifting the AR route, inversely, it is possible to read shifting the AR route as shifting the coordinate of the node.
(64)
(65)
(66)
(67)
(68) Here, a specific example of shifting of the AR route according to the present embodiment will be described here.
(69) In a case where the road has three lanes each way and a lane on which the subject vehicle is to travel is the leftmost lane, the node positions are shifted on the lane of the subject vehicle by a “lane width (for example, 3.25 m)×(the number of lanes (in this example, 3)-0.5)”. This processing is processing to address a situation where the nodes are set on the center line.
(70) In this manner, in the AR route shifting processing of the present embodiment, a node set at the center of the road is shifted on the lane on which the subject vehicle is to travel. In the present embodiment, the AR route is shifted to the center of the lane on which the subject vehicle is to travel.
(71) In the AR route shifting processing in the present embodiment, particularly, the AR route after the subject vehicle turns left or right or the AR route the subject vehicle is to follow after the branch point is shifted on the basis of lane information of the road after the subject vehicle turns left or right or lane information of the road on which the subject vehicle is to follow after the branch point, it is possible to reduce a feeling of strangeness of the AR route after the subject vehicle turns left or right or the AR route the subject vehicle is to follow after the branch point.
(72) Then, effects by the AR route shifting processing according to the present embodiment will be described using
(73)
(74) In this manner, by performing the AR route shifting processing of the present embodiment, even if the node position at the intersection is displaced to a left side or right side from an extension of the traveling lane, it is possible to display the AR route along the lane on which the subject vehicle is to travel without providing a feeling of strangeness.
<3> Conclusion
(75) As described above, according to the present embodiment, as described in section <2-3>, because the AR route is formed by the node information included in the road map data being shifted to of the lane on which the subject vehicle is to travel on the basis of the lane information, it is possible to display the AR route which matches the shape of the route on which the subject vehicle is to travel without providing a feeling of strangeness while resolving inconvenience that the AR route is largely displaced from the route on which the subject vehicle is to travel at positions such as an intersection and a branch point, where a plurality of roads intersect.
(76) The above-described embodiment is merely an example of embodiment for implementing the present invention, and a technical scope of the present invention should not be limitedly interpreted by this. That is, the present invention can be implemented in various forms within a range not deviating from a gist or main features of the present invention.
(77) While, in the above-described embodiment, a case has been described where the display apparatus of the present disclosure is applied to an in-vehicle HUD, the present disclosure is not limited to this, and the display apparatus of the present disclosure can be widely applied to display systems and apparatuses which display the AR route which is a virtual image so as to be superimposed on a real image which is seen by the user, in short.
(78) While various embodiments have been described herein above, it is to be appreciated that various changes in form and detail may be made without departing from the spirit and scope of the invention(s) presently or hereafter claimed.
INCORPORATION BY REFERENCE
(79) This application is entitled and claims the benefit of Japanese Patent Application No. 2019-061464, filed on Mar. 27, 2019, the disclosure of which including the specification, drawings and abstract is incorporated herein by reference in its entirety.
INDUSTRIAL APPLICABILITY
(80) The display system of the present invention is suitable for, for example, a system including an in-vehicle HUD.
REFERENCE SIGNS LIST
(81) 100 Display apparatus 101 Map information acquirer 102 Position detector 103 Radar 104 Vehicle behavior detector 105 Viewpoint detector 110 Image former 111 AR route former 120 Display controller 130 HUD (Head-Up Display) 200 Vehicle 210 Windshield 220 Dashboard N1, N2, N3, N4, N5 Node h2, h3, h4 Distance L0, L1, L2 Line L10 Curve Vi Virtual image