DISPLAY CONTROL DEVICE, DISPLAY METHOD, AND STORAGE MEDIUM STORING DISPLAY PROGRAM
20220410711 · 2022-12-29
Assignee
Inventors
Cpc classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A display control device includes a processor. The processor is configured to acquire a travel route of a vehicle, in a case in which the vehicle, traveling on a travel path, changes a travel direction to another travel path, set a display position of a guidance display that guides the travel direction with respect to a display region of a display unit, on the basis of a positional relationship between the display region and the other travel path, and display the guidance display, the display position of which has been set relative to a view ahead of the vehicle, so as to be superimposed on the display region.
Claims
1. A display control device, comprising: a processor, wherein the processor is configured to: acquire a travel route of a vehicle; in a case in which the vehicle, traveling on a travel path, changes a travel direction to another travel path, set a display position of a guidance display that guides the travel direction with respect to a display region of a display unit, on the basis of a positional relationship between the display region and the other travel path; and display the guidance display, the display position of which has been set relative to a view ahead of the vehicle, so as to be superimposed on the display region.
2. The display control device of claim 1, wherein the processor sets the display position of the guidance display at a side of the other travel path in the display region in a case in which the other travel path exists outside the display region, and sets the display position of the guidance display at a side of the travel path in the display region in a case in which the other travel path exists inside the display region.
3. The display control device of claim 1, wherein the processor sets the display position of the guidance display at a side of the other travel path in the display region in a case in which the other travel path is not visible, and sets the display position of the guidance display at a side of the travel path in the display region in a case in which the other travel path is visible.
4. The display control device of claim 1, wherein, in a case in which the other travel path exists at a side of a travel lane in a case in which the travel path has the travel lane and an oncoming lane, the processor sets the display position of the guidance display at a near side of an intersection between the travel path and the other travel path and, in a case in which the other travel path exists at a side of the oncoming lane, the processor sets the display position of the guidance display at a far side of the intersection.
5. The display control device of claim 1, wherein the processor displays the guidance display at the display region while changing the guidance display by animation.
6. The display control device of claim 1, wherein the view ahead is an image captured by an imaging unit installed in the vehicle, and the processor displays the guidance display at the display region so as to be superimposed on the captured image.
7. A display method, by which a computer executes processing comprising: acquiring a travel route of a vehicle; in a case in which the vehicle, traveling on a travel path, changes a travel direction to another travel path, setting a display position of a guidance display, which guides the travel direction with respect to a display region of a display unit, on the basis of a positional relationship between the display region and the other travel path; and displaying the guidance display, the display position of which has been set relative to a view ahead of the vehicle, so as to be superimposed on the display region.
8. A non-transitory storage medium storing a display program executable by a computer to perform processing, the processing comprising: acquiring a travel route of a vehicle; in a case in which the vehicle, traveling on a travel path, changes a travel direction to another travel path, setting a display position of a guidance display, which guides the travel direction with respect to a display region of a display unit, on the basis of a positional relationship between the display region and the other travel path; and displaying the guidance display, the display position of which has been set relative to a view ahead of the vehicle, so as to be superimposed on the display region.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] Exemplary embodiments of the present disclosure will be described in detail based on the following figures, wherein:
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
DETAILED DESCRIPTION
[0032] A display device 10 that is an exemplary embodiment of the present disclosure is described hereinafter with reference to the drawings. As illustrated in
[0033] As illustrated in
[0034] The car navigation ECU 22A controls the car navigation system. A GPS device 30 and a center display 32 are connected to the car navigation ECU 22A. The GPS device 30 is a device that measures the current position of the vehicle 12. The GPS device 30 includes an antenna that receives signals from GPS satellites. Note that the GPS device 30 may be directly connected to the display control device 20. The center display 32 is a liquid crystal display that displays a map and a mark indicating the current position of the vehicle 12. The center display 32 is provided at a vehicle transverse direction central portion of the dashboard 14.
[0035] The ADAS-ECU 22B performs overall control of the advanced driver assistance system. An external camera 40 that captures images of at least the region in front of the vehicle 12 is connected to the ADAS-ECU 22B. Further, actuators that drive the brake and the accelerator respectively are connected to the ADAS-ECU 22B. Note that the external camera 40 may be directly connected to the display control device 20.
[0036] The body ECU 22C controls the wipers and the lights. The body ECU 22C is connected to, for example, the switches for the headlights, the turn signals, and the like.
[0037] The head-up display 24 is structured to include the projection device 26 and an actuator 28. The actuator 28 is a driving device for adjusting the angle of a mirror that reflects an image projected from the projection device 26, and the distance between the mirror and the projection device 26.
[0038] The display control device 20 is structured to include a CPU (Central Processing Unit) 20A, a ROM (Read Only Memory) 20B, a RAM (Random Access Memory) 20C, a storage 20D, a communication I/F (interface) 20E, and an input/output I/F 20F. The CPU 20A, the ROM 20B, the RAM 20C, the storage 20D, the communication I/F 20E, and the input/output I/F 20F are connected so as to be able to communicate with one another via an internal bus 20G.
[0039] The CPU 20A is a central processing unit, and executes various programs and controls various sections. Namely, the CPU 20A reads-out programs from the ROM 20B, and executes the programs by using the RAM 20C as a workspace.
[0040] The ROM 20B stores various programs and various data. As illustrated in
[0041] The processing program 100 that serves as a display program is a program for performing superimposition processing that is described later. The image data 110 is data that stores animation images to be displayed on the head-up display 24.
[0042] As illustrated in
[0043] The storage 20D is structured by an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and stores various programs and various data.
[0044] The communication I/F 20E is an interface for connection with the respective ECUs 22. This interface employs a communication protocol based on the CAN protocol. The communication I/F 20E is connected to the respective ECUs 22 via an external bus 20H.
[0045] The input/output I/F 20F is an interface for communicating with the projection device 26 and the actuator 28 that structure the head-up display 24.
[0046] As illustrated in
[0047] The acquisition unit 200 acquires information relating to a travel route of the vehicle 12, and image information relating to images captured by the external camera 40. Information relating to the travel route is acquired from the car navigation ECU 22A. Image information relating to the images captured by the external camera 40 is acquired from the ADAS-ECU 22B.
[0048] Based on the travel route, the determination unit 210 determines whether or not it is necessary to carry out a course change from a first travel path T1, on which the vehicle 12 is traveling, to an arbitrary second travel path T2, or, in other words, whether or not it is necessary to perform a left or a right turn. Here, the first travel path T1 is the road on which the vehicle 12 is currently traveling and is an example of the travel path, and the second travel path T2 is the road onto which the vehicle 12 will change its travel direction along the travel route, and is an example of the other travel path.
[0049] Further, the determination unit 210 determines whether or not the second travel path T2, onto which the travel direction will be changed from the first travel path T1, exists inside of the projection surface 16A serving as the display region, or exists outside of the projection surface 16A. Whether or not the second travel path T2 exists inside the projection surface 16A or outside the projection surface 16A is determined based on whether or not a main portion of the second travel path T2 is included in the foreground of the viewpoint of the driver on the projection surface 16A. The main portion includes the range of the distance from intersection C until the course change has been completed (e.g., until the turn signal turns off), and ranges of predetermined distances, and the like.
[0050] The determination unit 210 may determine whether or not the main portion of the second travel path T2 is included in the foreground of the viewpoint of the driver based on a sightline camera of the driver, or based on three-dimensional coordinates in a virtual space that is described hereinafter.
[0051] Moreover, in a case in which the projection surface 16A is large with respect to the front window 16, there is the possibility that the second travel path T2 will always be included in the foreground at the driver's viewpoint from far ahead of the course change. In this case, a predetermined determination region may be provided for the projection surface 16A, and the determination unit 210 may determine whether or not the second travel path T2 is included in the determination region.
[0052] The setting unit 220 has the function of setting mark M (see
[0053] In a case in which the second travel path T2 exists outside the projection surface 16A, the setting unit 220 sets the display position of the mark M at the second travel path T2 side of the projection surface 16A. In a case in which the display position of the mark M is set at the second travel path T2 side of the projection surface 16A, the setting unit 220 performs the following setting.
[0054] In a case in which the second travel path T2 exists at travel lane DL side at the first travel path T1 that includes the travel lane DL and oncoming lane OL, the setting unit 220 sets the display position of the mark M at the near side of the intersection C between the first travel path T1 and the second travel path T2. Due thereto, as illustrated in
[0055] Moreover, in a case in which the second travel path T2 exists at the oncoming lane OL side, the setting unit 220 sets the display position of the mark M at the far side of the intersection C. Due thereto, as illustrated in
[0056] Moreover, in a case in which the second travel path T2 exists inside the projection surface 16A, the setting unit 220 sets the display position of the mark M at the first travel path T1 side of the projection surface 16A. Due thereto, as illustrated in
[0057] The control unit 230 displays the mark M, whose display position has been set, at the view ahead of the vehicle 12 and so as to be superimposed on the projection surface 16A. This mark M is stored in the image data 110 as, for example, an animated video.
[0058] (Flow of Control)
[0059] The flows of the superimposition processing and the coordinate setting processing executed by the display control device 20 of the present exemplary embodiment are described with reference to
[0060] In step S100 in
[0061] In step S101, the CPU 20A acquires image information relating to the images captured by the external camera 40.
[0062] In step S102, the CPU 20A determines whether or not there is a course change. More specifically, the CPU 20A determines whether or not there is a course change from the first travel path T1 to the second travel path T2 at the intersection C. In a case in which the CPU 20A determines that there is a course change (in a case in which step S102 is YES), processing proceeds to step S103. On the other hand, in a case in which the CPU 20A determines that there is no course change (in a case in which step S102 is NO), processing proceeds to step S104.
[0063] In step S103, the CPU 20A executes coordinate setting processing. Details are described later.
[0064] In step S104, the CPU 20A executes usual display. Namely, the CPU 20A causes the usual navigation image or the like to be displayed on the projection surface 16A. Then, processing returns to step S100.
[0065] In step S105, the CPU 20A displays the mark M on the projection surface 16A. This mark M is displayed by animated video. For example, in
[0066] In step S106, the CPU 20A determines whether or not the course change has been completed. In a case in which the CPU 20A determines that the course change has been completed (in a case in which step S106 is YES), processing proceeds to step S107. On the other hand, in a case in which the CPU 20A determines that the course change has not been completed (NO In step S106), processing returns to step S100.
[0067] In step S107, the CPU 20A ends the display of the mark M. Then, processing returns to step S100.
[0068] Next, the flow of the coordinate setting processing of step S103 will be described with reference to the flowchart of
[0069] In step S200 of
[0070] In step S201, the CPU 20A determines whether or not the course change is a left turn. In a case in which the CPU 20A determines that the course change is a left turn (in a case in which step S201 is YES), processing proceeds to step S202. On the other hand, in a case in which the CPU 20A determines that the course change is not a left turn, i.e., is a right turn (in a case in which step S201 is NO), processing proceeds to step S203.
[0071] In step S202, the CPU 20A sets the mark M at the near left corner of the intersection C. Then, the CPU 20A ends the coordinate setting processing, and returns to the superimposition processing.
[0072] In step S203, the CPU 20A sets the mark M at the far right corner of the intersection C. Then, the CPU 20A ends the coordinate setting processing, and returns to the superimposition processing.
[0073] In step S204, the CPU 20A sets the mark M at the side of the first travel path T1 on which the vehicle is currently traveling. Then, the CPU 20A ends the coordinate setting processing, and returns to the superimposition processing.
SUMMARY
[0074] The display control device 20 of the present exemplary embodiment is structured to display a mark M, which is a guidance display that guides the travel direction, on the projection surface 16A of the head-up display 24 so as to be superimposed on the view ahead of the vehicle 12.
[0075] In the present exemplary embodiment, in a case in which the second travel path T2 exists outside the projection surface 16A, the CPU 20A sets the display position of the mark M at the second travel path T2 side of the projection surface 16A. As illustrated in
[0076] Moreover, in the display control device 20 of the present exemplary embodiment, in a case in which the second travel path T2 exists at the side of travel lane DL, the CPU 20A sets the display position of the mark M at the near side of the intersection C. For example, in a case in which the left side is the travel lane DL, the mark M is set at the near left side of the intersection C (see
[0077] Moreover, in accordance with the present exemplary embodiment, the animated display of the mark M can reduce the sense of incongruity with respect to changes in display of the mark M during travel.
Other Exemplary Embodiments
[0078] In the above exemplary embodiment, the determination unit 210 determines whether or not the second travel path T2, onto which the direction will be changed from the first travel path T1, exists inside or outside the projection surface 16A that serves as the display region, and the setting unit 220 sets the display position of the mark M on the basis of this the determination. However, the present disclosure is not limited to this, and, as another exemplary embodiment, there may be a structure in which the determination unit 210 determines whether or not the second travel path T2, onto which the travel direction is changed from the first travel path T1, is viewable, and the setting unit 220 sets the display position of the mark M on the basis of this determination.
[0079] Note that “is viewable” refers to a state in which it is possible to confirm the traffic situation, road surface situation, obstacles, and the like of the second travel path T2, which the vehicle 12 will be traveling on from here on, from the first travel path T1 that is currently being traveled. Accordingly, “is not viewable” means a state in which the traffic situation, road surface situation, obstacles, and the like of the second travel path T2 cannot be confirmed.
[0080] In this other exemplary embodiment, operation and effects that are similar to those of the above-described exemplary embodiment are exhibited, except that the processing of determining whether or not the second travel path T2 is not viewable is carried out in step S200 of
[Notes]
[0081] Although the present exemplary embodiment is structured such that the mark M is displayed so as to be superimposed on the view ahead of the vehicle 12 viewed through the front window 16, the present disclosure is not limited to this, and the mark M may be displayed so as to be superimposed on a captured image of the view ahead of the vehicle 12. For example, in a case in which images captured by the external camera 40 that structures the imaging unit can be displayed on the center display 32, the mark M can be displayed so as to be superimposed on the captured images of the view ahead of the vehicle 12. In this case as well, it is possible to guide the course change while reducing annoyance for the driver who views the center display 32.
[0082] Alternatively, the mark M may be displayed in a superimposed manner in a wearable device such as augmented reality (AR) glasses or a head-mounted display. In this case, the driver wearing the wearable device can ascertain the mark M that is superimposed on the view ahead of the vehicle 12 that is being viewed through a screen or the like in front of the eyes. In this case as well, it is possible to guide the course change while reducing annoyance for the driver.
[0083] Note that the various processing executed by the CPU 20A reading and executing software (a program) in the above exemplary embodiments may be executed by various types of processors other than a CPU. Examples of processors in this case are Programmable Logic Devices (PLDs) whose circuit structure can be modified after production such as Field-Programmable Gate Arrays (FPGAs) and the like, and dedicated electric circuits that are processors having circuit structures designed for the sole purpose of executing specific processings, such as Application Specific Integrated Circuits (ASICs) or the like, or the like. The above-described processings may be executed by any one of these various types of processors, or by a combination of two or more of the same type or different types of processors (e.g., plural FPGAs, or a combination of a CPU and an FPGA, or the like). Further, the hardware structures of these various types of processors are, more specifically, electric circuits that combine circuit elements such as semiconductor elements and the like.
[0084] The above exemplary embodiments describe aspects in which the respective programs are stored in advance (installed) on a non-transitory storage medium readable by a computer. For example, the processing program 100 in the display control device 20 is stored in advance in the ROM 20B. However, the present disclosure is not limited to this, and the respective programs may be provided in the form of being stored on a non-transitory storage medium such as a CD-ROM (Compact Disc Read Only Memory), a DVD-ROM (Digital Versatile Disc Read Only Memory), a USB (Universal Serial Bus) memory, or the like. Alternatively, the programs may be provided in a form of being downloadable from an external device over a network.
[0085] The processings in the above exemplary embodiments are not limited to being executed by a single processor, and may be executed by plural processors in cooperation with each other. The flows of processings described in the above exemplary embodiments also are examples, and unnecessary steps may be deleted therefrom, new steps may be added thereto, or the order of the processes may be rearranged, within a scope that does not depart from the gist of the present invention.