Vehicle control system
11029689 · 2021-06-08
Assignee
Inventors
- Kensei HATA (Sunto-gun, JP)
- Hiroaki Kodera (Susono, JP)
- Takahito Endo (Sunto-gun, JP)
- Naoki Harayama (Sunto, JP)
- Katsuya Iwazaki (Susono, JP)
- Yushi Seki (Susono, JP)
- Hideaki Komada (Gotemba, JP)
Cpc classification
B60Q2200/30
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/085
PERFORMING OPERATIONS; TRANSPORTING
B60Q2300/20
PERFORMING OPERATIONS; TRANSPORTING
Y02B20/40
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G08G1/166
PHYSICS
G01S17/86
PHYSICS
G01S2013/9316
PHYSICS
B60Q2300/45
PERFORMING OPERATIONS; TRANSPORTING
H05B47/115
ELECTRICITY
International classification
Abstract
A vehicle control system configured to maintain detection accuracy of an external sensor such as an on-board camera. The vehicle control system is applied to a vehicle that can be operated autonomously. A controller communicates with a database stored on the controller and a database stored on an external facility. The controller changes orientation of a headlamp toward an object detected by an on-board camera, in a case that the headlamp is turned on, and that information about the object detected by the on-board camera is not available in the database.
Claims
1. A vehicle control system for use with a vehicle having: a prime mover; a brake device that applies braking force to a wheel; a steering system that turns the wheels; a lighting device that emits a light; an external sensor that detects external conditions; and a controller that controls the prime mover, the brake device, and the steering system based on information about the external conditions detected by the external sensor, so as to operate the vehicle autonomously without requiring a manual operation, the vehicle control system comprising: a network interface that communicates with: (a) a first database stored in the vehicle, and (b) a second database stored in an external facility, both the first and second database containing a map database; and the controller being configured to: detect whether an object having road information is present in a vicinity of the vehicle via the external sensor, determine whether information about the road information of the detected object is not available in the map database of both the first database of the vehicle and the second database of the external facility via the network interface, change the orientation of the lighting device toward the detected object when: (i) the lighting device is turned on, and (ii) the information about the road information of the detected object is not available in the map database of both the first database of the vehicle and the second database of the external facility, and further change the orientation of the lighting device towards the detected object in a case when a detection accuracy of the external sensor is reduced as a result of orienting the lighting device toward the detected object, including: determining that the detection accuracy of the external sensor is reduced when the detected object is a light emitting object or a reflection object, further changing the orientation of the lighting device such that light is emitted to only a portion of the detected object that does not include a light emitting portion when the detected object corresponds to the light emitting object, and further changing the orientation of the lighting device such that light is emitted to only a portion of the detected object that does not include a reflecting portion when the detected object corresponds to the reflection object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Features, aspects, and advantages of exemplary embodiments of the present invention will become better understood with reference to the following description and accompanying drawings, which should not limit the invention in any way.
(2)
(3)
(4)
(5)
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
(6) Embodiments of the present disclosure will now be explained with reference to the accompanying drawings. The control system according to at least one embodiment of the present disclosure may be applied to a hybrid vehicle powered by an engine and a motor(s), and an electric vehicle powered by the motor(s). In the vehicles of these kinds, electric power may be supplied to the motor not only from a battery but also from a fuel cell. In addition, the control system may also be applied to a conventional vehicle in which the vehicle is powered only by an engine.
(7) Referring now to
(8) One end of an input shaft 9 is connected to the output member 7 to be rotated integrally therewith, and other end of the input shaft 9 is connected to a single-pinion planetary gear unit 10. The planetary gear unit 10 comprises a sun gear 11 fitted onto the input shaft 9, a ring gear 12 arranged concentrically with the sun gear 11, a plurality of pinion gears 13 interposed between the sun gear 11 and the ring gear 12, and a carrier 14 supporting the pinion gears 13 while allowing to revolve around the sun gear 11.
(9) A first cylindrical shaft 15 extends from the sun gear 11 on the input shaft 9 toward the engine 1 to be connected to the first motor 2. For example, a permanent magnet type synchronous motor having a generating function may be used as the first motor 2. In the first motor 2, a rotor 2a is connected to the first cylindrical shaft 15 of the sun gear 11 to be rotated integrally therewith, and a stator 2b is fixed to a stationary member 16 such as a housing.
(10) A second cylindrical shaft 17 extends from the ring gear 12 toward the second motor 3, and a rotor 3a of the second motor 3 is connected to the second cylindrical shaft 17 to be rotated integrally therewith. A stator 3b of the second motor 3 is fixed to the stationary member 16 such as a housing.
(11) A leading end of the second cylindrical shaft 17 is connected to an output shaft 18 to be rotated integrally therewith, and a parking gear 19 as an external gear is fitted onto the output shaft 18 to be rotated integrally therewith. A parking lock mechanism 20 is arranged outside of the parking gear 19. The parking lock mechanism 20 comprises a parking pawl and a parking actuator (neither of which are shown). The parking actuator selectively brings the parking pawl into engagement with the parking gear 19 thereby locking the output shaft 18. An engagement between the parking pawl and the parking gear 19 may be maintained even after shutting down a battery as a power source 21.
(12) A leading end of the output shaft 18 is connected to a differential gear unit 22, and the differential gear unit 22 is connected to a pair of drive wheels 24 through drive shafts 23 extending laterally. The drive wheels 24 are turned by a steering system 25. Rotations of the drive wheels 24 and another pair of wheels 26 are individually stopped by a brake 27.
(13) An operating mode of the vehicle Ve may be selected from a hybrid mode (to be abbreviated as the “HV mode” hereinafter) in which the vehicle Ve is powered at least by the engine 1, and an electric vehicle mode (to be abbreviated as the “EV mode” hereinafter) in which the vehicle Ve is powered by at least one of the first motor 2 and the second motor 3. Specifically, in the HV mode, the engine 1 generates power in accordance with a required drive force calculated by a controller (i.e., ECU) 28, and the first motor 2 generates reaction torque in such a manner as to deliver the output power of the engine 1 to the drive wheels 24 through the planetary gear unit 10. In this situation, electric power generated by the first motor 2 may be supplied to the second motor 3 so that an output torque of the second motor may be applied to the second cylindrical shaft 17. That is, the output power of the engine 1 may be translated partially into the electric power by the first motor 2, and then translated into kinetic energy again by the second motor 3 to be applied to a torque transmission route between the engine 1 and the drive wheels 24. By contrast, when the first motor 2 serves as a motor while establishing the reaction torque, output torque of the first motor 2 applied to the transmission route may be translated into electric power by the second motor 3, thereby reducing power transmitted through the transmission route.
(14) In the EV mode, the second motor 3 is operated as a motor in such a manner as to achieve a required drive force calculated by the controller 28. In this situation, fuel supply to the engine 1 and power supply to the first motor 2 may be stopped.
(15) As shown in
(16) A configuration of the controller 28 is shown in
(17) In order to selectively connect and disconnect the drive controller 37 to/from the battery 21 depending on an operating condition of the switch button or key for energizing the relay switch 35, a main switch 39 is arranged between the battery 21 and the drive controller 37. For example, when the switch button is pressed, the main switch 39 is turned on, and then, if the switch button is pressed for a predetermined period of time, the relay switch 35 is turned on. The main switch 39 is controlled by the main controller 36 to automatically allow and interrupt electric power supply to the drive controller 37.
(18) The main controller 36 is an electronic control unit composed mainly of a microcomputer. To the main controller 36, detection signals and information about operating conditions and behaviors of constituent elements of the vehicle Ve are transmitted from an internal sensor 40. Specifically, the internal sensor 40 includes an accelerator sensor 42 for detecting a position of an accelerator pedal 41, a brake sensor (or switch) 44 for detecting a depression of a brake pedal 43, a steering sensor 46 for detecting a steering angle of the steering wheel 45, a vehicle speed sensor 47 for detecting rotational speeds of the wheels 24 and 26, a longitudinal acceleration sensor 48 for detecting a longitudinal acceleration of the vehicle Ve, a lateral acceleration sensor 49 for detecting a lateral acceleration of the vehicle Ve, a yaw rate sensor 50 for detecting a yaw rate of the vehicle, a shift sensor 52 for detecting a position of a shift lever (or switch) 51 and so on. The main controller 36 transmits command signals for controlling the engine 1, the first motor 2 and the second motor 3 to the drive controller 37, and transmits command signals for controlling the brake 27 and so on to the sub-controller 38 based on incident signals from the internal sensor 40 as well as maps and formulas installed in advance. In
(19) The control system according to the embodiments of the present disclosure is configured to operate the vehicle Ve autonomously. Specifically, the control system is configured to execute a starting operation, an accelerating operation, a steering operation, a braking operation, a stopping operation and etc. of the vehicle Ve completely autonomously at level 4 defined by the NHTSA (National Highway Traffic Safety Administration) or level 4 or 5 defined by the SAE (Society of Automotive Engineers), while recognizing and observing an external condition and a travelling condition. For this reason, the vehicle Ve may be operated not only autonomously with or without a driver (and a passenger) but also manually by the driver. The control system may also be configured to operate the vehicle Ve at level 3 at which an accelerating operation, a steering operation, a braking operation etc. are executed autonomously only in an allowable condition, and the driver has to manipulate the vehicle Ve upon request from the system.
(20) As described, the vehicle Ve is operated autonomously while manipulating the engine 1, the first motor 2, the second motor 3, the brake 27, and so on by the controller 28. In addition, the steering system 25, the parking lock mechanism 20 and so on are also controlled by the controller 28.
(21) In order to operate the vehicle Ve autonomously, detection signals from external sensors 53 for detecting external conditions are also sent to the main controller 36. For example, the external sensor 53 includes at least one of an on-board camera, a RADAR (i.e., a radio detection and ranging) a LIDAR (i.e., a laser imaging detection and ranging), an ultrasonic sensor and so on. Data detected by the external sensor 53 may be utilized in an inter-vehicle communication.
(22) Specifically, the on-board camera is arranged inside of a windshield glass, and transmits recorded information about the external condition to the main controller 36. To this end, not only a monocular camera but also a stereo camera having a plurality of lenses and image sensors to achieve a binocular vision may be used as the on-board camera. If the stereo camera is used as the on-board camera, the main controller 36 is allowed to obtain three-dimensional information in the forward direction.
(23) The RADAR is adapted to detect obstacles utilizing radio waves such as millimetric-waves and microwaves, and to transmit detected information to the main controller 36. Specifically, the RADAR detects an obstacle such as other vehicles and so on by emitting radio waves and analyzing the radio waves reflected from the obstacle.
(24) Likewise, the LIDAR is adapted to detect obstacles utilizing laser light and to transmit detected information to the main controller 36. Specifically, the LIDAR detects an obstacle such as other vehicles and so on by emitting laser light and analyzing the laser light reflected from the obstacle.
(25) Information about other vehicles around the vehicle Ve such as positions, speeds, directions, operating modes etc. may be obtained through the inter-vehicle communication system to support safe driving. Such inter-vehicle communication is available among the vehicles individually having an on-board equipment for intelligent transport systems even where infrastructure has not yet been improved.
(26) In addition, the vehicle Ve is further provided with a GPS (i.e., global positioning system) receiver 54, a digital map database 55, and a navigation system 56. Specifically, the GPS receiver 54 is adapted to obtain a position (i.e., latitude and longitude) based on incident signals from GPS satellites, and to transmit the positional information to the main controller 36. The map database 55 may be installed in the main controller 36, but map database stored on external facility such as an online information processing systems may also be available. The navigation system 56 is configured to determine a travelling route of the vehicle Ve based on the positional information obtained by the GPS receiver 54 and the map database 55.
(27) The main controller 36 carries out calculations based on the incident data or information from the internal sensor 40 and the external sensor 53 as well as the preinstalled data, and calculation results are sent in the form of command signal to the drive controller 37, the sub-controller 38 and the auxiliary 57. The incident signals to the drive controller 37 are converted into drive commands, and further transmitted to the throttle actuator of the engine 1, and the first inverter 29 and the second inverter 30 of the first motor 2 and the second motor 3. The incident signals to the sub-controller 38 are converted into appropriate command signals and further transmitted to actuators 58 of the brake 27, the steering system 25 and so on.
(28) The actuator 58 includes a brake actuator, a steering actuator and so on. Specifically, the brake actuator is adapted to actuate the brake 27 to control braking force applied to the wheels 24 and 26 in response to the command signal from the sub-controller 38. The steering actuator is adapted to activate an assist motor of the steering system 25 to control a steering torque in response to the command signal from the sub controller 38.
(29) The auxiliary 57 includes devices that are not involved in propulsion of the vehicle Ve such as a wiper, a headlamp, a direction indicator, an air conditioner, an audio player and so on.
(30) The main controller 36 comprises a position recognizer 59, an external condition recognizer 60, a running condition recognizer 61, a travel plan creator 62, a travel controller 63, an auxiliary controller 64, a passenger detector 65 and so on.
(31) Specifically, the position recognizer 59 is configured to recognize a current position of the vehicle Ve on the map based on the positional information received by the GPS receiver 54 and the map database 55. The current position of the vehicle Ve may also be obtained from the positional information used in the navigation system 56. Optionally, the vehicle Ve may also be adapted to communicate with external sensors arranged along the road to obtain the current position of the vehicle Ve.
(32) The external condition recognizer 60 is configured to recognize external condition of the vehicle Ve such as a location of a traffic lane, a road width, a road configuration, a road gradient, an existence of obstacles around the vehicle Ve and so on, based on the recorded information of the on-board camera, or detection data of the RADAR or the LIDAR. Optionally, weather information, a friction coefficient of road surface etc. may be obtained according to need.
(33) The running condition recognizer 61 is configured to recognize running condition of the vehicle Ve such as a vehicle speed, a longitudinal acceleration, a lateral acceleration, a yaw rate and so on based on detection data collected by the internal sensors 40.
(34) The travel plan creator 62 is configured to create a travel locus of the vehicle Ve based on a target course determined by the navigation system 56, a position of the vehicle Ve recognized by the position recognizer 59, and an external condition recognized by the external condition recognizer 60. That is, the travel plan creator 62 creates a travel locus of the vehicle Ve within the target course in such a manner that the vehicle Ve is allowed to travel safely and properly while complying traffic rules.
(35) In addition, the travel plan creator 62 is further configured to create a travel plan in line with the created travel locus. Specifically, the travel plan creator 62 creates a travel plan in line with the target course based on the external conditions recognized by the external condition recognizer 60 and the map database 55.
(36) Specifically, the travel plan is created based on prospective data after few seconds from the present moment to determine a future condition of the vehicle Ve such as a driving force or the like required in future. Optionally, the travel plan may also be created based on prospective data after several ten seconds depending on the external conditions and the running conditions. Thus, the travel plan creator 62 creates a future plan to change a vehicle speed, acceleration, steering torque etc. during travelling along the target course in the form of e.g., a map.
(37) Alternatively, the travel plan creator 62 may also create a pattern to change the vehicle speed, acceleration, steering torque etc. between predetermined points on the travel locus. Specifically, such patterns may be determined by setting target values of those parameters at each point on the travel locus taking account of a required time to reach the point at the current speed.
(38) As described, the controller 28 is configured to work with the adaptive cruise control system or cooperative adaptive cruise control system, and the travel plan may also be created in such a manner as to follow the preceding vehicle while communicating with the other vehicles. The adaptive cruise control system may be manipulated by switches arranged in the vicinity of the steering wheel or within a steering pad. Specifically, activation of the cruise control system, selection of a control mode, setting a target distance from a preceding vehicle etc. may be executed by manipulating the switches.
(39) The travel controller 63 is configured to operate the vehicle Ve autonomously in line with the travel plan created by the travel plan creator 62. To this end, specifically, the travel controller 63 transmits command signals to the actuators 58, or the engine 1, the first motor 2 and the second motor 3 through the drive controller 37 and the sub-controller 38.
(40) The auxiliary controller 64 is configured to operate the auxiliaries 57 such as the wiper, the headlamp, the direction indicator, the air conditioner, the audio player and so on in line with the travel plan created by the travel plan creator 62.
(41) The passenger detector 65 is configured to determine the existence of passenger in the vehicle Ve and the preceding vehicle. For example, the passenger detector 65 determines the existence of passenger in the vehicle Ve based on a fact that a power switch, an ignition switch, or a start button is turned on, that a passenger sitting on a vehicle seat is detected, that a seat belt is fastened, or that the steering wheel is turned. Meanwhile, the passenger detector 65 determines the existence of passenger in the preceding vehicle by obtaining information about the preceding vehicle through the inter-vehicle communication, or by analyzing information obtained by the on-board camera.
(42) Thus, the vehicle Ve shown in
(43) At step S1, it is determined whether the vehicle Ve is being operated autonomously. Specifically, such determination of the current operating mode can be made based on a signal from the switch for selecting the operating mode, or based on a flag representing the autonomous mode. If the vehicle Ve is currently not operated autonomously so that the answer of step S1 is NO, the routine returns.
(44) In this case, the controller 28 determines that the vehicle Ve is currently operated manually by a driver in the manual mode. As described, the vehicle Ve may be operated autonomously with or without a driver (or a passenger(s)). A presence of the passenger may be determined by the passenger detector 65 based on operating states of the above-explained devices. Instead, a presence of the passenger may be determined based on a signal from a biometric passenger sensor such as an infrared sensor for detecting a body temperature of the passenger, and a motion sensor such as a Doppler sensor for detecting a body movement of the passenger.
(45) By contrast, if the vehicle Ve is being operated autonomously so that the answer of step S1 is YES, the routine progresses to step S2 to determine whether a headlamp as a lighting device of the vehicle Ve is turned on. According to the embodiment, the lighting device includes not only the headlamp but also a front fog lamp and a lamp of the on-board camera. The lighting device may further include an infrared lamp and a millimeter-wave RADAR. Thus, according to the embodiment, the lighting device includes not only the lighting device for emitting visible light but also the lighting device for emitting invisible light. As described, the headlamp is included in the auxiliary 57, and controlled automatically by the auxiliary controller 64.
(46) For example, the headlamp is turned on when travelling in the nighttime, when travelling in the fog, when travelling in the rain, and when travelling through a tunnel. If the headlamp is turned off so that the answer of step S2 is NO, the routine returns without carrying out any specific control.
(47) By contrast, if the headlamp is turned on so that the answer of step S2 is YES, the routine progresses to step S3 to determine whether an object whose information is not available in the database is detected, and whether the detected object is a physical object.
(48) During propulsion in the autonomous mode, the external sensor 53 detects various objects such as a road sign, a road depression, a railroad crossing and so on while with reference to the database stored on the main controller 36 and the data which has been collected by the external sensor 53. If the information about the newly detected object is not found in the database stored on the main controller 36, and the newly detected object is a physical object e.g., a disabled vehicle stopping on the road so that the answer of step S3 is YES, the routine progresses to step S4 to change an orientation of the headlamp (i.e., a direction of radiation) toward the detected object. In other words, if the newly detected object is not available in the database, the headlamp is oriented to the newly detected object.
(49) The external sensor 53 also detects road information about, a traffic congestion, a speed limit and so on indicated on a road sign and a message board, and the detected road information is stored on the map database 55.
(50) The orientation of the lighting device may be change arbitrarily by the auxiliary controller 64 not only vertically but also horizontally.
(51) According to the embodiment, therefore, the object whose information is not available in the database can be recognized clearly by the on-board camera even when the on-board camera is not allowed to recognize the object clearly such as in the nighttime. That is, if a stopping disabled vehicle is detected by the RADAR or the LIDAR but the details of the trouble has not yet been confirmed, the details of the trouble of the disabled vehicle can be confirmed by the on-board camera. For example, it is possible to confirm a fact that a tire(s) of the disabled vehicle is/are flattened, or that the disabled vehicle is overturned. In addition, if the disabled vehicle is out of fuel, such information may be obtained through the inter-vehicle communication.
(52) By contrast, if the information about the newly detected object is not available in the database but the newly detected object is the road information so that the answer of step S3 is NO, the routine progresses to step S5 to determine whether the newly detected road information e.g., a message board is not available in the database. That is, at step S3, availability of the information about the newly found physical object in the database is determined. Meanwhile, at step S5, availability of the newly obtained road information in the database is determined. As described, such road information includes messages indicated on a message board about a traffic congestion, a speed limit etc. The answer of step S3 will also be YES if the information about the newly detected object has already been stored in the database. In this case, the routine returns through the below-mentioned step S5.
(53) For example, the speed limit may be restricted due to bad weather, and some of traffic lanes may be closed due to traffic accident or road construction. Such road conditions change constantly, and the information stored in the database is updated continuously. Although the road sign or the message board is newly detected, it would be difficult to read the message indicated on e.g., the message board by the on-board camera in e.g., the nighttime. Specifically, in the nighttime, it would be difficult for the on-board camera to clearly recognize the speed limit indicated on the message board that is restricted due to bad weather or the like.
(54) If the newly detected road information is not available in the database so that the answer of step S5 is YES, therefore, the routine also progresses to step S4 to change an orientation of the headlamp toward the detected object such as the message board.
(55) By contrast, if the newly detected road information is available in the database so that the answer of step S5 is NO, the routine returns without changing the orientation of the headlamp.
(56) Then, the routine progresses to step S6 to specify the detected object by the on-board camera. Specifically, in the case that the newly detected object is the physical object, details of the physical object is specified by the on-board camera. By contrast, in the case that the newly detected object is the road information, details of the message indicated on the message board or the like is read by the on-board camera. Thereafter, the database stored on the main controller 36 is updated at step S7 based on the information specified by the on-board camera, and the command signals to operate the vehicle autonomously are calculated by the main controller 36 based on the updated database. For example, the travel plan, the target vehicle speed, the pattern to change the vehicle speed the target course and so on are updated by the main controller 36 based on the updated database. Optionally, the database stored in the external online information processing systems may also be updated based on the updated database stored on the main controller 36. Thereafter, the orientation of the headlamp is retuned to an original position, and the routine returns.
(57) Thus, according to the embodiment of the present disclosure, the headlamp is oriented to the newly found object during autonomous propulsion in e.g., the nighttime. According to the embodiment, therefore, the information about the newly found object that is not available in the database can be specified clearly by the external sensor 53 such as the on-board camera even in e.g., the nighttime. In other words, the on-board camera is allowed to specify the newly found object accurately even in the nighttime so that the external conditions around the vehicle Ve is detected correctly.
(58) In addition, if the object is detected by the RADAR or the LIDAR but the details of the object has not yet been specified, the details of the object can be obtained by the on-board camera while emitting the light to the object.
(59) If the detected object is a light emitting object, it would be difficult for the on-board camera to specify details of the object by emitting the light to the object due to reflection between the light emitted from the headlamp and the light emitted from the object. In order to avoid such disadvantage, the vehicle control system according to the embodiment may be further configured to execute a routine shown in
(60) In the routine shown in
(61) After changing the orientation of the headlamp to the detected object at step S4, it is determined at step S100 whether the object detected at step S3 or S5 is a light emitting object or a reflection object that makes the on-board camera difficult to specify the object if it is irradiated. For example, it would be difficult for the on-board camera to read a massage indicated on an electronic message board if the message board is irradiated by the light emitted from the headlamp. At step S100, therefore, reduction in detection accuracy of the on-board camera after orienting the headlamp to the object is determined. Specifically, the light emitting object includes an electronic construction signage, a flare, an electronic message board and so on, and the reflection object includes a rubber pole (i.e., a lane separation pole). If the detected object is the light emitting object or the reflection object so that the answer of step S100 is YES, the routine progresses to step S200 to further change the orientation of the headlamp within the object in such a manner as to emit the light to a portion of the object other than a light emitting portion or a reflection portion.
(62) At step S100, reduction in the detection accuracy of the on-board camera is also determined even if the detection accuracy at the light emitting portion or the reflection portion of the object and the detection accuracy at the remaining portion of the object are identical to each other. In addition, the answer of step S100 will also be YES when the on-board camera is not allowed to clearly read the message indicated on the message board due to over exposure or the like. For example, such determination at step S100 may be made based on a threshold of illuminance or detection degree.
(63) If the detected object is not the light emitting object or the reflection object so that the answer of step S100 is NO, the routine progresses to step S300 to fix the orientation of the headlamp changed at step S4 until the on-board camera is allowed to specify the object. For example, the orientation of the headlamp is fixed until the on-board camera is allowed to read the message indicated on the message board. Then, the detected object is specified by the on-board camera at step S6, and the database stored on the main controller 36 is updated at step S7 based on the information specified by the on-board camera. Thereafter, the orientation of the headlamp is retuned to the original position, and the routine returns. For example, in a case that the detected object is a disabled vehicle, the orientation of the headlamp is returned to the original position after passing the disabled vehicle.
(64) Thus, by executing the routine shown in
(65) Although the above exemplary embodiments of the present disclosure have been described, it will be understood by those skilled in the art that the present disclosure should not be limited to the described exemplary embodiments, and various changes and modifications can be made within the scope of the present disclosure. For example, any kinds of appropriate mechanism may be applied to change an orientation of the headlamp. In addition, if the newly found object that is the light emitting object or the reflection object, a light emitting period may be reduced to allow the on-board camera to specify the object accurately. Further, if the detected object is large, it is also possible to obtain details of the object while changing the orientation of the on-board camera.