Guided vehicle positioning for inductive charging with the assistance of a vehicle camera
09802501 · 2017-10-31
Assignee
Inventors
Cpc classification
G05D1/0246
PHYSICS
B60L53/38
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0225
PHYSICS
Y02T10/70
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B60L53/126
PERFORMING OPERATIONS; TRANSPORTING
B62D6/002
PERFORMING OPERATIONS; TRANSPORTING
B60L53/37
PERFORMING OPERATIONS; TRANSPORTING
Y02T90/12
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02T90/14
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02T10/7072
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
B60L9/00
PERFORMING OPERATIONS; TRANSPORTING
B62D6/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method and a corresponding device are provided for positioning a vehicle above a primary coil for inductive charging of a rechargeable battery in the vehicle. A control device for a vehicle is described. The vehicle has a secondary coil for receiving electrical energy from a primary coil outside the vehicle. The vehicle further has at least one camera, which is designed to detect an environment of the vehicle. The control unit is designed to receive image data from the at least one camera of the vehicle and to access reference data. The reference data includes information on at least one predefined reference object in the detected environment of the vehicle and on a position of the at least one predefined reference object relative to the primary coil. The control unit detects the at least one predefined reference object in the received image data on the basis of the reference data. In addition, the control unit determines a position of the secondary coil relative to the primary coil on the basis of the detected at least one reference object.
Claims
1. A control unit for a vehicle having a secondary coil for picking up electric power from a vehicle-external primary coil, wherein the vehicle has at least one camera configured to capture surroundings of the vehicle, wherein the control unit is configured to execute instructions stored on a computer-readable medium to: receive from the at least one camera, image data of the surroundings of the vehicle captured by the at least one camera; access reference data comprising information about at least one predefined reference object located within the captured surroundings of the vehicle, the information at least determinative of an identity of the at least one predefined reference object and/or a position of the at least one predefined reference object relative to the primary coil; detect a presence of the at least one predefined reference object in the received image data based on the accessed reference data; ascertain the position of the secondary coil relative to the primary coil based on the detected at least one predefined reference object; and generate a graphical representation of the secondary coil with respect to the primary coil based on the ascertained position of the secondary coil relative to the primary coil.
2. The control unit according to claim 1, wherein the reference data comprises one or more of: information about visual features of the at least one reference object that are able to be detected in the image data from the camera; information about a physical size of the at least one reference object; information concerning spatial coordinates of the at least one reference object in a predefined coordinate system; and information concerning spatial coordinates of the primary coil in the predefined coordinate system.
3. The control unit according to claim 1, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: receive information concerning a steering turn and concerning a wheel speed of the vehicle; and ascertain a motion of the vehicle relative to the at least one reference object based on the received steering turn and wheel speed information.
4. The control unit according to claim 1, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: ascertain a camera angle for a ray between the at least one camera and the at least one reference object; and ascertain a distance from the secondary coil to the at least one reference object based on the ascertained camera angle.
5. The control unit according to claim 1, wherein the reference data comprise information about a multiplicity of predefined reference objects along an approach trajectory of the vehicle with respect to the primary coil; and the control unit is configured to further execute instructions stored on the computer-readable medium to: receive a time sequence of image data from the at least one camera of the vehicle along the approach trajectory; and detect the multiplicity of predefined reference objects in the time sequence of image data.
6. The control unit according to claim 1, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: receive the reference data from a pilot unit of the primary coil.
7. The control unit according to claim 5, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: receive the reference data from a pilot unit of the primary coil.
8. The control unit according to claim 1, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: prompt a control signal to be sent to a pilot unit of the primary coil in order to illuminate the at least one predefined reference object.
9. The control unit according to claim 7, wherein the control unit is configured to further execute instructions stored on the computer-readable medium to: prompt a control signal to be sent to a pilot unit of the primary coil in order to illuminate the at least one predefined reference object.
10. A vehicle, comprising: a secondary coil for picking up electric power from a vehicle-external primary coil; at least one camera configured to capture surroundings of the vehicle; a control unit for the vehicle, wherein the control unit is configured to: receive image data from the at least one camera of the vehicle; access reference data comprising information about at least one predefined reference object in the captured surroundings of the vehicle, the information at least determinative of a presence of the at least one predefined reference object and/or a position of the at least one predefined reference object relative to the primary coil; detect the presence of the at least one predefined reference object in the received image data based on the accessed reference data; ascertain the position of the secondary coil relative to the primary coil based on the detected presence of the at least one predefined reference object, and generate a graphical representation of the secondary coil with respect to the primary coil based on the ascertained position of the secondary coil relative to the primary coil; a memory unit configured to store the reference data; and a display configured to display the graphical representation.
11. The vehicle according to claim 10, wherein the reference data comprises one or more of: information about visual features of the at least one reference object that are able to be detected in the image data from the camera; information about a physical size of the at least one reference object; information concerning spatial coordinates of the at least one reference object in a predefined coordinate system; and information concerning spatial coordinates of the primary coil in the predefined coordinate system.
12. The vehicle according to claim 10, wherein the control unit is configured to: receive information concerning a steering turn and concerning a wheel speed of the vehicle; and ascertain a motion of the vehicle relative to the at least one reference object based on the received steering turn and wheel speed information.
13. The vehicle according to claim 10, wherein the control unit is configured to: ascertain a camera angle for a ray between the at least one camera and the at least one reference object; and ascertain a distance from the secondary coil to the at least one reference object based on the ascertained camera angle.
14. The vehicle according to claim 10, further comprising an onboard computer configured to permit a user to input the reference data.
15. A method for assisting in positioning a secondary coil of a vehicle relative to a vehicle-external primary coil, the method comprising the acts of: receiving image data from at least one vehicle camera configured to capture surroundings of the vehicle; receiving reference data comprising information about at least one predefined reference object in the captured surroundings of the vehicle, the information at least determinative of a presence of the at least one predefined reference object and/or a position of the at least one predefined reference object relative to the primary coil; detecting the at least one predefined reference object in the received image data based on the received reference data; ascertaining a position of the secondary coil relative to the primary coil based on the detected at least one reference object; and generating a graphical representation of the secondary coil with respect to the primary coil based on the ascertained position of the secondary coil relative to the primary coil.
16. The method according to claim 15, wherein the reference data comprises information about a multiplicity of predefined reference objects; and the method additionally comprises the acts of: detecting the multiplicity of predefined reference objects in the received image data; and comparing the reference data with the detected multiplicity of predefined reference objects.
17. The method according to claim 15, wherein the reference data comprise one or more of: information about visual features of the at least one reference object that are able to be detected in the image data from the camera; information about a physical size of the at least one reference object; information concerning spatial coordinates of the at least one reference object in a predefined coordinate system; and information concerning spatial coordinates of the primary coil in the predefined coordinate system.
18. The method according to claim 15, wherein the method further comprises the acts of: receiving information concerning a steering turn and concerning a wheel speed of the vehicle; and ascertaining a motion of the vehicle relative to the at least one reference object based on the received steering turn and wheel speed information.
19. The method according to claim 15, wherein the method further comprises the acts of: ascertaining a camera angle for a ray between the at least one camera and the at least one reference object; and ascertaining a distance from the secondary coil to the at least one reference object based on the ascertained camera angle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
DETAILED DESCRIPTION OF THE DRAWINGS
(14) As already explained at the outset with respect to
(15) This document describes a method and a corresponding apparatus for assisting the driver in positioning the secondary coil 102 of a vehicle 100 over a primary coil 111. The method described and the apparatus described allow continuous and inexpensive assistance for the positioning operation.
(16) In particular, one or more of the ambient sensors installed in the vehicle 100 (e.g. cameras and/or ultrasonic sensors) are used for assisting in positioning.
(17) Using the ambient sensors 131, 132 (particularly using one or more cameras 131), a control unit 133 of the vehicle 100 can produce graphical representations of the surroundings of the vehicle 100 that are able to be displayed to the driver on a screen 134 of the vehicle 100 during positioning of the vehicle 100, and thus assist the driver in positioning the vehicle 100. In particular, a panoramic view of the vehicle 100 (e.g. what is known as a “top view” representation of the vehicle 100) can be produced. An exemplary “top view” representation is shown in
(18) As an alternative or in addition to the synthetically produced “top view” representation, direct use of the image from a camera 131 of the vehicle (e.g. a rear camera 131) is also possible. An exemplary schematic image from a camera 131 is shown in
(19)
(20) The control unit 133 may therefore be set up to identify the ground unit 111 on the basis of the data from the ambient sensors 131, 132 (particularly on the basis of the images from the one or more cameras 131) (e.g. using image processing algorithms). To assist in the automatic identification of the ground unit 111, the ground unit 111 can include specific visual features and/or reference points that can be identified by the control unit. As
(21) In addition, nonvisual features (e.g. identification that the vehicle 100 is situated in the reception range of the WLAN communication of the ground unit 111) can be used in order to identify the position of the ground unit 111 in the surroundings of the vehicle 100.
(22) Furthermore, the control unit 133 may be set up to identify further reference points 402, 403 in the surroundings of the vehicle 100. The reference points 402, 403 may be situated on the ground (i.e. in the ground plane z=0). Alternatively or additionally, reference points 402, 403 above or below the ground plane (z< >0) can also be used. In the example shown in
(23) The control unit 133 may be set up to use the ascertained reference points 401, 402, 403 to determine the position of the vehicle 100 relative to the ground unit 111. Furthermore, the control unit 133 may be set up to determine the relative motion of the vehicle 100 (relative to the ground unit 111). The relative motion of the vehicle 100 can be ascertained via the motion of the objects 111, 201, 202 identified in the images and reference points 401, 402, 403. In other words, the progression of the captured data from the ambient sensors 131, 132 over time (e.g. a time sequence of images) can be used to ascertain the motion of particular objects 111, 201, 202 and/or particular reference points 401, 402, 403. Furthermore, the steering angle of the vehicle 100 and/or the rotation of the wheels of the vehicle 100 can be ascertained. The control unit 133 may be set up to ascertain the relative motion of the vehicle 100 from this information. In particular, the control unit 133 may be set up to track the position of the vehicle 100 relative to the ground unit 111 (even when the ground unit 111 can no longer be captured by the ambient sensors 131, 132).
(24) By way of example, the control unit 133 may be set up to capture and store a graphical representation (e.g. as a “top view” representation) of the charging station (and particularly of the ground unit 111) at the beginning of the positioning operation (for example see left-hand image in
(25) As already explained, in proximity, when the vehicle 100 is over the ground unit 111, typically only the further reference points 402, 403 and the ambient sensors 131, 132 are then available for distance measurement between ground unit 111 and secondary coil 102. For this reason, the control unit 133 is typically set up to track the entire approach trajectory in order to find the position of the vehicle 100. In particular, the control unit 133 is typically set up to determine a starting position for the vehicle 100 relative to the ground unit 111 right from when the ground unit 111 can still be captured by the ambient sensors 131, 132. Furthermore, the control unit 133 may be set up to update the position of the vehicle 100 continuously from the starting position. The update can take place on the basis of the ascertained motion of the further reference points 402, 403 and/or the further objects 201, 202, and also on the basis of the vehicle-internal information (such as steering angle and/or wheel rotation). This allows the vehicle 100 to be positioned precisely over the ground unit 111, even when the ground unit 111 can no longer be captured by the ambient sensors 131, 132.
(26) Furthermore, in direct proximity to the ground unit 111, a quality indicator for the magnetic coupling between the primary coil 111 and the secondary coil 102 can be used to check the final position.
(27) The control unit 133 may be set up to ascertain an intended trajectory 501 for the vehicle 100 from a current position of the vehicle 100 (see
(28)
(29) Further refinements of the positioning method described above are possible. By way of example, permanent monitoring of the vehicle surroundings (using the ambient sensors 131, 132) during the charging operation can be used to identify that an animal (e.g. a cat) runs under the vehicle 100. This allows the safety of the charging operation to be increased. In particular, the charging power could be reduced in such cases in order to reduce hazard for the animal as a result of high field strengths.
(30) The positioning method described can be combined with an automatic parking assistance function. In particular, an automatic parking function can be used on the basis of the reference points 401, 402, 403 and/or the reference objects 111, 201, 202 and also on the basis of the starting position of the vehicle 100 in order to position the vehicle 100 over the ground unit 111 (e.g. on the basis of the intended trajectory 501).
(31) The control unit 133 may be set up to learn reference points 401, 402, 403 when a charging station is approached repeatedly. This allows a primary coil 111 to be approached even when the primary coil 111 is not visible, e.g. when the primary coil 111 is concealed by a blanket of snow. In addition, this can continuously improve the precision of the positioning.
(32) As explained above, further reference points 402, 403 can be used in order to assist the positioning operation. The reference points 402, 403 can also be situated above the ground plane (z>0). The reference points 402, 403 can be produced both through teaching and through identification of a pattern 411 that is typical of the charging station. Furthermore, alternative reference points can be used in order to approach even completely concealed primary coils 111 (e.g. those inset in the ground).
(33) As described above, the controller 133 may be set up to detect reference objects 201, 202 and/or reference points 401, 402. These reference objects 201, 202 and/or reference points 401, 402 can be detected by image analysis methods. Furthermore, the geometric arrangement of the reference objects 201, 202 and/or reference points 401, 402 can be ascertained on the basis of the data from the ambient sensors 131, 132.
(34) In order to reduce the complexity of the ascertainment of the reference objects 201, 202 and/or reference points 401, 402 and of the ascertainment of the geometric arrangement and in order to increase the reliability of the ascertained information, it may be advantageous to use predefined markings or predefined reference objects. Information about such predefined reference objects can be made available to the control unit 133 as reference data. This allows the ascertainment of the position of the secondary coil 102 relative to the ground unit 111 to be improved.
(35)
(36) As shown in image a) in
(37) Image b) in
(38) Image c) in
(39)
(40) Alternatively or in addition, permanent markings 712, 713 that are in place in the environment of the parking space at which the charging station is situated can be used as reference objects (e.g. a car park sign 712 or a pillar 713), provided that these markings 712, 713 have sufficient contrast.
(41) The dimensions 721, 722 of the predefined reference objects 711, 712, 713 can be used to determine the distance between camera 131 and reference object 711, 712, 713. The actual dimensions 721, 722 (e.g. height, width) of the reference objects 711, 712, 713 are known to the control unit 131, which means that the distance between reference object 711, 712, 713 and vehicle camera 131 can be ascertained from the dimensions of the reference objects 711, 712, 713 as ascertained on the basis of the image data of the vehicle camera 131. Alternatively or in addition, it is also possible to use data from distance sensors 132 of the vehicle 100 to ascertain the distance.
(42) The use of a predefined reference object 701 having predefined dimensions 721, 722 allows the control unit 133 to ascertain the distance from the vehicle 100 to the reference object and the orientation of the vehicle 100 relative to the reference object. Since the one or more reference objects 701, 702 are arranged in a predefined manner relative to the ground unit 111, this allows the secondary coil 102 of the vehicle 100 to be positioned over the ground unit 111 on the basis of the data from the vehicle camera 131.
(43) To ascertain the distance between a reference object 701 and the camera 131 of the vehicle 100, it is possible to use triangulation methods. This is shown by way of example in
(44)
(where s12 is the distance moved 807 by the vehicle 100, and where s.sub.m1.sub._.sub.2 is the current distance 802 from the vehicle 100 to the reference object 701) gives
(45)
(46) Therefore, knowledge of the camera angles 806, 804 at two successive instants and knowledge of the distance 807 covered between these two instants allows the current distance 802 from the vehicle 100 to the reference object 701 to be determined.
(47) For the special case in which φ.sub.m1.sub._.sub.2−φ.sub.m1.sub._.sub.1 is small or zero (i.e. if the vehicle 100 moves directly toward the reference object 701), it is admittedly not possible to compute the distance 802 using the aforementioned triangulation formula. In this case, however, the vehicle distance 807 directly provides information about how the distance between vehicle 100 and reference object 701 changes. In particular, the change of distance between the distance 801 at the first instant and the current distance 802 (at the second instant) then corresponds to the vehicle distance 807.
(48) Even when the steering is turned, triangulation methods allow the distance 802 between reference object 701 and vehicle 100 to be ascertained by measuring the camera angles 804, 806 and by measuring the distance moved 807 by the vehicle 100 (e.g. on the basis of the wheel rotation). As shown in
(49) The absolute position of the vehicle can be ascertained on the basis of the identification of at least two reference objects 701 and 702, as shown in
(50)
where d.sub.m1.sub._.sub.m2 is the predefined distance 823 between the two reference objects 701, 702. Hence, the coordinates and the orientation of the vehicle 100 (i.e. the position and orientation of the secondary coil 102 of the vehicle 100) can be ascertained. The coordinates of the reference objects 701, 702 and of the primary coil 111 are known, which therefore means that all the requisite information for positioning the vehicle 100 is available.
(51) The use of a series of reference objects 701, 702 allows the whole approach trajectory of the vehicle 100 to be accompanied and assisted. To this end, predefined reference objects 701, 702 may be placed along the approach trajectory, so that where possible at least one reference object 701, 702 can always be captured by the camera 131 of the vehicle. This is shown by way of example in
(52) In the example shown in
(53) The geometry of a charging station (i.e. particularly the coordinates of the reference objects 701, 702; the dimensions 721, 722 of the reference objects 701, 702; the description of the reference objects 701, 702 (for automatic identification); and/or the relative position of the ground unit) may be stored on a memory unit in the vehicle 100. The control unit 133 can access this memory unit when required. By way of example, the geometry of a charging station can be transmitted from the charging station (e.g. from the primary electronics 110) to the vehicle 100 via a wireless communication link (e.g. when the vehicle 100 approaches).
(54) An application can be provided that assists a user in ascertaining the geometry of a charging station. The interface 900 of an exemplary application (e.g. an application for a Smartphone or for the onboard computer of the vehicle 100) is shown in
(55) In one exemplary method, a first step can involve the primary unit 110, 111 and the reference objects (e.g. markers) being placed. The vehicle camera 131 or a tablet PC camera can then be used to photograph the reference objects. The data from the vehicle (geometry and possibly images), which are intended to have the charging station used for them, can be transmitted to the tablet PC. This information can then be used to configure the charging station (as described in
(56) To position another type of vehicle, the data concerning the reference objects can be transmitted from the off board electronics 110 of the charging station, or from a database in a network, to the memory unit of the vehicle 100 (e.g. when the vehicle 100 approaches). The vehicle 100 then uses its known vehicle geometry to compute the complete data and can thus assist the positioning operation.
(57) The use of a charging station with predefined reference objects 701, 702 presupposes that the position of the ground unit 111 does not change relative to the predefined reference objects 701, 702. However, the ground unit 111 may be a shallow body on a nonslip mat, for example, said body not being firmly screwed to the ground. As a result, it is possible for the ground unit 111 to be moved unnoticed, as a result of which effective positioning of the secondary coil 102 over the ground unit 111 can no longer take place. The charging station (and the vehicle) should be set up to identify movement of the ground unit 111. The identification of movement should work even when a vehicle is not being charged or when there is no vehicle over the primary coil. In addition, identification should be possible even during a power failure. To solve this problem, the ground unit 111 may be positioned on a mechanical sensor ball, with the sensor ball recording motion by the ground unit 111. Alternatively or in addition, the ground unit 111 may be placed along a marking on the ground. An optical sensor of the ground unit 111 can then identify whether the ground unit 111 is still arranged along the marking or whether the ground unit 111 has been moved.
(58) This means that, in order to prevent an incorrect position from being approached when a ground unit 111 has moved, the ground unit 111 can have a sensor fitted that identifies motion (slipping, lifting, etc.) by the ground unit 111. After the ground unit 111 has moved, the driver can be notified and if need be asked to check the position.
(59) It should be pointed out that, in addition to one or more vehicle cameras 131, further external cameras can also be used. The external cameras can be used to capture the motion of the vehicle 100. The data from the one or more external cameras can be used with the data from the one or more vehicle cameras 131 for optimized position finding.
(60) This document has described a method and an apparatus (control unit) that assist the driver of a vehicle 100 in positioning the vehicle 100 over a ground unit 111. The method and the apparatus allow the use of components 131, 132 already in place in the vehicle 100 for positioning. As a result, the method and the apparatus can be implemented inexpensively (e.g. by software). In particular, the cameras 131 are today already integrated in vehicles 100 in optimum fashion (e.g. at positions with little soiling) for all-round vision, so that these properties can be used at the same time. The intended position can be displayed in a known representation, e.g. top or rear view representation. Therefore, provision of the positioning function described does not require a new form of presentation, and the positioning function described can be implemented as part of a consistent interface having other driver assistance functions. Furthermore, possible errors in the optical system (e.g. soiling of a camera 131) can be conclusively identified by the driver in the image shown, and therefore do not result in the vehicle being incorrectly positioned. The method described in this document therefore allows reliable positioning of the vehicle.
(61) Furthermore, the suitable choice of predefined reference objects allows the implementation of positioning that is independent of soiling and weather (e.g. snow). The installation of the charging station and the capture of the geometric data from the charging station can also be effected without a vehicle (e.g. using a tablet PC). Capture of the reference objects can be optimized through the use of original images from the vehicle camera and assisted by means of offline handling. This allows the reliability of the system to be increased. In addition, the use of the vehicle camera and the controller allow information from the chassis (steering angle, wheel rotation) about the vehicle motion to be used directly in the vehicle in a simple manner.
(62) The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.