Self-propelled construction machine and method for visualizing the working environment of a construction machine moving on a terrain

09719217 · 2017-08-01

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a self-propelled construction machine, in particular a road milling machine or a slipform paver, which can carry out translational and/or rotational movements for a planned project on a terrain. In addition, the invention relates to a method for visualizing the working environment of a construction machine moving on the terrain, in particular a road milling machine or a slipform paver. The construction machine comprises an image recording unit for recording an image segment of the terrain located in a coordinate system (X, Y, Z) dependent on the position and orientation of the construction machine on the terrain, and a display unit for displaying the image segment of the terrain. Moreover, the construction machine comprises a data processing unit which is configured in such a way that a depiction of a part of the project located in the image segment is superimposed on the image segment of the terrain displayed on the display unit, such that the project is visualized in the image segment. The display unit thus displays not only the actual image segment, but also a virtual image of the project, thus widening the perception of the machine operator. As a result, the machine operator can identify on the display unit whether the project forming the basis of the control matches the reality.

Claims

1. A self-propelled construction machine comprising: a chassis supporting a machine frame and comprising front and rear wheels or running gears for moving the machine in a working direction; a working tool for working a terrain; an image recorder fixed relative to the machine frame and configured to record an image segment of a region of the terrain in a first coordinate system dependent on a position and orientation of the construction machine on the terrain; a display configured to display the image segment of the terrain; a data storage comprising project data describing a shape and location of at least one project in a second coordinate system independent of the position and orientation of the construction machine, wherein the at least one project comprises one or more structures to be installed or one or more portions of the terrain to be modified, and a data processor configured to superimpose a depiction of at least part of the at least one project located in the region of the terrain associated with the image segment on the displayed image segment of the terrain, wherein at least part of the at least one project is visualised in the image segment.

2. The self-propelled construction machine of claim 1, further comprising one or more sensors configured to provide position and orientation data describing the position and orientation of the construction machine in the second coordinate system.

3. The self-propelled construction machine of claim 2, wherein the one or more sensors comprises a global navigation satellite system (GNSS).

4. The self-propelled construction machine of claim 2, wherein the one or more sensors comprises a first and a second GNSS receiver configured to decode GNSS signals from a global navigation satellite system (GNSS) and correction signals from a reference station, the first and second GNSS receivers being arranged in different positions on the construction machine.

5. The self-propelled construction machine of claim 2, wherein the data processor is configured to transform the project data describing the shape and location of the at least one project in the second coordinate system, based on the position and orientation of the construction machine in the second coordinate system, into the first coordinate system.

6. The self-propelled construction machine of claim 1, wherein the project data further comprises data describing at least one contour of the project, further wherein the data processor is configured to display the at least one contour of the project in the image segment of the terrain.

7. The self-propelled construction machine of claim 1, wherein the data processor is configured to determine object data describing a shape and location of at least one actual object in the image segment of the terrain, and compare the object data with the project data.

8. The self-propelled construction machine of claim 7, wherein the project data further comprises data describing at least one contour of the project, and a spacing is determined between at least one reference point relating to the contour of the project and at least one reference point relating to a contour of the actual object.

9. The self-propelled construction machine of claim 8, further comprising an alarm which produces one or more outputs from a group comprising an optical alarm, an acoustic alarm, a tactile alarm or a control signal for intervention in the machine control, upon the data processor identifying that the spacing is smaller than a predefined threshold value.

10. The self-propelled construction machine of claim 1, further comprising an interface for inputting the project data to the data storage.

11. A method for visualising the working environment of a construction machine moving on and working a terrain, the method comprising: displaying an image segment of a region of the terrain in a first coordinate system dependent on the position and orientation of the construction machine on the terrain; providing from data storage project data describing a shape and location of at least one project in a second coordinate system independent of the position and orientation of the construction machine, wherein the at least one project comprises one or more structures to be installed or one or more portions of the terrain to be modified; and superimposing a depiction of at least part of the at least one project located in the region of the terrain associated with the image segment on the displayed image segment, wherein at least part of the at least one project is visualised in the image segment.

12. The method of claim 11, further comprising determining position and orientation data describing the position and orientation of the construction machine in the second coordinate system.

13. The method of claim 12, wherein the position and orientation data describing the position and orientation of the construction machine are determined via a global navigation satellite system (GNSS).

14. The method of claim 12, further comprising transforming the project data into the first coordinate system based on the position and orientation of the construction machine in the second coordinate system.

15. The method of claim 11, wherein the project data describing the shape and location of the at least one project comprise data describing at least one contour of the project, the at least one contour of the project being displayed in the image segment of the terrain.

16. The method of claim 11, further comprising: determining object data describing a shape and location of at least one actual object in the image segment of the terrain, and comparing the object data with the project data.

17. The method of claim 16, further comprising determining a spacing between at least one reference point relating to at least one contour of the project and at least one reference point relating to at least one contour of the actual object.

18. The method of claim 11, further comprising determining the project data describing the shape and location of the at least one project in the second coordinate system independent of the position and orientation of the construction machine using a rover.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) In the following, different embodiments of the invention are described in more detail with reference to the drawings, in which:

(2) FIG. 1A is a side view of an embodiment of a slipform paver,

(3) FIG. 1B is a plan view of the slipform paver of FIG. 1A,

(4) FIG. 2A is a side view of an embodiment of a road milling machine,

(5) FIG. 2B is a plan view of the road milling machine of FIG. 2A,

(6) FIG. 3 shows the road surface to be worked using a road milling machine, together with the coordinate system independent of the movement of the construction machine, and the coordinate system dependent on the movement of the construction machine,

(7) FIG. 4 shows the image segment of the terrain displayed on the display unit of the road milling machine,

(8) FIG. 5A is an example of the superimposition of the contours of a project and an object in the image segment, in which the contours of the project and object do not intersect,

(9) FIG. 5B is an example of the superimposition of the contours of a project and an object in the image segment, in which the contours of the project and object intersect,

(10) FIG. 6 shows the image segment of the terrain displayed on the display unit of a slipform paver, in which the project and object exactly match,

(11) FIG. 7 shows the image segment of the terrain displayed on the display unit of a slipform paver, in which the project and object do not match, and

(12) FIG. 8 is a block diagram showing the components for visualising the working environment of the construction machine according to the invention.

DETAILED DESCRIPTION

(13) FIGS. 1A and 1B are the side view and plan view of a slipform paver as an example of a self-propelled construction machine. A slipform paver of this kind is described in detail in EP 1 103 659 B1. Since slipform pavers per se belong to the prior art, only the components of the construction machine related to the invention are described here.

(14) The slipform paver 1 comprises a machine frame 2 which is carried by a chassis 3. The chassis 3 comprises two front and two rear chain running gears 4A, 4B which are fastened to front and rear lifting columns 5A, 5B. The working direction (direction of travel) of the slipform paver is indicated by an arrow A. However, it is also possible for just one front or rear running gear to be provided.

(15) The chain running gears 4A, 4B and the lifting columns 5A, 5B form the drive device of the slipform paver for carrying out translational and/or rotational movements of the construction machine on the terrain. By raising and lowering the lifting columns 5A, 5B, the height and inclination of the machine frame 2 can be moved relative to the ground. The slipform paver can be moved forwards and backwards by means of the chain running gears 4A, 4B. The construction machine thus has three translational and three rotational degrees of freedom.

(16) The slipform paver 1 comprises a device 6 for moulding concrete (shown merely as an indication), referred to hereinafter as a concrete trough. The concrete trough 6 constitutes the working device of the slipform paver for installing a structure of a predefined shape on the terrain.

(17) FIGS. 2A and 2B show the side view of a road milling machine as a further example of a self-propelled construction machine, the same reference signs being used for the corresponding parts. The road milling machine 1 also comprises a machine frame 2 which is carried by a chassis 3. The chassis 3 again comprises front and rear chain running gears 4A, 4B which are fastened to front and rear lifting columns 5A, 5B. However, it is also possible for just one front or rear running gear to be provided. The road milling machine comprises a working device for modifying the terrain. In this case, this is a milling device 6 comprising a milling roller equipped with milling tools which, however, cannot be identified in the figures. The milling material is removed by means of a conveyor F.

(18) The road surface to be worked using a road milling machine is shown in FIG. 3. A road 8 delimited by curbs 7 extends over the terrain. In this embodiment, the project consists in milling off the surface of the road. In this case, it should be taken into account that certain objects O, for example manhole covers, are located in the center of the road surface, and water inlets are located at the side of the road surface. FIG. 3 shows two manhole covers 9, 10 and a water inlet 11, over which the road milling machine travels when milling off the road surface. However, the illustration in FIG. 3 does not correspond to the field of vision of the machine operator. The machine operator cannot see the objects O on the road from the cab of the construction machine, since said objects are located immediately in front of the construction machine or below the machine. In particular, the machine operator cannot identify the manhole cover when the milling roller is located just a short distance in front of the manhole cover, i.e. precisely at the time at which the machine operator must raise the milling roller. However, this region cannot be monitored by a camera either, on account of the milling material flying around in the housing of the milling roller.

(19) Since the machine operator cannot identify the manhole covers, in practice lateral markings are made at the level of the manhole covers, which markings are indicated as M.sub.1 and M.sub.2 in FIG. 3. These markings are intended to allow the machine operator or another person to identify the position of the manhole covers so that the milling roller can be raised at the correct time. However, such markings are not required in the case of the construction machine according to the invention.

(20) The location and shape of the circular manhole covers 9, 10 are clearly delineated by three reference points O.sub.11, O.sub.12, O.sub.13 and O.sub.21, O.sub.22, O.sub.23 located on the circumference of the circle. The location and shape of the rectangular water inlets are delineated by four reference points O.sub.31, O.sub.32, O.sub.33, O.sub.34 which are located at the corners of the water inlet.

(21) The project is described by the previously generated project data, which are input into a working memory 12 of the construction machine via an appropriate interface 12A (FIG. 8). The project data contain the coordinates of the reference points characteristic for the project, which are detected in a coordinate system (X, Y, Z) independent of the position and orientation of the construction machine. In this embodiment, the reference points are located on the contours 13, 14, 15 which surround the contours 16, 17, 18 of the objects O at a predefined minimum spacing Δ. Since, in this embodiment, the objects O are circular manhole covers 9, 10 and rectangular water inlets 11, the contours delineating the project are also circles and rectangles. The circular contours 13, 14 of the project are clearly delineated in the coordinate system (X, Y, Z) independent of the movement of the construction machine by the coordinates of three reference points P.sub.11, P.sub.12, P.sub.13 and P.sub.21, P.sub.22, P.sub.23, and the rectangular contours 15 of the project are delineated therein by the coordinates of four reference points P.sub.31, P.sub.32, P.sub.33, P.sub.34.

(22) The project data comprise the coordinates of the reference points of the project in the fixed coordinate system (X, Y, Z) independent of the movement of the construction machine. Said data mark the surface to be milled off, which lies outside the contours 13, 14, 15 of the project. The surface which is not to be worked is the surface located within the contours 13, 14, 15 of the project, in which the objects O are located. The project is clearly determined in this way.

(23) The project data can be determined in the following manner. The fixed coordinate system (X, Y, Z) is preferably the coordinate system of a global satellite navigation system (GNSS), with the result that the reference points of the object can be detected in a simple manner using a measuring device (rover). The reference points P.sub.11, P.sub.12, P.sub.13 and P.sub.21, P.sub.22, P.sub.23 and P.sub.31, P.sub.32, P.sub.33, P.sub.34 of the project are determined from the reference points O.sub.11, O.sub.12, O.sub.13 and O.sub.21, O.sub.22, O.sub.23 and O.sub.31, O.sub.32, O.sub.33, O.sub.34 of the objects, while taking account of a minimum spacing Δ between the contours 13, 14, 15 of the project and the contours 16, 17, 18 of the object. The project data can be stored in an external storage unit, for example a USB stick, and input into the internal storage unit 12 of the construction machine via the interface 12A. The construction machine can then be controlled using said data. When the road milling machine reaches a surface which is not to be worked, the milling roller is automatically raised relative to the ground. As soon as the road miller has traveled over the surface which is not to be worked, the milling roller is lowered again. This prevents the manhole covers 9, 10 or the water inlet 11 and/or the construction machine from being damaged. However, the milling roller may also be raised and lowered by means of manual intervention in the machine control, the point in time at which the intervention is to be made being signalled to the machine operator.

(24) In practice, it could be that the reference points of the project are not correctly detected in the GNSS coordinate system independent of the road milling machine, taking account of the object O. There is then the risk that the manhole covers 9, 10 or water inlet 11 are not located within the previously determined contours 16, 17, 18, resulting in damage to the manhole covers or water inlet and/or the machine.

(25) The road milling machine comprises an image recording unit 19 comprising a camera system 19A arranged on the machine frame 2, by means of which system an image segment 20A of the terrain to be worked, i.e. of the road surface comprising manhole covers and water inlets, is captured. The camera system 19A detects a region which cannot be seen by the machine operator in the cab. The image segment 20A is displayed on a display unit 20, for example an LC display. FIG. 4 shows the display of the display unit 20. While the road miller moves on the terrain, the image shown in the image segment 20A changes continuously, with the result that the machine operator can identify that he is approaching a manhole cover 9, 10 or water inlet 11 with the road miller.

(26) In addition, the road milling machine comprises a data processing unit 21, by means of which the available project data are processed. The data processing unit 21 is configured in such a way that the project located in the image segment is superimposed on the image segment 20A of the terrain displayed on the display unit 20. In this embodiment, the contours 16, 17, 18 of the project, which mark the surface to be worked and the surface not to be worked, are displayed in the image segment 20A in the manner in which they correspond to the previously determined project data. The machine operator can thus immediately identify on the display unit 20 if the project data do not correspond to the reality, i.e. if the contours 16, 17, 18 of the project do not concentrically surround the contours 13, 14, 15 of the object O at a predefined minimum spacing Δ. However, if the manhole covers and water inlets are located within the displayed contours, the road miller can be controlled without any intervention in the machine control.

(27) A coordinate system (x, y, z) dependent on the movement of the construction machine on the terrain is assigned to the image segment 20A, which coordinate system is shown in FIG. 3. The position (origin) and alignment of said coordinate system corresponds to the location and angle of view of the camera 19A on the construction machine. The location and shape of the objects O are also delineated by corresponding coordinates in this coordinate system.

(28) The coordinate system (x, y, z) dependent on the movement of the construction machine on the terrain may be a three-dimensional or two-dimensional coordinate system. FIG. 3 shows the general case of a coordinate system having an x-axis, y-axis and a z-axis. However, a two-dimensional coordinate system is sufficient in the event of a curvature of the surface of the terrain which is to be ignored, and when observing merely two-dimensional objects. However, this presupposes that the x/y plane of the coordinate system is in parallel with the surface of the terrain, which is assumed to be flat. In the following, it is assumed that this is the case.

(29) The camera system may be a stereo camera system or a camera system comprising just one camera. However, a camera system comprising just one camera is sufficient in the event of a curvature of the surface of the terrain which is to be ignored and/or when taking account only of two-dimensional objects. If the camera system is a stereo camera system, three-dimensional images can also be displayed on the display unit 20 by means of the known method.

(30) In order to determine the position and orientation of the construction machine, and thus also the position and orientation (angle of view) of the camera system 19A in the coordinate system (X, Y, Z) independent of the position and orientation of the construction machine, the construction machine comprises a device 22 which provides the position/orientation data of the construction machine (FIG. 8). This device may comprise a first GNSS receiver 22A and a second GNSS receiver 22B which are arranged in different positions S1, S2 on the construction machine. FIG. 1B shows the position S1 and S2 of the two GNSS receivers 22A and 22B on the slipform paver. The first and second GNSS receivers 22A, 22B decode the GNSS signals from the global navigation satellite system (GNSS) and correction signals from a reference station in order to determine the position and orientation of the construction machine. Systems of this kind, which permit very precise determination of the position/orientation data, belong to the prior art. However, instead of the second GNSS receiver, an electronic compass K may also be provided in order to detect the orientation of the construction machine. FIG. 2B shows the position S1 of the first GNSS receiver 22A and the position S2 of the compass K on the road milling machine. However, the compass may also be dispensed with when calculating the orientation of the construction machine. The orientation can be calculated in that the location of a reference point of the construction machine at successive points in time is determined and the direction of movement is determined from the change in location. The accuracy can be additionally increased by including the steering angle in the calculation.

(31) The data processing unit 21 receives the current position/orientation data which are continuously provided by the device 22 for determining the position and orientation of the construction machine, and transforms the shape and location of the project in the project data describing the coordinate system (X, Y, Z) independent of the position and orientation of the construction machine into the machine coordinate system (x, y, z) dependent on the position and orientation of the construction machine, on the basis of the position and orientation of the construction machine in the coordinate system independent of the construction machine. This data transformation takes place in real time. Once the coordinates of the reference points in the machine coordinate system marking the contours of the project are known, the contours 16, 17, 18 of the project are displayed in the image segment 20A (FIG. 4). The data processing unit operations required for generating the contours belong to the prior art.

(32) If no project data are present for the depicted image segment 20A, no visualisation occurs on the display unit 20. Otherwise, the machine operator is shown the relevant information as virtual objects beside the image of the actual objects (hydrant 9, 10 or water inlet 11) by means of the contours 16, 17, 18, which contours should match the actual objects O detected in the camera image. As a result, the machine operator can constantly monitor the control of the construction machine.

(33) The data processing unit 21 may comprise an image processing unit which can automatically identify whether the actual objects O match the virtual objects, i.e. whether the actual contours 13, 14, 15 of an object O (hydrant or water inlet) shown in the image segment are actually located within the associated virtual contours 16, 17, 18 of the project. The data processing unit 21 is configured such that the shape and location, in the image segment 20A, of the actual object O (hydrant or water inlet) captured by the camera system 19A is determined. The data processing unit 21 can make use of the known methods of image recognition for this purpose. The shape and location of the actual object in the image segment are described by object data. For example, the circular contour of the manhole cover 9 is delineated by the three reference points P.sub.11, P.sub.12, P.sub.13 located on the contour (FIG. 3).

(34) In the data processing unit 21, the object data are compared with the project data in order to identify whether the actual objects match the virtual objects. In this embodiment, the data processing unit checks whether the contour 13 of the actual object, for example the manhole cover 9, is located within the contour 16 of the project. For this purpose, the data processing unit 21 checks whether the two contours 13, 16 intersect. If the contours 13, 16 do not intersect, it is concluded that the object data correspond to the reality. Otherwise, it is concluded that the project data have been incorrectly determined.

(35) FIG. 5A shows the case in which the object data match the project data, i.e. the contours 13, 16 do not have any intersection point, while FIG. 5B shows the case in which the object data do not match the project data, i.e. the contours 13, 16 intersect at two points R.

(36) Furthermore, in a preferred embodiment the data processing unit 21 can also identify whether a minimum spacing Δ is adhered to. For this purpose, the data processing unit determines two reference points P.sub.A1 and P.sub.A2 which are assigned to the contour 13 of the object and the contour 16 of the project respectively. For example, points which are located particularly close to one another on the circular contours 13, 16 may be determined as reference points P.sub.A1, P.sub.A2 (FIG. 5A). The data processing unit 21 determines the spacing a between the reference points P.sub.A1, P.sub.A2 located on the contours and compares the spacing a with a predefined threshold value. If the spacing between the points is smaller than the predefined threshold value, it is concluded that the contour 13 of the object is located within the project, since the contours 13, 16 do not intersect. However, it is concluded that a minimum spacing Δ is not adhered to, with the result that there is still a risk of damage to the manhole cover or the construction machine. However, the reference points may also be the centers or centroids of the circular contours. In the case of an exact alignment taking account of the predefined minimum spacing Δ, the contours 13, 16 have a common center or centroid, i.e. the spacing between the centers should be as small as possible.

(37) The above embodiment is to be understood merely as an example of an embodiment in order to compare the project data and the object data. However, the data may also be evaluated using all other known algorithms in order to conclude whether the actual objects match the virtual objects.

(38) The construction machine comprises an alarm unit 23 which emits an optical and/or acoustic and/or tactile alarm if the data processing unit 21 has identified that the two contours 13, 16 do not match and/or that the spacing a is smaller than a predefined threshold value (FIG. 8). The machine operator can also be made aware of an incorrect determination of the object data by means of coloured underlays on specific surfaces, by hatchings or by markings. The spacing “a” can also be displayed on the display unit 20.

(39) In the following, a further embodiment of the invention will be described with reference to FIGS. 6 and 7, which embodiment differs from the previous embodiment in that the project is not the modification of the terrain by means of a road milling machine (FIG. 2) but rather the installation of a structure by means of a slipform paver (FIG. 1). Like the road milling machine, the slipform paver comprises an image recording unit 12 and a data processing unit 21, as well as a device 12 for providing the project data (FIG. 8). The corresponding parts are provided with the same reference signs.

(40) In the present embodiment, the project of the slipform paver is a traffic island which is laterally delimited by a concrete curb 25. The curb 25 comprises, for example, a straight portion 25A which is adjoined by a semi-circular portion 25B. The curb 25 is to be located beside a rectangular water inlet 26, which requires exact control of the slipform paver.

(41) The project data again comprise the coordinates of reference points characteristic for the project, which are detected in a coordinate system (X, Y, Z) independent of the position and orientation of the construction machine. The project data describe the shape and location of the curb 25. The shape and location of the straight portion 25A may for example each be delineated by two reference points, P.sub.1, P.sub.2 and P.sub.3, P.sub.4 respectively, which are located at the beginning and end of the inner and outer contours 27, 28 respectively of the curb 25. The semicircular portion 25B may for example be delineated by three reference points P.sub.2, P.sub.5, P.sub.6 and P.sub.4, P.sub.7, P.sub.8, which are located on the inner and outer contours 27, 28 respectively.

(42) The previously determined project data relating to the GNSS system independent of the position and orientation are input into the working memory 12 of the slipform paver via the interface 12A. The control unit of the slipform paver is configured such that the slipform paver moves along a path which corresponds to the course of the curb 25 to be installed.

(43) FIGS. 6 and 7 show the image segment 20A captured by the camera system 19A of the image recording unit 19 and displayed on the display unit 20, in which image segment the terrain located in front of the slipform paver in the working direction A and a part of the slipform paver comprising the concrete trough 6 can be identified.

(44) The device 22 for determining the position and orientation of the slipform paver on the terrain continuously calculates the current position/orientation data, the data processing unit 21 transforming the project data present in the GNSS system (X, Y, Z) independent of the position and orientation of the slipform paver into the machine coordinate system (x, y, z) dependent on the position and orientation of the slipform paver, which machine coordinate system corresponds to the angle of view of the camera system. Once the coordinates of the reference points in the machine coordinate system have been determined, the inner and outer contours 27, 28 of the straight and semicircular portion 25A, 25B are superimposed on the camera image.

(45) FIGS. 6 and 7 show an option for depicting the curb 25 in the image segment 20A by means of the contours 27, 28 which show the machine operator the course of the curb to be produced by the slipform paver, when the stored project data form the basis of the control. In addition to the inner and outer contours 27, 28, coloured underlays, hatchings, subsidiary lines or markings can also be generated by the data processing unit 21 and displayed on the display unit 20 for the purpose of visualising the curb 25 in the camera image. The machine operator can check the correct course of the curb 25 in the image segment 20A. The operator can identify in advance whether the curb 25 extends beside the water inlet 26 for example.

(46) FIG. 6 shows the case of a correct course of the curb 25 immediately beside, i.e. at a predefined minimum spacing from, the water inlet 26, while FIG. 7 shows the case in which the curb 25 extends over the water inlet 26. In the second case, the alarm unit 23 generates an alarm signal so that the machine operator can intervene in the machine control.

(47) In a preferred embodiment, the data processing unit 21 determines, by means of image recognition, the coordinates of reference points O.sub.1, O.sub.2, O.sub.3, O.sub.4 of the rectangular water inlet 26 in the machine coordinate system (x, y, z) corresponding to the camera image. Since the standard shape and size of the water inlet 26 is known, the coordinates of the corners of the water inlet for example can be determined by means of image recognition without significant mathematical complexity. Said coordinates then provide the object data which are compared with the project data in order to be able to identify whether the plan corresponds to the reality. For this purpose, the data processing unit 21 can check, for example, whether the contours of the curb and water inlet intersect, and/or the data processing unit can calculate, for example, the spacing between the contours, as is described with reference to the other embodiment.