METHOD FOR PERCEIVING AN AUGMENTED REALITY FOR AN AGRICULTURAL UTILITY VEHICLE

20220332339 · 2022-10-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for perceiving an augmented reality during a work assignment of an agricultural utility vehicle includes processing data of a data communication system of the agricultural utility vehicle via a control unit, sending the data processed via the control unit to a perception device, projecting the data received from the control unit as an item of optical information via a perception device, and jointly perceiving the real surroundings of the agricultural utility vehicle and the projected optical information via the perception device

    Claims

    1. A method for perceiving an augmented reality during a work assignment of an agricultural utility vehicle, comprising: processing data of a data communication system of the agricultural utility vehicle via a control unit; sending the data processed via the control unit to a perception device; projecting the data received from the control unit as an item of optical information via a perception device; and jointly perceiving the real surroundings of the agricultural utility vehicle and the projected optical information via the perception device.

    2. The method of claim 1, wherein the perception device includes a personally wearable viewing unit.

    3. The method of claim 1, wherein the perception device includes a viewing panel of a vehicle cab of the utility vehicle.

    4. The method of claim 3, wherein the perception device includes a windshield of the vehicle cab.

    5. The method of claim 1, wherein the data of the data communication system comprises at least one of the following data groups: agronomic data, which represent agronomic features of the present geographic usage region of the utility vehicle; working data, which represent features of the present work assignment of the utility vehicle; logistics data, which represent features of at least one object or obstacle in the present geographic usage region; position data, which represent a geographic position of the utility vehicle; orientation data, which represent an orientation of the utility vehicle relative to the present geographic usage region; and location data, which represent a location of the perception device relative to the utility vehicle.

    6. The method of claim 1, wherein during the processing of the data via the control unit, individual data are assigned supplementary geo-reference data, which represent a real geographic position of the optical information to be projected.

    7. The method of claim 6, wherein the optical information is projected based in part on the assigned geo-reference data as a three-dimensional relative arrangement with respect to the perception device.

    8. The method of claim 1, wherein the control unit is integrated inside the utility vehicle.

    9. The method of claim 1, wherein the control unit is arranged outside the utility vehicle.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0036] The detailed description of the drawings refers to the accompanying FIGURES in which:

    [0037] FIG. 1 schematically shows components of an agricultural utility vehicle, which has a data communication system, according to an embodiment.

    [0038] Like reference numerals are used to indicate like elements throughout the several FIGURES.

    DETAILED DESCRIPTION

    [0039] The embodiments or implementations disclosed in the above drawings and the following detailed description are not intended to be exhaustive or to limit the present disclosure to these embodiments or implementations.

    [0040] FIG. 1 schematically shows components of an agricultural utility vehicle 10 (in short utility vehicle 10 hereinafter), which has a data communication system 12. In some embodiments, the data communication system 12 is designed as a bus system (e.g., ISO, CAN). Various data of the data communication system 12 are processed in a control unit 14 and sent as optical data D_opt to a perception device 16.

    [0041] The perception device 16 is designed as a viewing unit (for example, special glasses) personally wearable by the user or driver and is only schematically shown here. The perception device 16 projects optical data D_opt received from the control unit 14 as items of optical information I_opt in such a way that the user (for example, a vehicle driver) perceives the real surroundings of the utility vehicle 10 and the item(s) of optical information I_opt jointly by means of, or via, the perception device 16.

    [0042] Alternatively, or additionally, a perception device 16′ is provided. It is designed as a viewing panel of a vehicle cab 18 of the utility vehicle 10 and is schematically shown by dashed lines. The viewing panel 16′ is for example a windshield of the vehicle cab 18. If the viewing panel 16′ is used as the perception device, it receives optical data D_opt from the control unit 14 and projects the optical data D_opt as items of optical information I_opt.

    [0043] In the illustrated embodiment, the control unit 14 is integrated inside the utility vehicle 10. The control unit 14 is designed here, for example, as a microprocessor unit or the like.

    [0044] Alternatively, the control unit 14 is replaced or supplemented by a control unit 14′ arranged externally with respect to the utility vehicle 10. In some embodiments, this external control unit 14′ is indicated by dashed lines and is a component of a stationary control center. The external control unit 14′ fundamentally operates like the internal control unit 14, i.e., various data of the data communication system 12 are processed in the control unit 14′ and sent as optical data D_opt to the perception device 16 and/or 16′.

    [0045] Different data and/or data groups are available on the data communication system 12. D_agr denotes agronomic data which represent agronomic features of the present geographic usage region of the utility vehicle 10. D_arb denotes work data which represent features of the present work assignment of the utility vehicle 10. D_log denotes logistics data which represent features of at least one object (for example, further working utility vehicles or obstacles) in the present geographic usage region. D_pos denotes position data, which represent a geographic position of the utility vehicle 10. For this purpose, the data communication system 12 can access the data of a position antenna 20 (for example, a GPS receiver) of the utility vehicle 10. D_or denotes orientation data, which represent an orientation (e.g., location, inclination) of the utility vehicle 10 or its vehicle cab 18 relative to the present geographic usage region. For this purpose, the data communication system 12 can access the data of a sensor system 22. The sensor system 22, for example, contains one or more inclination sensor(s).

    [0046] By means of, or via, a suitable location sensor system 24, a present relative location of the perception device 16 relative to the vehicle cab 18 is registered and passed on as location data to the data communication system 12. In some embodiments, the location data contain a position pos(k) and an orientation or an angle of inclination w(k) of the perception device 16 relative to the vehicle cab 18.

    [0047] Furthermore, geo-reference data D_geo (for example, data from digital topography maps) are applied to the data communication system 12, which are also processed in the control unit 14 and assist a three-dimensional representation of various items of optical information I_opt. For this purpose, for example, the data D_agr, D_arb, and D_log can be assigned specific geo-reference data D_geo. The user can select at a user interface 26, for example, which agronomic data or which agronomic data map is to be processed in the control unit 14 or 14′ and is to be three-dimensionally projected by means of, or via, assigned geo-reference data D_geo.

    [0048] The terminology used herein is for the purpose of describing example embodiments or implementations and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the any use of the terms “has,” “includes,” “comprises,” or the like, in this specification, identifies the presence of stated features, integers, steps, operations, elements, and/or components, but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

    [0049] Those having ordinary skill in the art will recognize that terms such as “above,” “below,” “upward,” “downward,” “top,” “bottom,” etc., are used descriptively for the FIGURES, and do not represent limitations on the scope of the present disclosure, as defined by the appended claims. Furthermore, the teachings may be described herein in terms of functional and/or logical block components or various processing steps, which may be comprised of any number of hardware, software, and/or firmware components configured to perform the specified functions.

    [0050] Terms of degree, such as “generally,” “substantially,” or “approximately” are understood by those having ordinary skill in the art to refer to reasonable ranges outside of a given value or orientation, for example, general tolerances or positional relationships associated with manufacturing, assembly, and use of the described embodiments or implementations.

    [0051] As used herein, “e.g.,” is utilized to non-exhaustively list examples and carries the same meaning as alternative illustrative phrases such as “including,” “including, but not limited to,” and “including without limitation.” Unless otherwise limited or modified, lists with elements that are separated by conjunctive terms (e.g., “and”) and that are also preceded by the phrase “one or more of” or “at least one of” indicate configurations or arrangements that potentially include individual elements of the list, or any combination thereof. For example, “at least one of A, B, and C” or “one or more of A, B, and C” indicates the possibilities of only A, only B, only C, or any combination of two or more of A, B, and C (e.g., A and B; B and C; A and C; or A, B, and C).

    [0052] While the above describes example embodiments or implementations of the present disclosure, these descriptions should not be viewed in a restrictive or limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the appended claims.