METHOD AND SYSTEM FOR DISPLAYING A 3D MODEL
20210200192 · 2021-07-01
Inventors
Cpc classification
G05B19/41885
PHYSICS
G06T19/20
PHYSICS
G06T19/00
PHYSICS
International classification
G05B19/418
PHYSICS
Abstract
The disclosure relates to a method for displaying a 3D model of an object, wherein the object includes a plurality of parts arranged in original positions. The method includes: actuating a control device by a user to select a selected region of the 3D model, wherein the parts of the object that are located in the selected region form the selected parts; and displaying the 3D model such that the selected parts are displayed in end positions in which they are moved away from their original positions such that distances between the selected parts increase. The selected parts are displayed such that a user may see the selected parts better.
Claims
1. A method for displaying a three-dimensional (3-D) model of an object having a multiplicity of parts arranged in original positions, the method comprising: actuating a control device to select a region of the 3-D model, wherein parts of the multiplicity of parts of the object in the selected region form selected parts; and displaying the 3-D model in such a manner that the selected parts are displayed in end positions in which the selected parts are moved away from their original positions where distances between the selected parts increase.
2. The method of claim 1, wherein the 3-D model is displayed in such a manner that parts of the multiplicity of parts of the object which are outside the selected region are displayed in their original positions.
3. The method of claim 1, wherein the selected parts are moved away from their original positions when displaying the 3-D model in such a manner that distances between the selected parts and a reference point inside the selected region increase.
4. The method of claim 3, wherein the 3-D model is displayed in a virtual reality (VR) environment and/or an augmented reality (AR) environment.
5. The method of claim 4, wherein the control device emits virtual beams in such a manner that the virtual beams are visible only in the VR environment and/or in the AR environment, and wherein the virtual beams are used to select the selected region during movement of the control device.
6. The method of claim 5, wherein the virtual beams are emitted by the control device in a form of truncated cones in the VR environment and/or the AR environment, and wherein a region of the object intersected by the virtual beams forms the selected region.
7. The method of claim 6, wherein the reference point is arranged on a central axis of a truncated cone formed by the virtual beams in a form of truncated cones.
8. The method of claim 1, wherein the control device is actuated in such a manner that: a position of a reference point is selected; a distance between the control device and the reference point is selected; an extent of the increase in the distances between the selected parts is selected; and/or a size of the selected region is determined.
9. The method of claim 1, further comprising: actuating the control device in such a manner that the selection of the selected region is canceled; and displaying the 3-D model in such a manner that the selected parts of the selected region are displayed in their original positions.
10. The method of claim 1, further comprising: selecting a predetermined part of the selected parts by the control device and/or a further control device.
11. The method of claim 1, wherein the 3-D model is displayed in such a manner that a transparency of at least some of the parts of the multiplicity of parts of the object is increased.
12. A computer program product which, when executed on a program-controlled device, causes the program-controlled device to: actuate the program-control device to select a region of a three-dimensional (3-D) model of an object having a multiplicity of parts arranged in original positions, wherein parts of the multiplicity of parts of the object in the selected region form selected parts; and display the 3-D model on a display device in such a manner that the selected parts are displayed in end positions in which the selected parts are moved away from their original positions where distances between the selected parts increase.
13. A system for displaying a 3-D model of an object having of a multiplicity of parts arranged in original positions, the system comprising: a control device configured to be actuated by a user in such a manner that a region of the 3-D model is selected, wherein the parts of the multiplicity of parts of the object in the selected region form selected parts; and a display device for displaying the 3-D model in such a manner that the selected parts are displayed in end positions in which the selected parts are moved away from their original positions where distances between the selected parts increase.
14. The system of claim 13, wherein the display device is configured to display the 3-D model in such a manner that parts of the multiplicity of parts of the object which are outside the selected region are displayed in their original positions.
15. The system of claim 13, further comprising: a control device in a form of a flashlight, wherein the control device comprises: an actuation unit for switching the control device on and off; an extent unit for selecting an extent of the increase in the distances between the selected parts; a selection unit for selecting a predetermined part of the selected parts; and/or a determination unit for determining a position and/or a size of the selected region.
16. The method of claim 11, wherein the at least some of the parts of the multiplicity of parts of the object comprise parts that are not the selected parts of the multiplicity of parts of the object.
17. The method of claim 1, wherein the 3-D model is displayed in a virtual reality (VR) environment and/or an augmented reality (AR) environment.
18. The method of claim 1, wherein the control device emits virtual beams in such a manner that the virtual beams are visible only in a virtual reality (VR) environment and/or in an augmented reality environment, and wherein the virtual beams are used to select the selected region during movement of the control device.
19. The method of claim 18, wherein the virtual beams are emitted by the control device in a form of truncated cones in the VR environment and/or the AR environment, and wherein a region of the object intersected by the virtual beams forms the selected region.
20. The method of claim 19, wherein the selected parts are moved away from their original positions when displaying the 3-D model in such a manner that distances between the selected parts and a reference point inside the selected region increase, and wherein the reference point is arranged on a central axis of a truncated cone formed by the virtual beams in a form of truncated cones.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0043] The exemplary embodiments which are described below relate to further advantageous configurations and aspects of the disclosure. The disclosure is explained in more detail below on the basis of certain embodiments with reference to the enclosed figures.
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050] In the figures, identical or functionally identical elements have been provided with the same reference signs unless stated otherwise.
DETAILED DESCRIPTION
[0051]
[0052] A 3-D model 1 of an object 3 is displayed on the screen 2 in
[0053] The control device 10 is in the form of a flashlight and is actuated by a user picking it up and moving it. The actuation of the control device 10 is explained in yet more detail below with reference to
[0054] The system 20 is suitable for carrying out a method for displaying a 3-D model 1. Such a method is shown, for example, in
[0055] In act S1, the control device 10 is actuated by a user 7 in order to select a selected region 5 of the 3-D model 1. For this purpose, the user 7 picks up 13 the control device 10 and moves it in such a manner that virtual beams 11, which are emitted by the control device 10, are emitted in the direction of the 3-D model 1. In this case, the virtual beams 11 are visible only in the VR environment, that is to say with the VR headset.
[0056] The user 10 moves the control device 10 in his hand 13 in such a manner that the virtual beams 11 emitted in the form of truncated cones intersect the object 3. The region of the object 3 within the beams 11 in the form of truncated cones forms the selected region 5. This is a region which the user 7 would like to visualize in more detail.
[0057] The parts 4 of the object 3 which are inside the selected region 5 form selected parts 14. The side surfaces of the selected parts 14 are illustrated using dotted lines in
[0058] In act S2, the 3-D model 1 is displayed in such a manner that the selected parts 14 are displayed in end positions.
[0059] In his VR headset 12, the user 7 sees how the selected parts 14 “fly apart”. As a result, the user 7 may better see the selected parts 14. He also sees, in particular, the selected parts 14 which were previously concealed by other parts 4.
[0060]
[0061] The system 20 may alternatively also carry out a method for displaying a 3-D model 1 according to a second embodiment. Such a method is described below on the basis of
[0062] In the method according to the second embodiment (
[0063] In act S4, the control device 10 is actuated by the user 7 again, with the result that the selection of the selected region 5 is canceled. For this purpose, the user 7 moves the control device 10 away from the selected region 5, with the result that the virtual beams 11 no longer intersect the object 3 in the selected region 5. In particular, the selected parts 14 are displayed in their end positions only as long as the user 7 points to the selected region 5 with the control device 10.
[0064] In act S5, the 3-D model 1 is displayed again in such a manner that the previously selected parts 14 are displayed in their original positions again. The previously selected parts 14 are moved together again, with the result that a distance between the respective previously selected parts 14 and the reference point 6 is reduced again.
[0065] Acts S1-S5 may be repeated as often as desired. As a result, the user 7 may select and investigate individual regions of the object 3 in succession.
[0066]
[0067] The beams 11 emitted by the control device 10 are emitted in the form of truncated cones with an opening angle α. The opening angle α is adjustable by virtue of the user 7 rotating the adjustment ring 16. A size of the selected region 5 may be changed by varying the opening angle α.
[0068] The beams 11 in the form of truncated cones are emitted along a central axis MA. The reference point 6 is on this central axis MA.
[0069] The user may adjust a distance d between the reference point 6 and the control device 10 by a sliding button 15 of the control device 10. The sliding button 15 and the adjustment ring 16 form a determination unit, in particular.
[0070] The user 7 may determine a depth h of the selected region 5 by a voice command. The control device 10 may therefore be operated using a single hand 13. Furthermore, it is possible to provide a haptic input device for adjusting the depth h, for example, a sliding button. Furthermore, a two-dimensional touchpad may also be used to adjust both the distance d and the depth h. In addition, the depth h may also be adjusted in some embodiments by rotating the control device 10 about its longitudinal axis.
[0071] In embodiments, the control device 10 also includes an actuation unit for switching the control device 10 on and off and/or an extent unit for selecting an extent of the increase in the distances between the selected parts 14.
[0072] Although the present disclosure has been described on the basis of exemplary embodiments, it may be modified in various ways. Instead of the described motor, the object 3 may also be, for example, any desired machine of an industrial installation or an entire industrial installation. The parts 4 of the object 3 may also be arranged inside the object 3 in a different manner to that shown in
[0073] The control device 10 may also be in the form of a remote control having a multiplicity of buttons. Alternatively, the control device 10 may also be a movement detection device which detects movements of the user 7. The described control device 10 in the form of a flashlight may also be modified. It may have, for example, various buttons for adjusting the distance d and/or the opening angle α.
[0074] It is to be understood that the elements and features recited in the appended claims may be combined in different ways to produce new claims that likewise fall within the scope of the present disclosure. Thus, whereas the dependent claims appended below depend from only a single independent or dependent claim, it is to be understood that these dependent claims may, alternatively, be made to depend in the alternative from any preceding or following claim, whether independent or dependent, and that such new combinations are to be understood as forming a part of the present specification.
[0075] While the present disclosure has been described above by reference to various embodiments, it may be understood that many changes and modifications may be made to the described embodiments. It is therefore intended that the foregoing description be regarded as illustrative rather than limiting, and that it be understood that all equivalents and/or combinations of embodiments are intended to be included in this description.