System and Method Using a System

20230169684 · 2023-06-01

    Inventors

    Cpc classification

    International classification

    Abstract

    A system and a method using a system for planning a use includes at least one marking, a control and evaluation unit, a database, a display unit, at least one time of flight sensor for a spatial scanning of a real environment, and at least one camera for imaging the real environment. The real environment in a spatial model can be displayed as a virtual environment on the display unit. The marking in the real environment is arranged at a position and has an orientation. The position and the orientation of the marking can be detected at least by the time of flight sensor and the position and orientation of the marking are linked by a virtual sensor model. The virtual sensor model in the spatial model of the virtual environment can be displayed on the display unit at the position and having the orientation of the marking.

    Claims

    1. A system for planning a use, the system comprising at least one marking, a control and evaluation unit, a database, a display unit, at least one spatial model of a real environment, and at least one camera for imaging the real environment; wherein the real environment in the spatial model can be displayed as a virtual environment on the display unit; wherein the marking in the real environment is arranged at a position and having an orientation; wherein the position and the orientation of the marking can be detected at least by the camera; wherein the position and orientation of the marking are linked to a virtual sensor model; wherein the visual sensor model in the spatial model of the virtual environment can be displayed at the display unit at the position and having the orientation of the marking; wherein the virtual sensor model has at least one virtual detection zone, wherein the virtual detection zone in the spatial model of the virtual environment can be displayed on the display unit.

    2. The system in accordance with claim 1, wherein the spatial model is present as a CAD model on the basis of a real environment.

    3. The system in accordance with claim 1, wherein the spatial model is formed or was formed by means of a 3D sensor.

    4. The system in accordance with claim 3, wherein the position and the orientation of the marking can be detected by the 3D sensor and the camera.

    5. The system in accordance with claim 3, wherein the 3D sensor is a stereo camera or a time of flight sensor.

    6. The system in accordance with claim 1, wherein the time of flight sensor is a laser scanner or a is a 3D time of flight camera.

    7. The system in accordance with claim 1, wherein the markings are at least two-dimensional matrix encodings.

    8. The system in accordance with claim 1, wherein the markings are at least real 3D models.

    9. The system accordance with claim 1, wherein the marking is arranged at the real environment by means of a holding device.

    10. The system in accordance with claim 1, wherein the position and orientation of a second marking is linked to a virtual object, with the virtual object in the spatial model of the virtual environment being displayed on the display unit at the position and having the orientation of the second marking.

    11. The system in accordance with claim 1, wherein the display unit is a smartphone, a tablet computer, augmented reality glasses/headset, or virtual reality glasses/headset.

    12. The system in accordance with claim 1, wherein the markings have at least one transponder.

    13. The system in accordance with claim 1, characterized in that the virtual sensor model is configurable, with the configuration of the virtual sensor model being able to be transferred to the real sensor.

    14. The system in accordance with claim 1, characterized in that at least one virtual solution is transferred to a simulation after a virtual planning.

    15. A method using a system for planning a use, the system comprising at least one marking, a control and evaluation unit, a database, a display unit, at least one spatial model of a real environment, and at least one camera for imaging the real environment; wherein the real environment in the spatial model is displayed as a virtual environment on the display unit; wherein the marking in the real environment is arranged at a position and having an orientation; wherein the position and the orientation of the marking are detected at least by the camera; wherein the position and orientation of the marking are linked to a virtual sensor model; wherein the visual sensor model in the spatial model of the virtual environment can be displayed at the display unit at the position and having the orientation of the marking; wherein the virtual sensor model has at least one virtual detection zone, wherein the virtual detection zone in the spatial model of the virtual environment is displayed on the display unit.

    Description

    [0063] The invention will also be explained in the following with respect to further advantages and features with reference to the enclosed drawing and to embodiments. The Figures of the drawing show in:

    [0064] FIG. 1 a system for planning a use with at least one marking;

    [0065] FIG. 2 a system for planning a use with at least one marking;

    [0066] FIG. 3 a further system for planning a use with at least one marking;

    [0067] FIG. 4 an exemplary marking;

    [0068] FIG. 5 a further system for planning a use with at least one marking.

    [0069] In the following Figures, identical parts are provided with identical reference numerals.

    [0070] FIG. 1 shows a system 1 for planning a use having at least one marking 2, having a control and evaluation unit 3, having a database 4, having a display unit 5, and having at least one spatial model of a real environment, and at least one camera 7 for imaging the real environment 8, wherein the real environment 8 in the spatial model can be displayed as a virtual environment 9 on the display unit 5, wherein the marking 3 in the real environment 8 is arranged at a position and having an orientation, wherein the position and the orientation of the marking 2 can be detected at least by the camera 7, wherein the position and orientation of the marking 2 are linked by a virtual sensor model 10, wherein the virtual sensor model 10 in the spatial model of the virtual environment 9 can be displayed on the display unit at the position and having the orientation of the marking 2, wherein the virtual sensor model 10 has at least one virtual detection zone 11, and wherein the virtual detection zone 11 in the spatial model of the virtual environment 9 can be displayed on the display unit 5.

    [0071] FIG. 2 shows a system 1 for planning a use having at least one marking 2, having a control and evaluation unit 3, having a database 4, having a display unit 5, and having at least one time of light sensor 6 for a spatial scanning of a real environment, and at least one camera 7 for imaging the real environment, wherein the real environment in the spatial model can be displayed as a virtual environment 9 on the display unit 5, wherein the marking 2 in the real environment 8 is arranged at a position and having an orientation, wherein the position and the orientation of the marking 2 can be detected by the time of flight sensor 6, wherein the position and orientation of the marking 2 are linked by a virtual sensor model 10, wherein the virtual sensor model 10 in the spatial model of the virtual environment 9 can be displayed on the display unit 5 at the position and having the orientation of the marking 2, wherein the virtual sensor model 10 has a virtual protected field 11, and wherein the virtual protected field 11 in the spatial model of the virtual environment 9 can be displayed on the display unit 5.

    [0072] FIG. 3 shows a system in accordance with FIG. 2. A set of N markings 2 or markers or so-called visual tags is provided, for example. The marking 2 or the markings 2 are positioned or arranged in an environment of a planned use or application. For example, fixedly at an infrastructure, e.g. at a wall or at a work place. The position is, for example, a spatially fixed position and the orientation is a spatially fixed orientation.

    [0073] The markings 2 are, for example, arranged at manipulators, e.g. at a robot 15 or at mobile platforms 16. The position is, for example, then a movable variable position and the orientation is a movable variable orientation.

    [0074] A virtual sensor model 10 that is presented in augmented reality faithful to the orientation with the aid of a display unit 5 or of a mobile end device can now be associated by means of the control and evaluation unit 3 with every marking 2 by software or APP. The effective range of the virtual sensor and the sensor function of the virtual sensor in the augmented reality are in particular also presented here. Provision can also be made to be able to set the relative translation and orientation between the marker and the augmented object on the digital end device.

    [0075] A digital end device 17 with the time of flight sensor 6, for example, for this purpose optically localizes the position and orientation of the markings 2 in the environment of the use using the time of flight sensor 6 and the camera 7. In this respect, for example, three different spatial directions and three different orientations, that is six degrees of freedom, of the marking 2 are determined. The time of flight sensor 6, for example a LIDAR scanner here generates the associated virtual environmental model or a situative environmental model.

    [0076] The environment or an environmental model here also comprises persons 18, for example, that dwell in the real environment, for example an industrial scene. The markings 2 are now used to associate the corresponding sensor function in situ with a located marking 2 or to associate and visualize the position in the correct orientation.

    [0077] In this process the effective ranges of the virtual sensors 10 do not penetrate any virtual infrastructure, i.e. e.g. walls, floors, or persons 18 are not irradiated in the augmented reality visualization, but are rather photorealistically recorded. An application can thus be planned interactively and/or immersively in a very efficient and transparent manner in a similar manner to “Post-It” notes.

    [0078] Sensors are only represented by the marking 2 to which specific sensor properties can be assigned. The sensor properties assigned to a marking 2 are visualized in an augmented manner.

    [0079] In this respect, knowledge of the relative location of the marking 2 can also be updated on a location change or an orientation change of the marking 2.

    [0080] The already positioned markings 2 can thus be recognized in the application and can also be associated with a location or coordinates in the environment from the combination of image data of the camera 7 and from generated depth maps of the time of flight sensor 6 on the recording of the environment and the generation of the environmental model.

    [0081] On a change of a marking position or on the addition of a further marking 2, only the zone around the new position has to be recorded until the algorithm has enough references to correctly reposition the marking 2 in the virtual environment. This could also take place in real time if the end device 17 used for the visualization has the corresponding sensor system.

    [0082] If the environmental detections takes place in real time, e.g. via the end device used for the visualization, moving objects can also be taken into account.

    [0083] After a successful planning, the results can be documented by means of screenshots or videos and can be transferred into the application documentation, e.g. as part of a risk assessment.

    [0084] The virtual sensors 10 used in the simulation can, for example, be positioned in a shopping basket in a tool of the control and evaluation unit 3 and can be obtained by the clients as real sensors.

    [0085] Sensor configurations formed from the simulation can, for example, likewise be delivered to clients as a parameter file. The simple and fast setup of the real application is thereby made possible and promoted in addition to the simple and fast planning of applications.

    [0086] The markings 2 attached in the real environment 8 can also serve as installation instructions for an integrator team. For example, assembly instructions, in particular text messages, can be displayed in the virtual environment 9, for example having the wording: “Please attach a sensor X in the orientation shown Y with a parameter set Z”.

    [0087] Subsequent to the virtual planning, one or more virtual solutions can be transferred to a simulation. Both process-relevant parameters, for example speeds, components per hour, etc. and safety-directed parameters, for example, protected fields, speeds, routes, and/or interventions by workers can now be varied and simulated in the simulation. The safety concept can thereby be further validated, on the one hand, and a productivity and an added value of the safety solution can be quantified and compared, on the other hand.

    [0088] An application planning by the user is possible in the future production environment, for example a human/robot collaboration (HRC). A user prepares a 3D environmental model in the virtual environment 9. A simulation and visualization of the safeguarding of the hazard sites takes place by positioning markings 2 in the environment that can be flexibly linked via a SICK library to different virtual sensors and their properties. The user can thus virtually plan and simulate his application, store sensor configurations, and place an individualized order via the APP. The client receives preconfigured sensors, including assembly instructions, based on his environmental model. After the installation of the real application, a validation can be made, with the simulation in the virtual environment 9 being compared with the real application. Differences are recognized and can be readjusted.

    [0089] As in the described human/robot collaboration (HRC) application, the application planning can likewise be carried out by workstations for an autonomous mobile robot application (AMR), for example. A user prepares a 3D model in the virtual environment 9 for this purpose. The safeguarding of the hazard site can be simulated and visualized by positioning markings 2 or markers that are linked to virtual sensors 10 and their properties. The user can thus likewise plan and simulate his application in augmented reality, store sensor configurations, and place an individualized order via the app. Furthermore not only workstations can be simulated and visualized, but also path sections with an autonomous mobile robot (AMR). For this purpose, a live 3D environmental model can be prepared and potential hazard sites simulated in the route planning.

    [0090] In accordance with FIG. 3, the time of flight sensor 6 is, for example, a laser scanner or a 3D time of flight camera.

    [0091] Sensor configurations of the virtual sensor 10, e.g. a protected field size of a virtual laser scanner, can be stored using the environmental model and the desired effective range of virtual sensors 10 spanned over the markings 2 and, for example, provided to a client as part of the ordering process for real sensors. In addition to the simple and fast planning of applications, the sensor configuration likewise supports the setting up of the real application.

    [0092] In accordance with FIG. 3, for example, two markings 2 are, for example, spatially fixedly arranged at a robot 15 and should detect and safeguard a specific zone around the robot 15. Provision is, for example, made that this zone is likewise input or drawn in the application by a mobile end device 17 and zones are thus identified that are not detected by the safety sensors in their current attachment and configuration. Such zones are e.g. produced behind static objects at which the protected fields are cut off due to the taught environmental data. As shown in FIG. 4, warning notices, for example symbolically with an exclamation mark, can be displayed, for example, in the virtual environment.

    [0093] The protected fields can also be interrupted or cut off at moving obstacles such as the person 18 or the mobile platform 16. In principle, markings 2 are also provided at moving machine parts or vehicles.

    [0094] A correspondence between the markings 2 and the virtual sensors 10 can look as follows, for example:

    [0095] The data models of virtual sensors 10 are stored in a library of the database 4, for example. This library, for example, comprises the 3D CAD model, an effective range, and a set of functions for every sensor. In addition, for example, meta data are stored for every senor; they can, for example, comprise a sensor type (e.g. laser scanner, 3D time of flight camera, etc.), physical connectors (e.g. Ethernet, IO link, etc.), or a performance level. The library or sensor library can be equipped with search filters so that the user can decide which virtual sensors 10 he would like to use for the association (e.g. a 3D time of flight camera with 940 nm wavelength, 3 m range, 70° field of vision, and Ethernet connector). Similar to a morphological box, different sensors can be associated with the markings and are then used for planning the application.

    [0096] In accordance with FIG. 3 and FIG. 4, the markings 2 are two-dimensional matrix encodings. In this respect, the unique direction and the unique orientation of the marking 2 can be recognized and determined from the two-dimensional matrix encoding. The matrix encoding can optionally comprise the kind of the sensor as well as further properties of the sensor such as a protected field size, a protected field direction, and a protected field orientation.

    [0097] In accordance with FIG. 3, the marking 2 is arranged in the real environment by means of a holding device.

    [0098] Universally suitable suspensions are, for example, provided to be able to position the markings 2 as freely as possible in the application for planning. A modular attachment concept comprising a base plate having the marking is, for example, provided in combination with selectively a magnetic device, an adhesive device, a clamping device, and/or a screw device.

    [0099] For example the position and orientation of a second marking 2 is linked to a virtual object, with the virtual object in the spatial model of the environment being displayed on the display unit at the position and having the orientation of the second marking 2.

    [0100] The virtual object can, for example, be a virtual machine, a virtual barrier, a virtual store, virtual material, virtual workpieces, or similar.

    [0101] The movements, routes, interventions, etc. of a person 18 can be transferred into a simulation or the virtual environment by means of a similar technique. For this purpose, markings 2 or markers can be attached to the joints of the person 18 and the movements can be recorded and displayed.

    [0102] In accordance with FIG. 3 the display unit is, for example, a smartphone, a tablet computer, augmented reality glasses/headset, or virtual reality glasses/headset.

    [0103] Devices or systems can thus be considered as mobile end devices 17 that are equipped with at least one camera and a possibility for visualization.

    [0104] In accordance with FIG. 3, the markings 2 have for example a transponder.

    [0105] Provision is, for example, made that the visual markings are additionally provided with radio location, for example using a UWB technique.

    REFERENCE NUMERALS

    [0106] 1 system

    [0107] 2 marking

    [0108] 3 control and evaluation unit

    [0109] 4 database

    [0110] 5 display unit

    [0111] 6 3D sensor

    [0112] 7 camera

    [0113] 8 real environment

    [0114] 9 virtual environment

    [0115] 10 virtual sensor model/virtual sensor

    [0116] 11 virtual protected field

    [0117] 15 robot

    [0118] 16 mobile platforms

    [0119] 17 digital end device

    [0120] 18 person