AUGMENTED REALITY SYSTEM WITH INTERACTIVE OVERLAY DRAWING
20230237643 · 2023-07-27
Inventors
Cpc classification
G06F16/5866
PHYSICS
International classification
Abstract
A method allows an estimator or other party to “Walk the Drawings.” An estimator can open one or more sets of drawings on a mobile device and walk those same electronic drawings as they actually walk or traverse the physical site itself. In other words, as the estimator physically moves across the construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings on their mobile device. Now the estimator can identify present and future such as features that need to be accessible being buried under asphalt. While walking the drawings, the estimator can label any challenges or features by simply clicking their avatar and it will post that geo-stamped location complete with corresponding notes and photos straight to the drawings for later review and analysis.
Claims
1. A method for navigation comprising: receiving, on a mobile device a terrain image and a map overlay from a server, the terrain image and the map overlay aligned; tracking, the movement of the mobile device as it moves within the area of the terrain image; and annotating, a position of the mobile device superimposed on the terrain image and the map overlay.
2. A method for aligning layers on an augmented reality display, the method comprising: receiving, an image file; rotating the image file to a predetermined heading; selecting a plurality of reference points on a terrain image; selecting a first alignment point in the image file corresponding to one of the plurality of reference points; and selecting a second alignment point in the image file, the second alignment point located on a line connecting two of the reference points.
3. A method for displaying annotated image data, the method comprising: receiving, image data; processing the image data into a byteslist; injecting the byteslist into a native map image layer; combining the native map image layer with a visual object; and rendering a map image.
4. The method of claim 1 further comprising: receiving a camera heading associated with a camera; rotating the terrain image until a heading of the terrain image matches the camera heading; capturing a photo with the camera; tagging the photo with metadata to produce a tagged photo; and sending the tagged photo to a second server.
5. The method of claim 4 wherein the camera is a component of the mobile device.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0016] Further features and advantages of the present invention will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024] It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
DETAILED DESCRIPTION
[0025] Embodiments will now be described with reference to the figures. For simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein.
[0026] Various terms used throughout the present description may be read and understood as follows, unless the context indicates otherwise: “or” as used throughout is inclusive, as though written “and/or”; singular articles and pronouns as used throughout include their plural forms, and vice versa; similarly, gendered pronouns include their counterpart pronouns so that pronouns should not be understood as limiting anything described herein to use, implementation, performance, etc. by a single gender; “exemplary” should be understood as “illustrative” or “exemplifying” and not necessarily as “preferred” over other embodiments. Further definitions for terms may be set out herein; these may apply to prior and subsequent instances of those terms, as will be understood from a reading of the present description.
[0027] Embodiments of the present invention provide an augmented reality computer system together with methods that allows an estimator or other party to “Walk the Drawings.” An estimator can open one or more sets of drawings on a mobile device and walk those same electronic drawings as they actually walk or traverse the physical site itself. In other words, as the estimator physically moves across the construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings on their mobile device. Now the estimator can identify and locate features that may need to be accessible while being buried under asphalt. While walking the drawings, the estimator can label any challenges or features by simply clicking their avatar and it will post that geo-stamped location complete with corresponding notes and photos straight to the drawings for later review and analysis.
[0028] Embodiments may provide extended mapping feature to enable site leaders to optimally design their site layouts for optimal construction efficiency. A user can label the best locations for haul roads, access points, equipment laydown areas, material storage spots and any other job details that might improve site efficiency. Placing these details on the drawings translates into actual placement in the field and at those exactly identified locations. These mapping features can also be used to documenting extras billings, quality control or environmental challenges.
[0029] Embodiments may include project schedules that allow for unlimited sub-schedules such as embedded equipment plans, embedded crew plans or any other embedding a user can think of. Schedule data, dates, and resources may be filtered and viewable separately. As an overall schedule changes at the highest level all the embedded schedules change with it. These embedded schedules provide a holistic view of an entire site, or an entire company's resource utilization.
[0030] Augmented reality systems as described in embodiments herein may be used in a number of industries and applications. Land developers and home builders may use embodiments to show how to drive a site by creating a Google Map annotated equivalent before Google Maps are actually supported in that area. A person can use the augmented reality system to drive to their lot without it being staked and understand the orientation, size, and view of key features. The shipping industry may utilize a map overlay over waterways to facilitate the travel of vessels in predefined shipping lanes or parking spots/berths, while avoiding hazards. Similarly, the airline industry may overlay runways for the pilots. Embodiments may also be used by the mining Industry to accurately overlay features, obstacles, hazards, etc. on the mine area.
[0031]
[0032] In embodiments, a user 108 may open a set of drawings or maps on their mobile device 106 and walk those same electronic drawings as they actually walk the physical site itself. In other words, as an estimator, foreman or laborer (user 108) physically moves across a construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings or maps that they have chosen on their mobile device 106 to view. These maps can include drawings, safety maps, underground locate maps, job construction details, material details.
[0033] With reference to
[0034] While moving about in the construction site, a user 108, such as a construction estimator, can simultaneously see themselves as an avatar on the drawings 900 on site and see information on satellite image 800, map 900, and other layers that have been configured for the build site. Also, while walking the drawings, the user 108 can label any challenges by simply clicking their avatar and the mobile device 106 software will post that geo-stamped location complete with any corresponding notes and photos (showing the direction of the camera) straight to the shared drawings for later review and analysis. This feature can also be used to documenting extras billings, quality control or environmental challenges to name just a few use cases. Information such as notes, may entered by user 108 by typing at a keyboard or keypad, using voice-text software, adding voice recordings, etc. Other information such as absolute or relative location, bearing, altitude, azimuth, etc. may be obtained from sensors included in the mobile device 106.
[0035] Embodiments extended the “Walk the Drawings” mapping feature to enable other users, such as site leaders, to optimally design their site layouts for optimal construction efficiency. For example, a user 108 can label the best locations for haul roads, access points, equipment laydown areas, material storage spots and any other job details that might improve site efficiency. Placing these details on the drawings 900 translates into actual placement in the field 800 and at those exactly identified locations as a person delivering items can follow their avatar to the drop off spots using their own mobile device connected to the augmented reality system. In embodiments, the augmented reality system may be used to issue work orders, and if another user accepts the work order, they may be automatically added to the system for that build site. User's who are not able to access the system may be provided with static or dynamic drawings, images, maps, etc.
[0036] In embodiments, any user, that has access and permission, can label points from the field to the office or from the office to the field. Photos or video added from a phone will include metadata including GPS location data to show where that picture was taken, be geo-stamped, and may indicate the direction that the camera was pointing. These points and maps can stored, catalogued, and filtered by category for quick filtering depending on what the user requires for what they are doing and whether they are on their phone or the computer. Categories may be customizable and may include additional information such as site logistics, safety/hazard points, indicators for extra billings, quality control points, environmental, tendering, etc. System administrators may configure the system to accept, store, filter, and display any number of metadata that may be used for that particular application.
[0037] In embodiments, when looking at the drawings on the computer the transparency of layers can be adjusted to simultaneously view an informational layers as well as the underlying terrain. Tools may be added to allow a user to perform absolute or relative measurements of distance, height, angle, etc. between points or other references.
[0038] In embodiments, locations may be indicated such a spill piles, areas to place materials or not to place materials, recommended or prohibited routes, etc. When a user enters a site, their avatar will appear on the map of the area and locations may be viewed with the user's avatar indicated. For example, if a driver arrives to deliver gravel, through a dispatch received through the augmented reality system, they may access the system to see their location and where they should deliver the gravel to. The driver may access the system through a user interface, such as by clicking an icon to access the build site or through automatically detecting the driver's location. The system may indicate directions to the destination or launch a GPS program to direct the driver. Maps may be rotated so that the driver's direction of travel is in front of them to make for easier navigation and may alert the driver when they reach their destination and provide additional information on how and where the gravel should be placed.
[0039]
[0040]
[0041]
[0042] Embodiments may provide enhancements to traditional project management software, including a macro scheduler with sub-schedules that move as the higher level schedule changes. In the case of a construction site, this could be a job schedule at its highest level similar to schedules supported by software such as Microsoft Project. Embodiments may improve on this by allowing for unlimited sub-schedules such as embedded equipment plans, embedded crew plans, or any other required embedded information. Sub-schedules may be filtered using different criteria and viewable separately. Examples of filtering criteria are duration, dates, milestones, equipment, location, crew, other resources, etc. As the overall schedule changes at the highest level all the embedded schedules change with it. These embedded schedules provide a holistic view of the entire company's resource utilization.
[0043] As an example, using embodiments of the augmented reality system described herein, an equipment sub-schedule (equipment) may show the actual available fleet available on any date and compare that to the aggregated projected equipment demands. This scheduler can include a specific equipment or a type of equipment (example unit #1 or “large backhoe”). Embodiments may take into consideration factors such as equipment out of service, under repair or to be rented out at a future date. As the job schedules change or the equipment fleet changes, the equipment schedule may change with it, keeping the information related to equipment demands and availability up to date and relevant. Being able to see equipment demands for each type of equipment well in advance helps companies make better decisions around renting vs. buying equipment, selling vs. repairing equipment, and choosing rental terms for hourly vs. monthly, or even moving projects around.
[0044] Embodiments may include a user interface that may be used by methods of adding equipment, or other resources, to a project. Embodiments may also include a user interface that shows a holistic view of equipment, or other resources, and how they may move to illustrate schedule changes.
[0045]
[0046] As shown, the device includes a processor 710, such as a central processing unit (CPU) or specialized processors such as a graphics processing unit (GPU) or other such processor unit, memory 720, non-transitory mass storage 730, I/O interface 740, network interface 750, video adaptor 770, and any required transceivers 760, all of which are communicatively coupled via bi-bus 725. Video adapter 770 may be connected to one or more of display 775 and I/O interface 740 may be connected to one or more of I/O device 745 which may be used to implement a user interface. According to certain embodiments, any or all of the depicted elements may be utilized, or only a subset of the elements. Further, computing devices 700 may contain multiple instances of certain elements, such as multiple processors, memories, or transceivers. Also, elements of the hardware device may be directly coupled to other elements without the bus 725. Additionally, or alternatively to a processor and memory, other electronics, such as integrated circuits, may be employed for performing the required logical operations.
[0047] The memory 720 may include any type of non-transitory memory such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memory (ROM), any combination of such, or the like. The mass storage element 530 may include any type of non-transitory storage device, such as a solid state drive, hard disk drive, a magnetic disk drive, an optical disk drive, USB drive, or any computer program product configured to store data and machine executable program code. According to certain embodiments, the memory 720 or mass storage 730 may have recorded thereon statements and instructions executable by the processor 710 for performing any of the aforementioned method operations described above.
[0048] It will be appreciated that it is within the scope of the technology to provide a computer program product or program element, or a program storage or memory device such as a magnetic or optical wire, tape or disc, USB stick, file, or the like, for storing signals readable by a machine, for controlling the operation of a computer according to the method of the technology and/or to structure some or all of its components in accordance with the system of the technology. Acts associated with the method described herein can be implemented as coded instructions in a computer program product. In other words, the computer program product is a computer-readable medium upon which software code is recorded to execute the method when the computer program product is loaded into memory and executed on the microprocessor of computing devices.
[0049] Although the present invention has been described with reference to specific features and embodiments thereof, it is evident that various modifications and combinations can be made thereto without departing from the invention. The specification and drawings are, accordingly, to be regarded simply as an illustration of the invention as defined by the appended claims, and are contemplated to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the present invention.