G06T2219/2004

VISUAL POSITIONING DEVICE AND THREE-DIMENSIONAL SURVEYING AND MAPPING SYSTEM AND METHOD BASED ON SAME
20180005457 · 2018-01-04 ·

Disclosed are a visual positioning device (101) and a three-dimensional surveying and mapping system (100) including at least one visual positioning device (101). The visual positioning device (101) includes an infrared light source (101b), an infrared camera (101a), a signal transceiver module (101d) and a visible light camera (101c). The three-dimensional surveying and mapping system (100) further includes a plurality of position identification points (102), a plurality of active signal points (103) and an image processing server (104). The image processing server (104) is configured to cache infrared images and real scene images shot by the infrared camera (101a) and the visible light camera (101c) and positioning information thereabout and store a three-dimensional model obtained through reconstruction. The present invention has the advantages of simple structure, no need for a power supply, convenience in use and high precision, etc.

Systems and methods for collaborative location tracking and sharing using augmented reality
11710285 · 2023-07-25 · ·

Disclosed is a location tracking system and associated methods for precisely locating a target device with a recipient device via different forms of location tracking and augmented reality. The recipient device receives a first position of the target device over a data network. The recipient device is moved according to the first position until the target device is in Ultra-WideBand (“UWB”) signaling range of the recipient device. The recipient device then measures a distance and direction of the target device relative to the recipient device based on Time-of-Flight (“ToF”) measurements generated from the UWB signaling. The recipient device determines a second position of the target device based on the distance and direction of the target device, and generates an augmented reality view with a visual reference at a particular position in images of a captured scene that corresponds to the second position of the target device.

Computer-Implemented Method For Positioning Patterns Around An Avatar

A computer-implemented method for designing a virtual garment or upholstery (G) in a three-dimensional scene comprising the steps of: a) providing a three-dimensional avatar (AV) in the three-dimensional scene; b) providing at least one pattern (P) of said virtual garment or upholstery in the three-dimensional scene; c) determining a distance field from a surface of the avatar; d) positioning the pattern relative to the avatar by keeping a fixed orientation with respect to said distance field; and e) assembling the positioned pattern or patterns around the avatar to form said virtual garment or upholstery, and draping it onto the avatar. A computer program product, non-volatile computer-readable data-storage medium and Computer Aided Design system for carrying out such a method. Application of the method to the manufacturing of a garment or upholstery.

Method for scattering points in a uniform arbitrary distribution across a target mesh for a computer animated creature
11710270 · 2023-07-25 · ·

A programmatic arbitrary distribution of items in a modeling system may be provided. To perform the distribution, a surface may be received, and a point count of application points associated with locations on the surface may be determined. A density map may be applied over the surface to assign a density to portions of the surface for the point count. Application points are then assigned to locations on the surface according to the density map and a scattering function of the point count, where the scattering function is based on one or more repulsion forces between neighboring points. The one or more repulsion forces are treated as pushing each of the neighboring point apart. Thereafter, the surface may be provided having the application points scattered across the surface based on the one or more repulsion forces.

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
20230237751 · 2023-07-27 ·

In accordance with an aspect of the present invention, an information processing apparatus includes an interface that transmits or receives data to or from an input apparatus that inputs user position information indicating a position of a user, and a processor that calculates an evaluation value for an evaluation target position based on the user position information, determines a position in a virtual space or a real space where an agent supporting the user is placed based on the evaluation value, and places the agent at the determined position in the virtual space or real space.

Positioning Correction Method of Near Seabed Video Data Based on Ultra-short Baseline

Disclosed is a near-seabed video data positioning correction method based on an ultra-short baseline, comprising the following steps: acquiring ultra-short baseline positioning data; eliminating abnormal data in the ultra-short baseline positioning data, establishing a four-dimensional elimination model, and eliminating the abnormal data in X, Y and Z directions; modeling the correction method of the recombined ultra-short baseline positioning data after removing abnormal data; obtaining the positioning data of the camera drag with specified precision by simulation. The application realizes the positioning correction of video data under the existing conditions and established operation modes, and eliminates, simulates and corrects the error data generated by the time change of ultra-short baseline data used for video positioning in the heading and other directions by integrating and using various survey data, so as to position the near-bottom video data.

MOVING CONTENT BETWEEN A VIRTUAL DISPLAY AND AN EXTENDED REALITY ENVIRONMENT

Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.

Registration of a surgical image acquisition device using contour signatures

Registration of a surgical image acquisition device (e.g. an endoscope) using preoperative and live contour signatures of an anatomical object is described. A control unit includes a processor configured to compare the real-time contour signature to the database of preoperative contour signatures of the anatomical object to generate a group of potential contour signature matches for selection of a final contour match. Registration of an image acquisition device to the surgical site is realized based upon an orientation corresponding to the selected final contour signature match.

Three-dimensional reconstruction method and apparatus for material pile, electronic device, and computer-readable medium

Embodiments of the present disclosure relate to a three-dimensional reconstruction method and apparatus for a material pile, an electronic device, and a computer-readable medium. The method may include: acquiring, in response to an instruction for controlling an excavator body of an excavator to rotate to transport materials being detected, a sequence of depth images of an excavated material pile collected by a binocular camera provided on a side of the excavator; and performing three-dimensional reconstruction based on the sequence of depth images of the material pile, to generate a three-dimensional model of the material pile.

USER CONTROLLED THREE-DIMENSIONAL SCENE
20230236660 · 2023-07-27 ·

The present disclosure relates generally to a system and method for a user to control a virtual representation of themselves within a three-dimensional virtual world. The system and method enable utilizing a two-dimensional image or video data of user with extracted depth information to position themselves in a three-dimensional scene. It also provides a control system and method for a user to control the virtual representation of themselves using the output video as a visual feedback mechanism in a three-dimensional space including the virtual representation of themselves. A user interacts with other virtual objects or items in a scene or even with other users visualized in the scene.