Patent classifications
G06F3/0325
RETRO-REFLECTIVE DISC TARGET
A retro-reflective marker comprising a bare retro-reflective layer; a protective layer with a near-infrared (NIR) wavelength specific anti-reflective coating; and a border with an NIR absorbent coating.
Systems and methods of displaying virtual elements on a multipositional display
A method of presenting visual information to a user includes detecting a position of a display relative to a surrounding environment, detecting movement of the display relative to the surrounding environment in real time, and updating a present of visual information on the display relative to a virtual reference frame updated in real time based upon the movement of the display relative to the surrounding environment.
HAND CONTROLLER FOR ROBOTIC SURGERY SYSTEM
A Robotic control system has a wand, which emits multiple narrow beams of light, which fall on a light sensor array, or with a camera, a surface, defining the wand's changing position and attitude which a computer uses to direct relative motion of robotic tools or remote processes, such as those that are controlled by a mouse, but in three dimensions and motion compensation means and means for reducing latency.
Arrangement for the relocating of virtual object images within a real non-electronic space
An arrangement for the relocating of virtual object images within a real non electronic space including a virtual object image creator adapted to project a plurality of virtual object images in a real non electronic space and at least one activatable tangible object, wherein the or each activatable tangible object is locatable within a corresponding virtual object image created by the virtual object image creator such that upon activation of the activatable tangible object by a user when the activatable tangible object is located within a virtual object image allows physical movement of the virtual object image within the real non electronic space that corresponds with physical movement and location of the activated activatable tangible object.
Marker-based tracking apparatus and method
A data processing device comprises an analyser to analyse successive images captured by a camera and to detect an optically detectable marker in the captured images, a first location detector to detect a location of the optically detectable marker with respect to a location of the camera according to a first detection mode and to generate a first detection result, a second location detector to detect the location of the optically detectable marker with respect to the location of the camera according to a second detection mode different to the first detection mode and to generate a second detection result, and a processor to select at least one of the first detection result and the second detection result and to generate data indicative of the location of the optically detectable marker with respect to the location of the camera based on the selection.
Systems for simulating joining operations using mobile devices
Systems are disclosed relating to a mobile device mounted to a welding helmet such that a wearer of the welding helmet can see a display of the mobile device when wearing the welding helmet. In some examples, the mobile device is mounted such that a camera of the mobile device is unobscured and positioned at approximately eye level, facing the same way the wearer's eyes are facing. In some examples, the simulated training environment may be presented to the user via the display screen of the mobile device, using images captured by the camera of the mobile device, when the mobile device is so mounted to the welding helmet.
Virtual reality
A virtual reality apparatus includes a head mountable display (HMD); a detector to detect a deviation of a current orientation of the HMD from a base orientation of the HMD; and a generator to generate content for presentation to the wearer of the HMD to prompt the wearer of the HMD to turn his head so as to change the orientation of the HMD towards the base orientation.
Sensor-based Bare Hand Data Labeling Method and System
A sensor-based bare hand data labeling method and system are provided. The method comprises: performing device calibration processing on a depth camera and on one or more sensors respectively preset at one or more specified positions of a bare hand, so as to acquire coordinate transformation data; collecting a depth image of the bare hand by the depth camera, and collecting 6DoF data of one or more bone points; acquiring, based on the 6DoF data and the coordinate transformation data, three-dimensional position information of a preset number of bone points; determining two-dimensional position information of the preset number of bone points on the depth image based on the three-dimensional position information of the preset number of bone points; and labeling joint information on all of the bone points in the depth image according to the two-dimensional position information and the three-dimensional position information.
Animation production system
The principal invention for solving the above-described problem is an animation production method that provides a virtual space in which a given object is placed, the method comprising: detecting an operation of a user equipped with a head mounted display; controlling a movement of an object based on the detected operation of the user; shooting the movement of the object; storing an action data relating to the movement of the shot object in a first track; and storing audio from the user in a second track.
Systems and methods for interfacing with head worn display systems
A display system includes an interface unit, a head worn display in wireless communication with the interface unit, and a first tracker sensor remote from a head of a user and configured to wirelessly sense a head pose and provide first head tracking data to the interface unit. The interface unit is remote from the head worn display. The display system also includes a second tracker sensor associated with the head of the user and configured to provide second head tracking data associated with the head pose to the head worn display. The interface unit is configured to receive the second head tracking data from the head worn display via at least one wireless link and provide video information for display on the head worn display via the at least one wireless link.