Patent classifications
G06V30/228
Color-sensitive virtual markings of objects
Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.
Color-sensitive virtual markings of objects
Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.
DEVICES AND METHODS FOR GENERATING INPUT
Devices and methods are disclosed for generating input. In one implementation, a stylus is provided for generating writing input. The stylus includes an elongated body having a distal end, and a light source configured to project coherent light on an opposing surface adjacent the distal end. The stylus further includes at least one sensor configured to measure first reflections of the coherent light from the opposing surface while the distal end moves in contact with the opposing surface, and to measure second reflections of the coherent light from the opposing surface while the distal end moves above the opposing surface and out of contact with the opposing surface. The stylus also includes at least one processor configured to receive input from the at least one sensor and to enable determining three dimensional positions of the distal end based on the first reflections and the second reflections.
VIRTUAL CONTACT SHARING ACROSS SMART GLASSES
Systems, methods, and non-transitory computer readable media configured for enabling content sharing between users of wearable extended reality appliances are provided. In one implementation, the computer readable medium may be configured to contain instructions to cause at least one processor to establish a link between a first wearable extended reality appliance and a second wearable extended reality appliance. The first wearable extended reality appliance may display first virtual content. The second wearable extended reality appliance may obtain a command to display first virtual content via the second wearable extended reality appliance, and in response, this content may be transmitted and displayed via the second extended reality appliance. Additionally, the first wearable extended reality appliance may receive second virtual content from the second wearable extended reality appliance, and display said second virtual content via the first wearable extended reality appliance.
VIRTUAL CONTACT SHARING ACROSS SMART GLASSES
Systems, methods, and non-transitory computer readable media configured for enabling content sharing between users of wearable extended reality appliances are provided. In one implementation, the computer readable medium may be configured to contain instructions to cause at least one processor to establish a link between a first wearable extended reality appliance and a second wearable extended reality appliance. The first wearable extended reality appliance may display first virtual content. The second wearable extended reality appliance may obtain a command to display first virtual content via the second wearable extended reality appliance, and in response, this content may be transmitted and displayed via the second extended reality appliance. Additionally, the first wearable extended reality appliance may receive second virtual content from the second wearable extended reality appliance, and display said second virtual content via the first wearable extended reality appliance.
Systems and methods for virtual whiteboards
Methods, systems, apparatuses, and non-transitory computer-readable media are provided for tying virtual whiteboards to physical spaces. In one implementation, the computer-readable medium includes instructions to cause a processor to receive wirelessly, an indication of a location of a first wearable extended reality appliance; perform a lookup to determine that the location of the first wearable extended reality appliance corresponds to a location of a particular virtual whiteboard; transmit to the first wearable extended reality appliance, data corresponding to content of the particular virtual whiteboard; receive, during a first time period, virtual content added by a first user; receive wirelessly at a second time period an indication that a second wearable extended reality appliance is in the location of the particular virtual whiteboard; and transmit to the second wearable extended reality appliance, data corresponding to the content and the added content of the particular virtual whiteboard.
SIMULATING USER INTERACTIONS OVER SHARED CONTENT
Methods, systems, apparatuses, and computer-readable media are provided for simulating user interactions with shared content. In one implementation, the computer-readable medium includes instructions to cause a processor to establish a communication channel for sharing content and user interactions; transmit to at least one second wearable extended reality appliance, first data, representing an object associated with first wearable extended reality appliance, enabling a virtual representation of the object to be displayed through the at least one second wearable extended reality appliance; receive image data from an image sensor associated with the first wearable extended reality appliance; detect in the image data at least one user interaction including a human hand pointing to a specific portion of the object; and transmit to the at least one second wearable extended reality appliance second data indicating an area of the specific portion of the object.
SIMULATING USER INTERACTIONS OVER SHARED CONTENT
Methods, systems, apparatuses, and computer-readable media are provided for simulating user interactions with shared content. In one implementation, the computer-readable medium includes instructions to cause a processor to establish a communication channel for sharing content and user interactions; transmit to at least one second wearable extended reality appliance, first data, representing an object associated with first wearable extended reality appliance, enabling a virtual representation of the object to be displayed through the at least one second wearable extended reality appliance; receive image data from an image sensor associated with the first wearable extended reality appliance; detect in the image data at least one user interaction including a human hand pointing to a specific portion of the object; and transmit to the at least one second wearable extended reality appliance second data indicating an area of the specific portion of the object.
Methods and apparatuses for producing smooth representations of input motion in time and space
The present invention provides a method and apparatus that are directed to accepting a plurality of positional data with corresponding times of the motion, determining one or more continuous positional functions that together represent an approximation of path of the positional data, and determining, for each positional function, one or more time functions that together represent an approximation of the times for the positional data corresponding to the positional functions.
GESTURE CONTROL DEVICE AND METHOD
A gesture control system for a device for determining which one of a plurality of devices is to be controlled by a gesture acquires images of a gesture from each of the electronic devices; establishes a three dimensional coordinate system for the gesture image; calculates an angle between a first vector from a start point of the gesture to a center point of each electronic device and a second vector from an end point of the gesture to the center point of each electronic device. Thereby, the electronic device intended as the object to be controlled by the gesture can be determined, according to whether the angle between the first vector and the second vector is less than a preset value. A gesture control method is also provided.