Patent classifications
A63F13/21
Resistance control systems and methods for amusement attractions
A resistance control system of an amusement attraction includes a support assembly having a base, a pivot joint, and a support beam extending between the base and the pivot joint. The resistance control system includes a spring plate coupled to the pivot joint of the support assembly, and includes at least one spring engaged with the spring plate. Additionally, the resistance control system includes an actuator plate positioned between the spring plate and the base of the support assembly, as well as at least one actuator coupled between the actuator plate and the base. The at least one actuator is configured to move and secure the actuator plate relative to the pivot joint to adjust a resistance to movement about the pivot joint.
Resistance control systems and methods for amusement attractions
A resistance control system of an amusement attraction includes a support assembly having a base, a pivot joint, and a support beam extending between the base and the pivot joint. The resistance control system includes a spring plate coupled to the pivot joint of the support assembly, and includes at least one spring engaged with the spring plate. Additionally, the resistance control system includes an actuator plate positioned between the spring plate and the base of the support assembly, as well as at least one actuator coupled between the actuator plate and the base. The at least one actuator is configured to move and secure the actuator plate relative to the pivot joint to adjust a resistance to movement about the pivot joint.
Interference based augmented reality hosting platforms
Interference-based augmented reality hosting platforms are presented. Hosting platforms can include networking nodes capable of analyzing a digital representation of scene to derive interference among elements of the scene. The hosting platform utilizes the interference to adjust the presence of augmented reality objects within an augmented reality experience. Elements of a scene can constructively interfere, enhancing presence of augmented reality objects; or destructively interfere, suppressing presence of augmented reality objects.
Interference based augmented reality hosting platforms
Interference-based augmented reality hosting platforms are presented. Hosting platforms can include networking nodes capable of analyzing a digital representation of scene to derive interference among elements of the scene. The hosting platform utilizes the interference to adjust the presence of augmented reality objects within an augmented reality experience. Elements of a scene can constructively interfere, enhancing presence of augmented reality objects; or destructively interfere, suppressing presence of augmented reality objects.
Systems and methods for assisting virtual gestures based on viewing frustum
An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.
Systems and methods for assisting virtual gestures based on viewing frustum
An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.
Fine-motion virtual-reality or augmented-reality control using radar
This document describes techniques for fine-motion virtual-reality or augmented-reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or sub-millimeter scale, for user control actions even when those actions are small, fast, or obscured due to darkness or varying light. Further, these techniques enable fine resolution and real-time control, unlike conventional RF-tracking or optical-tracking techniques.
Fine-motion virtual-reality or augmented-reality control using radar
This document describes techniques for fine-motion virtual-reality or augmented-reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or sub-millimeter scale, for user control actions even when those actions are small, fast, or obscured due to darkness or varying light. Further, these techniques enable fine resolution and real-time control, unlike conventional RF-tracking or optical-tracking techniques.
Systems and methods for facilitating virtual vehicle operation based on real-world vehicle operation data
Systems and methods for facilitating virtual operation of a virtual vehicle within a virtual environment are disclosed. According to aspects, a computing device may access a data model indicative of real-life operation of a real-life vehicle by a real-life operator and, based on the data model, generate a set of virtual vehicle movements that are reflective of a performance of the real-life operation of the real-life vehicle by the real-life operator. The computing device may display, in a user interface, the virtual vehicle undertaking the set of virtual vehicle movements such that the real-life operator may review the virtual movements and potentially be motivated to improve his/her real-life vehicle operation.
Play system and method for detecting toys
A play system, comprising: one or more toys comprising one or more electrically conductive parts, the one or more conductive parts defining a spatial pattern, the one or more toys having a physical configuration that is modifiable by a user, the spatial pattern being dependent on the physical configuration of the one or more toys, the one or more toys including a first toy; a magnetic field generating device arranged to generate a magnetic field for inducing an eddy current in one or more of the electrically conductive parts; a sensor configured to detect the induced eddy current; and a data processor; wherein the first toy and at least one of the magnetic field generating device and the sensor are movable relative to each other; wherein the sensor is configured to detect the induced eddy current during relative movement between the first toy and at least one of the magnetic field generating device and the sensor, and wherein the data processor is configured to: receive sensor data from the sensor, the sensor data being indicative of the eddy current detected during relative movement between the first toy and at least one of the magnetic field generating device and the sensor, detect the spatial pattern of the electrically conductive parts based at least in part on the received sensor data, determine the physical configuration of the one or more toys based on the detected spatial pattern.