Patent classifications
G06F3/0325
Four dimensional energy-field package assembly
Four dimensional (4D) energy-field package assembly for projecting energy fields according to a 4D coordinate function. The 4D energy-field package assembly includes an energy-source system having energy sources capable of providing energy to energy locations, and energy waveguides for directing energy from the energy locations from one side of the energy waveguide to another side of the energy waveguide along energy propagation paths.
Hand controller for robotic surgery system
A Robotic control system has a wand, which emits multiple narrow beams of light, which fall on a light sensor array, or with a camera, a surface, defining the wand's changing position and attitude which a computer uses to direct relative motion of robotic tools or remote processes, such as those that are controlled by a mouse, but in three dimensions and motion compensation means and means for reducing latency.
TRACKING SYSTEM, TRACKING METHOD AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
A tracking method, for tracking an object based on a computer vision, includes following steps. A series of images is captured by a tracking camera. A first position of a trackable device is tracked within the images. An object is recognized around the first position in the images. In response to the object being recognized, a second position of the object is tracked in the images.
DEVICES AND METHODS FOR GENERATING INPUT
Devices and methods are disclosed for generating input. In one implementation, a stylus is provided for generating writing input. The stylus includes an elongated body having a distal end, and a light source configured to project coherent light on an opposing surface adjacent the distal end. The stylus further includes at least one sensor configured to measure first reflections of the coherent light from the opposing surface while the distal end moves in contact with the opposing surface, and to measure second reflections of the coherent light from the opposing surface while the distal end moves above the opposing surface and out of contact with the opposing surface. The stylus also includes at least one processor configured to receive input from the at least one sensor and to enable determining three dimensional positions of the distal end based on the first reflections and the second reflections.
SYSTEMS AND METHODS FOR AUGMENTING AN APPEARANCE OF A HILT TO SIMULATE A BLADED WEAPON
This disclosure relates to systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon. A hilt may be augmented with a blade of a bladed weapon by detecting a landmark associated with the hilt, determining a position and/or an orientation of the hilt, determining an overlay image comprising the blade of the bladed weapon, wherein the blade is placed within the overlay image according to the determined position and/or the determined orientation of the hilt, and displaying the overlay image so that the blade of the bladed weapon appears to be attached to the hilt.
CONTROL SYSTEM FOR NAVIGATING A PRINCIPAL DIMENSION OF A DATA SPACE
Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
OPERATING DEVICE
An operating device includes a first light-emitting area provided on a front surface of the operating device, a first light-transmitting member that is formed with a material that transmits light and that makes up the first light-emitting area, a second light-emitting area provided on an upper surface of the operating device, a second light-transmitting member that is formed with a material that transmits light and that makes up the second light-emitting area, a light source, and a light guide member adapted to guide light of the light source to the first light-transmitting member and the second light-transmitting member. The light guide member includes a first guide section arranged behind the first light-transmitting member to guide light to the first light-transmitting member, and a second guide section that extends upward beyond a position of the first light-transmitting member toward the second light-transmitting member.
Devices and methods for monitoring gaze
A gaze monitoring system comprising: an eye tracker comprising; a camera having an optical axis; a first IR source configured to illuminate a user's eyes; the first IR source located relatively near the optical axis of the camera; and at least one second IR source configured to illuminate the user's eyes, the at least one second IR source located relatively far from the camera's optical axis, in a position such that when the gaze angle is too large to get corneal reflection images of the first IR source, the image reflection of the at least one second IR source is visible by the camera, as a corneal reflection.
Systems and methods for detection of objects within a field of view of an image capture device
Robotic surgical systems and methods of operating robotic surgical systems are included. The methods include directing light at an optical element configured to be detected by an image capture device of the robotic surgical system, the optical element configured to reflect light having a wavelength within a predetermined range, detecting, using an image capture device capturing images of the optical element, an absence or a presence of the reflected light from the optical element, and providing a notification, in response to the detection by the image capture device of the absence of the reflected light from the optical element.
Adaptive visual overlay for anatomical simulation
An anatomical feature simulation unit is a physical device designed to help simulate an anatomical feature (e.g., a wound) on an object (e.g., a human being or human surrogate such as a medical manikin) for instructing a trainee to learn or practice treatment skills. For the trainee, the simulation looks like a real body part when viewed using an Augmented Reality (AR) system. Responsive to a change in the anatomic state of the object (e.g., bending a knee or raising of an arm) not only the spatial location and orientation of the anatomical feature stays locked on the object in the AR system, but the characteristics of the anatomical feature change based on the physiologic logic of changing said anatomical state (e.g., greater or less blood flow, opening or closing of a wound).