Patent classifications
A61F4/00
CONTROLLING DEVICES USING FACIAL MOVEMENTS
A system for controlling at least one device includes a pair of glasses having a glasses frame. A plurality of magnetic sensors, a processor coupled to the plurality of magnetic sensors, and a wireless communication transmitter coupled to the processor are arranged on or in the glasses frame. A plurality of magnetic skins tags are arranged on a human face. The plurality of magnetic sensors sense movement of at least one of the plurality of magnetic skin tags and transmit a signal corresponding to the sensed movement to the processor. The processor, responsive to receipt of the signal corresponding to the sensed movement, transmits a signal for controlling the at least one device via the wireless communication transmitter to a processor of a power-driven mobility device.
MOTOR VEHICLE HAND CONTROL FOR DIFFERENTLY ABLED INDIVIDUALS
A disclosed vehicle control system provides control over complex vehicle functions including transmission shifting to enable differently abled drivers to operate a motor vehicle.
Visually directed human-computer interaction for medical applications
The present invention relates to a method and apparatus of utilizing an eye detection apparatus in a medical application, which includes calibrating the eye detection apparatus to a user; performing a predetermined set of visual and cognitive steps using the eye detection apparatus; determining a visual profile of a workflow of the user; creating a user-specific database to create an automated visual display protocol of the workflow; storing eye-tracking commands for individual user navigation and computer interactions; storing context-specific medical application eye-tracking commands, in a database; performing the medical application using the eye-tracking commands; and storing eye-tracking data and results of an analysis of data from performance of the medical application, in the database. The method includes performing an analysis of the database for determining best practice guidelines based on clinical outcome measures.
Visually directed human-computer interaction for medical applications
The present invention relates to a method and apparatus of utilizing an eye detection apparatus in a medical application, which includes calibrating the eye detection apparatus to a user; performing a predetermined set of visual and cognitive steps using the eye detection apparatus; determining a visual profile of a workflow of the user; creating a user-specific database to create an automated visual display protocol of the workflow; storing eye-tracking commands for individual user navigation and computer interactions; storing context-specific medical application eye-tracking commands, in a database; performing the medical application using the eye-tracking commands; and storing eye-tracking data and results of an analysis of data from performance of the medical application, in the database. The method includes performing an analysis of the database for determining best practice guidelines based on clinical outcome measures.
MOTION STABILIZATION BY A HANDHELD TOOL
Systems and methods for tracking unintentional muscle movements of a user and stabilizing a handheld tool while it is being used by the user are described. The method may include detecting motion of a handle of the handheld tool manipulated by a user while the user is performing a task with a user-assistive device attached to an attachment arm of the handheld tool. Furthermore, the method may include storing the detected motion in a memory of the handheld tool as motion data. The method may also include controlling, based on the motion data, a motion-generating mechanism of the handheld tool that moves the attachment arm relative to the handle in a single degree of freedom in a direction of the detected motion of the handle.
MOTION STABILIZATION BY A HANDHELD TOOL
Systems and methods for tracking unintentional muscle movements of a user and stabilizing a handheld tool while it is being used by the user are described. The method may include detecting motion of a handle of the handheld tool manipulated by a user while the user is performing a task with a user-assistive device attached to an attachment arm of the handheld tool. Furthermore, the method may include storing the detected motion in a memory of the handheld tool as motion data. The method may also include controlling, based on the motion data, a motion-generating mechanism of the handheld tool that moves the attachment arm relative to the handle in a single degree of freedom in a direction of the detected motion of the handle.
REQUESTING ASSISTANCE BASED ON USER STATE
Assistance may be provided to a first user of a first device by monitoring sensors of the first device and providing assistance via a character presented by the device. Sensor data, such as audio, video, or biometric data, may be transmitted to a server, and a server may process the sensor data to determine a state of the first user. The server may determine to request assistance based on the state of the first user. The server may then send a request to a second device of a second user to guide the character presented by the first device. The second user may then provide assistance to the first user by guiding the character presented by the first device.
REQUESTING ASSISTANCE BASED ON USER STATE
Assistance may be provided to a first user of a first device by monitoring sensors of the first device and providing assistance via a character presented by the device. Sensor data, such as audio, video, or biometric data, may be transmitted to a server, and a server may process the sensor data to determine a state of the first user. The server may determine to request assistance based on the state of the first user. The server may then send a request to a second device of a second user to guide the character presented by the first device. The second user may then provide assistance to the first user by guiding the character presented by the first device.
Method for detecting voluntary movements of structures in the ear to trigger user interfaces
A sensor which detects voluntary movements of ear structures, including the ear drum complex, which triggers user interfaces of electronic devices to enable communication and other activities by interaction with assistive technology. The method of detecting this voluntary movement may also be used to trigger and control user interfaces in connected devices such as mobile telephones and incorporated into multi-function earphones.
Hand support method and device for somatosensory input to the palm
A hand support device and method for somatosensory input to the palm comprises a support portion and an attachment portion. The support portion forms a rounded surface that supports a hand of a user. The attachment portion attaches to a structure. The device activates purposeful motor movement with accurate motor control for users with Dyspraxia when employed in conjunction with evidence-based developmental pedagogy in high stimulating tasks such as music education per the Rancer Method (Kupferstein & Walsh, 2015). In one example, the hand support device attaches to a piano and clamps to the keyslip allowing the user to rest their hand while playing the piano. In another example, the hand support device is adapted to attach to a surface of a laptop or desk with a base surface that is disposed above the laptop, allowing the user to rest their hand while typing on the keyboard.