Patent classifications
A61B3/085
A SYSTEM AND A METHOD FOR ALERTING ON VISION IMPAIRMENT
The present invention discloses a technique for alerting on vision impairment. The system comprises a processing unit configured and operable for receiving scene data being indicative of a scene of at least one consumer in an environment, identifying in the scene data a certain consumer, identifying an event being indicative of a behavioral compensation for vision impairment, and, upon identification of such an event, sending a notification relating to the vision impairment.
Methods And Kits For Assessing Neurological Function And Localizing Neurological Lesions
A method for assessing neurological function in a subject includes a) prompting a user to follow a moving saccade-evoking stimulus on a display, b) tracking eye movement of the subject while the user follows the moving stimulus, c) collecting a first eye conjugacy data of the subject relating to the saccade-evoking stimulus, and d) comparing the first eye conjugacy data with a second eye conjugacy data, the second eye conjugacy data relating to an anti-saccade stimulus.
Electronic device and method of controlling the same
An example method of controlling an electronic device worn by a user includes constructing a user model by training a content feature according to response characteristics of an eye of a user who wears the electronic device, and in response to a content feature stored in the user model being detected from reproduced content during content reproduction, processing the reproduced content based on response characteristics of the eye of the user corresponding to the detected content feature.
Vestibular-ocular reflex test and training system
A system and method for testing a subject for cognitive or oculomotor impairment includes presenting the subject with a display of an object, while presenting the display to the subject, and the subject's head moves horizontally or vertically within a predefined range of movement rates, measuring the subject's right eye positions and/or the subject's left eye positions. The system generates a metric by statistical analysis of the measurements of subject's right eye positions or the subject's left eye positions, and generates a report based on the metric. In some embodiments, the system is configured to train the subject to improve their performance, and thereby remediate or reduce cognitive and oculomotor impairment.
System and method for measuring ocular motility
The present invention provides a system and a method for measuring ocular motility of a patient. The system comprises a display unit capable of presenting at least one target; a blocking unit configured and operable to selectively block/unblock at least one target in a field of view of at least one eye of the patient; a camera unit comprising at least one imaging element configured and operable to generate at least two image data indicative of at least one eye condition; and a processing unit connected to the blocking unit, to the display unit and to the camera unit, the processing unit being configured for performing the following steps: (a) displaying at least one target, for at least one eye (b) receiving image data indicative of at least one eye's condition from the camera unit, (c) controlling the blocking unit to block/unblock at least one target in the field of view of at least one eye of the patient, (d) detecting a change in at least one eye's condition, (e) displacing the target for at least one eye; and repeating steps (a)-(e) until no change in the eye's condition is measured to thereby determine at least one ocular motility parameter.
MODIFICATION PROFILE GENERATION FOR VISION DEFECTS RELATED TO DOUBLE VISION OR DYNAMIC ABERRATIONS
In certain embodiments, vision defect information may be generated via a dynamic eye-characteristic-based fixation point. In some embodiments, a first stimulus may be displayed at a first location on a user interface based on a fixation point for a visual test presentation. The fixation point for the visual test presentation may be adjusted during the visual test presentation based on eye characteristic information related to a user. As an example, the eye characteristic information may indicate a characteristic of an eye of the user that occurred during the visual test presentation. A second stimulus may be displayed during the visual test presentation at a second interface location on the user interface based on the adjusted fixation point for the visual test presentation. Vision defect information associated with the user may be generated based on feedback information indicating feedback related to the first stimulus and feedback related to the second stimulus.
Method and system for generating a retail experience using an augmented reality system
An augmented reality AR system and method for a retail experience include a waveguide apparatus that includes a planar waveguide and at least one optical diffraction element The AR retail system and method recognizes user location in a retail establishment, retrieves data corresponding to the retail establishment and generates virtual content relating to the retail establishment based on the retrieved data. The AR retail system and method creates a virtual user interface in a user's field of view. Virtual content is displayed on the virtual user interface while the user is engaged in retail activity and may be based on user input. The AR retail system and method may provide entertainment, facilitate the shopping experience, offer virtual coupons, render games based on locations throughout a store or based on a shopping list, provide information about food choices such as calorie counts, and identify metadata associated with items.
Method and system for generating a retail experience using an augmented reality system
An augmented reality AR system and method for a retail experience include a waveguide apparatus that includes a planar waveguide and at least one optical diffraction element The AR retail system and method recognizes user location in a retail establishment, retrieves data corresponding to the retail establishment and generates virtual content relating to the retail establishment based on the retrieved data. The AR retail system and method creates a virtual user interface in a user's field of view. Virtual content is displayed on the virtual user interface while the user is engaged in retail activity and may be based on user input. The AR retail system and method may provide entertainment, facilitate the shopping experience, offer virtual coupons, render games based on locations throughout a store or based on a shopping list, provide information about food choices such as calorie counts, and identify metadata associated with items.
Mobile device application for ocular misalignment measurement
Disclosed are a mobile device and method which include acquiring, by an image acquisition unit installed in a mobile device, an image of eyes of a patient while light provided by a light source reflects from an optical surface of the eyes of the patient; and obtaining, by a processor installed in the mobile device, ocular misalignment measurements, including a magnitude and a direction of ocular misalignment in the eyes of the patient, using the acquired image or set of images.
Systems and methods for measuring and classifying ocular misalignment
A device for measuring and classifying ocular misalignment of a patient's eyes includes an enclosure, two lenses at the front of the enclosure, one corresponding to each eye of a patient, a divider within the enclosure, positioned laterally between the lenses, a screen within the enclosure, an integrated microprocessor connected to the screen, and at least one input control connected to the integrated microprocessor, at least one input control operable by the patient; where the integrated microprocessor generates and transmits two images to the screen, each image corresponding to each lens; where the integrated microprocessor receives input from the patient via at least one input control to manipulate at least one image on the screen; and where the integrated microprocessor calculates and outputs a quantification of ocular misalignment based on that input.