PORTABLE HEAD-MOUNTED OCULAR-VESTIBULAR TESTING SYSTEM

20170258325 ยท 2017-09-14

Assignee

Inventors

Cpc classification

International classification

Abstract

A portable, miniaturized, lightweight ocular-vestibular testing unit with sensors in a head-mounted unit and processor and battery units either incorporated into the head-mount or on other parts of the body, such as a waist belt and connecting to one or more displays via wireless local network. A digital camera for each eye and multiple sensors are configured for ocular and vestibular system testing in a wide variety of positions and locations of a user. The unit records data from the digital cameras and sensors locally to the unit. Recorded data may be viewed via a wireless local network data connection on external displays.

Claims

1. A portable ocular-vestibular testing apparatus, comprising: a wearable mounting structure configured for fitting onto a user's head; a first [IR] camera directed toward a first eye of the user; a second [IR] camera directed toward a second eye of the user; a motion sensor configured for tracking movement of the user's head; a processor module attached to the wearable mounting structure a battery unit attached to the wearable mounting structure and coupled to the processor module, the battery unit configured for counter-balancing the processor module wherein the processor module comprises: camera controller circuitry coupled to the first camera and to the second camera, and processing circuitry coupled to the first camera, the second camera, and the motion sensor, the processing circuitry including memory for storing image data received from the first camera and from the second camera and for storing motion signals received from the motion sensor.

2. The apparatus of claim 1, wherein the processing circuitry includes clock circuitry for time stamping the image data and the motion signals.

3. The apparatus of claim 1, wherein the first camera and second camera are mounted to the processor module.

4. The apparatus of claim 1, wherein the motion sensor is mounted to the processor module.

5. The apparatus of claim 1, wherein the processing circuitry is configured to detect the user's pupil positions in real time based on the image data.

6. The apparatus of claim 5, wherein the processing circuitry is configured to generate a video rendering of the user's eye movements based on the image data.

7. The apparatus of claim 6, wherein the processing circuitry is configured to record the video rendering.

8. The apparatus of claim 6, wherein the processing circuitry is configured to record eye positions of the user and simultaneous head tilt of the user in 3 dimensions based on the image data and the motion signals.

9. The apparatus of claim 6, wherein the processing circuitry is configured to include an indication of the user's pupil positions in the video rendering of the user's eye movements.

10. The apparatus of claim 5, wherein the processor module comprises a stimulus projecting device configured to present a predetermined visual stimulus to the user.

11. The apparatus of claim 10 wherein the stimulus projecting device includes laser diodes directed to a vertical surface in front of the user, wherein the processing circuitry is configured to control the laser diodes.

12. The apparatus of claim 10, wherein the processing module includes an audio output port and wherein the processing circuitry is configured to communicate an audible stimulus signal to the audio output port, wherein the audible stimulus is synchronized with the predetermined visual stimulus.

13. The apparatus of claim 10, wherein the predetermined visual stimulus is predetermined to elicit specific responses that indicate operation of the user's ocular and vestibular systems and neurological pathways.

14. The apparatus of claim 13 wherein the predetermined visual stimulus comprises a plurality of stationary dots and moving patterns.

15. The apparatus of claim 10, wherein the processing circuitry comprises wireless transceiver circuitry configured for wireless communication of the video rendering to a separate external display device.

16. The apparatus of claim 15, wherein the wireless transceiver circuitry is configured for receiving control signals from the separate external display device.

17. The apparatus of claim 16, wherein the control signals include a first signal configured to cause the processing circuitry to start recording of the video rendering and a second signal configured to cause the processing circuitry to stop recording of the video rendering.

18. The apparatus of claim 17, wherein the control signals include a third signal configured to cause the processing circuitry to start presenting the video stimulus and a fourth signal configured to cause the processing circuitry to stop presenting the video stimulus.

19. The apparatus of claim 17, wherein the separate external display device is in the group consisting of a mobile computing device, smart phone, personal computer and tablet computer.

20. The apparatus of claim 5, wherein the processing circuitry comprises wireless transceiver circuitry configured for wireless communication of a predetermined visual stimulus image to a separate external display device, wherein the predetermined visual stimulus image is predetermined to elicit specific responses that indicate operation of the user's ocular and vestibular systems and neurological pathways.

21. The apparatus of claim 20 wherein the wireless transceiver circuitry comprises a wireless data module configured to communicate in compliance with the IEEE 802.11 standard

22. The apparatus of claim 1, wherein the cameras are IR cameras.

23. The apparatus of claim 22, further comprising an eye enclosure covering each eye of the user, the eye enclosure comprising an IR light source configured for illuminating a viewing area of a corresponding one of the IR cameras.

24. The apparatus of claim 23 further comprising a second eye enclosure covering each eye of a user, the second eye enclosure providing a light seal against the user's face and positioning the digital cameras to view the user's eyes via infrared reflective mirrors without obstructing the field of view of the user's eyes.

25. The apparatus of claim 23, further comprising an adjustable holder positioned over each eye of the user, the adjustable holder including an interchangeable cap configured for covering a corresponding one of the user's eyes.

26. The apparatus of claim 25 wherein the interchangeable cap is opaque for testing the user's eye position under conditions of darkness.

27. The apparatus of claim 25, wherein the interchangeable cap comprises a colored lens for testing the user's eye position under exposure to a selected frequency of visible light.

28. The apparatus of claim 1 wherein the motion sensor comprises inertial gyroscopic and accelerometer sensors generating the motion signals, the motion signals including X, Y and Z axis positions.

29. The apparatus of claim 28, wherein the motion sensor comprises a six degree of freedom sensor on each side of the processing unit to provide differential readings.

30. A portable ocular-vestibular testing apparatus, comprising: a sensor module configured for fitting onto a user's head; a first [IR] camera directed toward a first eye of the user; a second [IR] camera directed toward a second eye of the user; a motion sensor configured for tracking movement of the user's head; a processor module configured for mounting to a first wearable article configured for wearing on a user's body; a battery unit configured for mounting to a second wearable article configured for wearing on a user's body, the battery unit coupled to the processor module; wherein the processor module comprises: camera controller circuitry coupled to the first camera and to the second camera, and processing circuitry coupled to the first camera, the second camera, and the motion sensor, the processing circuitry including memory for storing image data received from the first camera and from the second camera and for storing motion signals received from the motion sensor.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0033] The accompanying drawings, which are incorporated into and form a part of the specification, illustrate an embodiment of the present invention and, together with the description, serve to explain the principles of the invention. The drawings are only for the purpose of illustrating various embodiments of the invention and are not to be construed as limiting the invention.

[0034] FIG. 1 is a frontal view of the disclosed ocular-vestibular testing apparatus fitted on a user's head according to aspects of the present disclosure.

[0035] FIG. 2 is a right-side view of the disclosed ocular-vestibular testing apparatus fitted on a user's head according to an aspect of the present disclosure.

[0036] FIG. 3 is a right-side view of an alternative embodiment of the disclosed ocular-vestibular testing apparatus fitted on a user's head.

[0037] FIG. 4 is a schematic illustration of an embodiment of the disclosed ocular-vestibular testing apparatus in wireless communication with wireless display devices via a wireless network according to aspects of the present disclosure.

[0038] FIG. 5 is a schematic block diagram illustration of a processing unit configured for controlling an ocular-vestibular testing apparatus according to aspects of the present disclosure.

[0039] FIG. 6 is an illustration of a split embodiment configured with sensors on a head mounted unit and processing unit and battery mounted remotely on a waist belt.

DETAILED DESCRIPTION

[0040] Aspects of the present disclosure include a portable miniaturized, head-mounted, lightweight ocular-vestibular testing assembly that provides recording of selected inputs, the selected inputs comprising eye position and head tilt in 3 dimensions, together with selected automated stimulus. The automated stimulus may include stationary dots and/or moving patterns, to elicit specific responses that indicate the operation of the ocular and vestibular systems and the neurological pathways that drive those systems. The visual stimulus is configurable by the clinician.

[0041] Referring to FIG. 1, the assembly includes an adjustable head-mount base 102 designed to fit securely to a user's head. The diagram shows two alternative configurations; the top (composite version) being with all components being headed mounted; the lower (split version) showing a base just containing the sensor components, with the processing and battery components being housed in a remote waist belt (not shown). The head mount base 102 may include adjusting members so it can be fitted to different head sizes and shapes. The base 102 includes a front portion, two side portions and a back portion, in which the orientation of the front portion of the base coincides with an orientation of the user's face.

[0042] The composite version assembly also includes a processing unit 104 coupled to the base unit 102. In an illustrative embodiment, the processing unit includes a left and right eye digital infrared (IR) camera located on the front portion of the base 102. Each camera may include associated controller boards. The processing unit 104 may be connected to the base unit by a flexible coupling member 106. The split version assembly includes a simpler head mount 110, resembling a large pair of glasses, with a sensor unit 112 which connects to a remote processing and battery assembly (not shown).

[0043] Referring to FIG. 2, again showing two configurations: composite and split versions. In the composite version, a power supply unit 202 is connected to the back portion of the base. In a particular implementation, the power supply unit 202 may be a 5-volt universal serial bus rechargeable battery pack, for example. In the illustrative embodiment, the power supply unit 202 counter-balances the camera and processing units 104 located at the side portions of the system. The power supply unit 202 is electrically coupled to and provides power to the controller boards in the processing unit 104. In the split version, the simpler head mount 204 only contains the sensors, with the processing unit and battery mounted remotely on a waist belt (not shown). In the illustrative embodiment, at least two infrared digital cameras are located each over one of the user's eyes. The cameras record the position of the user's eyes in real time. This allows the production of real-time video recording together with digital analysis of the position of the pupil of each eye.

[0044] Referring to FIG. 3, in an illustrative embodiment the digital cameras 302 are positioned on the processing unit 104 to view the area including the user's eyes via infrared reflective mirrors without obstructing the field of view of the user's eyes. Eye shields 306 extending from the processing unit 104 may be provided for positioning and support of the processing unit 104.

[0045] According to another aspect of the present disclosure, the apparatus also includes at least two eye enclosures 108 (FIG. 1). Each of the eye enclosures 108 include multiple IR light emitting diodes, which provide camera illumination.

[0046] According to another aspect of the present disclosure, adjustable holders are positioned over each of the user's eyes. The adjustable holders include interchangeable caps. The interchangeable caps may include opaque caps, which enable testing of the user's eye position in complete darkness, and tinted caps, which enable testing of the user's eye position under conditions of a selected frequency of visible light, for example.

[0047] A real-time clock is configured to synchronize the data recording for the various inputs. This enables a view of an automated stimulus to be displayed in conjunction with a display of the resulting eye or head positions.

[0048] In an illustrative embodiment, the apparatus also includes a wireless local area network data connection using the IEEE 802.11 standard for communicating with a wireless display device. Referring to FIG. 4, in an illustrative embodiment two types of displays may be coupled to the disclosed ocular-vestibular testing system 402 via a wireless local area network 404. A control display 406 may be coupled in wireless communication with the ocular-vestibular testing system 402 via the wireless local area network 404 and used by a clinician to initiate start of stimulus actions and the viewing of resultant video and charts, for example. A stimulus display 408, may also be coupled in wireless communication with the ocular-vestibular testing system 402 via the wireless local area network 404 to display selected stimuli for viewing viewed by the user.

[0049] According to aspects of the present disclosure, the control display 406 and the stimulus display 408 each include a web browser supporting the HTTP standard. This avoids the need to for downloading additional software to the control display 406 and stimulus display 408. The control displays 406 and stimulus display 408 can each include a computer running Windows, Apple or Linux operating systems, a smart phone, a tablet computer, or a smart TV display, for example.

[0050] In addition to or alternative to the stimulus display 408, the system may include a set of three laser dot projectors, which can show bright dots against a suitable background in front of the patient. Stimulus actions include a variety of stimulus types, such as stationary and moving dots, stationary and moving patterns. Stimulus actions can be pre-programmed sequences or random sequences, both of which may be recorded during a testing session.

[0051] Resulting eye and head positions, together with the stimuli that evoked them, may be charted in a variety of formats and displayed in real-time on the control display. Previously recorded data may also be displayed in real-time, slow-speed or accelerated-speed to aid the clinician in diagnosis.

[0052] The gathered and recorded data is downloadable to an external device for long-term storage or further analysis. Data may be downloaded in a variety of standard formats, such as CSV, XLS, XML. The disclosed system may also include a security module to ensure that all data may be protected from un-approved access, including access via the WLAN.

[0053] In an illustrative embodiment, the apparatus includes three laser diodes configured to project bright dots on a vertical surface in front of the user. The laser diodes may be controlled from the processing unit to show 0, 1, 2 or 3 dots at a time as a stimulus, for example.

[0054] Embodiments of the disclosed ocular-vestibular testing assembly may also include a stereo audio output to receive headphones for audible stimulus in the form of tones to be listened to by the system's user during testing.

[0055] In an illustrative embodiment, the ocular-vestibular testing assembly may also include an enclosure for each eye. The enclosures are configured to provide a light seal against the user's face. The enclosures may also position the digital cameras to view the area including the user's eyes via infrared reflective mirrors without obstructing the field of view of the user's eyes.

[0056] The disclosed ocular-vestibular testing assembly of may also include multiple opaque holders to attach the closed caps so that they occlude external light, or colored lenses for testing sensitivity to different frequencies of visible light. Each eye enclosure is capable of holding a different cap or lens compared to the opposite eye enclosure.

[0057] According to aspects of the present disclosure, the disclosed ocular-vestibular testing assembly may also include inertial gyroscopic and accelerometer sensors connected to the processing controller boards and providing data showing the X, Y and Z axis positions, with a 6 DOF (degree of freedom) sensor on each side of the processing unit to provide differential readings.

[0058] FIG. 5 is a schematic block diagram illustration of a processing unit configured for controlling an ocular-vestibular testing apparatus according to aspects of the present disclosure. The block diagram is divided into two major sections: Functional Block 1 containing the sensor components; and Functional Block 2 containing the processing and storage components. This division supports the alternative configuration, with either all components being head-mounted, or the sensors being head-mounted and other components being mounted remotely on a waist belt. The processing unit 500 may include a video capture module 502, a head tilt capture module 504 and other sensors 506 in communication with a sensor aggregation module 508. The sensor aggregation module 508 aggregates input from the video capture module 502, a head tilt capture module 504 and other sensors 506 for storage in a data storage module 510. A data analysis module 512 analyses data in the data storage module 510 and co-operatively with a charting module 514 generates output for displaying to user, displaying to an operator and/or for controlling the ocular-vestibular testing apparatus. A web server module 516, may be configured as a web server for communicating wirelessly with external devices such as displays 406, 408, for example. An audio output module 518 may be configured for generating audio stimulus for output to user, via an audio port, for example. A laser output module 520 may be configured for generating visual stimulus for presentation to a user. A security module 522 may be configured to ensure that user data is be protected from un-approved access, including access via the WLAN, for example. A controller module 524 may be configured for controlling the ocular-vestibular testing apparatus to initiate and terminate testing, select visual and/or audible stimulus to be presented to a user, and control recording of testing video data, for example.

[0059] FIG. 6 is an illustration of a configuration with the processing and battery components 601 mounted remotely on a waist belt and connected via a cable 602 to a simplified head-mount sensor component 603.