Method, System and Apparatus for Investigating or Assessing Eye or Pupil Movement

20230172507 · 2023-06-08

    Inventors

    Cpc classification

    International classification

    Abstract

    Method, system and apparatus (10) for capturing eye/pupil movement(s). Wearable apparatus (10) includes image capture means (12,18), a light source (16) to illuminate the eye(s). Video recordings can be stored in memory (20) and/or transmitted to a remote location. The apparatus (10) can include on-board image processing. Pupil shape and/or position can be determined e.g. from glints (28). Direction and speed of eye motion can be determined. Head movement (50) and eye motion (52) measurements can be obtained. Measurement by head movement sensors (32), body sensors (34) and/or environment sensors (38) (e.g. gravity) can be factored in to an assessment of the user's condition.

    Claims

    1. A method of assessing changes in eye state using an apparatus covering a user's eyes, wherein the apparatus detects or monitors changes in the eye(s) or movement of features of the eye(s) during a user vestibular or neurological episode or during a user vestibular or neurological event, the method including capturing image data of the user's eyes for a period of time during said episode or said event, and communicating the captured image data for assessing or informing diagnosis, absence or presence of a disorder or medical condition.

    2. The method of claim 1, wherein a starting point and a finishing point of the pupil(s) is used to determine direction and magnitude of eye or pupil movement and/or other eye kinematics.

    3. (canceled)

    4. The method of claim 1, including capturing rotational motion of the eye(s) and/or pupil(s).

    5. The method of claim 4, wherein time taken for a full or partial rotation is used to determine speed and/or rate of rotation and/or other eye kinematics.

    6. The method of claim 1, wherein variation/change in angle and/or rate of change of such angle of the pupil(s) relative to a reference axis or point is captured as image data and/or measured.

    7-9. (canceled)

    10. The method of claim 1, including using pixel count of a sensor of a camera or each said camera of the apparatus to determine angle of gaze of the user, the method including determining angle of gaze from captured image data and/or image data recordings of eye/pupil position/movement.

    11-12. (canceled)

    13. The method of claim 1, including determining frame position by detecting that one or both eyes of the user of the apparatus is/are within a frame of reference covered by the at least one image capture/recording apparatus.

    14. The method of claim 1, including identifying pupil position and/or pupil diameter in a reference region.

    15. (canceled)

    16. The method of claim 14, wherein a previsualization/detection field within which is captured the eye as the object of interest is used to ensure capturing the eye in image data and unnecessary frame/field beyond the eye is removed/reduced.

    17. (canceled)

    18. The method of claim 1, wherein a portion/window of an overall sensor area of the image capture apparatus is used for windowing such that captured image data is derived from the portion of the sensor area.

    19. The method of claim 18, wherein the portion/window is moved to centralize with respect to the eye(s)/pupil(s) to frame the eye(s)/pupil(s) in at least one frame.

    20. The method of claim 1, wherein position of the eye(s)/pupil(s) relative to one another and/or to a reference is used for positioning the apparatus on/relative to the user.

    21. The method of claim 20, wherein height of a feature of one said eye relative to a feature of the other said eye and/or to at least one reference is used for positioning the apparatus on/relative to the user.

    22. The method of or claim 21, wherein position of corners of the eyes are determined and used to compensate for position of the apparatus on the user's face.

    23. The method of claim 1, wherein one or more of pupil center, pupil shape or pupil position relative to a horizontal plane/reference is used for calibration/position determination.

    24. The method of claim 23, wherein, if the pupils are not aligned on a horizontal plane, degree of pupil misalignment is calculated and used to either realign eye images to a true horizontal plane or as an adjustment parameter in later image post processing and analysis.

    25. The method of claim 1, including measuring/imaging variation in pupil shape from generally circular to ovoid/ellipsoid and compensating for non-circular irregularity via the image capture means and/or processing means.

    26. The method of claim 25, including determining from the pupil shape a confidence factor in the accuracy of the eye kinematics.

    27. The method of claim 26, wherein the confidence factor is used during analysis to modify or reject eye kinematic measurements.

    28. The method of claim 26, wherein the confidence factor includes one or more of pixel count representing the shape of the pupil image, percentage of eyelid closure obstructing the view of the pupil, size of the pupil, angle of the pupil from the capture apparatus center.

    29. The method of claim 1, including detecting full eye closure or partial eye closure occurring within a threshold time or occurring within a confidence factor of eye kinematics accuracy.

    30-35. (canceled)

    36. The method of claim 1, including detecting and/or imaging of presence of and/or location of at least one glint of one or both eyes relative to the center and/or edge of the respective pupil or iris.

    37. (canceled)

    38. The method of claim 36, wherein, for a given light source position within the apparatus, position of the glint is used to determine eye and/or pupil center position and/or wherein distance and/or position of the glint(s) from the respective pupil center is used to determine position and/or orientation of the apparatus or of a camera of the apparatus relative to the face of the user.

    39. (canceled)

    40. The method of claim 36, wherein distance between two said glints and/or between at least one said glint and the center of the pupil or other reference is used to compensate for non-ideal apparatus placement/orientation relative to the user's head/face with respect to eye position or to compensate for non-ideal positioning of at least one camera of the apparatus relative to eye position.

    41-46. (canceled)

    47. The method of claim 1, wherein one or more images or image data is used to identify the wearer or features of the wearer for personal identification, authentication, security, access control, authority to control/use equipment, authority to allow others such access/use.

    48-77. (canceled)

    78. A method of capturing and reporting eye motion data of a user for use in a medical diagnosis, or for producing a treatment regime, or for monitoring symptoms, for a user, the method including: a portable eye motion capture apparatus self-administered by the user to cover their eyes prior to onset of, at the onset of, or during, or after, an episode or event of dizziness or imbalance the eye motion capture apparatus configured for capturing eye motion data of the user during their episode or event of dizziness or imbalance processing or part-processing the captured eye motion data onboard the portable eye motion capture apparatus or remote from the portable eye motion capture apparatus or a combination of onboard and remotely; transmitting the respective processed, part processed or unprocessed eye motion data to a remote location creating analytics based on the processed, part processed or unprocessed eye motion data for use by a clinician in providing the medical diagnosis, or producing the treatment regime, or reporting on the symptoms, for said user.

    79. The method of claim 78, including automatically detecting individual eye movements or patterns of eye movements within the processed and/or unprocessed said captured eye motion data.

    80. The method of claim 79 including detecting nystagmus eye movements or patterns of nystagmus eye movements.

    81. The method of claim 80, including determining speed and direction of the nystagmus eye movements and/or determining speed and direction of patterns of the nystagmus eye movements.

    82. The method of claim 81, including displaying pattern, speed and/or direction of the nystagmus eye movements or patterns of eye movements as at least one graphic representation and/or on an electronic display.

    83. The method of claim 81, including determining at least one driver or origin of the nystagmus from the pattern, speed or direction of the nystagmus.

    84. The method of claim 80 including analyzing at least one component of the nystagmus or analyzing within phase eye kinematics of the nystagmus or analyzing at least one component of at least one slow phase of the nystagmus or analyzing transition(s) of one or more phases of the nystagmus or analyzing between nystagmus, or analyzing a combination of any two or more thereof, to determine drivers of the nystagmus, origin of the nystagmus and/or physiological pathways contributing to the eye movements.

    85. The method of claim 78, wherein the eye motion data includes or is augmented by eye movement video sequences, 3D orientation and position of the user's head during their episode(s) or their event(s), eye kinematics and tracking data, other eye feature information, patient history, questionnaire data, information pertinent to diagnosis, or a combination of any two or more thereof.

    86. The method of claim 78, including identifying severity, timing and location of the episode or event.

    87. The method of claim 78, including instructing the user when and how to self-administer and use the eye motion capture apparatus.

    88. The method of claim 78, wherein episode capture and/or recording is monitored in real or near real time by at least one clinician following automatic detection and transmission of recorded or direct image data as the episode or event occurs.

    89. The method of claim 78, wherein the eye motion capture apparatus performs on-board processing that occurs during or after video capture of the eye motion data, wherein the on-board processing detects if the patient closes their eyes during the recording and/or after the data capture, compresses the video data and/or logs particular eye and head movements.

    90. A method of remote oculography includes: capturing nystagmus eye motion data of a user at their home, residence, place of work, or place of recreation, for use in a medical diagnosis, or for producing a treatment regime, or for monitoring symptoms, the method including: a user self-administering a portable eye motion capture apparatus over their eyes prior to onset of, at the onset of, or during, or after, their vestibular or neurological episode or event; the eye motion capture apparatus configured for capturing nystagmus eye motion data of the user during their episode or event; processing or part-processing the captured nystagmus eye motion data onboard the portable eye motion capture apparatus, or remote from the portable eye motion capture apparatus, or a combination of onboard and remote; transmitting the respective processed, part processed or unprocessed nystagmus eye motion data to a remote location; creating analytics based on the processed, part processed or unprocessed nystagmus eye motion data for use by a clinician in providing the medical diagnosis, or producing the treatment regime, or reporting on the symptoms, for said user.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0203] One or more embodiments of the present invention will hereinafter be described with reference to the accompanying Figures, in which:

    [0204] FIGS. 1A to 1C show examples of horizontal, vertical and torsional eye movement with respect to nystagmus.

    [0205] FIG. 2 shows a chart of nystagmus related to horizontal eye movement (speed and direction) plotted over time on a graph with horizontal eye position on the y axis and time on the x axis.

    [0206] FIGS. 3A to 3D show patterns (e.g. speed and direction of nystagmus over time) of captured nystagmus utilising a apparatus/method according to at least one embodiment of the present invention.

    [0207] FIGS. 4A to 4C show an alternative arrangement for displaying captured nystagmus patterns (speed and direction over a time period) using radial stacked histogram charts according to at least one embodiment of the present invention.

    [0208] FIGS. 5A to 5D relate to steps taken for adjusting for misalignment of the apparatus relative to eye position of a patient according to at least one embodiment of the present invention.

    [0209] FIG. 6 shows a side sectional view of a apparatus according to an embodiment of the present invention.

    [0210] FIGS. 7A and 7B show examples of glints reflected in the eye of a wearer as captured by image capture means of a apparatus according to an embodiment of the present invention.

    [0211] FIGS. 8A and 8B show an eyeball with its pupil in two different locations. Shown are 2 gaze vectors, the lines from the centre of the eye through the centre of the pupils.

    [0212] FIGS. 9A, 9B, 9C and 9D show an eyeball with its pupil being 5 mm diameter in two locations being centred with respect to the capture apparatus image and being 45 degrees upwards from the capture apparatus image.

    [0213] FIG. 10 shows an alternative embodiment of the present invention.

    [0214] FIG. 11 shows an example of an embodiment of the present invention incorporating environmental sensing/measurements.

    [0215] FIG. 12 shows an example according to an embodiment of the present invention incorporating body sensing/measurement.

    [0216] It will be appreciated that one or more embodiments of the present invention can incorporate environmental sensing/measurements and body sensing/measurements.

    [0217] FIG. 13 shows an example of capturing head movement and eye motion data according to an embodiment of the present invention.

    [0218] FIG. 14 shows an example of a patient experience storyboard' flowchart and employment of one or more embodiments of the present invention.

    DESCRIPTION OF PREFERRED EMBODIMENT(S)

    [0219] As shown by way of example in FIGS. 1A to 1C, typical examples of nystagmus.

    [0220] FIG. 1A shows nystagmus presenting as horizontal nystagmus e.g. left nystagmus where the eye moves slowly to the patient's right and rapidly returns horizontally to the left.

    [0221] FIG. 1B shows nystagmus presenting as vertical nystagmus e.g. down beating nystagmus where the vertical slow eye movement upward is followed by rapid vertical downward rotation.

    [0222] FIG. 1C shows torsional nystagmus e.g. clockwise nystagmus with the eye slowly rotating anticlockwise followed by a rapid counter torsional clockwise rotation.

    [0223] FIG. 2 shows a chart of nystagmus related eye movement (e.g. speed and direction of movement) plotted over time on a graph with horizontal eye position on the y-axis (gaze right at the top and gaze left at the bottom) and time on the x-axis. This figure shows left beating nystagmus.

    [0224] Nystagmus over time can be plotted a number of ways. Horizontal and vertical tracings are currently adopted. There are also, new ways to display the data. This is driven by knowing exactly the pupil position (due to realignment of the horizontal plane from lining up the pupil or using glints (28 FIGS. 7A, 7B)) and therefore accurately know the direction and speed of the nystagmus.

    [0225] The features of nystagmus such as shape, amplitude and frequency can be analysed. Nystagmus are usually described by speed of eye movement (e.g. of the slow phase) and direction of eye movement (e.g. of the fast phase). For example, 16°/sec right beating nystagmus. Instead of direction they can also be defined in relation to gravity (geotropic: towards earth, ageotropic: away from earth).

    [0226] According to one or more embodiments of the invention, captured nystagmus motions can be charted/displayed showing direction and velocity of movement over time.

    [0227] For example, FIG. 3A shows by way of example charted strong right-beating nystagmus that reduces in velocity over a minute and a half of time.

    [0228] FIG. 3B shows an example of charted captured strong persistent left-beating nystagmus.

    [0229] FIG. 3C shows charted captured mild persistent up-beating nystagmus.

    [0230] FIG. 3D shows charted captured direction changing nystagmus with respect to time. Moderate right-beating nystagmus for 40 seconds followed by a direction change to left beating nystagmus.

    [0231] FIGS. 4A to 4C show alternative visualisation of nystagmus results as captured by the apparatus/method of the present invention.

    [0232] In particular, FIG. 4A shows left beating nystagmus (some with slight upward component) of varying velocity.

    [0233] FIG. 4B shows occasional upward and to the right beating nystagmus of mild to moderate velocity.

    [0234] FIG. 4C shows strong left beating nystagmus.

    [0235] Being able to visualise the pattern (speed and direction) of nystagmus during an attack significantly aids in diagnosing the underlying cause of the attack. The present invention provides a new modality of capturing, measuring and displaying time of attack nystagmus and being able to view and present the pattern (speed and direction) significantly improves the assessment methodology.

    [0236] It is important to have the apparatus correctly positioned on the patient's head/face or to be able to determine position of the apparatus on the patient's face and compensate for sub-optimal positioning to ensure useable/correct results.

    [0237] FIGS. 5A to 5D show a technique according to one or more embodiments of the present invention that can be employed to correct misalignment in recordings due to incorrect apparatus placement.

    [0238] In this example, the patient has pure left beating nystagmus, as shown in FIG. 5A. Eyes move slowly to the patient's right and then a fast movement back to the patient's left. In this embodiment, the tracing should show left beating nystagmus with no movement on the vertical scale. The upper tracing shows the horizontal (left and right) movement and the lower tracing represents the vertical (up and down) movement of the eyes.

    [0239] FIG. 5B shows an example (Case 1) of what happens to the eye movements in FIG. 5A when the apparatus is horizontally and vertically accurately aligned with eyes. In this case, the eye movement recordings show the true direction of eye movement (e.g. right and left movement are true right and left movement). This translates to accurate tracings. Note that there is no vertical component shown in the tracings as there was no vertical eye movement.

    [0240] FIG. 5C shows an example (Case 2) of what happens to the eye movements in FIG. 5A when the apparatus is incorrectly positioned. That is, the apparatus is not horizontally and vertically aligned with the patient's eyes. In this case the tracings do not reflect the true direction of the nystagmus. Note that there is a vertical component shown in the tracings although no vertical eye movement occurred.

    [0241] When the apparatus is not positioned correctly (Case 2), the misalignment (correction factor) can be measured and used to recalculate eye position so the tracings reflect the true direction of the eye movements. This is explained in FIG. 5D.

    [0242] FIG. 5D shows that using the captured images (video images or series of still images), the degree of misalignment is calculated. This can be achieved by using the pupil centres as an indication of the horizontal plane

    [0243] Other methods within the present invention can also be used to determine the horizontal plane. These include determining the position of the corners of the eye or any other feature of the eye and surrounds. For this example the pupil centres are used, as shown in FIG. 5D.

    [0244] As shown in Steps 1 to 4 of FIG. 5D:

    [0245] In step 1, the degree of misalignment of the pupils from horizontal is determined.

    [0246] In step 2, correction is applied by rotating the image/video back by the degree of misalignment so that the pupils are horizontal and the frame is non-horizontal.

    [0247] In step 3, frame correction is applied to centre the pupils within the frame.

    [0248] In step 4, the corrected tracing of eye movement during nystagmus are captured.

    [0249] As an alternative in steps 2 to 4, the video can be kept the same and the calculated degree of misalignment can be used to recalculate the tracings.

    [0250] It will therefore be appreciated that the present invention envisages many ways to plot the nystagmus. For example, horizontal and vertical tracings can be used.

    [0251] Embodiments of the present invention also envisage measuring, plotting and/or assessing nystagmus relative to gravity. Eye motion (optionally also head movement) can be tracked/measured and compared with gravity (gravity direction, strength of gravity, change in strength and/or direction).

    [0252] The nystagmus of lateral canal BPPV can be categorized as either always towards the ground (geotropic) or always towards the sky (ageotropic).

    [0253] Alternatively, knowing the position (due to realignment of the horizontal plane from lining up the pupil as well as the calibration data using glints 28 (FIGS. 7A, 7B) or ovoid shape of pupil as example), the direction of the nystagmus can be plotted.

    [0254] FIG. 6 shows a side sectional view of a apparatus 10 according to an embodiment of the present invention.

    [0255] In particular, the apparatus 10 is provided as a wearable apparatus, such as goggles or other type of headset.

    [0256] When the patient is experiencing an attack of vertigo, dizziness or imbalance they place the apparatus in front of their eyes 13. The apparatus is turned on, such as by an on-off switch 11 provided on the apparatus (or remotely by wireless communication control).

    [0257] Two lens(es)/cameras 12 record the eyes/eye features (e.g. movement of the pupils).

    [0258] There is preferably a cover to the lenses for each lens. The cover may be opaque to the wearer/user so that the patient's eyes are in complete darkness with no internal reflections in order to avoid any visual distractions. Recording can of course still be made through the opaque cover i.e. the cover is opaque the user and not to the lenses/cameras.

    [0259] A seal 14, (such as a flexible, foam, plasticized or rubberized seal) is preferably provided around a face side periphery of the apparatus to prevent external light entering the cavity so the patient cannot see any visual distractions.

    [0260] A separate, removable adapter (such as a face mask) can be removably attached to the apparatus in order to accommodate different head sizes/face shapes. The removable adapter (e.g. face mask) can be disposable or cleanable. The removable adapter (e.g. face mask) may include a surround for contact with the user's face, the surround may be removable from the apparatus.

    [0261] Preferably the surround and/or adapter (e.g. face mask) is formed of or includes a flexible material or combination of materials, such as a polymer, e.g. silicon, rubber and/or foam based. The face mask may be shaped so as to fit around the user's eyes and over their nose e.g. over the bridge of the nose, similar to googles for use in water or skiing, for a comfortable light omitting fit e.g. sealing around the contact area with the skin. The adapter (e.g. face mask) can be of a light blocking material to prevent light passing to the eyes and image capture means that would consequently impair image capture of the eyes.

    [0262] The patient's eyes are illuminated by a light source 16, such as by one or more infrared LEDs, incorporated as part of/attached to/within the apparatus. This is so the wearer/user cannot fixate on any point, therefore overriding the spontaneous eye movements.

    [0263] One or more cameras 18, such as video cameras, (e.g. one for each eye, or a single camera and a mirror arrangement) record the patient's eye movements (nystagmus) during an attack episode e.g. a vertigo attack.

    [0264] The video recordings are stored, such as on the memory apparatus 20, which may preferably be located mounted within or on the apparatus.

    [0265] Preferably the memory apparatus is connected for data transfer with electronic circuitry 22 controlling the apparatus.

    [0266] When the patient's episode attack has finished, the apparatus is removed from their head.

    [0267] The video recordings can be provided to a processing system.

    [0268] The recordings are preferably compressed and eye movements logged.

    [0269] Processing of the eye movements can occur within/on-board the apparatus.

    [0270] Other data, such as the quality of the image, how long the pupil was open, can be stored and can be analysed on the unit or transmitted to be analysed remote from the apparatus, such as at a clinic.

    [0271] Capture/recording of eye position/movement (optionally with head position/orientation measurements) can be transmitted to a remote location, such as the ‘cloud’ or remote server, to be processed there.

    [0272] The apparatus can be battery powered, such as by an on-board battery 24 and/or can receive power through a power connection 26.

    [0273] Eye kinematic measurements, such as gaze angle and rate of movement/change of movement/rate of change of movement, can be calculated and used to assess presence of a medical condition, such as nystagmus or other neurological disorder.

    [0274] Angular velocity can be determined by calculating rate of change of gaze angle (e.g. change of angle between gaze vectors for a given time interval).

    [0275] Further, angular acceleration can be determined by calculating the rate of change of angular velocity for a given time interval.

    [0276] Angular jerk can be determined by calculating the rate of change of angular acceleration for a given time interval.

    [0277] Angular jerk measurements can be used to detect when the eye movement transitions from a slow phase to a fast phase and or when the eye movement changes direction.

    [0278] Definition: ‘within phase’ means sub-periods or intervals of time within a traditionally identified phase of a nystagmus e.g. slow phase or fast phase, such as shown by way of example in FIGS. 2, 5A and 5B.

    [0279] Analysis of the eye kinetics within phases (e.g. slow phase) may find eye movement patterns relating to the speed, position, direction, changes of direction, acceleration of the pupil (or other eye features) and other parameters.

    [0280] For example, the slow phase of a horizontal right beating nystagmus, may have smaller periods of vertical up beating nystagmus or some other notable feature that are contained within the time interval defining the slow phase.

    [0281] Eye kinematics including the angular jerk measurements can be used to identify ‘within phase’ transitions that may not be detected with incumbent clinic eye tracking apparatus. These ‘within phase’ transitions will lead to new research opportunities for identifying and diagnosing diseases with dizziness symptoms.

    [0282] Identifying ‘within phase’ eye kinematics may lead to better diagnosis of complex cases. This includes cases where multiple diseases, conditions or factors may contribute to or confound dizziness symptoms when using traditional nystagmus analysis.

    [0283] Furthermore ‘within phase’ eye kinetics may lead to further information about the underlying pathology in patients with vestibular dysfunction. For example it may give information as to the functioning of the vestibulo-cerebellar pathway and/or the vestibular nuclei.

    [0284] The present invention includes algorithms to detect traditional nystagmus, their related eye kinematics and ‘within phase’ eye kinetics.

    [0285] The apparatus and software patent includes algorithms to detect ‘within phase’ eye kinematics.

    [0286] FIGS. 8A and 8B show identical pupil locations relative to the image capture means (e.g. camera). They illustrate how differences in the calculated eyeball centre result in errors in the angles between gaze vectors and therefore errors in angular velocity.

    [0287] Results/statistics (such as summary or full statistics) of one or more patient episodes of nystagmus based on the eye movements can be provided by the apparatus or as processed remotely and displayed on a local display (local to the apparatus) or transmitted to a remote display (e.g. tablet, pc, web browser client) . For example, the apparatus/method may categorise the extent or type of nystagmus (e.g. 6 degree per second right beating) and display the type or types (or pattern) to give a clinician instant information/knowledge. The apparatus may include or have access to processing means to process and supply a suggested test.

    [0288] The apparatus 10 for such remote oculography monitoring and software together provide medical specialists with a unique custom solution to aid their diagnosis of vertigo, dizziness and balance symptoms as a result of neurological and vestibular (inner ear) conditions.

    [0289] One or more embodiments of the present invention provides a portable apparatus designed to obtain and/or record patient's eye movements during episodes/attacks of dizziness away from the clinic. That is, patients can administer the apparatus themselves when they are at home, work or during recreational activities.

    [0290] The medical specialist can review the captured data. This application allows specialists to view and analyse the uploaded patient data as well as collaborate with colleagues regarding the assessment of patients' symptoms.

    [0291] The system allows for the day-to-day administration and collection of patient information. Stored information includes one or more of: [0292] eye movement video sequences; [0293] 3D orientation and position of the head during the episodes; [0294] eye kinematics and tracking data; [0295] Other eye feature information (e.g. pupil size); [0296] history; [0297] questionnaire data; and [0298] information pertinent to diagnosis.

    [0299] A software platform facilitates use of the apparatus and can help to identify patients for specific treatment options and enables research from the captured data including severity, timing and location of episodes.

    [0300] Additionally, the system can include monitoring the use of a number of the apparatus, and can oversee the setup and administration of clinics employing the apparatus.

    [0301] The apparatus 10 can include goggles configured to monitor and/or capture patient's eye/pupil movement(s), such as during episodes of dizziness. When the patient begins to experience an episode, the patient wears the apparatus. The apparatus commences recording the episode automatically or when switched on.

    [0302] It is appreciated that users/patients will have the apparatus for 2-4 weeks and clinics will possess a number of the apparatus to deploy to a respective number of users/patients, with potential for spare apparatus being kept on hand.

    [0303] Alternatively, the patient or each patient may keep the respective apparatus. The patient could choose to keep the apparatus permanently (e.g. for a cost) and only the specialist/clinician can provide a report/analysis when the apparatus transmits new/updated measurements/data of one or more nystagmus episodes.

    [0304] In use, the apparatus captures video of the eye kinematics and preferably patient position (e.g. accelerometer, gyroscope) data.

    [0305] Preferably, the apparatus may perform on-board processing that occurs during or after the video capture. For example, the on-board processing may detect if the patient closes their eyes during the recording and/or after the capture may compress the video data and/or log particular events, such as variability in eye movement or eye closure or periods of non-movement between bouts of eye movement.

    [0306] The captured data can be sent (streamed, sent in packets or sent as a whole) from the apparatus to a platform e.g. via hard wire or wireless, such as via a mobile phone network.

    [0307] If a mobile network is not available, the apparatus can include the facility to store the data on-board to be downloaded e.g. at a later time (such as via USB, internal or removable storage).

    [0308] Preferably, software on the platform is cloud based and accessed through a web-app.

    [0309] By way of explanatory example, a patient can have custody of the apparatus permanently or for a time period. When the patient senses or begins experiencing an episode of dizziness, the patient puts on the apparatus and commences recording eye movements using the on-board image capture means.

    [0310] The patient can receive a visual, audible and/or vibration alert that the apparatus is recording. If the patient shuts their eyes during recording or if ambient light enters the apparatus thereby affecting recording, the apparatus gives an alert to the patient.

    [0311] If the eyes are closed and/or light entering continues for a period e.g. 10 seconds, the recording can be halted/paused and the patient notified. Alternatively, if the apparatus successfully records eye movement images for a predetermined threshold period, say, 120 seconds, the apparatus could indicate that successful recording is completed. The patient can cease recording. The captured image data is stored, processed on-board and/or sent for remote processing.

    [0312] With respect to the clinic side for deployment of the apparatus, a patient is instructed when and how to use the apparatus. After sufficient episode recordings are completed, the patient returns the apparatus to the clinician or, as otherwise arranged, keeps the apparatus.

    [0313] Recorded data may be obtained directly from the apparatus (as processed data, semi-processed data for additional processing, or raw data for subsequent processing) and the obtained data is analysed. The image data can be transmitted from the apparatus for remote processing, such as by hard-wired, local wireless transmission e.g. Wi-Fi and/or mobile phone network link for remote processing and/or storage and/or analysis.

    [0314] Episode recording may be monitored in real or near real time by clinicians, such as by automatic detection and transmission of recorded or direct image data as an episode occurs.

    [0315] The present invention provides advantages of: [0316] Reduced contact time with patients and increase the likelihood of arriving at a definitive diagnosis. [0317] Ability to diagnose more patients due to (a) the reduced occurrence of repeat appointments with existing undiagnosed patients, and (b) shorter appointments with all patients. [0318] Provide a low cost assessment option requiring minimal capital expenditure that provides a rapid return on investment. [0319] Improved patient care. [0320] Provides a source of income for the clinics.

    [0321] For the patient, the value proposition for apparatus/system is: [0322] The apparatus/system provides a more convenient method of recording their spontaneous nystagmus. [0323] Shorter waiting times for diagnosis and treatment. [0324] A quicker path to the correct specialist and an increased likelihood of a correct diagnosis. [0325] Empowerment of the patient by being actively involved in identifying the cause. [0326] Symptoms are recorded objectively and not reliant on patient report, therefore providing credibility to the patient's medical complaint. [0327] Lower costs due to the need for fewer tests. [0328] Fewer incidents of misdiagnosis and inappropriate treatment. [0329] Reduced loss of income due to illness. [0330] Reduced stress

    [0331] Further benefits are reduce costs due to earlier identification of the causes of dizziness in the care chain and hence facilitate fewer expensive specialist consultations, creation of efficiencies in the healthcare system by facilitating remote monitoring of patient's dizziness episodes and allow clinicians to see more patients in the same amount of time, reduction on the volume of ‘falls’ (and other outcomes that are the result of dizziness) due to earlier diagnosis of the causes of dizziness.

    [0332] With reference to FIGS. 9A to 9D, the reference numerals indicate the eyeball 1, pupil projection onto capture apparatus image plane 2, actual pupil height 3, projected pupil height 4 and projected pupil shape 5.

    [0333] In FIG. 9A the projection of the pupil image is 5 mm high, and the pupil projection is a circle with 5 mm diameter.

    [0334] In FIG. 9C where the gaze (from eyeball 1) is 45 degrees upward, the pupil is still 5 mm however the pupil projection is a 3.54 mm high by 5 mm wide ellipse.

    [0335] The change in shape of the ellipse projection of the pupil can be used to calculate the gaze angle and thereafter all related eye kinematics.

    [0336] Information/data 30 directly or indirectly from the apparatus 10 of the present invention can be combined with sensed information/data from one or more head movement sensors 32 (such as one or more accelerometers), one or more body/body function sensors 34 and/or one or more external sensors 36, or a combination of any two or more thereof.

    [0337] One or more external sensors 36 can include one or more environmental sensors 38a and/or one or more remote sensors 38b for remote monitoring of the person or their body factors e.g. remote cameras for capturing images for assessing behaviour, change in behaviour or activity (fitness, undertaking tasks, interaction with others etc.), temperature, blood pressure—see, for example, FIG. 11. For example, one or more environmental factors, such as temperature, air pressure, humidity, altitude, gravity, and/or rate of change of one or more thereof, can be measured. Measurement of the environmental factor(s) may be factored into assessing patient data, such as eye motion data 42 obtained via the apparatus 10. Analysis may be conducted using cloud computing/storage 44. Analysis in the cloud can examine and/or document relationship between environmental functions (such as temperature, air pressure (barometric pressure), humidity, gravity etc.) and the presence of eye motion or type of eye motion (such as nystagmus).

    [0338] One or more body sensors 34 can include internal, body mounted or worn sensors, such as a heart rate monitor worn around the chest, a pulse sensor on the wrist, a blood pressure monitoring cuff on the arm. See, for example FIG. 12. For example, oxygen level and/or heart rate maybe measured and optionally provided as data for analysis with eye motion data 42 obtained using the apparatus 10. Eye motion data and the body measurement data can be provided via a cloud data service 44.

    [0339] Analysis 46 can be conducted through cloud computing service. Such analysis may examine and/or document relationship between body functions (such as heart rate, oxygen level) and eye motion (such as nystagmus). Analysis/results can be displayed 48 for diagnosis/patent assessment.

    [0340] Gravity (G) and/or changes in gravity (G.sub.t) over time can be measured.

    [0341] It will be appreciated that, whilst commensurate measurement of one or more of body, head and external parameters with eye state/movement is preferred, it can be possible to separately assess a person's physical parameters and assess gravity later or earlier.

    [0342] Sensor information/data and/or information/data from the apparatus 10 can be provided for analysis by an analyser/processor 40.

    [0343] Reaching or exceeding one or more threshold values of measured/sensed head metrics, body metrics and/or environmental metrics can be used to determine whether effect on eye state or change or eye state over time.

    [0344] Head movement 50 and eye motion characteristics 52 can be measured. Accelerometer data 54 related to head movement can be provided as an input to understanding relationship between head movement and eye motion. Eye motion (such as a nystagmus pattern) can be detected 56, such as measurement of eye motion over time.

    [0345] The eye motion pattern 58, such as indicating BPPV (benign paroxysmal positional vertigo) can be provided as part of the assessment of the relationship between head movement and eye motion—see 60.

    [0346] Cause and effect relationship between specific head movement and specific eye motion can be assessed 60.

    [0347] The specific otolith that is at fault can be assessed. For example, when an otolith is activated by specific head movement it causes an unwanted defect that is detectable or fails to cause an expected eye movement. By knowing the specific head movement and expected corresponding eye motion, the particular ear can be determined and the semi-circular canal within that ear can be identified that is causing the problem (See 62).