HEAD-MOUNTED DISPLAY APPARATUS

20220146856 · 2022-05-12

    Inventors

    Cpc classification

    International classification

    Abstract

    A head mountable imaging apparatus (5) for assisting a user with reduced vision comprises a first display device (20) configured to provide a display to a first eye of the user. A first lens (27) is provided on a user side of the first display device (20). The first lens (27) is configured to form a focused image of the first display device (20). A first camera (25) is configured to provide an output representing a scene in front of the imaging apparatus (5). A processor (40) is configured to receive the output from the first camera (25), to perform one or more image enhancements to improve vision for the user and to provide a processed output to the first display device (20) for display to the user. The first display device (20) is circular or elliptical.

    Claims

    1. A head mountable imaging apparatus for assisting a user with reduced vision, the apparatus comprising: a first display device configured to provide a display to a first eye of the user; a first lens provided on a user side of the first display device, the first lens configured to form a focused image of the first display device; a first camera configured to provide an output representing a scene in front of the imaging apparatus; and a processor configured to receive the output from the first camera, to perform one or more image enhancements to improve vision for the user and to provide a processed output to the first display device for display to the user, wherein the first display device is circular or elliptical.

    2. An apparatus according to claim 1, further comprising a first tubular element which surrounds the first display device and the first lens, with the first lens located at an eye-facing end of the first tubular element.

    3. An apparatus according to claim 2, wherein the first tubular element provides a light tight shield.

    4. An apparatus according to claim 2, wherein the first lens is supported by the first tubular element.

    5. An apparatus according to claim 1, wherein the first lens is circular or round.

    6. An apparatus according to claim 1, further comprising an open region adjacent the first lens such that a user is able to view a combination of an image on the first display device and surrounding environment outside the imaging apparatus.

    7. An apparatus according to claim 1, wherein the apparatus is configured to display an image having a first angular field of view of the scene on the first display device, and to provide to the user an angular field of view of the first display device which is the same as the first angular field of view.

    8. An apparatus according to claim 1, wherein a distance between the first display device and the first lens is less than a diameter or height of the first display device.

    9. An apparatus according to claim 1, wherein the first display device is an opaque display device which does not allow a user to see through the display device.

    10. An apparatus according to claim 1, wherein the first camera has a first image sensor, and wherein the processor is configured to obtain the output from a selected region of the first image sensor which is a subset of an overall area of the first image sensor.

    11. An apparatus according to claim 10, wherein the image sensor has a rectangular shape.

    12. An apparatus according to claim 10, wherein the image sensor has an x-axis and a y-axis and wherein the processor is configured to vary the position of the selected region in at least one of the x-axis and the y-axis.

    13. An apparatus according to claim 10, wherein the processor is configured to vary a size of the selected region of the overall area of the first image sensor.

    14. An apparatus according to claim 1, wherein the first camera is positioned on an outer, forward-facing, side of the imaging apparatus in front of the first display device.

    15. An apparatus according to claim 14, wherein the first camera is aligned with a central axis of the first display device.

    16. An apparatus according to claim 15, wherein the first camera is substantially aligned with an optical axis of the user's first eye.

    17. An apparatus according to claim 1, wherein the first lens is a Fresnel lens, an aspheric lens or a plano convex lens.

    18. An apparatus according to claim 1, further comprising a second display device which is circular or elliptical.

    19. An apparatus according to claim 1, further comprising a second camera configured to provide an output representing a scene in front of the imaging apparatus.

    20. An apparatus according to claim 19, wherein the second camera is positioned on an outer, forward-facing, side of the imaging apparatus in front of the second display device.

    21. An apparatus according to claim 20, wherein the second camera is aligned with a central axis of the second display device.

    22. An apparatus according to claim 21, wherein the second camera is substantially aligned with an optical axis of the user's second eye.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0038] One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying figures in which:

    [0039] FIGS. 1-3 show examples of an imaging apparatus;

    [0040] FIG. 4 shows another example of an imaging apparatus;

    [0041] FIG. 5 shows image processing functionality in the imaging apparatus;

    [0042] FIG. 6 shows a camera for use in the imaging apparatus;

    [0043] FIG. 7 shows a conventional arrangement of a rectangular display and lens;

    [0044] FIG. 8 shows a relationship between a camera, a display and a lens in the imaging apparatus of FIGS. 1-4;

    [0045] FIG. 9 shows a relationship between fields of view of the imaging apparatus;

    [0046] FIGS. 10-12 shows an arrangement of a display and lens for use in the imaging apparatus;

    [0047] FIG. 13 shows a relationship between parts of the imaging apparatus;

    [0048] FIGS. 14 and 15 show examples of processing performed by the imaging apparatus.

    DETAILED DESCRIPTION

    [0049] FIGS. 1-4 show examples of an imaging apparatus 5. The imaging apparatus 5 is configured to be worn on a user's head 1. The imaging apparatus 5 shown in these drawings is in the form of a head mountable pair of glasses, but it could be in the form of a headset. The imaging apparatus 5 has a frame 10 or a housing, which is worn in a similar manner to a conventional pair of glasses. The housing/frame 10 has a bridge region 11 which is configured to rest on a user's nose. The housing/frame 10 has a pair of arms 12, 13. Each of the arms 12, 13 is configured to rest on a user's ear.

    [0050] The imaging apparatus 5 provides each eye of the user with an image representing a view of the surrounding environment in front of the apparatus. In particular, the imaging apparatus 5 provides each eye of the user with an image representing a view of the surrounding environment that a user would normally experience with that eye. A display 20, 30 is provided in front of each of the user's eyes. A first display 20 is provided in front of the user's left eye. A second display 30 is provided in front of the user's right eye. Each display 20, 30 is supported by the frame/housing 10. The position of the displays 20, 30 is best seen in FIG. 4. Each of the displays 20, 30 may use any suitable display technology, such as backlit Liquid Crystal Display (LCD) or Organic Light Emitting Diode (OLED). OLED display technology is advantageous as the display does not require a backlight and therefore can be implemented with reduced physical depth, weight and lower power consumption. It will be understood that the display is opaque. That is, the user only sees an image displayed by the display 20, 30. The user cannot see through the display 20, 30.

    [0051] Each of the displays 20, 30 has a round shape, such as a circular shape or an oval/elliptical shape. In an example, the diameter of the circular OLED display is 35 mm. Other dimensions are possible. The displays 20, 30 may be the type of round displays used in smart watches, or any other suitable display.

    [0052] In FIGS. 1-4 a pair of cameras 25, 35 is provided. A first camera 25 is provided on an outer, forward-facing, side of the imaging apparatus in front of the first display device 20. A second camera 35 is provided on an outer, forward-facing, side of the imaging apparatus in front of the second (right) display 30. In this example, each camera 25, 35 is aligned with a central axis 21, 31 of the display 20, 30. Each camera 25, 35 may also be aligned with an optical axis of one of the user's eyes (when the eye is located at a rest position). Each camera 25, 35 provides an output image/video signal. Each camera 25, 35 is configured to provide an output image signal representing a field of view in front of the respective display. The first camera 25 provides an image which represents a view in front of the first (left) display 20. The second camera 35 provides an image which represents a view in front of the second (right) display 30. The use of two spaced-apart cameras 25, 35 provides a user with separate images at their left and right eyes, which can allow a perception of depth in an imaged scene. Visual navigation in the world is greatly assisted by depth perception. Binocular vision allows depth perception from a number of different cues including stereopsis, eye convergence, disparity and parallax.

    [0053] Referring again to FIG. 4, a display 20 is mounted on a first, eye-facing, side of a printed circuit board (PCB) 26 and a camera 25 is mounted on a second, outward-facing, side of the PCB 26. A first lens 27 is provided on a user-facing side, in front of the first (left) display 20. Similarly, a display 30 is mounted on a first, eye-facing, side of a PCB 36 and a camera 35 is mounted on a second, outward-facing, side of the PCB 36. A second lens 37 is provided on a user-facing side, in front of the second (right) display 30. Each lens 27, 37 is spaced apart from the respective display 20, 30. Each lens 27, 37 is a compact lens, such as a Fresnel lens, an aspheric lens or a plano convex lens. These types of lens may be moulded from lightweight materials, such as a polymeric material (e.g. plastic). This also has an advantage of reducing cost. The lenses 27, 37 have a short focal length. This allows each lens 27, 37 to be positioned close to the display 20, 30. Each display 20, 30 lies in, or near to, the focal plane of the respective lens 27, 37. The lenses 27, 37 allow the imaging apparatus 5 to be as physically compact as possible, compared to a conventional use of single spherical lens or multi element spherical lenses. This allows a lens with a very low F/# (typically ½-⅔) to be used and results in a more compact display. As shown in FIG. 11, in some examples, the distance between the display 20, 30 and lens 27, 37 is ½-⅔ of the diameter D of the display 20, 30.

    [0054] Each lens 27, 37 is round and advantageously is slightly larger than the display. For example a 40 mm diameter lens may be used with a 35 mm diameter display. Each display 20, 30 is placed within, or near, the focal plane of the lens. This allows the full area of the display to be viewed and results in an image focused far away when placed close to the eye.

    [0055] A first tubular element 24 surrounds the display 20 and the lens 27, maintaining the lens 27 at a fixed distance from the display 20. The lens 27 is supported by the first tubular element 24. No other part of the frame 10 or housing is required to support the first lens 27. The display (or the PCB 26 on which the display 20 is mounted) is located at the outward-facing end of the tubular element 24. The fresnel lens 27 is located at, or close to, the eye-facing end of the tubular element 24. A region of empty space separates the display 20 and the lens 27. The first tubular element 24 may also provide a light tight shield between the lens 27 and the display 20. That is, the only optical path to/from the display is via the lens 27. This prevents stray light from reaching the display 20. This can improve readability of the display, especially under bright conditions while avoiding the need to fully isolate the user from the surrounding environment in the manner of a conventional shielded headset. High contrast is important for partially-sighted people due to a degradation in contrast sensitivity. A second tubular element 34 provides the same functions for the right eye display 30 and lens 37.

    [0056] The imaging apparatus 5 comprises an open region 16 adjacent to the first lens 27. The open region 16 is to the left hand side of the first lens 27. Similarly, the imaging apparatus 5 comprises an open region 17 adjacent to the second lens 37. The open region 17 is to the right hand side of the first lens 27. Instead of shielding the user from the surrounding environment, the user can view a combination of an image on the first display device (via the lens 27) and the surrounding environment outside the imaging apparatus. Similarly, the user can view a combination of an image on the second display device (via the lens 37) and the surrounding environment outside the imaging apparatus. The open region has an advantage of keeping the periphery clear for general spatial awareness, object location and obstacle avoidance. It has an advantage of reducing the feeling of isolation from the external world that is normally associated with a shielded headset type of imaging apparatus, and can reduce nausea. The open region has an advantage of improving airflow, preventing uncomfortable heat and moisture.

    [0057] A prescription lens 28, 38 may also be provided. The prescription lens may compensate for short-sightedness (myopia), far-sightedness (hyperopia) and/or some other condition. In FIG. 4 a first prescription lens 28 is shown in front of the fresnel lens 27, and a second prescription lens 38 is shown in front of the fresnel lens 37. Depending on a user's needs, a prescription lens may only be present for the left eye or the right eye. Where tubular elements 24, 34 are used, the prescription lens, or lenses, may be supported by the tubular elements 24, 34. For example, prescription lens 28 may locate within the eye-facing end of the tubular element 24.

    [0058] FIG. 5 schematically shows image processing functionality of the imaging apparatus 5. A processing unit 40 is configured to receive an image/video signal 41 from the left eye camera 25 and receive an image/video signal 42 from the right eye camera 35. The processing unit 40 may improve vision for the user by computationally enhancing a live image of the environment. Processing unit 40 may provide one or more image enhancements 45 to the image signal received from the cameras 25, 35. The processing unit 40 outputs a processed image/video signal 43 to the left eye display 20 and outputs a processed image/video signal 44 to the right eye display 30. The image enhancements may comprise one or more of: edge detection and presentation of the detected edges (e.g. as white edges on a black background, as white edges overlaid upon a colour or a grayscale image); an enhanced contrast between features of the image; a black and white high-contrast image with a global threshold that applies to the entire screen; a black and white high contrast image with multiple regional thresholds to compensate for lighting changes across a screen; an algorithm to detect large regions of similar hues (e.g. regardless of brightness) and then presenting these regions as high brightness swatches of the same colour. Other enhancements or image processing may be performed by the processing unit 40. Other processing functions include one or more of: magnification or minification, display of a high resolution static image, presentation of a picture-within-picture. The type of enhancement(s)/processing performed by the processing unit 40 may depend on the vision defects of the user.

    [0059] One possible location for the processing unit 40 is the bridge region 11 of the frame/housing 10. Another possible location for the processing unit 40 is in one, or both, of the arms 12, 13. The imaging apparatus 5 may comprise a local power source, such as at least one battery housed in one, or both, of the arms 12, 13.

    [0060] FIG. 6 shows the camera 25 in more detail. Camera 35 is the same as camera 25. The camera 25 comprises an image sensor 25A and a lens, or lens array, 25B. The lens 25B of the camera forms a focused image on the image sensor 25A. The lens 25B has a field of view (FOV) 25C.

    [0061] For comparison purposes, FIG. 7 illustrates conventional apparatus used in Virtual Reality (VR) or Augmented Reality (AR) applications. A rectangular display 101 is used with a macroscopic round lens 102. The lenses 102 used are required to produce a high-resolution image over a wide field of view with low field curvature and other aberrations. So, typically, either complex multicomponent lenses, or a customised molded aspheric lens, is required. This leads to significant compromises in form factor because the shape of the display and the lens are mismatched. The diameter of the lens 102 has to be at least as large as the diagonal of the display 101 in order to be able to view the entire display 101. In addition, the lenses 102 required have a relatively large F/# (>1). This results in the distance between the lens and the display being larger than the diagonal size of the display (typically by at least 1.5 to 2×). Both of these factors result in a large distance between the display and the lens and therefore result in either a large bulky headset or a small display with a small field of view.

    [0062] FIG. 8 illustrates an optical and a physical relationship of the components of the imaging apparatus. The relationships of the imaging apparatus shown in FIG. 8 can apply to the horizontal (x) plane (i.e. FIG. 8 can be understood as showing a top view of the apparatus) and to the vertical (y) plane (i.e. FIG. 8 can be understood as showing a side view of the apparatus). The angular ranges are wider in the horizontal plane compared to the vertical plane, but the same relationships apply. The imaging apparatus 5 comprises a display 20, a camera 25 and a Fresnel lens 27. The Fresnel lens 27 is positioned between the user's eye 2 and the display 20.

    [0063] FIG. 8 shows three eye positions 2A, 2B, 2C. Position 2A is a central position of the eye. In this central (or rest) position 2A, the main optical axis of the eye is aligned with a centre of the lens 27, display 20 and camera 25. The lens 27, display 20 and camera 25 are co-aligned with the same axis. Positions 2B, 2C represent positions at the limits of comfortable eye movement under normal conditions. Lines 6 and 7 represent the edges of the field of view for positions 2B, 2C. The eye can rotate further than positions 2B, 2C but this is generally uncomfortable. Usually, if the user wishes to view outside of the comfortable viewing range they will rotate their head to bring the eye position back to within this comfortable range. Typically, the range of eye movement is restricted to an elliptical region extending between +20 degrees and −20 degrees in the horizontal plane and between +15 degrees and −15 degrees in the vertical plane. These angles relate to the angular distance between the main optical axis in positions 2B, 2C and the main optical axis in a rest position (position 2A). Beyond this angular range of movement, a user typically moves their head (rather than their eyes) to reorient.

    [0064] FIG. 10 shows an elliptical region representing a typical range of eye movement, superimposed upon the circular display 20, 30. The region of typical eye movement lies within the circular display 20, 30.

    [0065] FIG. 11 shows a relationship between the Fresnel lens 27, 37 and the display 20, 30. The diameter D of the lens 27, 37 is less than the distance between the lens 27, 37 and the display 20, 30.

    [0066] Advantageously the display 20, camera 25 and Fresnel lens 27 are all aligned, and are aligned with a main optical axis 21 of the user's eye 2. The Fresnel lens 27 is positioned within a field of view (FOV) of the user's eye. The lens 27 allows the user's eye to form a focused image of the display 20.

    [0067] An aim of the imaging apparatus 5 is to appear, to the user, as if there is nothing but an empty glasses frame in front of their eye. To achieve this, the FOV 22 of the lens 27 and display 20 as seen by the user's eye is matched to the FOV 25D of the scene displayed on the display 20. This gives a system magnification of 1× (unity) in terms of angular field of view. The relationship between the FOV 22 and FOV 25D is shown by FIG. 9.

    [0068] In conventional Virtual Reality (VR)/Augmented Reality (AR) imaging there is often a mismatch between the viewing angle of cameras and the viewing angle of their displays. This provides further difficulties with navigation due to mismatches in the optic flow of the visual scene. Close objects move faster in the visual field than distant objects. When the image on the display is zoomed in, and at a greater size than real life, the increased optic flow makes everything appear to be closer than it is.

    [0069] The user typically experiences a discontinuity between their view of the display 20, 30 and their view past the edge of the display 20, 30 due to distortions in the image. The discontinuity in the optic flow may induce nausea and make navigation around the world challenging. It may also make it difficult to perform tasks requiring hand-eye coordination.

    [0070] In the imaging apparatus 5, the effects of this optical discontinuity are reduced. A user experiences a system magnification of unity by matching the camera focal length and chip size, to the display size and lens focal length. Fine adjustments to the system magnification are made digitally. This ensures that peripheral vision past the edge of the display and the image on the display are continuous. The user is able to then use peripheral vision with no mismatch in position, scale or flow of objects as they pass the boundary from peripheral vision to the display. The discontinuity at the boundary of the apparatus may be similar to that experienced at a frame of a conventional pair of glasses.

    [0071] The user's field of view FOV 22 of the lens 27, and the display 20 beyond the lens 27, is determined by factors such as the size of the lens 27, the size of the display 20 and the distance 50 between the lens 27 and the eye 2. As explained above, the eye 2 has a wider overall FOV than FOV 22. The extent of the wider FOV of the eye is shown by the dashed lines 6, 7. When the eye is located in one of the extreme positions 2B, 2C the user's gaze is directed approximately one third of the way across the display 20. The full display 20 will still be visible within the user's peripheral vision. The world beyond the edge of the lens 27 will also be visible in the user's peripheral vision, assuming the glasses frame does not obstruct this. This is true even when the gaze is directed straight ahead. Rotation of the eye 2 effectively translates the pupil, and the edge of the lens 27 effectively acts as a window that the display 20 is viewed through. This means that as the eye rotates the view of the display 20 will appear to be cropped differently. If the eye is rotated to the left then the display will be cropped on the left hand edge. By configuring the lens 27 with a larger diameter than a diameter the display 20, this effect should be minimised or negligible.

    [0072] Referring to FIG. 12, any point source of light (in this case a pixel 29 on display 20, 30) will emit light in all (many) directions. A few representative rays are shown. If the point source lies in the focal plane of the lens (as it is in this case) then the diverging rays from a point will exit the lens parallel to each other. These parallel rays are then focused by the lens in the eye 2 to a corresponding image point on the retina. The collection of a range of diverging rays from a point source by a lens, and their subsequent refocusing to a point, is a necessary requirement to form an image. For a point at the centre of the display 20, the lens 27 captures a much wider range of rays compared to a pixel located at the periphery of the display 20. In principle, this means the centre of the image would appear to be much brighter than the periphery. However, in this case, the pupil of the eye 2 limits the set of rays that contribute to the image formation. This means that if we make the diameter of lens 27 larger than the diameter of the display 20 by the size of the pupil (actually the size of the eyebox because the pupil can move anywhere within the eyebox), then a reasonable image can be formed, with uniform brightness across the whole of the image. The eyebox is the three dimensional region in front of the lens within which the user can see a reasonable image. So, if the eyebox has a dimension of 5 mm, then the pupil will need to be within this 5 mm region for the optimal view.

    [0073] As explained above, the scene displayed on the display 20 has a FOV 25D. Referring again to FIG. 6, the lens 25B of the camera 25 collects light over a wider FOV 25C. The lens 25B projects an image onto the image sensor 25A. The projected image is as wide as, or wider than, the image sensor 25A of the camera 25. By providing a FOV 25C which is wider than the FOV 25D, the image can be cropped to match the FOV of the display 20 and/or lens 27 as seen by the eye (theta). This wider camera FOV 25D can also be used for translation and/or digital zooming to calibrate for the user. This is explained in more detail in FIGS. 13-15.

    [0074] The circular display 20 displays an image which is selected from a region of the image sensor 25A. Stated another way, the image sensor 25A is cropped to provide the image for display. FIG. 13 shows the circular display FOV superimposed upon the image/camera sensor FOV.

    [0075] FIG. 13 is showing the relationship between the FOV of the image displayed by the display 20 compared to the FOV of the image on the image sensor. The image sensor 25A typically has smaller physical dimensions than the display 20. It should be understood that FIG. 13 does not show a relationship of the physical dimensions of the image sensor 25A and the display 20 but, instead, shows a relationship between FOVs of the image sensor 25A and the display 20.

    [0076] In FIG. 13 the display FOV has a height (DISPLAY_H) and a width (DISPLAY_W). The image sensor FOV has a height (SENSOR_H) and a width (SENSOR_W). The display FOV has a height (DISPLAY_H) which is substantially the same as the height (SENSOR_H) of the image sensor FOV, and the display FOV has a width (DISPLAY_W) which is less than a width (SENSOR_W) of the image sensor FOV.

    [0077] A position of the cropped region of the image sensor 25A may be selected by the processing unit 40. For example, the cropped region used for output to the display 20 may be moved in the x-axis and/or y-axis.

    [0078] The size of the cropped region may be varied by the processing unit 40. Size may be varied by a digital zoom operation, i.e. a digital domain manipulation of the mapping between the pixels of the image sensor 25A and the pixels of the display 20. A digital zoom in function is shown in FIG. 14. To perform a digital zoom in, a pixel of the image sensor 25A is mapped to a plurality of neighbouring pixels of the display. Interpolation algorithms may be used to improve appearance. A digital zoom in function is shown in FIG. 15. To perform a digital zoom out, a plurality of pixels of the image sensor 25A are mapped to a pixel of the display 20. Digital zooming may be required to compensate for the position of the imaging apparatus relative to the user's eyes. For example, if the eye-to-lens distance is longer than normal, a digital zoom in may be required. Similarly, if the eye-to-lens distance is less than normal, a digital zoom out may be required.

    [0079] Position and/or size of the displayed region may be selected by manual control. For example, a user can manually enlarge (zoom in) or shrink (zoom out) the image based on their own needs and the visual experience. A user interface to control the zoom function may be provided on the imaging apparatus 5 (e.g. buttons on arms 12, 13, FIG. 4). Additionally, or alternatively, a user interface to control the zoom function may be provided a handheld control unit that may be physically attached (e.g. via a cable) to the imaging apparatus 5. Additionally, or alternatively, a user interface to control the zoom function may be provided on a portable device which communicates wirelessly (e.g. using a wireless transmission protocol such as Bluetooth™). When the user interacts with the control, such as by manipulating a button, knob, slider of graphical user interface (GUI), this instructs the processing unit 40 to enlarge or shrink the image as described above.

    [0080] In another situation, the zoom level may be preconfigured for the wearer by a qualified technician or clinician based on factors such as: the shape of the user's face; the distance from the eye to the lens 27 when the imaging apparatus is worn by the user.

    [0081] The camera lens 25B is a wide angle lens. This type of lens inevitably has non-ideal optical properties. FIG. 13 shows the effects of optical barrel distortion on a rectilinear grid. Barrel distortion has the effect of causing straight lines to appear curved. The barrel distortion is worst at the periphery of the FOV, and is most pronounced at the corners of a rectangular image. Barrel distortion (and other forms of optical distortion) may be corrected to some extent in the digital domain by the processing unit 40. However, this is computationally expensive, wastes power and increases the system latency. This is critical for portable and wearable systems. The cropping of the image sensor FOV has an effect of cropping the most heavily distorted region of the camera lens 25B, while avoiding for this computationally expensive processing.

    [0082] As described above, the imaging apparatus can have a single camera, such as a single camera which is centrally-mounted on the front of the frame 10 or housing. The single camera has a FOV which is sufficient to provide images to each display. For example, to provide a FOV to each eye of 60 degrees, the single camera may have a FOV of 80 degrees. An output of the single camera is processed to provide an image to the left eye display 20 and to the right eye display 30. The images displayed by each display 20, 30 can have the same unity gain factor described above. That is, the left eye display 20 is configured to display an image having a first angular field of view of the scene in front of the left eye on the left eye display device, and the imaging apparatus is configured to provide to the user an angular field of view of the left eye display device which is the same as the first angular field of view. Similarly, the right eye display 30 is configured to display an image having a first angular field of view of the scene in front of the right eye on the right eye display device, and the imaging apparatus is configured to provide to the user an angular field of view of the right eye display device which is the same as the first angular field of view. This gives continuity between the displayed image and the real world, and continuity between the displayed image and the surrounding environment visible through the open regions 16, 17 to the side of the lenses 27, 37.

    [0083] Another aspect of the disclosure may be understood with reference to the following numbered clauses. [0084] 1. A head mountable imaging apparatus for assisting a user with reduced vision comprising: [0085] a first display device configured to provide a display to a first eye of the user; [0086] a first lens provided on a user side of the first display device, the first lens configured to form a focused image of the first display device; [0087] a first camera configured to provide an output representing a scene in front of the imaging apparatus; [0088] a processor configured to receive the output from the first camera and to provide an output to the first display device for displaying an image to the user, [0089] wherein the imaging apparatus is configured to display an image having a first angular field of view of the scene on the first display device, and the imaging apparatus is configured to provide to the user an angular field of view of the first display device which is the same as the first angular field of view. That is, the imaging apparatus is configured to provide to the user an angular field of view of the image displayed on the first display device which is the same as the first angular field of view. [0090] 2. An apparatus according to clause 1 comprising an open region adjacent the first lens such that a user is able to view a combination of an image on the first display device and surrounding environment outside the imaging apparatus. [0091] 3. An apparatus according to clause 1 or 2 wherein the first camera is positioned on an outer, forward-facing, side of the imaging apparatus in front of the first display device. [0092] 4. An apparatus according to clause 3 wherein the first camera is aligned with a central axis of the first display device. [0093] 5. An apparatus according to clause 4 wherein the first camera is substantially aligned with an optical axis of the user's first eye in a rest position. [0094] 6. An apparatus according to any one of the preceding clauses wherein the angular field of view of the first display device provided to a user is based on an expected distance between the first lens and a position of a user's eye. [0095] 7. An apparatus according to any one of the preceding clauses wherein the first lens is a Fresnel lens or an aspheric lens. [0096] 8. An apparatus according to any one of the preceding clauses wherein a distance between the first display and the first lens is less than a diameter or height of the first lens. [0097] 9. An apparatus according to clause 8 wherein a distance between the first display and the first lens is between one half and two thirds of the diameter or height of the first display. [0098] 10. An apparatus according to any one of the preceding clauses wherein the first display is circular or elliptical. [0099] 11. An apparatus according to any one of the preceding clauses wherein the first camera has a first image sensor, and wherein the processor is configured to obtain the output from a selected region of the first image sensor which is a subset of an overall area of the first image sensor. [0100] 12. An apparatus according to clause 11 wherein the first image sensor has a rectangular shape. [0101] 13. An apparatus according to clause 11 or 12 wherein the first image sensor has an x-axis and a y-axis and wherein the processor is configured to vary the position of the selected region in at least one of the x-axis and the y-axis. [0102] 14. An apparatus according to clause 14 wherein the processor is configured to vary the position of the selected region based on a user input. [0103] 15. An apparatus according to any one of clauses 11 to 14 wherein the processor is configured to vary a size of the selected region to adjust the first angular field of view of the image displayed on the first display device. [0104] 16. An apparatus according to clause 15 wherein the processor is configured to vary a size of the selected region based on a user input. [0105] 17. An apparatus according to any one of the preceding clauses comprising: [0106] a second display device configured to provide a display to a second eye of the user; [0107] a second lens provided on a user side of the second display device, the second lens configured to form a focused image of the second display device; [0108] a second camera configured to provide an output representing a scene in front of the imaging apparatus; [0109] a processor configured to receive the output from the second camera and to provide an output to the second display device for displaying an image to the user,
    wherein the imaging apparatus is configured to display an image having a second angular field of view of the scene on the second display device, and the imaging apparatus is configured to provide to the user an angular field of view of the second display device which is the same as the second angular field of view. [0110] 18. An apparatus according to clause 17 wherein the first angular field of view is equal to the second angular field of view.

    [0111] Throughout the description and claims of this specification, the words “comprise” and “contain” and variations of the words, for example “comprising” and “comprises”, means “including but not limited to”, and is not intended to (and does not) exclude other moieties, additives, components, integers or steps.

    [0112] Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.

    [0113] Features, integers or characteristics described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith.