Display Apparatus
20230116084 · 2023-04-13
Assignee
Inventors
- Franz Josef Maier (Poertschach am Woerthersee, AT)
- Mauel Dorfmeister (Wiesen, AT)
- Joerg REITTERER (Brunn Am Gebirge, AT)
- Louahab NOUI (East Sussex, GB)
Cpc classification
G09G2320/0247
PHYSICS
G02B26/085
PHYSICS
G02B2027/0147
PHYSICS
G02B2027/0187
PHYSICS
G09G2320/0233
PHYSICS
G09G2360/18
PHYSICS
G09G2340/0407
PHYSICS
G02B27/0179
PHYSICS
International classification
Abstract
A display apparatus comprises a mirror oscillating about a first axis upon excitation by a first excitation signal and about a second axis upon excitation by a second excitation signal, a light source projecting a light beam onto the mirror for deflection towards an image plane, the light source being controller according to pixels read-out by an image processor from a buffer, a gaze tracker detecting a user’s region of interest, ROI, within the image plane, and a controller modulating one of the excitation signals by a modulation signal which is dependent on the ROI such that the number of passes of the light beam per unit area is higher in the ROI than in a region outside thereof, wherein the number of pixels read-out per unit area by the image processor is higher in the ROI than in a region outside of the ROI.
Claims
1. A display apparatus, comprising: a mirror assembly, wherein a first mirror of the mirror assembly is configured to oscillate about a first axis upon excitation by a first excitation signal of a first frequency and wherein the first or a second mirror of the mirror assembly is configured to oscillate about a second axis upon excitation by a second excitation signal of a second frequency; a light source configured to project a light beam onto the mirror assembly for deflection by the mirror assembly towards an image plane, the light source having an input via which it can be controlled according to pixels of an image frame to be displayed on the image plane for a frame period, the pixels being read-out by an image processor from a buffer storing the image frame and applied sequentially to the input of the light source; a gaze tracker configured to detect a user’s region of interest, ROI, within the image plane by tracking a user’s gaze; and a controller connected to the gaze tracker and configured to modulate one of the first and second excitation signals by a first modulation signal which is dependent on the ROI detected by the gaze tracker such that the number of passes of the light beam per unit area of the image plane and per frame period is higher in the ROI than in a region outside of the ROI; wherein the image processor is connected to at least one of the gaze tracker and the controller and configured to control the number of pixels read-out per unit area and per frame period such that said number of pixels is higher in the ROI than in a region outside of the ROI.
2. The display apparatus according to claim 1, wherein the image processor is configured to receive a control signal from said at least one of the gaze tracker and the controller, wherein said control signal indicates a pixel read-out ratio to be used by the image processor when reading-out the pixels.
3. The display apparatus according to claim 1, characterised by a renderer which renders the image frame before it is stored in the buffer, wherein the renderer is configured to receive a control signal from said at least one of the gaze tracker and the controller, and wherein said control signal indicates a rendering resolution to be used by the renderer when rendering the image frame.
4. The display apparatus according to claim 1, wherein the image frame is one of a series of image frames to be displayed on the image plane with a frame rate, and in that the frequency of the first modulation signal is a one- or morefold of the frame rate.
5. The display apparatus according to claim 1, wherein the controller is configured to decrease the intensity of the light beam for pixels within a region of increased pixel resolution.
6. The display apparatus according to claim 1, wherein the controller is configured to amplitude-modulate said one of the first and second excitation signals by the first modulation signal.
7. The display apparatus according to claim 1, wherein the gaze tracker is configured to detect the ROI by predicting the ROI from an analysis of a past track of the user’s gaze on the image plane.
8. The display apparatus according to claim 7, wherein the gaze tracker is configured to determine a duration of a past saccade of the user’s gaze from the analysis, wherein the controller is configured to complete a change of the first modulation signal from one ROI to another ROI within that duration.
9. The display apparatus according to claim 1, wherein the display apparatus is configured to be head-mounted and the gaze tracker is an eye tracker.
10. The display apparatus according to claim 1, wherein the controller has a memory with a look-up table which stores, for each one of a set of different ROIs within the image plane, a respective first modulation signal dependent on that ROI, wherein the controller is configured to retrieve the first modulation signal dependent on the detected ROI from the look-up table.
11. The display apparatus according to claim 1, wherein the controller is configured to modulate the other one of the first and second excitation signals by a second modulation signal which is dependent on the ROI detected by the gaze tracker such that the number of passes of the light beam per unit area of the image plane and per frame period is higher in the ROI than in a region outside of the ROI.
12. The display apparatus according to claim 11, wherein the image frame is one of a series of images frames to be displayed on the image plane with a frame rate, and in that the frequency of the second modulation signal is a one- or morefold of the frame rate.
13. The display apparatus according to claim 11, wherein the controller is configured to amplitude-modulate said other one of the first and second excitation signals by the second modulation signal.
14. The display apparatus according to claim 10, wherein the controller is configured to modulate the other one of the first and second excitation signals by a second modulation signal which is dependent on the ROI detected by the gaze tracker such that the number of passes of the light beam per unit area of the image plane and per frame period is higher in the ROI than in a region outside of the ROI, and wherein the look-up table stores, for each one of a set of different ROIs within the image area, a respective second modulation signal dependent on that ROI, wherein the controller is configured to retrieve also the second modulation signal dependent on the detected ROI from the look-up table.
15. The display apparatus according to claim 11, wherein at least one of the first and second modulation signals is a triangular or saw-tooth signal with an offset, wherein the slopes of the saw-teeth or triangles and the offset depend on the detected ROI.
16. The display apparatus according claim 11, wherein at least one of the first and second modulation signals is a sequence of sine halves with an offset, wherein the amplitudes of the sine halves and the offset depend on the detected ROI.
17. The display apparatus according to claim 11, wherein at least one of the first and second modulation signals is a repetition of a step-like function comprised of a first and a last section with high slope, a middle section with low slope, and an offset, wherein the respective slopes and lengths of the sections and the offset depend on the detected ROI.
18. The display apparatus according to claim 11, wherein at least one of the first and second modulation signals is a repetition of a step function comprised of at least two sections of different respective constant values, wherein the respective values and lengths of the sections depend on the detected ROI.
19. The display apparatus according to claim 1, wherein the controller is configured such that said number of passes is, in an intermediate region between the ROI and said region outside of the ROI, lower than in the ROI and higher than in said region outside, and in that the image processor is configured such that said number of pixels read-out is, in said intermediate region, lower than in the ROI and higher than in said region outside.
20. The display apparatus according to claim 19, wherein there are several adjacent intermediate regions of gradually decreasing numbers of passes and numbers of read-out pixels.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The disclosed subject matter will now be described by means of exemplary embodiments thereof with reference to the enclosed drawings, in which show:
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
DETAILED DESCRIPTION
[0042]
[0043]
[0044] Reverting to
[0045] In the example of
[0046] Instead of the semi-transparent combiner 6 the display apparatus 1 could be used with any other image plane 4, e.g., a conventional reflective projection screen such as a miniature screen mounted on the frame of virtual reality (VR) glasses, or a projection wall or a movie screen, for example when the display apparatus 1 is used as a miniature (or full-scale) video beamer. The display apparatus 1 could even be used to directly project the image frames 2 into the user’s eye 5, i.e., use the retina of the eye 5 as image plane 4, optionally with suitable optics therebetween. It goes without saying that the image plane 4 can have any form, including a curved form such as the retina of the eye 5.
[0047] The display apparatus 1 comprises a light source 11 which emits a light beam 12. The light source 11 can be of any kind including gas lasers, fibre lasers, semiconductor lasers etc. For miniaturisation the light source 11 may employ LEDs, micro LEDs or laser diodes, e.g., edge-emitting laser diodes or surface-emitting laser diodes. For colour pixels 3, the light source 11 may be a polychromatic light source 11, e.g., a set of laser diodes of three primary colours which emit a light beam 12 comprised of three different wavelengths for colour perception. The light beam 12 carries the image frames 2 in a time-multiplexed manner, i.e., the intensity (luminance) and/or colour values of the pixels 3 one after the other in the sequence the pixels 3 are “painted” on the image plane 4, when the light beam 12 is moved along a trajectory over the image plane 4 as will be explained further on.
[0048] To control the intensity and/or colour of the light beam 12 pixelwise, the light source 11 has a control input 13 (
[0049] As the encoding of the pixels 3 in the image frames 2 is usually different from the order the pixels 3 are drawn by the light beam 12 on the image plane 4, the image processor 14a reads-out (“picks”) the respective intensity and/or colour values for each pixel 3 in a random-access manner – schematically shown by arrow R – from the buffer 14b and applies them sequentially, i.e., in the drawing order of the pixels 3 along the light beam’s trajectory on the image plane 4, to the input 13 of the light source 11.
[0050] To move (scan) the light beam 12 along its a trajectory over the image plane 4, the display apparatus 1 comprises a mirror assembly 16, here: one single micro-electro-mechanical-system (MEMS) mirror, downstream of the light source 11 in the path of the light beam 12. The mirror assembly 16 deflects the light beam 12 into subsequent directions (angles) towards the image plane 4. Optionally, additional optics or waveguides can be interposed in the path of the light beam 12 from the light source 11 via the mirror assembly 16 to the image plane 4.
[0051] As shown in
[0052] To induce the oscillation of the mirror 17 about the first axis 19 a first actuator 21 acts between the mirror 17 and the support 18. The actuator 21 may be a coil attached to the mirror 17 and lying in a magnetic field of the support 18, through which coil a first excitation signal S.sub.1 (here: an excitation current) is passed. For inducing the oscillation of the mirror 17 about the second axis 20 a second actuator 22 acts between the mirror 17 and the support 18, e.g., also a coil, through which a second excitation signal S.sub.2 is passed. The excitation signals S.sub.1, S.sub.2 are obtained from signal generators 23, 24 which may be external or internal to the display apparatus 1 and may be a part of the MEMS mirror 16 or the controller 14. Instead of electromagnetic actuators 21, 22 with coils any other type of actuators for driving the oscillations of the mirror 17 about the two axes 19, 20 can be used, e.g., electrostatic, piezoelectric, electrothermal or magnetostrictive actuators.
[0053] The frequencies f.sub.1 and f.sub.2 of the two excitation signals S.sub.1 and S.sub.2 are chosen such that the mirror 17 oscillates about each axis 19, 20 at – or nearly at – the resonance frequency of the respective articulation of the mirror 17 on the support 18 (or a multiple thereof, e.g., a harmonic frequency of higher order). The resonance frequency or natural harmonics about the respective axis 19, 20 is defined by the mass distribution of the mirror 17 about that axis 19, 20, the spring forces and frictional resistances of the articulations of the mirror 17 about that axis 19, 20, and the magnetic, electrostatic, etc. counterforces of the actuators 21, 22. By oscillating the mirror 17 about the axes 19, 20 at – or in the vicinity of – its resonance frequency about the respective axis 19, 20 a large amplitude of the mirror movement (a large angular sway) can be achieved with small excitation signals S.sub.1, S.sub.2, i.e., of low power or low amplitude, which allows to use particularly small actuators with small moving masses and high resonance frequencies.
[0054] To excite and maintain the resonant oscillations of the mirror 17 about the axes 19, 20 the excitation signals S.sub.1, S.sub.2 can be of any form, e.g., pulse signals which trigger the mirror oscillations every oscillation period, every other oscillation period or even more seldomly. However, usually the frequencies f.sub.1, f.sub.2 of the excitation signals S.sub.1, S.sub.2 will be the same as the oscillation frequencies of the mirror 17 about the axes 19, 20.
[0055] Most commonly sinusoidal excitation signal S.sub.1, S.sub.2 will be used, as shown in
[0056] The frequencies f.sub.1 and f.sub.2 of the excitation signals S.sub.1 and S.sub.2 are chosen such that the trajectory of the light beam 12 on the image plane 4 is a Lissajous figure which densely covers the entire image plane 4 during the period 1/f.sub.fr of one image frame 2. Such a “complex” or “dense” Lissajous figure can be achieved when the frequencies f.sub.1, f.sub.2 are greater than the frame rate f.sub.fr, e.g., greater than 1 kHz or tens of kHz, and the beginnings B of their respective oscillation periods T.sub.i = 1/f.sub.1, T.sub.j = 1/f.sub.2 (i = 1, 2, ...; j = 1, 2, ...; see
[0057] Alternatively, instead of the single mirror 17 oscillating about two axes 19, 20, the mirror assembly 16 could comprise two mirrors (not shown) each of which oscillates about a respective one of the (e.g. perpendicular) axes 19, 20 in dependence on the respective excitation signal S.sub.1, S.sub.2 for successive deflection of the light beam 12. Of course, any of the embodiments described herein may be carried out for this variant as well.
[0058]
[0059] The local pixel resolution achievable by the number of differing passes of the light beam 12 per unit area U of the image plane 4 and per frame period T.sub.fr is depicted in
[0060] In
[0061] In the example shown in
[0062] The amplitude of oscillations of the mirror 17 about the axes 19, 20 can be altered in different ways, for example, by changing the amplitude of the excitation signals S.sub.1, S.sub.2; by moving the frequencies f.sub.1, f.sub.2 of the excitation signals S.sub.1, S.sub.2 further away from the respective resonance frequency of the mirror 17 about the respective axis 19, 20, which leads to a drop of the oscillation amplitude from its maximum at resonance; by reducing the pulsewidth of a pulsed excitation signal S.sub.1, S.sub.2; etc. In general, the amplitude of the mirror oscillation about any of the two axes 19, 20 can be varied by amplitude modulation, frequency modulation, pulsewidth modulation or phase modulation of the respective excitation signal S.sub.1, S.sub.2 with a respective modulation signal M.sub.1, M.sub.2 generated by the controller 15.
[0063]
[0064] By varying the amplitude of the oscillations of the mirror 17 about the two axes 19, 20 and hence the current maximum size of the trajectory L while it is drawn to “build-up” a frame 2, as it is shown in
[0065] Therefore, for a specific area 25.sub.H of high pixel resolution R.sub.xy to achieve, the controller 15 calculates – in addition to specific modulation signals M.sub.1, M.sub.2 for the excitation signals S.sub.1, S.sub.2 – a respective control signal CS.sub.xy to control the pixel read-out ratio of the image processor 14a. The control signal CS.sub.xy indicates the number of pixels 3 the image processor 14a reads out from the image buffer 14b per local unit area U of the image plane 4 and per frame period T.sub.fr. This is shown in detail in
[0066] As shown in
[0067] Alternatively, the image buffer 14.sub.b may hold the image frame 2 already in a locally varying resolution according to the regions 25.sub.L and 25.sub.H. This means that the image frame 2 is only generated in high resolution for the region 25.sub.H (or at least the ROI 26) by, e.g., a renderer 14c which renders the image frame 2 from, e.g., abstract object data D such as 3D models, 3D animations etc. For the region 25.sub.L (or any region outside of the ROI 26) the renderer 14c may render the image frame/s 2 in a reduced pixel resolution, saving both processing power in the renderer 14c and memory space in the buffer 14b. The information, which parts of the image frame/s 2 shall be rendered in high or low resolution, is fed to the renderer 14c in form of the control signal CS.sub.xy by the processor 15.
[0068] To further save processing power in the renderer 14c, the renderer 14c may even abstain from rendering the image frame/s 2 in high resolution for those areas within the regions 25.sub.H which are not currently used for the ROI 26, see grey pixels 3 in the region 25.sub.H outside of the ROI 26 in
[0069] The above-mentioned modulation of the excitation signals S.sub.1, S.sub.2 by the modulation signals M.sub.1, M.sub.2 is now selectively used in the display apparatus 1 to increase the local pixel resolution R.sub.xy in a user’s region of interest, ROI, 26 (
[0070] In the head-mounted embodiment of the display apparatus 1 shown in
[0071] Parts of the gaze detector 28, in particular its processing components, may be implemented by the controller 15 and/or the image processor 14a. In general, the controller 15, the image processor 14a, the image buffer 14b, the optional renderer 14c and the gaze tracker 28 may be implemented individually by separate electronic components or integrated into each other as needed, i.e., share common electronic components.
[0072] The gaze tracker 28 can work according to any principle known in the art, e.g., by eye-attached tracking with special contact lenses worn by the user which have embedded mirrors or sensors, or by optical tracking of corneal or retinal reflections of visible or invisible light rays. Most commonly, the gaze trackers 28 will be implemented optically, e.g., by means of a camera directed at the user’s eye or eyes to view and track the gaze 27. Such a video camera can be used both for eye tracking (when head-mounted) or gaze tracking from a stationary point in the environment 10 when it views and analyses both eye movement and head movement.
[0073] The detection of the user’s ROI 26 on the image plane 4 is used to move the area 25.sub.H of increased pixel resolution R.sub.xy, achieved by the current modulation of the excitation signals S.sub.1, S.sub.2 applied and the current pixel read-out ratio of the image processor 14a, into the detected ROI 26. This is shown in
[0074] In a first variant shown in
[0075] As can be seen from
[0076] In a second variant shown by the sequence of
[0077] From an analysis of the past track 29 of the gaze 27 the gaze tracker 28 can then predict the current ROI 26 for a frame 2 to display, and the controller 15 can – even pre-emptively – change the modulation signals M.sub.1, M.sub.2 so that the ROI 26 will always be hit or covered by a region 25 of increased pixel resolution R.sub.xy. Concurrently, the corresponding control signal CS.sub.xy is issued by the gaze tracker 28 or the controller 15 to the image processor 14a and optionally the renderer 14c. The gaze tracker 28 can even predict a next saccade x.sub.i+1 from an analysis of the track 29, particularly from a past sequence of fixations pi and saccades x.sub.i, in order to adjust the display apparatus 1 for the next saccade x.sub.i+1 of the user’s gaze 27.
[0078] When predicting the ROI 26, the gaze tracker 28 can not only predict the location of the ROI 26 but optionally also the size of the ROI 26. For instance, the size of the ROI 26 can be determined in dependence on a calculated location prediction uncertainty, e.g., in order to have a larger size of the ROI 26 in case of a higher location prediction uncertainty.
[0079] Furthermore, the gaze tracker 28 can be configured to determine a duration (or average duration) di of one (or more) past saccades x.sub.i, and the controller 15 can be configured to complete a change from a first set of modulation signals M.sub.1, M.sub.2 – which achieves, e.g., the region 25.sub.H of
[0080] The modulation signals M.sub.1, M.sub.2 and control signals CS.sub.xy required to achieve a specific region 25.sub.H of increased pixel resolution R.sub.xy that hits or covers the ROI 26, i.e., the dependencies of the signals M.sub.1, M.sub.2, CS.sub.xy on the detected ROI 26, can be programmed into the controller 15 (and/or the gaze tracker 28 and/or the image processor 14a and/or the renderer 14c) in form of a formula. Alternatively, as shown in
[0081]
[0082] For a specific ROI 26 detected, the controller 15 looks up the region 25.sub.H into which the ROI 26 falls (or which falls into that ROI 26) and retrieves from the correspondence between the elements of the matrices 32, 33 the corresponding first and second modulation signals M.sub.1, M.sub.2. The controller 15 then modulates the excitation signals S.sub.1, S.sub.2 with the modulation signals M.sub.1, M.sub.2 retrieved from the look-up table 31. Concurrently, the controller 15 (or the gaze tracker 28) controls the image processor 14a – and optionally the renderer 14c – such that the number of pixels 3 read-out per unit area U by the image processor 14a is correspondingly higher in the ROI 26 than outside of the ROI 26.
[0083] The corresponding control signal CS.sub.xy can instruct the image processor 14a to perform the reduced subsampling or non-subsampling – and optionally the renderer 14c to perform its full-resolution rendering – in the entire region 25.sub.H or only in the ROI 26. The control signal CS.sub.xy may, e.g., be a first predefined pixel read-out ratio and/or rendering resolution for the region 25.sub.H or ROI 26 and a second predefined pixel read-out ratio and/or rendering resolution for the region 25.sub.L. When generated by the controller 15, each corresponding control signal CS.sub.xy may be stored in the look-up table 31 together with the respective modulation signals M.sub.1, M.sub.2.
[0084] To perform the modulation, the display apparatus 1 may have discrete modulators 34, 35 receiving the excitation signals S.sub.1, S.sub.2 from the signal generators 23, 24 on the one hand and the modulation signals M.sub.1, M.sub.2 from the controller 15 on the other hand. Alternatively, the signal generators 23, 24 and modulators 34, 35 can be implemented by processing elements within the controller 15.
[0085] In general, different types of modulation signals M.sub.1, M.sub.2, can be used which lead to different shapes and sizes of regions 25 of increased pixel resolution R.sub.xy. Instead of the saw-tooth signals of
[0086] In
[0087] In
[0088] In an optional variant (not shown), the modulation signal M.sub.1 or M.sub.2 is a repetition of a step function comprised of at least two sections of different respective constant values, wherein the respective values and lengths of the sections depend on the detected ROI 26. It goes with saying that each of the excitations signals S.sub.1, S.sub.2 can be modulated with the same or different modulation signals M.sub.1, M.sub.2, i.e., with modulation signals M.sub.1, M.sub.2 of different frequencies, shapes, amplitudes and offsets.
[0089] The region 25.sub.H of increased pixel resolution and/or the ROI 26 may be perceived by the user with an increased light intensity. This may be a desirable effect if the attention of the user shall be particularly drawn to the region 25.sub.H or the ROI 26, respectively. On the other hand, if this is an undesirable effect which shall be countered, the controller 15 can optionally decrease the intensity of the light beam 12 via the control input 13 of the light source 11 for pixels 3 within the region 25.sub.H, or at least within the ROI 26, and increase it outside thereof.
[0090] In a further optional variant multiple different regions 25.sub.H,i (i = 1, 2, ...) with respective different subsampling ratios may be used. This can be done, e.g., to “smooth” the transition of the pixel resolution between the ROI 26 and its surrounding, or between the areas 25.sub.H and 25.sub.L, respectively, by using one or several adjacent “intermediate” regions 25.sub.H,i of gradually decreasing pixel resolution until the pixel resolution has reached that of the neighbouring region 25.sub.L.
[0091] The disclosed subject matter is not restricted to the specific embodiments described herein, but encompasses all variants, modifications and combinations thereof that fall within the scope of the appended claims.