METHOD FOR CALIBRATING AN AUGMENTED REALITY VISUAL RENDERING SYSTEM COMPRISING AT LEAST ONE DISPLAY DEVICE THAT IS PARTIALLY TRANSPARENT WITH RESPECT TO THE USER THEREOF, AND ASSOCIATED SYSTEM
20170322020 · 2017-11-09
Inventors
Cpc classification
B60J9/00
PERFORMING OPERATIONS; TRANSPORTING
G02B27/0179
PHYSICS
G01B11/25
PHYSICS
International classification
G01B11/25
PHYSICS
Abstract
A method for calibrating an augmented reality visual rendering system comprising at least one display device partially transparent with respect to the user thereof, comprises carrying out the steps of: displaying a two-dimensional calibration curve on a partially transparent display device of the partially transparent display device; recording a three-dimensional curve described by the user by a three-dimensional pointing device so that the trajectory described by the three-dimensional pointing device is aligned, according to the viewpoint of the user, with the two-dimensional calibration curve displayed on the display device; and matching the three-dimensional curve and the two-dimensional calibration curve displayed on the display device; and comprising a step of calibrating the augmented reality visual rendering system by determining the parameters of a model to represent the set comprising the eye of the user 1 and the partially transparent display device of the partially transparent display device at which the eye is looking, on the basis of matching.
Claims
1. A method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, comprising carrying out at least once the steps of: displaying a two-dimensional calibration curve on a partially transparent display device; recording a three-dimensional curve described by the user by means of a three-dimensional pointing device so that the trajectory described by the three-dimensional pointing device is aligned, according to the viewpoint of the user, with the two-dimensional calibration curve displayed on the display device; and matching the three-dimensional curve and the two-dimensional calibration curve displayed on the display device; and comprising a step of calibrating said augmented reality visual rendering system by determining the parameters of a model making it possible to represent the set comprising the eye of the user and the partially transparent display device of the partially transparent display device at which the eye is looking, on the basis of said matching.
2. The method as claimed in claim 1, wherein the step of matching the three-dimensional curve and the two-dimensional calibration curve displayed on the display device comprises the sub-steps of: projecting the three-dimensional curve onto the display device so as to provide a two-dimensional projected curve and an association between points of the three-dimensional curve and points of the projected curve; and projecting said two-dimensional projected curve onto the corresponding calibration curve so as to provide an association between points of the two-dimensional projected curve and points of the corresponding calibration curve.
3. The method as claimed in claim 1, comprising, furthermore, in an iterative manner, the following steps of: re-estimating the matching of the three-dimensional curve or curves and the respective two-dimensional calibration curve or curves displayed on the display device, on the basis of the new parameters of the model which were determined during the last calibration step; and re-calibrating said augmented reality visual rendering system, on the basis of the last matching of the three-dimensional curve or curves and the respective two-dimensional calibration curve or curves displayed on the display device.
4. The method as claimed in claim 3, wherein said iteration is stopped when a convergence criterion is attained.
5. The method as claimed in claim 1, wherein the calibration curve or curves comprise at least one curve exhibiting at least one discontinuity, and/or at least one polygonal curve.
6. The method as claimed in claim 1, wherein the calibration curve or curves comprise at least one curve whose display on the display device guarantees that no point of the display device is situated at a distance from the curve exceeding 25% of the diagonal of the display device.
7. The method as claimed in claim 6, wherein the calibration curve or curves comprise at least one Hilbert curve, and/or at least one Peano curve, and/or at least one Lebesque curve.
8. The method as claimed in claim 1, wherein said calibration is applied to an augmented reality glasses system.
9. The method as claimed in claim 1, wherein said calibration is applied to an augmented reality windshield system.
10. An augmented reality visual rendering system, adapted to implement the method as claimed in claim 1.
11. The augmented reality visual rendering system as claimed in claim 10, being an augmented reality glasses system or an augmented reality windshield system.
Description
[0038] The invention will be better understood on studying a few embodiments described by way of wholly non-limiting examples and illustrated by the appended drawings in which:
[0039]
[0040] In all the figures, elements having identical labels are similar.
[0041] The example mainly described uses a single calibration curve, in a nonlimiting manner, since, as a variant, a plurality of calibration curves can be used successively.
[0042]
[0043] In
[0044] In
[0045] A two-dimensional projected curve CP.sub.2D on the semi-transparent screen 3 may, for example, be used as intermediate curve to match the three-dimensional curve C.sub.3D and the two-dimensional calibration curve CE.sub.2D which is displayed on the semi-transparent screen 3.
[0046] It is considered that the pointing device 5 provides a 3D position expressed in the reference frame 4 rigidly tied to the partially transparent screen 3. For example, the stylus 5 and the glasses 3 are located by one and the same tracking device, such as a camera fixed on the glasses, a magnetic or acoustic tracking device, some of whose receivers are fixed on the glasses as well as on the stylus 5. Consequently, the position of the glasses and of the stylus are known according to a common reference frame 4 (that of the tracking device).
[0047] For the two-dimensional calibration curve CE.sub.2D displayed on the semi-transparent screen 3, the parametric or non-parametric representation is denoted U.sub.0.
[0048] The user is invited to move the 3D pointing device 5 in such a way that the trajectory described by the tip 6 of the latter 5 and the 2D calibration curve displayed on the partially transparent screen 3 is aligned from the viewpoint of the eye of the user 1. The 3D trajectory is stored by the system and this trajectory is denoted V.sub.0. The pointing device 5 can be a stylus whose tip 6 can be located with the aid of a magnetic, optical, mechanical or acoustic location device or a laser tracking device, etc. The pointing device can also be the finger of the user, the end of the latter being able to be located with the aid of an optical system such as a 3D camera, a stereovision head, etc.
[0049] If an approximate calibration of the device is available, for example done in the factory, then it can be used to perform the sub-step of projecting the 3D curve C.sub.3D onto the partially transparent screen 3 so as to provide a 2D projected curve CP.sub.2D and an association between points of the 3D curve and points of the 2D projected curve CP.sub.2D, the 3D trajectory of the pointer 5. Each point of the 3D trajectory C.sub.3D can then be associated with a point of the calibration curve CE.sub.2D which is closest to the point corresponding to its projection on the projected curve CP.sub.2D.
[0050] This can therefore, for example, proceed in the following manner:
[0051] The trajectory V.sub.0, represented by the curve C.sub.3D, is sampled at a set {Xj.sub.0}.sub.j=0 . . . N of 3D points. Each point Xj.sub.0 is projected in 2D onto the partially transparent screen 3 according to the current calibration, thus providing a 2D point denoted Yj.sub.0 as illustrated in
[0052] Each point Yj.sub.0 is then associated with a respective point Zj.sub.0 of the calibration curve CE.sub.2D displayed on the semi-transparent screen 3, having parametric or non-parametric representation U.sub.0, which is closest to it (the distance used can be any norm, such as the Euclidean norm, the 1 norm, the infinite norm, etc.), as illustrated in
[0053] If no approximate calibration of the device is available, the geometric properties of the 2D calibration curve CE.sub.2D can be used to establish this correspondence. Thus, if the calibration curve CE.sub.2D displayed is polygonal, the 3D trajectory C.sub.3D of the pointer 5 can be analyzed so as to identify the points of the trajectory which correspond to the vertices of a polygonal curve. Once the vertices of the displayed polygon CE.sub.2D have been associated with their assumed respective corresponding opposite numbers from the 3D trajectory C.sub.3D, a transfer function can be estimated and then used to associate each point of the 3D trajectory C.sub.3D with a point of the 2D calibration trajectory CE.sub.2D.
[0054] If an initial calibration is available, such as a calibration performed previously in the factory, then the new calibration can be estimated by using a nonlinear optimization algorithm (gradient descent, Levenberg-Marquardt, Dogleg, etc.) to modify these initial values so that the respective projections {Yj.sub.0} of the points {Xj.sub.0} according to the new calibration are situated as close as possible to the respective points {Zj.sub.0} of the displayed 2D calibration curve CE.sub.2D which have been associated with them.
[0055] If no initial calibration is available, the calibration can be carried out with the aid of a linear estimation algorithm. In the case of the use of a pinhole model to represent the system, it is possible to use algorithms such as are cited in the documents “Multiple View Geometry in Computer Vision”, by Richard Hartley and Andrew Zisserman (2003), Cambridge University Press. pp. 155-157, ISBN 0-521-54051-8; “Techniques for calibration of the scale factor and image center for high accuracy 3-D machine vision metrology”, by K. Lenz and R. Tsai, IEEE Trans on Pattern Analysis and Machine Intelligence, PAMI-10:713-720, 1988; “Camera calibration with distortion models and accuracy evaluation”, by J. Weng, P. Cohen, and M. Herniou, IEEE Trans on Pattern Analysis and Machine Intelligence, PAMI 14(10):965-980, 1992; or “A comparison of new generic camera calibration with the standard parametric approach” by Dunne Aubrey K., Mallon John and Whelan Paul F. (2007), In: MVA2007-IAPR Conference on Machine Vision Applications, 16-18 May 2007, Tokyo, Japan 01/2007, etc. A step of nonlinear optimization of this calibration can be added so as to increase the accuracy of the calibration.
[0056] It is possible to implement an iterative method. The step of matching the 3D curve and the 2D calibration curve displayed on the semi-transparent screen and the calibration step can be repeated so as to increase the quality of the calibration. Thus, the result of the calibration step is used as approximate calibration for matching of the following iteration. The calibration being more accurate than that used in the previous step of matching between the trajectory C.sub.3D and the calibration curve CE.sub.2D displayed, of the previous iteration, the new matching of the trajectory C.sub.3D and of the calibration curve CE.sub.2D displayed will be carried out with greater accuracy. This iterative improvement of the matching then makes it possible to improve the accuracy of the calibration of the following iteration. The iterative process therefore makes it possible to improve both the matching between the trajectory C.sub.3D and the calibration curve CE.sub.2D displayed, and the calibration. Stated otherwise, the curve or curves is or are re-interpreted with the new optical center or the parameters of the model characterizing the projection of the image displayed on the display device on the retina of the eye.
[0057] It is also possible to use a plurality of calibration curves. The steps of displaying a calibration curve and of recording a three-dimensional curve described by the user by means of a three-dimensional pointing device can be repeated several times with different curves before a calibration step. A curve displayed on the screen can vary in shape each time, before executing the step of matching between a 3D curve and the corresponding calibration curve. Thus, a set of 2D calibration curves CE.sub.2D (denoted {Ui}.sub.i=0 . . . M-1 with M the number of curves displayed) displayed on the semi-transparent screen 3, and a set of associated 3D trajectories C.sub.3D (denoted {Vi}.sub.i=0 . . . M-1), are made available.
[0058] The step of association or matching between a 3D trajectory C.sub.3D and the corresponding calibration curve CE.sub.2D displayed is then repeated for each trajectory/calibration curve pair displayed. The calibration step is then modified so as to consider a set of calibration curves rather than just one. Thus, each trajectory Vi is sampled at a set of points denoted {Xj.sub.i}.sub.j=0 . . . N. For each point Xj.sub.i, the projection of this point according to the current calibration is denoted Yj.sub.i. Each point Yj.sub.i is then associated with the 2D point closest to the 2D calibration curve corresponding to it, this point being denoted Zj.sub.i. The calibration step is then estimated so as to minimize the distance between the projections of the points {Xj.sub.i}.sub.j=0 . . . N;i=0 . . . M-1 and the corresponding points on the calibration curve displayed {Zj.sub.i}.sub.j=0 . . . N;i=0 . . . M-1.
[0059] The steps of the above-described method can be performed by one or more programmable processors executing a computerized program so as to execute the functions of the invention by operating on input data and generating output data.
[0060] A computer program can be written in any form of programming language, including compiled or interpreted languages, and the computer program can be deployed in any form, including in the guise of an autonomous program or as a subprogram, element or other unit appropriate for use in a computing environment. A computer program can be deployed so as to be executed on a computer or on several computers at a single site or distributed over several sites and linked together by a communication network.
[0061] The preferred embodiment of the present invention has been described. Diverse modifications may be made without deviating from the spirit and the scope of the invention. Consequently, other implementations are within the scope of the following claims.