METHOD FOR SIMULATING THE ILLUMINATION OF AN INDOOR SCENE OBSERVED BY A CAMERA ILLUMINATED BY OUTDOOR LIGHT AND SYSTEM

20220279156 · 2022-09-01

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for simulating, at an instant, the illumination of an indoor scene observed by a camera and being illuminated by outdoor light, the method comprising: a first preliminary phase including: obtaining a reflectance map of the scene elaborating normalized radiosity maps for the contribution of the sky and the ground, a second phase carried out within a first given duration from the instant including: obtaining the position of the sun, obtaining a normalized radiosity map for the contribution of the sun, a third phase, including: acquiring an image of the scene, determining brightness scale parameters for the ground, the sky, and the sun. The disclosure also proposes tracking the camera position.

    Claims

    1. A method for simulating, at an instant, an illumination of an indoor scene observed by a camera and being illuminated by outdoor light, the method comprising: a first preliminary phase including: obtaining a 3D geometric model of the scene, obtaining image information of the scene, obtaining a reflectance map of the scene for a contribution of outdoor illumination to the scene based on the 3D geometric model of the scene and the image information of the scene, elaborating a normalized radiosity map for a contribution of the sky to the outdoor illumination based on the reflectance map, elaborating a normalized radiosity map for a contribution of a ground to the outdoor illumination based on the reflectance map, a second phase carried out within a first given duration from the instant including: obtaining a position of the sun using geolocation, time, and orientation of the scene, obtaining a normalized radiosity map for a contribution of the sun to the outdoor illumination based on the position of the sun and the reflectance map, a third phase carried out within a second given duration from the instant, shorter than the first given duration, including: acquiring at least one image of the scene, determining from the at least one image a brightness scale parameter for the contribution of the ground, a brightness scale parameter for the contribution of the sky, and a brightness scale parameter for the contribution of the sun.

    2. The method of claim 1, wherein obtaining the reflectance map comprises considering a plurality of light sources arranged on a sphere centered on the scene.

    3. The method of claim 2, wherein elaborating a normalized radiosity map for the contribution of the sky to the outdoor illumination based on the reflectance map comprises only considering the light sources in the upper half of the sphere, and elaborating a normalized radiosity map for the contribution of the ground to the outdoor illumination based on the reflectance map comprises only considering the light sources in the lower half of the sphere.

    4. The method of claim 1, wherein the preliminary phase comprises a ray-tracing step to determine a visibility of each vertex of the 3D geometric model, the method further comprising using the visibilities to obtain the normalized radiosity maps for the contribution of the sky, of the ground, and of the sun.

    5. The method of claim 1, wherein obtaining the reflectance map comprises using image information of the scene acquired when there is no direct sunlight entering the scene.

    6. The method of claim 1, wherein obtaining the reflectance map comprises using image information of the scene acquired when there is direct sunlight entering the scene at different instants associated with different positions of the sun.

    7. The method of claim 1, comprising: rendering a normalized image of the scene for the contribution of the sky to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination, rendering a normalized image of the scene for the contribution of the ground to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination, and rendering a normalized image of the scene for the contribution of the sun to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination

    8. The method of claim 7, wherein determining from the image a brightness scale parameter for the contribution of the ground, a brightness scale parameter for the contribution of the sky, and a brightness scale parameter for the contribution of the sun, comprises minimizing a difference between the image and a linear combination of products of each brightness scale parameter with the corresponding normalized image.

    9. The method of claim 1, wherein the third phase comprises acquiring a plurality of images of the scenes at different instants and determining the brightness scale parameters uses the plurality of images.

    10. The method of claim 7 when in combination with at least claim 7, comprising elaborating an image of the scene from the normalized images of the scene and the brightness scale parameters.

    11. A method for tracking the position of a camera observing a scene over a plurality of instants, using the simulated illumination of the indoor scene by the method for simulating the illumination of an indoor scene in accordance with claim 1 at every instant of the plurality of instants.

    12. The method of claim 10, wherein the method for simulating the illumination of an indoor scene is the method in accordance with claim 10, and wherein tracking is performed on the image elaborated from the normalized images of the scene and the brightness scale parameters.

    13. A system comprising a camera and a control unit configured to perform the method according to claim 1.

    14. A recording medium readable by a computer and having recorded thereon a computer program including instructions for executing the steps of a method according to claim 1.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0062] How the present disclosure may be put into effect will now be described by way of example with reference to the appended drawings, in which:

    [0063] FIG. 1 is a graphical representation of a pipeline to implement the steps of the method according to an example,

    [0064] FIG. 2 is a schematical representation of a sphere associated with outside illumination,

    [0065] FIG. 3 is a representation of a system according to an example,

    [0066] FIG. 4A is an image of an indoor scene

    [0067] FIG. 4B is the corresponding rendered image of the image of the indoor scene,

    [0068] FIG. 5A represents a sphere representing outdoor illumination

    [0069] FIG. 5B is a representation of how this sphere is observed from the inside.

    DESCRIPTION OF THE EMBODIMENTS

    [0070] An exemplary method for simulating the illumination of an indoor scene will now be described. This method aims at simulating the illumination of an indoor scene at a given instant, typically in near real time. Also, in the present description, the simulation is aimed at performing a tracking of the position of the camera observing the scene. The disclosure is however not limited to tracking this position and can also be applied to other applications in which it is required to use a simulation of the illumination.

    [0071] The method limits the computational load to be performed in near real-time. This results from the fact that a portion of the computations is performed once, another portion is performed for example every five minutes, and the final portion is done in real time.

    [0072] FIG. 1 shows the steps of a method according to an example. In a first phase P1 (which needs be performed only once for a given scene), the method comprises a step S001 of reconstructing the 3D geometry of the scene. In some embodiments, this can be done on the basis of an image of the scene (an RGB image or an RGBD image, and eventually a plurality of images), and the pose of the camera (at this stage, this pose can be provided). The output of this step is the 3D geometric model of the scene in the form of a 3D mesh. The 3D geometric model of the scene can be obtained by Truncated Signed Distance Field (TSDF) integration, which requires a depth image stream scanning the scene and the corresponding camera poses. It should be noted that if this method is used, the RGBD images should be acquired when there is no direct sun light visible in the scene as most RGBD cameras operate using a near-infrared sensor which fails in bright sun light.

    [0073] The disclosure is not limited to the use of images to reconstruct the 3D geometry of the scene and can also use 3D geometry models prepared by an operator.

    [0074] Subsequently, a step in which reflectance maps (i.e. reflectance values at each vertex) will be acquired is performed. In some embodiments, this step uses as input the 3D geometry model of the scene, images of the scene and camera poses.

    [0075] While not represented, the method can include a step of obtaining image information of the scene.

    [0076] It should be noted that the reflectance ρ of a vertex v.sub.i is defined as the ratio between the incoming light E(v.sub.i) (also called the irradiance) and the reflected light B(v.sub.i) (also called the radiosity) at vertex v.sub.i:


    B(v.sub.i)=ρ(v.sub.i)E(v.sub.i)  Equation (1)

    The radiosity can be measured (from image information) and used together with the scene geometry, to compute the irradiance, as will be shown hereinafter. Outdoor illumination complicates this procedure in two ways, first, it is not possible to reconstruct the outdoor geometry with current RGBD sensors, and the sky does not have a proper geometry in its common sense. Further, the sun radiosity is typically too high to be captured directly by consumer-grade RGBD cameras. The following described method overcomes this difficulty.

    [0077] More precisely, the subsequently described steps S002, S003, and S004 are three alternative steps that can be used to obtain reflectance maps from image information (one or more images).

    [0078] Step S002 is an alternative which is only represented for the sake of completeness as it represents a case where there is no outdoor illumination. This step implements the method of document “Camera tracking in lighting adaptable maps of indoor environments” further discloses how to obtain radiosity maps/reflectance maps and of European Patent Application 20 154 742.9 filed on Jan. 30, 2020.

    [0079] If the scene is lit by artificial light sources only, which are all inside the scene, it is possible to use the method known to the person skilled in the art as the “radiosity transport” method, which solves the equation system given by

    [00001] B ( v i ) = ρ ( v i ) .Math. j i B ( v j ) F ( v i , v j ) G ( v i , v j ) Equation ( 2 )

    for all vertices v.sub.i, and wherein F(v.sub.i,v.sub.j) is the form factor describing the geometrical relation between vertices v.sub.i and v.sub.j, and G(v.sub.i,v.sub.j)∈{0,1} the visibility equal to “1” if the line of sight between vertices v.sub.i and v.sub.j is free (i.e. not blocked).

    [0080] Radiosity values B(v.sub.i) are recovered from scanning the scene with an RGB camera while varying the camera's exposure time.

    [0081] Step S003, actually part of the present disclosure, is a first alternative step of obtaining a reflectance map of the scene for the contribution of outdoor illumination to the scene based on the 3D geometry model of the scene. This alternative is particularly suitable for scenes which receive outdoor light but not direct sunlight. In this case, the the brightness of the sky is low enough to be directly measured on image information with a consumer grade RGB(D) camera (i.e. the image information can be obtained by acquiring at least one image of the scene with this outside illumination). In this step, it is considered that light is passing for example through a window but no direct sunlight is visible in the scene itself. Thus, the radiosity can be measured as in the above described step S002, while the geometry outside this window is still missing (there are no vertices onto which the radiosity can be projected).

    [0082] To overcome this difficulty, the outdoor geometry is modeled as a sphere or a sky sphere. This sphere is formed by a set custom-character of sample points (i.e. light sources) placed on the sphere. Each point custom-character is associated with a direction n(l) and a radiosity B(l). This radiosity is recovered by projecting parts of the input image (or images if a plurality of images of the scene is used), which do not fall onto the scene geometry, onto the sky sphere instead. Then, a slightly modified equation 2 can be used for reflectance recovery:


    B(v.sub.i)=ρ(v.sub.i)[E.sub.s(v.sub.i)+E.sub.l(v.sub.i)]  Equation (3)

    Wherein E.sub.s is the irradiance from the scene and E.sub.l is the irradiance from l, with:

    [00002] E s ( v i ) = .Math. j i B ( v j ) F ( v i , v j ) G ( v i , v j ) Equation ( 4 )

    as before, and:

    [00003] E l ( v i ) = .Math. l B ( l ) F ( v i , l ) G ( v i , l ) Equation ( 5 )

    where F(v.sub.i,l)=−cos(n(l).Math.n(v.sub.i) and G(v.sub.i,l)=0 if and only if the ray n(l) intersects any scene geometry other than v.sub.i and 1 otherwise.

    [0083] The above defined sphere will be better described in reference to FIG. 2.

    [0084] Step S004 is another alternative for recovering the reflectance map. This alternative is particularly suitable for applications where the scene will receive direct sunlight.

    [0085] In this alternative, the sun is considered to be directly visible from the inside, in the scene. Theoretically, this could be handled analogously to step S003, however, the minimal exposure time of most RGBD cameras is not short enough to capture the sun's radiosity. In other words, orienting a camera directly towards the sun does not allow measuring B(l) for a point placed on the sphere where the sun is l∈custom-character.sup.sun with custom-character.sup.sun being a set of points placed where the sun is, better described hereinafter in reference to FIG. 2.

    [0086] Thus, since the sun, if present/visible, is the dominant light source in the scene, we cannot reliably recover the reflectance using radiosity transport directly.

    [0087] Nevertheless, under certain assumptions and as will be described hereinafter, a meaningful scene reflectance can be estimated, in particular by using at least two images of the scene (from the image information) where the scene receives outdoor illumination and direct sunlight, taken at different instants associated with different positions of the sun.

    [0088] Let custom-character.sub.t1.sup.sun be a set of points arranged on the above-mentioned sphere within the sun circle (the circle appearing where the sun is located), or in other words_the sky area covered by the sun at time t1 (an instant at which a first image has been captured). Further, let V.sub.t1.sup.sun be the set of vertices lit by direct sun light, i.e. v∈V.sub.t1.sup.sun if and only if F(v,l)G(v,l)>0 for a given l∈custom-character.sub.t1.sup.sun. It is assumed that B.sub.t1(l) is the same for all l∈custom-character.sub.t1.sup.sun, and this leads to only having one unknown, the sun radiosity denoted B.sub.t1.sup.sun. It should be noted that for all other vertices not in V.sub.t1.sup.sun it is possible to recover the reflectance as they only depend on the scene or sky radiosities, which can be measured. By capturing the scene again at a different time of day t.sub.2 (an instant at which a second image has been captured, chosen so that the position of the sun differs between the two instants, the larger the difference in position the better—or, alternatively, a difference of at least one vertex should be obtained between V.sub.t1.sup.sun and V.sub.t2.sup.sun defined hereinafter), it is possible to recover B.sub.t1.sup.sun. Let V.sub.t2.sup.sun be defined similarly to V.sub.t1.sup.sun and V.sub.t1−t2.sup.sun the set of vertices in V.sub.t1.sup.sun but not in V.sub.t2.sup.sun, i.e. lit by direct sunlight at t1 but not at t.sub.2. It is then possible to recover the sun brightness/radiosity B.sub.t1.sup.sun from any vertex v∈V.sub.t1−t2.sup.sun by exploiting the fact that the reflectance ρ(v) is the same at t1 and t2. Solving the above presented equation 3 for ρ(v) at t2 and inserting it in equation 3 for t1, it is possible to obtain:

    [00004] B t 1 ( v ) = B t 2 ( v ) [ E s t 2 ( v ) + E l t 2 ( v ) ] [ E s t 1 ( v ) + E l t 1 ( v ) ] Equation ( 6 )

    With E.sub.s.sup.ti(v) and E.sub.l.sup.ti(v) being respectively the irradiance from the scene and from the sky at instant ti.

    [0089] In equation 6, all quantities except E.sub.l.sup.t1(v) are known. Indeed, B.sub.t1(v) and B.sub.t2(v) can be measured, E.sub.s.sup.t1(v) and E.sub.s.sup.t2(v) depend only on scene radiosities and can also be computed from measured values, E.sub.l.sup.t1(v) by definition of v depends only on the sky radiosity which is measured as well. E.sub.l.sup.t1(v) can be expressed as:

    [00005] E l t 1 ( v ) = E sky t 1 ( v ) + B t 1 sun .Math. l t 1 sun F ( l , v ) G ( l , v ) Equation ( 7 )

    With E.sub.sky.sup.t1(v) computed using the above defined equation 5 using sky radiosities at t1. Using this expression in equation 6 and solving it for B.sub.t1.sup.sun provides:

    [00006] B t 1 sun = [ B t 1 ( v ) [ E s t 2 ( v ) + E l t 2 ( v ) ] B t 2 ( v ) - E s t 1 ( v ) - E sky t 1 ( v ) ] / [ .Math. l t 1 sun F ( l , v ) G ( l , v ) ] Equation ( 8 )

    [0090] It should be noted that all the values on the right side are known, thus, it is possible to compute B.sub.t1.sup.sun, use it in the original equation system for t1 and solve for the remaining reflectance values to obtain the reflectance map.

    [0091] Step S005 can then be carried out (this step is carried out before steps S003 and steps S004 as its output can be used in these steps) in which a ray tracing step is performed to determine the visibility of each vertex of the 3D geometric model. This step uses the 3D geometric model as input and the sky sphere separated into points/light sources, as defined above. This allows obtaining the visibility G in equations 4 and 5, for example through a pre-computation so that these equations can be solved later. This eliminates the main bulk of computation during illumination estimation.

    [0092] Step S006 can then be carried out in which a normalized radiosity map ({circumflex over (B)}.sup.sky) for the contribution of the sky to the outdoor illumination based on the reflectance map (obtained using either S003 or S004) is elaborated, and in which a normalized radiosity map ({circumflex over (B)}.sup.gr) for the contribution of the ground to the outdoor illumination based on the reflectance map is elaborated.

    [0093] In the present application, obtaining a normalized radiosity map is performed using the algorithm known to the person skilled in the art as the radiosity algorithm (Cindy M. Goral, Kenneth E. Torrance, Donald P. Greenberg, and Bennett Battaile. 1984. “Modeling the interaction of light between diffuse surfaces”. SIGGRAPH Comput. Graph. 18, 3 (July 1984), 213-222). In some embodiments, an iterative version of the radiosity algorithm is used.

    [0094] At this stage, the reflectance of each vertex of the scene is known,

    [0095] It is possible to obtain the normalized radiosities from the light sources and their brightness, G, F, and the reflectance map. For example, each vertex v∈V has a reflectance ρ(v), the radiosity B(v) here is the amount of light reflected by the surface patch at v, and B.sup.x(v) is the radiosity from light source custom-character.sup.x. In order to estimate B.sup.x(v), it is possible to first compute the radiosity B.sup.l(v) induced by a single directional light source l∈custom-character.sup.x and then to sum up these components. Then, and as mentioned before, an iterative version of the basic radiosity algorithm is used, with:


    B.sub.0.sup.l(v.sub.i)=ρ(v.sub.i)S.sup.lF(l,v.sub.i)G(l,v.sub.i)

    being the light reaching the scene directly from l, and
    B.sub.k+1.sup.l(v.sub.i)=ρ(v.sub.i)ρ.sub.j≠iB.sub.k.sup.l)F(v.sub.i,v.sub.j)G(v.sub.i,v.sub.j) being an iterative estimate for light inter-reflections within the scene. The form factor F(v.sub.i, v.sub.j) is computed as in document “Building dense reflectance maps of indoor environments using an rgb-d camera” (M. Krawez, T. Caselitz, D. Büscher, M. Van Loock, and W. Burgard, in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 3210-3217), for a direction light source F(l,v.sub.i), which simplifies to −cos(n(l).Math.n(v.sub.i)). Also, the visibility term G(v.sub.i,v.sub.j) is 1 if the line of sight between v.sub.i and v.sub.j is free, and 0 otherwise. Similarly, G(l,v.sub.i)=1 only when the ray with direction n(l), starting at v.sub.i, does not intersect with any scene geometry. For example, B.sup.l(v.sub.i)=B.sub.K.sup.l(v.sub.i), with K=10 by way of example, and B.sup.x(v.sub.i)=custom-characterB.sup.l(v.sub.i). Setting S.sup.l to 1 in the above equation provides a normalized radiosity {circumflex over (B)}.sup.x.

    [0096] To adapt the 3D geometric model to the current outdoor illumination in real time, a simplified model is used consisting of sun, sky, and ground light components. The radiosity of each vertex v is given by B(v)=B.sup.sun(v)B.sup.sky(v)+B.sup.gr(v). The sky and ground components are computed by first splitting the sky sphere into two disjoints sets custom-character.sup.sky and custom-character.sup.grd, and adding the set of bins custom-character.sup.sun describing the sun area. Then, equation 3 can be used to compute the radiosity contributions of each of the three light sources.

    [0097] Here, it is assumed that the sun, sky, and ground bins l have the same radiosities (or brightness scales) S.sup.sun, S.sup.sky, and S.sup.gr respectively, such that the term B(l) in equation 5 can be replaced by the proper brightness scale parameter S.sup.x, with x∈{sun, sky, gr} (with gr for ground). In this case, the linearity of the radiosity transport equations allows pulling out S.sup.x from the computation and to scale the vertex radiosities subsequently, i.e. B.sup.x(v)={circumflex over (B)}.sup.x(v)S.sup.x. {circumflex over (B)}.sup.x is called a normalized radiosity.

    [0098] For the sky and the ground, it is possible to pre-compute the normalized radiosities to speed up the scene re-lighting. Given {circumflex over (B)}.sup.sky and {circumflex over (B)}.sup.sun for each vertex, these normalized radiosities will then be multiplied by the corresponding brightness scale parameters in near real time, around the instant at which it is desired to simulate the illumination.

    [0099] A second phase P2 can be carried out. By way of example this second phase can be carried out every 5 minutes, or any duration of the order of 5 minutes, as the sun moves slowly throughout the day.

    [0100] In step S007 of the second phase, the position of the sun is computed using the scene geolocation (i.e. latitude and longitude), the orientation of the scene, the time (i.e. yyyy.mm.dd.hh.mm.ss). The position of the sun can be expressed as a direction vector to reflect the sun direction relative to the scene.

    [0101] In order to simulate the sun illumination correctly, it is required to known the sun position relative to the scene at any given time. This can be calculated analytically by an astronomical model.

    [0102] Using the sun position (for example expressed as a direction), it is possible to obtain the normalized radiosity map ({circumflex over (B)}.sup.sun) for the contribution of the sun to the outdoor illumination in step S008. This step also uses the reflectance map, the visibility terms G, and the 3D geometric model of the scene.

    [0103] Unlike the sky and ground contributions, the sun changes its relative position to the scene depending on time, therefore a radiosity pre-computation carried out in the first phase is not possible. {circumflex over (B)}.sup.sun should therefore be computed for every new sun position. Given that the sun area is much smaller than that of sky or ground, {circumflex over (B)}.sup.sun is obtained within a couple of minutes using GPU acceleration. Even though this update is not real-time, the sun position varies only very little in the course of minutes, and performing the steps of the second phase can be performed every couple minutes, typically every five minutes.

    [0104] The final and third phase P3 can be carried out in near real-time, that is to say within a duration which is less than the frame rate from the instant (t) at which it is desired to simulate the illumination.

    [0105] This phase comprises step S009 in which a normalized image Î.sub.t.sup.sun the scene for the contribution of the sun to the outdoor illumination of the scene is rendered on the basis of {circumflex over (B)}.sup.sun. This step can use an initial pose estimate T.sub.t.sup.≈. This pose is obtained because a frame-to-frame tracking method is used here. Thus, the change of camera pose is computed by matching an input image at instant t to the image of instant t−1. This tracking process is described in more detail in document “Camera Tracking in Lighting Adaptable Maps of Indoor Environments” (T. Caselitz, M. Krawez, J. Sundram, M. Van Loock and W. Burgard, 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020, pp. 3334-3340). Also, the tracking may be done using the method described in the previously mentioned European Patent Application.

    [0106] Step S010 comprises analogous steps for {circumflex over (B)}.sup.sky and {circumflex over (B)}.sup.gr in which normalized images are rendered (respectively Î.sub.t.sup.sky and Î.sub.t.sup.gr.

    [0107] The normalized images can be combined in step S011.

    [0108] Also, in the third phase, an image I.sub.C of the scene is acquired with the camera in step S012.

    [0109] The subsequent steps will describe how the brightness scale parameters) S.sup.sun, S.sup.sky, and S.sup.gr are estimated from the normalized images and the acquired image.

    [0110] This will be done by adjusting brightness scale parameter values to fit the acquired image.

    [0111] In step S013, a subset of pixels Ω of the pixels of the image I.sub.C are selected so as to represent the image areas with the most information about illumination and containing the least amount of noise/errors.

    [0112] In fact, it has been observed that because T.sub.t.sup.≈ can differ from the actual pose of the camera used to capture I.sub.C, the reference image I.sub.C.sup.ref which will be obtained using the normalized images and the brightness scale parameters can differ from I.sub.C. In practice, image noise and errors in the reconstructed scene geometry and reflectance maps, and the simplified outdoor illumination model (i.e.

    [0113] the sphere) prevent this ideal matching. Hence, in some embodiments, Ω is used.

    [0114] Also, a weight W.sup.spl can be used to represent the reliability of Ω for brightness scale estimation. This weight can either be set to 1, or be computed to reflect whether the sampled points are representative for the light distribution in the scene, and how much noise can be expected in these samples.

    [0115] The brightness scale estimation is carried out in step S014. For a single frame I.sub.C and the samples u∈Ω the following error Err is minimized for the three brightness scale parameters S.sup.sun, S.sup.sky, and S.sup.gr:

    [00007] Err = .Math. u Ω ( I C ( u ) - CRF ( et .Math. I B ( u ) ) ) 2 Equation ( 9 )

    With I.sub.B(u)=Σ.sub.x∈{sun,sky,gr}S.sup.xÎ.sup.x(u), CRF( ) being the camera response function, and et the exposure time of I.sub.C. The brightness scale parameters are obtained by minimizing Err over S.sup.x. This minimization can be performed by a least squares solver.

    [0116] As an alternative to step S014, step S015 can be performed in which several frames are used.

    [0117] It has been observed that a single frame can still be error-prone, and that in some embodiments, S.sup.sun, S.sup.sky, and S.sup.gr are averaged over multiple frames using W.sup.spl and Err as weights. In order to cope with dynamic illumination (i.e. clouds passing in front of the sun), it is proposed to use a sliding window approach, where frames behind a time horizon are discarded. It is further proposed to filter out outliers which deviate too much from previous frames. In step S016 the brightness scale is carried out.

    [0118] Consequently, the simulated image I.sub.C.sup.ref in step S017 can be rendered using the rendered normalized images and the brightness scale parameters S.sup.sun, S.sup.sky, and S.sup.grd.

    [0119] Tracking of the camera position can then be performed in step S018 by comparing I.sub.C.sup.ref and I.sub.C. As explained above, tracking is performed using methods known to the person skilled in the art.

    [0120] FIG. 2 is a 2D representation of a cross section of the above-mentioned sphere centered on the scene. As can be seen on the figure, the points considered to be light sources are aligned on the sphere. More precisely, the points of custom-character.sup.sky are aligned in the upper hemisphere while the points custom-character.sup.gr are aligned in the lower hemisphere. These points are spaced regularly around the sphere.

    [0121] Also, on this figure, for each point l.sub.i, the direction n(l.sub.i) is represented by an arrow.

    [0122] It should be noted that for the sun, a limited number of points are used where the sun appears to be when seen from the scene (points custom-character.sup.sun). The points of custom-character.sup.sun are more densely spaced than the points of custom-character.sup.sky and custom-character.sup.gr.

    [0123] In order to recover the reflectance in either step S003 or step S004, a radiosity value is affected by projecting a portion of the image to the point li, as explained above.

    [0124] FIG. 3 shows a system 200 configured to perform the method described in reference to FIG. 1. This system is equipped with a camera 201 (for example an RGBD camera), and has a computer structure. The system comprises a processor 202, and a nonvolatile memory 203. In this nonvolatile memory 203, a set of computer program instructions 204 is stored. When these instructions 204 are executed by processor 202, the steps of the above defined method are performed.

    [0125] FIG. 4A is an image rendered by the present method, while FIG. 4B is the actual image acquired by the camera. It can be scene that the illumination simulation provides a good simulation of the actual scene.

    [0126] FIG. 5A is an image of a sphere of bins centered around the scene and determined in the above steps.

    [0127] FIG. 5B shows how this sphere is seen from the camera: this is an approximation of the outside which does not take into account the exact geometry of the outside but provides a good estimation of the illumination.

    [0128] Throughout the description, including the claims, the term “comprising a” should be understood as being synonymous with “comprising at least one” unless otherwise stated. In addition, any range set forth in the description, including the claims should be understood as including its end value(s) unless otherwise stated. Specific values for described elements should be understood to be within accepted manufacturing or industry tolerances known to one of skill in the art, and any use of the terms “substantially” and/or “approximately” and/or “generally” should be understood to mean falling within such accepted tolerances.

    [0129] Although the present disclosure herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present disclosure.

    [0130] It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.