METHOD FOR SIMULATING THE ILLUMINATION OF AN INDOOR SCENE OBSERVED BY A CAMERA ILLUMINATED BY OUTDOOR LIGHT AND SYSTEM
20220279156 · 2022-09-01
Inventors
- Jugesh Sundram (Brussels, BE)
- Mark Van Loock (Brussels, BE)
- Michael Krawez (Freiburg, DE)
- Tim Caselitz (Freiburg, DE)
- Wolfram Burgard (Freiburg, DE)
Cpc classification
F21V33/0052
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F21W2121/008
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F21S10/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
H04N13/275
ELECTRICITY
H04N13/254
ELECTRICITY
International classification
H04N13/254
ELECTRICITY
F21V33/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
A method for simulating, at an instant, the illumination of an indoor scene observed by a camera and being illuminated by outdoor light, the method comprising: a first preliminary phase including: obtaining a reflectance map of the scene elaborating normalized radiosity maps for the contribution of the sky and the ground, a second phase carried out within a first given duration from the instant including: obtaining the position of the sun, obtaining a normalized radiosity map for the contribution of the sun, a third phase, including: acquiring an image of the scene, determining brightness scale parameters for the ground, the sky, and the sun. The disclosure also proposes tracking the camera position.
Claims
1. A method for simulating, at an instant, an illumination of an indoor scene observed by a camera and being illuminated by outdoor light, the method comprising: a first preliminary phase including: obtaining a 3D geometric model of the scene, obtaining image information of the scene, obtaining a reflectance map of the scene for a contribution of outdoor illumination to the scene based on the 3D geometric model of the scene and the image information of the scene, elaborating a normalized radiosity map for a contribution of the sky to the outdoor illumination based on the reflectance map, elaborating a normalized radiosity map for a contribution of a ground to the outdoor illumination based on the reflectance map, a second phase carried out within a first given duration from the instant including: obtaining a position of the sun using geolocation, time, and orientation of the scene, obtaining a normalized radiosity map for a contribution of the sun to the outdoor illumination based on the position of the sun and the reflectance map, a third phase carried out within a second given duration from the instant, shorter than the first given duration, including: acquiring at least one image of the scene, determining from the at least one image a brightness scale parameter for the contribution of the ground, a brightness scale parameter for the contribution of the sky, and a brightness scale parameter for the contribution of the sun.
2. The method of claim 1, wherein obtaining the reflectance map comprises considering a plurality of light sources arranged on a sphere centered on the scene.
3. The method of claim 2, wherein elaborating a normalized radiosity map for the contribution of the sky to the outdoor illumination based on the reflectance map comprises only considering the light sources in the upper half of the sphere, and elaborating a normalized radiosity map for the contribution of the ground to the outdoor illumination based on the reflectance map comprises only considering the light sources in the lower half of the sphere.
4. The method of claim 1, wherein the preliminary phase comprises a ray-tracing step to determine a visibility of each vertex of the 3D geometric model, the method further comprising using the visibilities to obtain the normalized radiosity maps for the contribution of the sky, of the ground, and of the sun.
5. The method of claim 1, wherein obtaining the reflectance map comprises using image information of the scene acquired when there is no direct sunlight entering the scene.
6. The method of claim 1, wherein obtaining the reflectance map comprises using image information of the scene acquired when there is direct sunlight entering the scene at different instants associated with different positions of the sun.
7. The method of claim 1, comprising: rendering a normalized image of the scene for the contribution of the sky to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination, rendering a normalized image of the scene for the contribution of the ground to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination, and rendering a normalized image of the scene for the contribution of the sun to the outdoor illumination of the scene based on the normalized radiosity map for the contribution of the sky to the outdoor illumination
8. The method of claim 7, wherein determining from the image a brightness scale parameter for the contribution of the ground, a brightness scale parameter for the contribution of the sky, and a brightness scale parameter for the contribution of the sun, comprises minimizing a difference between the image and a linear combination of products of each brightness scale parameter with the corresponding normalized image.
9. The method of claim 1, wherein the third phase comprises acquiring a plurality of images of the scenes at different instants and determining the brightness scale parameters uses the plurality of images.
10. The method of claim 7 when in combination with at least claim 7, comprising elaborating an image of the scene from the normalized images of the scene and the brightness scale parameters.
11. A method for tracking the position of a camera observing a scene over a plurality of instants, using the simulated illumination of the indoor scene by the method for simulating the illumination of an indoor scene in accordance with claim 1 at every instant of the plurality of instants.
12. The method of claim 10, wherein the method for simulating the illumination of an indoor scene is the method in accordance with claim 10, and wherein tracking is performed on the image elaborated from the normalized images of the scene and the brightness scale parameters.
13. A system comprising a camera and a control unit configured to perform the method according to claim 1.
14. A recording medium readable by a computer and having recorded thereon a computer program including instructions for executing the steps of a method according to claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0062] How the present disclosure may be put into effect will now be described by way of example with reference to the appended drawings, in which:
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
DESCRIPTION OF THE EMBODIMENTS
[0070] An exemplary method for simulating the illumination of an indoor scene will now be described. This method aims at simulating the illumination of an indoor scene at a given instant, typically in near real time. Also, in the present description, the simulation is aimed at performing a tracking of the position of the camera observing the scene. The disclosure is however not limited to tracking this position and can also be applied to other applications in which it is required to use a simulation of the illumination.
[0071] The method limits the computational load to be performed in near real-time. This results from the fact that a portion of the computations is performed once, another portion is performed for example every five minutes, and the final portion is done in real time.
[0072]
[0073] The disclosure is not limited to the use of images to reconstruct the 3D geometry of the scene and can also use 3D geometry models prepared by an operator.
[0074] Subsequently, a step in which reflectance maps (i.e. reflectance values at each vertex) will be acquired is performed. In some embodiments, this step uses as input the 3D geometry model of the scene, images of the scene and camera poses.
[0075] While not represented, the method can include a step of obtaining image information of the scene.
[0076] It should be noted that the reflectance ρ of a vertex v.sub.i is defined as the ratio between the incoming light E(v.sub.i) (also called the irradiance) and the reflected light B(v.sub.i) (also called the radiosity) at vertex v.sub.i:
B(v.sub.i)=ρ(v.sub.i)E(v.sub.i) Equation (1)
The radiosity can be measured (from image information) and used together with the scene geometry, to compute the irradiance, as will be shown hereinafter. Outdoor illumination complicates this procedure in two ways, first, it is not possible to reconstruct the outdoor geometry with current RGBD sensors, and the sky does not have a proper geometry in its common sense. Further, the sun radiosity is typically too high to be captured directly by consumer-grade RGBD cameras. The following described method overcomes this difficulty.
[0077] More precisely, the subsequently described steps S002, S003, and S004 are three alternative steps that can be used to obtain reflectance maps from image information (one or more images).
[0078] Step S002 is an alternative which is only represented for the sake of completeness as it represents a case where there is no outdoor illumination. This step implements the method of document “Camera tracking in lighting adaptable maps of indoor environments” further discloses how to obtain radiosity maps/reflectance maps and of European Patent Application 20 154 742.9 filed on Jan. 30, 2020.
[0079] If the scene is lit by artificial light sources only, which are all inside the scene, it is possible to use the method known to the person skilled in the art as the “radiosity transport” method, which solves the equation system given by
for all vertices v.sub.i, and wherein F(v.sub.i,v.sub.j) is the form factor describing the geometrical relation between vertices v.sub.i and v.sub.j, and G(v.sub.i,v.sub.j)∈{0,1} the visibility equal to “1” if the line of sight between vertices v.sub.i and v.sub.j is free (i.e. not blocked).
[0080] Radiosity values B(v.sub.i) are recovered from scanning the scene with an RGB camera while varying the camera's exposure time.
[0081] Step S003, actually part of the present disclosure, is a first alternative step of obtaining a reflectance map of the scene for the contribution of outdoor illumination to the scene based on the 3D geometry model of the scene. This alternative is particularly suitable for scenes which receive outdoor light but not direct sunlight. In this case, the the brightness of the sky is low enough to be directly measured on image information with a consumer grade RGB(D) camera (i.e. the image information can be obtained by acquiring at least one image of the scene with this outside illumination). In this step, it is considered that light is passing for example through a window but no direct sunlight is visible in the scene itself. Thus, the radiosity can be measured as in the above described step S002, while the geometry outside this window is still missing (there are no vertices onto which the radiosity can be projected).
[0082] To overcome this difficulty, the outdoor geometry is modeled as a sphere or a sky sphere. This sphere is formed by a set of sample points (i.e. light sources) placed on the sphere. Each point
is associated with a direction n(l) and a radiosity B(l). This radiosity is recovered by projecting parts of the input image (or images if a plurality of images of the scene is used), which do not fall onto the scene geometry, onto the sky sphere instead. Then, a slightly modified equation 2 can be used for reflectance recovery:
B(v.sub.i)=ρ(v.sub.i)[E.sub.s(v.sub.i)+E.sub.l(v.sub.i)] Equation (3)
Wherein E.sub.s is the irradiance from the scene and E.sub.l is the irradiance from l, with:
as before, and:
where F(v.sub.i,l)=−cos(n(l).Math.n(v.sub.i) and G(v.sub.i,l)=0 if and only if the ray n(l) intersects any scene geometry other than v.sub.i and 1 otherwise.
[0083] The above defined sphere will be better described in reference to
[0084] Step S004 is another alternative for recovering the reflectance map. This alternative is particularly suitable for applications where the scene will receive direct sunlight.
[0085] In this alternative, the sun is considered to be directly visible from the inside, in the scene. Theoretically, this could be handled analogously to step S003, however, the minimal exposure time of most RGBD cameras is not short enough to capture the sun's radiosity. In other words, orienting a camera directly towards the sun does not allow measuring B(l) for a point placed on the sphere where the sun is l∈.sup.sun with
.sup.sun being a set of points placed where the sun is, better described hereinafter in reference to
[0086] Thus, since the sun, if present/visible, is the dominant light source in the scene, we cannot reliably recover the reflectance using radiosity transport directly.
[0087] Nevertheless, under certain assumptions and as will be described hereinafter, a meaningful scene reflectance can be estimated, in particular by using at least two images of the scene (from the image information) where the scene receives outdoor illumination and direct sunlight, taken at different instants associated with different positions of the sun.
[0088] Let .sub.t1.sup.sun be a set of points arranged on the above-mentioned sphere within the sun circle (the circle appearing where the sun is located), or in other words_the sky area covered by the sun at time t1 (an instant at which a first image has been captured). Further, let V.sub.t1.sup.sun be the set of vertices lit by direct sun light, i.e. v∈V.sub.t1.sup.sun if and only if F(v,l)G(v,l)>0 for a given l∈
.sub.t1.sup.sun. It is assumed that B.sub.t1(l) is the same for all l∈
.sub.t1.sup.sun, and this leads to only having one unknown, the sun radiosity denoted B.sub.t1.sup.sun. It should be noted that for all other vertices not in V.sub.t1.sup.sun it is possible to recover the reflectance as they only depend on the scene or sky radiosities, which can be measured. By capturing the scene again at a different time of day t.sub.2 (an instant at which a second image has been captured, chosen so that the position of the sun differs between the two instants, the larger the difference in position the better—or, alternatively, a difference of at least one vertex should be obtained between V.sub.t1.sup.sun and V.sub.t2.sup.sun defined hereinafter), it is possible to recover B.sub.t1.sup.sun. Let V.sub.t2.sup.sun be defined similarly to V.sub.t1.sup.sun and V.sub.t1−t2.sup.sun the set of vertices in V.sub.t1.sup.sun but not in V.sub.t2.sup.sun, i.e. lit by direct sunlight at t1 but not at t.sub.2. It is then possible to recover the sun brightness/radiosity B.sub.t1.sup.sun from any vertex v∈V.sub.t1−t2.sup.sun by exploiting the fact that the reflectance ρ(v) is the same at t1 and t2. Solving the above presented equation 3 for ρ(v) at t2 and inserting it in equation 3 for t1, it is possible to obtain:
With E.sub.s.sup.ti(v) and E.sub.l.sup.ti(v) being respectively the irradiance from the scene and from the sky at instant ti.
[0089] In equation 6, all quantities except E.sub.l.sup.t1(v) are known. Indeed, B.sub.t1(v) and B.sub.t2(v) can be measured, E.sub.s.sup.t1(v) and E.sub.s.sup.t2(v) depend only on scene radiosities and can also be computed from measured values, E.sub.l.sup.t1(v) by definition of v depends only on the sky radiosity which is measured as well. E.sub.l.sup.t1(v) can be expressed as:
With E.sub.sky.sup.t1(v) computed using the above defined equation 5 using sky radiosities at t1. Using this expression in equation 6 and solving it for B.sub.t1.sup.sun provides:
[0090] It should be noted that all the values on the right side are known, thus, it is possible to compute B.sub.t1.sup.sun, use it in the original equation system for t1 and solve for the remaining reflectance values to obtain the reflectance map.
[0091] Step S005 can then be carried out (this step is carried out before steps S003 and steps S004 as its output can be used in these steps) in which a ray tracing step is performed to determine the visibility of each vertex of the 3D geometric model. This step uses the 3D geometric model as input and the sky sphere separated into points/light sources, as defined above. This allows obtaining the visibility G in equations 4 and 5, for example through a pre-computation so that these equations can be solved later. This eliminates the main bulk of computation during illumination estimation.
[0092] Step S006 can then be carried out in which a normalized radiosity map ({circumflex over (B)}.sup.sky) for the contribution of the sky to the outdoor illumination based on the reflectance map (obtained using either S003 or S004) is elaborated, and in which a normalized radiosity map ({circumflex over (B)}.sup.gr) for the contribution of the ground to the outdoor illumination based on the reflectance map is elaborated.
[0093] In the present application, obtaining a normalized radiosity map is performed using the algorithm known to the person skilled in the art as the radiosity algorithm (Cindy M. Goral, Kenneth E. Torrance, Donald P. Greenberg, and Bennett Battaile. 1984. “Modeling the interaction of light between diffuse surfaces”. SIGGRAPH Comput. Graph. 18, 3 (July 1984), 213-222). In some embodiments, an iterative version of the radiosity algorithm is used.
[0094] At this stage, the reflectance of each vertex of the scene is known,
[0095] It is possible to obtain the normalized radiosities from the light sources and their brightness, G, F, and the reflectance map. For example, each vertex v∈V has a reflectance ρ(v), the radiosity B(v) here is the amount of light reflected by the surface patch at v, and B.sup.x(v) is the radiosity from light source .sup.x. In order to estimate B.sup.x(v), it is possible to first compute the radiosity B.sup.l(v) induced by a single directional light source l∈
.sup.x and then to sum up these components. Then, and as mentioned before, an iterative version of the basic radiosity algorithm is used, with:
B.sub.0.sup.l(v.sub.i)=ρ(v.sub.i)S.sup.lF(l,v.sub.i)G(l,v.sub.i)
being the light reaching the scene directly from l, and
B.sub.k+1.sup.l(v.sub.i)=ρ(v.sub.i)ρ.sub.j≠iB.sub.k.sup.l)F(v.sub.i,v.sub.j)G(v.sub.i,v.sub.j) being an iterative estimate for light inter-reflections within the scene. The form factor F(v.sub.i, v.sub.j) is computed as in document “Building dense reflectance maps of indoor environments using an rgb-d camera” (M. Krawez, T. Caselitz, D. Büscher, M. Van Loock, and W. Burgard, in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 3210-3217), for a direction light source F(l,v.sub.i), which simplifies to −cos(n(l).Math.n(v.sub.i)). Also, the visibility term G(v.sub.i,v.sub.j) is 1 if the line of sight between v.sub.i and v.sub.j is free, and 0 otherwise. Similarly, G(l,v.sub.i)=1 only when the ray with direction n(l), starting at v.sub.i, does not intersect with any scene geometry. For example, B.sup.l(v.sub.i)=B.sub.K.sup.l(v.sub.i), with K=10 by way of example, and B.sup.x(v.sub.i)=B.sup.l(v.sub.i). Setting S.sup.l to 1 in the above equation provides a normalized radiosity {circumflex over (B)}.sup.x.
[0096] To adapt the 3D geometric model to the current outdoor illumination in real time, a simplified model is used consisting of sun, sky, and ground light components. The radiosity of each vertex v is given by B(v)=B.sup.sun(v)B.sup.sky(v)+B.sup.gr(v). The sky and ground components are computed by first splitting the sky sphere into two disjoints sets .sup.sky and
.sup.grd, and adding the set of bins
.sup.sun describing the sun area. Then, equation 3 can be used to compute the radiosity contributions of each of the three light sources.
[0097] Here, it is assumed that the sun, sky, and ground bins l have the same radiosities (or brightness scales) S.sup.sun, S.sup.sky, and S.sup.gr respectively, such that the term B(l) in equation 5 can be replaced by the proper brightness scale parameter S.sup.x, with x∈{sun, sky, gr} (with gr for ground). In this case, the linearity of the radiosity transport equations allows pulling out S.sup.x from the computation and to scale the vertex radiosities subsequently, i.e. B.sup.x(v)={circumflex over (B)}.sup.x(v)S.sup.x. {circumflex over (B)}.sup.x is called a normalized radiosity.
[0098] For the sky and the ground, it is possible to pre-compute the normalized radiosities to speed up the scene re-lighting. Given {circumflex over (B)}.sup.sky and {circumflex over (B)}.sup.sun for each vertex, these normalized radiosities will then be multiplied by the corresponding brightness scale parameters in near real time, around the instant at which it is desired to simulate the illumination.
[0099] A second phase P2 can be carried out. By way of example this second phase can be carried out every 5 minutes, or any duration of the order of 5 minutes, as the sun moves slowly throughout the day.
[0100] In step S007 of the second phase, the position of the sun is computed using the scene geolocation (i.e. latitude and longitude), the orientation of the scene, the time (i.e. yyyy.mm.dd.hh.mm.ss). The position of the sun can be expressed as a direction vector to reflect the sun direction relative to the scene.
[0101] In order to simulate the sun illumination correctly, it is required to known the sun position relative to the scene at any given time. This can be calculated analytically by an astronomical model.
[0102] Using the sun position (for example expressed as a direction), it is possible to obtain the normalized radiosity map ({circumflex over (B)}.sup.sun) for the contribution of the sun to the outdoor illumination in step S008. This step also uses the reflectance map, the visibility terms G, and the 3D geometric model of the scene.
[0103] Unlike the sky and ground contributions, the sun changes its relative position to the scene depending on time, therefore a radiosity pre-computation carried out in the first phase is not possible. {circumflex over (B)}.sup.sun should therefore be computed for every new sun position. Given that the sun area is much smaller than that of sky or ground, {circumflex over (B)}.sup.sun is obtained within a couple of minutes using GPU acceleration. Even though this update is not real-time, the sun position varies only very little in the course of minutes, and performing the steps of the second phase can be performed every couple minutes, typically every five minutes.
[0104] The final and third phase P3 can be carried out in near real-time, that is to say within a duration which is less than the frame rate from the instant (t) at which it is desired to simulate the illumination.
[0105] This phase comprises step S009 in which a normalized image Î.sub.t.sup.sun the scene for the contribution of the sun to the outdoor illumination of the scene is rendered on the basis of {circumflex over (B)}.sup.sun. This step can use an initial pose estimate T.sub.t.sup.≈. This pose is obtained because a frame-to-frame tracking method is used here. Thus, the change of camera pose is computed by matching an input image at instant t to the image of instant t−1. This tracking process is described in more detail in document “Camera Tracking in Lighting Adaptable Maps of Indoor Environments” (T. Caselitz, M. Krawez, J. Sundram, M. Van Loock and W. Burgard, 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020, pp. 3334-3340). Also, the tracking may be done using the method described in the previously mentioned European Patent Application.
[0106] Step S010 comprises analogous steps for {circumflex over (B)}.sup.sky and {circumflex over (B)}.sup.gr in which normalized images are rendered (respectively Î.sub.t.sup.sky and Î.sub.t.sup.gr.
[0107] The normalized images can be combined in step S011.
[0108] Also, in the third phase, an image I.sub.C of the scene is acquired with the camera in step S012.
[0109] The subsequent steps will describe how the brightness scale parameters) S.sup.sun, S.sup.sky, and S.sup.gr are estimated from the normalized images and the acquired image.
[0110] This will be done by adjusting brightness scale parameter values to fit the acquired image.
[0111] In step S013, a subset of pixels Ω of the pixels of the image I.sub.C are selected so as to represent the image areas with the most information about illumination and containing the least amount of noise/errors.
[0112] In fact, it has been observed that because T.sub.t.sup.≈ can differ from the actual pose of the camera used to capture I.sub.C, the reference image I.sub.C.sup.ref which will be obtained using the normalized images and the brightness scale parameters can differ from I.sub.C. In practice, image noise and errors in the reconstructed scene geometry and reflectance maps, and the simplified outdoor illumination model (i.e.
[0113] the sphere) prevent this ideal matching. Hence, in some embodiments, Ω is used.
[0114] Also, a weight W.sup.spl can be used to represent the reliability of Ω for brightness scale estimation. This weight can either be set to 1, or be computed to reflect whether the sampled points are representative for the light distribution in the scene, and how much noise can be expected in these samples.
[0115] The brightness scale estimation is carried out in step S014. For a single frame I.sub.C and the samples u∈Ω the following error Err is minimized for the three brightness scale parameters S.sup.sun, S.sup.sky, and S.sup.gr:
With I.sub.B(u)=Σ.sub.x∈{sun,sky,gr}S.sup.xÎ.sup.x(u), CRF( ) being the camera response function, and et the exposure time of I.sub.C. The brightness scale parameters are obtained by minimizing Err over S.sup.x. This minimization can be performed by a least squares solver.
[0116] As an alternative to step S014, step S015 can be performed in which several frames are used.
[0117] It has been observed that a single frame can still be error-prone, and that in some embodiments, S.sup.sun, S.sup.sky, and S.sup.gr are averaged over multiple frames using W.sup.spl and Err as weights. In order to cope with dynamic illumination (i.e. clouds passing in front of the sun), it is proposed to use a sliding window approach, where frames behind a time horizon are discarded. It is further proposed to filter out outliers which deviate too much from previous frames. In step S016 the brightness scale is carried out.
[0118] Consequently, the simulated image I.sub.C.sup.ref in step S017 can be rendered using the rendered normalized images and the brightness scale parameters S.sup.sun, S.sup.sky, and S.sup.grd.
[0119] Tracking of the camera position can then be performed in step S018 by comparing I.sub.C.sup.ref and I.sub.C. As explained above, tracking is performed using methods known to the person skilled in the art.
[0120] .sup.sky are aligned in the upper hemisphere while the points
.sup.gr are aligned in the lower hemisphere. These points are spaced regularly around the sphere.
[0121] Also, on this figure, for each point l.sub.i, the direction n(l.sub.i) is represented by an arrow.
[0122] It should be noted that for the sun, a limited number of points are used where the sun appears to be when seen from the scene (points .sup.sun). The points of
.sup.sun are more densely spaced than the points of
.sup.sky and
.sup.gr.
[0123] In order to recover the reflectance in either step S003 or step S004, a radiosity value is affected by projecting a portion of the image to the point li, as explained above.
[0124]
[0125]
[0126]
[0127]
[0128] Throughout the description, including the claims, the term “comprising a” should be understood as being synonymous with “comprising at least one” unless otherwise stated. In addition, any range set forth in the description, including the claims should be understood as including its end value(s) unless otherwise stated. Specific values for described elements should be understood to be within accepted manufacturing or industry tolerances known to one of skill in the art, and any use of the terms “substantially” and/or “approximately” and/or “generally” should be understood to mean falling within such accepted tolerances.
[0129] Although the present disclosure herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present disclosure.
[0130] It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.