Method for installation of a ceiling system

12404684 ยท 2025-09-02

Assignee

Inventors

Cpc classification

International classification

Abstract

A method is for installing a ceiling system attached to a structural ceiling surface. The method includes illuminating, in sequence and by a projector, different subsets of a set of optical sensors separately arranged in a known distribution pattern on the structural ceiling surface. Each optical sensor of the set of optical sensors is arranged to output a temporal signal indicative of a level of illumination. The method also includes determining an image plane relative to the projector based on the known distribution pattern of the set of optical sensors and the outputted temporal signals of each of the optical sensors, and displaying, by the projector, an image indicative of installation locations of the ceiling system at the image plane.

Claims

1. A method for installation of a ceiling system attached to a structural ceiling surface, the method comprising: illuminating, in sequence and by a projector, different subsets of a set of optical sensors separately arranged in a known distribution pattern on the structural ceiling surface, wherein each optical sensor of the set of optical sensors being arranged to output a temporal signal indicative of a level of illumination, determining an image plane relative to the projector based on the known distribution pattern of the set of optical sensors and the outputted temporal signals of each of the optical sensors, and displaying, by the projector, an image indicative of installation locations of the ceiling system at the image plane.

2. The method according to claim 1, wherein the ceiling system comprises a grid of profiles comprising a plurality of primary profiles, and wherein the displaying the image comprises displaying information indicative of installation locations of the primary profiles at the image plane.

3. The method according to claim 1, wherein the determining the image plane comprises determining the image plane such that the image plane includes physical locations of the optical sensors of the set of optical sensors.

4. The method according to claim 1, wherein the displaying the image comprises displaying the image on the structural ceiling surface.

5. The method according to claim 4, wherein the displaying the image comprises calculating a projective transform based on a location of the projector relative to the image plane, and applying the projective transform to the image such that the displayed image is compensated for size and wrap with respect to the structural ceiling surface.

6. The method according to claim 1, wherein the determining the image plane comprises determining a rotation of the projector relative to a desired direction of the installation locations of the ceiling system at the image plane based on the known distribution pattern of the set of optical sensors and the outputted temporal signals of each of the optical sensors, and wherein the displaying the image comprises compensating a rotation of the image based on the determined rotation.

7. The method according to claim 1, wherein the displaying the image comprises adjusting a position of the image at the image plane based on building information model (BIM) data.

8. The method according to claim 1, wherein the displaying the image comprises adjusting a position of the image at the image plane based on a user-initiated input signal.

9. The method according to claim 2, wherein the optical sensors of the set of optical sensors are arranged at locations corresponding to junctions of the grid of profiles.

10. The method according to claim 1, wherein the optical sensors of the set of optical sensors are arranged on a common substrate which is adapted to be arranged at the structural ceiling surface.

11. The method according to claim 1, wherein the displaying the image comprises displaying installation guidance based on building information model (BIM) data.

12. The method according to claim 11, wherein the displaying the image comprises displaying fix points of the ceiling system to be installed.

13. The method according to claim 11, wherein the displaying the image comprises displaying forbidden areas for fix points of the ceiling system to be installed.

14. The method according to claim 11, wherein the displaying the image comprises displaying fix points of primary profiles to be installed.

15. The method according to claim 11, wherein the displaying the image comprises displaying forbidden areas for fix points of primary profiles to be installed.

16. The method according to claim 1, wherein the displaying the image further comprises displaying building information model (BIM) data.

17. The method according to claim 1, wherein the illuminating different subsets of the set of optical sensors comprises: illuminating, in sequence, different subsets of the set of optical sensors according to a first predetermined dichotomy pattern along a first major direction, and illuminating, in sequence, different subsets of the set of optical sensors according to a second predetermined dichotomy pattern along a second major direction, the first major direction being perpendicular to the second major direction.

18. The method according to claim 1, further comprising: arranging a second projector, separate from the projector, at a distance from the structural ceiling surface, and displaying, by the projector and the second projector, the image indicative of installation locations of the ceiling system at the image plane.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The aspects of the inventive concept, including its particular features and advantages, will be readily understood from the following detailed description and the accompanying drawings, in which:

(2) FIG. 1 conceptually illustrates a room in which a ceiling system including a grid of profiles is to be installed using a method for installation of a ceiling system.

(3) FIG. 2 conceptually illustrates the room of FIG. 1 with a different sensor setup as compared to FIG. 1.

(4) FIGS. 3a-3i schematically illustrates an illumination sequence and temporal signals form two different sensors.

(5) FIG. 4 conceptually illustrates the room of FIG. 1 at a later stage of the method where installation locations of primary profiles of the grid of profiles are displayed at a structural ceiling surface.

(6) FIG. 5 conceptually illustrates the room of FIG. 1 while displaying additional information on a structural ceiling surface.

(7) FIG. 6 conceptually illustrates the room of FIG. 1 while using a second projector in addition to the projector.

(8) FIG. 7 is a flowchart of a method for installation of a ceiling system attached to a structural ceiling surface.

DETAILED DESCRIPTION

(9) The present inventive concept will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred variants of the inventive concept are shown and discussed. This inventive concept may, however, be embodied in many different forms and should not be construed as limited to the variants set forth herein; rather, these variants are provided for thoroughness and completeness, and fully convey the scope of the inventive concept to the skilled person. Like reference numerals refer to like elements throughout the description.

(10) As have been described above, the present inventive concept generally relates to installation of ceiling systems attached to structural ceiling surfaces. Hence, installations of any type of ceiling systems are encompassed by the present inventive concept. In the following description, the present inventive concept it will be described in conjunction with ceiling systems including a grid of profiles. However, the present inventive concept is equally applicable for with ceiling systems void of any grid of profiles.

(11) FIG. 1 conceptually illustrates a room 100 in which a grid of profiles is to be installed using a method for installation of a grid of profiles of a ceiling system. A number of components used for executing the method will be described below in conjunction with describing the method. The method will be described sequentially below, and variants thereof will be discussed in-line. The components used for executing the method will sometimes be collectively referred to as a system.

(12) More specifically, a grid of profiles of a ceiling system is to be attached to the structural ceiling surface 102 of the room 100. The grid of profiles to be installed comprises a plurality of primary profiles. The primary profiles are typically extending in a non-interrupted manner along an installation of a ceiling system. The method described for installation of a grid of profiles of a ceiling system includes displaying an image indicative of installation locations of the primary profiles. Primary profiles are thereafter typically attached to the structural ceiling surface 102 in locations corresponding to the installation locations indicated in the displayed image.

(13) In the room 100 there is provided a projector 104. The projector 104 forming part of the system used for executing the method. The projector 104 is arranged on the floor 106 of the room 100. The projector 104 may alternatively be arranged on a wall or at the ceiling of the room 100. The projector 104 may be of any suitable kind. Examples of suitable projectors 104 include SD, HD and 4 K video projectors or beamers. Projectors 104 capable of displaying colour images are preferred while projectors 104 capable of displaying black and white images or monochromic images will suffice. Also, a laser projector 104 may be used to advantage. The projector 104 is connected to an image source 108. The image source 108 may be any device capable of providing an image signal to the projector 104. Examples of suitable image sources includes, computers, laptops, tablets and media players such an Apple TV to give a few non-limiting examples. The image source 108 may be connected to the projector 104 by a wired connection, a wireless connection or a combination thereof. In the depicted system, the image source 108 is connected by wired connection to the projector 104.

(14) The image source 108 includes a processor 108a capable of processing and altering image data. The processor 108a is typically capable of processing an incoming image signal in order to feed out an image signal suitable for the projector 104 in question. Alternatively, or additionally, the image source 108 may include a memory 108b having stored thereon image data used to produce an image signal for suitable for the projector 104 in question. The processor 108a may alter or manipulate the image data so as to alter or manipulate the image displayed by the projector 104.

(15) The image source 108 is preferably attached to the projector 104 so as to form a single unit. The image source 108 may be integral to the projector 104. The image source 108 may be separate the projector 104.

(16) The depicted system used for executing the method further includes a user device 110. The user device 110 may be connected to the image source 108 by a wired connection, a wireless connection or a combination thereof. The depicted user device 110 is a handheld device which is connected to the image source 108 by a wireless connection. The user device 110 may be any suitable device such as computers, laptops, media players and tablets such as an iPad to give a few non-limiting examples. The user device 110 is arranged to transmit an image signal to the image source 108 or to transmit data relevant for the image source 108 to produce an image signal for the projector 104. The user device is typically capable of receiving different types of input form a user. The input may be input related to image adjustment, image manipulation, sensor locations, mutual distances, mutual directions, brightness, contrast, amount of image information, etc.

(17) The user device 110 and the image source 108 may be the same device.

(18) The depicted system used for executing the method further includes a set of optical sensors including four optical sensors 112a-d. The optical sensors 112a-d are arranged on the structural ceiling surface 102 of the room 100. More specifically, the optical sensors 112a-d are arranged on a common substrate 114. Hence, the optical sensors 112a-d are arranged indirectly on the structural ceiling surface 102 of the room 100. The common substrate 114 is adapted to be arranged on the structural ceiling surface 102. The common substrate 114 may for example be a board of thin material, a plate or similar. The common substrate 114 may be foldable and may for that purpose include hinges, fold lines or similar. By the common substrate 114 the optical sensors 112a-d are separately arranged in a known distribution pattern on the common substrate 114 and hence on the structural ceiling surface 102. In other words, the mutual distances and mutual directions between the respective optical sensors 112a-d are fixed by the common substrate 114 and hence the distribution pattern in known. The common substrate 114 is held at the structural ceiling surface 102 by means of a holder 116 in form of an adjustable tripod.

(19) In FIG. 1 four optical sensors 112a-d are depicted. However, more optical sensors may be used, such as 5, 7, 10 or 50 optical sensors. By using more optical sensors, an increased accuracy may be achieved. However, in the following a system having four optical sensors 112a-d will be described.

(20) Each of the four optical sensors 112a-d are connected to the user device 110 by means of a wireless connection. Each optical sensor 112a-d of the set of optical sensors being arranged to output a temporal signal indicative of a level of illumination. In other words, each optical sensor 112a-d is arranged to output a time sequential signal which is dependent on a level of illumination impinging on the optical sensor 112a-112d in question. The temporal signal indicative of a level of illumination will be discussed in greater detail below when referring to FIG. 3.

(21) As an alternative, each optical sensor 112a-d of the set of optical sensors may be attached by a respective wired connection to a common wireless transmitter. The wireless transmitter may in turn be connected to the user device 110 by means of a wireless connection. The wireless transmitter may in this case advantageously be arranged on the common substrate 114.

(22) As a further alternative, each optical sensor 112a-d of the set of optical sensors may be attached by a respective wired connection to a common wireless transmitter. The wireless transmitter may in turn be connected to the projector 104 by means of a wireless connection. The wireless transmitter may in this case advantageously be arranged on the common substrate 114.

(23) As can be seen in FIG. 1, the projector 104 illuminates a portion 118 of the structural ceiling surface 102 of the room 100. The depicted illuminated portion 118 covers a major portion of the structural ceiling surface 102 of the room 100. As depicted in FIG. 1, the illuminated portion 118 of the structural ceiling surface 102 includes a bright area 118a with high illumination intensity and a less bright area 118b with a lower illumination intensity. The optical sensors 112a and 112c are present in the bright area 118a with high illumination intensity whereas the sensors 112b and 112d are present in the less bright area 118b with a lower illumination intensity. Hence, in the depicted moment of FIG. 1, optical sensors 112a and 112c will output a respective signal indicative of a higher level of illumination as compared to optical sensors 112b and 112d. Signals outputted by the optical sensors 112a-112d will be discussed in greater detail below when referring to FIG. 3.

(24) Now referring to FIG. 2, here room 100 is depicted with a different setup of optical sensors 212a-212d. In order to avoid undue repetition, only differences in relation to FIG. 1 will be discussed below.

(25) The optical sensors 212-a-d are arranged on the structural ceiling surface 102 of the room 100. More specifically, the optical sensors 212a-d are arranged directly on the structural ceiling surface 102 of the room 100. The respective optical sensors 212a-d may for instance be glued to or screwed to the structural ceiling surface 102 of the room 100. The optical sensors 212a-d are separately arranged on the structural ceiling surface 103. In other words, the mutual distances and directions between the respective optical sensors 212a-d are fixed by the optical sensors 212a-d being arranged on the structural ceiling surface 103. However, in this case, the distribution pattern will not directly be known per se. This means that a user executing the method will have to carefully determine the mutual distances and directions between the respective optical sensors 112a-d after the optical sensors 212a-d have been arranged on the structural ceiling surface 103, or the user executing the method will have to carefully arrange the optical sensors 212a-d on the structural ceiling surface 102 according to a predetermined known distribution pattern. Hence, the distribution pattern of the optical sensors 212a-d may be known either from measurements of the actual locations or from the fact that the optical sensors 212a-d are mounted according to a known distribution pattern.

(26) The optical sensors 212a-d are connected to the projector 104 by respective wired connections. The projector 104 of FIG. 2 includes a user deice 110 and an image source 108 integrally formed with the projector 104. The user device 110 and an image source 108 fulfil corresponding purposes of those described in conjunction with FIG. 1 and will not be described in greater detail here.

(27) Now referring to FIGS. 3a-3i. According to the method, different sub sets of the set of optical sensors 112a-112d (or 212a-d) are illuminated in sequence by the projector 104.

(28) In FIGS. 3a-3i it is schematically depicted how two optical sensors denoted S1, S2, corresponding to any of the optical sensors 112a-112d (or 212a-d), are illuminated in sequence by the projector 104. If FIG. 3a-3i two sensors S1, S2 are depicted for reasons of simplicity. The same principle does however apply to the situation where more optical sensors are used. In FIG. 3a-3i, different subsets of the optical sensors S1 and S2 are illuminated in sequence by an illumination sequence including six different illumination patterns as depicted in FIGS. 3a-3f. The different subsets are corresponding to the actual optical sensors S1, S2 being illuminated at a certain point in time.

(29) In the following, FIGS. 3a-3i will be referred to as schemes a-i, where schema a refers to FIG. 3a, scheme b to FIG. 3b etc. All six schemes a-f shows the same area of a structural ceiling surface 102 on which the two optical sensors S1 and S2 are arranged. The respective schemes a-f shows the same area of the structural ceiling surface 102 at different points in time where scheme a corresponds to a first point in time and scheme f corresponds to a last point in time of the respective schemes. The sequential illumination of the optical sensors S1, S2 according to the illumination sequence including six different illumination patterns is performed. The six different illumination patterns will result in that each of the optical sensors S1, S2 outputs a temporal signal TS1, TS2 indicative of a level of illumination at the respective schemes a-f. The temporal signals TS1 and TS2 are depicted in the respective graphs g and h in FIG. 3.

(30) The outputted temporal signals TS1 and TS2 may in practice be used to determine in which directions the optical sensors S1, S2 are arranged in relation to the projector 104. In practice, a space angle interval with respect to the projector may be determined for each optical sensor S1, S2. The space angle interval will in practice correspond an angle interval along a first major direction and an angle interval along a second major direction. The space angel intervals of the optical sensors S1, S2 will thus correspond to a respective area A1, A2 at the structural ceiling surface 102. This is illustrated in scheme i in FIG. 3.

(31) In the following it will be described in greater detail how the respective directions to sensors S1, S2 are determined by illuminating different subsets of the sensors S1, S2 in sequence by the projector 104.

(32) Before the first illumination of scheme a is employed a dark level signal with the projector tuned off is preferably recorded as a reference signal.

(33) In the first illumination pattern of scheme a, sensor S1 is illuminated while sensor S2 is not. In other words, sensor S1 is at scheme a located in an area with high illumination intensity while sensor S2 is located in an area with a lower illumination intensity such as a background illumination intensity. At the point in time corresponding to scheme a, sensor S1 will output a signal indicative of a higher level of illumination as compared to sensor S2. This is illustrated in schemes g and h where it may be seen that the temporal signal TS1 of sensor S1 has a higher signal level than the temporal signal TS2 of sensor S2 at point a. In fact, the temporal signal TS2 of sensor S2 is set to a zero since corresponding to a background illumination level.

(34) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme a, it may be concluded that sensor S1 is located somewhere in the area indicated by the bracket above scheme a. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket below scheme a. This because sensor S1 is in the illuminated area at the right of scheme a, while sensor S2 is in the non-illuminated area at the left of scheme a.

(35) Followingly, the sensors S1, S2 are illuminated with a second illumination pattern of scheme b. Sensors S1 and S2 are not illuminated when the second illumination pattern scheme b is employed. In other words, both sensor S1 and sensor S2 are located in an area with a low illumination intensity such as a background illumination intensity. At the point in time corresponding to scheme b, sensors S1 and S2 will output signals indicative of a low level of illumination. This is illustrated in schemes g and h where it may be seen that the temporal signals TS1 and TS2 both have a low signal level at point b.

(36) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme b, it may be now be concluded that sensor S1 is located somewhere in the area indicated by the bracket above scheme b. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket below scheme b.

(37) Followingly, sensors S1, S2 are illuminated with a third illumination pattern of scheme c. Sensors S1 and S2 are illuminated when the third illumination pattern of scheme c is employed. In other words, both sensor S1 and sensor S2 are located in an area which is illuminated by the projector 104. At the point in time corresponding to scheme c, sensors S1 and S2 will output signals indicative of a high level of illumination. This is illustrated in schemes g and h where it may be seen that the temporal signals TS1 and TS2 both have a high signal level at point c.

(38) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme c, it may be now be concluded that sensor S1 is located somewhere in the area indicated by the bracket above scheme c. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket below scheme c.

(39) The three initial illumination patterns are more specifically predetermined dichotomy patterns along a first major direction.

(40) Followingly, a fourth illumination pattern of scheme d is employed. When employing the illumination pattern of scheme d sensor S1 is illuminated while sensor S2 is not. In other words, sensor S1 is at scheme d located in an area with high illumination intensity while sensor S2 is located in an area with a lower illumination intensity such as a background illumination intensity. At the point in time corresponding to scheme d, sensor S1 will output a signal indicative of a higher level of illumination as compared to sensor S2. This is illustrated in schemes g and h where it may be seen that the temporal signal TS1 of sensor S1 has a higher signal level than the temporal signal TS2 of sensor S2 at point d.

(41) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme d, it may be concluded that sensor S1 is located somewhere in the area indicated by the bracket to the right of scheme d. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket to the left of scheme d.

(42) Followingly, the sensors S1, S2 are illuminated with a fifth illumination pattern of scheme e. Sensors S1 and S2 are not illuminated when the fifth illumination pattern scheme e is employed. In other words, both sensor S1 and sensor S2 are located in an area with a low illumination intensity such as a background illumination intensity. At the point in time corresponding to scheme e, sensors S1 and S2 will output signals indicative of a low level of illumination. This is illustrated in schemes g and h where it may be seen that the temporal signals TS1 and TS2 both have a low signal level at point e.

(43) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme e, it may now be concluded that sensor S1 is located somewhere in the area indicated by the bracket to the right of scheme e. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket to the left of scheme e.

(44) Followingly, sensors S1, S2 are illuminated with a sixth illumination pattern of scheme f. Sensors S1 and S2 are illuminated when the sixth illumination pattern of scheme f is employed. In other words, both sensor S1 and sensor S2 are located in an area which is illuminated by the projector 104. At the point in time corresponding to scheme f, sensors S1 and S2 will output signals indicative of a high level of illumination. This is illustrated in schemes g and h where it may be seen that the temporal signals TS1 and TS2 both have a high signal level at point f.

(45) Form the respective temporal signals TS1, TS2 and the illumination pattern at scheme f, it may be now be concluded that sensor S1 is located somewhere in the area indicated by the bracket to the right of scheme f. Correspondingly, it may be concluded that sensor S2 is located somewhere in the area indicated by the bracket to the left of scheme f.

(46) The three final illumination patterns of schemes d-f are more specifically predetermined dichotomy patterns along a second major direction, where the first major direction being perpendicular to the second major direction.

(47) It may be concluded from the temporal signal TS1 of the six above described illumination patterns that the sensor S1 is located in the area A1 of scheme i. Correspondingly, it may be concluded from the temporal signal TS2 of the six above described illumination patterns that the sensor S2 is located in the area A2 of scheme i. In practice, as described above the respective areas A1 and A2 of scheme i corresponds to in which directions the sensors S1, S2 are arranged in relation to the projector 104. In practice, the number of illumination patterns of the above type may be increased so as to more accurate determine the locations of the sensors S1 and S2. In other words, by increasing the number of illumination patterns of the above type the areas A1 and A2 of scheme i may be reduced in size.

(48) As an alternative to predetermined dichotomy patterns, a relatively narrow light cone may scan the structural ceiling surface 102 on which the optical sensors S1 and S2 are attached. It may be then be recorded for what space angle the light cone impinges on the respective sensors S1 and S2.

(49) As an alternative to predetermined dichotomy patterns, a laser beam may scan the structural ceiling surface 102 on which the optical sensors S1 and S2 are attached. It may be then be recorded for what directions the laser beam impinges on the respective sensors S1 and S2.

(50) Other strategies may also be used to advantage for locating the optical sensors S1 and S2 attached to the structural ceiling surface 102.

(51) Now referring back also to FIG. 1 and FIG. 2, as previously indicated, sensor S1 and sensor S2 may correspond to any of the optical sensors 112a-112d of FIG. 1 or the optical sensors 212a-f of FIG. 2. Hence, in a manner corresponding to what has been described above in conjunction to FIG. 3 it may be concluded how the respective optical sensors 112a-112d (or 212a-d) are located relative to the projector 104 in terms of in which directions the sensors 112a-112d (or 212a-d) are arranged in relation to the projector 104.

(52) Based on determined directions of the respective sensors 112a-112d (or 212a-d) and the known distribution pattern an image plane relative to the projector may be now be determined by mathematical calculations. Needless to say, the directions of the respective sensors 112a-112d (or 212a-d) need not be determined. Rather, an image plane relative to the projector 104 may be determined based on the known distribution pattern of the set of optical sensors 112a-112d (or 212a-d) and the outputted temporal signals TS1, TS2 of each of the optical sensors 112a-112d (or 212a-d).

(53) The image plane relative to the projector 104 may be determined using homography algorithms as is known in the art. The main purpose of determining the image plane relative to the projector 104 is to correct the image projected by the projector 104 to fit the plane of the structural ceiling surface 102 or to at least compensate the image projected by the projector 104 to become parallel with the plane of the structural ceiling surface 102.

(54) The image plane relative to the projector 104 may be determined by inverting the homography matrix. In the subject case, the homography matrix includes elements relating to positions of at least four optical sensors 112a-112d (or 212a-d).

(55) For instance, the optical sensors 112a-112d may as depicted in FIG. 1 be positioned at equivalent distances around a centre point. Hence, in FIG. 1 the intersection of the adjustable holder 116 and the common substrate 114 may correspond to the centre point. The centre point may for reasons of simplicity is given the coordinates (0, 0), i.e. the centre point may be considered to be located at the origin of a coordinate system having an X-axis and a Y-axis. The respective optical sensors 112a-112d may for reasons of simplicity be given the coordinates (1, 1), (1, 1), (1, 1) and (1, 1). That is, each optical sensor may be considered located one distance or length unit away from the centre point in each one of the X-direction and the Y-direction. All optical sensors 112a-112d depicted in FIG. 1 are located in different directions with respect to the centre point which is reflected in the signs of the respective coordinates of the optical sensors 112a-112d.

(56) Hence, using the outputted temporal signals TS1, TS2 of each of the optical sensors 112a-112d, the position of each optical sensor of the optical sensors 112a-112d in view of the projector 104 may be determined. The positions of each of the optical sensors 112a-112d in the view of the projector may be denoted coordinates (X.sub.1, Y.sub.1), (X.sub.2, Y.sub.2), (X.sub.3, Y.sub.3) and (X.sub.4, Y.sub.4).

(57) Now considering the eight points (1, 1), (1, 1), (1, 1), (1, 1), (X.sub.1, Y.sub.1), (X.sub.2, Y.sub.2), (X.sub.3, Y.sub.3) and (X.sub.4, Y.sub.4) which may be included in a 33 homography matrix.

(58) The homography matrix including the eight points may now be inverted in order to determine the image plane relative to the projector 104.

(59) More specifically, the inverted homography matrix will result in a transform matrix or projective transform which when applied, to e.g. an image, will place the point (X.sub.1, Y.sub.1) in the point (1, 1), the point (X.sub.2, Y.sub.2) in the point (1, 1), the point (X.sub.3, Y.sub.3) in the point (1, 1) and the point (X.sub.4, Y.sub.4) in the point (1, 1). The projective transform will consequently include translation, rotation and scale.

(60) The inversion of the homography matrix may for instance be calculated using the tool OpenCV which is a programming library mainly aimed at computer vison.

(61) Since a projective transform have 8 degrees of freedom the locations of at least four optical sensors 112a-112d (or 212a-d) need to be determined in relation to the projector 104. The projective transform has 8 degrees of freedom for 9 coefficients (33 matrix). However, because such transforms are defined within a scaling factor, which removes one degree of freedom, one of the coefficients can be arbitrarily set to 1.

(62) After having determined the image plane relative to the projector 104, the projector may display an image indicative of installation locations 120 of primary profiles at the image plane, as is shown in FIG. 4.

(63) More specifically, the user device 110 may determine the image plane relative to the projector 104 based on the known distribution pattern of the set of optical sensors 112a-112d (or 212a-d) and the outputted temporal signals TS1, TS2 of each of the optical sensors 112a-112d (or 212a-d). This may typically be determined by mathematical calculations performed in the user device 110, e.g. according to the above example. The user device 110 may thus transmit a signal to the image source 108. The signal may thus be constituted such that the displayed image of the projector 104 based on the image signal form the image source 108 is displayed at the determined image plane.

(64) In FIG. 4, it is illustrated how an image 122 is displayed at the structural ceiling surface 102 of the room 100. Hence, the structural ceiling surface 102 corresponds to the determined image plane. The image 122 includes installation locations 120 of primary profiles of the grid of profiles. In FIG. 4 it is depicted how installation locations 120 of five primary profiles are indicated as straight parallel lines displayed on the structural ceiling surface 102 of the room 100.

(65) The optical sensors 112a-d have been removed in FIG. 4 but may very well be present at the time of displaying the image 122.

(66) The displayed image 122 comprises displayed fix points 124 of the primary profiles to be installed. The fixpoints 124 may be indicated with image features such as dots or crosses at locations where it is appropriate to fix the primary profiles to be installed. The displayed fixpoints 124 may consequently be displayed at certain distances where a firm installation of the primary profiles will be achieved while not using an excessive amount of fix points 124.

(67) Hence, in the situation depicted in FIG. 4 the image plane has been determined such that the image plane substantially includes physical locations of the optical sensors 112a-112d (or 212a-d) of the set of optical sensors. Moreover, the image 122 displayed by the projector 104 is displayed on the structural ceiling surface 102. For this reason, a projective transform has been calculated based on a location of the projector 104 relative to the image plane. The projective transform may advantageously be calculated by the user device 110. The projective transform has been applied to the displayed image 122 depicted in FIG. 4. By applying the projective transform the displayed image 122 may be compensated for size and wrap with respect to the image plane and consequently to the structural ceiling surface 102 of the room 100. In other words, the projective transform may be applied such that the image 122 is correctly displayed on the structural ceiling surface 102, i.e. the displayed image 122 indicative of installation locations 120 of the primary profiles corresponds to actual and desired installation locations of the primary profiles to be installed.

(68) More specifically, the projective transform may be determined by inverting the homography matrix as described above.

(69) Also, the rotation of the projector 104 relative to a desired direction of the installation locations 120 of primary profiles at the image plane may be determined. The rotation may be determined by mathematical calculations based on the known distribution pattern of the set of optical sensors 112a-112c (or 212a-d) and the outputted temporal signals TS1, TS2 of each of the optical sensors 112a-112c (or 212a-d). The rotation may be calculated by the user device 110 and may consequently be transmitted to the image source 108 such that the rotation of the displayed image 122 is compensated based on the determined rotation.

(70) More specifically, the rotation of the projector 104 relative to a desired direction of the installation locations 120 of primary profiles at the image plane may be determined from the projective transform resulting from inversion of the homography matrix as described above.

(71) The position of the displayed image 122 of FIG. 4 may be adjusted based on building information model data, BIM data. By adjusting the position of the displayed image 122 based on BIM data the image 122 including the installation locations 120 of primary profiles at the image plane, i.e. at the structural ceiling surface 102, may be adjusted sideways such that the installation locations 120 of primary profiles at the image plane are aligned with certain building features, such as walls or air outlets. Correspondingly, the position of the image 122 may be adjusted such that installation locations 120 of the primary profiles are positioned at locations where there is a reduced risk of interfering with or damaging concealed features of the building, such as piping or electrical cables. In order to achieve this, the user device 110 may include BIM data stored therein or the user device 110 may receive BIM data from a remote resource such as a server or a cloud service. The user device 110, may thus based on the BIM data adjust the signal transmitted to the image source 108 such that the position of the displayed image 122 is adjusted sideways as desired.

(72) The position of the displayed image 122 of FIG. 4 may be adjusted based on a user-initiated input signal. Hence, a user may typically adjust the position of the displayed image 122 by an input on the user device 110. The user device 110 may in response to the user input adjust the signal transmitted to the image source 108 such that the position of the displayed image 122 including the installation locations 120 of primary profiles at the image plane, i.e. at the structural ceiling surface 102, is adjusted sideways as desired. By doing this, a user may adjust the position of the mage 122 such that installation locations 120 of primary profiles at the image plane are aligned with certain building features, such as walls or air outlets. Correspondingly, the position of the image 122 may be adjusted such that installation locations 120 of the primary profiles are positioned at locations where there is a reduced risk of interfering with or damaging concealed features of the building, such as piping or electrical cables.

(73) Correspondingly, a rotation of the displayed image 122 of FIG. 4 may be adjusted based on a user-initiated input signal.

(74) The optical sensors 112a-112c (or 212a-d) may be arranged at locations corresponding to junctions of the grid of profiles to be installed. By this, a user may get an instant and visual confirmation that the image 122 is displayed correctly, i.e. a user may verify that the sensors 112a-112c (or 212a-d) actually are aligned at locations corresponding to junctions of the grid of profiles in the displayed image 122.

(75) Now referring to FIG. 5, here room 100 is depicted with a different image 222 displayed at the structural ceiling surface 102. In order to avoid undue repetition, only differences in relation to FIG. 4 will be discussed below.

(76) As can be seen in FIG. 5, installation locations 220 of primary profiles to be installed are shown in the image 222. The installation locations 220 of the of primary profiles to be installed are shown as straight lines extending at oblique angles with respect to each other. Hence, the installation locations 220 of the of primary profiles to be installed are shown as non-parallel lines in the image 222.

(77) The image 222 includes more information as compared to the image 122 of FIG. 4. The depicted image 222 of FIG. 5 includes installation guidance 126 based on BIM data. The installation guidance 126 is thus displayed as a part of the image 222. The included installation guidance 126 is based on BIM data in the sense that the displayed installation guidance 126 is displayed in manner where features of the building is taken into account. The displayed installation guidance 126 may for instance take concealed features of the building into account. Not yet installed features of the building may be taken into account such that sufficient space is left for the features to be installed later on. The displayed installation guidance 126 may thus be positioned in areas which are suitable for installation of the primary profiles to be installed.

(78) The displayed image 122 may include may include fix points 128 of primary profiles to be installed. The fix points 128 may thus be displayed at suitable locations where there is a reduced risk of interfering with or damaging concealed features of the building. The fixpoints 128 may be indicated with image features such as dots or crosses at locations where it is appropriate to fix the primary profiles to be installed. The displayed fixpoints 128 may consequently be displayed at certain distances where a firm installation of the primary profiles will be achieved while not using an excessive amount of fix points 128. The displayed fix points 128 is an example of displayed installation guidance 126.

(79) The displayed image 122 may include forbidden areas 130 for fix points 128 of primary profiles to be installed. The forbidden areas 130 may be displayed in areas where fix points 128 would risk interfering with or damaging concealed features of the building, such as piping or electrical cables. The forbidden areas 130 may be displayed in areas where fix points 128 would risk interfering with features of the building that has not yet been installed. The displayed forbidden areas 130 is an example of displayed installation guidance 126.

(80) The displayed image 122 may include building information model data, BIM data 132. The displayed image 122 may consequently visualise concealed features 132 of the building, such as piping or electrical cables. The displayed image 122 may consequently visualise features 132 of the building that has not yet been installed. The image 122 may as a few non-limiting examples include visual representations 132 of cables, pipes, wires, airducts, air outlets, lighting appliances, heaters, smoke detectors, fans, wi-fi access points, sprinklers, interior walls and windows.

(81) The displayed installation guidance 126 may as a further example include a movie clip 134 showing an installation scheme how a typical primary profile is installed at the structural ceiling 102 of the room 100. The movie clip 134, may include indications on where to fix the primary profiles to be installed. That is the movie clip 134 may include information regarding fix points 128 of the primary profiles to be installed. The movie clip 134 may also include information regarding suitable ways of drilling in the structural ceiling 102, as well as which type of fasteners to suitably use for the structural ceiling 102 at hand. Hence, different drilling techniques and fasteners may be displayed based on the material of the structural ceiling 102 at hand. Information regarding the material of the structural ceiling 102 at hand may for instance be gathered by the user device 110 form a server having BIM data stored thereon.

(82) Now referring to FIG. 6, here room 100 is depicted with two different projectors 104, 204 used to display an image 322 at the structural ceiling surface 102. In order to avoid undue repetition, only differences in relation to FIG. 4 will be discussed below.

(83) As is illustrated in FIG. 6, a second projector 204 is arranged on the floor 106 of the room 100 in addition to the projector 104. The second projector 204 is like the projector 104 arranged at a distance from the structural ceiling surface 102. The image 322 is displayed at the image plane by the projector 104 and the second projector 204. The image 322 is jointly formed by the projector 104 and the second projector 204. The image source 108 connected to the projector and the image source 208 connected to the second projector 204 are both connected to the user device 110. The user device 110 may thus distribute image data to the image source 108 and the image source 208 such that the image 322 is jointly formed by the projector 104 and the second projector 204. Before, the image 322 may be displayed by the projector 104 and the second projector 204, a common image plane is determined for each one of the projector 104 and the second projector 204 as described above.

(84) In the depicted setup of FIG. 6, the image plane coincides with the structural ceiling surface 102. Hence, the image 322 is displayed at the structural ceiling surface 102. The image 322 includes installation locations 120 of primary profiles. The installation locations 120 are depicted as straight lines extending in parallel. However, different layouts of the installation locations 120 may be used to advantage when using a second projector 204 in addition to the projector 104. Also, information of the kind described in conjunction with FIG. 5 may be displayed in image 322. By utilising the projector 104 and the second projector 204 a larger installation area may be covered by the image 322 at the same time. Moreover, an increased redundancy may be achieved in case of failure of the projector 104 or the second projector 204. Furthermore, the risk of certain areas of the structural ceiling surface 102 being shadowed, by for instance pillars, light armatures, inner walls or similar, may also be reduced.

(85) Now referring to FIG. 7 in addition to FIGS. 1-6. In FIG. 7 is shown a flow scheme of a method 700 which may be used for installation of a grid of profiles of a ceiling system attached to a structural ceiling surface 102 as have been described above in conjunction with FIGS. 1-6. The method 700 comprising; in sequence illuminating 702, by a projector 104, different sub sets of a set of optical sensors 112a-d, 212a-d, S1, S2 separately arranged in a known distribution pattern on the structural ceiling surface 102, wherein each optical sensor 112a-c, 212a-c, S1, S2 of the set of optical sensors being arranged to output a temporal signal TS1, TS2 indicative of a level of illumination, determining 704 an image plane relative to the projector 104 based on the known distribution pattern of the set of optical sensors 112a-c, 212a-c, S1, S2 and the outputted temporal signals TS1, TS2 of each of the optical sensors 112a-d, 212a-d, S1, S2, displaying 706, by the projector 104, an image 122, 222, 322 indicative of installation locations 120, 220 of the primary profiles at the image plane.

(86) As is understood, the respective features and elements described in conjunction with the respective FIGS. 1-7 may be combined or interchanged to suit specific installation needs, when installing a grid of profiles for a ceiling system.

(87) Moreover, it will be appreciated that the concept of displaying an image indicative of installation locations at an image plane may advantageously be used in the course of installing other entities than primary profiles of a grid of profiles for a ceiling system.

(88) For instance, installation locations of ceiling tiles installed independently, i.e. without a grid of profiles, may be displayed to advantage a determined image plane.

(89) Moreover, installation locations of sound absorbing baffles, such as ceiling or wall mounted baffles may advantageously be displayed a determined image plane.

(90) Also, installation locations of other devices, such as fans, lightings, loudspeakers, sprinklers, wi-fi transceivers and air inlets may be displayed to advantage at a determined image plane.

(91) It will be appreciated that the present inventive concept is not limited to the variants shown. Several modifications and variations are thus conceivable within the scope of the invention which thus is exclusively defined by the appended claims.