Methods and apparatus for light-based positioning and navigation

10564253 · 2020-02-18

Assignee

Inventors

Cpc classification

International classification

Abstract

Systems, methods, mobile computing devices and computer-readable media are described herein relating to light-based positioning. In various embodiments, light sources (106) may be commissioned to selectively energize one or more LEDs (520) to emit light carrying a coded light signal. The coded light signal may convey information about a location of a lighting effect (102) projected by the one or more LEDs onto a surface (104). In various embodiments, mobile computing devices (100) such as smart phones or tablets may detect these coded light signals from the lighting effects and/or from the light sources, extract the location information, and utilize it to determine their locations within an environment.

Claims

1. A computer-implemented method for commissioning a light source, comprising: placing a commissioning device in a lighting effect projected by the light source onto a surface; determining, by the commissioning device, a location of the commissioning device within an environment; and transmitting, by the commissioning device to the light source, a location of the lighting effect within the environment, wherein the location of the light effect is based at least in part on the determined location of the commissioning device.

2. The computer-implemented method of claim 1, wherein the transmitting further comprises transmitting a reference distance between the light source and the surface.

3. The computer-implemented method of claim 1, wherein the transmitting further comprises transmitting an angle between a first vector that is normal to the surface and extends from a center of the lighting effect, and a second vector from the commissioning device to the light source.

4. The computer-implemented method of claim 3, further comprising calculating, by the commissioning device, the angle based at least in part on a distance between a rendition of the lighting effect on a display of the commissioning device and a center of the display.

5. The computer-implemented method of claim 1, wherein the transmitting further comprises transmitting an angle between a first vector that extends along the surface from a center of the lighting effect to a position on the surface opposite the light source, and a second reference vector that is predefined relative to a magnetic pole.

6. The computer-implemented method of claim 5, further comprising calculating, by a mobile computing device, the angle based at least in part on an orientation of the mobile computing device relative to the magnetic pole.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention.

(2) FIG. 1 schematically illustrates one example of how a mobile computing device may determine its location within an environment by determining its distance from a lighting effect, in accordance with various embodiments.

(3) FIG. 2 schematically depicts an example of how a mobile computing device may determine a incident angle using its camera and display, in accordance with various embodiments.

(4) FIG. 3 schematically depicts one example of how a mobile computing device may determine its location within an environment by determining its distance from a light source, and then determining the light source's distance from the light effect it produces, in accordance with various embodiments.

(5) FIG. 4 schematically depicts one example of how a commissioning computing device may be utilized to commission a light source with information about a location of a lighting effect projected by the light source onto a surface, in accordance with various embodiments.

(6) FIG. 5 schematically depicts an example light source, in accordance with various embodiments.

(7) FIG. 6 schematically depicts an example method that may be implemented by a mobile computing device to determine its location within an environment, in accordance with various embodiments.

(8) FIG. 7 depicts an example method of commissioning a light source, in accordance with various embodiments.

DETAILED DESCRIPTION

(9) More and larger indoor and/or underground environments are being built for shopping, parking, traffic, living, and so forth. Many such environments may interfere with or even block GPS signals, making conventional GPS-based navigation with mobile computing devices difficult. Those same environments may lack natural sunlight, and therefore may be lit with artificial lighting. Technology exists that enables sources of that artificial light to emit locational information that may be used by mobile computing devices for positioning and/or navigational purposes. However, a local network connection (e.g., Wi-Fi) may be required for the mobile computing device to associate a particular light source with a particular location, and such systems may not provide sufficient information for a mobile computing device to determine its location with sufficient accuracy.

(10) Accordingly, Applicants have recognized and appreciated that it would be beneficial to utilize lighting infrastructure to facilitate location determination and navigation by mobile computing devices within an enclosed environment, without requiring a network connection by the mobile computing devices. Applicants further recognized and appreciated that it would be beneficial to provide light-based navigation and positioning to facilitate calculation of a mobile computing device's location within an environment with a higher degree of accuracy than has been possible in the past.

(11) In view of the foregoing, various embodiments and implementations of the present invention are directed to light-based navigation and positioning. In various embodiments, light sources may be selectively energized to project lighting effects on surfaces. Those lighting effects may carry coded light signals that convey various types of information about a location of the lighting effect. Mobile computing devices such as smart phones and tablet computers may be equipped with cameras configured to utilize rolling shutter techniques to capture these coded light signals. The mobile computing devices may then extract and use the location information for navigation and positioning.

(12) In some embodiments, the coded light signals may simply convey geographic coordinates. For instance, in some embodiments, the coded light signals carried in the lighting effects may convey location data formatted using a version of the World Geodetic System. In such embodiments, a location on Earth may be expressed using latitude, longitude and height. In other embodiments, the coded light signals may convey more literal data, such as northwest corner of first floor, women's shoes, southeast corner of garage floor A2, floor 5, and so forth. In yet other embodiments, the coded light signals carried in the lighting effects may convey location data that is pertinent in a particular environment such as an underground parking lot or shopping mall. For example, the location data may include Cartesian coordinates defined relative to a predefined origin within the environment. While many of the following examples describe transmission of Cartesian coordinates in coded light signals, this is not meant to be limiting, and other coordinate systems, such as Polar coordinates, may be used instead.

(13) Referring to FIG. 1, in one embodiment, a mobile computing device in the form of a smart phone 100 may be configured to calculate its location in an environment (e.g., garage, store, mall, airport, etc.) by determining a distance to the center of a lighting effect 102 projected onto a surface 104 by a light source 106. In various embodiments, smart phone 100 may be equipped with a camera having a lens 108, and may be configured to utilize rolling shutter to detect a coded light signal carried in the lighting effect.

(14) Assume lighting effect 102 is located at point (X.sub.1, Y.sub.1, Z.sub.1), and that smart phone 100 is located at point (X.sub.2, Y.sub.2, Z.sub.2). In various embodiments, light source 106 may be configured to emit light that carries a coded light signal. In various embodiments, the coded light signal may carry information about a location of lighting effect 102 projected onto surface 104. For example, the coded light signal may carry the location of the center of the lighting effect, (X.sub.1, Y.sub.1, Z.sub.1).

(15) In various embodiments, smart phone 100 may have stored in memory a reference height of smart phone, h.sub.phone, which may be an estimate of a distance between smart phone 100 and surface 104 when smart phone 100 is carried in a typical manner. For example, if a user of smart phone 100 indicates that her age is 10, then smart phone 100 may assume an average height of a smart phone when carried by a typical ten-year-old girl. In other embodiments, h.sub.phone may be conveyed by the coded light signal emitted by light source 106.

(16) In various embodiments, smart phone 100 may determine its orientation relative to surface 104. For instance, in various embodiments, smart phone 100 may determine the angle between a vector represented by the line h.sub.phone in FIG. 1 and a first vector 110 extending from a focal point of a camera lens 108 of smart phone 100 to surface 104 along a central axis of the camera lens. To determine , smart phone 100 may utilize one or more of an accelerometer and/or a gyroscope.

(17) In various embodiments, smart phone 100 may determine an angle between first vector 110 and a second vector 112 extending from the focal point to the center of lighting effect 102. If camera lens 108 is pointed directly at the center of lighting effect 102, may be zero. In various embodiments, the angle may be calculated based on a distance between a rendition of lighting effect 102 on a display of smart phone 100 and a center of the display. An example of this is shown in FIG. 2, where a rendition of lighting effect 102 is rendered on a display 114 of smart phone 100. A distance 116 between a center of the rendition of lighting effect 102 and a center of display 114 may be proportionate to, or otherwise related to, the angle of FIG. 1.

(18) Once the angles , and the reference height h.sub.phone are known, smart phone 100 may be configured to calculate various distances between smart phone 100 and a center of lighting effect 102. For instance, smart phone may calculate Y using the following equation:
Y=h.sub.phonetan(+).(1)

(19) FIGS. 1 and 2 demonstrate a simple example of a mobile computing device (i.e. smart phone 100) determining its location in an environment primarily in two dimensions, using a lighting effect. However, disclosed techniques are equally applicable in three dimensions. Further, if a mobile computing device detects more than one lighting effect 102 (or more than one light source 106 as described below), in various embodiments, the mobile computing device may calculate its location within the environment using information conveyed in a coded light signal carried by the brightest observed lighting effect 102 (or light source 106).

(20) FIG. 3 depicts a three-dimensional example of how a mobile computing device such as smart phone 100 may determine its location within an environment. In this example, smart phone 100 may determine its location on the X/Y plane, (X.sub.1, Y.sub.1), based on its distance from light source 106 and a distance between light source 106 and the lighting effect 102 it projects. Assume that light source 106 projects a lighting effect 102 on a surface 104 that is the X/Y plane. In some embodiments, Z.sub.1 may be based on h.sub.phone in FIG. 1 because it may represent an estimated height of smart phone 100 when held by a user. Assume also that light source 106 is located at point (X.sub.2, Y.sub.2, h.sub.light), and that lighting effect 102 is projected onto surface 104 at point (X.sub.3, Y.sub.3, Z.sub.3).

(21) Light source 106 may be commissioned in a process described below to emit a coded light signal. The coded light signal may convey various information about a location of lighting effect 102 in the environment. For example, the coded light signal may convey a reference distance h.sub.light between light source 106 and surface 104. Smart phone 100 may extract this information from the coded light signal and use it to perform various calculations to determine its location within the environment.

(22) In various embodiments, smart phone 100 may calculate an angle .sub.2 between a first vector, e.g., n.sub.phone in FIG. 3, that is normal to surface 104, and a second vector, r.sub.2, that extends from light source 106 to smart phone 100. In various embodiments, and similarly as described above with reference to FIG. 2, this calculation may be based on an orientation of smart phone 100 as detected by a gravity sensor, as well as a distance 116 between a rendition of the light source 106 on display 114 of smart phone 100 and a center of display 114.

(23) In various embodiments, a distance between smart phone 100 and light source 106 along the X/Y plane, r.sub.2x,y, may be calculated based on h.sub.light and .sub.2, using an equation such as one of the following:
r.sub.2x,y=h.sub.light/tan(90.sub.2)(2)
r.sub.2x,y=h.sub.lighttan(.sub.2)(3)
These equations and others described above and below are not meant to be limiting, and it should be understood that other equations may be performed in other orders without departing from the present disclosure.

(24) In various embodiments, smart phone 100 may calculate an angle .sub.2 between r.sub.2x,y and a reference vector. In various embodiments, the reference vector may be transmitted in the coded light signal emitted by light source 106 or preprogrammed into smart phone 100. In some embodiments, the reference vector may be predefined relative to a magnetic pole (including parallel to the pole). For instance, in FIG. 3, the Y-axis is the reference vector, and is aligned with magnetic north/south. Smart phone 100 may be equipped with a sensor such as a compass to detect the magnetic pole, the reference vector, its own orientation relative to the reference vector, and ultimately, angle .sub.2. Once angle .sub.2 is known, X.sub.2 and Y.sub.2 may be calculated using equations such as the following:
X.sub.2=r.sub.2x,ysin(.sub.2)(4)
Y.sub.2=r.sub.2x,ycos(.sub.2)(5)

(25) In some embodiments, once X.sub.2 and Y.sub.2 are known, smart phone 100 may determine its location within the environment further based on a location of lighting effect 102. For instance, the coded light signal emitted by light source 106 may convey, in addition to h.sub.light, the coordinates (X.sub.3, Y.sub.3, Z.sub.3) of lighting effect 102 as well as an angle .sub.3 between a vector n.sub.l.e. that is normal to surface 104 and that extends from a center of lighting effect 102, and a vector r.sub.3 from light source 106 to the center of lighting effect 102. Once .sub.3 is known, a distance r.sub.3x,y of light source 106 from lighting effect 102 along surface 104 may be calculated based on h.sub.light and .sub.3, using an equation such as one of the following:
r.sub.3x,y=h.sub.light/tan(90.sub.3)(6)
r.sub.3x,y=h.sub.lighttan(.sub.3)(7)

(26) In various embodiments, the coded light signal emitted by light source 106 may convey an angle .sub.3 between r.sub.3x,y and the Y-axis (which as mentioned above is aligned with magnetic north). Once r.sub.3x,y and .sub.3 are known, X.sub.3 and Y.sub.3 may be calculated, e.g., by smart phone 100, using equations such as the following:
X.sub.3=r.sub.3x,ysin(.sub.3)(8)
Y.sub.3=r.sub.3x,ycos(.sub.3)(9)

(27) Once X.sub.2, X.sub.3, Y.sub.2 and Y.sub.3 are known, smart phone 100 may calculate its location (X.sub.1, Y.sub.1) on the X/Y plane relative to the location of the center of lighting effect, (X.sub.3, Y.sub.3), using an equation such as the following:
(X.sub.1,Y.sub.1)=(X.sub.3+X.sub.2+X.sub.3,Y.sub.3+Y.sub.2+Y.sub.3)(10)
Z.sub.3 may simply be h.sub.phone, unless lighting effect 102 is projected onto a different surface than the user holding smart phone 100. In such case, Z.sub.3 may be a difference in height between the two surfaces.

(28) In some embodiments, light source 106 may be commissioned to emit a coded light signal that carries its own location, in addition to or instead of the location of the center of lighting effect 102. In such embodiments, it may be possible for smart phone 100 to calculate its position using equations such as (2)-(5), without performing equations (6)-(9).

(29) It should be noted that, in the simplest case where smart phone 100 is placed directly in the light beam emitted by light source 106, e.g., on top of or near the center of lighting effect 102, smart phone 100 may calculate its position as simply the position of lighting effect, (X.sub.3, Y.sub.3, Z.sub.3).

(30) In some scenarios, the mobile computing device may move through an environment quickly. For example, a mobile computing device associated with a vehicle (e.g., a GPS navigation unit) may move through a tunnel, where GPS is unavailable, at a high rate of speed. Light sources in the tunnel may emit coded light signals conveying location information. Because the vehicle is moving quickly, a light sensor matrix may be installed on the vehicle. To compensate for short exposure time, in various embodiments, multiple light sources in the tunnel may emit coded light signals conveying the same location information, e.g., in a synchronized manner to create a longer beam.

(31) As mentioned previously, in order for light source 106 to emit a coded light signal conveying information such as h.sub.light, the location of the center of lighting effect (X.sub.3, Y.sub.3, Z.sub.3), .sub.3 or .sub.3, it may first be commissioned with this data. In some embodiments, each light source 106 may be commissioned manually, e.g., by the manufacturer or by someone installing light source 106 in an environment. In some embodiments, light source 106 may be commissioned using a commissioning device. A commissioning device may in some embodiments be a portable computing device designed specifically for commissioning light sources. For instance, an autonomous robotic commissioning device may be configured to autonomously travel around an environment to multiple lighting effects 102, where it commissions the corresponding light sources 106. In other embodiments, the commissioning device may be a general purpose mobile computing device, such as a smart phone or tablet, that may be placed into a lighting effect 102.

(32) FIG. 4 depicts one example of how an example commissioning device 418 may be used to commission light source 106 so that other mobile computing devices (e.g., smart phone 100) are able to calculate their locations within an environment. As was the case with FIG. 3, assume that a center of lighting effect 102 is located at point (X.sub.3, Y.sub.3, Z.sub.3), and the light source 106 is located at point (X.sub.2, Y.sub.2, h.sub.light). In some embodiments, light source 106 may emit a coded light signal identifying itself. In other embodiments, an identifier of light source 106 may be input into commissioning device 418 manually.

(33) Commissioning device 418 may be positioned, or may position itself if autonomous, at the center of light effect 102, i.e. at point (X.sub.3, Y.sub.3, Z.sub.3). Assume that commissioning device 418 knows its location, e.g., using GPS or by tracking wheel rotations and turns from a known starting point. Once commissioning device 418 is so positioned, it may commission light source 106 by transmitting information about the location of lighting effect 102 to light source 106, e.g., using various communication technologies such as Wi-Fi, Bluetooth, NFC, RFID, coded light, and so forth.

(34) For instance, commissioning device 418 may transmit its location (which is at the center of lighting effect 102) to light source 106. Commissioning device 418 may also transmit to light source 106 a reference height h.sub.light of light source 106. In some embodiments, commissioning device 418 may calculate and transmit to light source 106 various angles, such as angles .sub.3 or .sub.3.

(35) In various embodiments, commissioning device 418 may calculate the angle .sub.3 between r.sub.3 and the normal vector n.sub.l.e.. In various embodiments, the angle .sub.3 may be calculated using techniques similar to those used to calculate the angle in FIGS. 1 and 2. For instance, commissioning device 418 may know an orientation of its camera (or other light sensor), similar to the first vector 110 of FIG. 1. Commissioning device 418 may then calculate .sub.3 based on a difference between a center of a display (or a memory buffer containing two-dimensional data representing a captured image) and a rendition of light source 106 on the display (or the memory buffer).

(36) In some embodiments, commissioning device 418 may additionally or alternatively calculate the angle .sub.3 between r.sub.3x,y and a reference vector that is predefined relative to a magnetic pole. For instance, in FIG. 4, the reference vector is the Y-axis, which is predefined along the magnetic pole. Commissioning device 418 may be equipped with a sensor such as a compass to detect the magnetic pole, its own orientation relative to the magnetic pole, and ultimately, angle .sub.3. The commissioning device may then transmit this angle .sub.3 to light source 106.

(37) In some embodiments, commissioning device 418 may transmit to light source 106 the location of light source 106, although this is not required when using the techniques demonstrated in FIG. 3. For instance, based on the reference height h.sub.light and the angle .sub.3, commissioning device 418 may calculate r.sub.3x,y. Once r.sub.3x,y is known, it can be used with the angle .sub.3 to calculate X.sub.3 and Y.sub.3. These values may be added to the position of commissioning device 418 on the X/Y plane, (X.sub.3, Y.sub.3), to determine the position of light source 106 on the X/Y plane.

(38) FIG. 5 schematically depicts components of an example light source 106, in accordance with various embodiments. Light source 106 may include one or more light-emitting diodes (LEDs) 520 and a controller 522 operably coupled with the one or more LEDs 520 and configured to selectively energize the one or more LEDs 520 to emit light carrying a coded light signal. As noted above, in various embodiments, the coded light signal may convey various information about a location of lighting effect 102 projected by the one or more LEDs onto surface 104. For example, in some embodiments, the information about the location of lighting effect 102 includes a location of a center of lighting effect 102. In some embodiments, the coded light signal also conveys a reference distance (e.g., h.sub.light) between light source 106 and surface 104.

(39) In some embodiments, controller 522 may be configured to derive the information about the location of lighting effect 102 based on a direction of a light beam produced by the one or more LEDs 520. For instance, light source 106 may be aware of its location, either by being commissioned by a commissioning device or via a GPS unit 524. Light source 106 may also have stored in memory a distance (e.g., h.sub.light) between light source 106 and surface 104 onto which it projects a lighting effect 102. Using these values, as well as a direction of a light beam emitted by light source, light source 106, e.g., by way of controller 522, may be configured to calculate a location of lighting effect 102. In other embodiments, controller 522 may be configured to derive the information about the location of the lighting effect based on a width of a light beam produced by the one or more LEDs 520.

(40) Although in examples described herein, lighting effect 102 has been projected on a horizontal surface, this is not meant to be limiting. Lighting effect 102 may be projected onto surfaces of any orientation, including horizontal, vertical, and anything in between. Moreover, surfaces 104 are not necessarily limited to floors. In some cases, the surfaces 104 may be raised surfaces of tables or other furniture. In such case, the Z-coordinate of lighting effect 102 and/or a reference distance between light source 106 and surface 104 (e.g., h.sub.light) may reflect the raised surface.

(41) FIG. 6 depicts an example method 600 that may be implemented by a mobile computing device such as smart phone 100 to calculate its position within an environment, in accordance with various embodiments. At block 602, a coded light signal may be received, e.g., by smart phone 100 from light source 106. At block 604, smart phone may extract a location of a lighting effect 102 produced by light source 106 on a surface 104.

(42) At block 606, smart phone 100 may determine a reference distance. If a camera of smart phone 100 is pointed at lighting effect 102, then the reference distance may be an estimated distance between smart phone 100 and surface 104, e.g., h.sub.phone in FIG. 1. If the camera of smart phone 100 is pointed at light source 106, on the other hand, then the reference distance may by a distance between light source 106 and the surface 104 on which light source 106 projects its lighting effect 102, e.g., h.sub.light in FIGS. 3 and 4. In some cases, h.sub.phone may be subtracted from h.sub.light to reflect a true distance in a direction of the Z-axis between smart phone 100 and light source 106.

(43) At block 608, smart phone 100 may determine its orientation relative to a magnetic pole, e.g., using a compass. For example, in FIG. 3, smart phone 100 determined the angle .sub.2. At block 610, smart phone 100 may determine its orientation relative to surface 104, e.g., using one or more accelerometers and/or gyroscopes. For example, smart phone 100 in FIG. 1 determined the angle . At block 612, smart phone 100 may determine an incident angle between a central axis of its camera and a vector from light source 106 or a center of lighting effect 102 to the camera's focal point. For instance, smart phone 100 in FIG. 1 determined the angle as demonstrated in FIG. 2 by determining a distance 116 between a center of display 114 and a rendition of lighting effect 102 on display 114.

(44) At block 614, smart phone 100 may calculate a distance between itself and lighting effect 102 and/or light source 106. For example, in FIG. 1, smart phone 100 used the sum of the two angles and in FIG. 1, in addition to the reference distance h.sub.phone, to calculate Y. Similar techniques were implemented by smart phone 100 in FIG. 3 to determine r.sub.2x,y and r.sub.3x,y.

(45) At block 616, based on the distance between smart phone 100 and lighting effect 102 and/or light source 106, as well as the location of lighting effect 102 (as conveyed by the coded light signal emitted by light source 106), smart phone 100 may calculate its location in environment.

(46) FIG. 7 depicts an example method 700 that may be implemented using commissioning device 418, in accordance with various embodiments. At block 702, commissioning device 418 may be placed in lighting effect 102, e.g., at its center. At block 704, commissioning device 418 may determine its location, e.g., using GPS or by tracking turns and rotations of its wheels.

(47) At block 706, commissioning device 418 may determine its orientation relative to a magnetic pole, e.g., using a compass. For example, in FIG. 4, the commissioning device determined the angle .sub.3. At block 708, commissioning device 418 may determine its orientation relative to surface 104. For example, in FIG. 4, commissioning device 418 determined the angle .sub.3 based at least in part on an orientation of its camera or light sensor.

(48) At block 710, commissioning device 418 may transmit the information determined at block 704-708 to light source 106, e.g., using various communication technologies such as Wi-Fi, Bluetooth, coded light, NFC, RFID, and so forth.

(49) While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.

(50) All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.

(51) The indefinite articles a and an, as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean at least one.

(52) As used herein in the specification and in the claims, the phrase at least one, in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase at least one refers, whether related or unrelated to those elements specifically identified.

(53) It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.

(54) Reference numerals appearing in the claims, if any, are provided merely for convenience and should not be construed as limiting the claims in any way.

(55) In the claims, as well as in the specification above, all transitional phrases such as comprising, including, carrying, having, containing, involving, holding, composed of, and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases consisting of and consisting essentially of shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.