DEVICE FOR TRAINING USERS OF AN ULTRASOUND IMAGING DEVICE
20200402425 ยท 2020-12-24
Assignee
Inventors
Cpc classification
G09B23/286
PHYSICS
G06T19/00
PHYSICS
International classification
A61B10/00
HUMAN NECESSITIES
Abstract
Disclosed are methods and devices for simulating ultrasound procedures and for training ultrasound users. Additionally disclosed are methods and devices for simulating needle insertion procedures, such as amniocentesis procedures, and for training physicians to perform such needle insertion procedures.
Claims
1. An ultrasound simulator comprising: a digital repository, the repository including at least one virtual three-dimensional model of a torso, the model comprising a plurality of simulated body organs; a processor associated with said repository and configured to use at least one of said virtual three-dimensional models in said repository; a location-identifying surface associated with said processor, the location-identifying surface being a touch sensitive surface which is built in to a body simulating unit, the location-identifying surface further providing pressure sensing; and a physical ultrasound transducer simulator associated with said processor, said ultrasound transducer simulator comprising an orientation sensor configured to provide to said processor a measurement of an orientation of said ultrasound transducer simulator relative to said location-identifying surface, said processor configured to relate said measurement to a surface location on said at least one three-dimensional model, different surface locations being in proximity to different ones of said simulated body organs, the simulator further being configured to provide an ultrasound image having the orientation of the physical ultrasound transducer simulator at said two-dimensional location on said location-identifying surface, the ultrasound image provided being a two-dimensional section through said three-dimensional model, the section changeably moving or rotating within said three-dimensional model as said ultrasound transducer simulator is moved or rotated, thereby to simulate ultrasound viewing of any one of said simulated organs.
2. The ultrasound simulator of claim 1, wherein at least one of said three-dimensional models is at least one of an ultrasound model, a Magnetic Resonance Imaging (MRI) model and a X-ray computed tomography (CT) model.
3. The ultrasound simulator of claim 1, also comprising a user-assessment module operative to assess at least one criterion of said performance of a user operating said ultrasound transducer simulator, and configured to instruct said user to reach a specified section of said at least one virtual three-dimensional model used by said processor.
4. The ultrasound simulator of claim 3, wherein said user-assessment module is also configured to provide a grade to said user, said grade being based on said user's performance in said at least one criterion.
5. The ultrasound simulator of claim 3, wherein said user-assessment module is configured to provide to said user, in real time, at least one of guidance for reaching said specified section and guidance for using appropriate pressure when attempting to reach said specified section.
6. The ultrasound simulator of claim 1, also comprising a physical needle simulator associated with said processor, in addition to and different from said ultrasound transducer simulator, said physical needle simulator comprising: a three-dimensional orientation sensor configured to sense and to provide to said processor said three-dimensional orientation of said needle simulator; and an insertion depth sensor configured to sense and to provide to said processor information regarding said simulated depth of insertion of said needle simulator.
7. The ultrasound simulator of claim 6, wherein said user assessment module is configured to train said user to insert a needle into a first volume while not contacting a second volume.
8. The ultrasound simulator of claim 6, wherein said user-assessment module is configured to at least one of: provide a warning indication to said user when a position of said needle simulator and a virtual insertion depth of said needle simulator correspond to a simulated needle being within a predetermined distance of contacting said second volume; and provide a contact indication to said user when a position of said needle simulator and a virtual insertion depth of said needle simulator correspond to a simulated needle contacting said second volume.
9. A method for simulating an ultrasound, comprising: providing a digital repository, the repository including at least one virtual three-dimensional model of a torso, the model comprising a plurality of simulated organs located around said torso; associating at least one of said virtual three-dimensional models in said repository with a processor; from a physical ultrasound transducer simulator comprising a three-dimensional orientation sensor, independently providing to said processor an orientation of said ultrasound transducer simulator relative to a location-identifying surface, said location-identifying surface being a touch-sensitive surface; providing pressure information from said touch-sensitive surface; providing to said processor information regarding a two-dimensional location of said ultrasound transducer simulator on said touch-sensitive surface; relating said two-dimensional location on said touch sensitive surface to a location on said associated three-dimensional model; and providing in real time an ultrasound image having the three-dimensional orientation of the physical ultrasound transducer simulator at said two-dimensional location on said associated three-dimensional model, thereby to allow a user to find a required one of said simulated organs and obtain a simulated ultrasound image thereof.
10. The method of claim 9, wherein said providing a repository comprises providing at least one ultrasound model.
11. The method of claim 9, wherein said providing a repository comprises providing at least one of at least one Magnetic Resonance Imaging (MRI) model and at least one X-ray computed tomography (CT) model.
12. The method of claim 9, also comprising assessing at least one criterion of said performance of a user operating said ultrasound transducer simulator, wherein said assessing comprises instructing said user to reach a specified section of said at least one virtual three-dimensional model used by said processor.
13. The method of claim 12, wherein said assessing comprises providing a grade to said user, said grade being based on said user's performance in said at least one criterion.
14. The method of claim 12, wherein said assessing comprises providing to said user, in real time, at least one of guidance for reaching said specified section and guidance for using appropriate pressure when attempting to reach said specified section.
15. The method of claim 9, also comprising: associating a physical needle simulator with said processor, in addition to and different from said ultrasound transducer simulator; from a three-dimensional orientation sensor included in said physical needle simulator, providing to said processor information regarding said three-dimensional orientation of said needle simulator; and from an insertion depth sensor configured included in said physical needle simulator, providing to said processor information regarding said simulated depth of insertion of said needle simulator.
16. The method of claim 13, wherein said assessing comprises using said physical needle simulator, training said user to insert a needle into a first volume while not contacting a second volume.
17. The method of claim 14, wherein said assessing comprises at least one of: providing a warning indication to said user when a position of said needle simulator and a virtual insertion depth of said needle simulator correspond to a simulated needle being within a predetermined distance of contacting said second volume; and providing a contact indication to said user when a position of said needle simulator and a virtual insertion depth of said needle simulator correspond to a simulated needle contacting said second volume.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0116] Some embodiments of the invention are described herein with reference to the accompanying figures. The description, together with the figures, makes apparent to a person having ordinary skill in the art how some embodiments of the invention may be practiced. The figures are for the purpose of illustrative discussion and no attempt is made to show structural details of an embodiment in more detail than is necessary for a fundamental understanding of the invention. For the sake of clarity, some objects depicted in the figures are not to scale.
[0117] In the Figures:
[0118]
[0119]
[0120]
[0121]
[0122]
DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION
[0123] The invention, in some embodiments, relates to the field of medical simulators, and more particularly, in some embodiments, to methods and devices for training ultrasound users to perform medical sonography, such as gynaecological sonography, cardiological sonography, gasteroenterological sonography, neurological sonography, musculoskeletal sonography, and CT scans, and to identify abnormalities seen in such tests.
[0124] As discussed above, methods and devices are needed in order to train users such as doctors and ultrasound technicians to recognize abnormalities and anomalies, such as embryonic abnormalities, or to safely guide medical devices, such as amniocentesis needles, using ultrasound imaging.
[0125] The principles, uses and implementations of the teachings of the invention may be better understood with reference to the accompanying description and figures. Upon perusal of the description and figures present herein, one skilled in the art is able to implement the teachings of the invention without undue effort or experimentation.
[0126] Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth herein. The invention is capable of other embodiments or of being practiced or carried out in various ways. The phraseology and terminology employed herein are for descriptive purpose and should not be regarded as limiting.
[0127] According to an aspect of some embodiments of the invention there is provided an ultrasound simulator comprising:
[0128] a digital repository of virtual three-dimensional models, including at least one virtual three-dimensional model;
[0129] a processor associated with the repository and configured, during operation of the simulator to simulate, to use at least one of the virtual three-dimensional models in the repository;
[0130] a location-identifying surface associated with the processor; and
[0131] a physical ultrasound transducer simulator associated with the processor, the ultrasound transducer simulator comprising a three-dimensional orientation sensor configured to provide to the processor information regarding a three-dimensional orientation of the ultrasound transducer simulator relative to the location-identifying surface,
[0132] wherein at least one of the location-identifying surface and a device bearing the location-identifying surface is operative to provide to the processor information regarding a two-dimensional location of the ultrasound transducer simulator on the surface.
[0133] According to an aspect of some embodiments of the invention there is also provided a method for simulating the use of ultrasound imaging, comprising:
[0134] providing a digital repository of virtual three-dimensional models, including at least one virtual three-dimensional model;
[0135] associating at least one of the virtual three-dimensional models in the repository with a processor;
[0136] from a physical ultrasound transducer simulator comprising a three-dimensional orientation sensor, providing to the processor information regarding a three-dimensional orientation of the ultrasound transducer simulator relative to a location-identifying surface functionally associated with the processor; and
[0137] providing to the processor information regarding a two-dimensional location of the ultrasound transducer simulator on the surface.
[0138] In the context of the present application, the two dimensional location of the ultrasound transducer simulator on the surface is defined as a two dimensional point, or a two dimensional area, at which the ultrasound transducer simulator is in touching contact with the surface.
[0139] As used herein, when a numerical value is preceded by the either of the terms about and around, the terms about and around are intended to indicate +/10%.
[0140] Reference is now made to
[0141] As seen in
[0142] Above the basin 12 is located a robotic arm 16, which is movable along the X and Y axes of the basin 12. In some embodiments the robotic arm moves at a relatively slow speed, such as around 1 mm per second. At a bottom end of the robotic arm 16 is placed an ultrasound transducer 20, which is immersed in the water located in basin 12. Typically, the ultrasound transducer 20 is functionally associated with an ultrasound imaging device (not depicted), in some embodiments together configured to repeatedly acquire an ultrasound image of a plane.
[0143] For use for creating a repository of virtual three-dimensional images, the robotic arm 16 travels along the X and Y axes in the basin 12 while ultrasound transducer 20 is operational, such that the ultrasound transducer 20 obtains image information for multiple sections of the object 14. In some embodiments, the robotic arm 16 travels at a rate that allows transducer 20 to obtain approximately 300-400 section images per 15 to 20 centimeter of object 14. Once the section images are obtained, a processor (not shown) (e.g., of an associated ultrasound imaging device or of a different device) uses the section images to recreate a virtual three-dimensional model of the object 14, as known in the art of tomography for storage in a repository.
[0144] The three-dimensional model of the object created by the device 10 is added to an image repository (not shown), that can be used to implement the teachings herein, for example, together with an ultrasound simulator according to the teachings herein, an embodiment of which is described hereinbelow with reference to
[0145] It is appreciated that the embodiment of
[0146] It is further appreciated that an image and/or virtual model repository according to the teachings herein may include models and/or images of any volume, including three-dimensional geometrical volumes such as spheres, ellipsoids, convex three-dimensional volumes, concave three-dimensional volumes, irregular three-dimensional volumes, and three-dimensional volumes representing anatomical volumes, for example human or mammalian organs.
[0147] Reference is now made to
[0148] As seen in
[0149] In some embodiments, the location-identifying surface 32 comprises a touch-sensitive surface, such that the touch sensitive surface provides to the processor 35 information regarding the two-dimensional location at which the physical transducer simulator 36 is positioned. The touch-sensitive surface may be any suitable touch-sensitive surface such as a touch screen known own in the art of user-machine interfaces. In some embodiments the touch-sensitive surface is of a tablet computer or smartphone, such as an iPad or iPod respectively, both commercially-available from Apple Inc of Cupertino, Calif., USA. In some such embodiments, the processor 35 is the processor of the tablet computer/smartphone. In some embodiments, the touch sensitive surface comprises a touch pad, such as typically available in laptop computers, using a suitable technology. Suitable touchpads are commercially available, for example T650 by Logitech SA, Morges, Switzerland.
[0150] In some embodiments, the location-identifying surface 32 uses an optoelectronic sensor (e.g., as used in computer mouse technology) in order to identify the two-dimensional location at which the physical transducer simulator 36 is positioned.
[0151] In some embodiments, the simulator 30 uses multiple cameras and an infra-red transmitter associated with the physical ultrasound transducer simulator 36 to determine the two-dimensional location of the transducer simulator 36 relative to the location-identifying surface 32, in a technology similar to that provided by IntelliPen.
[0152] In some embodiments, the simulator 30 uses a three-dimensional camera, such as a 3D Time of Flight camera commercially available from Mesa Imaging AG of Zurich, Switzerland, associated with the physical ultrasound transducer simulator 36 to determine the two-dimensional location of the transducer simulator 36 relative to the location-identifying surface 32.
[0153] In some embodiments, the location-identifying surface 32 uses a magnetic sensor comprising a solenoid and a magnetic field (e.g., generated by a magnetic-field generating component) in order to identify the two-dimensional location. In this case, the solenoid is located in the physical transducer simulator 36, and the two-dimensional location of the physical transducer simulator 36 is identified based on the magnitude of current passing through the solenoid.
[0154] In some embodiments, such as the embodiments depicted in
[0155] In some embodiments, such as the embodiment illustrated in
[0156] In some embodiments, the simulator 30 uses a three-dimensional camera, such as a 3D Time of Flight camera commercially available from Mesa Imaging AG of Zurich, Switzerland, associated with the physical ultrasound transducer simulator 36 to determine the two-dimensional location of the transducer simulator 36 relative to the location-identifying surface 32.
[0157] In some embodiments, the physical transducer simulator 36 is functionally associated with the processor 35, and provides the processor 35 information regarding its own three-dimensional orientation, including the yaw, pitch, and roll of the physical transducer simulator 36. In some embodiments, such as the embodiment illustrated in
[0158] In some embodiments, the physical transducer simulator 36 comprises a gyroscope (not shown) used to identify the angular velocity of the transducer simulator 36, or, if the transducer simulator 36 is not moving, the three-dimensional orientation of the transducer simulator. The transducer simulator 36 may further include a compass (not shown) which indicates the direction in which the transducer simulator 36 is oriented and an accelerometer (not shown) used to obtain the direction in which the transducer simulator 36 is moving, or, when the transducer simulator 36 is not moving, the three-dimensional orientation of the transducer simulator 36. The three-dimensional orientation of the physical transducer simulator 36 is obtained by combining the information from the gyroscope, compass, and accelerometer using any suitable filter, such as a Kalman filter and/or LPF filters and/or HPF filters according to any method and using any suitable component with which a person having ordinary skill in the art is familiar.
[0159] It is appreciated that the gyroscope and the accelerometer provide very similar, if not identical, information regarding the orientation of the transducer simulator 36. However, due to the relatively noisy output of typical accelerometers, and to the drift problem often associated with gyroscopes, the combination of the outputs of the two provides more accurate positioning information than would be provided when using only one of the two. That said, in some embodiments a no-drift gyroscope is used, and obtain accurate positioning information for a transducer simulator 36.
[0160] Alternatively, in some embodiments transducer simulator 36 includes three non-parallel solenoids (e.g., mutually-orthogonally defining X, Y, and Z axes) and a source of a magnetic field in a specified plane. The current passing through each of the solenoids at any given moment is used to calculate the three-dimensional orientation of the transducer 36, in the usual way.
[0161] As a further alternative, in some embodiments physical transducer simulator 36 includes a mechanical device, similar to a joystick, which provides the three-dimensional orientation of the transducer simulator 36.
[0162] In some embodiments, the simulator 30 uses a three-dimensional camera, such as a 3D Time of Flight camera commercially available from Mesa Imaging AG of Zurich, Switzerland, associated with the physical ultrasound transducer simulator 36 to determine the three-dimensional orientation of the physical transducer simulator 36. This aspect is particularly useful when the three-dimensional camera is used also to identify the two dimensional location of the ultrasound simulator transducer 36 on surface 32.
[0163] During use of the simulator, for example for training, a specified virtual three-dimensional model from the repository is selected and uploaded to the processor 35. As seen in
[0164] The user places the physical transducer simulator 36 contacts the location-identifying surface 32 at a specific (two-dimensional location) with a three-dimensional orientation. The processor 35 is provided information regarding the two-dimensional location of the transducer 36 on the location-identifying surface 32, and the transducer simulator 36 provides the processor 35 information regarding its three-dimensional orientation relative to surface 32. In some embodiments, the processor 35 is provided information regarding the two-dimensional location of the transducer 36 on surface 32 directly from surface 32, for example when surface 32 is a touch surface operative to identify the two dimensional location at which it is contacted. In some embodiments, the processor 35 is provided information regarding the two-dimensional location of transducer 36 on surface 32 from a device associated with surface 32, such as a three dimensional camera operative to capture an image of transducer 36 located on surface 32.
[0165] In response, the processor 35 displays to the user on display 34 an image of a section of the selected three-dimensional virtual model, such that the section corresponds to an ultrasound image of the specified virtual three-dimensional model from the repository acquired by an ultrasound imaging transducer having the three-dimensional orientation of the transducer simulator 36 and at the location of the transducer simulator 36 relative to surface 32, as indicated by reference numeral 40 in
[0166] In some embodiments, the ultrasound simulator device 30 may be used for assessing the performance of a user. In some embodiments, as seen in
[0167] In some such embodiments, device 30 instructs the user to display an image of a specific section, for example by displaying an image or a verbal description of the specific section on display 34, on display 44, or overlaid on a surface 32, or by verbally specifying the section to be displayed, for example aurally using speakers 46.
[0168] In some such embodiments, device 30 is configured to assess whether the user has reached the correct section for display, how many attempts the user made until reaching the correct section, how many hand motions were required for the user to reach the correct section, and the amount of pressure applied by the user on surface 32. For this purpose processor 35 may include a user assessment module 50 including a motion assessment module 52 functionally associated with the ultrasound transducer simulator 36, a pressure assessment module 54 functionally associated with surface 32. The assessment information collected from modules 52 and 54 is summarized, and, in some embodiments, a scoring module 56 functionally associated with display 34, display 44, and/or speakers 46 presents the user with a grade of the test, and, in some cases, with comments and/or guidance for improvement, visually on display 34 and/or 44, and/or aurally using speakers 46.
[0169] In some embodiments, processor 35 also includes a user guidance module 58, functionally associated with the user assessment module 50 and configured, during a training or testing session, to guide the user to move the transducer simulator 36 (e.g., to the left or to the right), or to change the orientation of the transducer simulator 36, or to change the pressure applied to transducer simulator 36 in order to help the user reach the required section. In some such embodiments, the guidance information is provided as an overlay on the surface 32. In some such embodiments, the guidance information is provided to the user visually, such as on display 34 and/or on display 44. In some embodiments the guidance is provided audibly (e.g., higher or lower tones), for example using speakers 46. In some embodiments the guidance is provided tactilely, for example using tactile signal generator 48.
[0170] In some embodiments, processor 35 also includes a model modifying module 60 functionally associated with the repository 33, which is configured to modify (e.g., shape or orientation) of at least part of the virtual three-dimensional model during user-assessment, for example, to simulate muscular or fetal motion during an ultrasound procedure. The model modifying module 60 may modify the model at regular intervals, at random intervals, or upon receipt of input from an assessing entity as indicated by input arrow 62. In some embodiments, model modifying module 60 is functionally associated with the user assessment module 50 and specifically with user guidance module 58, so that guidance provided to the user of transducer simulator 36 may be updated upon modification by module 60 of the model being used for user assessment.
[0171] Reference is now made to
[0172] As seen in
[0173] In some embodiments, the three-dimensional orientation sensor 72 comprises a pen associated with a tablet computer, such as the Intuos3 Grip Pen commercially available from Wacom Company Ltd. of Tokyo, Japan.
[0174] In some embodiments, the insertion depth simulator 74 comprises a component similar to a computer mouse, mounted onto the three-dimensional orientation sensor 72, such that the lower the device is along the three-dimensional orientation sensor 72 indicates deeper virtual insertion of the needle simulator. In some such embodiments, the mouse is associated with the processor and provides to the processor information regarding its height over the surface 32, thereby providing to the processor information regarding the virtual depth to which the needle simulator is inserted.
[0175] In some embodiments, the insertion depth simulator 74 comprises a distance sensor. In some such embodiments, the distance sensor comprises a potentiometer. In some such embodiments, the distance sensor comprises a linear encoder. In some such embodiments, the distance sensor comprises a laser distance sensor. In some such embodiments, the distance sensor comprises an ultrasonic distance sensor.
[0176] In some embodiments, the three-dimensional orientation sensor 72 and/or the insertion depth simulator 74 comprises a three-dimensional camera, such as a 3D Time of Flight camera, commercially available from Mesa Imaging AG of Zurich, Switzerland, which camera may provide information regarding the three-dimensional orientation of the simulated needle and/or information regarding the depth to which the needle was inserted.
[0177] In some such embodiments, the insertion depth simulator 74 comprises a pressure sensor.
[0178] In some embodiments, such as the embodiment illustrated in
[0179] In some embodiments, the physical needle simulator 70 is configured to simulate an amniocentesis needle. In some embodiments, the physical needle simulator 70 is configured to simulate a laparoscopic needle. In some embodiments, the physical needle simulator 70 is configured to simulate a biopsy needle.
[0180] In some embodiments, a physical needle simulator is configured to simulate a different type of hard device used to penetrate into a body and guided by a user to a location in the body with the help of ultrasound imaging.
[0181] In use, a virtual three-dimensional model from the model repository 33 is specified and uploaded by the processor 35, in a similar manner to that described hereinabove with reference to
[0182] In addition to placing the physical transducer simulator 36 on the location-identifying surface 32 as described hereinabove with reference to
[0183] The processor receives information regarding the two-dimensional location of the transducer simulator 36 and information regarding the transducer's three-dimensional of the transducer simulator 36, substantially as described above.
[0184] Additionally, the needle simulator 70 provides the processor 35 with information regarding the three-dimensional orientation of the needle simulator 70 and about the virtual depth of insertion of the needle simulator 70. In some embodiments, the information regarding the three-dimensional orientation is provided by the three-dimensional orientation sensor 72 and the information regarding the virtual depth of insertion of the needle is provided by the insertion depth sensor 74.
[0185] In response, the processor 35 provides to display 34 an image of a section of the model, indicated by reference numeral 80, such that the section corresponds to the three-dimensional orientation of the transducer 36, with a superimposed image 82 of a virtual needle having a location corresponding to the location, orientation and virtual insertion depth of the needle simulator 70.
[0186] As described hereinabove, in some embodiments the ultrasound simulator device 30 and the needle simulator 70 may be used for assessing the performance of a user, by instructing the user to insert the needle into a certain place in the three dimensional model and assessing the user's performance, substantially as described hereinabove with reference to
[0187] In some embodiments, a user assessment module of processor 35, such as user assessment module 50 of
[0188] In some embodiments, such as in a first training stage, the first virtual volume comprises a first three-dimensional volume and the second virtual volume comprises a second three-dimensional volume located near to, within, or surrounding the first virtual volume.
[0189] In some embodiments the first virtual volume simulates a uterine volume with amniotic fluid and the second virtual volume simulates an embryo or fetus thereinside, and the user-assessment module is configured to train the user to perform an amniocentesis procedure without harming the embryo or fetus.
[0190] In some embodiments the first virtual volume simulates a tumor tissue and the second virtual volume simulates healthy tissue, and the user-assessment module is configured to train the user to perform a biopsy of the tumor tissue without harming the healthy tissue.
[0191] In some embodiments the first virtual volume simulates a tissue of unknown character and the second virtual volume simulates healthy tissue, and the user-assessment module is configured to train the user to perform a biopsy of the tissue of unknown character without harming the healthy tissue in order to perform cytology tests to identify the type of tissue of unknown character.
[0192] In some embodiments, the first virtual volume simulates an undesired substance, and the second virtual volume simulates body tissue. For example, the first virtual volume may simulate a gall stone, a kidney stone, a lipoma, or a ganglion cyst.
[0193] In some embodiments, the user-assessment module is configured to provide a warning indication to the user when the needle simulator position, orientation and virtual insertion depth correspond to a simulated needle being dangerously close to the second virtual volume. For example, the user may be warned if the simulated needle is within one millimeter of the second virtual volume.
[0194] In some embodiments, the warning indication comprises a visual indication. For example, the visual indication may be provided on the display, such as display 34 or 44 of
[0195] In some embodiments, the user-assessment module is configured to provide a contact indication to the user when the needle simulator position, orientation and virtual insertion depth correspond to a simulated needle being in contact with the second virtual volume.
[0196] In some embodiments, the contact indication comprises a visual indication. For example, the visual indication may be provided on the display, such as display 34 or 44 of
[0197] As described hereinabove with reference to
[0198] As described hereinabove with reference to
[0199] It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features is of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
[0200] Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the scope of the appended claims.
[0201] Citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the invention.
[0202] All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting. In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.