DETERMINING INTERACTIONS BETWEEN CELLS BASED ON FORCE SPECTROSCOPY
20220366708 · 2022-11-17
Assignee
Inventors
Cpc classification
G06V10/751
PHYSICS
G01N33/4833
PHYSICS
B01L3/50273
PERFORMING OPERATIONS; TRANSPORTING
G06V20/69
PHYSICS
G02B21/367
PHYSICS
B01L3/502715
PERFORMING OPERATIONS; TRANSPORTING
B01L2400/0436
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06V20/69
PHYSICS
Abstract
Methods and systems for determining interaction between cells are described wherein the method includes determining or receiving a sequence of images representing manipulating first cells, in a holding space, the holding space including a functionalized wall comprising second cells, the manipulating including settling of the first cells onto the functionalized wall and applying a force on the settled first cells; detecting groups of pixels representing first cells in first images representing the settling of the first cells onto the functionalized wall; tracking locations of detected first cells in the first images; and, determining settling events, a settling event being determined if a cell in a first image is not distinguishable from background of the first image, the location in the image at which a cell settling event is detected defining a cell settling location; detecting groups of pixels representing cells in second images captured during the application of the force and tracking locations of detected cells, wherein tracked locations of a detected cell in the second images form a tracking path, the first location of the tracking path defining a pop-up event, the location in a second image at which a pop-up event is detected defining a pop-up location; and, determining detachment events based on the settling locations and based on the pop-up locations, a detachment event defining a first cell being detached from a second cell due to application of the force on the first cell, and determining information about the interaction between first and second cells based on the force applied to the first cells.
Claims
1. A method for determining interaction between cells comprising: determining or receiving a sequence of images representing manipulating first cells, in a holding space, the holding space including a functionalized wall comprising second cells, the manipulating including settling of the first cells onto the functionalized wall and applying a force on the settled first cells; detecting groups of pixels representing the first cells in first images representing the settling of the first cells onto the functionalized wall; and, tracking locations of the detected first cells in the first images; and, determining settling events, a settling event being determined if a cell is no longer distinguishable from the background, the location in the first image at which a cell settling event is detected defining a cell settling location; detecting groups of pixels representing first cells in second images captured during the application of the force and tracking locations of the detected first cells, wherein tracked locations of the detected first cells in the second images form a tracking path, the first location of the tracking path defining a pop-up event, the location in a second image at which a pop-up event is detected defining a pop-up location; and, determining detachment events based on the settling locations and based on the pop-up locations, a detachment event defining a first cell being detached from a second cell due to application of the force on the first cell, and determining information about the interaction between the first and the second cells based on the force applied to the first cells.
2. The method according to claim 1 wherein the determining of the detachment events includes: determining a distance between one said settling location and one said pop-up location and determining one said detachment event based on a threshold value.
3. The method according to claim 1, wherein classification into a trackable or non-trackable cell is based on at least one of: intensity values of pixels of the group of pixels representing a cell; a shape, texture and/or dimensions of the group of pixels representing a cell; and/or, a contrast ratio between pixel values of group of the pixels representing a cell and pixel values representing the background of an image in which the group of pixels is tracked.
4. The method according to claim 1, wherein the determining if a cell is no longer distinguishable from the background is based on at least one of: one or more changes in pixel values of a group of pixels representing a cell, a shape, texture and/or, a contrast ratio between pixel values of the group of pixels representing a cell and pixel values representing the background of an image in which the group of pixels is tracked.
5. The method according to claim 1, wherein the determining of the images includes: determining or receiving one or more background images of the functionalized wall comprising the second cells; and removing the background from the images representing the manipulation of the first cells by using the one or more background images.
6. The method according to claim 1, wherein the determining of die detachment events includes: determining or receiving locations of one or more non-functional areas in the images of the functionalized wall surface, a non-functional area defining an area of the functionalized wall surface from which the second cells are absent; and disregarding one said pop-up location in the determining of detachment events if the pop-up location is located in or within a predetermined distance of the one or more non-functional areas.
7. The method according to claim 1, wherein the determining of the detachment events includes: determining or receiving one or more cluster locations in the images captured during the application of the force, a cluster defining an aggregation of cells which are not bound to the functionalized cell surface in the images when the force is applied to the first cells; and disregarding one said pop-up location in the determining of detachment events if the pop-up location is detected within one of the one or more cluster locations.
8. The method according to claim 1, wherein the determining of the detachment events includes: disregarding one said pop-up location in the determining of detachment events if the pop-up location is located within a predetermined distance of the edges of the images.
9. The method according to claim 1, wherein tracking the location of each detected cell includes: linking positions of detected cells in subsequent images using a minimization technique.
10. The method according to claim 1, wherein the method further includes: determining an avidity curve based on the detachment events and the force associated with each of the detachment events
11. The method according to claim 1, wherein the first cells are effector cells and the second cells are target cells.
12. The method according to claim 1, wherein first cells include at least one of lymphocytes, monocytic cells, granulocytes, T cells, natural killer cells, B-Cells, CAR-T cells, dendritic cells, Jurkat cells, bacterial cells, red blood cells, macrophages, TCR Tg T-cells, OT-I/OT-II cells, splenocytes, thymocytes, BM derived hematopoietic stem cells, TILs, tissue derived macrophages, and innate lymphoid cells; and/or, the second cells include at least one of: tumor cells, stem cells, epithelial cells, B16 melanoma, fibroblasts, endothelial cells, HEK293, HeLa, 3T3, MEFs, HuVECs, microglia, and neuronal cells.
13. A module for analyzing images of cells being manipulated in a holding space, the module comprising a non-transitory computer readable storage medium having computer readable program code embodied therewith, and a processor coupled to the computer readable storage medium, wherein responsive to executing the computer readable program code, the processor is configured to perform executable operations comprising: determining or receiving a sequence of images representing manipulating first cells, in the holding space, the holding space including a functionalized wall comprising second cells, the manipulating including settling of the first cells onto the functionalized wall and applying a force on the settled first cells; detecting groups of pixels representing the first cells in first images representing the settling of the first cells onto the functionalized wall; and, tracking locations of detected first cells in the first images; and, determining settling events, a settling event being determined if a cell is no longer distinguishable from the background, the location in the first image at which one said cell settling event is detected defining a cell settling location; detecting groups of pixels representing the cells in second images captured during the application of the force and tracking locations of detected cells, wherein tracked locations of detected cells in the second images form a tracking path, the first location of the tracking path defining a pop-up event, the location in a second image at which a pop-up event is detected defining a pop-up location; and, determining detachment events based on the settling locations and based on the pop-up locations, a detachment event defining one said first cell being detached from one said second cell due to application of the force on the first cell, and determining information about the interaction between the first and the second cells based on the force applied to the first cells.
14. A system for determining interaction between cells comprising: a sample holder comprising a holding space for cells; a force generator for applying a force to the cells; an imaging system capturing images of the cells in the holding space; a controller module for controlling the force generator and the imaging system; a non-transitory computer readable storage medium having computer readable program code embodied therewith, and a processor coupled to the computer readable storage medium, wherein responsive to executing the computer readable program code, the processor is configured to perform executable operations comprising: determining or receiving a sequence of images representing manipulating first cells, in the holding space, the holding space including a functionalized wall comprising second cells, the manipulating including settling of the first cells onto the functionalized wall and applying a force on the settled first cells; detecting groups of pixels representing the first cells in first images representing the settling of the first cells onto the functionalized wall; tracking locations of detected first cells in the first images; and, determining settling events, a settling event being determined if one said cell is no longer distinguishable from background of the first image, the location in the image at which one said cell settling event is detected defining a cell settling location; detecting groups of pixels representing the cells in second images captured during the application of the force and tracking locations of detected cells, wherein the tracked locations of detected cells in the second images form a tracking path, the first location of the tracking path defining a pop-up event, the location in a second image at which the pop-up event is detected defining a pop-up location; and, determining detachment events based on the settling locations and based on the pop-up locations, a detachment event defining one said first cell being detached from one said second cell due to application of the force on the first cell, and determining information about the interaction between first and second cells based on the force applied to the first cells.
15. A computer program or suite of computer programs comprising at least one software code portion or a computer program product storing at least one software code portion, the software code portion, when run on a computer system, being configured for executing the method steps according to claim 1.
16. The method of claim 2, wherein the detachment event is determined if the distance between the settling location and the pop-up location is smaller than the threshold value.
17. The method of claim 4, wherein the changes in the pixel values include changes of intensity values.
18. The method according to claim 1, wherein the determining of the detachment events includes: determining or receiving one or more cluster locations in the images captured during the application of the force, a cluster defining an aggregation of cells which are not bound to the functionalized cell surface in the images when the force is applied to the first cells; and disregarding one said pop-up location in the determining of detachment events if the pop-up location is detected within a predetermined distance of one of the one or more cluster locations.
19. The method according to claim 1, wherein the second cells are effector cells and the first cells are target cells.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
DETAILED DESCRIPTION
[0067]
[0068] The system of
[0069] The system may further comprise a light source 120 for illuminating the sample using any suitable optics (not shown) to provide a desired illumination intensity and intensity pattern, e.g. plane wave illumination, Köhler illumination, etc., known per se. Here, the light 122 emitted from the light source may be directed through the force field generator 108 to (the sample in) the sample holder 102 and sample light 124 from the sample is transmitted through the objective 114 and through an optional tube lens 126 and/or further optics (not shown) to the camera 116. The objective and the camera may be integrated. In an embodiment, two or more optical detection tools, e.g. with different magnifications, may be used simultaneously for detection of sample light, e.g. using a beam splitter.
[0070] In another embodiment, not shown but discussed in detail in WO2014/200341, the system may comprise a partially reflective reflector and light emitted from the light source is directed via the reflector through the objective and through the sample, and light from the sample is reflected back into the objective, passing through the partially reflective reflector and directed into a camera via optional intervening optics. Further embodiments may be apparent to the reader.
[0071] The sample light may comprise light affected by the sample (e.g. scattered and/or absorbed) and/or light emitted by one or more portions of the sample itself e.g. by chromophores/fluorophores attached to the cellular bodies.
[0072] Some optical elements in the system may be at least one of partly reflective, dichroic (having a wavelength specific reflectivity, e.g. having a high reflectivity for one wavelength and high transmissivity for another wavelength), polarisation selective and otherwise suitable for the shown setup. Further optical elements e.g. lenses, prisms, polarizers, diaphragms, reflectors etc. may be provided, e.g. to configure the system 100 for specific types of microscopy.
[0073] The sample holder 102 may be formed by a single piece of material with a channel inside, e.g. glass, injection moulded polymer, etc. (not shown) or by fixing different layers of suitable materials together more or less permanently, e.g. by welding, glass bond, gluing, taping, clamping, etc., such that a holding space 106 is formed in which the fluid sample is contained, at least during the duration of an experiment. While, the force spectrometry system of
[0074]
[0075] Further, the sample holder 212 may be connected to a fluid flow system 214 for introducing fluid and unbound cells into the holding space of the sample holder and/or removing fluid from the holding space, e.g. for flowing fluid through the holding space (see arrows in
[0076]
[0077] One or more software programs that run on the computer 118 of the force spectroscopy system may be configured to control the camera, the force field generator and the flow cell to conduct different experiments. In a typical experiment, cells, e.g. effector cells, may be flushed into the holding space of the flow cell and may interact, e.g. bind, with the target cells. This interaction can be probed by analysing the response of cells that are bound to target cells as a function of the force applied. Typically, the response of the cells is determined by analysing video frames that are captured by the camera. To that end, the computer may include an image processing module 128 comprising one or more image processing algorithms for analysing the response of the cells when they are manipulated in the flow cell using the force field generator. The image analysis of the video frames is described hereunder in greater detail.
[0078]
[0079] As will be described hereunder in more detail, the incubation phase may be imaged and when the cells are introduced into the holding space and move towards the functionalized wall, groups of pixels in the captured images may be detected and tracked. However, as effector cells approach the functionalized wall, move other the wall surface and bind to target cells, the contrast between pixels representing effector cells and pixels representing the functionalized wall including the target cells may become very low so that if the contrast drops below a certain level cells can no longer be reliably detected and tracked.
[0080] After the incubation phase, a force may be applied to the effector cells that are bound to the target cells. The force may have a direction away from the functionalized wall surface. Typically, a force ramp will be applied to the effector cells, so that if the force becomes larger than the binding force, effector cells will detach from the target cells and move away in the direction of the force (
[0081] When the force is larger than the binding force, the effector cell will detach from the target cell and move in a direction that depends on the applied force, which may have a component perpendicular to the functionalized wall (e.g. the z-direction) and two components in the plane of the functionalized wall (e.g. the x and y direction). The location in the image in which a pop-up event is detected (i.e. the contrast between groups of pixels representing cell and pixels representing the background is above a certain level) and the point in time at which the pop-up event occurred can be determined on the basis of the images (video frames) which are captured during the experiment. The time at which cells detach may determine the force that is exerted on the effector cells. In a typical experiment, the force ramp may take between 2-10 minutes, but it can also be shorter or longer.
[0082] Based on a measurement scheme as described with reference to
[0083] While
[0084]
[0085]
[0086] A dotted circle 505 in the image may indicate that the algorithm has detected a group of pixels 506 within the circular area which is classified as a trackable cell. The cells that are visible in the images are depicted using a white color. This detection and classification process may be applied to the captured images. The location of detected groups of pixels that are classified as trackable, unbound cells (which may move both parallel and perpendicular to the functionalized wall) may be determined so that the movement of the detected cell as a function of time can determined. The locations of a cell that is detected and tracked may form a so-called tracking path (not shown). Each cell that is detected and tracked in subsequent images may be linked with a unique identifier so that the locations of a tracked cell (the tracking path) and other information can be stored on a storage medium of the computer that executes the image processing algorithm.
[0087] During the incubation phase, effector cells that are introduced into the microfluidic cell will gradually descent by e.g. gravitational forces towards the functionalized wall. Further, the cells may move over the surface of the functionalized wall until they encounter suitable target cells to bind to. When cells are descending towards the functionalized wall, the change in contrast, shape and/or dimensions of the group of pixels representing a cell may cause the image processing algorithm to classify the group of pixels as no longer trackable. Thus, during the incubation phase, the classification of a group of pixels that is classified as a trackable cell may change into non-trackable (i.e. a cell in a first image is no longer distinguishable from background of the first image) when it moves towards the wall surface and binds to a target cell. Detected and tracked groups of pixels that are classified as cells may “disappear” into the background of pixels representing the functionalized wall surface. In
[0088] During the tracking of detected cells in subsequent images, the image processing algorithm may determine a so-called cell settling event in an image if the classification of a group of pixels is changed from a trackable cell into a non-trackable cell. The settling event may occur at a location in the image as schematically depicted in
[0089] A cell settling event may further be associated with a time instance indicating at which time (or in which image of the video) the cell settling event was determined. Such time instance may for example be determined based on a clock or a time stamp of the image in which the event was detected. As shown in
[0090] Thereafter, a force generator of the force spectroscopy system, e.g. an acoustic force generator as explained with reference to
[0091] As shown in
[0092] Further, the location at which a cell is detected (for the first time) as a cell may define a cell pop-up location. The location of detected cells in subsequent images may be tracked, wherein the tracked cells may not only move away from the functionalized wall but also sideward within the plane of the functionalized wall. For example, as illustrated in
[0093]
[0094] The tracking paths, the pop-up events and settling events may be used by the image processing algorithm to distinguish detachment events, i.e. pop-up events associated with detachment of a cell that was bound to a target cell, from other events, i.e. pop-up events that were recognized by the image processing algorithm but relate to other processes. As shown in
[0095] Further, detached cells or debris that enter the field of view of the imaging system may be detected, classified as trackable cells and tracked until the cells accumulate at the cluster area. For example, tracking paths 522.sub.4,5,9 relate to detected cells or detected debris that is recognized by the algorithm as a cell. Such cells may come from an area outside the field of view and enter the field of view when the acoustic force attracts these cells to move towards the node. Events such as detached target cells or cells or debris that enter the field of view may be filtered out (disregarded) by using a distance correlation between the settling and pop-up events. This correlation is based on the observation that settled cells may move a certain limited distance before it finds a suitable target cell to bind to. Thus, the location of a settling event and a pop-up event of a detached cell (a detachment event) should be within a certain distance. When using the distance correlation to filter out the relevant pop-up events, the image processing algorithm may determine five detachment events 524.sub.1-5 in
[0096]
[0097]
[0098] The method may start with a step 802 determining or receiving a sequence of images of manipulating first cells, e.g. effector cells, in a holding space, wherein the holding space may include a functionalized wall comprising second cells, e.g. target cells. Here, the target cells may be connected to the wall of the flow cell so that they will not detach when a force is applied to the functionalized wall surface. The manipulating of the cell may include settling of the first cells onto the functionalized wall. The settling of the cells allows the cells to move around over the wall in order to find a suitable target cell it can bind to. The process of settling and binding may be referred to the incubation phase. Thereafter, a force may be applied to the settled first cells.
[0099] In step 804, groups of pixels may be detected in in first images representing the settling of the first cells onto the functionalized wall. These detected groups of pixels may represent first cells. Further, tracking locations of detected first cells in the first images may be determined, wherein during tracking a cell may be classified as no longer trackable. In that case, a cell that is tracked in consecutive images becomes no longer distinguishable from background of the first image, because the cell is close to the functionalized wall surface. Such event may be referred to as a settling event. The location in the image at which a cell settling event is detected defines a cell settling location.
[0100] In a further step 806, groups of pixels may be detected in second images captured during the application of the force. These detected groups of pixels may represent first cells and tracking locations of detected first cells, wherein tracked locations of a detected first cell in the second images may form a tracking path, the first location of the tracking path defining a pop-up event, the location in a second image at which a pop-up event is detected defining a pop-up location.
[0101] Thereafter, detachment events may be determined (step 808), wherein detachment events are related to first cells being detached from the second cells due to application of the force on the first cells. These events may be determined based on the settling locations and the pop-up locations. The detachment events are then used to determine information about the interaction between first and second cells. In particular, the information about the interaction between the cells may be based on the force that was applied to the first cells when the detachment events occurred.
[0102] In an embodiment, a background correction method may be applied to the captured video images before the images are processed for cell detection and tracking. In such method, the background of the captured images (video frames), i.e. the functionalized wall comprising the target cells, may be removed based one or more captured background images. This way, the foreground information, i.e. the unbound cells, may be more clearly visible so that subsequent image processing, e.g. detection and tracking can be improved. Thus, the background correction method may be used as a pre-processing step for improving the accuracy of the detection and tracking algorithm. This background subtraction may be based on a pre-acquired image or series of images (before flush in of the first cells) and may be static (unchanging). It may however also involve a background model which is dynamically updated based on images taken during the experiment and may involve an advanced background model describing e.g. the dynamic behavior of the target cells.
[0103]
[0104] Instead of using a static background picture, a rolling median background correction algorithm may be used or any other suitable algorithm for dynamically correcting the background. In case if a rolling median background correction, a background image may be formed by taking the median intensity value over a predetermine number of images, e.g. 10 images, for each pixel separately or for groups of pixels. This background image may then used for removing the background of the current image (or at least a substantial part thereof).
[0105]
[0106]
[0107]
[0108]
[0109] Memory elements 1304 may include one or more physical memory devices such as, for example, local memory 1308 and one or more bulk storage devices 1310. Local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 1300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the number of times program code must be retrieved from bulk storage device 1310 during execution.
[0110] Input/output (I/O) devices depicted as input device 1312 and output device 1314 optionally can be coupled to the data processing system. Examples of input device may include, but are not limited to, for example, a keyboard, a pointing device such as a mouse, or the like. Examples of output device may include, but are not limited to, for example, a monitor or display, speakers, or the like. Input device and/or output device may be coupled to data processing system either directly or through intervening I/O controllers. A network adapter 1316 may also be coupled to data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to said data and a data transmitter for transmitting data to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with data processing system 1300.
[0111] As pictured in
[0112] In one aspect, for example, data processing system 1300 may represent a client data processing system. In that case, application 1318 may represent a client application that, when executed, configures data processing system 1300 to perform the various functions described herein with reference to a “client”. Examples of a client can include, but are not limited to, a personal computer, a portable computer, a mobile phone, or the like.
[0113] In another aspect, data processing system may represent a server. For example, data processing system may represent an (HTTP) server in which case application 1318, when executed, may configure data processing system to perform (HTTP) server operations. In another aspect, data processing system may represent a module, unit or function as referred to in this specification.
[0114] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0115] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
[0116] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0117] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.