Smart Cell Monitoring Webinar - Essen BioScience
Mr. Tim O’Callaghan: Hello, and welcome to the latest Essen BioScience Webinar in our webinar series. My name is Tim O’Callaghan. I am Global Product Manager at Essen BioScience. It’s really great to see there is such a large audience from all the way over the world today.
Today’s webinar will be 35 minutes long, after which, we will have a brief Q & A session and provide access to the slides used in the presentation today. So, without further ado, it is my pleasure to introduce our speaker today Del Trezise who is Co-Founder and Director of the Essen BioScience labs in Europe. Del, the floor is yours.
Dr. Del Trezise: Okay. Hello, everybody, wherever you may be. Thank you for the introduction, Tim, and thank you for everybody who’s dialed in for today’s webinar. The title of the presentation is, as hopefully you can see on the slides, “Smart Cell Monitoring, Resolving the Changing Needs of Cellular Research With Continuous Live-cell Analysis”.
So, the structure of the webinar is as follows. To begin with, I will overview some of the fundamental challenges that we face in biomedical research and then discuss the changing landscape of cell models and cell systems that we use. I’ll then go on to introduce continuous live-cell analysis as a concept and as a method, and then describe some of the current and emerging technologies which are bringing this method to the everyday researcher. I’ll, then, spend some time illustrating the application of continuous live-cell analysis in some networked examples before wrapping up and providing a perspective on the future.
Okay. So, the fundamental challenges in biomedical research that we face, there probably isn’t a day that goes by where we shouldn’t be thinking, whether we’re doing in vivo models or in in vitro work, how relevant and translational the model systems that we are using to the things that we’re trying to understand. How relevant are they? How reliably do the inform us, and are they truly predictive and translational? All big questions.
Secondarily, over the years, and largely through the input of the science of genetic and epigenetics and studies on the environment, it’s increasingly clear the patients are ever more complex than, perhaps we had originally imagined. They are unique, and their biology changes over time. So, anything that we understand or can measure from a patient is only ever true that particular point in time, and their biology will change in its dynamic as they age. So, these are two of the fundamental challenges that we need to consider.
On this next slide, I've tried to capture some of the major applications of cells in biomedical research within this--the context of these fundamental challenges. So, in the top left hand corner, basic mechanisms of biology, understanding signaling pathways in cells, understanding who cells join together to form tissues, understanding how tissues form together to form organs and whole beings.
In the top right hand corner, using cells to try to predict the efficacy and safety of potential new therapies. Do we see, in those cell models, what we would ultimately see when we take those potential new medicines to our patient population? In the bottom left hand corner, biomanufacture. Cells are very good at making certain biologicals. For example, antibodies.
And fourthly, the science of cell therapy has moved on tremendously over the last decade, so cells can be considered therapeutics in their own right.
Okay. So, this slide captures what I’ll refer to as a changing landscape of the use of cellular models. It’s illustrative and shows on the X axis how increasingly researchers are moving towards more relevant cell types in their research, whether those are primary cells or stem cell derived cells or even patient specific cells. And organizing them in more relevant ways, shown here on the Y axis. So, moving from simple to two dimensional monocultures towards co-cultures and three dimensional microtissues. And, indeed organs on a chip. So, this, this landscape, it has moved towards more advanced cell systems in their research.
Now, these new or newer advanced cell system certainly present some great opportunity for translational research, but they also present some different challenges compared to working with immortalized cell lines or tumor cell lines. As a generalization, they’re typically harder to source and to, to maintain. They’re often of a limited supply. Some of the cells are expensive to purchase, and they’re more challenging to work with. The exemplification of this would be, for any of you working with CHO cells or hexcells or hemicells, you probably get to the end your working week and you throw away unused cells. You discard them. They’re low cost, and you don’t worry too much about that. I suspect for those of you working with human stem cell derived neurons, for example, you're not throwing away cells as a matter of course because they’re expensive, and they’re precious to you. And you're trying to extract the most information that you can from them.
Secondarily, again, compared to immortalized cells, these advanced cell systems typically more dynamic. The properties of the cells change more over time. Some of the cells have a limited lifespan. Cells will differentiate, differentiate, and in a way the people are organizing them to take advantage of understanding of the importance of the microenvironment, for example, cell to cell communication is, is something which is more prevalent and interesting in these advanced cell models compared to immortalized cells.
This next slide really, really, really capitulates what I just told you, so I won’t spend too long on it, but it’s a basic scoring of some of the high level attributes of different cell model systems. So, the top row, host cell cell lines is the most reductionist and simple, and at the bottom, microtissues and organoids. And, as I mentioned, if you consider the relative cost, ease of use, and availability, there is certainly a trade off in working with these advanced cell models. So, the experiment that we’re all part of is this search for greater translation or relevance. But a quid pro quo is these new cell models are more challenging to work with but typically more costly, and certain cell types have a limited availability which, which forces the, the need to do what I'm going to refer to as “economic experiments”.
Okay. So, working with these advanced cell models presents some new requirements for analytical methods. So, it stands to reason that if you are going to take the time and effort to work with advanced cell models, that your analytical methods should be commensurate with those. So, you don’t want to be taking really simplistic or, or low value measurements from an advanced cell model. That doesn’t really make too much sense.
Secondarily, given the dynamic nature of these advanced cell models, analytical methods that contract parameters over time and allow biologists to understand temporal changes will be ever more valuable. Thirdly, given that these cells are precious, any of these methods that are cell-sparing, whether that be through true assay miniaturization or potentially through streamlined workflows where you just don’t use as many cells, you just don’t waste as many cells, you don’t need as many cells for assay development and assay optimization. Those methods would also be favored.
And finally, you really want your analytical methods to be sufficiently user friendly that you can use them to focus on the complexities in the biology rather than focusing on the complexities of the analytical method in its own right.
So, with all that in mind, continuous monitoring and analysis as a method is ideally suited to meeting these needs presented by advance cell models. And that will be the topic for the next section of this webinar. Now, just before we do that, we’re just like to take a short break, and on your screen will pop up a question that we’d like you to consider. Are you using an advanced cell system as we've defined here? If you wouldn’t mind just giving us a quick yes or no answer, we’ll just spend a few seconds on this, and then we’ll move on to the next part of the webinar.
Okay. Thank you very much. So, section two, Continuous Monitoring and Analysis. Though, I'm aware that many of you will already understand and appreciate what we refer to, but for those, for those of you that this may be new to, I'd like to start by presenting an analogy to another healthcare situation to really illustrate what we mean by continuous monitoring and analysis.
So, if we consider how traditionally healthcare checks have been done, it’s been done by visits to your general practitioner who will measure your, for example, cardio vascular parameters. Let's take heart rate as an example. You will have an appointment. You're GP will measure your heart rate. Hopefully, it’s within range. They will send you away and say come back in six months’ time. And then you come back and they’ll, they’ll make the same measurement and determine whether it’s gone up, down, or stayed the same.
In the modern world, with the developments of new technology and the illustration here is of a Fitbit wearable technology monitor. You can continuously measure your heartrate. You don’t need a general practitioner to help you.
Now, in the two scenarios, shown here by A and B, in scenario one the, the parameter does change as a function of time. It generally, you know, oscillates above and below a straight line here. There may be days where you've drunk too much coffee or had a stressful day at work where the parameter is slightly higher and then, and conversely it may be slightly lower than the average on a more relaxed moment. However, if your continuous measurement looks more like line B, you probably should be concerned whether this trend for your heartrate to increase in time, and you really should be going to see an expert to get some more detailed analysis done.
Hopefully, it’s clear from this example the benefits of this continuous monitoring are firstly more accurate base. Secondly, the ability to, to determine or detect trends and outliers in the data set. And you can use early detection to make an early decision. You wouldn’t wait for six months to go see your GP if your trend was for a raising, rising parameter.
Importantly, the detection method, in this case, this Fitbit monitor, is very non-perturbing to the biology of interest. So, both your lifestyle, you don’t need to take a day off work to go and see your general practitioner. And secondary, this important, what we refer to as the white coat syndrome. It’s inherently anxiety inducing going into an unfamiliar environment to have your CD parameters monitored. Whereas, your Fitbit, you can just wear it day and night, and it’s completely unperturbing to, to the measurement itself. Ultimately, this approach should be simpler and cheaper, too.
Now, I don’t want anybody to take this a, well, visits to your general practitioner are no longer required, but hopefully, you get the point that the value of continuous monitoring provides something different and useful to these snapshot measurements.
Okay. So, applying that similar logic to live-cell analysis, so continuous live-cell analysis. What is it? Well, put simply, it’s the continuous or semi-continuous measurement of parameters in cells. So, these can be relevant or should be relevant analyses in real time. So, they may be things like cell function measurements or morphology measurements or phenotype measurements. The measurement should be made from completely non-perturbed, living cells in which case, the environmental control of those cells is imperative.
Thirdly, the measurement should be non-exhaustive. So, you don’t want to be depleting your cells each time you make a measurement from them. If the biology of interest unfolds over hours, days, or weeks, then your continuous live-cell analysis approach should be able to follow that long-term biology.
And finally, the live-cell analysis approach should be applicable across many workflows. So, not just for the types of kinetic assays that you may think about, but also throughout the husbandry of the cells and any manipulations to the cells that you may make.
Okay. So, just thinking about that kind of continuum of cell culture, cell manipulation, and phenotypic assay, I've tried to list on this slide a few of the workflow applications where continuous live-cell analysis adds value. I won’t go through them one by one, but they’re clustered as cell culture activities, cell manipulation activities, for example, stem cell reprogramming or clonal dilution. And, cell assay activities, you know, starting from quality control to cell plating through the detail mechanism of action studies.
To loop back to the analogy I drew earlier, just to illustrate how this differs from traditional endpoint measurement. So, for those of you making endpoint measurements with plate readers, for the most part, it seems improbable that the first true measurement that you make from your cells is a single time slice of the assay part of this continuum. You don’t, you don’t make measurements through the cultural manipulation phase of this.
Whereas, with continuous live-cell monitoring, you have the opportunity to derive parameters on cell health and morphology and growth through your cell husbandry activities. You can understand the efficiency of transfection if you’re introducing new genes, for example. You can verify the consistency of plating if you're using plates for your assays, and the effects of any pretreatments before you get into the kinetic assay itself where you can follow the full-time course of the biology, and not have to worry about which particular time point may be most relevant for your mission that’s interest.
Okay. So, I want to move on now to introduce a couple of the technical approaches which are used for continuous live-cell analysis. The first is electrical measurements. So, there are a number of systems out there now that allow researchers to measure impedance or extracellular field potentials from cells. These methods can be used by systems that reside within a standard cell incubator, providing the essential environmental control. They do require specialist electrode plates in order to resolve the signals, but the signals are non-invasive. They’re label free, and they can be applied repeatedly and over the long-term.
In my view, the most relevant and value-adding applications for the electrical technologies are for measuring things from excitable cells. So, neurons and cardio-myocytes. However, impedance is also a surrogate for a number of other cellular phenotypes such as cell migration, cell proliferation, and cell adhesion in others and has been used in that way.
Some of the down sides or the challenges with the electrical approach are that the impedance signals can be difficult to interpret because they do translate to so many different cellular phenotypes. In some cases, rather high cell numbers are required in order to make robust impedance measurements which is somewhat of a downside compared to premise for cell sparing requirements with the advanced cell models. And finally, some of the electrode plate consumables can be reasonably, reasonably costly.
I'm showing two examples on the right hand side of this slide, so just to talk about those very briefly. So, the top right hand panel shows some cellular impedance measurements over three-day period of A549 cells plated onto these electrode plates. During the first 24 hours, the impedance increases as the cell adhere to the plates, and then different groups overlaid here are treated with different cytotoxic compounds. And you can see that some prevent the increase in impedance as associated with proliferation of the cells. So, this is a greater proliferation cytotoxicity assay.
The lower panel shows some measurements from the multi-electrode ray, so the field potentials in mouse spinal cord. And, during the first couple of weeks of these cultures, you don’t detect much electrical activity in this paradigm. But if you leave the cells long enough, they will start to communicate with each other and using extra potentials, you can start to measure these action potentials in these spinal cord neurons. And you can do this for weeks and weeks on end using the appropriate technology.
Okay. So, that’s the electrical approach. Let's, now, talk about the imaging approach. So, again, this will, hopefully, be obvious to many, but the, this approach involves taking time-lapsed images of cells using imaging devices or microscope systems that are housed within an environmentally controlled chambers in order to keep the cells happy and healthy. You can make label free measurements from cells using phase contrasts of Brightfield microscopy or you can use non-perturbing fluorescent probes in order to measure other parameters from cells which I’ll come and talk more about in a moment.
As I mentioned, these devices are designed to work within the humidified and warm environment of a cell incubator, therefore, keeping the cells happy. But, in comparison to the electrical devices, you don’t need specialized consumables. You can do this with regular tissue culture plastic. Applications are numerous, ranging from cell health through to cell quality control and advanced phenotypic assays.
And a final important point on here is that these types of imaging systems do not require particularly large numbers of cells, so they are very well aligned to some of the requirements of the advanced cell models that I have mentioned.
Okay. So, on this table, I tried to capture some of the different types of live-cell analysis imaging solutions. I've clustered them into four types, contact microscopes, live-cell images dedicated to this microscope work stations, and high content images. The compact microscopes and live-cell images work within a standard cell incubator. The work stations typically have their own housing for environmental control, and the high content imaging devices typically have bolt on stages which, in fairness, not particularly good at long-term cell stability. They’re okay for one or two days, but much beyond that, they’re, they’re less, less good. The compact microscopes are typically of lower cost, but the consequence is they have less flexibility in the optical systems and lower capacity and throughput and, and little in the way of automated analysis, image analysis tools.
The dedicated live-cell images, as you would expect, they’re, they’re target is towards this application, so have a very balanced perspective in terms of throughput capacity, automated analysis, and flexibility. May surprise you, the capacity throughput scoring on the high content imager here, just to point that out. That’s with regards to use. We’ll continue with that continuous analysis. High content readers are very good for single endpoint analysis, but bearing in mind, they can only read one plate at a time. If you want to continuously analyze from cells, then their overall capacity is fairly low.
Okay. So, just before we go on to the third section of the webinar, we have another question that we’d ask, I'd like you to think about for a moment. So, the question is on your screen. Are you currently performing any continuous or real time live-cell analyses in your research? Just give you a few moments to think about that. Okay. Thank you very much.
So, let's come on to some examples, and hopefully this is where it will all fall into place. So, the first example I'd like to talk about with regards to the, the application and the value of continuous live-cell analysis is a project in which we were looking to differentiate human cortical neural stem cells and develop long-term, stable neural networks in order to do pharmalogical studies upon. So, to explain the graph on the left, this is time course of the development of neurites in these cultures over an eight-day period. We used neurite link as a marker for the differentiation into the neuronal phenotype, and then, subsequently, obviously, as an index of the neural network.
We took these cells, cryopreserved cells, we used one to wells in a 96 well plate and started to measure neurite length, and for the first 24 hours or so, there were little or no neurites in these cultures, as you would expect. These were not differentiated. These were precursor cells. After 24 hours, we added an exogenous differentiation stimulus, and then we monitored for the next four or five days the development of these neurites. And, low and behold, we saw each, each line represents data from the single well. We saw this consistent development of neurites and this differentiation of the precursors into mature neurons.
And after 200 hours, you could see only relatively stable neurite network under these conditions which is all very pleasing. Just to show you, briefly, what this looks like in video. So, this is a video now of the neurites differentiating. They’re changing into this elongated neuronal phenotype. And this is the corresponding hot quantification. These measurements were made using an IncuCyte ZOOM live-cell imager which applies an algorithm to look for long, thin structures which in this case, they’re labeled magenta, and then it quantifies the total length and number of branch points of these neurites. And that’s the data which is shown in the graphic. So, hopefully, that’s clear.
Okay. Now, this is where it got interesting. When we continued to monitor these cells, what we noticed was after about 300 hours in culture, a number of the wells, we began to see this deterioration and the drop off in the total neurite lengths. And, I've separated these wells out and color coded them red. So, 7 of the 24 wells, we began to see this, this deterioration in phenotype which we weren’t expecting.
When we reviewed and looked to those images and reanalyzed them for confluence as well as neurite length, it was very clear that there was a strong correlation between the drop off in neurites and this increase in cell confluence. And it didn’t take us long to, by retrospective analysis of the images, to verify that the reason why, in the subset of these wells, we saw this unstable neurite network, was because the non-terminally differentiated precursors which were relatively few and far between had begun to proliferate and dominate the culture. So, we, we, throughout the course of the assay, had the ability to understand what was going on.
So, just to reflect on that example, in this case, continuous live-cell analysis provided us with two things. One was a better biological insight than we feel we would have by an endpoint approach. And secondly, was a great productivity. Real time observation and management, measurement, I'm sorry, gave us a verification of cell health and morphology throughout the study. All of our measurements were made from living, non-perturbed cells. We didn’t require to fix or stain or wash them. We, at all times, understood the time course of the neural outgrowth and stabilization, and we could troubleshoot to identify this precursor proliferation complication.
With regards to productivity, all of the images were automatically taken by the IncuCyte and automatically analyzed, so we’d no overhead to the researcher. I showed you data from the 24 control wells in this study. We took 78 time point measurements equating to just under 1900 data points. These wells took up no more than half a million total cells which, given these are relatively expensive and precious cells, it was an acceptable cost to us. And, we didn’t require to purchase any labeling antibodies for the experiments.
And, importantly, the, all of the measurements were non-exhaustive, so the cells were still viable at the end of all of this to do follow up studies, for example. Things like this to chemistry.
Okay. So, let's think about the second example. This is slightly different. So, this is an example where we assembled a co-culture assay to look at immune cell killing of adherent tumor cells for an immuno-oncology project. So, briefly, what we did is we created a co-culture with ovarian cancer cells with human peripheral blood mononuclear cells, PBMCs, as the effector cells. We, first, labeled the nuclei of the ovarian cancer cells with a nuclear targeted red fluorescent protein so that we could quantify the cancer cells in their own right.
We duplexed this with a measurement of apoptosis using a fluorescent caspase 3/7 substrate which only fluoresces when cells begin to express an active caspase enzyme. We activated the co-culture with IL-2/CD3 to prompt the effector cells to target the tumor cells. We measured the tumor cell number and the apoptotic cell death by quantifying the red and green channels respectively.
So, to begin with, how was continuous live-cell analysis applied. Well, from the outset, when we grew these SKOV-3 target cells in six well plates, we did the transfection with the NucLight Red lentivirus construct. At that point, we didn’t know what the optimal concentration MOI of lentivirus to use. So, we did a simple matrix experiment where we used a range of different MOIs in the absence and presence of polybrene. What you can see from this analysis in the, the right half of this slide, is the optimal concentration with MOI6 plus polybrene, and after two days, we saw a high infection rate. And the cells, the image on the bottom left hand side, shows that roughly 70 to 80 percent of these cells were expressing the, the nuclei targeted RFP.
We, then, took these cells, we continued, continued to monitor them. We put them into a T175 flask to bolt them up. This now shows live-cell analysis of the cells in this T flask. And what you can see here is the growth curve for these cells and an image taken at 100 hours showing that now, in the presence of the puromycin selection antibiotic. From eliminating cells not expressing the RFP, we could purify the cell population.
So, we then progressed with this purified cell population into plating these cells in preparation for our co-culture assay. So, we worked on 96 well plates, and our initial target, by seeding the stock cells at 2000 cells per well, was to bring about roughly 20 percent confluence, or 20 percent area occupancy across this plate. The data on the bottom left hand side of this slide shows we did a reasonably good job here. On average, we had 18 percent confluence with relatively little variance from one well to the next and SD value of two percent. It’s considered a good starting point. And you can see the image of those healthy, happy SKOV-3 cells on the upper right hand panel.
We, then, added the PBMCs, so we required 8000 cells per well for this experiment, and the yellow arrow highlights one of these. It’s, it’s very small, round, bright, high contrast cell, contrasting, obviously, with the larger, red, nucleated SKOV-3 cells. So, the continuous live-cell analysis was use, basically, to QC this plate before and after the addition of the affected cells.
So, moving on to the assay itself. So, what you see here, now, is the continuous monitoring of these cultures for, for the seven-day period. And the top left hand panel is the microplate view. So, this analysis is done in real time, so you don’t have to wait to the end of seven days to see the outcome of the experiment. You're watching what happens as time goes by.
The parameter on the Y axis is the number of red objects, so the number of living, SKOV-3 cells in the culture. And, the controls are showing in the pink, and you can see that as the SKOV cells proliferate over time, after about 120 hours, they are fully confluent. In the presence of PBMCs alone, you see no effect on the ability of the SKOV-3 cells to proliferate, but if you activate those PBMCs with IL-2/CD3, which is now the blue line in the panel on the top right, you can see that the cells stop proliferating at around about 60 hours and the cell number starts to decrease as the effected cells start to kill the target cells.
If we look at the green channel to monitor the appearance of apoptotic SKOV-3 cells. In the controls are the pink and the dark blue, we saw little or no apoptosis up to that 100 hours. After that, as the cultures began to get over confluent, you start to see some dying cells, but little in the way of comparison with what you see with the activated PBMCs. So, in the IL-2/CD3 group in the dark blue, around about 60 hours, we start to see the appearance of apoptotic tumor cells.
We can really verify that that’s happening by inspecting the movie. So, this, now, shows a movie of these cells. The red cells are the tumor cells, and you see the very highly motile PBMCs, and in a moment, you’ll see one particularly large cell begins to degranulate having been attacked by these tumor cells. There it goes, and then it glows green as the caspase substrate is cleaved and the fluorophore binds to the DNA. So, we have a very visual method of verifying what we’re trying to measure.
Okay. And then, finally, for this story, we can do this this at a scale. I show a four plate study here with IncuCyte ZOOM. You could do, actually, six plates, and you could do six or eight four well plates. But the example here is 4 times 96 will plates. This, in fact, was an antibody dependent cell killing assay where we were looking to test the effect of Herceptin and Herceptin bio similars on PBMC killing of tumor cells. The important values for the, the industrious of the screeners in the audience are these plate QCs that prime parameters measured from the high and low controls which were all in excess of 0.6 illustrating highly precise and robust screening assays.
We required only a small number of cells for the entire study. We had 3 million PBMCs for the entire study, and, and I just show you one representative concentration response curve, here, for Herceptin to illustrate how this approach can be turned at scale into robust quantative pharmacology.
Okay. And, so just to reflect on that, example two. So, really similar high level messages of the value of the approach. Biological insight and enhanced productivity. We could monitor the full time course using a direct read out, so in this case target cell death throughout the course of this assay. We can multiplex that with the apoptosis readout and the cell morphology analysis to really build confidence in the signals that we were measuring. The time lapse movies verify certain physical interactions between target effect cells and really support and add credibility to the quantified data.
As with the previous example, all of the images were taken and analyzed automatically by the IncuCyte system. Hopefully, I showed you how we can streamline some of the reagent and assay build using this continuous live-cell analysis approach. In the assay itself, we had four 96 well plates with two metrics and 64 time points. So, we analyzed 49,000 data points in this study.
And, again, in similar ways to that which I showed you before, the cells remain viable throughout the experiment for follow up studies. So, you could get them to do close autometry analysis or you could sample supernatants to measure, for example.
Okay. So, that’s the story. Just to wrap up, I showed you, to begin with, how the cellular research landscape is changing and researchers are moving towards more complex models where cells become more precious and dynamic entities. I, hopefully, showed you how developed using continuous live-cell analysis, how well aligned to the new requirements presented by these advanced cell models. The continuous live-cell imaging method I showed you provide both valuable biological insight as well as enhanced productivity. And this can be done at a scale and across a range of different cellular workflows.
My concluding statement is that continuous live-cell imaging and analysis is strongly emerging as, as a tool for, powerful tool for the cell biologist, particularly for those of you that are working with or considering working with more advanced cells systems. So, on that note, I will finish up. I’ll hand back to Tim who will facilitate the question and answer session. All right. Thank you.
Mr. Tim O’Callaghan: Great. Thanks, Del. Thanks for a really great overview for how live-cell imaging and analysis can be used across a range of different cell workflows from characterizing cells in culture, QCing manipulators in modified cells, and also in real time function assays. But, before we jump into the QC session, I'd just like to draw your attention to the handouts section of the webinar toolbar where you can now download a copy of the slides used in today’s presentation. So, you can take a look at the quick snapshot that I've taken of the toolbar here on the right hand side of the slide. Hopefully, you can recognize that in your webinar toolbar. Click in it, and get access to all of the slides from today’s presentation.
We will also be sharing with you an e-mail with links to a white paper that expands on the content and themes covered in today’s session and a full recording of the webcast. In addition, we will provide links to publications showing how live-cell imaging and analysis has been used in labs around the world to drive research in oncology, immunology, neuroscience, and many other disciplines as well. We will also provide a link to the SelectScience website where you can access independent reviews and customer testimonials for all of the live-cell imaging technologies discussed here today.
Okay, so at this point, I'd like to open the floor to a Q&A session. I can see a number of questions are already coming in from the audience there. So, please continue to provide your questions Del and the team via the question box in the webinar toolbar, and we’ll continue to address them for you. So, looking what we've got here at the moment, we've got a question from Carter. So, this is a question about the T cell [inaudible] actually. So, why start at 20 percent confluence for a short term assay like ADCC? Wouldn’t you want more targets cells so that you have a denser image, especially if you are anticipating the loss of target cells by effector cell killing? So, I guess this is a combination of proliferation cell there.
Dr. Del Trezise: Yeah. That’s a, that’s a great question, and it’s perfectly reasonable standpoint. We have taken the view of building our assays to grow cells from low confluence and look for effects on proliferation as well as the appearance of dead cells. But, absolutely. You can equally set these cells up at a much higher confluence and look for a reduction in cell number and appearance of apoptotic cells.
The only downside I can think of that approach is if the killing time course is slow, then, potentially you will overwhelm the cell cultures when they reach confluence before you see any killing. But I have to agree with the, the suggestion that, potentially, you could do a, particularly for ADCC where we know that the killing profile can be relatively, relatively quick. But, you could, certainly, and in that, this case, also the SKOV-3 cells are not so precious in as much as that they’re and immortalized cell line. You could certainly use a higher cell number to begin with. So, yeah. that’s a great comment. Thank you.
Mr. Tim O’Callaghan: Okay. I've got a question here from Susan. I liked your example of neuronal differentiation. Can you provide some other examples of the use of live-cell analysis in monitor cell manipulations?
Dr. Del Trezise: Okay. Thank you, Susan. Yeah, that’s a, that’s a great question. I just picked two examples for today, but yes there is certainly others. Off the top of my head, I had a couple that I can think of. We recently did a study with human monocytes where we took monocytes and we used differentiation with two different differentiation protocols to induce the microphite--microphage phenotype. So, we differentiated cells into m one and m two macrophages where we could verify their, their difference in, in cell morphology using continuous live-cell analysis. And then, we used those selfsame cells to do real time chemotaxis assays. So, that, that was, I think, a nice example where live-cell analysis was used through the cell husbandry through to the, the kinetic assay.
Another slightly different example I can think of is clonal dilution where we did some experiments to try to generate truly clonal cell lines. So, we seeded them out at very low density on plates, so typically less than half a cell per well, and then looked for the emergence of colonies over time. So, that was a nice way, again, where you could use the ability to review historic data and couple them to later time points to really wind back to see when healthy colonies emerged they really, truly formed from a single cell. So, hopefully, those two examples are helpful. Thank you.
Mr. Tim O’Callaghan: All right. Thank you. A question here from Kyulee [phonetic]. Could we take your ADCC protocol for IC50 calculation?
Dr. Del Trezise: Yes. Yes, we could. Yeah, so I somewhat glossed over the, the pharmacology that, that came out at the end of that, but we had routinely been using that protocol to test Herceptin bio similars, for example. We've a similar assay in suspension cells where we've been looking at Rituximab and Rituximab bio similars where we've been doing full quantity of IC50 biology for, for decision making and, and release and potency assays in the bio similars field. So, absolutely. These types of assays very much lend themselves to full quantative biology.
Mr. Tim O’Callaghan: Okay. I've got a nice question here from Peggy. Do you change the meter during the open ended experiments?
Dr. Del Trezise: Yeah. Good. Thank you, Peggy. Yes. I, I glossed over that detail, but certainly if I, if I take something like the neurite, the neuronal differentiation protocol where I showed you data running out for, you know, a couple of weeks. Then, absolutely, it’s essential to, to replace some of the media. We typically try not to, or I will say wash the plate and remove the entire media. We may replace say 70 percent of the media every third or fourth day.
It’s interesting, sometimes, in the data you can actually see the proliferating cells, and you can see they begin to decrease in their proliferative rate as they exhaust the nutrients of the media and, and the buildup of toxic metabolites occurs. And then, you know that it’s a good time to refeed them, and when you refeed them, they will start to look more healthy and, and start to proliferate more rapidly. So, you can use the data to really help drive your refeeding paradigms.
Just, you know, just to add to that, on a couple of the movies I showed you, you may have noticed that there, they you don’t necessarily always see the exact field of view on some of the plastic, and that’s because the plate will have been taken out of the, the IncuCyte, taken out of the incubator for refeeding and then repositioned in order to continue the analysis.
Mr. Tim O’Callaghan: Okay. So, a question from Richard here. In our live cell imaging experiment, cells frequently change morphology on drug treatment. The algorithm often cannot be optimized for untreated and treated morphologies for an accurate representation of confluency. What do you suggest as a solution? So, might be interesting to, sort of.
Dr. Del Trezise: Yeah.
Mr. Tim O’Callaghan: And some other label instructions.
Dr. Del Trezise: Yeah. Yeah. Thank you, Richard. That’s, again, it’s a complication that, I guess, many imaging or analytical methods face is the, the range of different phenotypes that you see with particularly when you start to work, work with unknown chemicals and, and you can see all sorts of different things. As, as Tim eluded to there, one of the reasons why we introduced these nuclear targeting RFPs is so that we can do true cell count. So, even if the, even if the cell morphology looks really different to your control cells and, and presents challenges for phase contrast confluence algorithm, you should still be able to count the number of viable cells. You can do a true genetic count over time.
We offer the solution, those, those reagents are commercially available. So, the NucLights can be introduced either using a lentivirus to make stable cell lines in the T cell killing example I showed you, or you can use a baculovirus as a transient transfection method if you want to, you know, look at a range of different cell types, that’s a viable and economical way to proceed. So, I would recommend cell counting, but your comment about the differences in cell morphology and the challenges for the confluence algorithm is very well made and understood. So, thank you.
Mr. Tim O’Callaghan: Okay. I think we've got time for just one more during this session. A question from David. What are your recommendations for trying to validate the cell confluence assay as far as accuracy is concerned? What would you use as a reference standard?
Dr. Del Trezise: Oh, not there’s a, there’s an interesting question. So, if, if, if by that you mean do, you know, how, how can we verify that the, the change in cell number is accurately reported by the change in confluence, then hopefully my answer to the previous question to do true cell counting with nuclear labeling is one, one approach. We have validated by correlating our confluence values against other cell counting methods. The traditional cell counting methods or some luminescence or MTT assays or, we've used other methods to verify the confluence algorithm.
But, you know, again, from a higher level view, my perspective is so much about biology is measuring differences in treatment groups. And, and the nice thing is, is that you, you have enough capacity and bandwidth here to, to run controls and to compare effects, you know, A versus effect B and vehicle treated controls to make sure that it’s the differences that you're focusing on and not to get too obsessed by the absolute values.
Mr. Tim O’Callaghan: Great. Okay. So, I think in the interest of time, here, we need to draw a close to this session. But, it’s really great to see the amount of audience engagement here. We continue to get questions in, so we will, of course, be responding to all the questions we receive today and invite you to continue this conversation by mailing us at the e-mail address you see now on your screen, [email protected]. It’ll be a pleasure to continue to talk with you.
So, it is left to me today, now to thank you all for your attendance today. As you log out the webinar, you will be provided with a feedback survey, so please do fill this in as it helps us tailor these events for you. We hope that you found this webinar thought provoking and the content useful and you take the opportunity to explore the use of live-cell imaging and analysis for your research and cell workflows. So, please stay tuned to our social media channels like LinkedIn and Facebook for information around the next webinar in our series where we will see you next time.