Quantitative Systems Pharmacology

Back to 2019 MSM Agenda

 

Session Description:  In an influential white paper in October 2011, the NIH Quantitative & Systems Pharmacology Workshop Group wrote, “We require better quantitative models of pharmacological mechanism at all scales, starting with single targets and drugs and scaling to vertically and horizontally integrated multi-scale models.” Seven years later, all major pharmaceutical companies have built systems pharmacology groups, separate from their traditional pharmacometrics groups, that employ computational mechanistic multi-scale modeling to evaluate how drugs perturb intracellular signaling networks, drug pharmacokinetics, and clinical trial designs. This session will highlight recent developments in systems pharmacology across academia and industry.

2008 and 2010 NIH Workshops on Quantitative Systems Pharmacology

2014 MSM Theme Discussion on Drug Discovery and Development

Organizers:     Jeff Saucerman, Feilim Mac Gabhann, Colleen Kuemmel (FDA) IMAG moderator

2:00-2:15     Brian Schmidt, Bristol-Myers Squibb - Application of Quantitative Systems Pharmacology in Early Clinical Development for Immuno-Oncology Drug Combinations

2:15-2:30     Jeff Saucerman, University of Virginia - Systems pharmacology model for control of cardiac fibrosis

2:30-2:45    Feilim Mac Gabhann, Johns Hopkins University - QSP for non-drug treatments

2:45-3:00     Denise Kirschner, University of Michigan - A Multi-scale Systems Pharmacology approach to tuberculosis therapy

3:00-3:30     Panel Discussion; panelists: Jennifer Linderman, Denise Kirschner, Brian Schmidt, Jeff Saucerman, Feilim Mac Gabhann

Speaker Bios and Abstracts:

Jeff Saucerman, University of Virginia (@sauce_lab)

Systems Pharmacology Model for Control of Cardiac Fibrosis. R01-HL137755 (PIs: Jeff Saucerman and Jeff Holmes)

The dynamic wound healing process after myocardial infarction can contribute to adverse remodeling events such as wall thinning and fibrosis. Fibroblasts regulate extracellular matrix degradation and deposition throughout wound healing. Here, we used a computational model of the cardiac fibroblast signaling network to predict how fibroblasts respond to the multiple dynamic stimuli post-MI and to identify putative drivers of pro- and anti-fibrotic activity. We then integrate this model with drug-target interaction databases, simulating the impact of these drugs in regulating fibroblast phenotypes and matrix after myocardial infarction. Several predictions are consistent with independent studies in the literature. Ultimately, network modeling and ongoing integration into multi-scale models is expected to identify mechanisms by which fibroblasts are regulated post-MI and therapeutic strategies for regulating ECM remodeling.

Biography: Dr. Jeff Saucerman is an Associate Professor of Biomedical Engineering at the University of Virginia. He leads a research group in cardiac systems biology, focused on identifying and controlling the molecular networks involved in heart failure. He received a B.S. in Engineering Science from Pennsylvania State University, Ph.D. in Bioengineering from the University of California San Diego with Dr. Andrew McCulloch, and completed a postdoctoral fellowship with Dr. Don Bers at Loyola University Chicago. Dr. Saucerman has received a number of awards including an NSF CAREER Award, Fellow of the American Heart Association, Dean’s Excellence in Teaching Award, and Pinn Scholar from the University of Virginia School of Medicine.

 

Brian J. Schmidt, Bristol-Myers Squibb

Application of Quantitative Systems Pharmacology in Early Clinical Development for Immuno-Oncology Drug Combinations

The cancer-immunity cycle is a complex, multifactorial system offers many opportunities for cancer cells to subvert immune control, but also provides a number of potential targets for therapeutic intervention. Modeling aspects of this complex system presents promise to help guide multiple aspects of drug development, but also presents clear challenges for developing reliable models that are predictive of clinical endpoints, have a reasonable mechanistic structure and clinical population variability, and can be developed and applied in suitable time frames to impact drug development.  Examples where these challenges have been addressed in the development and application of QSP cancer-immunity models to inform clinical development of combination immuno-oncology agents are presented, and model calibration and expansion in the context of targeting new pathways is discussed.

Biography: Dr. Brian J. Schmidt is a Senior Principal Scientist in Quantitative Clinical Pharmacology at Bristol-Myers Squibb. His team applies QSP to inform multiple I-O clinical development groups within BMS as well as in regulatory interactions, and his current research interests include the development of QSP immuno-oncology models, new QSP tools, and new algorithms.  Brian completed his PhD training in biomedical engineering and since then has performed research in QSP, systems biology, and computational biology.  Brian is an active leadership team member of the International Society of Pharmacometrics QSP Special Interest Group.

 

Feilim Mac Gabhann, Johns Hopkins University (@FMacG)

QSP for non-drug treatments.

Building mechanistic models of drug targets, and of the targets' interaction networks and environment, enable us to use these models for more than drugs. Gene therapy, biomaterials, cell transplant and more are viable therapeutics that can simulated and tested. The incorporation of patient data further allows us to simulate the variability observed in clinical trials.

 

Biography: Dr. Feilim Mac Gabhann is an Associate Professor of Biomedical Engineering at Johns Hopkins University, and part of the Institute for Computational Medicine. An alum of University College Dublin, Johns Hopkins University, and University of Virginia, his lab builds mechanistically-detailed computational models of various pharmacological approaches, including small molecules, antibodies, gene therapy, and cell-transplant therapy. Feilim serves as Deputy-Editor-in-Chief for PLoS Computational Biology.

 

 

Denise Kirschner, University of Michigan (@KirschnerDenise)

A Multi-scale Systems Pharmacology approach to tuberculosis therapy

Designing successful drug regimens to treat disease can be difficult. Positive results from preclinical and in vitro experiments do not necessarily translate to clinical efficacy, resulting in failed clinical trials. Tuberculosis (TB), caused by the pathogen Mycobacterium tuberculosis, is treated with combinations of antibiotics to limit the development of resistance, so designing the best combination and dosing schedule is a complex problem. One pathological characteristic of TB is the formation of lesions called granulomas. These granulomas further complicate regimen design by introducing a physiological environment that harbors subpopulations of bacteria that are phenotypically tolerant to certain antibiotics and also limits antibiotic distribution. Rational design of new antibiotic regimens to treat TB requires an understanding of drug distribution in these granulomas, as well as the bactericidal activity of different antibiotics.

We designed an integrated computational and experimental approach to optimizing drug regimens for TB. We developed a multi-scale computational model to simulate granuloma formation, antibiotic distribution and antibiotic treatment. We utilize a cellular and tissue scale agent-based model to simulate immune cell and bacterial interactions on a two- or threedimensional spatial grid to capture the emergent behavior of granuloma formation.. At the molecular scale, blood vessels in the agent-based model deliver antibiotics onto the grid (lung tissue) where the antibiotics undergo diffusion, extracellular binding, and cellular partitioning. A pharmacodynamics model estimates the concentration-dependent killing rate constant and the probability a given bacterium will be killed during treatment simulation. Calibrating the pharmacokinetics and pharmacodynamics of different antibiotics based on experimental data allows us to simulate treatment with a regimen of any combination of antibiotics. However, the ‘regimen design space’ for possible combinations of antibiotics is still too large to search exhaustively even using computation. We apply a surrogate-assisted optimization framework to predict which combinations of antibiotics and dosing schedules produce optimal sterilizing regimens, providing an efficient way to identify optimal regimens. Ultimately, this computational framework provides a pipeline to predict regimen efficacy on a sample of virtual patients through a virtual clinical trial. Using a computational framework to predict optimal drug regimens can aid in the rational design of regimens with higher success rates.

Biography: Over the past 25 years Dr. Kirschner has focused on questions related to host-pathogen interactions in infectious diseases. The main focus has been to study persistent infections (e.g. Helicobacter pylori and Mycobacterium tuberculosis and HIV-1).  Dr. Kirschner is a professor of Microbiology and Immunology at the University of Michigan. She  currently serves (and has for the past decade and a half) as co- Editor-in-Chief of the Journal of Theoretical Biology the oldest and the top theoretical biology journal. She has served as both a member of and as chair for many study section review panels at the National Institutes of Health. This gives her a profound knowledge and experience of the field and broad expertise in many areas of computational and mathematical biology. She serves as the founding co-director of The Center for Systems Biology at the University of Michigan, an interdisciplinary center at the University of Michigan aimed to facilitate research and training between wet-lab and theoretical scientists.  She currently serves as President for the Society of Mathematical Biology.

 

Interactive Discussion (please put you name before your comments):

Comment

Your name
Jacob Barhak
Comment

Brian and Jeff discussed validation. Two questions: 1) How do you obtain validation data, Jeff mentioned literature, do you copy validation parameters by hand? 2) Are there more or less equivalent models to the models you develop? If yes, how do they compare to your model validation?

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 14:32

Your name
Jeff Saucerman
Comment

Jacob,

Essentially, yes we are automating our validation test suites so that we can evaluate the robustness of model validation to variation in model structure and parameters. Here is the poster we'll be presenting today on this topic, would love to chat further.

https://msmmeeting.nibib.nih.gov/sites/default/files/MSM%20Saucerman%20_%202019%20IMAG%20MSM%20Meeting%20Poster.pdf

Jeff

Submitted by Anonymous (not verified) on Thu, 03/07/2019 - 09:08

In reply to by Anonymous (not verified)

Your name
Darren Tyson
Comment

Model testing instead of validation.

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:08

Your name
Darren Tyson
Comment

Are there any ways of assessing model consistency or similarity among different modeling approaches? For example, how would we define the similarities between a boolean model versus a model of mass action kinetics when they are describing the same system?

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:13

Your name
Jeff Saucerman
Comment

We have used sensitivity analysis to compare mass action kinetic models vs. logic-based differential equation models of the same network structure. https://www.ncbi.nlm.nih.gov/pubmed/21087478

I agree, it would be great to see more comprehensive comparison between models of differing formalisms.

 

Eric Sobie previously did great work using sensitivity analysis to compare the functional relationships among various models of cardiomyocyte electrophysiology.

https://www.ncbi.nlm.nih.gov/pubmed/19217846

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 17:20

In reply to by Anonymous (not verified)

Your name
Justin Melunis
Comment

Dr. Saucerman, you talked specifically about how you created your modeling network from literature and that your model has become strictly mechanical over say a data science approach. Due to this, finding the relationships between two nodes in your network seems to be non-trivial, so you try and work with non absolute values and utilize drug results to improve these results. Have you looking into the concept of visual nueral networks for training these relationships? By the term visual neural network, I mean that the structure of a nueral network is derived from biological relavence for each node. The paper that I am thinking of, which might have some relevance to your work, is on the development of a program called DCell, that was published in march of last year. https://www.nature.com/articles/nmeth.4627. Now, I understand that you may not have the dataset that would be enough to fully train one of these networks from scratch, but maybe you can take your currently mechaniclistically trained model and utilize back propregation, much like the typical training a nueral network, to optimize your model.

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:19

Your name
Justin Melunis
Comment

I apologize, I meant for my comment to be an after panel discussion type suggestion directed at Dr. Saucerman. My understanding was that you have inputs to your model and results to your model, but no way to test the intermediate relationships between, so the real question becomes how can you optimize the parameters that link the nodes of your model together to best optimize the input to output relationship. I believe that back propragation would be a very useful technique within this case. This has been shown to work with great results, such as in the paper I listed. 

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:59

In reply to by Anonymous (not verified)

Your name
Jeff Saucerman
Comment

We validate and revise models against a fair amount of data from intermediate components of these networks and experiments with internal perturbations. But I agree that there is substantial potential in doing this more systematically with such inference approaches.

Your name
Gary An
Comment

Comment in support of Dr. Kirschner and Dr. Jacobosen's comments: in addition to intrinsic biological differences between model organisms and human populations, there is the issue of patient heterogeneity in clinical trials. Our contention is that in order to get closer to the clinical population it is critical to operate over a wide parameter space of clinical/biological plausibility when you are intending to examine the efficacy (ne robustness) of an intervention across of heterogeneous population. This is an exponential increase in the number of simulations that need to be done, and the ML/DL approaches are an important tool. in being able to make this tractable; this is the basis of our current work looking at the possible space of treatment/control of sepsis. 

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:29

Your name
Raj Vadigepalli, Thomas Jefferson University, Philadelphia
Comment

It is quite a logical twist to go from “when can we simulate a multi-organ QSP of a human?” to “it is ridiculous that we cannot learn something useful until we build a virtual human”. In complex systems with many unexplored emergent properties, we cannot predict ahead of time the insights we would learn from putting a larger system together from component parts. To stay away from Complexity because “models have to be simpler abstractions” seems like missed opportunity.

Much is being learned from Whole Cell Models… that was not possible to query in models of individual pathways and networks... Similarly from Whole Brain Simulations... and so on.

There is a case to be made for a Virtual Physiological Human and we should recognize it for what it is, rather than be dismissive.

 

Submitted by Anonymous (not verified) on Wed, 03/06/2019 - 15:33

Your name
Jeff Saucerman
Comment

Definitely agree that such models can be very insightful. One obvious but highly impactful historical example would be the multi-organ circulatory models by Art Guyton's models of the circulation, which predicted disease and treatment outcomes.

Annu Rev Physiol. 1972;34:13-46. https://www.ncbi.nlm.nih.gov/pubmed/4334846

Table sorting checkbox
Off