Theme 5 - New Methodologies for Multiscale Modeling

Day 3: Friday March 24, 2017

Moderators:

Pedja Neskovic (ONR)

 

Virginia Pasour (ARL)

 

Ramana Madupu (DOE)

 

Beth Lewandowski (NASA)

 

New Methodologies for Multiscale Modeling

Agenda for the Session: 

8:30 - 8:50am - George Karniadakis (Brown) – Multi-fidelity Modeling and Propagation of Uncertainty in Multi-scale Biological Systems 

 

8:50 - 9:10am - Larry Carin (Duke) - Deep Poisson Factor Analysis

 

9:10 - 9:30am - Elchanan Mossel (MIT) - On Models and Theory

 

9:30 - 10:00am - DISCUSSION

Comments/Questions (please identify yourself):

 

10:00 - 10:30am BREAK

10:30 - 10:50am – David Dunson (Duke) – Bayesian Approaches for Multi-scale Modeling

 

10:50 - 11:10am – Le Song (GTech) – Machine Learning Tools for Networked Biological Processes

 

11:10 - 11:30am – Elizabeth Ogburn (JHU) – Inferring Causal Relationships from Observational Data

 

11:30 - 12:00pm - DISCUSSION

Comments/Questions (please identify yourself):

 

The goal of this special session is to bring together scientists who have recently developed powerful tools for the analysis of big data and introduce those tools to the MSM community.  The focus of the talks will not be on the results of specific projects but on discussing potential benefits and limitations of new mathematical and computational methods for potential use with modeling.

The specific charge to the speaker is to:

  • present new methodologies that can be used by the MSM community (and avoid sharing own research successes!);
  • present methodological limitations and use contexts for the methods presented;
  • discuss sociocultural issues in using these methods in the MSM community, and how IMAG can facilitate their use in the MSM Consortium.


Back to MAIN AGENDA

Comment

Your name
Curtis Larsen
Comment

Lawrence Carin:  How did you separate the training data from the evaluation data for your model?

 

Submitted by Anonymous (not verified) on Fri, 03/24/2017 - 09:32

Your name
Lester Chiu
Comment

To Dr. Carlin:

  1. A naive question, is the context of the words is important to your case (beyond word counting)? The context might change the the meaning or importance of the term of the word. And can some other deep learning methods apply to your question, for example “word embedding”, “recurrent neural network” or “convolution neural network”?

  2. When increasing the depth of your model, are you able to interpret the model, for example, knowing what features (factors) contribute more?

  3. Follow by Curtis Larsen's question, how do you define your background (negative) data? The way of chossing or the size of negative group data might affect the prediction performace (here you use AUROC implies the size of positive and negative data are similar)

Submitted by Anonymous (not verified) on Fri, 03/24/2017 - 09:52

Table sorting checkbox
Off