Abstract: After a brief overview of the many uses of finite mixture modeling, applications of new mixture modeling developments are discussed. One major development goes beyond the conventional mixture of normal distributions to allow mixtures with flexible non-normal distributions. This has interesting applications to cluster analysis, factor analysis, SEM, and growth modeling. The talk focuses on applications of Growth Mixture Modeling for continuous outcomes that are skewed. Examples are drawn from national longitudinal surveys of BMI as well as twin studies. Extensions of this modeling to the joint study of survival and non-ignorable dropout are also discussed.
Bengt Muthén obtained his Ph.D. in Statistics at the University of Uppsala, Sweden and is Professor Emeritus at UCLA. He was the 1988-89 President of the Psychometric Society and the 2011 recipient of the Psychometric Society’s Lifetime Achievement Award. He has published extensively on latent variable modeling and is one of the developers of the Mplus computer program, which implements many of his statistical procedures.
Dr. Muthén’s research interests focus on the development of applied statistical methodology in areas of education and public health. Education applications concern achievement development while public health applications involve developmental studies in epidemiology and psychology. Methodological areas include latent variable modeling, analysis of individual differences in longitudinal data, preventive intervention studies, analysis of categorical data, multilevel modeling, and the development of statistical software (namely Mplus!).
For more information about Bengt Muthén, check out his website: www.statmodel.com/bmuthen/
Andrew Gelman is a professor of statistics and political science and director of the Applied Statistics Center at Columbia University. He has received the Outstanding Statistical Application award from the American Statistical Association, the award for best article published in the American Political Science Review, and the Council of Presidents of Statistical Societies award for outstanding contributions by a person under the age of 40. His books include Bayesian Data Analysis (with John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Don Rubin), Teaching Statistics: A Bag of Tricks (with Deb Nolan), Data Analysis Using Regression and Multilevel/Hierarchical Models (with Jennifer Hill), Red State, Blue State, Rich State, Poor State: Why Americans Vote the Way They Do (with David Park, Boris Shor, and Jeronimo Cortina), and A Quantitative Tour of the Social Sciences (co-edited with Jeronimo Cortina).
Andrew has done research on a wide range of topics, including: why it is rational to vote; why campaign polls are so variable when elections are so predictable; why redistricting is good for democracy; reversals of death sentences; police stops in New York City, the statistical challenges of estimating small effects; the probability that your vote will be decisive; seats and votes in Congress; social network structure; arsenic in Bangladesh; radon in your basement; toxicology; medical imaging; and methods in surveys, experimental design, statistical inference, computation, and graphics.
Jeffrey R. Harring
Dr. Harring is Associate Professor of Measurement, Statistics, and Evaluation (EDMS) in the Department of Human Development and Quantitative Methodology at the University of Maryland. Prior to joining the the EDMS faculty in the fall of 2006, Dr. Harring received a M.S. degree in Statistics in 2004, and completed his Ph.D. in the Quantitative Methods Program within Educational Psychology in 2005–both degrees coming from the University of Minnesota. Before that, Dr. Harring taught high school mathematics for 12 years.
Dr. Harring teaches a variety of graduate-level quantitative methods courses including: General Linear Models I & II, Statistical Analysis of Longitudinal Data, Statistical Computing and Monte Carlo Simulation, Multivariate Data Analaysis and Finite Mixture Models in Measurement and Statistics.
Dr. Harring’s research interests focus on applications of (i) statistical models for repeated measures data, (ii) linear and nonlinear structural equation models, (iii) multilevel models and (iv) statistical computing.
The principal focus of Dr. Robins’ research has been the development of analytic methods appropriate for drawing causal inferences from complex observational and randomized studies with time-varying exposures or treatments. The new methods are to a large extent based on the estimation of the parameters of a new class of causal models – the structural nested models – using a new class of estimators – the G estimators. The usual approach to the estimation of the effect of a time-varying treatment or exposure on time to disease is to model the hazard incidence of failure at time t as a function of past treatment history using a time-dependent Cox proportional hazards model. Dr. Robins has shown the usual approach may be biased whether or not further adjusts for past confounder history in the analysis when:
(A1) there exists a time-dependent risk factor for or predictor of the event of interest that also predicts subsequent treatment, and (A2) past treatment history predicts subsequent risk factor level.
Conditions (A1) and (A2) will be true whenever there are time-dependent covariates that are simultaneously confounders and intermediate variables.
In contrast to previously proposed methods, Dr. Robins’ methods can:
- be used to estimate the effect of a treatment (e.g., prophylaxis for PCP) or exposure on a disease outcome in the presence of time-varying covariates (e.g., number of episodes of PCP) that are simultaneously confounders and intermediate variables on the causal pathway from exposure disease;
- allow an analyst to adjust appropriately for the effects of concurrent non-randomized treatments or non-random non-compliance in a randomized clinical trial. For example, in the AIDS Clinical Trial Group (ACTG) trial 002 of the effects of high-dose versus low-dose AZT on the survival of AIDS patients, patients in the low-dose arm had improved survival, but they also took more aerosolized pentamidine (a non-randomized concurrent treatment);
- allow an analyst to adequately incorporate information on the surrogate markers (e.g., CD4 count) in order to stop at the earliest possible moment, randomized trials to the effect of the treatment (e.g., AZT) on survival.
Dr. Robins teaches at the Harvard School of Public Health.
Edward Vytlacil received his PhD in Economics from the University of Chicago in 2000. He is currently a Professor of Economics at New York University, having previously been a faculty member at Stanford University, Columbia University, and Yale University. He is a Co-Editor of the Journal of Applied Econometrics, and an Associate Editor for Econometrica and the Journal of Econometrics. Vytlacil’s work has focused on the micro-econometric methodology for treatment effect and policy evaluation using disaggregate data. A theme in his work has been in allowing for the effects of a treatment to vary across people, and allowing individuals to have some knowledge of their own idiosyncratic treatment effect and to act upon that knowledge. In addition to his work in econometric methodology, he has published empirical work in labor economics and health economics evaluating the returns to schooling, the returns to job training programs, and the effectiveness of medical interventions.
Dr. Vytlacil will be speaking about Accounting for Individual Heterogeneity in Treatment Effect Analysis.