Colloquium Fall 2020

Fall 2020

The Quantitative Psychology colloquium series meets weekly in the Autumn and Spring semesters. The primary activity is presentation of ongoing research by students and faculty of the quantitative psychology program. Often,  guest speakers from allied disciplines (e.g., Education, Statistics, and Linguistics) within and external to The Ohio State University present on contemporary quantitative methodologies. Additionally, discussions of quantitative issues in research and recently published articles are often conducted during colloquium meetings.

Faculty coordinator: Dr. Jolynn Pek
Venue:
Online or Jennings 001
Time:
12:30-1:30pm


August 31, 2020

Organizational Meeting

 

September 07, 2020

Labor Day

 

September 14, 2020

Title: Professional Development: Strategic Planning
Abstract: This session will focus on taking time out of our busy schedules to identify personal and professional goals for the semester, create a strategic plan to accomplish them, and identify the types of community, support, and accountability required to make this your most productive and balanced semester.
Discussant: Dr. Jolynn Pek

 

September 21, 2020

Title: Professional Development: Weekly planning
Abstract: This session will focus on devising a weekly plan to align day-to-day and week-to-week work with our strategic plan.
Discussant: Dr. Jolynn Pek

 

September 28, 2020

Speaker:  Dr. Bob Gore
Department of Psychology, The Ohio State University
Title: Perceptions of Safety among LGBTQ Students in a Large, Urban School District
Abstract:
The present study examines perceived safety among LGBTQ students in the Los Angeles Unified School District. Perceived safety is viewed as a protective factor against mental health consequences of peer victimization. In collaboration with researchers at LAUSD, we examined the influence of general and minority stress experiences, both distal and proximal, drawing on Meyer’s Minority Stress Theory as a heuristic framework. We found that LGBTQ students reported higher rates of peer victimization than their non-LGBTQ peers (supporting the notion that a significant portion of their experience of social stress is targeted based on their sexual minority status). We then sought to determine whether school supports plausibly influence perceived safety, whether that influence is mediated by an experience of connection to school, and whether peer victimization interrupts this process as a moderator. In a complex multilevel data structure, results obtained from the MLMED macro (Rockwood and Hayes) suggest mediation but not moderation of that mediation. The normal developmental process seems to involve drawing on supports to feel connected to school, which is associated with perceptions of safety. Results as to moderation of the mediation (by minority and general stress experiences) were mixed. Teacher behaviors associated with greater experience of safety were identified in stepwise regression.

 

October 5, 2020

Title: Professional Development: Developing a Daily Writing Practice
Abstract:
This session will focus on debunking myths about writing productivity and highlight the practice of setting aside 30 minutes of writing that will increase writing productivity and decrease stress, anxiety and guilt.
Discussant: Dr. Jolynn Pek

 

October 12, 2020

Title: Paper discussion on Jones & Thissen (2006).  1 A History and Overview of Psychometrics. Handbook of statistics, 26, 1-27.
Discussant: Dr. Jolynn Pek

 

October 19, 2020

Speaker:  Dr. David Thissen
Department of Psychology & Neuroscience, University of North Carolina at Chapel Hill
*joint event with University of Maryland, College Park Measurement, Statistics and Evaluation Program

Title: Data visualization and statistical graphics
Abstract: Statistical graphics can be used for data analysis, displaying results, and for teaching statistical concepts. In the last fifty years, advances in computing and software development have shifted the creation of statistical graphics from the exclusive purview of the trained draftsman or graphic artist to provide universal access. In this presentation we discuss examples of a variety of statistical graphics that are intended to be inspirational of creativity and efficacy.

Dr. David Thissen is at the Department of Psychology & Neuroscience, and the L.L. Thurstone Psychometric Laboratory, at the University of North Carolina at Chapel Hill. He teaches in the graduate program in quantitative psychology, and in the undergraduate program in psychology—mostly on topics in the areas of psychological testing and measurement and item response theory.

In addition, in collaboration with graduate students here and colleagues elsewhere, he am involved in research on contemporary developments in the theory of educational and psychological testing. Support from the North Carolina Department of Public Instruction and the National Institutes of Health facilitates this work.

 

October 26, 2020

Title: Alumni Panel Discussion: Careers in Industry for Quantitative Methodologists
*Event organized by the Quantitative Psychology Group
Abstract: Quantitative psychology is a broad field encompassing mathematical modeling, research design and methodology, and statistical analysis of psychological data. This scientific field traces its roots to the study of human (mental) abilities and psychological measurement, giving birth to psychometric models (e.g., factor analysis and item response theory). The interdisciplinary nature of quantitative psychology is evident in its broad impact in social science research, including the disciplines of education, public health, and data analytics. This panel discussion of experts in these fields, who have all graduated from the Ohio State University, will focus on describing their experiences working in industry.

Panel Speakers

Dr. Carrie Houts
          Director, Psychometrics
Vector Psychometric Group, LLC
Graduated from OSU in 2011
Dr. Aleks Sinayev
                Quantitative User Experience Researcher
Google
Graduated from OSU in 2016
Dr. Joonsuk Park
                Data Scientist
Duetto Research
Graduated from OSU 2019
Dr. Jack DiTrapani
                International Graduate Trainee in Predictive Analytics
Munich Reinsurance
Graduated from OSU 2019

 

November 2, 2020

Speaker: Dr. Samantha Anderson
Department of Psychology, Arizona State University
*joint event with University of Maryland, College Park Measurement, Statistics and Evaluation Program

Title:  Multiple Testing in ANOVA and Regression: Issues and Solutions for Fostering a More Reproducible Science
Abstract: Multiple testing, or conducting multiple significance tests on the same set of variables, arises quite frequently in psychological studies and can have unintended consequences when left uncorrected. In the first portion of this presentation I take a Bayesian perspective and consider the probability favoring the null hypothesis implied by certain statistically significant p-values. I show that multiple testing, in the form of using multiple parallel dependent variables, drastically increases the discrepancy between the p-value and the probability the null hypothesis is true and reduces the evidence in favor of the alternative hypothesis, despite conventional significance. In the second portion of this presentation, I take a look at a type of multiple testing that is rarely considered nor adjusted for: tests of multiple predictors in multiple regression analysis. I describe preliminary simulations that assess a variety of adjustment procedures developed for the ANOVA-case but applicable to regression, and I provide initial recommendations on procedures that balance Type I error rates with statistical power. Finally, I describe a few insights that link this work with the replication crisis in psychology.

Dr. Samantha Anderson is Assistant Professor in the Department of Psychology. She is interested in research design and quantitative methodology. Her work focuses on improvements to sample size planning and power analysis, with an emphasis on user-friendly approaches and software. Additionally, she has research interests in missing data handling and longitudinal analysis.

 

November 9, 2020

Inhan Kang
Department of Psychology, The Ohio State University

Title: Modeling the Psychological Process underlying Responses and Response Times of Psychometric Measurement Data
Abstract: There has been increasing interest in measuring and modeling response and response times (RTs) in psychometrics. Many of the previous approaches extend the latent variable modeling such as factor analysis and item response theory models to use RTs to predict response proportions, jointly model responses and RTs, or account for local dependency between them. However, these approaches do not shed light on how psychological constructs produce both measures. In this presentation, a process-based modeling approach is proposed in which different theories of the psychological response process are implemented as mathematical models and compared. The modeling is based on the evidence accumulation process of decision-making in mathematical psychology and three models with different representations of how evidence drives response-making are examined. Parameter recovery and empirical model fitting results will be provided.
Discussant: Yiyang Chen

 

16 November 2020

Dr. Zhiyong (Johnny) Zhang
Department of Psychology, Notre Dame University
*joint event with University of Maryland, College Park Measurement, Statistics and Evaluation Program

Title: Quantitative Psychology at the Age of Data Science
Abstract: Data science is quickly becoming a buzzword in many fields including quantitative psychology and psychology in general. I will argue that the development of data science provides unique opportunities for quantitative researchers. Through several recent studies, I will show how a quantitative psychologist can contribute to and benefit from the development of data science. Examples will be provided in terms of network analysis, text mining, and big data analysis.

Dr. Zhiyong Zhang is interested in developing and applying statistical methods in the areas of developmental and health research. His methodological research interests include (1) continuous and categorical dynamic factor models, nonlinear time series models, and dynamical systems analysis, (2) linear and nonlinear models for analyzing longitudinal data, and (3) Bayesian methods and statistical computing. His substantive interests are in the analysis of intraindividual change and interindividual differences in change of life span development, cognitive aging, and emotion.

 

23 November 2020

Speaker: Yiyang Chen
Department of Psychology, The Ohio State University

Title: Bayesian hierarchical modeling of the memory updating task
Abstract: The memory updating task is a common scheme used in working memory studies, mainly targeted at measuring memory capacity and efficiency. I investigate the possibility to model the task with a Bayesian hierarchical structure, which may be able incorporate existing working memory theories with reaction time data. The model can potentially reveal mechanisms of the working memory at both group and individual levels.

Discussant: Selena Wang

Speaker: Diana Zhu
Department of Psychology, The Ohio State University

Title: Measurement Invariance Relationships between Multilevel Factor Models and Multigroup Factor Models
Abstract: For multivariate data with different groups of individuals, two factor model approaches are available: multigroup factor models and multilevel factor models. The former was originally developed for a few groups while the latter requires a large number of groups. Except for a few articles in the literature (Jak, 2019; Jak & Jorgensen, 2017; Jak, Oort, & Dolan, 2013, 2014), not much attention is given to the relationship between the two approaches.

In this presentation, I will (1) review multigroup and multilevel factor models; (2) map cross-level invariance (or the lack thereof) into multigroup scalar invariance (or the lack thereof) and vice versa; (3) briefly discuss a set of demonstration studies, practical consequence, and future research direction.

Discussant: Jacob Coutts


30 November 2020

Speaker: Dr. Steffi Pohl
Department of Education and Psychology  , Freie Universität Berlin
*joint event with University of Maryland, College Park Measurement, Statistics and Evaluation Program

Title: Modeling and Reporting Assessment Results: Disentangling Different Aspects of Test Performance using Log Data

Abstract: In data of low-stakes assessments different test-taking behavior is present. Test takers omit items, do not reach the end of the test due to time limits or due to quitting, randomly guess on items, or apply different solution strategies. I show in which way current practice in estimating competence scores threatens the validity and comparability of test scores. I propose to deal with this issue by disentangling the different aspects that impact performance and to report on the performance of competence assessments in form of a profile of these different aspects. I present models that make use of log data from computerized testing, to identify and investigate test-taking behavior, and to disentangle the different aspects that impact performance. Disentangling different aspects allows for a deeper understanding of performance and a fairer comparison of groups and may be more valid for situations outside of testing sessions. I will discuss the implications of disentangling different aspects of test performance for reporting on results and country rankings.

Dr. Steffi Pohl is Full Professor of Methods and Evaluation/Quality Assurance, Department of Education and Psychology, Freie Universität Berlin. She was recently awarded the Psychometric Early Career Award in July, 2020. Her research focuses on developing methods within the research areas of Psychometrics, log data analysis, missing values and causal inference.

 

Robert Wherry Speaker Series
Colloquium Archive