IRC 2023 – Pre-Conference Workshops

Prior to the IEA International Research Conference (IEA IRC), IEA offered two-day, on-site workshops covering specialized topics related to large-scale assessment from 26–27 June 2023. 


The workshops provided a stimulating and practical learning environment for those who wished to improve their understanding of, and gain practice in working with, data from large-scale international assessments such as those conducted by IEA.

 
 
1. Using large-scale assessment data for informing policy and practice
David Rutkowski and Sabine Meinck
Prof. David Rutkowski and Dr. Sabine Meinck

Over a 60-year history, modern ILSAs (international large-scale assessments) have become influential educational policy tools, moving beyond their historical role as descriptive “snap shots” of educational systems. As the number of ILSAs increase in both number of participants and subjects assessed, policymakers often ask how the resulting information can help them inform the policy process. One important way to meet such requests is to present policy-relevant information resulting from ILSAs in a brief and accessible format. This workshop discussed the utility and limitations of ILSAs for informing policymakers and education practitioners. We provided an overview of how policy briefs can be structured along with illustrative examples, specifically using briefs published under IEA’s Compass Briefs series. Participants were encouraged to start preparing their own policy brief and left the workshop with a well-developed outline that they could later develop into a publishable document that may be included in an IEA Compass Brief.

Contents
The workshop focused on the studies PIRLS 2021 and REDS, and emphasized the following key topics:

  • Challenges and solutions in the construction and development of educational briefs that utilize ILSA data (defining and addressing the audience, structure and length, tables and visual displays) 
  • General information about PIRLS and REDS: goals, purposes and intent, theoretical frameworks, target populations, achievement domains and background information collected
  • Introduction on how to find, use, and interpret data from the studies
  • Interpretation and discussion of findings, using terminology and wording that is accessible to the defined audience

Methods
Lectures and group work alternated. Time was provided for participants to discuss possible topics and outlines for briefs that would be relevant for their specific context. Participants also got the opportunity to work with IEA data, either referring to readily available statistics or—for those familiar with it—using simple statistical analysis tools such IEA’s IDB Analyzer, for which aid by instructors was provided. Participants inspired each other by exchanging ideas on nationally focused analysis while being advised by instructors on possibilities and limitations.

About the instructors
David Rutkowski is a Professor of Research Methods at Indiana University (IU), and also worked as a researcher for IEA in Hamburg, Germany. David’s research focuses on educational measurement and policy. He has collaborated with or consulted for national and international organizations including the US State Department, USAID, UNESCO, World Bank, IEA and the OECD. He is currently the editor of the IEA policy brief series, co-editor of the journal Discourse, serves on the IEA publication editorial committee and is a board member of several academic journals. He also co-leads a project on improving assessment literacy among teachers. 

Dr. Sabine Meinck is co-head of IEA’s Research and Analysis Unit and head of the Sampling Unit. She is responsible for the coordination and conduct of research and sampling-related activities within IEA. Apart from her research activities, one of her main interests relates to the dissemination of results of IEA studies to policy and practice.

 
 
2. Analyzing IEA Data with R
Umut Atasever and Diego Cortes

In order to conduct targeted research using existing education data from international large-scale assessments (ILSA), researchers must have a good knowledge of statistical software in order to perform preliminary analyses, subset data to the target population of interest, and finally carry out statistical analysis, taking the specifics of ILSA data into account. A number of such software applications exist, but many are commercial while still requiring a steep learning curve for basic functionality. A free software application growing in use among the research and academic community is called R and has an active user base willing to discuss and exchange ideas, as well as offer solutions to basic and difficult programming questions. It has been integrated with the IEA IDB Analyzer, which is a freely available software for ILSA data analysis.


R is one of the most important tools for statisticians presently used. It is an open-source project software environment and programming language with a wide variety of options that is highly extensible. It is not only used and supported by a broad and very diverse research community, but also by companies like Microsoft, Google, RStudio, and Data Camp that are actively promoting R as a lead platform.


This hands-on workshop had two goals. Firstly, it aimed to help individuals who wanted to advance their data science skills using R and become familiar with R Studio, a software that makes R user-friendly. It informed participants how to get their data into R, structure and transform it, analyze it, and visualize the results, using the Tidyverse tools of data exploration. Secondly, participants learned about the specifics of analyzing ILSA data, such as using plausible values and weights. We used PIRLS 2021 data for our analysis examples. Participants were then introduced to the IEA IDB Analyzer integration with R to perform analyses with large-scale assessment data. 
After the workshop, participants were able to:

  • Understand the use of R and R Studio
  • Import data into R
  • Perform subsetting, filtering, and recoding with data
  • Use the R package Tidyverse to explore and visualize data
  • Use functions to perform descriptive and inferential analyses
  • Understand how to account for the specifics of ILSA data in statistical analysis
  • Use R in conjunction with the IEA IDB Analyzer for research questions
  • Compile the results and graphs to a report using R markdown for, e.g., writing a journal article

This course aimed to provide researchers with a foundation of the programming knowledge they need to use R and R resources.

 

About the instructors
Umut Atasever works at the IEA Sampling Unit with major responsibilities in TIMSS, ICILS and ICCS. He focuses on different aspects of sampling methodology with studies on the validity of international assessments.

Diego Cortes is a research analysist at IEA Hamburg with experience in survey methodology and data analysis in the context of international large-scale assessments in the field of education. He specializes in generating probabilistic samples with a complex design and examining how survey-design features impact the inference one can reach about population parameters.

 
 
3. Methods for causal inference with observational data from international assessments
Alec Kennedy, Andrés Strello and Rolf Strietholt
Alec Kennedy, Andrés Strello and Rolf Strietholt

Much comparative educational research poses questions about causal effects, but cannot employ experimental techniques to investigate them. However, during the last couple of decades analytical techniques have been developed within statistics, econometrics, and other disciplines, which support causal inference from cross-sectional and longitudinal data. The workshop presents so-called quasi-experimental designs in the context of educational research for the identification of causal effects: instrumental variables, regression discontinuity, difference-in-differences, and fixed effects. These include approaches such as exploiting random variation due to class size or age of entry policies in schools, country-level longitudinal analyses (e.g., TIMSS 2011, 2015, 2019), comparisons between educational stages (e.g., fourth grade, eighth grade), within-student-between-grade designs (e.g., math, science), among other strategies.

In this workshop, causal effects were defined using the potential outcomes framework from the Rubin causal model (RCM). Key differences between experimental and observational designs were discussed to explain issues such as selection into treatments, omitted variable bias, and reverse causality. Thereafter, we introduced different methods that address these issues in a non-technical way. We reviewed some studies that have employed the data from international assessments such as PIRLS and TIMSS to illustrate the methods. These studies were discussed in groups, highlighting the authors’ identification strategy as well as discussing advantages and shortcomings of each method.     

After the workshop, participants were able to:

  • Understand what a causal effect is
  • Identify issues related to the identification of causal effects with observational data
  • Understand limitations of regression analysis
  • Employ strategies to identify effects using observational data from large-scale assessments.

Methods
The workshop was a mixture of lectures, group discussions, and hands-on examples, to ensure participants gained both theoretical background and real-life examples applicable to educational research. 

About the instructors
Alec Kennedy is a researcher at the Research and Analysis Unit of the IEA. His interests are in education policy and quantitative research methodology.

Andrés Strello is a researcher at the Research and Analysis Unit of the IEA. He is interested in educational inequality, sociology of education, and comparative analyses. 

Rolf Strietholt is the co-lead of the Research and Analysis Unit of the IEA. His main research interests include school and educational effectiveness research and research methodology.