Alexandra Sarafoglou
Sessions
Most empirical research articles present a single analysis conducted by the authors. However, many-analysts studies have shown that research teams often use distinct analytical approaches to the same data, which frequently leads to varying conclusions.
To promote robustness and encourage a more diverse statistical perspective in research, we launched the diamond open-access Journal of Robustness Reports. The journal publishes concise 500-word summaries of alternative analyses conducted by independent analysts, which enables a more comprehensive and balanced interpretation of empirical data.
In this hackathon, we invite participants to propose target articles for Robustness Reports, that is, influential and widely debated scientific studies where alternative analyses could complement the original findings and provide valuable new insights. Participants will then break out into groups to conduct the reanalyses, write submission-ready Robustness Reports, and present their findings in a plenary discussion.
While the social sciences have adopted preregistration as a preferred method to prevent bias, astrophysics and cosmology have embraced analysis blinding to safeguard confirmatory research since the early 2000s. In this workshop, I will discuss the strengths and challenges of analysis blinding, a technique where data is temporarily altered before analysis. I will briefly discuss empirical findings comparing analysis blinding to preregistration and highlight the types of projects where this approach is particularly valuable. As a practical exercise, participants will have the opportunity to apply analysis blinding to an empirical dataset.
In this unconference, we want to explore the potential of the many-analysts approach in the context of exploratory research. To date, the many-analysts approach—where multiple research teams address the same research question using the same dataset—has been primarily applied to confirmatory research. In that domain, it has demonstrated a surprising diversity in how researchers preprocess data, operationalize key constructs, and select statistical models to test hypotheses of interest. However, its potential for exploratory research has received relatively little attention.
We believe that many-analysts approaches could provide valuable insights into intriguing patterns in the data and enable a systematic exploration of the variable space. These insights, in turn, can inform auxiliary assumptions, inform theory development, and guide the design of subsequent confirmatory studies.
Keynotes: Alexandra Sarafoglou, Marton Kovacs, Jeffrey Lees, Lisa Spitzer, and Agata Bochynska.
Moderator: Ekaterina Pronizius
This year, we are departing from tradition at SIPS. Rather than inviting individual keynote speakers to open and close the conference, we will host two open roundtable discussions.
For the final day, we would like to host a second roundtable—this time highlighting the rising stars of open science. We would love to hear their personal stories and perspectives, and at one point, we plan to open the conversation to all SIPS attendees. We are confident that this discussion will provide fresh ideas and inspiration that will carry into the SIPS sessions.