Featured Articles
Meredith Turner
/ Categories: 563

We Want Open Science in I-O! ...Do We?

Vinay Patel and Joe Meyer, Louisiana Tech University

Open Science in I-O

What is science? Is it a conglomeration of gray-haired men and women swirling around questionable concoctions in a test tube, a group of gung-ho graduate students running statistics, or is it a practice aimed at achieving the “greater good”- whatever that is? These are all relevant reflections when considering the implications of what has been labeled as “open science,” or open-sourced scientific research.¹ There are many questions surrounding the idea of open science, and we aim to answer a few of them. The major purposes of this article are threefold:

  1. Aid in spreading the awareness of open science practices to give scientists and practitioners the knowledge to engage in them to solve issues that are prevalent within the field.
  2. Gauge opinions from SIOP professionals on their attitudes towards open science and how I-O professionals can adopt/implement these practices.
  3. Show our readers steps certain I-O professionals have taken to promote open science within our field.

We overviewed various sources finding some potential benefits and caveats of adopting open science. We then interviewed a handful of SIOPers on the subject and touched upon the current uptake of open science in the field. Finally, some recommendations are made for scientists, practitioners, and the general public alike.

Potential benefits:

  • Paid readership: We’ve all been in this situation, you find the perfect abstract, but wait, just four easy payments of $19.99 and you can access it. The current scientific publication model, as it stands, is one of the most profitable industries currently in existence (Buranyi, 2017). Open access would allow free access to researchers and practitioners alike.
  • Combatting the publisher or perish model: Some are beginning to question the current model of publish or perish (Smaldino & McElreath, 2016; should quantity of publication determine the award of tenure, or should quality/impact be the primary determinant?)
  • Large collaborative data sets: The mass sharing of information through the digital revolution has transformed society and has brought great changes in scientific enterprise. Scientists from around the globe can collaborate on data sets more effectively and with less restraint. To a considerable degree the natural sciences have adopted an open science approach in recent years, shall the social sciences follow? (National Academies of Sciences, Engineering, and Medicine, 2018)
  • Transparency and reproducibility: Should science be conducted covertly? Open science calls for transparency in all stages of research. This can increase honesty in research practice, as well as make replication more viable. Questionable research practices (QRPs) are pervasive in research culture and that open science approaches can inhibit them (Fiedler, & Schwarz, 2016; John, Loewenstein, & Prelec, 2012; Open Science Collaboration, 2015; Simmons, Nelson, & Simonsohn, 2011).
  • Preregistration: Relative to the previous point, preregistration of studies (taken up by the Journal of Business Psychology and the Journal of Personnel Psychology) increases candor and adherence to the original methodological synopsis put forth by the researcher in all stages of the process: from hypothesis formulation to data collection. With a much clearer blueprint of methodology, successful replication may have a greater possibility. After all, we are in a “reproducibility crisis” (Banks et al., 2018; Open Science Collaboration, 2015).
  • Addressing the scientist–practitioner gap: Many practitioners lose access to scientific journals once they depart from academia. This may curtail evidence-based practice and take them away from the “pulse” of the field. If practitioners were able to openly access journals, their practice will be more informed and grounded in science, possibly leading to further contributions in research and the field (Banks et al., 2018).
  • Not only the significant studies get published: Preregistration combats against the current paradigm of only publishing significant studies, even false positives, as well as thwarting deceptive tactics, such as p-hacking and significance farming. There is much to be learned from a study that is not deemed significant, and what was the effect size anyway? (Simmons, Nelson, & Simonsohn, 2011)

Open science is not, however, without its limitations and caveats:

  • Costs and infrastructure: The technology needed for researchers to adopt open science practices isn’t science-fiction, we have it; however, hosting and maintaining large amounts of data that can be easily accessed is costly. Furthermore, it has been argued that open science may further marketize research (Tyfield, 2013).
  • Less rigorous review: Quick review in some open science journals does have its respective benefits; however, this may result in less rigorous review in published journals, and therefore possibly allowing QRPs into the journal, or less methodologically rigorous studies (Lancaster, 2016).
  • Privacy: As I-O is inherently involved in organizational settings, researchers need to be heedful of potentially confidential information and proprietary knowledge when publishing open access research.
  • Payment where payment is due: For a journal to operate, funding does need to come from somewhere, and no, it does not grow on trees. Some open science journals operate through generous donors and institutions; however, this model depends on sustained generosity; therefore, some framework needs to be established, especially with the loss of funding from training sources (Harnad et al., 2004).

Open science has gained a considerable backing in the recent years. Interest spans from the natural to psychological sciences. We now aim to outline the prevalence of two issues relevant to psychology in general, as well as industrial-organizational psychology, and to consider the benefits of adopting open science in I-O.


There has been specific mention of marginal reproducibility percentages in the psychological sciences (Open Science Collaboration, 2015). Recent initiatives have attempted to gain an estimate of psychology’s reproducibility rate, finding it to be around 36% (2015). Similarly, it has been suggested that the average power of psychological research is approximately 36%, in harsh juxtaposition with the .80 standard of adequate power (Stanley, Carter, & Doucouliagos, 2018). Although no figures have been reported for industrial-organizational psychology specifically, it is likely the field is comparable to these figures. In response to the concern of reproducibility, numerous calls to action of taken place, including a track at SIOP 2017 dedicated to reproducible research (Horn, Stilson, & Vaughn, 2017). Additionally, SIOP’s task force on a Robust and Reliable Science, aimed at examining the current climate of research, has specifically cited open science as an important consideration as the field moves forward (Grand et al., 2018). The authors’ goals were to urge fellow researchers and practitioners in our community to engage in practices and efforts that are for the betterment of I-O sciences. Six recommendations were set forth, which were relevance, rigor, replication, accumulative and cumulative, open and transparent, and theory-oriented research (Grand et al., 2018). Considering the recent findings regarding reproducibility rates, the rising concerns of questionable research practices, and the specific calls to action in industrial-organizational psychology, the authors of this article advocate the implementation of open science in I-O to mitigate these issues. Reproducibility is integral to science (Nosek et al., 2015). Transparency in the methodology, procedure, and data analysis, for example sharing r code, allows researchers to construct similar experiments to reproduce results.

Questionable Research Practices

Ironically, considerable research has been done investigating questionable research practices (QRPs; Fiedler & Schwarz, 2016; John, et al., 2012; Simmons, et al., 2011). There is not a consensus regarding the prevalence of QRPs, however, it can most likely be agreed upon that they are an issue that needs correction. Fanelli (2009) conducted the first meta-analysis attempting to gain an insight into the frequency of QRPs, finding a pooled weighted average of 1.97% of researchers openly admitting to falsifying or modifying data, and up to 72% admitted to participating in “other” QRPs. Furthermore, up to 72% of researchers indicated they were aware of QRPs being committed by colleagues (Fanelli, 2009). Obvious limitations exist in examining QRP’s, foremost is social desirability bias. Some have attempted to alleviate the issues of social desirability bias by incentivizing truth-telling (John, et al., 2012). In this case, 94% of researchers surveyed admitted to having done at least one QRP (John et al., 2012). Therefore, it can be concluded that QRPs play a significant role in research culture and can be seen as pervasive. To combat this, preregistration of studies (taken up by the Journal of Business Psychology and the Journal of Personnel Psychology) may increase candor and adherence to the original methodological synopsis put forth by the researcher in all stages of the process: from hypothesis formulation to data collection. With a much clearer blueprint of methodology, successful replication may have a greater possibility. It behooves researchers to provide the best possible chance for reproduction, and open science may potentiate this.

What Can We Do?

Bharati Belwalker (personnel administrator for the City of New Orleans) pointed out, changes do need to happen at a micro level, which could possibly ignite interest in the subject. This could lead to a possible change in incentives (moving away from a reward system that does not lead to QRPs). She recommends taking open science up as an individual, a form of a grassroots movement, by conducting transparent, theoretically sound, methodologically rigorous, and ethical I-O research and practices. It could be simple, for example, if you have a poster accepted at SIOP, sign up to share your data, design, and so on. From there, word could spread. Neil Morelli (head of Selection for the Cole Group) discusses how open access can lead to “cross-pollination” and take our research to the next level by incorporating methods and technologies used in other disciplines. Not only are scientists able to access it but also the general public, furthering dissemination and honesty in research approaches throughout the entire process. Chris Castille compares it to fighting pollution, where the pollution represents QRPs. Furthermore, a front-end approach to combating questionable research practices rather than a back-end approach can possibly alleviate some issues that have been mentioned throughout this article. In other words, taking care of issues sooner using open science practices, as opposed to catching QRPs 10 years down the road after the work has already been published, replicated, and shared on the CNN nightly news. Our current research culture is a rewards-based publishing system. This is not necessarily inherently flawed, but when one’s career is on the line, some researchers have gone to questionable ends to keep that publishing count up (Fanelli, 2009; Simmons, et al., 2011). Changing this framework will be difficult, one of a paradigm shift. However, through writing this article, we aim to lead to the questioning of our current paradigm and stimulate thoughts and conversations on how to change it.

What’s Going on Right Now?

Some notable adoptions of the practice have occurred in recent years. The Journal of Personnel Assessment and Decisions (PAD) is one of note. Scott Highhouse and Dennis Doverspike outlined the creation of PAD in their Summer 2017 TIP article, pointing towards four goals: “1) counter the trend away from practical scientific research; 2) be no cost to authors and readers; 3) ease the burden on reviewers; 4) publish shorter and more accessible articles; and, 5) begin to respond to calls for changes in the basic nature of the publication enterprise” (Highhouse & Doverspike, 2017). The online journal displays a world map with grey pins indicating the geographical area where an open source download occurred, exemplifying worldwide interest and collaboration in its content. As of early October, recent downloads have occurred in Nigeria, Australia, Russia, and Brazil. Frank Igou (International Personnel Assessment [IPAC] board member) commented on the practitioner focus and quick turn-around (time from submission to publication) rate of PAD, believing that its benefit comes down to time allocation. He states, “Practitioners work 40-60 hours per week, they (practitioners) can allocate their time towards practice, not allocating another 200 hours, or 3–6 months to put together an article.”

Neil and Bharati both report that isolation from the “pulse” of the field may occur when one leaves graduate school and begins to practice due to the paid readership and accessibility issues discussed above. Neil told us about a website that was created by Alison Mallard that is already helping a large number of students and practitioners. The site is called IOatwork.com. Allison realized that restricted access of research deters the research from ending up in the hands of people that will use it the most, the website summarizes recent articles and provide this service as a means of getting research into the right hands. With this website, spending a few minutes every month can give practitioners and students the advantage to keep up with the most current and relevant literature and actually use the research.


  1. As Gema Bueno de la Fuente (n.d.) of Foster Open Science outlines, open science is a general term that consists of the following subcomponents: open source, open access, open data, open education, and open methodology. However, in some cases not all constituents are met nor do they need to be to necessitate open science.


Banks, G. C., Field, J. G., Oswald, F. L., O’Boyle, E. H., Landis, R. S., Rupp, D. E., & Rogelberg, S. G. (2018). Answers to 18 questions about open science practices. Journal of Business and Psychology, 1-14.

Bueno de la Fuente, G. (n.d.). What is open science? https://www.fosteropenscience.eu/content/what-open-science-introduction

Buranyi, S. (2017, June 27). Is the staggeringly profitable business of scientific publishing bad for science? Retrieved November, 2018

Fiedler, K., & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7(1), 45-52.

Grand, J., Rogelberg, S., Allen, T., Landis, R., Reynolds, D., Scott, J., . . . Truxillo, D. (2018). A systems-based approach to fostering robust science in industrial-organizational psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 11(1), 4-42. doi:10.1017/iop.2017.55

Harnad, S., Brody, T., Vallières, F., Carr, L., Hitchcock, S., Gingras, Y., ... & Hilf, E. (2004). The green and the gold roads to open access. Nature Web Focus. http://www.nature.com/nature/focus/accessdebate/21.html;

Highhouse, S., & Doverspike, D. (2017). Creating an open-access, practitioner-friendly, scientific journal for i-o psychology: The case of Personnel Assessment and Decisions (PAD). The Industrial-Organizational Psychologist, 55(1). http://0-www-siop-org.library.alliant.edu/tip/july17/pad.aspx

Horn, Z., Stilson, R., & Vaughn, D. (2017). Your Guide to Reproducible Research (RR) at SIOP 2017. The Industrial-Organizational Psychologist, 54(4). http://0-www-siop-org.library.alliant.edu/tip/april17/rr.aspx

John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological science, 23(5), 524-532.

Lancaster, A. (2016). Open science and its discontents. http://ronininstitute.org/open-science-and-its-discontents/1383/

National Academies of Sciences, Engineering, and Medicine. (2018). Open science by design: Realizing a vision for 21st century research. Washington, DC: The National Academies Press.

Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., ... & Contestabile, M. (2015). Promoting an open research culture. Science, 348(6242), 1422-1425.

Open science Collaboration (2015). Estimating the reproducibility of psychological science. Science. (349) 625. doi: 10.1126/science.aac4716.

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological science, 22(11), 1359-1366.

Stanley, T. D., Carter, E. C., & Doucouliagos, H. (2018). What meta-analyses reveal about the replicability of psychological research. Psychological Bulletin. Advance online publication.

Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3(9), 160384.

Tyfield, D. (2013). Transition to Science 2.0:“Remoralizing” the Economy of Science. Spontaneous Generations: A Journal for the History and Philosophy of Science, 7(1), 29-48.

3167 Rate this article:
Comments are only visible to subscribers.


Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.