Featured Articles
Jenny Baker
/ Categories: 592

Building on Our Success With the SIOP/CARMA Open Science Summer Series

Christopher M. Castille, Nicholls State University

In May–June of this year, SIOP, in conjunction with the Consortium for the Advancement of Research Methods and Analysis (CARMA), hosted the SIOP/CARMA Open Science Summer Series. The series was hosted free of charge, and we’d again like to thank CARMA’s director, Dr. Larry Williams, for his support of the open science community. Our sessions were led by George Banks of the University of North Carolina, Charlotte, who we also would like to thank for taking time from his busy schedule to make the series happen. All of George’s materials can be found by visiting a dropbox hosted by CARMA (https://tinyurl.com/ca3anaz9). Many of these materials include sources from presentations given by scholars advancing open science, such as Herrera Bennett, Chris Chambers, Brian Nosek, Claire Riss, Steven Rogelberg, Rolf Zwaan, as well as scholars servicing the Center for Open Science ambassador-resource page.

Overall, the series was a resounding success! Participants who responded to our survey (n = 23) would recommend the series to their friends or colleagues. Attendance was strong. Over 200 scholars signed up for the series with each class consisting of roughly 20–50 viewers. Our lead instructor, George Banks, brought us through discussions and exercises pertaining to a variety of open science topics. We also ended each day of the series with a panel discussion consisting of editors from a variety of journals.

Table 1 gives a broad overview of the topics discussed, panelists, and supporting journals. Topics ranged from open science workflows to how to use the Open Science Framework to how to speak openly about key issues in opening up our science (e.g., authorship). Panelists consisted of editors and associate editors from several journals. Collectively, we explored the many ways of encouraging greater openness and transparency in our science.

Table 1
Open Science Topics Discussed on Each Day of the SIOP/CARMA Open Science Virtual Summer Series

Day

Workshop topics

Panelists

1

(a) What is open science?

(b) Accelerating robust research in the organizational sciences?

Scott Highhouse (Personnel Assessment and Decision)

Andrew Timming (Human Resource Management Journal)

Mo Wang (Work, Aging, and Retirement)

2

(a) What is the Open Science Framework?

(b) An ounce of prevention is worth more than a pound of cure: The various forms of preregistering research

John Antonakis (Leadership Quarterly)

Maryam Kouchaki (Organizational Behavior and Human Decision Processes)

Cornelius König (International Journal of Selection and Assessment)

3

(a) An open science workflow template

(b) Reviewing with open science in mind

Lillian Eby (Journal of Applied Psychology)

Nadya A. Fouad (Journal of Vocational Behavior)

Jonas W. B. Lang (Journal of Personnel Psychology)

4

(a) The many ways of ensuring analytic reproducibility

(b) Promoting open science and replication work

Paul Bliese (Organizational Research Methods)

Berrin Erdogan (Personnel Psychology)

Nikolaou Ioannis (International Journal of Selection and Assessment)

Lucy Gilson (Group & Organization Management)

5

(a) How to have better conversations when making authorship decisions

(b) Transparency and openness guidelines, preprints, and our publishing model

Steven Rogelberg (Journal of Business and Psychology)

Christian Resick (Journal of Organizational Behavior)

Ingo Zettler (Journal of Personnel Psychology)

Note: Panelists discussed, broadly speaking, answers to two questions: (a) What problems or growing pains have you encountered in adopting or encouraging open science practices? (b) What advice do you have to offer authors hoping to adopt open science practices but seeing the challenges associated with doing so? They did not necessarily discuss topics related to the questions for each day.

Several interesting insights and takeaways emerged from our panel discussions, a few of which I will highlight. Lillian Eby, editor of the Journal of Applied Psychology, encouraged authors to adopt methods checklists such as that found at JAP (e.g., see Eby et al., 2020) to ensure that key methodological details are provided in a manuscript. Lillian also noted that JAP and all APA core journals are committing to the Transparency and Openness Promotion Guidelines (Nosek et al., 2015), which for JAP will be effective November 2021 (for the announcement, visit this link). Nadya Fouad, editor of the Journal of Vocational Behavior, reported that authors submitting under the results-blind review format—where there is a separation between the review of the methods and the results of an investigation—currently have nearly twice the acceptance rate as their general submission format. These and other benefits of results-blind reviewing have been reported in the literature (see Woznyj et al., 2018). Far from negatively impacting journal prestige, leveraging open science publishing practices such as registered reports was cast by John Antonakis, editor of Leadership Quarterly, as both promoting good science and elevating his journal’s prestige. Representatives of several other journals (Leadership Quarterly, Journal of Business and Psychology, Human Resource Management Journal, International Journal of Selection and Assessment, Group and Organization Management, Journal of Organizational Behavior) highlighted how they have introduced new publishing practices (e.g., registered reports, results-blind reviewing) to further enhance the rigor and reproducibility of our science.

There have been many changes brought about partly due to the open science movement. When it comes to further enhancing the robustness of our science, there is always more work to be done. One of our panelists, Cornelius König, who is an associate editor for the International Journal of Selection and Assessment, challenged our attendees to see the replication crisis occurring within social psychology as offering lessons about ways in which we might strengthen our own field. He noted that some may view the crisis, if we call it that, as restricted generally to social psychology. He suggested that this view needs to be challenged and urged our attendees to consider what key claims may need to be revisited in light of the open science movement. Another panelist, Lillian Eby, noted also that there are many challenges specific to the domain of I-O psychology that arise when applying open science practices in our field. For example, the proprietary nature of organizational data or measures can prevent sharing them openly. We should certainly strive for some forms of openness even in these cases (e.g., sharing sample items, providing detailed descriptive statistics), yet we will likely have to accept some data privacy issues if practitioners are to continue making timely and important contributions to the field.

We are continuing to leverage our relationship with CARMA to promote more open science activities sponsored by SIOP. We also hope to build more collaborations with other professional societies (e.g., Southern Management Association, Academy of Management). Many of these developments are ongoing, and we look forward to sharing more details in a subsequent entry in TIP.

Author Note

Thanks to Fred Oswald for his feedback on this manuscript.

If you are interested in contributing to Opening Up, TIP’s column for all things open science, please contact christopher.castille@nicholls.edu. We are considering topics such as diversity and inclusivity, teaching open science, and areas where there may be value in spurring different kinds of replication projects (registered reports vs registered replication reports).

References

Eby, L. T., Shockley, K. M., Bauer, T. N., Edwards, B., Homan, A. C., Johnson, R., Lang, J. W. B., Morris, S. B., & Oswald, F. L. (2020). Methodological checklists for improving research quality and reporting consistency. Industrial and Organizational Psychology, 13(1), 76–83. https://doi.org/10.1017/iop.2020.14

Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374

Woznyj, H. M., Grenier, K., Ross, R., Banks, G. C., & Rogelberg, S. G. (2018). Results-blind review: A masked crusader for science. European Journal of Work and Organizational Psychology, 27(5), 561–576. https://doi.org/10.1080/1359432X.2018.1496081

 

Print
669 Rate this article:
No rating
Comments are only visible to subscribers.

Categories

Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.