Society for Industrial and Organizational Psychology > Research & Publications > TIP > TIP Back Issues > 2016 > July

masthead710

Volume 54     Number 1    July 2016      Editor: Tara Behrend

Meredith Turner
/ Categories: 541

Call for Proposals for I-O Graduate Program Rankings

Nicholas P. Salter, Joseph A. Allen, Allison S. Gabriel, David Sowinski, and Loren Naidoo

Are you part of an I-O graduate program that is truly excellent, but the typical ranking systems do not necessarily show that?  Would you like to help I-O psychologists (current as well as those who will be entering our field in the future) develop a better understanding of the different strengths of various graduate programs?  We are issuing a Call for Proposals for rankings of I-O graduate programs.  This is an excellent opportunity for graduate programs to highlight the ways in which they excel, and for individual SIOP members to help contribute to our field.

In this Call for Proposals, we are seeking proposals for new and unique methodologies for ranking I-O Ph.D. and M.A./M.S. programs that reflect the diversity of values and strengths across the field of I-O. Multiple ranking methodology submissions will be accepted for publication, resulting in multiple rankings featured in an upcoming issue of TIP.  We have developed this call in consultation with the TIP Editor, in response to a need for more comprehensive and updated information about graduate programs.

Proposal Rationale

Ranking graduate programs is not uncommon in our field; TIP features program rankings on a somewhat regular basis (and rankings are available elsewhere as well).  However, the criteria programs are ranked on typically focus on a program’s research productivity (Beiler, Zimmerman, Doerr, & Clark, 2014; Gibby, Reeve, Grauer, Mohr, & Zickar, 2002; Oliver, Blair Gorman, & Woehr, 2005; Payne, Succa, Maxey, & Bolton, 2001; Winter, Healy, & Svyantek, 1995). To measure research productivity, previous rankings have counted the number of faculty journal publications (and in some cases, only from a select subset of influential I-O journals) as well as the number of SIOP presentations.  One exception to this trend was Kraiger and Abalos (2004), who ranked programs based on student ratings of the program.  Their ranking captured research productivity (as self-defined by the students), but also captured criteria such as the general culture of the program and program resource availability.

Ranking I-O graduate programs based on research productivity is appropriate in many ways. First, research productivity is considered important among many I-O psychologists (and indeed among many academics in general).  Moreover, for prospective students who enter the field of I-O because they want to conduct interesting and important research, rankings based on research help direct individuals to programs that maximize person-research fit.  Beyond this, though, there are practical reasons for ranking based on this criterion.  It is difficult to rank the quality of a graduate program, given that many factors that make programs unique are intangible, subjective, and challenging to measure. Research productivity is somewhat more objective and therefore easier to measure, given that individuals conducting the rankings can more easily count the number of publications coming out of a program and differentiate publications by quality of varying degrees. 

However, research productivity is not the only criterion for many people deciding where to apply to grad school.  Rankings that only look at research productivity can have the unintended consequence of undervaluing programs that excel in other ways.  For example, programs that place a greater emphasis on practice, or teaching, or balance (etc.) may be poorly ranked despite being valuable to many students. If these other programs do not appear on the rankings, they may appear to others as if they are poor quality or less rigorous, which can be untrue. Another consequence of focusing on research productivity is that there are fewer rankings of terminal M.A./M.S. programs, likely because they tend to be less research-focused. Indeed, with few exceptions (e.g., Kraiger & Abalos, 2004), prior efforts have ranked only Ph.D. programs. In light of these considerations, the time is ripe for new approaches to graduate program rankings across different degrees, using a variety of criteria beyond research productivity.

Possible Submission Methodologies

            We are looking for submissions from teams and/or individuals to conduct updated rankings of I-O Ph.D. and/or M.A./M.S. programs. We hope to receive submissions to rank programs based on criteria that will help demonstrate ways that I-O programs excel other than research productivity, which will be helpful for students applying to grad school (so they can choose a program most suited for their needs).  We hope teams will submit proposals that have both unique criteria and common criteria; areas of overlap between teams are not only acceptable, but desired. Alternative ranking criteria may include (but are not limited to):

  • How well a program prepares students for a career outside of a traditional research-focused school, such as at a teaching-focused or comprehensive school or at an applied job. Measures of this outcome might include alumni placement records as well as opportunities for teaching and/or applied experiences while still in school.
  • How much of an “impact” members from each program make, such as scholarly impact outside of I-O or involvement with the community. Measures of this outcome might include how often members of the programs are quoted in the news media as well as research productivity in non-I-O journals.
  • How graduate students feel about the program, such as their engagement, well-being, or turnover/retention (similar to Kraiger & Abalos, 2004).
  • The amount of diversity that is reflected by members (students and faculty) of a program.
  • The growth trajectory of a program; some newer or developing programs are really excelling in many ways but may not yet “show up” on traditional ranking systems.

Please note: the above-mentioned ideas are only suggestions, not requirements.  Those submitting a proposal may choose to include some of these, but are encouraged to develop other innovative criteria and measures.

Submission Guidelines

            We are seeking two-page submissions that describe briefly the proposed ranking methodology.  Specifically, the first page should explain the criteria to be ranked, as well as the rationale for including them and the specific measures used.  This should also include a timeline detailing when each step of the plan will be executed.  The second page of the submission should provide information about who is on the author team as well as their qualifications related to successfully executing the proposal.  Successful proposals will clearly define how they plan to rank I-O programs as well as the overarching aims/goals that such a ranking would achieve. 

Proposals are due to Nicholas Salter nsalter@ramapo.edu by October 15, 2016.  Authors should expect to hear back acceptance decisions by November 15, 2016, and the final reports are due November 15, 2017.  We note that our goal is to promote open scientific practices in TIP. As such, authors should agree to make their data available to others as requested.

Previous Article metaBUS: An Open Search Engine of I-O Research Findings
Next Article A Comparison of the Revised Guidelines to the Careers Study Results
Print
1834 Rate this article:
No rating