Ranking I-O Master's Programs Using Objective Data From I-O Coordinators
Stephen J. Vodanovich, Valerie J. Morganson, and Steven J. Kass University of West Florida
Past I-O psychology program ranking surveys have differed in a number of respects such as sample type and size (e.g., faculty, students), type of data collected (e.g., objective, subjective), and number of programs ranked, the majority focused on ranking doctoral programs based on research productivity (Beiler, Zimmerman, Doerr, & Clark, 2014; Gibby, Reeve, Grauer, Mohr, & Zickar, 2002; Oliver, Blair, Gorman, & Woehr, 2005; Payne, Succa, Maxey, & Bolton, 2001; Winter, Healy, & Svyantek, 1995). An exception is the research of Kraiger and Abalos (2004) who surveyed masters and doctoral students with a focus on nonresearch factors (e.g., instruction quality, faculty support, funding, class size).
Given that previous survey research focused on doctoral programs and research output, our proposal extended current rankings by concentrating on master’s level programs and on applied experiences. Additionally, we relied upon objective data. The highlighting of applied experience seemed warranted given that applied skills are arguably more pertinent to ranking MA/MS programs as compared to assessing doctoral training. Indeed, past research on master's I-O programs have discussed the vital role of application of classroom material through such mechanisms as consulting projects and internships (e.g., Hays-Thomas & Kass, 2003; Lowe, 1993; Kottke, Shoenfelt, & Stone, 2014; Schneider, Piotrowski, & Kass, 2007; Shoenfelt, Kottke, & Stone, 2012; Shoenfelt, Stone, & Kottke, 2013). Also, the focus on application is consistent with SIOP's Guidelines for Education and Training, which recognizes that graduates of master’s programs are more likely to be consumers of I-O research (e.g., using the literature to solve work-related problems) than they are to be researchers themselves. We wrote items, a priori, to assess the following dimensions: (a) applied experience, (b) curriculum, (c) faculty experience/information, and (d) student accomplishments/information. We discuss each of these dimensions and their rationale below.
Applied Experience
The first, quintessential dimension—applied experience—reflects the degree to which students are engaged in various real-world opportunities to use their I-O skills. These include the availability of applied projects, internships, and the presence of a consulting unit. Such activities provide enriching applied experiences for graduate students (e.g., Byrne et al., 2014; Dickson & Mullins, 2016; Kottke et al., 2014).
Curriculum
Curriculum is inherently critical to ranking educational programs. The curriculum dimension included number of credit hours, I-O-related credit hours, time to graduation, proportion of students who graduate on time, and course offering. Our assessment of I-O masters programs’ course offerings was based on past work that identified the importance of specific courses and competencies for graduate study. This work included the SIOP Guidelines for Education and Training in Industrial Organizational Psychology (2016) as well as other relevant research (e.g., Erffmeyer & Mendel, 1990; Tett, Walser, Brown, Simonet, & Tonidandel, 2013; Trahan & McAllister, 2002).
Faculty Experience/Information
In parallel to historical rankings that have assessed programs based on faculty research productivity (see citations above), faculty applied experiences and qualifications are a component of our ranking procedure. Faculty applied experience was included to help applicants evaluate the type of mentoring they can expect in a program. One might expect that faculty who themselves have engaged in applied experiences (e.g., consulting clients) would be better able to supervise students who are engaged in hands-on experiences (e.g., fieldwork, internships, projects) by modelling the expected behavior. This is the educational approach suggested in the SIOP Guidelines (2016) for building professional competence. Faculty experiences/information included faculty-to-student ratio, proportion of faculty who engage in consulting, and proportion of faculty who supervise consulting projects. Because scholarship is essential to quality applied intervention, we also included an item to reflect faculty research productivity.
Student Accomplishments and Information
The assessment of student accomplishments evaluates programs in terms of what applicants could expect while matriculating (e.g., assistantships) as well as upon graduation (e.g., obtaining jobs). To the extent that strong students help attract and retain other strong students and build the reputation and human capital of a program, student accomplishments can be viewed as an indicator of program quality (cf. Schneider, 1987). Perhaps more intuitively, student accomplishments can be viewed as an outcome of program quality. Therefore, we decided to present program rankings on the student accomplishments/information dimension as well as an outcome of the other three factors that we measured. Student accomplishments/information included items concerning percentage of non-doctoral-bound students who obtain an I-O job within a year, active participation in I-O-related student chapters, assistantships, graduation rates, and presentations at conferences.
Method
Participants, Procedure, and Instrumentation
Prior to administering the survey, the project was approved by the university’s institutional review board. After receiving the contact information for each of the 127 coordinators of I-O MA/MS programs from SIOP, we sent emails to each coordinator with a brief description of our research, informed consent, and a link to our survey in Qualtrics. We surveyed coordinators of terminal master’s programs as well as MA/MS programs contained within doctoral programs. Weekly reminders were sent to coordinators to increase the response rate. These updates targeted coordinators who had not yet begun the survey or whose survey responses were incomplete. Overall, coordinators were allowed approximately 6 weeks to complete the survey. This extended time was provided because several items required coordinators to assess archival data (e.g., graduation rates) in order to provide accurate responses.
Our final sample consisted of 69 completed surveys for a response rate of 54%.2 Degree types included MS (47.8%), MA (44.9%), or other degree (7.2%; e.g., both MA and MS, MPS). The vast majority (92.8%) of respondents indicated that their program was located within psychology. Programs embedded within public institutions (68.1%) were more prevalent than programs within private institutions (31.9%). Whereas 69.6% of respondents indicated that they did not have a doctoral program, 30.4% did. The majority of respondents reported having a face-to-face program (81.2%), but our sample also included hybrid programs (10.1%) and exclusively virtual programs (8.7%). With the exception of two programs, respondents were located at universities within the United States.3
The final survey consisted of 53 questions. The items were written and placed into the four dimensions noted earlier for ranking purposes: applied experience, curriculum, faculty expertise/information, and student accomplishments/information. Additional items assessed "demographic" program data that were not used for ranking purposes (e.g., faculty diversity, job tenure, private or public university). At the end of our survey, we provided a link to a page provided by a team of researchers at Appalachian State University who were conducting a different ranking survey for SIOP.
The items on the survey were developed using several factors. One was the inclusion of items to collect program data that are regularly reported by SIOP via input from program coordinators, (e.g., number of I-O faculty, graduate employment, average program completion time). Other items were written to be consistent with previous surveys (e.g., Tett et al., 2013) and SIOP's 2016 Guidelines for Education and Training in Industrial-Organizational Psychology (e.g., curriculum coverage, applied focus for MA/MS programs). Finally, the items were reflective of those contained in previous program ranking surveys, such as applied experience, coursework, financial aid, and student research opportunities (Bulger, Horvath, & Zickar; Kraiger & Abalos, 2004). We originally planned to conduct a pilot study from a random sample of I-O coordinators to assist in item construction, but the tight timeframe for data collection made this approach untenable.
Because items varied in terms of response format (e.g., ordinal categories, continuous data), all responses were scaled from 0 to 1 in order to sum items into category scores. For example, responses were assigned point values (e.g., 0 to 5) based on higher scores being indicative of greater quality (e.g., number of hours needed for internships, total I-O related hours required, total program hours, number of I-O faculty). Points were then divided by the number of options that were available on each item. Further, items that asked about proportions were divided by 100 (e.g., percent of I-O students who graduate, percent of students who do an internship). Items were averaged together to create category scores on which programs were ranked. Finally, category scores were combined to provide an overall ranking. Data cleaning was informed by content coding of qualitative write-in responses, referencing the SIOP program database, and in some cases, replying to emails from respondents.4 For instance, in order to enhance the accuracy and consistency of survey responses, we responded to several emails to clarify the meaning of various items.
Results
Means, standard deviations, and intercorrelations among dimensions are presented in Table 1. Programs were ranked on each of the scale facets as well as overall. Top five rankings for each dimension are presented in Tables 2 to 5. As indicated earlier, we also regressed Student Accomplishments/ Information onto each of the other facets: Applied Experience (β = .29, p = .02, R2 = .08), Curriculum (β = .29, p = .02, R2 = .09), and Faculty Experience/Information (β = .35, p = .02, R2 = .12). Each significantly predicted Student Accomplishments/Information. The top ten rankings overall—an average across the four dimensions—is presented in Table 6.
Table 1
Means, Standards Deviations, and Intercorrelations Among Dimensions
|
M
|
SD
|
1
|
2
|
3
|
4
|
5
|
1. Applied experience
|
.59
|
.20
|
|
|
|
|
|
2. Curriculum
|
.71
|
.15
|
.31**
|
|
|
|
|
3. Faculty experience/information
|
.52
|
.17
|
.28*
|
.18
|
|
|
|
4. Student accomplishments/information
|
.66
|
.16
|
.29*
|
.29*
|
.35**
|
|
|
5. Overall
|
.62
|
.11
|
.73**
|
.63**
|
.67**
|
.69**
|
|
Note. *p < .01 **p <.001
|
Table 2
Rankings of Programs Top 20 Institutions/Programs for the Applied Experience Dimension
|
|
|
1. Minnesota State University, Mankato
|
.94
|
1.83
|
2. University of West Florida
|
.93
|
1.77
|
3. University of Detroit Mercy
|
.90
|
1.61
|
4. Middle Tennessee State University
|
.90
|
1.60
|
5. St Mary's University
|
.90
|
1.58
|
6. Southern Illinois University at Edwardsville
|
.89
|
1.54
|
7. San Francisco State University
|
.88
|
1.51
|
8. Saint Cloud State University
|
.88
|
1.51
|
9. Valdosta State University
|
.87
|
1.43
|
10. Florida Institute of Technology
|
.83
|
1.26
|
11. George Mason
|
.83
|
1.23
|
12. William James College
|
.82
|
1.20
|
13. Angelo State University
|
.81
|
1.15
|
14. University of Texas at Arlington
|
.78
|
.97
|
15. Roosevelt University
|
.76
|
.89
|
16. University of Tennessee at Chattanooga
|
.75
|
.83
|
17. California State University, Long Beach
|
.75
|
.83
|
18. Western Kentucky University
|
.72
|
.67
|
19. Emporia State University
|
.72
|
.66
|
20. Springfield College
|
.71
|
.62
|
Table 3
Rankings of Programs Top 20 Institutions/Programs for the Curriculum Dimension
|
|
|
1. University of Tennessee at Chattanooga
|
.94
|
1.50
|
2. Middle Tennessee State University
|
.90
|
1.25
|
3. University of New Haven
|
.88
|
1.13
|
4. Appalachian State University
|
.88
|
1.12
|
5. California State University, San Bernardino
|
.88
|
1.10
|
6. Hofstra University
|
.87
|
1.02
|
7. Carlos Albizu University-Online
|
.85
|
.91
|
8. Minnesota State University, Mankato
|
.84
|
.88
|
9. Carlos Albizu University-Miami
|
.84
|
.85
|
10. South Dakota State University
|
.84
|
.84
|
11. Iona College
|
.83
|
.77
|
12. San Francisco State University
|
.82
|
.70
|
13. Salem State University
|
.82
|
.68
|
14. Xavier University
|
.81
|
.67
|
15. St Mary's University
|
.81
|
.66
|
16. University of Georgia
|
.81
|
.64
|
17. Florida Institute of Technology
|
.80
|
.61
|
18. University of Detroit Mercy
|
.80
|
.60
|
19. Valdosta State University
|
.80
|
.60
|
20. Lamar University
|
.80
|
.59
|
Table 4
Rankings of Programs Top 20 Institutions/Programs for the Faculty Experience/Information Dimension
|
|
|
1. Illinois State University
|
.90
|
2.22
|
2. Florida Institute of Technology
|
.76
|
1.39
|
3. University of Tulsa
|
.75
|
1.34
|
4. Appalachian State University
|
.74
|
1.28
|
5. University at Albany, SUNY
|
.73
|
1.23
|
6. Saint Mary's University
|
.71
|
1.13
|
7. Minnesota State University, Mankato
|
.70
|
1.04
|
8. George Mason
|
.70
|
1.04
|
9. Southern Illinois University at Edwardsville
|
.70
|
1.04
|
10. Elmhurst College
|
.70
|
1.02
|
11. Middle Tennessee State University
|
.70
|
1.02
|
12. Eastern Kentucky University
|
.69
|
.97
|
13. Central Michigan University
|
.69
|
.96
|
14. Colorado State University
|
.68
|
.90
|
15. Radford State University
|
.67
|
.85
|
16. University of Oklahoma
|
.67
|
.85
|
17. Wayne State University
|
.66
|
.80
|
18. Western Kentucky University
|
.66
|
.79
|
19. Xavier University
|
.65
|
.75
|
20. San Diego State University
|
.64
|
.71
|
Table 5
Rankings of Programs Top 20 Institutions/Programs for the Student Accomplishments and Information Dimension
|
|
|
1. South Dakota State University
|
1.00
|
2.15
|
2. Middle Tennessee State University
|
.97
|
1.96
|
3. Southern Illinois University at Edwardsville
|
.94
|
1.76
|
4. University of Minnesota Duluth
|
.92
|
1.65
|
5. Central Michigan University
|
.87
|
1.34
|
6. San Diego State University
|
.87
|
1.31
|
7. Saint Mary's University
|
.86
|
1.23
|
8. Lamar University
|
.85
|
1.19
|
9. Appalachian State University
|
.84
|
1.14
|
10. Illinois State University
|
.84
|
1.13
|
11. Minnesota State University, Mankato
|
.84
|
1.13
|
12. Western Kentucky University
|
.83
|
1.05
|
13. University of Central Florida
|
.81
|
.92
|
14. University of Nebraska at Omaha
|
.80
|
.89
|
15. Florida Institute of Technology
|
.79
|
.82
|
16. University of Oklahoma
|
.79
|
.81
|
17. Eastern Kentucky University
|
.77
|
.68
|
18. Radford State University
|
.77
|
.66
|
19. Indiana University Purdue University Indianapolis
|
.76
|
.59
|
20. University of Tulsa
|
.75
|
.57
|
Table 6
Overall Rankings of Top 20 Institutions/Programs
|
|
|
1. Middle Tennessee State University
|
.87
|
2.14
|
2. Minnesota State University, Mankato
|
.83
|
1.84
|
3. Florida Institute of Technology
|
.80
|
1.54
|
4. Southern Illinois University at Edwardsville
|
.77
|
1.33
|
5. University of West Florida
|
.76
|
1.23
|
6. University of Tennessee at Chattanooga
|
.76
|
1.19
|
7. Saint Cloud State University
|
.76
|
1.18
|
8. Appalachian State University
|
.75
|
1.15
|
9. George Mason
|
.75
|
1.10
|
10. Western Kentucky University
|
.75
|
1.09
|
11. St Mary's University
|
.74
|
1.06
|
12. San Francisco State University
|
.74
|
1.05
|
13. University of Detroit Mercy
|
.73
|
.93
|
14. University of Central Florida
|
.72
|
.90
|
15. California State University, San Bernardino
|
.71
|
.80
|
16. Central Michigan University
|
.71
|
.76
|
17. San Diego State University
|
.70
|
.73
|
18. Eastern Kentucky University
|
.70
|
.72
|
19. Illinois State University
|
.69
|
.64
|
20. Angelo State University
|
.69
|
.63
|
Discussion
This project presents a contribution to SIOP and potential graduate students. Specifically, past program rankings have primarily assessed academic factors that are more relevant to doctoral-level education. However, a strength of the current study is that it focused on applied factors, which are more applicable to the mission of I-O master’s programs and more pertinent to graduates who are likely to become practitioners. Also, besides providing an overall program ranking, our results offer rankings on four specific, relevant program dimensions. This approach was a major focus of our survey, which was to provide prospective students with a useful way to view the quality of master’s programs on multiple dimensions. As such, this article provides students with a resource to help them choose the right program for them based on the dimensions they consider to be most important. That is, they can decide whether to focus on overall program rankings or emphasize specific program components (e.g., applied experience, curriculum). Our findings can also be used as a benchmark for I-O faculty who want to build or enhance their programs while also allowing them to emphasize key, positive programmatic features to appropriate audiences.
Strengths and Limitations
Our project has limitations as well as strengths. Ideally, formative measures should be identified and validated with criterion-related validity evidence using structural equation modeling (e.g., Diamantopoulos, Reifler, & Roth, 2008). However, our sample size and project scope did not permit such a validation effort. Regression results and facet intercorrelations, however, provide some evidence of criterion-related validity. The content of some items could be considered as being related to more than one category. This was especially true regarding the “applied experience” and “curriculum” dimensions. For instance, questions about practicums, internships, and the number of courses requiring formal presentations were included under the “applied experience” category, although they can also be considered as part of a program’s curriculum. Also, as noted before, our student accomplishments/information dimension contained program factors (e.g., availability of graduate assistantships) as well as program “outcomes” such as gaining employment after graduation and graduation rates. Our regression analysis partly addressed this issue by using student accomplishments/information as an outcome variable. Regardless of how this dimension is conceptualized, its contents are likely to be considered by students in evaluating the quality of I-O master’s programs.
Our survey required answers to objective items. However, it is possible that some I-O coordinators may have interpreted the meaning of various questions differently (e.g., what qualifies as an internship or an applied project, the extent to which various topics are covered in the curriculum). Also, certain data may not have been readily available to coordinators (e.g., what percentage of graduates receive an I-O –related job within a year, what percent of I-O students presented papers at conferences). If so, the accuracy of the responses could be an issue. Regarding the latter point, I-O programs would benefit if their coordinators established mechanisms to track and assess program quality and indicate areas for improvement. Finally, some programs were too new to adequately answer many of the survey questions (e.g., graduation rate). Therefore, they were not included in our rankings.
A strength of our project was its focus on objective, quantifiable factors rather than potentially biased opinions of satisfied (or dissatisfied) respondents. We tried to include data that most program coordinators would likely be collecting as they assess and track their own programs. Finally, at 54%, our response rate was quite respectable, thus adding to the representativeness of our findings. With more and more programs competing for qualified students, we expect that future updates to the rankings will include data from all available programs. The survey results could help guide decision makers (e.g., curriculum, applied experiences) in taking steps to improve their individual programs. It is our hope that the discrete, objective information contained in our results will be combined with additional subjective data that will allow prospective students to identify programs that best fit their interests and career goals.
Notes
1 The authors wish to thank Skye Evans and Michael DeNoia for their assistance on this project.
2 Eleven surveys were partially completed and were not included in the analyses.
3 In one case, an international university’s response had to be omitted for the curriculum dimension because they used a points system rather than a credit hour system; no conversion was available.
4 Data were extensively cleaned prior to analysis. Additional detail on specific data cleaning decisions is available upon request from the authors.
References
Beiler, A. A., Zimmerman, L. M., Doerr, A. J., & Clark, M. A. (2014). An evaluation of research productivity among I-O psychology doctoral programs. The Industrial Organizational Psychologist, 51(3), 40-52. http://0-www-siop-org.library.alliant.edu/tip/Beiler_et_al.pdf
Bulger, C. A., Horvath, M., & Zickar, M. (2006). Industrial-organizational (I-O) psychology graduate school rankings: A guide for I-O graduate school applicants. The Industrial and Organizational Psychologist, 43, 121-131.
Byrne, Z. S., Hayes, T. L., Mort McPhail, S., Hakel, M. D., Cortina, J. M., & McHenry, J. J. (2014). Educating industrial–organizational psychologists for science and practice: Where do we go from here? Industrial and Organizational Psychology, 7(1), 2-14. doi:10.1111/iops.12095
Diamantopoulos, A., Riefler, P., & Roth, K. P. (2008). Advancing formative measurement models. Journal of Business Research, 61, 1203-1218.
Dickson, M., & Mullins, M. (2016, April). Formalized programs for providing graduate students with professional practice experience. Programs for Graduate Student Professional Practice. Panel discussion presented at the 31st Annual Conference of the Society for Industrial and Organizational Psychology, Anaheim, CA.
Erffmeyer E. S., & Mendel R. M. (1990). Master's level training in industrial/organizational psychology: A case study of the perceived relevance of graduate training. Professional Psychology: Research and Practice, 21(5), 405. doi: 10.1037/0735-7028.21.5.405
Gibby, R. E., Reeve, C. L., Grauer, E., Mohr, D., & Zickar, M. J. (2002). The top I-O psychology doctoral programs of North America. The Industrial-Organizational Psychologist, 39(4), 17-25. http://0-www-siop-org.library.alliant.edu/tip/backissues/TIPApr02/02gibby.aspx
Hays-Thomas, R., & Kass, S. J. (2003). Integrating classroom knowledge and application: The Industrial/Organizational Psychology internship presentation. Teaching of Psychology, 30(1), 70-71.
Kottke, J. L., Shoenfelt, E. L., Stone, N. J (2014). Educating industrial–organizational psychologists: Lessons learned from master's programs. Industrial and Organizational Psychology, 7, 26-31 doi:10.1111/iops.12099
Kraiger, K., & Abalos, A. (2004). Rankings of graduate programs in I-O psychology based on student ratings of quality. The Industrial-Organizational Psychologist, 42(1), 28-43.
Lowe, R. H. (1993). Master's programs in industrial-organizational psychology: Current status and a call for action. Professional Psychology: Research and Practice, 24, 27-34.
Oliver, J., Blair, C. A., Gorman, A., & Woehr, D. J. (2005). Research productivity of I-O psychology doctoral programs in North America. The Industrial-Organizational Psychologist, 43(1), 55–63. https://0-www-siop-org.library.alliant.edu/tip/backissues/July05/pdf/Sheridan%20PDFs/431_055to063.pdf
Payne, S. C., Succa, C. A., Maxey, T. D., & Bolton, K. R. (2001). Institutional representation in the SIOP conference program: 1986-2000. The Industrial-Organizational Psychologist, 9(1), 53-60. http://0-www-siop-org.library.alliant.edu/tip/backissues/TipJul01/12payne.aspx
Schneider, B. (1987). The people make the place. Personnel Psychology, 40, 437-453.
Schneider, S. K., Piotrowski, C, & Kass, S. J. (2007). Training masters students through consulting experiences: Benefits and pitfalls. Organization Development Journal, 25, 47-55.
Shoenfelt E. L., Kottke J. L., & Stone N. J. (2012). Master's and undergraduate I/O internships: Data-based recommendations for successful experiences. Teaching of Psychology, 39(2), 100–106. doi: 10.1177/0098628312437724
Shoenfelt E. L., Stone N. J., & Kottke J. L. (2013). Internships: An established mechanism for increasing employability. Industrial and Organizational Psychology: Perspectives on Science and Practice, 6, 24–28. doi: 10.1111/iops.12004
Society for Industrial and Organizational Psychology, Inc. (2016). Guidelines for education and training in industrial-organizational psychology. Bowling Green, OH: Author. http://0-www-siop-org.library.alliant.edu/Instruct/SIOP_ET_Guidelines_2017.pdf
Tett R. P., Walser B., Brown C., Simonet D. V., & Tonidandel S. (2013). 2011 SIOP graduate program benchmarking survey: Part 3: Curriculum and competencies. The Industrial-Organizational Psychologist, 50(4), 69–90.
Trahan W. A., & McAllister H. A. (2002). Master's level training in industrial/organizational psychology: Does it meet the SIOP Guidelines? Journal of Business and Psychology, 16(3), 457–465.
Winter, J. L., Healy, M. C., & Svyantek, D. J. (1995). North America’s top I-O psychology/doctoral programs: U.S. News and World Report revisited. The Industrial-Organizational Psychologist, 33(1), 54-58.
Appendix A
Items Used to in Rankings by Facet
Applied Dimension
1. Responses were combined on the following two questions:
a. Does your I-O program contain a formalized, applied internship within your curriculum?
b. How many hours are required to successfully complete the internship? If variable, provide an average. Type in number of hours.
2. Typically, what percent of students perform an internship?
3. Does your program allow for students to enroll in a practicum?
4. Does your program have a designated unit (e.g., consulting clinic, center) to acquire consulting contracts and/or grants?
5. How many courses in your program, including an internship, if applicable, involve students conducting applied projects (e.g., job analysis, training programs, organizational development) outside of the classroom?
6. How many courses in your program require formal presentations (group or individual) designed for applied audiences?
Curriculum Dimension
1. Responses were combined on the following two questions:
a. How many total credit hours are required for your I-O MA/MS degree? Please indicate in semester hours (1.5 quarter hours = 1 semester hours).
b. Given the number of hours in your program, what percent of MA/MS students graduate in the expected timeframe (e.g., “on time”)?
2. How many I-O-related hours (including research methods and statistics) are required for your I-O MA/MS degree? Please indicate in semester hours (1.5 quarter hours = 1 semester hours).
3. To what extent are the following topics covered in your program? Use the guidelines listed below in providing your answers: (0 = never: not covered at all, .5 = somewhat, 1 = extensively). [job analysis, personnel recruitment/selection, training and development, performance appraisal, job evaluation/compensation, employment law, work motivation, work attitudes, work groups/teams, leadership/management, judgment/decision making, organizational development, organizational theory, work/family, work stress, human factors, consulting/business skills, workforce diversity, workforce aging, individual differences in the workplace]
Faculty Information/Experience
1. Faculty to student ratio
a. How many I-O faculty teach in your I-O program? (three-quarter appointments count as .75; half-time appointments count as .5; one-third appointments count as .3; Do not count adjunct instructors)
b. This was divided by responses to the following question: What number of students typically enter your I-O program each year?
2. What number of I-O faculty in your program have worked on a consulting project in the past 5 years? (This was divided by the number of I-O faculty who teach)
3. How many I-O faculty have supervised I-O students on external consulting projects?
(This was divided by the number of students enrolled annually)
4. How many total articles have been published by your I-O faculty in refereed journals from 2012 to 2016, including “in press” articles?
Student Accomplishments/Information
1. Of the graduate students who do not pursue doctoral degrees, what percentage obtain work in an
I-O-related job within a year after graduation?
2. What percent of current I-O graduate students are active participants in I-O related student chapters (e.g., SHRM, ATD, IOPSA)?
3. What percent of current I-O graduate students received assistantships?
4. Of those who enroll in your program, what percent of your I-O students graduate?
5. What percentage of your MA/MS students have presented a paper or poster at a regional, national, or international conference in the past 5 years?