Opinion: Rethinking How We Rate and Rank MLIS Programs | LIS Education

Throughout the United States and Canada, there are more than 63 ALA-accredited programs offering advanced degrees in library and information science. While the number of programs has grown over the years, the field has yet to develop any significant, rigorous measures of evaluation to assess them. Even as interest in LIS education grows, the tools for determining which programs will match a student’s goals or establishing a hierarchy of quality remain stuck in neutral.

June15webLIS4Note:This article has been updated to reflect a previously incorrect figure in the table.

Throughout the United States and Canada, there are more than 63 ALA-accredited programs offering advanced degrees in library and information science. While the number of programs has grown over the years, the field has yet to develop any significant, rigorous measures of evaluation to assess them. Even as interest in LIS education grows, the tools for determining which programs will match a student’s goals or establishing a hierarchy of quality remain stuck in neutral.

Historically, many LIS students chose to attend a particular program because it was geographically close, so, for most, the decision was relatively easy. Today, however, thanks to online education, many programs are national, providing ease of access and in some cases ease of affordability as well. San José State, CA; Emporia, KS; Clarion, PA; and Texas Woman’s University, Denton, all have quite large distance education programs.

Attendance, budgets grow unevenly

Over the past decade, enrollment has grown by 1,823 students, from 14,683 to 16,506, a rate of just over one percent per year. This growth was not equally shared, however. Twenty-five programs declined in enrollment, six together lost a total of 1,076 students over the decade, and 11 gained a total of 3,056 students.

Not all demographic groups have shared equally in the expansion of head counts, either. Minority enrollments grew by 37 percent over the past ten years. This might seem large, if the beginning numbers hadn’t been so small. In 2002, minority enrollment was 1,698 students, representing 11.6 percent of enrollment. By 2012, minority enrollment stood at 14.1 percent of all MLIS students. This is still a long way from the 36.2 percent of the U.S. population comprising people of color in 2011, according to the Census Bureau, or even the 22.7 percent of master’s degrees awarded to people of color in 2007, according to the National Center for Education Statistics.

As well as inequality in gains or losses in head count, there is tremendous inequality in the budgets of LIS programs. While over the past decade, nine programs each had increases of more than $5 million, totaling over $84 million, there were 17 programs that either lost money or received an increase of less than $500,000.

Perception, standards, competencies

Historically, there have been two means of evaluating LIS programs. The first, popularized by Herb White, dean of the LIS program at Indiana University, was through perception studies: asking faculty and administrators in LIS programs which programs were the best. White’s three studies using this technique produced consistent results, showing that respondents preferred programs that were large and offered doctoral options. This methodology disappeared for a while, only to reappear, slightly modified, in rankings from US News & World Report, which gathers data on program quality via a survey sent to three upper-echelon members of each LIS program, ranking each school on a scale of one to five. Compared to the ranking system for business programs, which takes into account reviews from recruiters, salary and employment figures for graduates, and test scores and grades of incoming students alongside peer reviews, the LIS program rankings from one of the leading voices in academic ratings remain fairly primitive.

The second method used to evaluate programs is accreditation, by which programs are held to and measured against standards. The problem is that the ALA Standards for Accreditation are liminal; they are a threshold, designed so as not to be tripped over. They are meant to be inclusive, not exclusive, making them a poor bar at best for judging program quality.

At the same time, library education has been under attack for not properly preparing its students for their subsequent job responsibilities. These attacks have strengthened interest in a competency movement: ALA and its divisions, and other organizations, have prepared lists of competencies that new graduates should have. Moreover, ALA’s Committee on Accreditation (COA), in its Standards, requires LIS programs to “take into consideration” the various competency statements. LIS programs, we presume, now teach to these competencies. Neither the value of the competencies nor their ability to address the pressing financial and political needs of the field have been demonstrated.

All of this begs an important question: How dependent is the ranking of a program on the ranking of its home university? If programs are ranked because they are in large, ranked universities (and we have seen that they are), how useful are those rankings given that being based at a large, renowned home university does not necessarily ensure a top-tier LIS program? On the other hand, if the rankings are accurate, the only purpose served by accreditation is to equate falsely the programs at ranked and nonranked universities, thus devaluing the profession itself.

A damaging admission

In a previous issue of LJ, the authors suggested public accountability measures that could be provided by programs and published by COA and that could serve as indicators of interest to prospective students and employers, mirroring the ranking standards already in place for other programs. One of those factors we suggest exploring is a look at the admissions requirements for programs. In an era of performance-based budgeting, in which programs are rewarded for the number of students they teach, shrinking budgets have put pressure on programs to make admission standards more lenient in the interest of bringing in more tuition dollars.

Table: Top and Bottom Schools by Time to Degree

Ten programs with SHORTEST time to degree
School Degrees Awarded Total Head Count Duration in Years*
Long Island 384 274 0.71
Indiana 268 267 1
Pittsburgh 189 241 1.28
North Texas 485 664 1.37
Rutgers 178 269 1.51
Florida State 231 394 1.71
California (UCLA) 66 118 1.79
Texas Austin 124 222 1.79
Rhode Island 58 104 1.79
Emporia 157 284 1.81
Programs with LONGEST time to degree
School Degrees Awarded Total Head Count Duration in Years*
Syracuse 99 298 3.01
Clarion 153 476 3.11
Simmons 202 630 3.12
San José State 630 1,986 3.15
Kentucky 79 250 3.16
Hawaii 29 97 3.34
St. Johns 20 67 3.35
Alabama 72 249 3.46
St. Catherine 48 185 3.85
*Duration calculated as head count divided by number of degrees awarded

We looked at 2012 admissions data for the ten programs that graduated over 43 percent of all ALA-accredited MLIS students according to COA trend data statistics. One thing that became clear was that, by and large, academic admissions requirements are stated conditionally and are nearly nonexistent. Common criteria, like a 3.0 GPA, were met by nearly 80 percent of college graduates, suggesting that programs granting a huge number of degrees were not picky about the students they accepted.

Testing was far from universal as well. For many programs, GRE scores are required only when the GPA is low, and even programs that do require GRE scores to be submitted do not demand exceptional scores from applicants. Annually, about 852 students reported GRE scores as MLIS applicants, suggesting that just over ten percent of the total applicants to MLIS programs are reporting GRE scores to those institutions. Those of us who have served on admissions committees know that GPA scores from different schools can mean very different things, making GRE scores an important and impartial indicator of student aptitude.

With today’s programs being funded on the basis of student credit hours and with most having no substantive admissions criteria, many schools are no longer acting as gatekeepers for the profession.

Further complicating the situation, those entering the profession today include fewer humanities and more business and education majors, raising questions as to whether the educational programs of today have adapted to the changing student demographics and whether they are properly serving this new breed of student.

Admissions standards for LIS programs in general need to be significantly strengthened to guarantee those programs are bringing the best and brightest into the industry and rigorously preparing them for their professional futures. Different skills, and not those in the various competency statements, need to be taught. This may result in fewer, and smaller, programs but ones that are better preparing students for the complex and changing world of modern librarianship.

What counts

COA keeps track of some statistics in its publication “Data on Program Performance,” including some that shed light on indicators like the number of students who graduate from a program and how long they take to do so. COA keeps track, by year, of ALA degrees awarded and total ALA head count. Dividing head count by degrees awarded gives an indication of how long the average student takes to complete a course of study.

The table above looks at the available data and seeks to put it to use, showing on average, the length of time it might take a student to complete an MLIS program, a factor that can have major financial implications for students. However, because the data is self-reported by schools and lacks confirmation or standardization by COA, these numbers, which could be a valuable tool for prospective students, provide significantly less value. By way of illustration, an earlier version of this story cited the University of Washington as having 746 students in 2012, an erroneous figure that remains embedded in the official ALA data despite efforts by UW to have it corrected. Another resource is LJ’s annual placements and salaries survey.

There does not seem to be any available data that addresses the number of students enrolled in various courses within programs, how many students take courses online versus in person, nor how many full-time versus part-time faculty members are engaged in instruction. At present, the standards only devote one sentence to distance education in COA’s Standards, which suggests that the field has a long way to go to keep up with the proliferation of online education and ensure it meets rigorous standards. Currently, there is no way to determine the number of on-campus, or online, or hybrid students, or courses at LIS programs. Neither ALISE nor COA collects such statistics. Nor is there a way for potential students to determine whether courses, especially core courses, are offered with sufficient frequency for them to graduate in a timely manner.

A quantified proposal

Is there a way to make the COA Standards indicators of quality? The answer is yes, if such data as suggested above could be collected and publicized and written into them. Whether schools would be willing to submit to this increased data-gathering and what effect it would have on the accreditation process remain to be seen, but an increased body of available data would let prospective students make better informed decisions about their educational future.

Until such measures are implemented, there are some proxies that might be useful to prospective students.

  • While past performance is no guarantee of future results, schools can still be evaluated based on the students they graduate. All programs have a goal of educating students to be leaders in the profession, which could be measured by examining alumni output in the form of publications or presentations by graduates, or by looking at the positions those grads hold in professional organizations or their participation at state or national meetings.
  • Another indicator of interest to potential students is the undergraduate graduation rate of the university and its student loan default rate, which are tracked by the U.S. federal government. While these are undergraduate statistics, it is not clear that the results of graduate students at a particular university differ from those of undergraduate students. Prospective students need to be cognizant of the university’s strengths to make sure they can complete their degrees in a timely fashion, and these statistics can be particularly useful there.

Raising the bar

There is also a third means of evaluating program quality: certification of graduates by means of an examination, similar to the bar examination lawyers must pass. This has not yet been done in LIS education, seemingly because there would seem to be numerous insurmountable problems to this solution. The breadth of the field and education within it make this sort of examination difficult to picture, but that’s no reason to write off the possibility of crafting an exam, or series of exams, that could be developed and administered by the relevant professional associations such as the Association of College & Research Libraries, Public Library Association, and Special Libraries Association. This process would be quite expensive and time-consuming and still leave many areas (rare books, preservation, archives and records management, digital libraries, and all areas of technology) uncovered. However, the difficulty of creating such a tool is not, by itself, sufficient reason not to explore its potential.

The authors write as believers in accreditation and see it as a process that can have strong benefits for programs, enabling them to examine themselves periodically and thus benefit both the program and its constituents. As practiced now, though, accreditation is not really relevant to the needs of the profession or of students. A good beginning would be to require LIS programs to address the concerns noted above. The statistical indicators might help make accreditation relevant to prospective and current students as well as to prospective employers.

It has also become clear to the authors that while the statistics gathered by ALISE, COA, and LJ range far and wide, their quality, accuracy, and usefulness leave much to be desired. If we are to be publicly accountable, we as a profession need to do a much better job of collecting and publishing meaningful statistics and making them available to the newest members of our field. After all, how can we train new librarians to present those they serve with the best information available if we’re not willing to do the same for these graduates from day one?

Phil Mulvaney is Library Director Emeritus, Northern State University, Aberdeen, SD. Dan O’Connor is an Associate Professor, Department of Library & Information Science, Rutgers University, New Brunswick, NJ

Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.


Kim

I'm curious about how to acquire admissions data. When I was looking at LIS programs I tried to find this information and was not successful. Are the LIS programs afraid to publish this data publicly?

Posted : Sep 01, 2014 06:54


Gabriel Richardson

Hi, great recommendation and an fascinating article, it'll be interesting if this is still the case in a few years time

Posted : Jul 24, 2014 04:23


Annie

I agree with some of the points raised above. I am an online MLIS student. I know that my degree will take longer than average because I am part time. When choosing my program I did look at entrance requirements and accreditation status. When choosing a program, you'll find that entrance requirements are either insanely easy or insanely difficult if you are a non-traditional student. Taking the GRE again was not a financial option for me, neither was trying to find letters of recommendation from professors since I had been out of University for a decade. We should also look at perceived value of the MLIS degree from the employer's point of view. Librarianship has become devalued in many markets because hiring a Librarian is more expensive than hiring a "paraprofessional." In my area, the public and academic libraries tend to hire anyone with a degree (the academic library) and/or people with bookstore experience, feeling that it is commensurate with MLIS training. As someone who utilized these local library resources during a recent reference course for my MLIS, I can say these people are woefully inadequate in reference skills. I met person after person manning the reference desks who didn't know how to search, use their databases or perform even the most basic of reference interviews. They were graduates of Google U, and it showed. Bean counters are saving money, and hurting libraries by not hiring librarians. Eventually, it will start to cost them more and more. We also forget that ranking programs is a necessary, yet terrible idea in general. Students who find they are attending a less than stellar program because it was the program they could afford or other metric feel the sting of being labeled with going to a "lackluster" program before they even get to show employers who they are and their capabilities. Could this affect their performance in the job market? Do they try less hard or do these rankings color the view of hiring managers such that they won't hire students from X University because it was ranked lower and they have a bias towards those schools that were ranked higher> Ultimately, we should be looking at individual graduates on their merits, not their school. School rankings only do a disservice to graduates. I know many an Ivy League graduate who amounted to nothing, and I have seen greatness emerge from mediocrity.

Posted : Jun 25, 2014 09:28


Frumious Bandersnatch

Until the culture and inadequacies of the profession itself are addressed, addressing any perceived deficiency on the part of the schools is pointless. Bluntly put, it is wasted effort to discuss more rigorous and exclusive gateways to a profession where one can often expect to start out in a part time position making $14.00 an hour. If our academic rigor is substandard, so, by and large, is the compensation and regard we offer the degree holders.

Posted : Jun 18, 2014 12:54


knownever

It's ironic that the profession is concerned with increasing diversity in the ranks, but simultaneously looks down on MLIS programs that have the very qualities that make them more accessible. Schools without GRE requirements makes applying less expensive. Schools with longer durations to graduation may have these numbers (and we need some more investigation into why) because they make it easier for people who are working full time to pursue their MLIS.

Posted : Jun 16, 2014 09:59


YetAnotherUnderemployedLibrarian

Every time I read about people ranking MLIS schools I have to laugh. It's like trying to rank romance novels on their literary merit. I do not think that having a final exam would be beneficial either, most people can remember and regurgitate things on command. I do believe that MLIS schools need to have higher entrance standards. But money makes the world go round, so it's easy money for schools to let anyone off the street join what's supposedly a "graduate" program.

Posted : Jun 13, 2014 08:51


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?