2014-06-13

Read the related LIS Education article: “How To Choose Your Library School”

Throughout the United States and Canada, there are more than 63 ALA-accredited programs offering advanced degrees in library and information science. While the number of programs has grown over the years, the field has yet to develop any significant, rigorous measures of evaluation to assess them. Even as interest in LIS education grows, the tools for determining which programs will match a student’s goals or establishing a hierarchy of quality remain stuck in neutral.

Historically, many LIS students chose to attend a particular program because it was geographically close, so, for most, the decision was relatively easy. Today, however, thanks to online education, many programs are national, providing ease of access and in some cases ease of affordability as well. San José State, CA; Emporia, KS; Clarion, PA; and Texas Woman’s University, Denton, all have quite large distance education programs.

Attendance, budgets grow unevenly

Over the past decade, enrollment has grown by 1,823 students, from 14,683 to 16,506, a rate of just over one percent per year. This growth was not equally shared, however. Twenty-five programs declined in enrollment, six together lost a total of 1,076 students over the decade, and 11 gained a total of 3,056 students.

Not all demographic groups have shared equally in the expansion of head counts, either. Minority enrollments grew by 37 percent over the past ten years. This might seem large, if the beginning numbers hadn’t been so small. In 2002, minority enrollment was 1,698 students, representing 11.6 percent of enrollment. By 2012, minority enrollment stood at 14.1 percent of all MLIS students. This is still a long way from the 36.2 percent of the U.S. population comprising people of color in 2011, according to the Census Bureau, or even the 22.7 percent of master’s degrees awarded to people of color in 2007, according to the National Center for Education Statistics.

As well as inequality in gains or losses in head count, there is tremendous inequality in the budgets of LIS programs. While over the past decade, nine programs each had increases of more than $5 million, totaling over $84 million, there were 17 programs that either lost money or received an increase of less than $500,000.

Perception, standards, competencies

Historically, there have been two means of evaluating LIS programs. The first, popularized by Herb White, dean of the LIS program at Indiana University, was through perception studies: asking faculty and administrators in LIS programs which programs were the best. White’s three studies using this technique produced consistent results, showing that respondents preferred programs that were large and offered doctoral options. This methodology disappeared for a while, only to reappear, slightly modified, in rankings from US News & World Report, which gathers data on program quality via a survey sent to three upper-echelon members of each LIS program, ranking each school on a scale of one to five. Compared to the ranking system for business programs, which takes into account reviews from recruiters, salary and employment figures for graduates, and test scores and grades of incoming students alongside peer reviews, the LIS program rankings from one of the leading voices in academic ratings remain fairly primitive.

The second method used to evaluate programs is accreditation, by which programs are held to and measured against standards. The problem is that the ALA Standards for Accreditation are liminal; they are a threshold, designed so as not to be tripped over. They are meant to be inclusive, not exclusive, making them a poor bar at best for judging program quality.

At the same time, library education has been under attack for not properly preparing its students for their subsequent job responsibilities. These attacks have strengthened interest in a competency movement: ALA and its divisions, and other organizations, have prepared lists of competencies that new graduates should have. Moreover, ALA’s Committee on Accreditation (COA), in its Standards, requires LIS programs to “take into consideration” the various competency statements. LIS programs, we presume, now teach to these competencies. Neither the value of the competencies nor their ability to address the pressing financial and political needs of the field have been demonstrated.

All of this begs an important question: How dependent is the ranking of a program on the ranking of its home university? If programs are ranked because they are in large, ranked universities (and we have seen that they are), how useful are those rankings given that being based at a large, renowned home university does not necessarily ensure a top-tier LIS program? On the other hand, if the rankings are accurate, the only purpose served by accreditation is to equate falsely the programs at ranked and nonranked universities, thus devaluing the profession itself.

A damaging admission

In a previous issue of LJ, the authors suggested public accountability measures that could be provided by programs and published by COA and that could serve as indicators of interest to prospective students and employers, mirroring the ranking standards already in place for other programs. One of those factors we suggest exploring is a look at the admissions requirements for programs. In an era of performance-based budgeting, in which programs are rewarded for the number of students they teach, shrinking budgets have put pressure on programs to make admission standards more lenient in the interest of bringing in more tuition dollars.

Table: Top and Bottom Schools by Time to Degree

Ten programs with SHORTEST time to degree

School

Degrees Awarded

Total Head Count

Duration in Years*

Long Island

384

274

0.71

Indiana

268

267

1

Pittsburgh

189

241

1.28

North Texas

485

664

1.37

Rutgers

178

269

1.51

Florida State

231

394

1.71

California (UCLA)

66

118

1.79

Texas Austin

124

222

1.79

Rhode Island

58

104

1.79

Emporia

157

284

1.81

Ten programs with LONGEST time to degree

School

Degrees Awarded

Total Head Count

Duration in Years*

Syracuse

99

298

3.01

Clarion

153

476

3.11

Simmons

202

630

3.12

San José State

630

1,986

3.15

Kentucky

79

250

3.16

Hawaii

29

97

3.34

St. Johns

20

67

3.35

Alabama

72

249

3.46

St. Catherine

48

185

3.85

Washington

130

746

5.74

*Duration calculated as head count divided by number of degrees awarded

We looked at 2012 admissions data for the ten programs that graduated over 43 percent of all ALA-accredited MLIS students according to COA trend data statistics. One thing that became clear was that, by and large, academic admissions requirements are stated conditionally and are nearly nonexistent. Common criteria, like a 3.0 GPA, were met by nearly 80 percent of college graduates, suggesting that programs granting a huge number of degrees were not picky about the students they accepted.

Testing was far from universal as well. For many programs, GRE scores are required only when the GPA is low, and even programs that do require GRE scores to be submitted do not demand exceptional scores from applicants. Annually, about 852 students reported GRE scores as MLIS applicants, suggesting that just over ten percent of the total applicants to MLIS programs are reporting GRE scores to those institutions. Those of us who have served on admissions committees know that GPA scores from different schools can mean very different things, making GRE scores an important and impartial indicator of student aptitude.

With today’s programs being funded on the basis of student credit hours and with most having no substantive admissions criteria, many schools are no longer acting as gatekeepers for the profession.

Further complicating the situation, those entering the profession today include fewer humanities and more business and education majors, raising questions as to whether the educational programs of today have adapted to the changing student demographics and whether they are properly serving this new breed of student.

Admissions standards for LIS programs in general need to be significantly strengthened to guarantee those programs are bringing the best and brightest into the industry and rigorously preparing them for their professional futures. Different skills, and not those in the various competency statements, need to be taught. This may result in fewer, and smaller, programs but ones that are better preparing students for the complex and changing world of modern librarianship.

What counts

COA keeps track of some statistics in its publication “Data on Program Performance,” including some that shed light on indicators like the number of students who graduate from a program and how long they take to do so. COA keeps track, by year, of ALA degrees awarded and total ALA head count. Dividing head count by degrees awarded gives an indication of how long the average student takes to complete a course of study. The table above shows, on the average, the length of time it might take a student to complete an MLIS program, a factor that can have major financial implications for students. Another resource is LJ’s annual placements and salaries survey.

There does not seem to be any available data that addresses the number of students enrolled in various courses within programs, how many students take courses online versus in person, nor how many full-time versus part-time faculty members are engaged in instruction. At present, the standards only devote one sentence to distance education in COA’s Standards, which suggests that the field has a long way to go to keep up with the proliferation of online education and ensure it meets rigorous standards. Currently, there is no way to determine the number of on-campus, or online, or hybrid students, or courses at LIS programs. Neither ALISE nor COA collects such statistics. Nor is there a way for potential students to determine whether courses, especially core courses, are offered with sufficient frequency for them to graduate in a timely manner.

A quantified proposal

Is there a way to make the COA Standards indicators of quality? The answer is yes, if such data as suggested above could be collected and publicized and written into them. Whether schools would be willing to submit to this increased data-gathering and what effect it would have on the accreditation process remain to be seen, but an increased body of available data would let prospective students make better informed decisions about their educational future.

Until such measures are implemented, there are some proxies that might be useful to prospective students.

While past performance is no guarantee of future results, schools can still be evaluated based on the students they graduate. All programs have a goal of educating students to be leaders in the profession, which could be measured by examining alumni output in the form of publications or presentations by graduates, or by looking at the positions those grads hold in professional organizations or their participation at state or national meetings.

Another indicator of interest to potential students is the undergraduate graduation rate of the university and its student loan default rate, which are tracked by the U.S. federal government. While these are undergraduate statistics, it is not clear that the results of graduate students at a particular university differ from those of undergraduate students. Prospective students need to be cognizant of the university’s strengths to make sure they can complete their degrees in a timely fashion, and these statistics can be particularly useful there.

Raising the bar

There is also a third means of evaluating program quality: certification of graduates by means of an examination, similar to the bar examination lawyers must pass. This has not yet been done in LIS education, seemingly because there would seem to be numerous insurmountable problems to this solution. The breadth of the field and education within it make this sort of examination difficult to picture, but that’s no reason to write off the possibility of crafting an exam, or series of exams, that could be developed and administered by the relevant professional associations such as the Association of College & Research Libraries, Public Library Association, and Special Libraries Association. This process would be quite expensive and time-consuming and still leave many areas (rare books, preservation, archives and records management, digital libraries, and all areas of technology) uncovered. However, the difficulty of creating such a tool is not, by itself, sufficient reason not to explore its potential.

The authors write as believers in accreditation and see it as a process that can have strong benefits for programs, enabling them to examine themselves periodically and thus benefit both the program and its constituents. As practiced now, though, accreditation is not really relevant to the needs of the profession or of students. A good beginning would be to require LIS programs to address the concerns noted above. The statistical indicators might help make accreditation relevant to prospective and current students as well as to prospective employers.

It has also become clear to the authors that while the statistics gathered by ALISE, COA, and LJ range far and wide, their quality, accuracy, and usefulness leave much to be desired. If we are to be publicly accountable, we as a profession need to do a much better job of collecting and publishing meaningful statistics and making them available to the newest members of our field. After all, how can we train new librarians to present those they serve with the best information available if we’re not willing to do the same for these graduates from day one?

Phil Mulvaney is Library Director Emeritus, Northern State University, Aberdeen, SD. Dan O’Connor is an Associate Professor, Department of Library & Information Science, Rutgers University, New Brunswick, NJ

Show more