Skip to content

At the Library Assessment Conference in August, Rick and I met Kathy Clarke, librarian and associate professor at James Madison University Libraries. Kathy oversees JMU's information literacy assessment and described the program in a lightning talk titled, "The Role of a Required Information literacy Competency Exam in the First College Year: What Test Data Can, Cannot, and Might Reveal." We asked Kathy her thoughts about standardized assessments. Here's what she shared with us:

What’s the future of an information literacy test?

Kathy Clarke

James Madison University has been a pioneer of the multiple choice information literacy tests of student skills. Many of you are probably pretty tired of hearing that, but it is hard to know how to begin these types of pieces without that statement/disclaimer. It’s a professional blessing/curse.

The JMU General Education program adopted the original ACRL Information Literacy Competency Standards for Higher Education as learning outcomes shortly after their adoption in 2001. As such, all JMU students have had to meet an information literacy competency requirement since that time.

Whenever we talk about a competency requirement, we mean all students will achieve a certain standard, and we will be able to demonstrate that they have met it. At JMU this is accomplished via a required test, Madison Research Essentials, that all our incoming students (n=4500) must complete before the end of their first year at JMU.

When we are speaking about a group that large and a strict reporting mandate, we realistically have one option – a multiform, fixed-answer multiple-choice test with two set scores (proficient and advanced). Our first year class is too large to offer an information literacy course (too many students and too few librarian-faculty) and our GenEd program is distributed such that departments offer specific courses, none that could meet or absorb the IL Standards/Learning outcomes.

At the August Library Assessment Conference (and in the library literature recently) there was and is much talk of rubrics, but scant attention to tests. One might go so far as to say that tests are starting to seem passé in favor of rubrics. It might surprise many to learn that rubrics play an important role in information literacy assessment even at JMU.

But not to the tune of an n=4500 reported out every single year.

As I have become more familiar with assessment issues and concerns, I have been taught by my assessment colleagues that you assess what you need to know about what students are able to do, but that you do it strategically to find out:

  • Is there is a difference?
    • Students who complete XYZ program perform better on the ABC instrument than those students who did not complete the program
  • Is there a relationship?
    • Students who get an A in the Basic Communication course get better grades on presentations in downstream courses.
  • Is there change or growth over time?
    • Student writing improves from first year assessment day to sophomore assessment day
  • Do you need all students to meet a set standard or competency?
    • All first year students will pass the University’s research competency exam by a given day.

When we test our first year students, we are doing so to set a competency and in all honesty, it works well. They all have to pass it by a certain date, and we report out results (for example, 97.9% of first year JMU students demonstrate information literacy competence) by the end of their first year. But the other three questions -- difference, relationship, and change -- could certainly be measured by a rubric. But also by a well-configured pre-post test design. So, it depends on what you want to know, but it also depends on how many you have to do, how often, why and for whom.

Assessing information literacy with either the old standards or by setting up new metrics for the new framework is fairly new to most librarians and downright foreign to many. Fixed-choice multiple-choice instruments, like SAILS, our locally grown Madison Research Essentials Skills Test (MREST) or Madison Assessment’s Information Literacy Test (ILT) do one kind of assessment. But they do that kind of assessment efficiently, quickly and for a large number it might be a good or the even the only do-able option.
____________________________________________________________

You can read Kathy's Library Assessment Conference presentation here:
http://libraryassessment.org/bm~doc/8clarkelightningtalk.pdf

 

Carrick Enterprises was represented at the 2014 Library Assessment Conference in early August. Rick Wiggins gave a lunchtime talk to a large audience about Project SAILS, explaining the purpose and benefits of the assessment program.

I attended many presentations and participated on a panel about three IMLS-funded information literacy projects (Project SAILS, RAILS, and Project Information Literacy).

Some highlights and take-aways from two of the presentations:

Alan Carbery (Champlain College) talked about assessing student work using rubrics and citation analysis. Among his findings:

  • Misunderstanding the definition of primary source – “the main source I will use”
  • Students who choose research topics from popular culture are more likely to omit academic sources because they assume there won’t be any.
  • Students have less trouble finding sources, more difficulty writing annotations.
  • Over reliance on citation management tools, resulting in important pieces, such as page numbers, being omitted.

Christine Tawatao, Robin Chin Roemer, Verletta Kern, and Nia Lam (University of Washington) examined video tutorials. Some results:

  • No correlation between marketing and consistent increase in usage.
  • No correlation between presence in LibGuides and usage.

Best practices for increasing user motivation:

  • Focus on a very specific problem or goal.
  • Assign a practical title.
  • Keep it short: 30-60 seconds. Skip the intro, get to the point.
  • Embrace quality production values - sound and images.
  • Place at point-of-need.

And, because I like diagrams and charts, here are two sources mentioned during keynote talks:

Extinction timeline 1950 - 2050: http://www.nowandnext.com/PDF/extinction_timeline.pdf
HT Margie Jantti, University of Wollongong

Non-cognitive elements critical to student success - Productive Persistence: http://www.carnegiefoundation.org/sites/default/files/PP_driver_diagram.pdf
HT Deb Gilchrist, Pierce College

Conference web site. Conference proceedings will be published in a few months.

--Carolyn Radcliff, Information Literacy Librarian

In light of the new framework for information literacy being developed by ACRL, the Project SAILS team is working toward a new assessment. The current SAILS assessments, including both the cohort and individual scores measures, will continue to be available to any institution that would like to utilize them for the foreseeable future. In fact, we are rolling out a number of improvements to SAILS.

The emerging ACRL framework for information literacy will affect libraries and librarians in many ways. The framework will offer new approaches to conceptualizing information literacy instruction and creating assessments. At Project SAILS, we are challenged to create a whole new standardized assessment that, like SAILS, can be used across institutions and that can span a college career.

To assist us with this process, in March we brought together a panel of knowledgeable and insightful librarians and measurement experts. During a two-day workshop, we discussed the existing draft of the framework, including metacognition and threshold concepts. We reviewed types of assessments and discussed the desired characteristics of assessments and reports. We conceptualized what an assessment based on the new framework would look like. Work continues and we are avidly following the continuing development of the framework.

Members of the panel include:

  • Joanna Burkhardt, Professor; Head of CCE Library; Chair of Technical Services, University of Rhode Island
  • April Cunningham, Instruction/Information Literacy Librarian, Palomar College, San Marcos, California
  • Jessame Ferguson, Director of Hoover Library, McDaniel College, Westminster, Maryland
  • Wendy Holliday, Head, Academic Programs and Course Support, Cline Library, Northern Arizona University
  • Penny Beile O’Neill, Associate Director, Information Services and Scholarly Communication, University of Central Florida
  • Dominique Turnbow, Instructional Design Coordinator, UC San Diego
  • Steve Wise, Senior Research Fellow at Northwest Evaluation Association, Portland, Oregon

We are grateful to the members of the panel and energized by the conversation, debate, and exchange of ideas.

As the framework moves toward completion and approval by the ACRL Board, we are continuing our work. We will have a beta version of our brand new assessment ready for testing soon. We will be revealing the name of the new assessment soon.

What will happen with the existing SAILS assessments?

Reinventing Libraries: Reinventing Assessment
Baruch College, New York, New York USA
June 6, 2012
Proposal deadline is March 1, 2014.

California Conference on Library Instruction
CSU East Bay, Oakland, California USA
April 18, 2014.
Poster session proposals due March 4, 2014.

See the full list of information literacy conferences and calls for proposals.