Skip to content

Updated December 11, 2017

Lots of great conferences coming up! Most of the events listed below have an emphasis on information literacy. Conference date and location are provided along with a link to the conference home page and deadline for submitting proposals, if available.

Please let us know if you're thinking about presenting on your experience with the SAILS assessments or with the Threshold Achievement Test of Information Literacy (TATIL). We'll be glad to help!
Read more

The Project SAILS tests were developed soon after the Association of College and Research Libraries adopted the “Information Literacy Competency Standards for Higher Education” in 2000. The Standards received wide attention and many academic libraries and their parent organizations embraced all or part of the Standards as guideposts for their information literacy programs.

The Standards were structured so that each of the five standards had performance indicators, and each performance indicator had outcomes. Subsequent to the publication of the Standards, a task force created the objectives for many of the outcomes. (See “Objectives for Information Literacy Instruction: A Model Statement for Academic Librarians.”) The resulting combination of standards, performance indicators, outcomes, and objectives served as the foundation of the SAILS tests, with test items based on most of the objectives (or for cases in which no objective was written, on outcomes).

Since 2006, hundreds of colleges and universities have used the SAILS tests to measure the information literacy knowledge of their students. The Cohort version of the SAILS test was released in 2006 with the Individual Scores version becoming available in 2010. More recently, the Build Your Own Test (BYOT) option went live in 2016.

Carrick Enterprises assumed responsibility for the continued operation of Project SAILS in 2012. Since that time, we have repeatedly stated our intention to continue offering the SAILS tests as long as they prove useful to the higher education community. That promise continues to this day. The Association of College and Research Libraries rescinded the “Information Literacy Competency Standards for Higher Education” earlier this year, but we stand by our commitment to offer the SAILS tests well into the future. We know that many institutions want a long-term solution to information literacy assessment and SAILS is one such solution.

The SAILS tests will be available as long as they are needed. We continue to monitor how well the test items perform, to make updates to test items, and to improve the underlying systems. If you would like to discuss how the SAILS tests can help you and your institution, please contact us.

Today our guest is Caroline Reed, Director of Research, Instruction and Outreach Services in the Jane Bancroft Cook Library at New College of Florida in Sarasota. I met Caroline at ACRL 2015 and when she told me about her innovative use of the Project SAILS test, I asked her to tell the story here.

Question: Would you briefly describe the information literacy program at New College of Florida?

Caroline: We are in the early stages of developing our information literacy program. Currently we do the traditional one-shots requested by faculty. We also encourage students to make consultation appointments with librarians. We have recently developed a liaison program with faculty where each of our instruction librarians is responsible to one of our three divisions--Humanities, Natural Sciences, and Social Sciences.

Library instruction is a part of all Seminars in Critical Thinking, which are research and writing intensive classes originally set up as part of our QEP, as well as our WEC (Writing Enhanced Classes).

We have a librarian who is a Wikipedia Ambassador. She has been able to work with faculty and students to edit and create Wikipedia entries as replacements for the traditional research paper assignments.

Librarians work with students on annotated bibliography projects as part of the January Independent Study Project (ISP) that 1st - 3rd years have to complete. This year one of our librarians actually sponsored the ISP so that she was the faculty member of record on those projects.

Read more

At the Library Assessment Conference in August, Rick and I met Kathy Clarke, librarian and associate professor at James Madison University Libraries. Kathy oversees JMU's information literacy assessment and described the program in a lightning talk titled, "The Role of a Required Information literacy Competency Exam in the First College Year: What Test Data Can, Cannot, and Might Reveal." We asked Kathy her thoughts about standardized assessments. Here's what she shared with us:

What’s the future of an information literacy test?

Kathy Clarke

James Madison University has been a pioneer of the multiple choice information literacy tests of student skills. Many of you are probably pretty tired of hearing that, but it is hard to know how to begin these types of pieces without that statement/disclaimer. It’s a professional blessing/curse.

The JMU General Education program adopted the original ACRL Information Literacy Competency Standards for Higher Education as learning outcomes shortly after their adoption in 2001. As such, all JMU students have had to meet an information literacy competency requirement since that time.

Whenever we talk about a competency requirement, we mean all students will achieve a certain standard, and we will be able to demonstrate that they have met it. At JMU this is accomplished via a required test, Madison Research Essentials, that all our incoming students (n=4500) must complete before the end of their first year at JMU.

When we are speaking about a group that large and a strict reporting mandate, we realistically have one option – a multiform, fixed-answer multiple-choice test with two set scores (proficient and advanced). Our first year class is too large to offer an information literacy course (too many students and too few librarian-faculty) and our GenEd program is distributed such that departments offer specific courses, none that could meet or absorb the IL Standards/Learning outcomes.

At the August Library Assessment Conference (and in the library literature recently) there was and is much talk of rubrics, but scant attention to tests. One might go so far as to say that tests are starting to seem passé in favor of rubrics. It might surprise many to learn that rubrics play an important role in information literacy assessment even at JMU.

But not to the tune of an n=4500 reported out every single year.

As I have become more familiar with assessment issues and concerns, I have been taught by my assessment colleagues that you assess what you need to know about what students are able to do, but that you do it strategically to find out:

  • Is there is a difference?
    • Students who complete XYZ program perform better on the ABC instrument than those students who did not complete the program
  • Is there a relationship?
    • Students who get an A in the Basic Communication course get better grades on presentations in downstream courses.
  • Is there change or growth over time?
    • Student writing improves from first year assessment day to sophomore assessment day
  • Do you need all students to meet a set standard or competency?
    • All first year students will pass the University’s research competency exam by a given day.

When we test our first year students, we are doing so to set a competency and in all honesty, it works well. They all have to pass it by a certain date, and we report out results (for example, 97.9% of first year JMU students demonstrate information literacy competence) by the end of their first year. But the other three questions -- difference, relationship, and change -- could certainly be measured by a rubric. But also by a well-configured pre-post test design. So, it depends on what you want to know, but it also depends on how many you have to do, how often, why and for whom.

Assessing information literacy with either the old standards or by setting up new metrics for the new framework is fairly new to most librarians and downright foreign to many. Fixed-choice multiple-choice instruments, like SAILS, our locally grown Madison Research Essentials Skills Test (MREST) or Madison Assessment’s Information Literacy Test (ILT) do one kind of assessment. But they do that kind of assessment efficiently, quickly and for a large number it might be a good or the even the only do-able option.
____________________________________________________________

You can read Kathy's Library Assessment Conference presentation here:
http://libraryassessment.org/bm~doc/8clarkelightningtalk.pdf

 

Carrick Enterprises was represented at the 2014 Library Assessment Conference in early August. Rick Wiggins gave a lunchtime talk to a large audience about Project SAILS, explaining the purpose and benefits of the assessment program.

I attended many presentations and participated on a panel about three IMLS-funded information literacy projects (Project SAILS, RAILS, and Project Information Literacy).

Some highlights and take-aways from two of the presentations:

Alan Carbery (Champlain College) talked about assessing student work using rubrics and citation analysis. Among his findings:

  • Misunderstanding the definition of primary source – “the main source I will use”
  • Students who choose research topics from popular culture are more likely to omit academic sources because they assume there won’t be any.
  • Students have less trouble finding sources, more difficulty writing annotations.
  • Over reliance on citation management tools, resulting in important pieces, such as page numbers, being omitted.

Christine Tawatao, Robin Chin Roemer, Verletta Kern, and Nia Lam (University of Washington) examined video tutorials. Some results:

  • No correlation between marketing and consistent increase in usage.
  • No correlation between presence in LibGuides and usage.

Best practices for increasing user motivation:

  • Focus on a very specific problem or goal.
  • Assign a practical title.
  • Keep it short: 30-60 seconds. Skip the intro, get to the point.
  • Embrace quality production values - sound and images.
  • Place at point-of-need.

And, because I like diagrams and charts, here are two sources mentioned during keynote talks:

Extinction timeline 1950 - 2050: http://www.nowandnext.com/PDF/extinction_timeline.pdf
HT Margie Jantti, University of Wollongong

Non-cognitive elements critical to student success - Productive Persistence: http://www.carnegiefoundation.org/sites/default/files/PP_driver_diagram.pdf
HT Deb Gilchrist, Pierce College

Conference web site. Conference proceedings will be published in a few months.

--Carolyn Radcliff, Information Literacy Librarian