Investigating Information Literacy Among Occupational Therapy Students at Misericordia University Using SAILS Build-Your-Own-Test

BY: Elaina DaLomba, PhD, OTR/L, MSW
Assistant Professor, Occupational Therapy Department
Misericordia University

Information literacy (IL) skills, as a component of evidence-based practice (EBP), are critical for healthcare practitioners. Most Occupational Therapy programs and the American Occupational Therapy Association require that curricula address IL/EBP skills development. However, evidence shows that occupational therapists don’t utilize IL/EBP once they graduate. Therapists don’t feel they possess the resources or skills to find current and applicable evidence in the literature. At Misericordia University’s Occupational Therapy program we decided to look at our student’s IL/EBP skills and trial a different method to enhance students’ skills. Measuring these constructs in a way that has clinical meaning is difficult. Misericordia uses SAILS for pre and post testing of all students’ IL skills development (during freshman and senior year) so it seemed a natural fit to use this within a research project. We didn’t want to collect unnecessary data due to time constraints so we chose the Build Your Own Test (BYOT), with three questions from each of the first six skill sets of SAILS. These 18 questions could be answered quickly and the data would be analyzed for us. This freed us up to focus on the qualitative portions of our research. Although the SAILS BYOTs don’t have reliability and validity measures particular to them (because they are individually constructed), the overall metrics of the SAILS are very good.

We designed an intensive embedded librarian model to explore what impact this would have on students' skill development in IL standards one, two, and three as per the objectives of our Conceptual Foundations of Occupational Therapy course. The librarian handled all of the pre and post-testing having the students simply enter their SAILS unique identifier codes (UIC) on computers in the library’s lab. Students then used their SAILS UIC for all study related protocols. The intervention started with an interactive lecture in the computer lab with simple, but thorough instructional sheets for the students to use throughout the semester. For each clinical topic introduced the instructor used the librarian’s model to create and complete searches in vivo, allowing the students to add, modify, or eliminate words, Boolean operators, MESH terms etc. The librarian was an active presence on our Blackboard site and maintained office hours within the College of Health Sciences and Education. Students were also instructed to bring their database search strategies and results for approval from the librarian prior to writing their research papers, exposing them to her knowledge, even if they had chosen not to access her assistance initially. The data will be analyzed in spring 2017, but data collection was a breeze!

The SAILS BYOT gave us meaningful, quantitative data in a quickly delivered format. While we might not conduct this same study again, we will continue to use SAILS BYOT for program development and course assessment due its ease of use and practical data.

The Project SAILS tests were developed soon after the Association of College and Research Libraries adopted the “Information Literacy Competency Standards for Higher Education” in 2000. The Standards received wide attention and many academic libraries and their parent organizations embraced all or part of the Standards as guideposts for their information literacy programs.

The Standards were structured so that each of the five standards had performance indicators, and each performance indicator had outcomes. Subsequent to the publication of the Standards, a task force created the objectives for many of the outcomes. (See “Objectives for Information Literacy Instruction: A Model Statement for Academic Librarians.”) The resulting combination of standards, performance indicators, outcomes, and objectives served as the foundation of the SAILS tests, with test items based on most of the objectives (or for cases in which no objective was written, on outcomes).

Since 2006, hundreds of colleges and universities have used the SAILS tests to measure the information literacy knowledge of their students. The Cohort version of the SAILS test was released in 2006 with the Individual Scores version becoming available in 2010. More recently, the Build Your Own Test (BYOT) option went live in 2016.

Carrick Enterprises assumed responsibility for the continued operation of Project SAILS in 2012. Since that time, we have repeatedly stated our intention to continue offering the SAILS tests as long as they prove useful to the higher education community. That promise continues to this day. The Association of College and Research Libraries rescinded the “Information Literacy Competency Standards for Higher Education” earlier this year, but we stand by our commitment to offer the SAILS tests well into the future. We know that many institutions want a long-term solution to information literacy assessment and SAILS is one such solution.

The SAILS tests will be available as long as they are needed. We continue to monitor how well the test items perform, to make updates to test items, and to improve the underlying systems. If you would like to discuss how the SAILS tests can help you and your institution, please contact us.

Lots of great conferences coming up! Most of the events listed below have an emphasis on information literacy. Conference date and location are provided along with a link to the conference home page and deadline for submitting proposals, if available.

Please let us know if you're thinking about presenting on your experience with the SAILS assessments or with the Threshold Achievement Test of Information Literacy. We'll be glad to help!
Read more

Today our guest is Caroline Reed, Director of Research, Instruction and Outreach Services in the Jane Bancroft Cook Library at New College of Florida in Sarasota. I met Caroline at ACRL 2015 and when she told me about her innovative use of the Project SAILS test, I asked her to tell the story here.

Question: Would you briefly describe the information literacy program at New College of Florida?

Caroline: We are in the early stages of developing our information literacy program. Currently we do the traditional one-shots requested by faculty. We also encourage students to make consultation appointments with librarians. We have recently developed a liaison program with faculty where each of our instruction librarians is responsible to one of our three divisions--Humanities, Natural Sciences, and Social Sciences.

Library instruction is a part of all Seminars in Critical Thinking, which are research and writing intensive classes originally set up as part of our QEP, as well as our WEC (Writing Enhanced Classes).

We have a librarian who is a Wikipedia Ambassador. She has been able to work with faculty and students to edit and create Wikipedia entries as replacements for the traditional research paper assignments.

Librarians work with students on annotated bibliography projects as part of the January Independent Study Project (ISP) that 1st - 3rd years have to complete. This year one of our librarians actually sponsored the ISP so that she was the faculty member of record on those projects.

Read more

At the Library Assessment Conference in August, Rick and I met Kathy Clarke, librarian and associate professor at James Madison University Libraries. Kathy oversees JMU's information literacy assessment and described the program in a lightning talk titled, "The Role of a Required Information literacy Competency Exam in the First College Year: What Test Data Can, Cannot, and Might Reveal." We asked Kathy her thoughts about standardized assessments. Here's what she shared with us:

What’s the future of an information literacy test?

Kathy Clarke

James Madison University has been a pioneer of the multiple choice information literacy tests of student skills. Many of you are probably pretty tired of hearing that, but it is hard to know how to begin these types of pieces without that statement/disclaimer. It’s a professional blessing/curse.

The JMU General Education program adopted the original ACRL Information Literacy Competency Standards for Higher Education as learning outcomes shortly after their adoption in 2001. As such, all JMU students have had to meet an information literacy competency requirement since that time.

Whenever we talk about a competency requirement, we mean all students will achieve a certain standard, and we will be able to demonstrate that they have met it. At JMU this is accomplished via a required test, Madison Research Essentials, that all our incoming students (n=4500) must complete before the end of their first year at JMU.

When we are speaking about a group that large and a strict reporting mandate, we realistically have one option – a multiform, fixed-answer multiple-choice test with two set scores (proficient and advanced). Our first year class is too large to offer an information literacy course (too many students and too few librarian-faculty) and our GenEd program is distributed such that departments offer specific courses, none that could meet or absorb the IL Standards/Learning outcomes.

At the August Library Assessment Conference (and in the library literature recently) there was and is much talk of rubrics, but scant attention to tests. One might go so far as to say that tests are starting to seem passé in favor of rubrics. It might surprise many to learn that rubrics play an important role in information literacy assessment even at JMU.

But not to the tune of an n=4500 reported out every single year.

As I have become more familiar with assessment issues and concerns, I have been taught by my assessment colleagues that you assess what you need to know about what students are able to do, but that you do it strategically to find out:

  • Is there is a difference?
    • Students who complete XYZ program perform better on the ABC instrument than those students who did not complete the program
  • Is there a relationship?
    • Students who get an A in the Basic Communication course get better grades on presentations in downstream courses.
  • Is there change or growth over time?
    • Student writing improves from first year assessment day to sophomore assessment day
  • Do you need all students to meet a set standard or competency?
    • All first year students will pass the University’s research competency exam by a given day.

When we test our first year students, we are doing so to set a competency and in all honesty, it works well. They all have to pass it by a certain date, and we report out results (for example, 97.9% of first year JMU students demonstrate information literacy competence) by the end of their first year. But the other three questions -- difference, relationship, and change -- could certainly be measured by a rubric. But also by a well-configured pre-post test design. So, it depends on what you want to know, but it also depends on how many you have to do, how often, why and for whom.

Assessing information literacy with either the old standards or by setting up new metrics for the new framework is fairly new to most librarians and downright foreign to many. Fixed-choice multiple-choice instruments, like SAILS, our locally grown Madison Research Essentials Skills Test (MREST) or Madison Assessment’s Information Literacy Test (ILT) do one kind of assessment. But they do that kind of assessment efficiently, quickly and for a large number it might be a good or the even the only do-able option.
____________________________________________________________

You can read Kathy's Library Assessment Conference presentation here:
http://libraryassessment.org/bm~doc/8clarkelightningtalk.pdf