Hi Everyone,

In November we made significant headway in our test development.  One essential component of designing a test is defining the concepts/skills we’ll assess and identifying the levels of performance we’ll expect to see students demonstrate.  Since this will be an assessment of undergraduates’ IL, we’re drafting plans for a test that will differentiate among students at a beginning level (at or near entry), an intermediate level (at or near completion of lower-division coursework), and an emerging expert level (at or near completion of a bachelor’s degree).

The Advisory Board generated lists of the IL skills, concepts, and dispositions that we have observed among college students throughout their education.  We organized those observations so that they fit into the 6 frames of the latest draft of the ACRL Framework for IL in Higher Education.  This provided us our first model of how students’ IL changes as they encounter the threshold concepts at the heart of the 6 frames.  We will continue to refine our model and use it to guide our development of test items beginning in early 2015.

We have also added details to our plans for using innovative item types that take advantage of the flexibility that’s possible in computer based testing.  Innovative items can include images and offer alternative response modes so that we can increase the fidelity that the questions have to students’ research experiences and gauge their knowledge practices and dispositions in ways that have not been possible with traditional multiple-choice questions.  For example, innovative items might use an image of typical search results and ask students to select a set of appropriate sources given a specific information need.  Determining our full range of item types is the next phase before we begin writing items.

Finally, in November, we began discussions with an expert in educational adaptive technology.  We are using resources like Web Accessibility in Mind to incorporate elements of universal design at this early stage of planning for test item types and response systems.

We’re looking forward to a busy winter as we conclude our planning process and begin test development.  I’ll keep you up-to-date with another post soon.

--April

As you may be aware, Project SAILS has been operated by Carrick Enterprises, Inc. since 2012.  Two of the original SAILS team members formed the company in order to continue providing the SAILS tests to institutions throughout the United States and, starting this year, around the world.

Project SAILS is based on the 2000 ACRL Competency Standards for Information Literacy in Higher Education.  With the upcoming move to the new ACRL Framework, Carrick Enterprises will be developing an entirely new assessment instrument. This is a big job and we plan to provide more information about the instrument at the ACRL conference in March in Portland.

We are extremely happy to be able to announce that Dr. April Cunningham has taken on the job of coordinating the design of this new instrument. April is the Instruction/Information Literacy Librarian at Palomar College, a comprehensive community college in northern San Diego County. She is active on the Learning Outcomes Council, which coodinates institutional student learning outcomes assessments (including assessment of Palomar's general education information literacy outcome).  She is also one of the curriculum developers/facilitators for ACRL's Assessment in Action project. We could not have found a more qualified person to lead this effort. Please help us welcome April to the project!

The emerging ACRL framework for information literacy will affect libraries and librarians in many ways. The framework will offer new approaches to conceptualizing information literacy instruction and creating assessments. At Project SAILS, we are challenged to create a whole new standardized assessment that, like SAILS, can be used across institutions and that can span a college career.

To assist us with this process, in March we brought together a panel of knowledgeable and insightful librarians and measurement experts. During a two-day workshop, we discussed the existing draft of the framework, including metacognition and threshold concepts. We reviewed types of assessments and discussed the desired characteristics of assessments and reports. We conceptualized what an assessment based on the new framework would look like. Work continues and we are avidly following the continuing development of the framework.

Members of the panel include:

  • Joanna Burkhardt, Professor; Head of CCE Library; Chair of Technical Services, University of Rhode Island
  • April Cunningham, Instruction/Information Literacy Librarian, Palomar College, San Marcos, California
  • Jessame Ferguson, Director of Hoover Library, McDaniel College, Westminster, Maryland
  • Wendy Holliday, Head, Academic Programs and Course Support, Cline Library, Northern Arizona University
  • Penny Beile O’Neill, Associate Director, Information Services and Scholarly Communication, University of Central Florida
  • Dominique Turnbow, Instructional Design Coordinator, UC San Diego
  • Steve Wise, Senior Research Fellow at Northwest Evaluation Association, Portland, Oregon

We are grateful to the members of the panel and energized by the conversation, debate, and exchange of ideas.

As the framework moves toward completion and approval by the ACRL Board, we are continuing our work. We will have a beta version of our brand new assessment ready for testing soon. We will be revealing the name of the new assessment soon.

What will happen with the existing SAILS assessments?