Carrick Enterprises was represented at the 2014 Library Assessment Conference in early August. Rick Wiggins gave a lunchtime talk to a large audience about Project SAILS, explaining the purpose and benefits of the assessment program.

I attended many presentations and participated on a panel about three IMLS-funded information literacy projects (Project SAILS, RAILS, and Project Information Literacy).

Some highlights and take-aways from two of the presentations:

Alan Carbery (Champlain College) talked about assessing student work using rubrics and citation analysis. Among his findings:

  • Misunderstanding the definition of primary source – “the main source I will use”
  • Students who choose research topics from popular culture are more likely to omit academic sources because they assume there won’t be any.
  • Students have less trouble finding sources, more difficulty writing annotations.
  • Over reliance on citation management tools, resulting in important pieces, such as page numbers, being omitted.

Christine Tawatao, Robin Chin Roemer, Verletta Kern, and Nia Lam (University of Washington) examined video tutorials. Some results:

  • No correlation between marketing and consistent increase in usage.
  • No correlation between presence in LibGuides and usage.

Best practices for increasing user motivation:

  • Focus on a very specific problem or goal.
  • Assign a practical title.
  • Keep it short: 30-60 seconds. Skip the intro, get to the point.
  • Embrace quality production values - sound and images.
  • Place at point-of-need.

And, because I like diagrams and charts, here are two sources mentioned during keynote talks:

Extinction timeline 1950 - 2050: http://www.nowandnext.com/PDF/extinction_timeline.pdf
HT Margie Jantti, University of Wollongong

Non-cognitive elements critical to student success - Productive Persistence: http://www.carnegiefoundation.org/sites/default/files/PP_driver_diagram.pdf
HT Deb Gilchrist, Pierce College

Conference web site. Conference proceedings will be published in a few months.

--Carolyn Radcliff, Information Literacy Librarian

In light of the new framework for information literacy being developed by ACRL, the Project SAILS team is working toward a new assessment. The current SAILS assessments, including both the cohort and individual scores measures, will continue to be available to any institution that would like to utilize them for the foreseeable future. In fact, we are rolling out a number of improvements to SAILS.

The emerging ACRL framework for information literacy will affect libraries and librarians in many ways. The framework will offer new approaches to conceptualizing information literacy instruction and creating assessments. At Project SAILS, we are challenged to create a whole new standardized assessment that, like SAILS, can be used across institutions and that can span a college career.

To assist us with this process, in March we brought together a panel of knowledgeable and insightful librarians and measurement experts. During a two-day workshop, we discussed the existing draft of the framework, including metacognition and threshold concepts. We reviewed types of assessments and discussed the desired characteristics of assessments and reports. We conceptualized what an assessment based on the new framework would look like. Work continues and we are avidly following the continuing development of the framework.

Members of the panel include:

  • Joanna Burkhardt, Professor; Head of CCE Library; Chair of Technical Services, University of Rhode Island
  • April Cunningham, Instruction/Information Literacy Librarian, Palomar College, San Marcos, California
  • Jessame Ferguson, Director of Hoover Library, McDaniel College, Westminster, Maryland
  • Wendy Holliday, Head, Academic Programs and Course Support, Cline Library, Northern Arizona University
  • Penny Beile O’Neill, Associate Director, Information Services and Scholarly Communication, University of Central Florida
  • Dominique Turnbow, Instructional Design Coordinator, UC San Diego
  • Steve Wise, Senior Research Fellow at Northwest Evaluation Association, Portland, Oregon

We are grateful to the members of the panel and energized by the conversation, debate, and exchange of ideas.

As the framework moves toward completion and approval by the ACRL Board, we are continuing our work. We will have a beta version of our brand new assessment ready for testing soon. We will be revealing the name of the new assessment soon.

What will happen with the existing SAILS assessments?

Reinventing Libraries: Reinventing Assessment
Baruch College, New York, New York USA
June 6, 2012
Proposal deadline is March 1, 2014.

California Conference on Library Instruction
CSU East Bay, Oakland, California USA
April 18, 2014.
Poster session proposals due March 4, 2014.

See the full list of information literacy conferences and calls for proposals.

SAILS participants often ask if our information literacy assessments can be customized with additional questions. The answer is ‘yes’ and ‘no.’

The test questions themselves cannot be changed and no test questions can be added. The calibrations, scoring, reliability, and reporting all depend on having one set of validated test questions for all participants.

However, we do offer the option for participants to slightly modify two standard demographic questions and to create two custom questions.

Standard Demographic Questions

All test administrations have two standard demographic questions, class standing and major.

Shown to the right is a screenshot test administrators see when completing these standard Custom labelsdemographic questions. Test administrators can re-name class standings  and major labels to fit their institution. Another option is to delete class standings and majors that are not needed. These changes allow each test administrator to customize class standing and major labels to match the terminology used at their institution.

Creating Custom Questions

You have the option to create two custom questions of your own choosing. Each question can be up to 255 characters long (typically 30 – 40 words) and with up to nine responses of 40 characters each. In June 2014, the number of response options will increase from nine to fifty.

What kind of questions would you want to create? Perhaps you want to compare test performance among students enrolled in certain courses. Or you want to see if students who had prior information literacy instruction score higher than those who did not. We analyzed years’ worth of custom question created by our participants and discovered that most custom questions fall into these categories:

categories

The Value of Custom Questions

Custom questions not only allow test administrators to learn more about the students taking the assessment, but they also make the data received more valuable.

By having additional information about test takers, administrators are able to slice the data in more ways in order to develop additional findings that can lead to positive changes in instruction. For example, by asking if students have received information literacy instruction from a previous course, test administrators are able to understand if these courses are having a positive impact on the information literacy skills of these students versus peers who have not received prior instruction.

Project SAILS is dedicated to providing valuable data to testing institutions and we have seen the addition of custom demographic questions provide additional value to those utilizing them. We hope that whether you are setting up your first test administration or your tenth, you find a way to utilize custom demographic questions to their full potential.

Ready to start? Register for a free account and begin your test administration today!