Skip to content

Sometime around 1996 I attended a conference on communication studies. I was working on a master’s degree in Comm Studies and this was my first conference in an area outside of librarianship. I was happy to discover a presentation on research related to libraries, specifically nonverbal behaviors of reference librarians. As the researcher described her findings and quoted from student statements about their interactions with librarians, I experienced a range emotions. Interest and pride soon gave way to embarrassment and frustration. The way I remember it now, there were a host of examples of poor interactions. “The librarian looked at me like I was from Mars,” that sort of thing. Most memorable to me was one of the comment/questions from an audience member. “Librarians need to fix this. What are they going to do about it?,” as though this study had uncovered a heretofore invisible problem that we should urgently address. (Did I mention feeling defensive, too?) I didn’t dispute the findings. What I struggled with was the sense that the people in the room thought that we librarians didn’t already know about the importance of effective communication and that we weren’t working on it. Was there room for improvement? For sure! But it wasn’t news to us.

I thought about that presentation again recently after viewing a webinar by Lisa Hinchliffe about her research project, Predictable Misunderstandings in Information Literacy: Anticipating Student Misconceptions To Improve Instruction. Using data from a survey of librarians who provide information literacy instruction to first year students, Lisa and her team provisionally identified nine misconceptions that lead to errors in information literacy practice. For example, first year students “believe research is a linear (uni-directional) process (and therefore do not see it as an iterative process and integrated into their work).” The project is a partnership with Credo. See the press release or view the webinar slides.

The reason the webinar took me back to the communication studies conference was that I found myself starting down that path. Listening to the misconceptions, I thought, hey, we should do something about that. For a moment I was that audience member, racing from becoming aware to expecting solutions. But only for a moment. Then it hit me – we ARE working on it.

I’ve been a reference and instruction librarian for more than 20 years. I’ve attended, participated in, watched, and read many many times how we can be better. Better at reference work, be better colleagues, administrators, mentors, leaders, collaborators with faculty, members of our communities. We can be more effective teachers. All these things are true and if you commit to lifelong learning, constant striving to improve is the only way. Luckily our profession is heavily invested in doing better. We seek to understand, to experiment, and to share.

The Predictable Misunderstandings in Information Literacy project is one of the latest in a long line of examples of trying to understand. Project Information Literacy is another, undertaking multiple investigations into how college students seek and use information in many aspects of their lives. Examples of how PIL studies are applied in libraries are readily available .

A brief look at books recently published by the American Library Association demonstrates that library professionals are positively doing the work and sharing their expertise. Here is just a sample:

Then there are the blog posts, web sites, journal articles, workshops, webinars, conferences, and other events aimed at sharing research and improving the educational experience for college students and instructors alike. Many of the regional, national, and international information literacy conferences are listed on this blog.

How does an instruction librarian balance the need to learn and improve with meeting the daily demands of the job? Every person has their own answer. For me, it has been a combination of getting the work done while staying open to opportunities. My library colleagues have always been the best partners in this endeavor while administrators have been supportive. I know I have not transformed my teaching for the better in all possible ways. But I keep trying.

 

The cornerstone of the Threshold Achievement Test for Information Literacy are the outcomes and performance indicators we wrote that were inspired by the ACRL Framework for Information Literacy for Higher Education.

Working with members of our Advisory Board, we first defined the information literacy skills, knowledge, dispositions, and misconceptions that students commonly demonstrate at key points in their education: entering college, completing their lower division or general education requirements, and preparing for graduation. These definitions laid the groundwork for analyzing the knowledge practices and dispositions in the Framework in order to define the core components that would become the focus of the test. Once we determined to combine frames into four test modules, the performance indicators were then used to guide item writing for each of the four modules. Further investigation of the Framework dispositions through a structural analysis led to identifying and defining information literacy dispositions for each module.

We invite you to read the outcomes, performance indicators, and dispositions that we created. They are available on the Threshold Achievement web site and as a PDF document. The work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. Module 4, The Value of Information, is presented below as an example.

Many other librarians and other educators have developed learning outcomes related to the Framework. One excellent site that brings together the work of many librarians is the ACRL Framework sandbox. As of this writing there are 10 listings in the Framework sandbox for learning outcomes. Another worthwhile source is the collection of information literacy learning outcomes from members of Private Academic Library Network of Indiana (PALNI) .

Threshold Achievement Test for Information Literacy Module 4: Value of Information

This module focuses on the norms of academic information creation and the factors that affect access to information. There are two knowledge outcomes and two dispositions that make up this module.

Outcome 4.1: Recognize the rights and responsibilities of information creation.

Performance Indicators:

1.1:  Identify reasons why plagiarism is prohibited.

1.2:  Determine whether or not a passage is plagiarized.

1.3:  Identify appropriate citation options when using material from a source that is cited within the source at hand.

1.4:  Identify the type of plagiarism when presented with a plagiarized passage.

1.5:  Recognize the benefits of copyright protections.

1.6:  Given a list, select the purposes of citation.

1.7:  Recognize the rights or interests of an author's sources.

1.8:  Recognize that where a source is found has no bearing on whether or not the source should be cited.

Outcome 4.2: Recognize social, legal, and economic factors affecting access to information.

Performance Indicators:

2.1:  Recognize how reporting on the same event offers disparate levels of coverage when the sources are written to be disseminated in different venues.

2.2:  Identify the relationship between individuals' organizational affiliations and their access to information.

2.3:  Identify reasons that some people's views are not disseminated to the larger community.

2.5:  Identify the meaning and scope of the concept of intellectual property.

2.6:  Identify the circumstances in which one's personal information may be used by other individuals, groups, and organizations.

2.7:  Identify reasons that access to information may be restricted, including copyright, licensing, and other practices.

2.8:  Distinguish among the common reasons that information may be freely available, including open access, public domain, and other practices.

Disposition 4.1: Mindful self-reflection

Learners who are disposed to demonstrate self-reflection in the context of the information ecosystem recognize and challenge information privilege.

Example behaviors:

  • Considering how to use existing intellectual property to spur creative work without violating the creators' rights.
  • Participating in informal networks to reduce disparities caused by the commodification of information.
  • Recognizing and suggesting ways to reduce the negative effects of the unequal distribution of information.

Disposition 4.2: Responsibility to community

Learners who are disposed to demonstrate a sense of responsibility to the scholarly community recognize and conform to academic norms of knowledge building.

Example behaviors:

  • Accessing scholarly sources through formal channels.
  • Avoiding plagiarism in their own work and discouraging plagiarism by others.
  • Recognizing the value of their own original contributions to the scholarly conversation.

 

After three years of development, two years of field testing, and countless hours of creative innovation and hard work, Carrick Enterprises is proud to announce the availability of the Threshold Achievement Test for Information Literacy!

We are fortunate to work with many librarians, professors, measurement and evaluation experts, and other professionals on the development of this test. We are grateful for the opportunity to collaborate with these creative people and to benefit from their insights and wisdom.


Test Item Developers
Jennifer Fabbi – Cal State San Marcos
Hal Hannon – Palomar and Saddleback Colleges
Angela Henshilwood – University of Toronto
Lettycia Terrones – Los Angeles Public Library
Dominique Turnbow – UC San Diego
Silvia Vong – University of Toronto
Kelley Wantuch – Los Angeles Public Library

Test Item Reviewers
Joseph Aubele – CSU Long Beach
Liz Berilla – Misericordia University
Michelle Dunaway – Wayne State University
Nancy Jones – Encinitas Unified School District

Cognitive Interviewers
Joseph Aubele – CSU Long Beach
Sophie Bury – York University, Toronto
Carolyn Gardner – CSU Dominguez Hills
Jamie Johnson – CSU Northridge
Pearl Ly – Skyline College
Isabelle Ramos – CSU Northridge
Silvia Vong – University of Toronto

Field Test Participants
Andrew Asher – Indiana University
Joseph Aubele – California State University, Long Beach
Sofia Birden – University of Maine Fort Kent
Rebecca Brothers – Oakwood University
Sarah Burns Feyl – Pace University
Kathy Clarke – James Madison University
Jolene Cole – Georgia College
Gloria Creed-Dikeogu – Ottawa University
David Cruse – Adrian College
April Cunningham – Palomar College
Diane Dalrymple – Valencia College
Christopher Garcia – University of Guam
Rumi Graham – University of Lethbridge
Adrienne Harmer – Georgia Gwinnett College
Rosita Hopper – Johnson & Wales University
Suzanne Julian – Brigham Young University
Cynthia Kane – Emporia State University
Martha Kruy – Central Connecticut State University
Jane Liu – Pomona College
Talitha Matlin – California State University at San Marcos
Courtney Moore – Valencia College
Colleen Mullally – Pepperdine University
Dena Pastor – James Madison University
Benjamin Peck – Pace University
Carolyn Radcliff – Chapman University
Michelle Reed – University of Kansas
Stephanie Rosenblatt – Cerritos College
Heidi Senior – University of Portland
Chelsea Stripling – Florida Institute of Technology
Kathryn Sullivan – University of Maryland, Baltimore County
Rosalind Tedford – Wake Forest University
Sherry Tinerella – Arkansas Tech
Kim Whalen – Valparaiso University

Standard Setters
Joseph Aubele – California State University, Long Beach
Stephanie Brasley – California State University Dominguez Hills
Jennifer Fabbi – California State University San Marcos
Hal Hannon – Palomar and Saddleback Colleges
Elizabeth Horan – Coastline Community College
Monica Lopez – Cerritos College
Natalie Lopez – Palomar College
Talitha Matlin – California State University San Marcos
Cynthia Orozco – East Los Angeles College
Stephanie Rosenblatt – Cerritos College

The Threshold Achievement Test for Information Literacy (TATIL) measures student knowledge and dispositions regarding information literacy. The test is inspired by the Association of College and Research Libraries' Framework for Information Literacy for Higher Education and by expectations set by the nation's accrediting agencies. TATIL offers librarians and other educators a better understanding of the information literacy capabilities of their students. These insights inform instructors of improvement areas, guide course instruction, affirm growth following instruction, and prepare students to be successful in learning and life. Each test is made up of a combination of knowledge items and disposition items.

About the Test

The Threshold Achievement Test assesses students' ability to recall and apply their knowledge and their metacognition about core information literacy dispositions that underlies their behaviors. Through this combination of knowledge and dispositional assessment TATIL offers a unique and valuable measure of the complexities of information literacy.

The knowledge items in TATIL are based on information literacy outcomes and performance indicators created by the test developers and advisory board of librarians and other educators. Knowledge items assess an array of cognitive processes that college students develop as they transition from pre-college to college ready to research ready. Mental behaviors tested include understanding (facts, concepts, principles, procedures), problem solving (problem identification, problem definition, analysis, solution proposal), and critical thinking (evaluating, predicting, deductive and inductive thinking). The items are presented in a variety of structured response formats to assess students' information literacy knowledge, skills, and abilities.

Dispositions are at the heart of a student's temperament and play an important role in learning transfer. Dispositions constitute affective facets of information literacy and are essential to students' information literacy outcomes. They indicate students' willingness to consistently apply the skills they have learned in one setting to novel problems in new settings. While some dispositions can be seen as natural tendencies, they may also be cultivated over time through intentionally-designed instruction and through exposure to tacit expectations for student behavior.

To address dispositions in the test, we use scenario-based problem solving items. Students are presented with a scenario describing an ill-defined information literacy challenge related to the content of the module. Following the scenario, students are presented with strategies for addressing the challenge. Students evaluate the usefulness of each strategy.

About the Reports

Threshold Achievement Test reports provide test managers with detailed and robust analyses of student performance. Sections include:

  • Summary results for knowledge and disposition dimensions
  • Detailed results for each knowledge outcome
  • Performance indicator rankings that identify students' relative strengths and weaknesses
  • Performance levels indicators ranging from conditionally ready to college ready to research ready
  • Disposition results with descriptions that align with students' scores
  • Breakouts for subgroups such as first year students or transfer students
  • Cross-institutional comparisons with peer institutions and other institutional groupings
  • Suggestions for targeted readings that can assist in following up on the results

Test managers also receive a set of supporting files:

  • Test Item document. A PDF document with a description of each test item.
  • Raw data file. Contains all of the scores presented in the report.
  • Student data file. Contains scores for every student.
  • Student data codebook. Describes the demographic options that were configured for the test.
  • Student Report zip file. Contains a directory of PDF documents with an analysis of each student's performance.

Test managers have the option to present students with personalized reports upon completing the test. As soon as the student finishes the test a dynamically generated reports is displayed describing the student’s performance and offering recommendations for improvement. The report content is connected directly with the knowledge outcomes, performance indicators, and dispositions of the module being tested.

About the Modules

Two TATIL modules are available now! Two more will come online in 2018. Read brief descriptions below and click on the module titles to see the outcomes, performance indicators, and dispositions. You may also download a PDF document with descriptions for all four modules.

Evaluating Process & Authority (the first module, available now!) focuses on the process of information creation and the constructed and contextual nature of source authority. It assesses how students understand and value authority, how they define their role in evaluating sources, and how they perceive the relative value of different types of sources for common academic needs.

Strategic Searching (the second module, also available now!) focuses on the process of planning, evaluating, and revising searches during strategic exploration. It tests students' ability to recall and apply their knowledge of searching and it tests their metacognition about a core information literacy disposition that underlies their searching behaviors.

Research & Scholarship is the third module and will be available in 2018. The test addresses students' ability to apply the research process to their college work in order to participate in the scholarly conversation and assesses how students understand and value their role within the scholarly community.

The Value of Information (fourth module, coming in 2018) assesses how students understand and value their role within the information ecosystem. It focuses on the norms of academic information creation and the factors that affect access to information. It tests students' ability to recall and apply their knowledge of information rights and responsibilities and it tests their metacognition about core information literacy dispositions that underlie their behaviors.

Learn More

The Threshold Achievement Test for Information Literacy (TATIL) is a unique and valuable tool to add to your assessment program. Explore the Threshold Achievement Test website to learn more about the test, cost and requirements for administering the finished modules, and participating in field testing for the remaining two modules.

Last week I was fortunate to get to attend and present at LOEX 2017, in Lexington, KY.  I’m excited to have joined the LOEX Board of Trustees this year and it was great to see familiar faces and meet new, energized librarians, too.

I presented a one-hour workshop where I walked participants through a comparison of two common types of results reports from large-scale assessments.  We looked at an example of a rubric-based assessment report and a report from the Evaluating Process and Authority module of the Threshold Achievement Test.  We compared them on the criteria of timeliness, specificity, and actionability, and found that rubric results reports from large-scale assessments often lack the specificity that makes it possible to use assessment results to make plans for instructional improvement.  The TATIL results report, on the other hand, offered many ways to identify areas for improvement and to inform conversations about next steps.  Several librarians from institutions that are committed to using rubrics for large-scale assessment said at the end of the session that the decision between rubrics and tests now seemed more complicated than it had before.  Another librarian commented that rubrics seem like a good fit for assessing outcomes in a course, but perhaps are less useful for assessing outcomes across a program or a whole institution.  It was a rich conversation that also highlighted some confusing elements in the TATIL results report that we are looking forward to addressing in the next revision.

Overall, I came away from LOEX feeling excited about the future of instruction in the IL Framework era.  While the Framework remains an enigma for some of us, presenters at LOEX this year found many ways to make practical, useful connections between their work and the five frames. ...continue reading "May Update: Report from LOEX"

Carrick Enterprises has begun to modernize the Project SAILS web site, administrator tools, and reports. This work will continue through the 2017-2018 academic year and will be put into production June 15, 2018. There will be no disruption of service during this work and all existing information will be migrated to the new system.

What’s new:

Peer institution scoring

You will select the tests from your peer institutions to include as a cross-institutional score. This will be reported with all score reporting except your Custom Demographic Questions. You will continue to see cross-institutional scores by institution type, however, you will now be able to include multiple institution types in these scores.

On-demand Cohort report creation

Cohort reports will no longer be restricted to being created at the end of December and the beginning of June. Once you have stopped testing, you will be able to configure your report for production. As long as all of the tests you have included in your peer institution list are completed, your report will be generated overnight and available to you the following day. Your payment will still be required to have been received by us before you can download your report.

Student reports for Individual Scores

You will have the option to display an individualized analysis of your students’ performance when they complete the test. They will have the option to download this report as a PDF document. If you choose to not display this report to your students, you will still receive the reports in your report download.

Detailed narrative report for Individual Scores

In addition to student data, you will receive a narrative report analyzing your students’ performance on the test. This report is something that can be shared with your faculty collaborators and your library administration.

Student activity monitoring

You will be able to monitor in real-time how far along your students are as they take the test. You will see the Student Identifier (which will be called the Student Key), start time, and page number that they are currently answering. You will still be able to download a list of Student Keys that have completed the test. This will continue to include the start time, end time, and number of seconds elapsed for each student.

What’s changing:

...continue reading "Project SAILS Enhancements in the Works"