I was fortunate to get to attend ALA in Orlando. When I’m at ALA, I make sure to always attend the ACRL Instruction Section panel. This year, I was especially interested because the panel took on Authority is Constructed and Contextual, a very rich concept in the Framework that we’ve had many conversations about as we’ve worked on the first module of the test: Evaluating Process and Authority.
The panelists described how they have engaged with the concept of authority in their own teaching and how the Framework has inspired them to think about this concept in new ways. Though the panel itself raised many interesting questions, a comment from the audience particularly piqued my interest. Jessica Critten, from West Georgia University, highlighted the gap in librarians’ discourse about what constitutes evidence and how students are taught to understand what they’re doing with the information sources we’re asking them to evaluate. She clearly identified the implication of the Authority is Constructed and Contextual Frame, which is that we evaluate authority for a purpose and librarians need to engage in more meaningful discussion about those purposes if we are going to do more than leave students with the sense that everything is relative. Jessica has been thinking about these issues for a while. She co-authored a chapter called “Logical Fallacies and Sleight of Mind: Rhetorical Analysis as a Tool for Teaching Critical Thinking” in Not Just Where to Click: Teaching Students How to Think about Information.
Jessica’s remarks showed me a connection that we need to continue to strengthen between our work in libraries and our colleagues’ work in composition studies and rhetoric. Especially at a time of increasing polarization in public discourse, the meaning of concepts like authority, facts, and evidence cannot be taken for granted as neutral constructions that we all define the same way. When I got back from Orlando, I sat down with our Rhetoric and Composition consultant, Richard Hannon, to ask him to elaborate on the connection between the Framework and how he gets students to think critically about facts, evidence, and information sources.
At ALA in Orlando on June 24 and 25, the final cohort of ACRL’s Assessment in Action team leaders will present the results of their assessment projects. This will be the culmination of 15 months of work that they have done on their own campuses and in our community of learners. For me, it will also be the culmination of about three and a half years of collaboration with Deb Gilchrist, Lisa Hinchliff, Carrie Donovan, and Kara Malenfant--as well as John Watts and Eric Resnis who joined the team in 2015. I have been a facilitator and curriculum developer for Assessment in Action since the first cohort began in 2013, and I have learned so much about assessment by working with librarians as they designed and implemented their projects.
In particular, I have learned about the value of thinking carefully about my institutional culture and norms when I am weighing different methods of assessment. Since there is no single right answer to the question of what type of assessment method or instrument we should use, the best guidance I have found has been to ask the question: “What will result in findings that we can use to ask new questions about our practice and that we can make meaningful to our colleagues?” Keeping my institution’s priorities in mind helps me to manage the sometimes overwhelming variety of approaches to assessing learning.
I have also learned that perseverance and a willingness to treat assessment as serious play will make it possible for librarians to sustain assessment projects over time. We all know that assessment is not a one-and-done activity, no matter how well designed, and so it is important to see it as a puzzle that we’ll get better at creating and solving as we become more practiced. The most important step to successful assessment is just to get started doing something, because the best assessments don’t just answer questions, they also raise new ones and that means that there’s never a final assessment project. For the AiA team leaders, I know that the results they’re sharing at ALA are just the first step in an ongoing process of learning more about their own contributions to students’ success.
The California Academic & Research Libraries, the state chapter of ACRL, held its biennial conference in Costa Mesa March 31-April 2. I was there to present a poster describing our approach to analyzing the Framework. I outlined our process for developing student outcomes and performance indicators. And I explained the rhetorical analysis that resulted in our four Dispositions: Toleration for Ambiguity, Feeling Responsible to the Community, Productive Persistence, and Mindful Self-Reflection. (You can read more about Richard Hannon’s work with the Dispositions here.)
Earlier that day, per-conference workshop participants had met with Allison Carr and Talitha Matlin from Cal State University, San Marcos, to grapple with the Framework and apply constructivist teaching approaches to develop new lesson plans and activities. Discussions like these are helping librarians to bridge the divide between their practices and the Framework’s aspirations. It was a wonderful opportunity for librarians to spend some focused time making the Framework practical.
Throughout the conference, what I heard from my colleagues working at colleges and universities from across the state is that the Framework remains an inspiring but daunting document. We discussed its value as a renewed vision for IL and encouraged one another to keep up the challenging but rewarding work of adapting the Framework to our needs. I heard this message from my community college colleagues who were trying to determine how much of the Framework to bite off in any given research session and in their program, overall. And I heard it from my research university colleague who is trying to incorporate the Framework’s knowledge practices as well as its dispositions into their campus-wide discussions about assessing institutional student learning outcomes.
The efforts of our advisory board and consultants to distill the Framework in order to create TATIL offer one piece of the foundation on which we will continue to build the future of information literacy.
There's still time to participate in field testing one or more modules this semester. This is great opportunity to contribute to the effectiveness and rigor of the test. If you're interested, please contact me (firstname.lastname@example.org) or Rick Wiggins (email@example.com) to get started.
We continue to make strides in developing the test. We've just completed cognitive interviews and usability testing for the third test module and we are writing items for the final module, 4: The Value of Information. Thanks to our talented team of test question writers, we are making exciting progress.
I had a chance this month to check in with Carrie Donovan to find out what she's thinking about the Framework now that it's been a little more than one year since ACRL filed the document. Carrie is Assistant Dean for Research & Instruction Services at Ferris State University’s Ferris Library. She is a curriculum designer and facilitator for ACRL's Assessment in Action. She also serves as the ACRL Instruction Section Member-at-Large, and as the ACRL Liaisons Training and Development Committee Vice-Chair. Carrie has been a member of the TATIL Advisory Board since 2014.
I asked her the following questions and she shared her insights.
At the ALA MidWinter meeting this month, the ACRL Board of Directors formally adopted the Framework for Information Literacy. At last year’s meeting, the Board filed the Framework, making it an official document of the organization. By formally adopting it, the Board is signaling to librarians who might have considered their previous action to be ambiguous that the Framework is here to stay. You can read more about the timing of their action at ACRL Insider.
In their post, ACRL’s president, vice-president, and past president acknowledge that there may be a role for standards or outcomes to be created in tandem with the Framework to support librarians’ ongoing assessment efforts. That’s certainly what we’ve found by working on the Threshold Achievement Test of Information Literacy. We’ve spent a lot of time developing and refining the outcomes and performance indicators that we use as the skeleton for our test. These guide how we create test questions because they describe the understanding, critical thinking, problem solving, and dispositions we expect students to demonstrate on the test. We’re excited to see how the ACRL leadership and membership work together in the coming year(s) to define additional outcomes.
If you haven’t checked it out already, visit ACRL’s Framework for Information Literacy blog, where Donna Witek is posting weekly links to scholarship related to the Framework.