Skip to content

December Update: How Students Experience the Test

Tweet about this on TwitterShare on FacebookShare on Google+Email this to someone

Download sample student reportThis semester Carolyn Radcliff and I had the opportunity to discuss the test and the students’ results reports with our own classes or with students in our colleagues’ classes.  You can see an example of students’ personalized results reports by clicking the thumbnail to the right.  These reports are currently available for the field testing versions of modules 1 and 2 and will be available for field testing versions of modules 3 and 4 in 2017.

Students’ Responses to their Personalized Results

Our conversations with students gave us a new perspective on the test.   As with any test results, some students were disappointed by their results and others disagreed with the evaluation of their performance, but overall students found value in the reports.  Here are some samples of reflective responses from students:

  • I felt most engaged when the results said that I ‘have the habit of challenging (my) own assumptions.’ That’s something I definitely do and I was surprised that the test was able to detect that.
  • I was most surprised that the report said that I defer to particular kinds of authority a bit more than others; I will be sure to keep the recommendations in mind.
  • It was surprising that I wasn’t as proficient as I thought but I felt most engaged by the results when I learned that most college students are also at my level.
  • It was surprising that the results reminded me to seek out additional perspectives and not only ones that support my claim or topic.
  • The chart of my score was interesting.
  • I felt most engaged at the beginning [of the results report] when they analyzed my results directly by using [the pronoun] ‘you.’
  • The test was beneficial by making me think about the use of different sources.
  • Nothing was surprising, but I did agree with the recommendations to strengthen my writing/reading abilities, which I found very helpful.

Students appreciate having results immediately, and in one class where we promised them results but an error on my part during the test set-up delayed their reports, students expressed disappointment and were relieved when they understood that they would still get their personalized reports later.  Nevertheless we know that not every testing situation is intended to result in direct feedback to students, so the student reports are an optional feature that you can turn on or off when you set up the test each time.

Students’ Responses to the Test Questions

In conversations with students about the different types of items used on the test, we continue to hear that the openness of our dispositional items challenge some students’ assumptions about what tests should ask them to do.  The dispositional items present information problems that students are likely to face and ask them to judge the usefulness of suggested strategies.  Students who are particularly challenged by these items explain that they are not accustomed to test questions that state that there is no correct answer.  This made them feel uneasy.  Other students felt validated by the dispositional items.  They found themselves freed from their usual test anxiety by items that did not require the “right answer” but instead stated they should use their best judgment.  In addition to the dispositional scenario items, students also expressed interest in the items that use alternative response methods like matching and categorizing, which require more tactile interaction with the answer options.

Students’ Responses to the Test Content

In one class, students prepared discussion questions after taking the test.  Their questions demonstrate the curiosity and critical thinking that the tests can prompt.  Each one could be the seed for an in-depth class discussion.

  • Is determining the credibility of a source subjective?
  • What other information literacy tests are out there and how do students find and take these tests?
  • What is the difference between accuracy and relevance [in the context of evaluating sources]?
  • Why did the test try and learn our research style and how does it know what we need to improve on?
  • What kind of responsibility do we, as authors of our own papers, have to use accurate data and sources?

When students reflect on the content of the tests overall, they report feeling confident that they are familiar with the concepts that the tests cover but they express uncertainty about where and when they learned the skills being tested.  Because information literacy is not often taught explicitly in students’ classes, some students have not thought about their skills, knowledge, and dispositions in this area before taking the test.  One student even expressed gratitude for having the chance to demonstrate her understanding of how important it is to evaluate sources and her ability to consider complex criteria, because she felt she had never had a chance to highlight those skills and get feedback on them before.

Overall, our experiences talking with students about the test underscore its value not just as an assessment but also as an opportunity for developing students’ awareness of their own information literacy.  We look forward to hearing from librarians during field testing and beyond about their experiences discussing the tests with their students.

Tweet about this on TwitterShare on FacebookShare on Google+Email this to someone