Cut! How assessing student learning helped us focus our video production
Library Instruction Services at UT Austin is faced with the challenge of making sure 8,000 new students achieve a set of basic information literacy learning outcomes each year. In order to maximize the time we spend in the classroom as well as to meet students at their points of need, we began creating videos aimed at teaching students concepts and skills ranging from understanding the importance of background information to finding the full-text of articles. We planned and recorded the videos following best practices, and began including them in pre-session assignments for students and stand-alone research guides. Our web analytics told us that the videos were being used, but the question remained: did students achieve the desired learning outcomes after watching the videos? Approach: In order to answer this question, we recruited 10 first-year students at the beginning of the fall semester to participate in a video assessment project to measure their learning after watching one of the two most widely used videos we created. In individual sessions, student volunteers were asked to think aloud as they completed a pre-test composed of a series of tasks that mirrored the process of finding materials for a typical first-year assignment, and then watched a video walking them through the steps of either finding an article or finding a book on a topic. After watching the video, students were asked to walk through the same series of tasks as before, again thinking aloud and explaining their decisions as they did so. Findings: In the pre-test, only two of five students fulfilled the learning outcomes tied to the video on finding an article for a paper, and four of five students fulfilled the learning outcomes tied to the video on finding a book. In the posttest, four of five students successfully found a relevant article, and four of five students found and explained how to retrieve a relevant book. By analyzing students' comments and performance, we learned that our videos were more successful in teaching conceptual ideas (such as where different kinds of information would be published, or how to brainstorm keywords) than in demonstrating navigation or click-by-click instruction. Practical Implications: Even though our assessment project had a very small sample size, we learned valuable information that has helped us make decisions about the videos we create. In the future, we will focus on creating videos that explain and reinforce complex ideas and difficult concepts (such as evaluating information) or provide real-life visuals (such as finding a call number in the stacks) rather than screencasts that demonstrate tools. This will allow us to spend more time designing instructional tools that aim to help students understand threshold concepts and less time recording screencasts of changing database interfaces. As instruction increasingly moves online, it is essential to know which formats work best for different kinds of learning, and how to focus our energy on providing the most value. After assessing the learning outcomes in our videos, we know how to better facilitate deeper learning.