Inspired by a conversations in the EdCampSTL Makerspace and with Adam Maltese from Indiana University, I’ve been thinking a lot about how I assess learning in our Makerspace. Here, I propose five ideas for assessing student learning and validating/invalidating our approach.
A) Longitudinal Outcomes Study – I have the rare pleasure of looping with a handful of students for the last three years. It is likely that many of them will continue in Makerspace throughout high school as well. As a 6-12th grade school this opportunity to loop with a cohort of students will likely persist. I would be curious to see growth over time in career/citizenship skills (critical thinking, problem solving, collaboration, leadership, initiative, communication skills, information analysis, and curiosity). My hypothesis is that Makerspace students would show significant growth in these areas over their peers who were not placed in a Makerspace class. We could also look at data like MAP scores, attendance, and GPA.
B) Portfolios – Here are two example in-progress student portfolios (1 & 2) and my “exemplar.” Note – these are very text heavy. As students apply to college these will be tweaked for an admissions officer audience. This approach would assess mastery of design thinking, metacognition, and writing. There is a growing movement for collecting student portfolio best practices via MakerEd’s Open Portfolio Project. At the end of the year students will summarize their learnings in a defense-style format.
C) Student STEM Attitudes (pre- & post-course survey). I’ve noticed that when students are told about STEM, often times they learn that they are not, can not be, and do not want to be involved with STEM. This is brutally apparent in media – see: “Famous Scientist.” I propose that when students do STEM within a framework that emphasizes choice and an authentic audience (like Makerspace) their STEM affinity grows.
D) Content Mastery – Makerspace-style projects can be build around content standards. In fact some believe that to be scalable, Makerspaces must be a component of core content classrooms rather than separate. I’d propose that, yes, the student-centered approach to teaching & learning in a makerspace could be integrated into all content areas. However, there is a unique opportunity for student-led integrative learning experiences with a separate course. In the realm of core content alignment, last year I piloted an instrument design project built around the Missouri Science Content Learning Expectations. Students took an aligned pre-test & post-test, and demonstrated significant growth in their understanding of sound & waves.
E) Performance Based Assessment – There is a growing body of research in the realm of assessing the value of undergraduate research experience, tinkering, and collaboration. One major contributor in this field is Adam Maltese at Indiana U. Perhaps we could create a “Design Thinking” practical exam that students would take at the beginning & end of a course to assess their approach to human-centered problem solving.
How do you assess student learning in a Makerspace?