History of the GCAA Makerspace

“A good design is never finished.”  

Just as our students are encouraged to iterate their solutions based on user testing & feedback, our approach to learning in the Makerspace has seen significant enhancements as well.

581758_10101251188669839_792759475_n

GCAA Makerspace v1.0

Version 1.0 – Spring 2013:  The Disruption Department via Gregory Hill piloted Makerspace programming at Grand Center Arts Academy via in small office at the school.  We used a $100 budget to go on a Dollar Tree shopping spree for prototyping materials. The program was only open to 7th graders on Fridays.  The material constraints meant a focus on learning the framework of Design Thinking versus specific tools.  On the first day students walked into an empty room that they designed using Design Thinking.

9538788223_50ea1bfddb_z

GCAA Makerspace v2.0

Version 2.0 – (2013/2014):  Success of the pilot in the spring encouraged us to expand to a full space the next year.  This space was a “drop-in” model, which meant students visited similar in context to a library.  They came as part of a class, during study hall, or before/after school.  Students completed a “Design Proposal Form” that guided their thinking through the product development process before making a prototype and soliciting feedback from other students.  Eventually, students demanded that they come to Makerspace as a regular part of their day, so we piloted the idea of a class (Design Thinking 101) in the Spring 2014.  

20141005_163642

GCAA Makerspace v3.0

Version 3.0 – (2014/2015): We were incredibly fortunate to start this year with a grant from CORTEX that supported a wide variety of Makerspace tool, material, and infrastructure enhancements.  This year we also expanded to a full-course model, offering two middle school & two high school offerings and expanding to “The Briefing Room” which provides a space for students to reflect and present on their learnings.  In October we hosted a “Makerspace Redesign Weekend” where community members specializing in architecture, design, education, and business helped us identify challenges, brainstorm, and prototype solutions to enhance the space.  Solutions included everything from a storage system redesign to a new seating arrangement.  This process is now replicated as a project in one of the Middle School Courses.

IMG_1524.jpg

GCAA Makerspace v4.0

Version 4.0 – (2015/2016): We added a third middle school offering.  That class is currently working in collaboration with the Jefferson National Expansion Memorial to design a scale model of the updated Arch Grounds.  Students now visit the Makerspace via one of five distinct courses offered for Middle School & High School.  Another enhancement was the switch to Chromebooks & Google Classroom, which has enhanced the quality and quantity of feedback.  Guiding learning in all courses is the framework of Design Thinking as a human-centered approach to problem solving. 

Coming Soon:
Version 5.0 – (2016/2017):  Version 5.0 will include another new course offering which focuses on game design.  We also plan to scale the knowledge, skills, and mindsets of Design Thinking that students use in the Makerspace to other areas within the building.  We’re excited to foster new relationships with community partners to offer students authentic audiences for whom to design solutions.  

Five Models of Makerspace Assessment

IMG_4707.JPG
Inspired by a conversations in the EdCampSTL Makerspace and with Adam Maltese from Indiana University, I’ve been thinking a lot about how I assess learning in our Makerspace. Here, I propose five ideas for assessing student learning and validating/invalidating our approach.

A) Longitudinal Outcomes Study – I have the rare pleasure of looping with a handful of students for the last three years.  It is likely that many of them will continue in Makerspace throughout high school as well.  As a 6-12th grade school this opportunity to loop with a cohort of students will likely persist.  I would be curious to see growth over time in career/citizenship skills (critical thinking, problem solving, collaboration, leadership, initiative, communication skills, information analysis, and curiosity).  My hypothesis is that Makerspace students would show significant growth in these areas over their peers who were not placed in a Makerspace class.  We could also look at data like MAP scores, attendance, and GPA.

B) Portfolios – Here are two example in-progress student portfolios (& 2) and my “exemplar.”  Note – these are very text heavy.  As students apply to college these will be tweaked for an admissions officer audience.  This approach would assess mastery of design thinking, metacognition, and writing.  There is a growing movement for collecting student portfolio best practices via MakerEd’s Open Portfolio Project.  At the end of the year students will summarize their learnings in a defense-style format.

C) Student STEM Attitudes (pre- & post-course survey).  I’ve noticed that when students are told about STEM, often times they learn that they are not, can not be, and do not want to be involved with STEM.  This is brutally apparent in media – see: “Famous Scientist.”   I propose that when students do STEM within a framework that emphasizes choice and an authentic audience (like Makerspace) their STEM affinity grows.

FamousScientist.png

Our vision for the future of the “Famous Entrepreneur” Google Image Search.


D) Content Mastery
– Makerspace-style projects can be build around content standards.  In fact some believe that to be scalable, Makerspaces must be a component of core content classrooms rather than separate.  I’d propose that, yes, the student-centered approach to teaching & learning in a makerspace could be integrated into all content areas.  However, there is a unique opportunity for student-led integrative learning experiences with a separate course.  In the realm of core content alignment, last year I piloted an instrument design project built around the Missouri Science Content Learning Expectations.  Students took an aligned pre-test & post-test, and demonstrated significant growth in their understanding of sound & waves.

E) Performance Based Assessment – There is a growing body of research in the realm of assessing the value of undergraduate research experience, tinkering, and collaboration.  One major contributor in this field is Adam Maltese at Indiana U.  Perhaps we could create a “Design Thinking” practical exam that students would take at the beginning & end of a course to assess their approach to human-centered problem solving.

How do you assess student learning in a Makerspace?

Why Design Thinking?

Soliciting user feedback.

Soliciting user feedback.

There are various approaches to learning in a Makerspace.  In the GCAA Makerspace we primarily use Design Thinking, a human-centered framework where products and processes are designed entirely around & for an end-user.  

We use design thinking for four reasons:

1) It’s a simple process that takes lots of practice to master (like learning an instrument or the other various arts pathways at our school).

2) It’s applicable to solving “bugs” that students encounter on a daily basis.  “How might we design a way to reduce tardies?”

3) It’s relevant to any profession: educationcomputer programming, business, medicine, etc.  Most any real-world problem can likely be solved with a design thinking approach.

4) Given the “human-centered” approach, students learn about themselves and others in an authentic way.

When visitors tour the makerspace, they often initially perceive our work to be building balsa bridges, simple machines, and Rube Goldberg devices.  While there is definitely a value to non-human centered engineering projects, we’ve shifted away from them.  I found that the competitive nature of engineering challenges tends to be discouraging to some students and the iterative nature of design thinking.  I’m often reminded of the Marshmallow Challenge – how motivated would participants be to build another tower after working 18 minutes on that first one?  I’ve found that students are way more motivated to build, re-build, re-empathize, and re-ideate when there is an actual human user of their product.

For many engineering challenges the “empathy” portion of design thinking could be switched to to “research.”  Rather than observing, interviewing, or immersing oneself in the shoes of a user, students could collect and condense background information on the problem.  If you do this, be cautious about students who get “stuck” in the research phase.  Remember the marshmallow challenge and how a bias toward action tends to result in the best product!  Prototype <-> Test!  After the research phase, students could define the challenge, ideate solutions, prototype, test, and iterate. 

Alternatively, most engineering challenges (as they’re presented in curriculum documents) could be shifted to a human-centered design challenge.  In fact, I used design thinking to work through how one might do that.

Empathy:  A teacher wants to set up a catapult contest, but acknowledges that it doesn’t necessarily align with Design Thinking.

Define:  How might we turn a catapult launch into a Design Thinking challenge?

Ideate:

-Students could design catapult related games for users to play.

-Students could design a user-friendly manual for operation of their catapults.

-The challenge could be “get your ball to the target in the most interesting way” with users voting & providing feedback.

Today in Makerspace

Thanks to Rob at Hixon Middle School for inspiring today’s post.  Check out the great things happening in their Makerspace here.

Draw bots become battle bots.

 
  

Prototype of a time management system.

  

Human drum machine via Scratch & Makey Makey.