Image by Element5 Digital

GAMEFUL LEARNING PILOT

Learning Management System Pilot for the Duke Game Lab

I worked with an instructional team of three to adapt existing course materials to work with GradeCraft, a game-based LMS. I was responsible for building and managing necessary assets, class observations, and evaluating the success of the program for all users.

 

THE QUESTIONS

  • How can game-based learning improve learning outcomes in an undergraduate game studies seminar?

  • Is the ROI sufficient to justify a purchase of the software license?

 

MY APPROACH

IMMERSION PHASE

  • Understand stakeholders’ definition of a successful LMS.

  • Develop a working knowledge of the platform and existing best-practices.

DESIGN PHASE

  • Adapt materials to a game-based education model.

  • Create assets unique to GradeCraft.

EVALUATION PHASE

  • Assess student and faculty experience of the LMS.

  • Understand whether the LMS contributed to positive learning outcomes.

 

IMMERSION PHASE

What: I interviewed stakeholders to establish the necessary benchmarks for success to determine whether or not future resources would be invested in the LMS. We reviewed existing use cases to determine which existing practices best fit or could be adapted to the faculty’s existing course.


Key takeaways:

  • GradeCraft focuses on non-linear course design, where students can ‘choose their own adventure’ to the grade they’re after, within the limits set by the instructor.

  • Students’ increased agency within this model could allow them to take greater ownership of what they want to learn and how.

  • Gameful learning could drastically increase faculty workloads by allowing students to continually re-do assignments or take on additional assignments to make up for poor performance.

 

DESIGN PHASE

What: I mapped out potential learning pathways based on the faculty’s existing syllabus and assignments. This course design allowed students to specialize into tracks which would allow them to work on the skills most relevant to them. We gave each unique names, badges, and achievements which I wrote and illustrated to include on the platform.


I set-up and maintained the grading and achievement systems throughout the semester, troubleshooting bugs that arose and tweaking the organization in response to faculty and student feedback.

 

EVALUATION PHASE

What: I evaluated program success periodically through the semester through...

  • Formal surveys of students and faculty administered through Qualtrics at mid-term and the end of the semester.

  • Formal and informal interviews of faculty and students about their experience.

  • Classroom observations, recording questions and challenges students had with the program.

  • Monitoring site use data.

Key takeaways:

  • Both faculty and students experienced a substantial learning curve when trying to understand the platform functions, particularly features that did not behave similarly to more familiar LMSs.

  • Most students did not use the features unique to the GradeCraft LMS despite being taught about them in writing and in class.

  • There was no clear indication that that the LMS either contributed or added substantively to either student or faculty experience.

Overall, I found the results of the study inconclusive because the pandemic required a rapid shift to distance learning half way through the term. The majority of the gamified syllabus took place in the stages for the final assessments which began around the same time. Data collected did not show whether or not challenges we created by the learning conditions or the course structure itself. Students participating in evaluations also drastically reduced during this period.


I also noted to stakeholders that similar course designs could be successfully administered within existing LMSs at the institution which would conserve limited resources. Students might also have an easier time navigating an unfamiliar course design while using a familiar LMS. If there was future interest in a second pilot under improved conditions, I suggested incorporating improvements based on early data, such as creating tutorial assignments that would require students to access key features.