G4C Industry Circle: How GlassLab’s analytics support impact at scale

1511_IndustryCircle_Banner_Template_GlassLab_resize for banner

RSVP HERE TO JOIN G4C’S GOOGLE HANGOUT WITH GLASSLAB

The GlassLab Analytics Engine: Supporting Impact at Scale

By Paula Escuadra, Head of Content Partnerships

Games can empower players to take on new perspectives, challenges, and the desire to face both failure and success. As part of fundamental game design, this sense of agency and exploration is supported by the mechanics of a “core loop,” also known as a sequence of critical verbs that loop back on one another [1].

In game design, this is important because of the way repetition can enable the mastery of a concept — simple or complex. While the loop in a game for entertainment creates interest and retention, the loop in a learning game intertwines both engagement and instructional interaction.

For instance, the adventure role-playing game Mars Generation One: Argubot Academy EDU, developed in collaboration with NASA and the National Writing Project, teaches argumentation skills through robotic battles. Targeted argumentation skills were informed by existing Educational Testing Service and Common Core standards in argumentation and writing.

datastructures_img2_634px

Looping Data, Gameplay, and Outcomes Together

The game mechanics are as follows: find evidence; construct the argument (equip your argubot); and critique the argument (win an argubot battle). As players proceed through the core loops of the game, the robot battles become more complex; as players progress, so can their understanding and mastery of how to create a valid argument.

Digital games afford deeper and richer streams of data to assess players’ learning and improvement [2], increasing opportunities and impact for players. A major challenge game developers and educators run into, however, lies in identifying meaningful learning evidence within thousands of virtual data points. How do we make all of this information useful?

The GlassLab Analytics Engine was created to empower developers to systematically connect in-game events to evidence of learning and visualize it in a way that was easy to use in any learning environment. It supports the alignment of learning design with system-wide data structures that enable powerful learning insights [3]. Beyond that, the Engine facilitates the more effective onboarding of high-quality digital games onto the GlassLab Games — also known as the STEAM engine of learning games.

datastructures_img1_634px

When GlassLab Game Services first launched, connecting these in-game events to learning reports was time consuming, done by a data engineer hooking each reporting event to individual pieces of data, one by one. The thoughtful design of how to visualize learning events in easy-to-use reports is still critical, but the pain of implementing those reports has been greatly reduced.

Soon, GlassLab will be releasing new tools that make it easier for developers to connect the data from what happens in-game to tangible learning outcomes — potentially reducing two months of typical integration time by an eighth of the time (taking as little as a single week!).

By applying what GlassLab has learned during its three-year research period, the studio hopes to streamline the creation of data-powered learning reports into archetypal structures that new developers can simply plug into, implement according to their games’ natural structure, and begin using to produce reports that GlassLab’s design process will help developers connect to content and learning standards.

COMMENTS

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>