GlassLab’s assessment engine can be hooked into video games using an API on a variety of platforms. Players’ choices while playing the game are sent to the assessment engine, which provides multiple views into the metrics generated during each game session. Each player’s performance can then be assessed to determine if they are actually learning.
On Monday I visited Zynga’s San Francisco HQ in the company of Tamas Makany, a learning designer at GlassLab Games. GlassLab is a non-profit put together by the Institute of Play, Electronic Arts, and other entities interested in the intersection of digital games and education.
Creating the right hooks requires not only the API, but also the assistance of GlassLab's learning experts and statisticians to:
- Identify what students are supposed to learn.
- Construct metrics to measure that this learning process is actually taking place.
- Identify (or create) the game mechanics that provide these metrics.
GlassLab currently has two games. The first is a modified version of SimCity called SimCityEDU that GlassLab built using the actual SimCity code under license. The game provides multiple missions that start SimCity at specific states and require students to handle specific problems, such as how to reduce carbon emissions in their city while still providing the city with sufficient power. The second game is Mars Generation One: Argubot Academy, an original game from GlassLab featuring squabbling Martians that requires players to assess whether and how much certain sentences support an argument. The Martians then simulate a debate using the players’ arguments as weapons.
Both games are built to teach lessons based on Common Core educational standards.
[1] ECgD is based on the principles of evidence centered design, a methodology that ensures that what you think is happening is really happening.
No comments:
Post a Comment