I’m working on a paper right now on what might be called ‘games for history’, but I’ll admit, I’m stumped. The anonymous reviewer wants to see some stats, some formalized quantitative or qualitative results demonstrating that students have learned something, and a discussion of the metrics used.
So in a mad dash today I’ve been burning the aether, trying to find anything other than anecdotal evidence for something I firmly believe: that game-based learning in the humanities can achieve deep learning.
In one sense, it seems a bit much to demand of game-based learning something we rarely demand of chalk-and-talk or other approaches used in higher ed… but that’s really not a useful response. It should be out there… Any ideas?
Last word to FAS:
These higher-order knowledge and skills [learned in games] are typically not revealed by tests of facts, or standards of learning-types of examinations. Instead of concrete measures of learning outcomes, what is available is typically strong anecdotal evidence — kids that participate in game- and simulation-like learning are very excited, they’re motivated, they’re immersed, and they seem to do better. In addition, games and simulations tend to blur the line between education and training, as they involve learning-by-doing. For example, decision-making may be best assessed in a test of its practical use.
If assessments are not measuring the right skills and knowledge — the higher order skills that games may be able to develop — then the use of educational games and simulations may be viewed as having poor efficacy. In reality, the assessment is designed to measure something other than what the game is designed to teach.
Federation of American Scientists. 2006. Summit on Educational Games: Harnessing the power of video games for learning. http://www.fas.org/gamesummit/Resources/Summit%20on%20Educational%20Games.pdf