Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities
Creativity assessments should be valid, reliable, and scalable to support various stakeholders (e.g., policy-makers, educators, corporations, and the general public) in their decision-making processes. Established initiatives toward scalable creativity assessments have relied on well-studied standar...
Gespeichert in:
Veröffentlicht in: | Creativity research journal 2022-01, Vol.34 (1), p.28-54 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 54 |
---|---|
container_issue | 1 |
container_start_page | 28 |
container_title | Creativity research journal |
container_volume | 34 |
creator | Rafner, Janet Biskjær, Michael Mose Zana, Blanka Langsford, Steven Bergenholtz, Carsten Rahimi, Seyedahmad Carugati, Andrea Noy, Lior Sherson, Jacob |
description | Creativity assessments should be valid, reliable, and scalable to support various stakeholders (e.g., policy-makers, educators, corporations, and the general public) in their decision-making processes. Established initiatives toward scalable creativity assessments have relied on well-studied standardized tests. Although robust in many ways, most of these tests adopt unnatural and unmotivating environments for expression of creativity, mainly observe coarse-grained snippets of the creative process, and rely on subjective, resource-intensive, human-expert evaluations. This article presents a literature review of game-based creativity assessment and discusses how digital games can potentially address the limitations of traditional testing. Based on an original sample of 127 papers, this article contributes an in-depth review of 16 papers on 11 digital creativity assessment games. Despite the relatively small sample, a wide variety of design decisions are covered. Major findings and recommendations include identifying (1) a disconnect between the potential of scaling up assessment of creativity with the use of digital games, and the actual reach achieved in the examined studies (2) the need for complementary methods such as stealth assessment, algorithmic support and crowdsourcing when designing creativity assessment games, and (3) a need for interdisciplinary dialogs to produce, validate and implement creativity assessment games at scale. |
doi_str_mv | 10.1080/10400419.2021.1971447 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1080_10400419_2021_1971447</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1332492</ericid><sourcerecordid>2633103868</sourcerecordid><originalsourceid>FETCH-LOGICAL-c360t-ad2a1ea5832167f31fb147f027f905612e8de439e7408da5a2de10fe1dd44f633</originalsourceid><addsrcrecordid>eNp9kE1Lw0AQhoMoWKs_oRDwaurM7ubLk6XWqhQqqHhc1mS2bk2TurtV-u9NSfXoaQbe552BJwgGCEOEDC4RBIDAfMiA4RDzFIVID4IexpxFSc7FYbu3TLSDjoMT55YAkDIBveDxxiyMV1U4VStyoW5sOLakvPkyfhuOnCPnVlT7q_DJW6oX_t1dhK-kPmraZaGqy3C-XjfWb2rjDbnT4EirytHZfvaDl9vJ8_gums2n9-PRLCp4Aj5SJVNIKs44wyTVHPUbilQDS3UOcYKMspIEzykVkJUqVqwkBE1YlkLohPN-cN7dXdvmc0POy2WzsXX7UrI2RuBZkrVU3FGFbZyzpOXampWyW4kgd_Lkrzy5kyf38treoOuRNcVfZ_KAnDORsza_7nJTt8pW6ruxVSm92laN1VbVhXGS___iBx1tfwc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2633103868</pqid></control><display><type>article</type><title>Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities</title><source>Education Source</source><creator>Rafner, Janet ; Biskjær, Michael Mose ; Zana, Blanka ; Langsford, Steven ; Bergenholtz, Carsten ; Rahimi, Seyedahmad ; Carugati, Andrea ; Noy, Lior ; Sherson, Jacob</creator><creatorcontrib>Rafner, Janet ; Biskjær, Michael Mose ; Zana, Blanka ; Langsford, Steven ; Bergenholtz, Carsten ; Rahimi, Seyedahmad ; Carugati, Andrea ; Noy, Lior ; Sherson, Jacob</creatorcontrib><description>Creativity assessments should be valid, reliable, and scalable to support various stakeholders (e.g., policy-makers, educators, corporations, and the general public) in their decision-making processes. Established initiatives toward scalable creativity assessments have relied on well-studied standardized tests. Although robust in many ways, most of these tests adopt unnatural and unmotivating environments for expression of creativity, mainly observe coarse-grained snippets of the creative process, and rely on subjective, resource-intensive, human-expert evaluations. This article presents a literature review of game-based creativity assessment and discusses how digital games can potentially address the limitations of traditional testing. Based on an original sample of 127 papers, this article contributes an in-depth review of 16 papers on 11 digital creativity assessment games. Despite the relatively small sample, a wide variety of design decisions are covered. Major findings and recommendations include identifying (1) a disconnect between the potential of scaling up assessment of creativity with the use of digital games, and the actual reach achieved in the examined studies (2) the need for complementary methods such as stealth assessment, algorithmic support and crowdsourcing when designing creativity assessment games, and (3) a need for interdisciplinary dialogs to produce, validate and implement creativity assessment games at scale.</description><identifier>ISSN: 1040-0419</identifier><identifier>EISSN: 1532-6934</identifier><identifier>DOI: 10.1080/10400419.2021.1971447</identifier><language>eng</language><publisher>Philadelphia: Routledge</publisher><subject>Computer Assisted Testing ; Creativity ; Design ; Evaluation Methods ; Motivation ; Scoring ; Student Evaluation ; Test Validity ; Video Games</subject><ispartof>Creativity research journal, 2022-01, Vol.34 (1), p.28-54</ispartof><rights>2021 Taylor & Francis Group, LLC 2021</rights><rights>2021 Taylor & Francis Group, LLC</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c360t-ad2a1ea5832167f31fb147f027f905612e8de439e7408da5a2de10fe1dd44f633</citedby><cites>FETCH-LOGICAL-c360t-ad2a1ea5832167f31fb147f027f905612e8de439e7408da5a2de10fe1dd44f633</cites><orcidid>0000-0001-6048-587X ; 0000-0001-9264-3334</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1332492$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Rafner, Janet</creatorcontrib><creatorcontrib>Biskjær, Michael Mose</creatorcontrib><creatorcontrib>Zana, Blanka</creatorcontrib><creatorcontrib>Langsford, Steven</creatorcontrib><creatorcontrib>Bergenholtz, Carsten</creatorcontrib><creatorcontrib>Rahimi, Seyedahmad</creatorcontrib><creatorcontrib>Carugati, Andrea</creatorcontrib><creatorcontrib>Noy, Lior</creatorcontrib><creatorcontrib>Sherson, Jacob</creatorcontrib><title>Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities</title><title>Creativity research journal</title><description>Creativity assessments should be valid, reliable, and scalable to support various stakeholders (e.g., policy-makers, educators, corporations, and the general public) in their decision-making processes. Established initiatives toward scalable creativity assessments have relied on well-studied standardized tests. Although robust in many ways, most of these tests adopt unnatural and unmotivating environments for expression of creativity, mainly observe coarse-grained snippets of the creative process, and rely on subjective, resource-intensive, human-expert evaluations. This article presents a literature review of game-based creativity assessment and discusses how digital games can potentially address the limitations of traditional testing. Based on an original sample of 127 papers, this article contributes an in-depth review of 16 papers on 11 digital creativity assessment games. Despite the relatively small sample, a wide variety of design decisions are covered. Major findings and recommendations include identifying (1) a disconnect between the potential of scaling up assessment of creativity with the use of digital games, and the actual reach achieved in the examined studies (2) the need for complementary methods such as stealth assessment, algorithmic support and crowdsourcing when designing creativity assessment games, and (3) a need for interdisciplinary dialogs to produce, validate and implement creativity assessment games at scale.</description><subject>Computer Assisted Testing</subject><subject>Creativity</subject><subject>Design</subject><subject>Evaluation Methods</subject><subject>Motivation</subject><subject>Scoring</subject><subject>Student Evaluation</subject><subject>Test Validity</subject><subject>Video Games</subject><issn>1040-0419</issn><issn>1532-6934</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kE1Lw0AQhoMoWKs_oRDwaurM7ubLk6XWqhQqqHhc1mS2bk2TurtV-u9NSfXoaQbe552BJwgGCEOEDC4RBIDAfMiA4RDzFIVID4IexpxFSc7FYbu3TLSDjoMT55YAkDIBveDxxiyMV1U4VStyoW5sOLakvPkyfhuOnCPnVlT7q_DJW6oX_t1dhK-kPmraZaGqy3C-XjfWb2rjDbnT4EirytHZfvaDl9vJ8_gums2n9-PRLCp4Aj5SJVNIKs44wyTVHPUbilQDS3UOcYKMspIEzykVkJUqVqwkBE1YlkLohPN-cN7dXdvmc0POy2WzsXX7UrI2RuBZkrVU3FGFbZyzpOXampWyW4kgd_Lkrzy5kyf38treoOuRNcVfZ_KAnDORsza_7nJTt8pW6ruxVSm92laN1VbVhXGS___iBx1tfwc</recordid><startdate>20220102</startdate><enddate>20220102</enddate><creator>Rafner, Janet</creator><creator>Biskjær, Michael Mose</creator><creator>Zana, Blanka</creator><creator>Langsford, Steven</creator><creator>Bergenholtz, Carsten</creator><creator>Rahimi, Seyedahmad</creator><creator>Carugati, Andrea</creator><creator>Noy, Lior</creator><creator>Sherson, Jacob</creator><general>Routledge</general><general>Taylor & Francis Ltd</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-6048-587X</orcidid><orcidid>https://orcid.org/0000-0001-9264-3334</orcidid></search><sort><creationdate>20220102</creationdate><title>Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities</title><author>Rafner, Janet ; Biskjær, Michael Mose ; Zana, Blanka ; Langsford, Steven ; Bergenholtz, Carsten ; Rahimi, Seyedahmad ; Carugati, Andrea ; Noy, Lior ; Sherson, Jacob</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c360t-ad2a1ea5832167f31fb147f027f905612e8de439e7408da5a2de10fe1dd44f633</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Assisted Testing</topic><topic>Creativity</topic><topic>Design</topic><topic>Evaluation Methods</topic><topic>Motivation</topic><topic>Scoring</topic><topic>Student Evaluation</topic><topic>Test Validity</topic><topic>Video Games</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Rafner, Janet</creatorcontrib><creatorcontrib>Biskjær, Michael Mose</creatorcontrib><creatorcontrib>Zana, Blanka</creatorcontrib><creatorcontrib>Langsford, Steven</creatorcontrib><creatorcontrib>Bergenholtz, Carsten</creatorcontrib><creatorcontrib>Rahimi, Seyedahmad</creatorcontrib><creatorcontrib>Carugati, Andrea</creatorcontrib><creatorcontrib>Noy, Lior</creatorcontrib><creatorcontrib>Sherson, Jacob</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><jtitle>Creativity research journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Rafner, Janet</au><au>Biskjær, Michael Mose</au><au>Zana, Blanka</au><au>Langsford, Steven</au><au>Bergenholtz, Carsten</au><au>Rahimi, Seyedahmad</au><au>Carugati, Andrea</au><au>Noy, Lior</au><au>Sherson, Jacob</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1332492</ericid><atitle>Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities</atitle><jtitle>Creativity research journal</jtitle><date>2022-01-02</date><risdate>2022</risdate><volume>34</volume><issue>1</issue><spage>28</spage><epage>54</epage><pages>28-54</pages><issn>1040-0419</issn><eissn>1532-6934</eissn><abstract>Creativity assessments should be valid, reliable, and scalable to support various stakeholders (e.g., policy-makers, educators, corporations, and the general public) in their decision-making processes. Established initiatives toward scalable creativity assessments have relied on well-studied standardized tests. Although robust in many ways, most of these tests adopt unnatural and unmotivating environments for expression of creativity, mainly observe coarse-grained snippets of the creative process, and rely on subjective, resource-intensive, human-expert evaluations. This article presents a literature review of game-based creativity assessment and discusses how digital games can potentially address the limitations of traditional testing. Based on an original sample of 127 papers, this article contributes an in-depth review of 16 papers on 11 digital creativity assessment games. Despite the relatively small sample, a wide variety of design decisions are covered. Major findings and recommendations include identifying (1) a disconnect between the potential of scaling up assessment of creativity with the use of digital games, and the actual reach achieved in the examined studies (2) the need for complementary methods such as stealth assessment, algorithmic support and crowdsourcing when designing creativity assessment games, and (3) a need for interdisciplinary dialogs to produce, validate and implement creativity assessment games at scale.</abstract><cop>Philadelphia</cop><pub>Routledge</pub><doi>10.1080/10400419.2021.1971447</doi><tpages>27</tpages><orcidid>https://orcid.org/0000-0001-6048-587X</orcidid><orcidid>https://orcid.org/0000-0001-9264-3334</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1040-0419 |
ispartof | Creativity research journal, 2022-01, Vol.34 (1), p.28-54 |
issn | 1040-0419 1532-6934 |
language | eng |
recordid | cdi_crossref_primary_10_1080_10400419_2021_1971447 |
source | Education Source |
subjects | Computer Assisted Testing Creativity Design Evaluation Methods Motivation Scoring Student Evaluation Test Validity Video Games |
title | Digital Games for Creativity Assessment: Strengths, Weaknesses and Opportunities |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-18T20%3A09%3A53IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Digital%20Games%20for%20Creativity%20Assessment:%20Strengths,%20Weaknesses%20and%20Opportunities&rft.jtitle=Creativity%20research%20journal&rft.au=Rafner,%20Janet&rft.date=2022-01-02&rft.volume=34&rft.issue=1&rft.spage=28&rft.epage=54&rft.pages=28-54&rft.issn=1040-0419&rft.eissn=1532-6934&rft_id=info:doi/10.1080/10400419.2021.1971447&rft_dat=%3Cproquest_cross%3E2633103868%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2633103868&rft_id=info:pmid/&rft_ericid=EJ1332492&rfr_iscdi=true |