Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics
Generations of scientists have pursued the goal of defining beauty. While early scientists initially focused on objective criteria of beauty ('feature-based aesthetics'), philosophers and artists alike have since proposed that beauty arises from the interaction between the object and the i...
Gespeichert in:
Veröffentlicht in: | PLoS computational biology 2023-12, Vol.19 (12), p.e1011703-e1011703 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | e1011703 |
---|---|
container_issue | 12 |
container_start_page | e1011703 |
container_title | PLoS computational biology |
container_volume | 19 |
creator | Dibot, Nicolas M Tieo, Sonia Mendelson, Tamra C Puech, William Renoult, Julien P |
description | Generations of scientists have pursued the goal of defining beauty. While early scientists initially focused on objective criteria of beauty ('feature-based aesthetics'), philosophers and artists alike have since proposed that beauty arises from the interaction between the object and the individual who perceives it. The aesthetic theory of fluency formalizes this idea of interaction by proposing that beauty is determined by the efficiency of information processing in the perceiver's brain ('processing-based aesthetics'), and that efficient processing induces a positive aesthetic experience. The theory is supported by numerous psychological results, however, to date there is no quantitative predictive model to test it on a large scale. In this work, we propose to leverage the capacity of deep convolutional neural networks (DCNN) to model the processing of information in the brain by studying the link between beauty and neuronal sparsity, a measure of information processing efficiency. Whether analyzing pictures of faces, figurative or abstract art paintings, neuronal sparsity explains up to 28% of variance in beauty scores, and up to 47% when combined with a feature-based metric. However, we also found that sparsity is either positively or negatively correlated with beauty across the multiple layers of the DCNN. Our quantitative model stresses the importance of considering how information is processed, in addition to the content of that information, when predicting beauty, but also suggests an unexpectedly complex relationship between fluency and beauty. |
doi_str_mv | 10.1371/journal.pcbi.1011703 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_3069179451</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A780598982</galeid><doaj_id>oai_doaj_org_article_510dbad915904fa99f3c8ce533e8eb69</doaj_id><sourcerecordid>A780598982</sourcerecordid><originalsourceid>FETCH-LOGICAL-c617t-2d485dc17a98120d9e757c36d8660d71844f3daa6decfcbd67bb593cad5e3bd53</originalsourceid><addsrcrecordid>eNqVklFv0zAQxyMEYmPwDRBE4gUeWuw4jm1eUDUBq1SBxMaz5diX1iWNO9vZ6LfHWbOpnXhBtmTr_Lu_7_66LHuN0RQThj-uXe871U63urZTjDBmiDzJTjGlZMII5U8P7ifZixDWCKWrqJ5nJ4SjkpOCnGary63ywcZdbrtcpe2jbay2qs076P3dEW-d_51vPRirY8hrUH3cfcqv3K3yJuQq3zgDbe6axDgNIdhuOalVAJMrCHEF0erwMnvWqDbAq_E8y359_XJ1fjFZ_Pg2P58tJrrCLE4KU3JqNGZKcFwgI4BRpklleFUhwzAvy4YYpSoDutG1qVhdU0G0MhRIbSg5y97udbetC3I0KUiCKoGZKClOxHxPGKfWcuvtRvmddMrKu4DzSzm4oFuQFCNTKyMwFahslBAN0VwDJQQ41JVIWp_H3_p6A0ZDF5NnR6LHL51dyaW7kRixIvVXJIUPe4XVo7yL2UIOMVSmzitW3AyVvx9_8-66T9bKjQ0a2lZ14PogCy44wUVRDja8e4T-24rpnlqq1K3tGpeK1GkZ2FjtOmhsis8YR1Qk7YNqx4TERPgTl6oPQc4vf_4H-_2YLfes9i4ED82DFRjJYdzvy5fDuMtx3FPam0P7H5Lu55v8BUUF_M8</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3069179451</pqid></control><display><type>article</type><title>Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Public Library of Science (PLoS)</source><source>PubMed Central</source><creator>Dibot, Nicolas M ; Tieo, Sonia ; Mendelson, Tamra C ; Puech, William ; Renoult, Julien P</creator><contributor>Fleming, Roland W.</contributor><creatorcontrib>Dibot, Nicolas M ; Tieo, Sonia ; Mendelson, Tamra C ; Puech, William ; Renoult, Julien P ; Fleming, Roland W.</creatorcontrib><description>Generations of scientists have pursued the goal of defining beauty. While early scientists initially focused on objective criteria of beauty ('feature-based aesthetics'), philosophers and artists alike have since proposed that beauty arises from the interaction between the object and the individual who perceives it. The aesthetic theory of fluency formalizes this idea of interaction by proposing that beauty is determined by the efficiency of information processing in the perceiver's brain ('processing-based aesthetics'), and that efficient processing induces a positive aesthetic experience. The theory is supported by numerous psychological results, however, to date there is no quantitative predictive model to test it on a large scale. In this work, we propose to leverage the capacity of deep convolutional neural networks (DCNN) to model the processing of information in the brain by studying the link between beauty and neuronal sparsity, a measure of information processing efficiency. Whether analyzing pictures of faces, figurative or abstract art paintings, neuronal sparsity explains up to 28% of variance in beauty scores, and up to 47% when combined with a feature-based metric. However, we also found that sparsity is either positively or negatively correlated with beauty across the multiple layers of the DCNN. Our quantitative model stresses the importance of considering how information is processed, in addition to the content of that information, when predicting beauty, but also suggests an unexpectedly complex relationship between fluency and beauty.</description><identifier>ISSN: 1553-7358</identifier><identifier>ISSN: 1553-734X</identifier><identifier>EISSN: 1553-7358</identifier><identifier>DOI: 10.1371/journal.pcbi.1011703</identifier><identifier>PMID: 38048323</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Aesthetics ; Artificial Intelligence ; Artificial neural networks ; Artists ; Asymmetry ; Beauty ; Biology and Life Sciences ; Brain ; Cognition ; Computer and Information Sciences ; Computer Science ; Datasets ; Efficiency ; Esthetics ; Image Processing ; Information processing ; Information processing (biology) ; Judgment - physiology ; Neural and Evolutionary Computing ; Neural networks ; Neural Networks, Computer ; Neurons ; Neurophysiology ; Perceptions ; Physical Sciences ; Prediction models ; Research and Analysis Methods ; Scientists ; Social Sciences ; Sparsity ; Symmetry</subject><ispartof>PLoS computational biology, 2023-12, Vol.19 (12), p.e1011703-e1011703</ispartof><rights>Copyright: © 2023 Dibot et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</rights><rights>COPYRIGHT 2023 Public Library of Science</rights><rights>2023 Dibot et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>Attribution</rights><rights>2023 Dibot et al 2023 Dibot et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c617t-2d485dc17a98120d9e757c36d8660d71844f3daa6decfcbd67bb593cad5e3bd53</cites><orcidid>0000-0001-6097-0293 ; 0000-0001-9383-2401</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10721202/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10721202/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,724,777,781,861,882,2096,2915,23847,27905,27906,53772,53774,79349,79350</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38048323$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://hal.science/hal-04660672$$DView record in HAL$$Hfree_for_read</backlink></links><search><contributor>Fleming, Roland W.</contributor><creatorcontrib>Dibot, Nicolas M</creatorcontrib><creatorcontrib>Tieo, Sonia</creatorcontrib><creatorcontrib>Mendelson, Tamra C</creatorcontrib><creatorcontrib>Puech, William</creatorcontrib><creatorcontrib>Renoult, Julien P</creatorcontrib><title>Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics</title><title>PLoS computational biology</title><addtitle>PLoS Comput Biol</addtitle><description>Generations of scientists have pursued the goal of defining beauty. While early scientists initially focused on objective criteria of beauty ('feature-based aesthetics'), philosophers and artists alike have since proposed that beauty arises from the interaction between the object and the individual who perceives it. The aesthetic theory of fluency formalizes this idea of interaction by proposing that beauty is determined by the efficiency of information processing in the perceiver's brain ('processing-based aesthetics'), and that efficient processing induces a positive aesthetic experience. The theory is supported by numerous psychological results, however, to date there is no quantitative predictive model to test it on a large scale. In this work, we propose to leverage the capacity of deep convolutional neural networks (DCNN) to model the processing of information in the brain by studying the link between beauty and neuronal sparsity, a measure of information processing efficiency. Whether analyzing pictures of faces, figurative or abstract art paintings, neuronal sparsity explains up to 28% of variance in beauty scores, and up to 47% when combined with a feature-based metric. However, we also found that sparsity is either positively or negatively correlated with beauty across the multiple layers of the DCNN. Our quantitative model stresses the importance of considering how information is processed, in addition to the content of that information, when predicting beauty, but also suggests an unexpectedly complex relationship between fluency and beauty.</description><subject>Aesthetics</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Artists</subject><subject>Asymmetry</subject><subject>Beauty</subject><subject>Biology and Life Sciences</subject><subject>Brain</subject><subject>Cognition</subject><subject>Computer and Information Sciences</subject><subject>Computer Science</subject><subject>Datasets</subject><subject>Efficiency</subject><subject>Esthetics</subject><subject>Image Processing</subject><subject>Information processing</subject><subject>Information processing (biology)</subject><subject>Judgment - physiology</subject><subject>Neural and Evolutionary Computing</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Neurons</subject><subject>Neurophysiology</subject><subject>Perceptions</subject><subject>Physical Sciences</subject><subject>Prediction models</subject><subject>Research and Analysis Methods</subject><subject>Scientists</subject><subject>Social Sciences</subject><subject>Sparsity</subject><subject>Symmetry</subject><issn>1553-7358</issn><issn>1553-734X</issn><issn>1553-7358</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqVklFv0zAQxyMEYmPwDRBE4gUeWuw4jm1eUDUBq1SBxMaz5diX1iWNO9vZ6LfHWbOpnXhBtmTr_Lu_7_66LHuN0RQThj-uXe871U63urZTjDBmiDzJTjGlZMII5U8P7ifZixDWCKWrqJ5nJ4SjkpOCnGary63ywcZdbrtcpe2jbay2qs076P3dEW-d_51vPRirY8hrUH3cfcqv3K3yJuQq3zgDbe6axDgNIdhuOalVAJMrCHEF0erwMnvWqDbAq_E8y359_XJ1fjFZ_Pg2P58tJrrCLE4KU3JqNGZKcFwgI4BRpklleFUhwzAvy4YYpSoDutG1qVhdU0G0MhRIbSg5y97udbetC3I0KUiCKoGZKClOxHxPGKfWcuvtRvmddMrKu4DzSzm4oFuQFCNTKyMwFahslBAN0VwDJQQ41JVIWp_H3_p6A0ZDF5NnR6LHL51dyaW7kRixIvVXJIUPe4XVo7yL2UIOMVSmzitW3AyVvx9_8-66T9bKjQ0a2lZ14PogCy44wUVRDja8e4T-24rpnlqq1K3tGpeK1GkZ2FjtOmhsis8YR1Qk7YNqx4TERPgTl6oPQc4vf_4H-_2YLfes9i4ED82DFRjJYdzvy5fDuMtx3FPam0P7H5Lu55v8BUUF_M8</recordid><startdate>20231201</startdate><enddate>20231201</enddate><creator>Dibot, Nicolas M</creator><creator>Tieo, Sonia</creator><creator>Mendelson, Tamra C</creator><creator>Puech, William</creator><creator>Renoult, Julien P</creator><general>Public Library of Science</general><general>PLOS</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>ISN</scope><scope>ISR</scope><scope>3V.</scope><scope>7QO</scope><scope>7QP</scope><scope>7TK</scope><scope>7TM</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><scope>1XC</scope><scope>VOOES</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-6097-0293</orcidid><orcidid>https://orcid.org/0000-0001-9383-2401</orcidid></search><sort><creationdate>20231201</creationdate><title>Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics</title><author>Dibot, Nicolas M ; Tieo, Sonia ; Mendelson, Tamra C ; Puech, William ; Renoult, Julien P</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c617t-2d485dc17a98120d9e757c36d8660d71844f3daa6decfcbd67bb593cad5e3bd53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Aesthetics</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Artists</topic><topic>Asymmetry</topic><topic>Beauty</topic><topic>Biology and Life Sciences</topic><topic>Brain</topic><topic>Cognition</topic><topic>Computer and Information Sciences</topic><topic>Computer Science</topic><topic>Datasets</topic><topic>Efficiency</topic><topic>Esthetics</topic><topic>Image Processing</topic><topic>Information processing</topic><topic>Information processing (biology)</topic><topic>Judgment - physiology</topic><topic>Neural and Evolutionary Computing</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Neurons</topic><topic>Neurophysiology</topic><topic>Perceptions</topic><topic>Physical Sciences</topic><topic>Prediction models</topic><topic>Research and Analysis Methods</topic><topic>Scientists</topic><topic>Social Sciences</topic><topic>Sparsity</topic><topic>Symmetry</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dibot, Nicolas M</creatorcontrib><creatorcontrib>Tieo, Sonia</creatorcontrib><creatorcontrib>Mendelson, Tamra C</creatorcontrib><creatorcontrib>Puech, William</creatorcontrib><creatorcontrib>Renoult, Julien P</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Canada</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PLoS computational biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dibot, Nicolas M</au><au>Tieo, Sonia</au><au>Mendelson, Tamra C</au><au>Puech, William</au><au>Renoult, Julien P</au><au>Fleming, Roland W.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics</atitle><jtitle>PLoS computational biology</jtitle><addtitle>PLoS Comput Biol</addtitle><date>2023-12-01</date><risdate>2023</risdate><volume>19</volume><issue>12</issue><spage>e1011703</spage><epage>e1011703</epage><pages>e1011703-e1011703</pages><issn>1553-7358</issn><issn>1553-734X</issn><eissn>1553-7358</eissn><abstract>Generations of scientists have pursued the goal of defining beauty. While early scientists initially focused on objective criteria of beauty ('feature-based aesthetics'), philosophers and artists alike have since proposed that beauty arises from the interaction between the object and the individual who perceives it. The aesthetic theory of fluency formalizes this idea of interaction by proposing that beauty is determined by the efficiency of information processing in the perceiver's brain ('processing-based aesthetics'), and that efficient processing induces a positive aesthetic experience. The theory is supported by numerous psychological results, however, to date there is no quantitative predictive model to test it on a large scale. In this work, we propose to leverage the capacity of deep convolutional neural networks (DCNN) to model the processing of information in the brain by studying the link between beauty and neuronal sparsity, a measure of information processing efficiency. Whether analyzing pictures of faces, figurative or abstract art paintings, neuronal sparsity explains up to 28% of variance in beauty scores, and up to 47% when combined with a feature-based metric. However, we also found that sparsity is either positively or negatively correlated with beauty across the multiple layers of the DCNN. Our quantitative model stresses the importance of considering how information is processed, in addition to the content of that information, when predicting beauty, but also suggests an unexpectedly complex relationship between fluency and beauty.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>38048323</pmid><doi>10.1371/journal.pcbi.1011703</doi><tpages>e1011703</tpages><orcidid>https://orcid.org/0000-0001-6097-0293</orcidid><orcidid>https://orcid.org/0000-0001-9383-2401</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1553-7358 |
ispartof | PLoS computational biology, 2023-12, Vol.19 (12), p.e1011703-e1011703 |
issn | 1553-7358 1553-734X 1553-7358 |
language | eng |
recordid | cdi_plos_journals_3069179451 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Public Library of Science (PLoS); PubMed Central |
subjects | Aesthetics Artificial Intelligence Artificial neural networks Artists Asymmetry Beauty Biology and Life Sciences Brain Cognition Computer and Information Sciences Computer Science Datasets Efficiency Esthetics Image Processing Information processing Information processing (biology) Judgment - physiology Neural and Evolutionary Computing Neural networks Neural Networks, Computer Neurons Neurophysiology Perceptions Physical Sciences Prediction models Research and Analysis Methods Scientists Social Sciences Sparsity Symmetry |
title | Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-18T13%3A12%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Sparsity%20in%20an%20artificial%20neural%20network%20predicts%20beauty:%20Towards%20a%20model%20of%20processing-based%20aesthetics&rft.jtitle=PLoS%20computational%20biology&rft.au=Dibot,%20Nicolas%20M&rft.date=2023-12-01&rft.volume=19&rft.issue=12&rft.spage=e1011703&rft.epage=e1011703&rft.pages=e1011703-e1011703&rft.issn=1553-7358&rft.eissn=1553-7358&rft_id=info:doi/10.1371/journal.pcbi.1011703&rft_dat=%3Cgale_plos_%3EA780598982%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3069179451&rft_id=info:pmid/38048323&rft_galeid=A780598982&rft_doaj_id=oai_doaj_org_article_510dbad915904fa99f3c8ce533e8eb69&rfr_iscdi=true |