Modes. A Multi-sensory Media Experience for Stress Reduction

In societies where productivity is prioritized over presence, stress abounds. The extensive and alarming effects of stress on the mental and physiological wellbeing of college students inspired a cross-disciplinary team to tackle this problem using their combined expertise in visual design, music te...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Design journal 2017-07, Vol.20 (sup1), p.S4771-S4773
Hauptverfasser: Fischer, Emily Verba, Hebbeler, John
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In societies where productivity is prioritized over presence, stress abounds. The extensive and alarming effects of stress on the mental and physiological wellbeing of college students inspired a cross-disciplinary team to tackle this problem using their combined expertise in visual design, music technology, psychology, art therapy, and mindfulness. The Modes experience is an atmospheric, introspective, and aesthetically sophisticated engagement of three senses: ophthalmoception (sight), audioception (hearing), and tactioception (touch). Through immersive interaction, mesmerizing visual and aural landscapes are generated in order to reduce stress in college students while simultaneously entertaining them. The two measurable outcomes of Modes are (1) the reduction of the stress hormone cortisol in users, and (2) the reduction of user heart rates. The design and functionality of Modes are rooted in tenets of mindfulness practice and Ayurveda - an ancient Indian healing system emphasizing inner balance as a method for maintaining health and wellness (Kiefer, 2016). Interacting with Modes is like playing in a sandbox of dynamic visuals and sounds. Users begin by selecting and entering one of three digital environments, each offering a unique mindfulness practice designed specifically for stress reduction.
ISSN:1460-6925
1756-3062
DOI:10.1080/14606925.2017.1352988