Lyricon (Lyrics + Earcons) improves identification of auditory cues

Document Type

Article

Publication Date

7-21-2015

Department

Department of Cognitive and Learning Sciences; Center for Human-Centered Computing; Center for Scalable Architectures and Systems

Abstract

Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of “lyricons” (lyrics + earcons [1]) has provided a novel approach to devising auditory cues in electronic products, by combining the concurrent two layers of musical speech and earcons (short musical motives). An experiment on sound-function meaning mapping was conducted between earcons and lyricons. It demonstrated that lyricons significantly more enhanced the relevance between the sound and the meaning compared to earcons. Further analyses on error type and confusion matrix show that lyricons showed a higher identification rate and a shorter mapping time than earcons. Factors affecting auditory cue identification and application directions of lyricons are discussed.

Publisher's Statement

© Springer International Publishing Switzerland 2015. Publisher's version of record: https://doi.org/10.1007/978-3-319-20898-5_37

Publication Title

International Conference of Design, User Experience, and Usability 2015

Share

COinS