Lyricon (Lyrics + Earcons) improves identification of auditory cues
Department of Cognitive and Learning Sciences, Center for Human-Centered Computing, Center for Scalable Architectures and Systems
Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of “lyricons” (lyrics + earcons ) has provided a novel approach to devising auditory cues in electronic products, by combining the concurrent two layers of musical speech and earcons (short musical motives). An experiment on sound-function meaning mapping was conducted between earcons and lyricons. It demonstrated that lyricons significantly more enhanced the relevance between the sound and the meaning compared to earcons. Further analyses on error type and confusion matrix show that lyricons showed a higher identification rate and a shorter mapping time than earcons. Factors affecting auditory cue identification and application directions of lyricons are discussed.
International Conference of Design, User Experience, and Usability 2015
Lyricon (Lyrics + Earcons) improves identification of auditory cues.
International Conference of Design, User Experience, and Usability 2015, 382-389.
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/742