Lyricon (Lyrics + Earcons) improves identification of auditory cues

Yuanjing Sun, Michigan Technological University
Myounghoon Jeon, Michigan Technological University

Copyright © 2015, Springer Nature. Publisher's version of record: https://doi.org/10.1007/978-3-319-20898-5_37

Abstract

Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of “lyricons” (lyrics + earcons [1]) has provided a novel approach to devising auditory cues in electronic products, by combining the concurrent two layers of musical speech and earcons (short musical motives). An experiment on sound-function meaning mapping was conducted between earcons and lyricons. It demonstrated that lyricons significantly more enhanced the relevance between the sound and the meaning compared to earcons. Further analyses on error type and confusion matrix show that lyricons showed a higher identification rate and a shorter mapping time than earcons. Factors affecting auditory cue identification and application directions of lyricons are discussed.