Enhancing the composition task in text entry studies: Eliciting dificult text and improving error rate calculation

Document Type

Conference Proceeding

Publication Date



Department of Computer Science


Participants in text entry studies usually copy phrases or compose novel messages. A composition task mimics actual user behavior and can allow researchers to better understand how a system might perform in reality. A problem with composition is that participants may gravitate towards writing simple text, that is, text containing only common words. Such simple text is insufcient to explore all factors governing a text entry method, such as its error correction features. We contribute to enhancing composition tasks in two ways. First, we show participants can modulate the difculty of their compositions based on simple instructions. While it took more time to compose difcult messages, they were longer, had more difcult words, and resulted in more use of error correction features. Second, we compare two methods for obtaining a participant's intended text, comparing both methods with a previously proposed crowdsourced judging procedure. We found participant-supplied references were more accurate.

Publication Title

Conference on Human Factors in Computing Systems - Proceedings