Research on human-centered computing has rapidly expanded from user interface design and evaluation to the creation of entire user experiences and even novel life-styles and human values. Therefore, the designing, building, and evaluation of computational technologies should be related to people’s capabilities, limitations, and environments, and should reflect how these technologies affect society. Part of the Institute of Computing and Cybersystems, the Center for Human,-Centered Computing (HCC) leverages broad expertise in people and technology to help lead this timely effort. Specifically, we integrate art, people, design, technology, and experiences, and conduct novel experiments and research on multiple areas in human-centered computing. HCC prepares Michigan Tech students to become future creators with balanced viewpoints by educating their computing side, their human side, and their interactions.

Peruse a selection of publications from the Center for Human-Centered Computing below. Visit the ICC homepage for more information about the Institute and its Centers.

Follow

Submissions from 2018

PDF

Robotic motion learning framework to promote social engagement, Rachel Burns, Myounghoon Jeon, and Chung Hyuk Park

Link

Robot-assisted socio-emotional intervention framework for children with Autism Spectrum disorder, Hifza Javed, Myounghoon "Philart" Jeon, Ayanna Howard, and Chung Hyuk Park

Submissions from 2017

Link

The influence of robot design on acceptance of social robots, Jaclyn Barnes, Maryam FakhrHosseini, Myounghoon Jeon, Chung Hyuk Park, and Ayanna Howard

Link

Child-Robot theater: STEAM education in an afterschool program, Jaclyn Barnes, Maryam FakhrHosseini, Eric Vasey, Zackery Duford, Joseph Ryan, and Myounghoon Jeon

Link

Emotions and affect in human factors and human-computer interaction, Myounghoon "Philart" Jeon

PDF

Robotic arts: Current practices, potentials, and implications, Myounghoon "Philart" Jeon

Link

Linking actions and objects: Context-specific learning of novel weight priors, Kevin Trewartha and J. Randall Flanagan

Submissions from 2016

Link

Multisensory robotic therapy to promote natural emotional interaction for children with ASD, Rachel Bevill, Paul Azzi, Matthew Spadafora, Chung Hyuk Park, Hyung Jung Kim, JongWon Lee, Kazi Raihan, Myounghoon Jeon, and Ayanna Howard

Link

Interactive robotic framework for multi-sensory therapy for children with Autism spectrum disorder, Rachel Bevill, Chung Hyuk Park, Hyung Jung Kim, JongWon Lee, Ariena Rennie, Myounghoon Jeon, and Ayanna Howard

PDF

A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired, Adam Caspo, György Wersényi, and Myounghoon Jeon

Link

Getting active with passive crossings: Investigating the use of in-vehicle auditory alerts for highway-rail grade crossings, Steven Landry, Myounghoon Jeon, Pasi T. Lautala, and David Nelson

PDF

17 Human-Car confluence: “Socially-Inspired driving mechanisms”, Andreas Riener, Myounghoon Jeon, and Alois Ferscha

Link

Distinct contributions of explicit and implicit memory processes to weight prediction when lifting objects and judging their weights: an aging study, Kevin Trewartha and J. Randall Flanagan

Submissions from 2015

Link

Estimation of drivers' emotional states based on Neuroergonmic equipment: an exploratory study using fNIRS, Maryam FakhrHosseini, Myounghoon Jeon, and Rahul Bose

Link

An Investigation on Driver Behaviors and Eye-Movement Patterns at Grade Crossings Using a Driving Simulator, Maryam FakhrHosseini, Myounghoon Jeon, Pasi T. Lautala, and David Nelson

PDF

Regulating drivers’ aggressiveness by Sonifying emotional data, Maryam FakhrHosseini, Paul Kirby, and Myounghoon Jeon

PDF

An exploration of semiotics of new auditory displays: A comparative analysis with visual displays, Myounghoon Jeon

Link

Development and evaluation of emotional robots for children with Autism spectrum disorders, Myounghoon Jeon

Link

Sorry, I’m Late; I’m not in the mood: Negative emotions lengthen driving time, Myounghoon Jeon and Jayde Croschere

Link

Menu navigation with in-vehicle technologies: Auditory menu cues improve dual task performance, preference, and workload, Myounghoon Jeon, Thomas M. Gable, Benjamin K. Davision, Michael A. Nees, Jeff Wilson, and Bruce N. Walker

Link

Report on the in-vehicle auditory interactions workshop: Taxonomy, challenges, and approaches, Myounghoon Jeon, T Hermann, P Bazilinskyy, J Hammerschmidt, KAE Wolf, I Alvarez, and et. al.

PDF

Cultural differences in preference of auditory emoticons: USA and South Korea, Myounghoon Jeon, Lee Ju-Hwan, Jason Sterkenburg, and Christopher Plummer

Link

Technologies expand aesthetic dimensions: Visualization and Sonification of embodied Penwald drawings, Myounghoon Jeon, Steven Landry, Joseph Ryan, and James Walker

Link

The effects of social interactions with in-vehicle agents on a driver's anger level, driving performance, situation awareness, and perceived workload, Myounghoon Jeon, Bruce N. Walker, and Thomas M. Gable

Link

Robotic framework with multi-modal perception for physio-musical interactive therapy for children with autism, Chung Hyuk Park, Myounghoon Jeon, and Ayanna Howard

Link

Robotic framework for music-based emotional and social engagement with children with Autism, Chung Hyuk Park, Neetha Pai, Jayashan Bakthavatchalam, Yaojie Li, Myounghoon Jeon, and Ayanna Howard

Link

Lyricon (Lyrics + Earcons) improves identification of auditory cues, Yuanjing Sun and Myounghoon Jeon

Link

Interactive Sonification markup language (ISML) for efficient motion-sound mappings, James Walker, Michael T. Smith, and Myounghoon Jeon

Link

Robotic sonification for promoting emotional and social interactions of children with ASD, Ruimin Zhang, Myounghoon Jeon, Chung Hyuk Park, and Ayanna Howard