HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,536,840
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Findlater_L* Results: 43 Sorted by: Date  Comments?
Help Dates
Limit:   
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 43 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 05 | 04 | 03 |
[1] The AT Effect: How Disability Affects the Perceived Social Acceptability of Head-Mounted Display Use Diverse Disabilities and Technological Support / Profita, Halley / Albaghli, Reem / Findlater, Leah / Jaeger, Paul / Kane, Shaun K. Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.4884-4895
ACM Digital Library Link
Summary: Wearable computing devices offer new possibilities to increase accessibility and independence for individuals with disabilities. However, the adoption of such devices may be influenced by social factors, and useful devices may not be adopted if they are considered inappropriate to use. While public policy may adapt to support accommodations for assistive technology, emerging technologies may be unfamiliar or unaccepted by bystanders. We surveyed 1200 individuals about the use of a head-mounted display in a public setting, examining how information about the user's disability affected judgments of the social acceptability of the scenario. Our findings reveal that observers considered head-mounted display use more socially acceptable if the device was being used to support a person with a disability.

[2] Supporting Everyday Activities for Persons with Visual Impairments Through Computer Vision-Augmented Touch Poster Session 2 / Findlater, Leah / Stearns, Lee / Du, Ruofei / Oh, Uran / Ross, David / Chellappa, Rama / Froehlich, Jon Seventeenth International ACM SIGACCESS Conference on Computers and Accessibility 2015-10-26 p.383-384
ACM Digital Library Link
Summary: The HandSight project investigates how wearable micro-cameras can be used to augment a blind or visually impaired user's sense of touch with computer vision. Our goal is to support an array of activities of daily living by sensing and feeding back non-tactile information (e.g., color, printed text, patterns) about an object as it is touched. In this poster paper, we provide an overview of the project, our current proof-of-concept prototype, and a summary of findings from finger-based text reading studies. As this is an early-stage project, we also enumerate current open questions.

[3] Personalized, Wearable Control of a Head-mounted Display for Users with Upper Body Motor Impairments HMDs & Wearables to Overcome Disabilities / Malu, Meethu / Findlater, Leah Proceedings of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.1 p.221-230
ACM Digital Library Link
Summary: Head-mounted displays provide relatively hands-free interaction that could improve mobile computing access for users with motor impairments. To investigate this largely unexplored area, we present two user studies. The first, smaller study evaluated the accessibility of Google Glass, a head-mounted display, with 6 participants. Findings revealed potential benefits of a head-mounted display yet demonstrated the need for alternative means of controlling Glass-3 of the 6 participants could not use it at all. We then conducted a second study with 12 participants to evaluate a potential alternative input mechanism that could allow for accessible control of a head-mounted display: switch-based wearable touchpads that can be affixed to the body or wheelchair. The study assessed input performance with three sizes of touchpad, investigated personalization patterns when participants were asked to place the touchpads on their body or wheelchair, and elicited subjective responses. All 12 participants were able to use the touchpads to control the display, and patterns of touchpad placement point to the value of personalization in providing support for each user's motor abilities.

[4] Designing Conversation Cues on a Head-Mounted Display to Support Persons with Aphasia HMDs & Wearables to Overcome Disabilities / Williams, Kristin / Moffatt, Karyn / McCall, Denise / Findlater, Leah Proceedings of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.1 p.231-240
ACM Digital Library Link
Summary: Symbol-based dictionaries of text, images and sound can help individuals with aphasia find the words they need, but are often seen as a last resort because they tend to replace rather than augment the user's natural speech. Through two design investigations, we explore head-worn displays as a means of providing unobtrusive, always-available, and glanceable vocabulary support. The first study used narrative storyboards as a design probe to explore the potential benefits and challenges of a head-worn approach over traditional augmented alternative communication (AAC) tools. The second study then evaluated a proof-of-concept prototype in both a lab setting with the researcher and in situ with unfamiliar conversation partners at a local market. Findings suggest that a head-worn approach could better allow wearers to maintain focus on the conversation, reduce reliance on the availability of external tools (e.g., paper and pen) or people, and minimize visibility of the support by others. These studies should motivate further investigation of head-worn conversational support.

[5] Head-Mounted Display Visualizations to Support Sound Awareness for the Deaf and Hard of Hearing HMDs & Wearables to Overcome Disabilities / Jain, Dhruv / Findlater, Leah / Gilkeson, Jamie / Holland, Benjamin / Duraiswami, Ramani / Zotkin, Dmitry / Vogler, Christian / Froehlich, Jon E. Proceedings of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.1 p.241-250
ACM Digital Library Link
Summary: Persons with hearing loss use visual signals such as gestures and lip movement to interpret speech. While hearing aids and cochlear implants can improve sound recognition, they generally do not help the wearer localize sound necessary to leverage these visual cues. In this paper, we design and evaluate visualizations for spatially locating sound on a head-mounted display (HMD). To investigate this design space, we developed eight high-level visual sound feedback dimensions. For each dimension, we created 3-12 example visualizations and evaluated these as a design probe with 24 deaf and hard of hearing participants (Study 1). We then implemented a real-time proof-of-concept HMD prototype and solicited feedback from 4 new participants (Study 2). Study 1 findings reaffirm past work on challenges faced by persons with hearing loss in group conversations, provide support for the general idea of sound awareness visualizations on HMDs, and reveal preferences for specific design options. Although preliminary, Study 2 further contextualizes the design probe and uncovers directions for future work.

[6] Design of and subjective response to on-body input for people with visual impairments Interaction / Oh, Uran / Findlater, Leah Sixteenth International ACM SIGACCESS Conference on Computers and Accessibility 2014-10-20 p.115-122
ACM Digital Library Link
Summary: For users with visual impairments, who do not necessarily need the visual display of a mobile device, non-visual on-body interaction (e.g., Imaginary Interfaces) could provide accessible input in a mobile context. Such interaction provides the potential advantages of an always-available input surface, and increased tactile and proprioceptive feedback compared to a smooth touchscreen. To investigate preferences for and design of accessible on-body interaction, we conducted a study with 12 visually impaired participants. Participants evaluated five locations for on-body input and compared on-phone to on-hand interaction with one versus two hands. Our findings show that the least preferred areas were the face/neck and the forearm, while locations on the hands were considered to be more discreet and natural. The findings also suggest that participants may prioritize social acceptability over ease of use and physical comfort when assessing the feasibility of input at different locations of the body. Finally, tradeoffs were seen in preferences for touchscreen versus on-body input, with on-body input considered useful for contexts where one hand is busy (e.g., holding a cane or dog leash). We provide implications for the design of accessible on-body input.

[7] Accessibility in context: understanding the truly mobile experience of smartphone users with motor impairments Mobility / Naftali, Maia / Findlater, Leah Sixteenth International ACM SIGACCESS Conference on Computers and Accessibility 2014-10-20 p.209-216
ACM Digital Library Link
Summary: Lab-based studies on touchscreen use by people with motor impairments have identified both positive and negative impacts on accessibility. Little work, however, has moved beyond the lab to investigate the truly mobile experiences of users with motor impairments. We conducted two studies to investigate how smartphones are being used on a daily basis, what activities they enable, and what contextual challenges users are encountering. The first study was a small online survey with 16 respondents. The second study was much more in depth, including an initial interview, two weeks of diary entries, and a 3-hour contextual session that included neighborhood activities. Four expert smartphone users participated in the second study and we used a case study approach for analysis. Our findings highlight the ways in which smartphones are enabling everyday activities for people with motor impairments, particularly in overcoming physical accessibility challenges in the real world and supporting writing and reading. We also identified important situational impairments, such as the inability to retrieve the phone while in transit, and confirmed many lab-based findings in the real-world setting. We present design implications and directions for future work.

[8] "OK glass?": a preliminary exploration of Google Glass Poster abstracts / Malu, Meethu / Findlater, Leah Sixteenth International ACM SIGACCESS Conference on Computers and Accessibility 2014-10-20 p.267-268
ACM Digital Library Link
Summary: Head-mounted displays such as Google Glass offer potential advantages for persons with motor impairments (MI). For example, they are always available and offer relatively hands-free interaction compared to a mobile phone. Despite this potential, there is little prior work examining the accessibility of such devices. In this poster paper, we perform a preliminary assessment of the accessibility of Google Glass for users with MI and the potential impacts of a head-mounted interactive computer. Our findings show that, while the touchpad is particularly difficult to use-impossible for three participants-advantages over a phone include that it is relatively hands free, does not require looking down at the display, and cannot be easily dropped.

[9] Incorporating peephole interactions into children's second language learning activities on mobile devices Crafting interactions / McNally, Brenna / Guha, Mona Leigh / Norooz, Leyla / Rhodes, Emily / Findlater, Leah Proceedings of ACM IDC'14: Interaction Design and Children 2014-06-17 p.115-124
ACM Digital Library Link
Summary: Physical movement has the potential to enhance learning activities. To investigate how movement can be incorporated into children's mobile language learning, we designed and evaluated two versions of a German vocabulary game called Scenic Words. The first version used movement-based dynamic peephole navigation, which requires physical movement of the arms, while the second version used touch-based static peephole navigation, which only requires standard touchscreen interactions; static peepholes are the status quo interaction technique for navigation, commonly found, for example, in map applications and games. To compare the two types of navigation and to assess children's reactions to dynamic peepholes, we conducted an in-home study with 16 children (ages 8-9). The children participated in pairs but individually played each version of the game on a mobile device. While results showed that the more familiar static peepholes were the preferred interaction style overall, participants became accustomed to the movement-based dynamic peepholes during the study. Participants noted that the dynamic peephole interaction became easier over time, and that it had some advantages such as for dragging-and-dropping elements in the game.

[10] Understanding child-defined gestures and children's mental models for touchscreen tabletop interaction Wednesday short papers / Rust, Karen / Malu, Meethu / Anthony, Lisa / Findlater, Leah Proceedings of ACM IDC'14: Interaction Design and Children 2014-06-17 p.201-204
ACM Digital Library Link
Summary: Creating a predefined set of touchscreen gestures that caters to all users and age groups is difficult. To inform the design of intuitive and easy to use gestures specifically for children, we adapted a user-defined gesture study by Wobbrock et al. [12] that had been designed for adults. We then compared gestures created on an interactive tabletop by 12 children and 14 adults. Our study indicates that previous touchscreen experience strongly influences the gestures created by both groups; that adults and children create similar gestures; and that the adaptations we made allowed us to successfully elicit user-defined gestures from both children and adults. These findings will aid designers in better supporting touchscreen gestures for children, and provide a basis for further user-defined gesture studies with children.

[11] Current and future mobile and wearable device use by people with visual impairments Accessibility / Ye, Hanlu / Malu, Meethu / Oh, Uran / Findlater, Leah Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.1 p.3123-3132
ACM Digital Library Link
Summary: With the increasing popularity of mainstream wearable devices, it is critical to assess the accessibility implications of such technologies. For people with visual impairments, who do not always need the visual display of a mobile phone, alternative means of eyes-free wearable interaction are particularly appealing. To explore the potential impacts of such technology, we conducted two studies. The first was an online survey that included 114 participants with visual impairments and 101 sighted participants; we compare the two groups in terms of current device use. The second was an interview and design probe study with 10 participants with visual impairments. Our findings expand on past work to characterize a range of trends in smartphone use and accessibility issues therein. Participants with visual impairments also responded positively to two eyes-free wearable device scenarios: a wristband or ring and a glasses-based device. Discussions on projected use of these devices suggest that small, easily accessible, and discreet wearable input could positively impact the ability of people with visual impairments to access information on the go and to participate in certain social interactions.

[12] Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures Papers / Oh, Uran / Kane, Shaun K. / Findlater, Leah Fifteenth Annual ACM SIGACCESS Conference on Assistive Technologies 2013-10-21 p.13
ACM Digital Library Link
Summary: While sighted users may learn to perform touchscreen gestures through observation (e.g., of other users or video tutorials), such mechanisms are inaccessible for users with visual impairments. As a result, learning to perform gestures can be challenging. We propose and evaluate two techniques to teach touchscreen gestures to users with visual impairments: (1) corrective verbal feedback using text-to-speech and automatic analysis of the user's drawn gesture; (2) gesture sonification to generate sound based on finger touches, creating an audio representation of a gesture. To refine and evaluate the techniques, we conducted two controlled lab studies. The first study, with 12 sighted participants, compared parameters for sonifying gestures in an eyes-free scenario and identified pitch + stereo panning as the best combination. In the second study, 6 blind and low-vision participants completed gesture replication tasks with the two feedback techniques. Subjective data and preliminary performance findings indicate that the techniques offer complementary advantages.

[13] Surveying the accessibility of touchscreen games for persons with motor impairments: a preliminary analysis Posters and demos / Kim, Yoojin / Sutreja, Nita / Froehlich, Jon / Findlater, Leah Fifteenth Annual ACM SIGACCESS Conference on Assistive Technologies 2013-10-21 p.67
ACM Digital Library Link
Summary: Touchscreen devices have become one of the most pervasive video game platforms in the world and, in turn, an integral part of popular culture; however, little work exists on comprehensively examining their accessibility. In this poster paper, we present initial findings from a survey and qualitative analysis of popular iPad touchscreen games with a focus on exploring factors relevant to persons with motor impairments. This paper contributes a novel qualitative codebook with which to examine the accessibility of touchscreen games for users with motor impairments and the results from applying this codebook to 72 iPad games.

[14] Effects of hand drift while typing on touchscreens Input 1: pens and consistency / Li, Frank Chun Yat / Findlater, Leah / Truong, Khai N. Proceedings of the 2013 Conference on Graphics Interface 2013-05-29 p.95-98
ACM Digital Library Link
Summary: On a touchscreen keyboard, it can be difficult to continuously type without frequently looking at the keys. One factor contributing to this difficulty is called hand drift, where a user's hands gradually misalign with the touchscreen keyboard due to limited tactile feedback. Although intuitive, there remains a lack of empirical data to describe the effect of hand drift. A formal understanding of it can provide insights for improving soft keyboards. To formally quantify the degree (magnitude and direction) of hand drift, we conducted a 3-session study with 13 participants. We measured hand drift with two typing interfaces: a visible conventional keyboard and an invisible adaptive keyboard. To expose drift patterns, both keyboards used relaxed letter disambiguation to allow for unconstrained movement. Findings show that hand drift occurred in both interfaces, at an average rate of 0.25mm/min on the conventional keyboard and 1.32mm/min on the adaptive keyboard. Participants were also more likely to drift up and/or left instead of down or right.

[15] Grand challenges in text entry Workshop summaries / Kristensson, Per Ola / Brewster, Stephen / Clawson, James / Dunlop, Mark / Findlater, Leah / Isokoski, Poika / Martin, Benoît / Oulasvirta, Antti / Vertanen, Keith / Waller, Annalu Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing Systems 2013-04-27 v.2 p.3315-3318
ACM Digital Library Link
Summary: Our workshop serves two purposes. First, to bring text entry researchers working in the human-computer interaction (HCI), natural language processing (NLP) and augmentative and alternative communication (AAC) communities together at CHI. Second, we will set three major grand challenges for text entry research: a) removing the performance bottleneck in text entry; b) designing efficient localized text entry methods; and c) bridging the communication gap between users with disabilities and society at large. These challenges will be discussed in a panel format at the workshop. The discussions will center on support activities, such as identifying obstacles for success in meeting these challenges and formalizing procedures for measuring progress in the text entry field.

[16] Age-related differences in performance with touchscreens compared to traditional mouse input Papers: technologies for life 1 / Findlater, Leah / Froehlich, Jon E. / Fattal, Kays / Wobbrock, Jacob O. / Dastyar, Tanya Proceedings of ACM CHI 2013 Conference on Human Factors in Computing Systems 2013-04-27 v.1 p.343-346
ACM Digital Library Link
Summary: Despite the apparent popularity of touchscreens for older adults, little is known about the psychomotor performance of these devices. We compared performance between older adults and younger adults on four desktop and touchscreen tasks: pointing, dragging, crossing and steering. On the touchscreen, we also examined pinch-to-zoom. Our results show that while older adults were significantly slower than younger adults in general, the touchscreen reduced this performance gap relative to the desktop and mouse. Indeed, the touchscreen resulted in a significant movement time reduction of 35% over the mouse for older adults, compared to only 16% for younger adults. Error rates also decreased.

[17] The challenges and potential of end-user gesture customization Papers: gesture studies / Oh, Uran / Findlater, Leah Proceedings of ACM CHI 2013 Conference on Human Factors in Computing Systems 2013-04-27 v.1 p.1129-1138
ACM Digital Library Link
Summary: The vast majority of work on understanding and supporting the gesture creation process has focused on professional designers. In contrast, gesture customization by end users' -- which may offer better memorability, efficiency and accessibility than pre-defined gestures -- has received little attention. To understand the end-user gesture creation process, we conducted a study where 20 participants were asked to: (1) exhaustively create new gestures for an open-ended use case; (2) exhaustively create new gestures for 12 specific use cases; (3) judge the saliency of different touchscreen gesture features. Our findings showed that even when asked to create novel gestures, participants tended to focus on the familiar. Misconceptions about the gesture recognizer's abilities were also evident, and in some cases constrained the range of gestures that participants created. Finally, as a calibration point for future research, we used a simple gesture recognizer ($N) to analyze recognition accuracy of the participants' custom gesture sets: accuracy was 68-88% on average, depending on the amount of training and the customization scenario. We conclude with implications for the design of a mixed-initiative approach to support custom gesture creation.

[18] Analyzing user-generated YouTube videos to understand touchscreen use by people with motor impairments Papers: impairment and rehabilitation / Anthony, Lisa / Kim, YooJin / Findlater, Leah Proceedings of ACM CHI 2013 Conference on Human Factors in Computing Systems 2013-04-27 v.1 p.1223-1232
ACM Digital Library Link
Summary: Most work on the usability of touchscreen interaction for people with motor impairments has focused on lab studies with relatively few participants and small cross-sections of the population. To develop a richer characterization of use, we turned to a previously untapped source of data: YouTube videos. We collected and analyzed 187 non-commercial videos uploaded to YouTube that depicted a person with a physical disability interacting with a mainstream mobile touchscreen device. We coded the videos along a range of dimensions to characterize the interaction, the challenges encountered, and the adaptations being adopted in daily use. To complement the video data, we also invited the video uploaders to complete a survey on their ongoing use of touchscreen technology. Our findings show that, while many people with motor impairments find these devices empowering, accessibility issues still exist. In addition to providing implications for more accessible touchscreen design, we reflect on the application of user-generated content to study user interface design.

[19] Personalized input: improving ten-finger touchscreen typing through automatic adaptation Pen + touch / Findlater, Leah / Wobbrock, Jacob Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems 2012-05-05 v.1 p.815-824
ACM Digital Library Link
Summary: Although typing on touchscreens is slower than typing on physical keyboards, touchscreens offer a critical potential advantage: they are software-based, and, as such, the keyboard layout and classification models used to interpret key presses can dynamically adapt to suit each user's typing pattern. To explore this potential, we introduce and evaluate two novel personalized keyboard interfaces, both of which adapt their underlying key-press classification models. The first keyboard also visually adapts the location of keys while the second one always maintains a visually stable rectangular layout. A three-session user evaluation showed that the keyboard with the stable rectangular layout significantly improved typing speed compared to a control condition with no personalization. Although no similar benefit was found for the keyboard that also offered visual adaptation, overall subjective response to both new touchscreen keyboards was positive. As personalized keyboards are still an emerging area of research, we also outline a design space that includes dimensions of adaptation and key-press classification features.

[20] The design and evaluation of prototype eco-feedback displays for fixture-level water usage data Defying environmental behavior changes / Froehlich, Jon / Findlater, Leah / Ostergren, Marilyn / Ramanathan, Solai / Peterson, Josh / Wragg, Inness / Larson, Eric / Fu, Fabia / Bai, Mazhengmin / Patel, Shwetak / Landay, James A. Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems 2012-05-05 v.1 p.2367-2376
ACM Digital Library Link
Summary: Few means currently exist for home occupants to learn about their water consumption: e.g., where water use occurs, whether such use is excessive and what steps can be taken to conserve. Emerging water sensing systems, however, can provide detailed usage data at the level of individual water fixtures (i.e., disaggregated usage data). In this paper, we perform formative evaluations of two sets of novel eco-feedback displays that take advantage of this disaggregated data. The first display set isolates and examines specific elements of an eco-feedback design space such as data and time granularity. Displays in the second set act as design probes to elicit reactions about competition, privacy, and integration into domestic space. The displays were evaluated via an online survey of 651 North American respondents and in-home, semi-structured interviews with 10 families (20 adults). Our findings are relevant not only to the design of future water eco-feedback systems but also for other types of consumption (e.g., electricity and gas).

[21] Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input Touch text entry / Findlater, Leah / Lee, Ben / Wobbrock, Jacob Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems 2012-05-05 v.1 p.2679-2682
ACM Digital Library Link
Summary: Although many techniques have been proposed to improve text input on touch screens, the vast majority of this research ignores non-alphanumeric input (i.e., punctuation, symbols, and modifiers). To support this input, widely adopted commercial touch-screen interfaces require mode switches to alternate keyboard layouts for most punctuation and symbols. Our approach is to augment existing ten-finger QWERTY keyboards with multi-touch gestural input that can exist as a complement to the moded-keyboard approach. To inform our design, we conducted a study to elicit user-defined gestures from 20 participants. The final gesture set includes both multi-touch and single-touch gestures for commonly used non-alphanumeric text input. We implemented and conducted a preliminary evaluation of a touch-screen keyboard augmented with this technique. Findings show that using gestures for non-alphanumeric input is no slower than using keys, and that users strongly prefer gestures to a moded-keyboard interface.

[22] WalkType: using accelerometer data to accommodate situational impairments in mobile touch screen text entry Touch text entry / Goel, Mayank / Findlater, Leah / Wobbrock, Jacob Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems 2012-05-05 v.1 p.2687-2696
ACM Digital Library Link
Summary: The lack of tactile feedback on touch screens makes typing difficult, a challenge exacerbated when situational impairments like walking vibration and divided attention arise in mobile settings. We introduce WalkType, an adaptive text entry system that leverages the mobile device's built-in tri-axis accelerometer to compensate for extraneous movement while walking. WalkType's classification model uses the displacement and acceleration of the device, and inference about the user's footsteps. Additionally, WalkType models finger-touch location and finger distance traveled on the screen, features that increase overall accuracy regardless of movement. The final model was built on typing data collected from 16 participants. In a study comparing WalkType to a control condition, WalkType reduced uncorrected errors by 45.2% and increased typing speed by 12.9% for walking participants.

[23] From plastic to pixels: in search of touch-typing touchscreen keyboards Features / Findlater, Leah / Wobbrock, Jacob O. interactions 2012-05-01 v.19 n.3 p.44-49
ACM Digital Library Link

[24] Personalized dynamic accessibility Forums: Universal Interactions / Gajos, Krzysztof Z. / Hurst, Amy / Findlater, Leah interactions 2012-03-01 v.19 n.2 p.69-73
Juan Pablo Hourcade, Editor
ACM Digital Library Link

[25] The aligned rank transform for nonparametric factorial analyses using only anova procedures Research methods / Wobbrock, Jacob O. / Findlater, Leah / Gergle, Darren / Higgins, James J. Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011-05-07 v.1 p.143-146
ACM Digital Library Link
Summary: Nonparametric data from multi-factor experiments arise often in human-computer interaction (HCI). Examples may include error counts, Likert responses, and preference tallies. But because multiple factors are involved, common nonparametric tests (e.g., Friedman) are inadequate, as they are unable to examine interaction effects. While some statistical techniques exist to handle such data, these techniques are not widely available and are complex. To address these concerns, we present the Aligned Rank Transform (ART) for nonparametric factorial data analysis in HCI. The ART relies on a preprocessing step that "aligns" data before applying averaged ranks, after which point common ANOVA procedures can be used, making the ART accessible to anyone familiar with the F-test. Unlike most articles on the ART, which only address two factors, we generalize the ART to N factors. We also provide ARTool and ARTweb, desktop and Web-based programs for aligning and ranking data. Our re-examination of some published HCI results exhibits advantages of the ART.
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 43 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 05 | 04 | 03 |