CHI 2012 EA Publication! Tabletops in Motion

To round out the old work that I have neglected to post over the last year, Fraser and I did some work using electromyography, motion capture, and a multi-touch tabletop to understand the extent of motor movements made during multi-touch tabletops usage in rehabilitation settings. The results of our experiment went into our CHI 2012 work in progress, “Tabletops in Motion: The Kinetics and Kinematics of interactive surface physical therapy”, found here, as well as our ICDVRAT 2013 paper discussed here.

Abstract:
Technology-based rehabilitation methods have shown promise for improving physical therapy programs, but much of the research is lacking quantitative analysis. We present a study conducted with healthy participants where we compared traditional “table-based” therapy methods with new technology-based methods. Using motion analysis and electromyography recordings, we assessed the kinetic and kinematic dimensions of participant motion during four activities. While technology-based methods are more enjoyable, our results indicate that it is the design of an activity that has a significant impact on the movements performed.

ICDVRAT 2012 Publication! User Perspecitves on Multi-Touch Tabletop Therapy

Fraser was nice enough to go to Paris on my behalf. The paper reported on the second half of our CHI 2012 WIP experiment (http://dl.acm.org/citation.cfm?id=2223801). I forgot to post about it before, so here it is, better late than never!

Abstract:

Technology-based activities are becoming more popular in therapy programs, and direct-touch interactive tabletops seem particularly suited to many therapy tasks. To better understand the potential benefit of interactive tabletops in rehabilitation, we examined users’ attitudes as they performed rehabilitation activities on a multi-touch tabletop and a normal, non-interactive surface. This revealed the elements of multi-touch tabletops and their associated activities that contribute to their success in rehabilitation programs and identify improvements for future designs. We found that although the engaging and dynamic nature of the interactive tasks was preferred, many participants were heavily influenced by prior exposure to commercial interaction devices and expected very precise and responsive sensing. We discuss the implications of user expectations and experiences on the design of future activities and rehabilitation technologies.

CHI 2013 Paper Accepted

Back in December I found out that some of the work I had done last year at the UofA was accepted for publication / presentation at CHI 2013 in Paris, France. I’m super excited to talk about this work as I was really excited about the results and of course, a stop in Paris means a stop at Disneyland Paris!

Here is the abstract of the work, as well as the CHI Video Preview which was a new component for submissions this year.

This work examines intermanual gesture transfer, i.e., learning a gesture with one hand and performing it with the other. Using a traditional retention and transfer paradigm from the motor learning literature, participants learned four gestures on a touchscreen. The study found that touchscreen gestures transfer, and do so symmetrically. Regardless of the hand used during training, gestures were performed with a comparable level of error and speed by the untrained hand, even after 24 hours. In addition, the form of a gesture, i.e., its length or curvature, was found to have no influence on transferability. These results have important implications for the design of stroke-based gestural interfaces: acquisition could occur with either hand and it is possible to interchange the hand used to perform gestures. The work concludes with a discussion of these implications and highlights how they can be applied to gesture learning and current gestural systems.

Also, if you are interested in other applications of motor learning to HCI, check out Fraser’s paper that will also be presented at CHI this year!

Surface and Pen-Based Gesture Research Study

*** Update ***
Fraser and I have recruited all the participants for our studies. We will be running another set of studies in 3 to 4 months, so please check back a little bit later for the more information!
*******

Once again, Fraser Anderson, and I are looking for participants for our research studies. We are currently running two user studies. Fraser’s experiment is concerned with how people learn pen-based gestures and mine is concerned with how people learn gestures on a large touchscreen. We hope that the results will help us understand how generalizable gestures are, how people perform gestures in different contexts, and will help guide the design of gesture-based interfaces in the future.

Fraser and I are currently seeking right handed University of Alberta students who are 18+ to participate. My study has the added caveat that potential participants must *not* be color blind. You can volunteer for one or both of our studies.

The studies take place in the Advanced Man-Machine Interface Lab in the Computing Science Center on the University of Alberta Campus. Your participation will be split over two days. On day 1, you will spend 1 hour learning a number of gestures. On day 2 (approximately 24 hours later), you will return for some follow-up tasks. Participation on day 2 takes less than 30 minutes. At the completion of the experiment, you will receive $15 cash. To be eligible to participate, you must be available for 1 full hour on day one and 30 minutes on day two.

If you are interested in participating, please email hci@ualberta.ca to set up an appointment. Unfortunately, everyone needs to have any appointment set up to participate – we cannot accommodate people who just show up at our lab. Thanks!
** These studies are being conducted by Michelle Annett and Fraser Anderson under the guidance of Dr. Walter Bischof and have been approved by the Research Ethics Board at the University of Alberta. **

Michelle’s Experiment Fraser’s Experiment

Gesture Learning Research Studies

*** Update ***
Thanks to all those who have participated thus far! Both Fraser and I have recruited all the participants for our studies. We will be running another set of studies in two-three months, so please check back a little bit later for the more information!
*************

Fraser Anderson, and I are looking for participants for our (2) research studies. Each  experiment is designed to evaluate how people learn pen-based gestures and apply touchscreen gestures in different contexts. The results of the experiments will help us understand how generalizable gestures are, how people perform gestures in different contexts, and will help guide the design of gesture-based interfaces in the future. You can volunteer for one or both of our studies.

We are currently seeking right handed University of Alberta students who are 18+ to participate. Both studies will take place in the Advanced Man-Machine Interface Lab in the Computing Science Center on the University of Alberta Campus. Your participation will be split over two days. On day 1, you spend 1 hour learning a number of gestures. On day 2 (approximately 24 hours later), you will return for some follow-up tasks. Participation on day 2 takes less than 15 minutes. At the completion of the experiment, you will receive $15 cash.

To be eligible to participate, you must be available for 1 full hour on day one and 15 minutes on day two.

If you are interested in participating, please email hci@ualberta.ca to set up an appointment. Thanks!

** These studies are being conducted by Michelle Annett and Fraser Anderson under the guidance of Dr. Walter Bischof and have been approved by the Research Ethics Board at the University of Alberta. **

Michelle’s Experiment Fraser’s Experiment