AIR Touch + Publication!

After much hard work and valuable input from a number of my therapist and research colleagues, the first version of the AIRTouch software was completed in September 2009. So far, we have 12 different tasks that are fully customizable and meet a variety of patient needs. Each of the activities was created using Adobe Flex (which I personally think is awesome) and although they are rather simple graphically, they are very fun to interact with.

In November 2009, Fraser and I were lucky enough to travel to Australia and give a conference presentation about our preliminary work at OzCHI 2009. You can read our conference publication here or email me for a copy. The publication was co-authored by D. Goertzen, J. Halton, Q. Ranson, W.F. Bischof and P. Boulanger.

A few of the upper-extremity-based activities are shown here:

Version two of the software is finished but I am waiting until I deploy them at the Glenrose before I begin writing about it and taking videos with my new Flip HD!

Multi-Touch TAR

After I finished my MSc degree in January, I spent the next few months working on a number of projects in the AMMI lab. One of the most fruitful projects was the design and development of the AIR Touch multi-touch system. Working with Fraser Anderson and a team of researchers and occupational therapists at the Glenrose Rehabilitation Hospital, the AIR Touch system is comprised of a 3 foot by 2 foot multi-touch surface (manufactured by NOR_/D), a FireFly MV IR camera, an off-the-shelf NEC projector, a mirror, the openFrameworks software package, and a suite of rehabilitation-inspired multi touch activities written in Adobe Flex. The lovely black 2×4 stand was created by Fraser and I in my garage (thanks Dad for the supplies!).

Version One of the system:

IR Camera Image (Top), Tracking Image (Bottom):

We chose to use a very open solution, as opposed to the Microsoft Surface or a SMART table, so we could modify or add things to the hardware setup as needed. We are currently in the process of designing version two of the hardware and I will post pictures when we are done!

Technology Assisted Rehabilitation

For my PhD research, I have been working on investigating the limitations and potential of integrating various technologies into rehabilitation programs. I have been working with a  number of occupational and physical therapists at the Glenrose Rehabilitation Hospital in Edmonton, Alberta, Canada to help increase patient enjoyment and compliance with rehabilitation activities. So far, I have been exploring Wii peripherals, virtual reality, multi-touch surfaces, tangibles, fabric-based computing, and low range RFID and have accumulated quite a collection of devices and prototyping products.

The majority of the blog will be a documentation of my experiences with technology-assisted rehabilitation (TAR) and human computer interaction (HCI). I will try to update it as much as possible, especially with youtube videos and pictures. Any questions or comments can always be emailed to me.

I am also always looking for eager high school or undergraduate students to work with me as interns or for ideas about course projects. Please check out the WISEST and HIP programs at the University of Alberta.

The SNaP Framework – A Virtual Reality Tool for Spatial Navigation

Continuing off of the work I did for my independent study course with Dr. Walter F. Bischof, my thesis project required me to create a software framework (the SNaP Framework – Spatial Navigation Framework) that would enable psychologists to easy create, control, and deploy virtual reality-based spatial navigation experiments. My main goal was to eliminate the hardware and usability issues inherent in current VR systems and create a flexible system that is easy for novices to use. I also had a number of research goals that I wanted to investigate/achieve:

  • Eliminate issues in using VR for spatial navigation research
  • Simplify input and output peripheral usage
  • Enable novice specification and deployment
  • Decrease design and implementation time
  • Reduce the volume of incomparable results
  • Create environments with similar appearances
  • Include universal interaction metaphors and behavioral recording techniques

The software framework contains a number of components (Chapter 1 and 4 of thesis):

  • Parameter File
    User-created XML file that specifies the spatial navigation experiment to be performed
  • VR Configuration Creator
    Python module that transforms each trial specified in a parameter file into a configuration file; determines if a block of trials needs to be run again due to poor performance
  • Configuration Files
    XML-based file that specifies the environmental and protocol configurations for a given experimental trial
  • VR Launcher
    Python module that determines which deployment contexts and input devices are desired
  • VRPN Server
    Open source virtual reality server/client that transforms input devices into generic device types; the data streaming from the server is used to control participant movement
  • Virtools VR Player
    Virtools-provided component that renders virtual environments
  • Virtools Composition Files
    Virtools file that contains a paradigm’s virtual world; contains custom scripts, 3D models, and universal modules to generate and render a paradigm’s virtual implementation
  • Result Files
    Capture a participant’s performance (i.e., path traveled, camera frustum images, or overall experiment results)

There are a number of ways that different groups of users can use the SNaP framework in their research. Novices can use 3D Layout window to add, relocate, or retexture 3D elements, quickly specify desired input and output peripherals, and easily deploy experiments using the provided batch scripts. Expert users can extend the XML parameter and configuration file schemas, design and integrate new 3D models, modify or add new scripting or C++ SDK code, introduce new metrics, modify the VR Configuration Creator to handle new parameter and configuration files, establish new goal monitoring algorithms, navigation metaphors, or aids, and implement new paradigms using template environment. (Chapter 5 of thesis)To test the efficacy of the SNaP Framework, I implemented a number of popular spatial navigation paradigms using the framework (I also performed a pilot study using a small sample of individuals, those results were done after my thesis was completed so they are present in my publications but not thesis). (Chapter 3 of thesis)

Complex Maze

 >

Bucket World
  

Scatter Hoarding Task

Cheng Task

Virtual Morris Water Maze

Analyzing the Relationships Between Users in Wiki Settings

During the summer of 2007, when I was working on the wiEGO project, I supervised my second WISEST student, Joyce Lam. Because there was lots of work in the Software Engineering lab investigating wikis and blogs, we set Joyce up on a project that analyzed the relationships that existed between wiki users. Using the Annoki Software Engineering wiki, she modified an existing wiki visualization program to help users compare the behavior (number of wiki page edits) of any two wiki contributors. As the first picture illustrates, a user of the Java-based visualization program can select two users who have contributed to the Annoki Wiki and then view an orbit graph (Image 2). In the main application panel, you can also choose to view a one person orbit diagram, or a two person diagram. You can also zoom in and out of the graph, hide pages you don’t want to see, and view a number of wiki statistics on the right hand side.

In an orbit graph, there is a central orb surrounded by a number of concentric circles. In the center orb are the two wiki contributors which were selected to be compared. In each concentric circle, there are a number of wiki pages (represented by colored dots). Each dot represents a wiki page present in the Annoki wiki. The color of each dot indicates which wiki contributor edited the page (one, the other, or both). The dots which are closest to the center orb have been edited the most, those edited the least (if at all) are the farthest away.

One can also select a specific wiki contributor and view all of the wiki pages that were edited by that contributor (via lines/edges connecting the dots). Although this project was not completed finished, both Joyce and I learned many interesting things about integrating wiki information with visualizations during her summer internship.