Category Archives: multitouch

Tim April defends MS thesis

Tim April recently defended his MS thesis [1]. Tim’s topic of exploration was multitouch surfaces and how interactions with these surfaces might be improved with the use of tangible user interfaces. Here’s a picture from the defense (for more pictures see Flickr):

Tim is currently Security Researcher at Akamai Technologies.

 

References

[1] Tim April, “Comparing and Contrasting Manual Direct Touch Interaction with Tangible User Interfaces for Mapping Applications,” MS Thesis, University of New Hampshire, 2013

Award of Excellence at 2012 Undergraduate Research Conference

Two of my undergraduate research assistants, Josh Clairmont and Shawn Bryan, won an Award of Excellence at the 2012 Undergraduate Research Conference. The URC is UNH’s annual event aimed at engaging undergraduate students in research.

Josh and Shawn created a tangible user interface for the Microsoft Surface multitouch table.  Their interface allows users to play a game of air hockey on the Surface. Josh, a computer engineering senior, was in charge of creating the Arduino-based game controller. Shawn, a computer science senior, created the game on the Surface.

Here is a video introducing the work of Josh and Shawn:

Congratulations Josh and Shawn!

2011 opportunity for UNH CS students: multi-touch surface manipulation of geo-coded time series

When I think back to the recent BP oil spill in the Gulf of Mexico, the images that come to mind are of wildlife affected on beaches, idle fishing vessels, and a massive response that involved thousands of people across multiple states.

How can such a massive response be managed? There is no single answer. However, one thing that can help is to make data about various aspects of the disaster, as well as the response effort, accessible to those conducting the response activities. This is the role of the Environmental Response Management Application (ERMA). ERMA is a web-based data visualization application. It visualizes geo-coded time series, without requiring users to know how to access specialized databases, or overlay data from these databases on virtual maps. ERMA was developed at UNH, under the guidance of the Coastal Response Research Center (CRRC).

Nancy Kinner is the co-director of the UNH Coastal Response Research Center. Building on Nancy’s experiences with ERMA, she and I are interested in exploring how a multi-touch table could be used to access and manipulate geo-coded time series.

Seeking UNH CS student

To further are effort, we are seeking a UNH CS student interested in developing a user interface on a multi-touch table. The interface would allow a human operator to access remote databases, manipulate the data (e.g. by sending it to Matlab for processing) and display the results on a virtual map or a graph. This work will be part of a team effort with two students working with Nancy on identifying data and manipulations of interest.

What should the user interface do?

The operator should be able to select data, e.g. from a website such as ERMA. Data types of interest include outputs from various sensors (temperature, pressure, accelerometers, etc.). Data manipulation will require some simple processing, such as setting beginning and end points for sensor readings. It will also require more complex processing of data, e.g. filtering.

What platform will be used?

The project will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

What are the terms of this job?

We are interested in hiring an undergraduate or graduate UNH CS student for the 2011-2012 academic year, with the possibility of extending the appointment for the summer of 2012 and beyond, pending satisfactory performance and the availability of funding. The student will work up to 20 hours/week during the academic year and up to 40 hours a week during the summer break.

What are the required skills? And what new skills will I acquire?

Work on this ream-project will require object-oriented programming that is necessary to control the multi-touch table. You will explore the application of these skills to the design of surface user interfaces as well as experiments with human subjects – after all we will have to systematically test your creation! Finally, you will interact with students and faculty from at least two other disciplines (civil/environmental and electrical/computer engineering), which means you will gain valuable experience working on multi-disciplinary teams.

Interested? Have questions, ideas, suggestions?
Email me.

2011 opportunities for UNH CS students: multi-touch surface interaction

I am seeking UNH CS students (individuals or teams) interested in developing a user interface on a multi-touch table. The interface would allow a human operator to control a fleet of unmanned aerial vehicles (UAVs). This project will part of a collaborative effort with WPI on creating a fleet of UAVs. Students at WPI will focus on building the UAVs. Students at UNH will work on communication issues (with Professor Nicholas Kirsch) and on user interface issues (with me).

What should the user interface do?

The operator should be able to view and manipulate data sent out by the UAV fleet. Data types of interest include images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data manipulation will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex processing of data, e.g. filtering.

What are the data sources?

Eventually, the data will come from UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang. Thus, we might utilize cell phones to get images, webcams to get video, and Arduino boards to generate sensor data (e.g. temperature).

What platform will be used?

The project will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

Is this a job, a project, or something else?

CS students would be able to use this effort as a senior project (details to be worked out with appropriate CS faculty). An independent study might also be a possiblity. Finally, I am interested in hiring students for academic year and/or summer jobs.

Can CS and ECE students collaborate?

Collaboration is not a requirement. However, some aspects of this work might benefit from the involvement of one or more UNH ECE students. E.g. ECE students can work on some of the data processing aspects of the projects, as well as on creating data sources (e.g. deployment of wireless sensor networks). I am actively recruiting ECE students for multi-touch projects and you are welcome to talk to your friends in ECE.

What are the required skills? And what new skills will I acquire?

For CS students, work on this project will require object-oriented programming that is necessary to control the multi-touch table. You will explore the application of these skills to the design of surface user interfaces as well as experiments with human subjects – after all we will have to systematically test your creation!

Interested? Have questions, ideas, suggestions?
Email me.

2011 Senior Project topics: multi-touch surface interaction

I am seeking students (individuals or teams) for two senior projects. Both projects would leverage a multi-touch surface to create a natural user interface for pervasive computing applications.

Pervasive computing problems and ideas are often introduced using videos. An excellent exampe is the Microsoft Health Future Vision video (download, watch on YouTube). 

Let’s focus on three themes from the video that are relevant to the senior projects: interactions with multi-touch interfaces, interactions with tangible user interfaces, and data manipulation/fusion. Multi-touch surfaces appear throughout the video: in Sabine’s home, in the doctor’s office, and in the hospital lobby. Several of the multi-touch interfaces, such as Sabine’s remote control, and her virtual wallet (used in the lobby), are tangible interfaces. Finally, Dr. Kemp manipulates/fuses data when interacting with Alex (patient in bed) and especially during the meeting with Sabine and Wei Yu.

The two senior projects will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

With all this in mind, here are the specifics on the two proposed projects.

Project 1: Mobile data fusion

This project will explore fusing data, such as images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data fusion will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex digital signal processing of data, e.g. windowing and filtering (topics covered in ECE 714). Consequently, work on this project will focus on data processing as well as object-oriented programming that is necessary to control the multi-touch table.

This project will be tied to a collaborative effort with WPI on creating a fleet of UAVs. Thus, eventually, the data to process and display on the multi-touch will come from the UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang.

Project 2: IR wallet

The Microsoft Surface uses infrared illumination and cameras to recognize interactions with its surface. It can also recognize 2D barcodes if they are visible in the IR part of the spectrum. The “IR wallet” project would result in a tangible user interface, similar to Sabine’s virtual wallet, that can display 2D barcodes in IR. These in turn will be picked up by the Microsoft Surface. Work on this project will focus on microcontroller-based design (e.g. with an Arduino board) and object-oriented programming for the Surface.

Interested? Have questions, ideas, suggestions? Email me.

Two posters at Ubicomp 2009

Our group presented two posters at last week’s Ubicomp 2009. Oskar Palinko and Michael Litchfield were on hand to talk about our multitouch table effort[1] (a great deal of work for this poster was done by Ankit Singh). Zeljko Medenica introduced a driving simulator pilot, work done in collaboation wtih Tim Paek, that deals with using augmented reality for the user interface of a navigation device [2].

Oskar (center) and Mike (right)

Oskar (center) and Mike (right)

Zeljko (center)

Zeljko (center)

Oskar, Mike and I are working on expanding the multitouch study. We plan to start with an online study in which subjects will watch two videos, one in which a story is presented using the multitouch table and another with the same story presented using a simple slide show. Zeljko will head up the follow-on to the pilot study – take a look at the video below to see (roughly) what we’re planning to do.

Take a look at other pictures I took at Ubicomp 2009 on Flickr.

References

[1] Oskar Palinko, Ankit Singh, Michael A. Farrar, Michael Litchfield, Andrew L. Kun, “Towards Storytelling with Geotagged Photos on a Multitouch Display,” Conference Supplement, Ubicomp 2009

[2] Zeljko Medenica, Oskar Palinko, Andrew L. Kun, Tim Paek, “Exploring In-Car Augmented Reality Navigation Aids: A Pilot Study,” Conference Supplement, Ubicomp 2009