Lasting Impact Award for “Sensing Techniques for Mobile Interaction”

Last week I received a significant award for some of my early work in mobile sensing.

It was not that long ago really, that I would get strange glances from practical-minded people– those folks who would look at me with heads tilted downwards ever so slightly, eyebrows raised, and eyeballs askew– when I would mention how I was painting mobile devices with conductive epoxy and duct-taping accelerometers and infrared range-finders to them.

The dot-com bubble was still expanding, smartphones didn’t exist yet, and accelerometers were still far too expensive to reasonably consider on a device’s bill of materials. Many people still regarded the apex of handheld nirvana as the PalmPilot, although its luster was starting to fade.

And this Frankensteinian contraption of sensors, duct tape, and conductive epoxy was taking shape on my laboratory bench-top:

Sensing Pocket PC, circa 2000, with proximity range sensor, touch sensitivity, and tilt sensor

The Idea

I’d been dabbling in the area of sensor-enhanced mobile interaction for about a year, trying one idea here, another idea there, but the project had stubbornly refused to come together. For a long time I felt like it was basically a failure. But every so often myself and my colleagues who worked with me on the project– Jeff Pierce, Mike Sinclair, and Eric Horvitz– would come up with one new example, or another type of idea to try out, and slowly we populated a space of interesting new ways to use the sensors to make mobile devices smarter– or to be more honest about it, just a little bit less stupid– in how they responded to the physical environment, how the user was handling the device, or the orientation of the screen.

The latter led to the idea of using the accelerometer to automatically re-orient the display based on how the user was holding the device. The accelerometer gave us a constant signal of this-way-up, and at some point we realized it would make a great way to switch between portrait and landscape display formats without any need for buttons or menus, or indeed without even explicitly having to think about the interaction at all. The handheld, by being perceptive about it, could offload the decision from the user– hey, I need to look at this table in landscape— to the background of the interaction, so that the user could simply move the device to the desired orientation, and our sensors and our software would automatically optimize the display accordingly.

There were also some interesting subtleties to it. Just using the raw angle of the display, relative to gravity, was not that satisfactory. We built in some hysteresis so the display wouldn’t chatter back and forth between different orientations. We added special handling when you put the handheld down flat on a desk, or picked it back up, so that the screen wouldn’t accidentally flip to a different orientation because of this brief, incidental motion. We noticed that flipping the screen upside-down, which we initially thought wouldn’t be useful, was an effective way to quickly show the contents of the screen to someone seated across the table from you. And we also added some layers of logic in there so that other uses of the accelerometer could co-exist with automatic screen rotation.

Once we had this automatic screen rotation idea working well, I knew we had something. We worked furiously right up to the paper deadline, hammering out additional techniques, working out little kinks and details, figuring out how to convey the terrain we’d explored in the paper we were writing.

The reviewers all loved the paper, and it received a Best Paper Award at the conference. We had submitted it to the Association of Computing Machinery’s annual UIST Symposium– the UIST 2000 13th Annual Symposium on User Interface Software and Technology, held in San Diego, California– because we knew the UIST community was ideally suited to evaluate this research. The paper had a novel combination of sensors. It was a systems paper– that is, it did not just propose a one-off technique but rather a suite of techniques that all used the sensors in a variety of creative ways that complemented one another. And UIST is a rigorously peer-reviewed single-track conference. It’s not the largest conference in the field of Human-Computer Interaction by a long shot– for many years it averaged about two hundred attendees– but as my Ph.D. advisor Randy Pausch (now known for “The Last Lecture“) would often say, “UIST is only 200 people, but its the right 200 people.”

This is the video, recorded back in the year 2000, that accompanied the paper. I think it’s stood the test of time pretty well– or at least a lot better than the hair on top of my head :-).

Sensing Techniques for Mobile Interaction on YouTube

The Award

Fast forward ten years, and the vast majority of handhelds and slates being produced today include accelerometers and other micro-electromechanical wonders. The cost of these sensors has dropped to essentially nothing. Increasingly, they’re included as a co-processor right on the die with other modules of mobile microprocessors. The day will soon come where it will be all but impossible to purchase a device without sensors directly integrated into the microscopic Manhattan of its silicon gates.

And our mobile screens all automatically rotate, like it or not 🙂

So, it was with great pleasure last week that I attended the 2011 24th annual ACM UIST Symposium, and received a Lasting Impact Award, presented to me by Stanford professor Dr. Scott Klemmer, for the contributions of our UIST 2000 paper “Sensing Techniques for Mobile Interaction.”

The inscription on the award reads:

Awarded for its scientific exploration of mobile interaction, investigating new interaction techniques for handheld mobile devices supported by hardware sensors, and laying the groundwork for new research and industrial applications.

UIST 2011 Lasting Impact Award

In the Meantime…

I remember demonstrating my prototype on-stage with Bill Gates at a media event here in Redmond, Washington in 2001. Gates spoke about the importance of keeping spending– both in the public and private sectors– on R & D and he used my demo as an example of some up-and-coming research, but what I most strongly recall is lingering in the green room backstage with him and some other folks. It wasn’t the first time that I’d met Gates, but it was the first occasion where I chit-chatted with him a bit in a casual, unstructured context. I don’t remember what we talked about but I do remember his foot twitching, always in motion, driving the pedal of a vast invisible loom, weaving a sweeping landscape surmounted by the towering summits of his electronic dreams.

I remember my palms sweating, nervous about the demo, hoping that the sensors I’d duct-taped to my transmogrified Cassiopeia E-105 Pocket PC wouldn’t break off or drain the battery or go crazy with some unforseen nuance of the stage lighting (yes, infrared proximity sensors most definitely have stage fright).

And then less than a week later came the 9/11 attacks. Suddenly spiffy little sensors for mobile devices didn’t seem so important any more. Many product groups, including Windows Mobile at the time, got excited about my demonstration but then the realities of a thousand other crushing demands and priorities rained down on the fragile bubble of technological wonderland I’d been able to cobble together with my prototype. The years stretched by and sensors still hadn’t become mainstream like I had expected them to be.

Then some laptops started shipping with accelerometers to automatically park the hard-disk when you dropped the laptop. I remember seeing digital cameras that would sense the orientation you snapped a picture in, so that you could view it properly when you downloaded it. And when the iPhone shipped in 2007, one of the coolest features on it was the embedded accelerometer, which enabled automatic screen rotation and tilt-based games.

A View to the Future

It took about five years longer than I expected, but we have finally reached an age where clever uses of sensors– both for obvious things like games, as well as for subtle and not-so-obvious things like counting footfalls while you are walking around with the device– abound.

Any my take on all this?

We ain’t seen nothin’ yet.

Since my initial paper on sensing techniques for mobile interaction, every couple of years another idea has struck me. How about answering your phone, or cuing a voice-recognition mode, just by holding your phone to your ear? How about bumping devices together as a way to connect them? What of dual-screen devices that can sense the posture of the screens, and thereby support a breadth of automatically sensed functions? What about new types of motion gestures that combine multi-touch interaction with the physical gestures, or vibratory signals, afforded by these sensors?

And I’m sure there’s many more. My children will never know a world where their devices are not sensitive to motion and proximity, to orientation and elevation and all the headings of the compass.

The problem is, the future is not so obvious until you’ve struck upon the right idea, until you’ve found the one gold nugget in acres and acres of tailings from the mine of your technological ambitions.

A final word of advice: if your aim is to find these nuggets– whether in research or in creative endeavors– what you need to do is dig as fast as you possibly can. Burrow deeper. Dig side-tunnels where no-one has gone before. Risk collapse and explosion and yes, worst of all, complete failure and ignominious rejection of your diligently crafted masterpieces.

Above all else, fail faster.

Because sometimes those “failed” projects turn out to be the most rewarding of all.

***

This project would not have been possible without standing on the shoulders of many giants. Of course, there are my colleagues on the project– Jeff Pierce, who worked with me as a Microsoft Research Graduate Fellowship recipient at the time, and did most of the heavy lifting on the software infrastructure and contributed many of the ideas and nuances of the resulting techniques. Mike Sinclair, who first got me thinking about accelerometers and spent many, many hours helping me cobble together the sensing hardware. And Eric Horvitz, who helped to shape the broad strokes of the project and who was always an energetic sounding board for ideas.

With the passing of time that an award like this entails, one also reflects on how life has changed, and the people who are no longer there. I think of my advisor Randy Pausch, who in many ways has made my entire career possible, and his epic struggle with pancreatic cancer. I think of my first wife, Kerrie Exely, who died in 1997, and of her father, Bill, who also was claimed by cancer a couple of years ago.

Then there are the many scientists whose work I built upon in our exploration of sensing systems. Beverly Harrison’s explorations of embodied interactions. Albrecht Schmidt’s work on context sensing for mobile phones. Jun Rekimoto’s exploration of tilting user interfaces. Bill Buxton’s insights into background sensing. And many others cited in the original paper.

22 responses to “Lasting Impact Award for “Sensing Techniques for Mobile Interaction”

  1. Congratulations! Well deserved!

  2. Thanks for sharing the story. Very interesting. Congratulations on the award!

  3. Congrats, Ken — well deserved indeed ! And thanks for this great writeup too !

  4. Congratulations — very well deserved!

  5. What a kid!
    Love from your dad

  6. Ken, congrats on the award. You don’t know me but I’ve been following since I discovered InkSeine. Regarding screen stitching for PPCs…are you aware of any WM applications available for experimentation or public use. I’m not a software guy, only an avid user of WM devices. My tablet pc went out of commission exactly 2 months after the warrently expired and I just haven’t had the heart or money to replace it. Instead I use multiple PPCs. Each one is dedicated to specific context…home, work, and outside. But I would love to set them side by side and exchange files and share screens. I’ve seen a lot of research papers for apps but is anything available for the typical Joe like me?

    • Good question. I am honestly not sure. I would not be surprised if there was something like this in the Windows Phone 7 Marketplace. Or if not, there certainly should be. Hmmmm….

      There are certainly multiple utilities like this for tablets and PC’s that let you have shared clipboard and the like. You might hunt around for those and see if any have clients for Windows Mobile as well.

      Sorry to hear about your tablet PC. Actually, it’s funny, out of the dozens of tablet PC’s that I have amassed over the years, I can only think of two of them that stopped working, and both were the old Toshiba Portege’s (digitizers would get stripes where they stopped sensing the pen). Wish I could just mail one your way, but they all have Microsoft asset tags on them so I’d probably get in trouble for that 🙂

  7. Ha! Yes, please do not get yourself in any trouble 😉 but I appriciate the sentiment. I had not thought about the shared clipboard thing. Will let you if I find something that works.

  8. Hi Ken,
    Just wondering about your thoughts with regards to the inking experience in Windows 8. I am currently using an Asus Eee Slate with Windows 7 which I love.

    • Hi Rami,

      I won’t be commenting on anything about Windows 8 until it is shipped. Sorry, but it just isn’t appropriate for me to comment here about an unreleased future Microsoft product.

      I was very excited however to see the preliminary release of Win 8 made available and I hope that everyone will try it out and give their feedback on it so we can deliver the best possible experience for slates (and other devices, of course) when the product does get released.

      Also glad to hear you are enjoying the Asus Eee Slate. I have two of those, actually, myself (both are being used in a project we currently have going looking at new ways that we might better support the “society of devices” — the likely future where we all have multiples of slates and other devices populating our homes and workspaces — to see what new and useful things we can come up with there).

      Ken

      • Would love to hear more about this project. I am constantly using multiple devices in the office and would love for the to work better together.

      • Hi Rami,

        Yes, it definitely will be a fun project, but it will be a while before it is ready for publication.

        However, we did do a similar project recently that illustrates some of what we are thinking: check out CodeSpace
        as written up on Engadget and lots of other places. http://www.engadget.com/2011/11/14/microsoft-outlines-code-space-looks-to-include-kinect-in-confer/

        I also recommend checking out some of Bill Buxton’s writings / talks about what he calls “Whereable” computing. Bill is a colleague of mine here and is involved with my new project a little bit, actually.

        Thanks
        Ken

  9. A great story about a great accomplishment.

    • Andruid,

      A belated thank you for your comment, and great to finally meet you in person at the CHI program committee meeting. Hope to chat with you again soon.

      Ken

  10. This is truly a great work!!

  11. This is really groundbreaking work at the time! Well deserved. And great write up too!

  12. Pingback: Invited Talk: WIPTTE 2015 Presentation of Sensing Techniques for Tablets, Pen, and Touch | The Past and Present Future

Leave a reply to Raman Chandrasekar Cancel reply