I have to admit it: I feel as if I’m looking at the sunrise of what may be a whole new way of interacting with mobile devices.
When I think about it, the possibilities bathe my eyes in a golden glow, and the warmth drums against my skin.
And in particular, my latest research peers out across this vivid horizon, to where I see touch — and mobile interaction with touchscreens in particular — evolving in the near future.
As a seasoned researcher, my job (which in reality is some strange admixture of interaction design, innovator, and futurist) is not necessarily to predict the future, but rather to invent it via extrapolation from a sort of visionary present which occupies my waking dreams.
I see things not as they are, but as they could be, through the lens afforded by a (usually optimistic) extrapolation from extant technologies, or those I know are likely to soon become more widely available.
With regards to interaction with touchscreens in particular, it has been clear to me for some time that the ability to sense the fingers as they approach the device — well before contact with the screen itself — is destined to become commonplace on commodity devices.
This is interesting for a number of reasons.
And no, the ability to do goofy gestures above the screen, waving at it frantically (as if it were a fancy-pants towel dispenser in a public restroom) in some dim hope of receiving an affirmative response, is not one of them.
In terms of human capabilities, one obviously cannot touch the screen of a mobile device without approaching it first.
But what often goes unrecognized is that one also must hold the device, typically in the non-preferred hand, as a precursor to touch. Hence, how you hold the device — the pattern of your grip and which hand you hold it in — are additional details of context that are more-or-less wholly ignored by current mobile devices.
So in this new work, my colleagues and I collectively refer to these two precursors of touch — approach and the need to grip the device — as pre-touch.
And it is my staunch belief that the ability to sense such pre-touch information could radically transform the mobile ‘touch’ interfaces that we all have come to take for granted.
You can get a sense of these possibilities, all implemented on a fully functional mobile phone with pre-touch sensing capability, in our demo reel below:
The project received a lot of attention, and coverage from many of the major tech blogs and other media outlets, for example:
- The Verge (“Microsoft’s hover gestures for Windows phones are magnificent”)
- SlashGear (“Smartphones next big thing: ‘Pre-Touch’”)
- Business Insider (“Apple should definitely copy Microsoft’s incredible finger-sensing smartphone technology”)
- And Fast Company Design (and again in “8 Incredible Prototypes That Show The Future Of Human-Computer Interaction.”)
But I rather liked the take that Silicon Angle offered, which took my concluding statement from the video above:
Taken as a whole, our exploration of pre-touch hints that the evolution of mobile touch may still be in its infancy – with many possibilities, unbounded by the flatland of the touchscreen, yet to explore.
And then responded as follows:
This is the moon-landing-esque conclusion Microsoft comes to after demonstrating its rather cool pre-touch mobile technology, i.e., a mobile phone that senses what your fingers are about to do.
While this evolution of touch has been coming in the research literature for at least a decade now, what exactly to do with above- and around-screen sensing (especially in a mobile setting) has been far from obvious. And that’s where I think our work on pre-touch sensing techniques for mobile interaction distinguishes itself, and in so doing identifies some very interesting use cases that have never been realized before.
The very best of these new techniques possess a quality that I love, namely that they have a certain surprising obviousness to them:
The techniques seem obvious — but only in retrospect.
And only after you’ve been surprised by the new idea or insight that lurks behind them.
If such an effort is indeed the first hint of a moonshot for touch, well, that’s a legacy for this project that I can live with.
UPDATE: The talk I gave at the CHI 2016 conference on this project is now available. Have a gander if you are so inclined.
Ken Hinckley, Seongkook Heo, Michel Pahud, Christian Holz, Hrvoje Benko, Abigail Sellen, Richard Banks, Kenton O’Hara, Gavin Smyth, William Buxton. 2016. Pre-Touch Sensing for Mobile Interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). ACM, New York, NY, USA, p. 2869-2881. San Jose, CA, May 7-12, 2016. http://dx.doi.org/10.1145/2858036.2858095
[PDF] [Talk slides PPTX] [video – MP4] [30 second preview – MP4] [Watch on YouTube]
Watch Pre-Touch Sensing for Mobile Interaction video on YouTube