It’s been a busy year, so I’ve been more than a little remiss in posting my Best Paper Award recipient from last year’s User Interface Software & Technology (UIST) symposium.
UIST is a great venue, particularly renowned for publishing cutting-edge innovations in devices, sensors, and hardware.
And software that makes clever uses thereof.
The paper takes two long-standing research themes for me — pen (plus touch) interaction, and interesting new ways to use sensors — and smashes them together to produce the ultimate Frankenstein child of tablet computing:
And if you were to unpack this orange-gauntleted beast, here’s what you’d find:
But although the end-goal of the project is to explore the new possibilities afforded by sensor technology, in many ways, this paper kneads a well-worn old worry bead for me.
It’s all about the hand.
With little risk of exaggeration you could say that I’ve spent decades studying nothing but the hand. And how the hand is the window to your mind.
Or shall I say hands. How people coordinate their action. How people manipulate objects. How people hold things. How we engage with the world through the haptic sense, how we learn to articulate astoundingly skilled motions through our fingers without even being consciously aware that we’re doing anything at all.
I’ve constantly been staring at hands for over 20 years.
And yet I’m still constantly surprised.
People exhibit all sorts of manual behaviors, tics, and mannerisms, hiding in plain sight, that seemingly inhabit a strange shadow-world — the realm of the seen but unnoticed — because these behaviors are completely obvious yet somehow they still lurk just beneath conscious perception.
Nobody even notices them until some acute observer takes the trouble to point them out.
Take a behavior as simple as holding a pen in your hand.
You hold the pen to write, of course, but most people also tuck the pen between their fingers to momentarily stow it for later use. Other people do this in a different way, and instead palm the pen, in more of a power grip reminiscent of how you would grab a suitcase handle. Some people even interleave the two behaviors, based on what they are currently doing and whether or not they expect to use the pen again soon:
This seems very simple and obvious, at least in retrospect. But such behaviors have gone almost completely unnoticed in the literature, much less actively sensed by the tablets and pens that we use — or even leveraged to produce more natural user interfaces that can adapt to exactly how the user is currently handing and using their devices.
If we look deeper into these writing and tucking behaviors alone, a whole set of grips and postures of the hand emerge:
Looking even more deeply, once we have tablets that support a pen as well as full multi-touch, users naturally want to used their bare fingers on the screen in combination with the pen, so we see another range of manual behaviors that we call extension grips based on placing one (or more) fingers on the screen while holding the pen:
People also exhibited more ways of using multiple fingers on the touchscreen that I expected:
So, it began to dawn on us that there was all this untapped richness in terms of how people hold, manipulate, write on, and extend fingers when using pen and touch on tablets.
And that sensing this could enable some very interesting new possibilities for the user interfaces for stylus + tablet computing.
This is where our custom hardware came in.
On our pen, for example, we can sense subtle motions — using full 3D inertial sensors including accelerometer, gyroscope, and magnetometer — as well as sense how the user grips the pen — this time using a flexible capacitive substrate wrapped around the entire barrel of the pen.
These capabilities then give rise to sensor signals such as the following:
This makes various pen grips and motions stand out quite distinctly, states that we can identify using some simple gesture recognition techniques.
Armed with these capabilities, we explored presenting a number of context-appropriate tools.
As the very simplest example, we can detect when you’re holding the pen in a grip (and posture) that indicates that you’re about to write. Why does this matter? Well, if the touchscreen responds when you plant your meaty palm on it, it causes no end of mischief in a touch-driven user interface. You’ll hit things by accident. Fire off gestures by mistake. Leave little “ink turds” (as we affectionately call them) on the screen if the application responds to touch by leaving an ink trace. But once we can sense it’s your palm, we can go a long ways towards solving these problems with pen-and-touch interaction.
To pull the next little rabbit out of my hat, if you tap the screen with the pen in hand, the pen tools (what else?) pop up:
But we can take this even further, such as to distinguish bare-handed touches — to support the standard panning and zooming behaviors — versus a pinch articulated with the pen-in-hand, which in this example brings up a magnifying glass particularly suited to detail work using the pen:
Another really fun way to use the sensors — since we can sense the 3D orientation of the pen even when it is away from the screen — is to turn it into a digital airbrush:
At any rate, it was a really fun project that garnered a best paper award, and a fair bit of press coverage (Gizmodo, Engadget, & named FastCo Design’s #2 User Interface innovation of 2014, among other coverage). It’s pretty hard to top that.
Unless maybe we do a lot more with all kinds of cool sensors on the tablet as well.
You might just want to stay tuned here. There’s all kinds of great stuff in the works, as always (grin).
Hinckley, K., Pahud, M., Benko, H., Irani, P., Guimbretiere, F., Gavriliu, M., Chen, X., Matulic, F., Buxton, B., Wilson, A., Sensing Techniques for Tablet+Stylus Interaction. In the 27th ACM Symposium on User Interface Software and Technology (UIST’14). Honolulu, Hawaii, Oct 5-8, 2014, pp. 605-614. http://dx.doi.org/10.1145/2642918.2647379
- [PDF] [video – WMV] [Watch on YouTube]
- Best Paper Award (UIST 2014)
- Named FastCo Design’s #2 User Interface Innovation of 2014.
- I also have a talk available about this work, which I presented at the WIPPTE workshop.
Watch Context Sensing Techniques for Tablet+Stylus Interaction video on YouTube