Category Archives: note taking

Paper: Sensing Techniques for Tablet+Stylus Interaction (Best Paper Award)

It’s been a busy year, so I’ve been more than a little remiss in posting my Best Paper Award recipient from last year’s User Interface Software & Technology (UIST) symposium.

UIST is a great venue, particularly renowned for publishing cutting-edge innovations in devices, sensors, and hardware.

And software that makes clever uses thereof.

Title slide - sensing techniques for stylus + tablet interaction

Title slide from my talk on this project. We had a lot of help, fortunately. The picture illustrates a typical scenario in pen & tablet interaction — where the user interacts with touch, but the pen is still at the ready, in this case palmed in the user’s fist.

The paper takes two long-standing research themes for me — pen (plus touch) interaction, and interesting new ways to use sensors — and smashes them together to produce the ultimate Frankenstein child of tablet computing:

Stylus prototype augmented with sensors

Microsoft Research’s sensor pen. It’s covered in groovy orange shrink-wrap, too. What could be better than that? (The shrink wrap proved necessary to protect some delicate connections between our grip sensor and the embedded circuitry).

And if you were to unpack this orange-gauntleted beast, here’s what you’d find:

Sensor components inside the pen

Components of the sensor pen, including inertial sensors, a AAAA battery, a Wacom mini pen, and a flexible capacitive substrate that wraps around the barrel of the pen.

But although the end-goal of the project is to explore the new possibilities afforded by sensor technology, in many ways, this paper kneads a well-worn old worry bead for me.

It’s all about the hand.

With little risk of exaggeration you could say that I’ve spent decades studying nothing but the hand. And how the hand is the window to your mind.

Or shall I say hands. How people coordinate their action. How people manipulate objects. How people hold things. How we engage with the world through the haptic sense, how we learn to articulate astoundingly skilled motions through our fingers without even being consciously aware that we’re doing anything at all.

I’ve constantly been staring at hands for over 20 years.

And yet I’m still constantly surprised.

People exhibit all sorts of manual behaviors, tics, and mannerisms, hiding in plain sight, that seemingly inhabit a strange shadow-world — the realm of the seen but unnoticed — because these behaviors are completely obvious yet somehow they still lurk just beneath conscious perception.

Nobody even notices them until some acute observer takes the trouble to point them out.

For example:

Take a behavior as simple as holding a pen in your hand.

You hold the pen to write, of course, but most people also tuck the pen between their fingers to momentarily stow it for later use. Other people do this in a different way, and instead palm the pen, in more of a power grip reminiscent of how you would grab a suitcase handle. Some people even interleave the two behaviors, based on what they are currently doing and whether or not they expect to use the pen again soon:

Tuck and Palm Grips for temporarily stowing a pen

Illustration of tuck grip (left) vs. palm grip (right) methods of stowing the pen when it is temporarily not in use.

This seems very simple and obvious, at least in retrospect. But such behaviors have gone almost completely unnoticed in the literature, much less actively sensed by the tablets and pens that we use — or even leveraged to produce more natural user interfaces that can adapt to exactly how the user is currently handing and using their devices.

If we look deeper into these writing and tucking behaviors alone, a whole set of grips and postures of the hand emerge:

Core Pen Grips

A simple design space of common pen grips and poses (postures of the hand) in pen and touch computing with tablets.

Looking even more deeply, once we have tablets that support a pen as well as full multi-touch, users naturally want to used their bare fingers on the screen in combination with the pen, so we see another range of manual behaviors that we call extension grips based on placing one (or more) fingers on the screen while holding the pen:

Single Finger Extension Grips for Touch Gestures with Pen-in-hand

Much richness in “extension” grips, where touch is used while the pen is still being held, can also be observed. Here we see various single-finger extension grips for the tuck vs. the palm style of stowing the pen.

People also exhibited more ways of using multiple fingers on the touchscreen that I expected:

Multiple Finger Extension Grips for Touch Gestures with Pen-in-hand

Likewise, people extend multiple fingers while holding the pen to pinch or otherwise interact with the touchscreen.

So, it began to dawn on us that there was all this untapped richness in terms of how people hold, manipulate, write on, and extend fingers when using pen and touch on tablets.

And that sensing this could enable some very interesting new possibilities for the user interfaces for stylus + tablet computing.

This is where our custom hardware came in.

On our pen, for example, we can sense subtle motions — using full 3D inertial sensors including accelerometer, gyroscope, and magnetometer — as well as sense how the user grips the pen — this time using a flexible capacitive substrate wrapped around the entire barrel of the pen.

These capabilities then give rise to sensor signals such as the following:

Grip and motion sensors on the stylus
Sensor signals for the pen’s capacitive grip sensor with the writing grip (left) vs. the tuck grip (middle). Exemplar motion signals are shown on the right.

This makes various pen grips and motions stand out quite distinctly, states that we can identify using some simple gesture recognition techniques.

Armed with these capabilities, we explored presenting a number of context-appropriate tools.

As the very simplest example, we can detect when you’re holding the pen in a grip (and posture) that indicates that you’re about to write. Why does this matter? Well, if the touchscreen responds when you plant your meaty palm on it, it causes no end of mischief in a touch-driven user interface. You’ll hit things by accident. Fire off gestures by mistake. Leave little “ink turds” (as we affectionately call them) on the screen if the application responds to touch by leaving an ink trace. But once we can sense it’s your palm, we can go a long ways towards solving these problems with pen-and-touch interaction.

To pull the next little rabbit out of my hat, if you tap the screen with the pen in hand, the pen tools (what else?) pop up:

Pen tools appear

Tools specific to the pen appear when the user taps on the screen with the pen stowed in hand.

But we can take this even further, such as to distinguish bare-handed touches — to support the standard panning and zooming behaviors —  versus a pinch articulated with the pen-in-hand, which in this example brings up a magnifying glass particularly suited to detail work using the pen:

Pen Grip + Motion example: Full canvas zoom vs. Magnifier tool

A pinch multi-touch gesture with the left hand pans and zooms. But a pinch articulated with the pen-in-hand brings up a magnifier tool for doing fine editing work.

Another really fun way to use the sensors — since we can sense the 3D orientation of the pen even when it is away from the screen — is to turn it into a digital airbrush:

Airbrush tool using the sensors

Airbrushing with a pen. Note that the conic section of the resulting “spray” depends on the 3D orientation of the pen — just as it would with a real airbrush.

At any rate, it was a really fun project that garnered a best paper award,  and a fair bit of press coverage (Gizmodo, Engadget, & named FastCo Design’s #2 User Interface innovation of 2014, among other coverage). It’s pretty hard to top that.

Unless maybe we do a lot more with all kinds of cool sensors on the tablet as well.

Hmmm…

You might just want to stay tuned here. There’s all kinds of great stuff in the works, as always (grin).


Sensing Pen & Tablet Grip+Motion thumbnailHinckley, K., Pahud, M., Benko, H., Irani, P., Guimbretiere, F., Gavriliu, M., Chen, X., Matulic, F., Buxton, B., Wilson, A., Sensing Techniques for Tablet+Stylus Interaction.  In the 27th ACM Symposium on User Interface Software and Technology (UIST’14)  Honolulu, Hawaii, Oct 5-8, 2014, pp. 605-614. http://dx.doi.org/10.1145/2642918.2647379

Watch Context Sensing Techniques for Tablet+Stylus Interaction video on YouTube

Invited Talk: WIPTTE 2015 Presentation of Sensing Techniques for Tablets, Pen, and Touch

The organizers of WIPTTE 2015, the Workshop on the Impact of Pen and Touch Technology on Education, kindly invited me to speak about my recent work on sensing techniques for stylus + tablet interaction.

One of the key points that I emphasized:

To design technology to fully take advantage of human skills, it is critical to observe what people do with their hands when they are engaged in manual activites such as handwriting.

Notice my deliberate the use of the plural, hands, as in both of ’em, in a division of labor that is a perfect example of cooperative bimanual action.

The power of crayon and touch.

My six-year-old daughter demonstrates the power of crayon and touch technology.

And of course I had my usual array of stupid sensor tricks to illustrate the many ways that sensing systems of the future embedded in tablets and pens could take advantage of such observations. Some of these possible uses for sensors probably seem fanciful, in this antiquated era of circa 2015.

But in eerily similar fashion, some of the earliest work that I did on sensors embedded in handheld devices also felt completely out-of-step with the times when I published it back in the year 2000. A time so backwards it already belongs to the last millennium for goodness sakes!

Now aspects of that work are embedded in practically every mobile device on the planet.

It was a fun talk, with an engaged audience of educators who are eager to see pen and tablet technology advance to better serve the educational needs of students all over the world. I have three kids of school age now so this stuff matters to me. And I love speaking to this audience because they always get so excited to see the pen and touch interaction concepts I have explored over the years, as well as the new technologies emerging from the dim fog that surrounds the leading frontiers of research.

Harold and the Purple Crayon book coverI am a strong believer in the dictum that the best way to predict the future is to invent it.

And the pen may be the single greatest tool ever invented to harness the immense creative power of the human mind, and thereby to scrawl out–perhaps even in the just-in-time fashion of the famous book Harold and the Purple Crayon–the uncertain path that leads us forward.

                    * * *

Update: I have also made the original technical paper and demonstration video available now.

If you are an educator seeing impacts of pen, tablet, and touch technology in the classroom, then I strongly encourage you to start organizing and writing up your observations for next year’s workshop. The 2016 edition of the series, (now renamed CPTTE) will be held at Brown University in Providence, Rhode Island, and chaired by none other than the esteemed Andries Van Dam, who is my academic grandfather (i.e. my Ph.D. advisor’s mentor) and of course widely respected in computing circles throughout the world.

Thumbnail - WIPTTE 2015 invited TalkHinckley, K., WIPTTE 2015 Invited Talk: Sensing Techniques for Tablet + Stylus Interaction. Workshop on the Impact of Pen and Touch Technology on Education, Redmond, WA, April 28th, 2015. [Slides (.pptx)] [Slides PDF]

 

Project: The Analog Keyboard: Text Input for Small Devices

With the big meaty man-thumbs that I sport, touchscreen typing–even on a full-size tablet computer–can be challenging for me.

Take it down to a phone, and I have to spend more time checking for typographical errors and embarrassing auto-miscorrections than I do actually typing in the text.

But typing on a watch?!?

I suppose you could cram an entire QWERTY layout, all those keys, into a tiny 1.6″ screen, but then typing would become an exercise in microsurgery, the augmentation of a high-power microscope an absolute necessity.

But if you instead re-envision ‘typing’ in a much more direct, analog fashion, then it’s entirely possible. And in a highly natural and intuitive manner to boot.

Enter the Analog Keyboard Project.

Analog Watch Keyboard on Moto 360 (round screen)

Wolf Kienzle, a frequent collaborator of mine, just put out an exciting new build of our touchscreen handwriting technology optimized for watches running the Android Wear Platform, including the round Moto 360 device that everyone seems so excited about.

Get all the deets–and the download–from Wolf’s project page, available here.

This builds on the touchscreen writing prototype we first presented at the MobileHCI 2013 conference, where the work earned an Honorable Mention Award, but optimized in a number of ways to fit on the tiny screen (and small memory footprint) of current watches.

All you have to do is scrawl the letters that you want to type–in a fully natural manner, not in some inscrutable secret computer graffiti-code like in those dark days of the late 1990’s–and the prototype is smart enough to transcribe your finger-writing to text.

It even works for numbers and common punctuation symbols like @ and #, indispensable tools for the propagation of internet memes and goofy cat videos these days.

Writing numbers and punctuation symbols on the Analog Keyboard

However, to fit the resource-constrained environment of the watch, the prototype currently only supports lowercase letters.

Because we all know that when it comes to the internet, UPPERCASE IS JUST FOR TROLLZ anyway.

Best of all, if you have an Android Wear device you can try it out for yourself. Just side-load the Analog Keyboard app onto your watch and once again you can write the analog way, the way real men did in the frontier days. Before everyone realized how cool digital watches were, and all we had to express our innermost desires was a jar of octopus ink and a sharpened bald eagle feather. Or something like that.

Y’know, the things that made America great.

Only now with more electrons.

You can rest easy, though, if these newfangled round watches like the Moto 360 are just a little bit too fashionable for you. As shown below, it works just fine on the more chunky square-faced designs such as the Samsung Gear Live as well.

Analog Keyboard on Samsung Gear Live watch

Check out the video embedded below, and if you have a supported Android Wear device, download the prototype and give it a try. I know Wolf would love to get your feedback on what it feels like to use the Analog Keyboard for texting on your watch.

Bring your timepiece into the 21st century.

You’ll be the envy of every digital watch nerd for miles around.

Besides: it’s clearly an idea whose time has come.

Thumbnail - Analog Keyboard ProjectKienzle, W., Hinckley, K., The Analog Keyboard Project. Handwriting keyboard download for Android Wear. Released October 2014. [Project Details and Download] [Watch demo on YouTube]

 

Watch Analog Keyboard video on YouTube

Paper: Writing Handwritten Messages on a Small Touchscreen

Here’s the final of our three papers at the MobileHCI 2013 conference. This was a particularly fun project, spearheaded by my colleague Wolf Kienzle, looking at a clever way to do handwriting input on a touchscreen using just your finger.

In general I’m a fan of using an actual stylus for handwriting, but in the context of mobile there are many “micro” note-taking tasks, akin to scrawling a note to yourself on a post-it, that wouldn’t justify unsheathing a pen even if your device had one.

The very cool thing about this approach is that it allows you to enter overlapping multi-stroke characters using the whole screen, and without resorting to something like Palm’s old Graffiti writing or full-on handwriting recognition.

Touchscreen-Writing-fullres

The interface also incorporates some nice fluid gestures for entering spaces between words, backspacing to delete previous strokes, or transitioning to a freeform drawing mode for inserting little sketches or smiley-faces into your instant messages, as seen above.

This paper also had the distinction of receiving an Honorable Mention Award for best paper at MobileHCI 2013. We’re glad the review committee liked our paper and saw its contributions as noteworthy, as it were (pun definitely intended).

Writing-Small-Touchscreen-thumbKienzle, W., Hinckley, K., Writing Handwritten Messages on a Small Touchscreen. In ACM 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, (MobileHCI 2013), Munich, Germany, Aug. 27-30, 2013, pp. 179-182. Honorable Mention Award (Awarded to top 5% of all papers). [PDF] [video MP4] [Watch on YouTube – coming soon.]

Paper: Informal Information Gathering Techniques for Active Reading

This is my latest project, which I will present tomorrow (May 9th) at the CHI 2012 Conference on Human Factors in Computing Systems.

I’ll have a longer post up about this project after I return from the conference, but for now enjoy the video. I also link to the PDF of our short paper below which has a nice discussion of the motivation and design rationale for this work.

Above all else, I hope this work makes clear that there is still tons of room for innovation in how we interact with the e-readers and tablet computers of the future– as well as in terms of how we consume and manipulate content to produce new creative works.

Informal Information Gathering Techniques for Active ReadingHinckley, K., Bi, X., Pahud, M., Buxton, B., Informal Information Gathering Techniques for Active Reading. 4pp Note. In Proc. CHI 2012  Conf. on Human Factors in Computing Systems, Austin, TX, May 5-10, 2012. [PDF]

[Watch Informal Information Gathering Techniques for Active Reading on YouTube]

Paper: Enhancing Naturalness of Pen-and-Tablet Drawing through Context Sensing

Context-Sensing Pen with multi-touch and orientation sensorsSun, M. Cao, X., Song, H., Izadi, S., Benko, H., Guimbretiere, F., Ren, X., and Hinckley, K. Enhancing Naturalness of Pen-and-Tablet Drawing through Context Sensing.  In Proc. ACM International Conference on Interactive Tabletops and Surfaces (ITS ’11). ACM, New York, NY, USA,  Kobe, Japan, November 13-16, 2011, pp. 212-221. [PDF] [video – WMV].

Watch Enhancing Naturalness of Pen through Context Sensing video on YouTube

The Fractured State of Reading and Publishing

The bad news: I dropped my Kindle this morning.

The good news: I caught it before it hit the floor.

The even worse news: In so doing, I slammed it against the corner of my desk, smashing the e-ink screen into a starburst of gray, black, and white-plaid shards:

The newly fractured landscape of my kindle screen.The man pictured in the screen saver offers his disapproval with a withering half-frown, a my-oh-my-what-have-thee done expression as he finds himself trapped forever in this doomed terrain of shattered e-ink.

So, I guess it’s back to paper for me until my new Kindle arrives.

For a long time I never thought I would have any use for a Kindle. After all, who wants to read on a computer? And what about marking up the text, dogearing pages, or having more than one book open on my desk at a time?

Well, those behaviors are mostly my self-fueled obsessions when authoring original works of nonfiction. For recreational reading, the mechanisms for highlighting passages and bookmarking pages on the Kindle are, while somewhat clumsy and indirect, still good enough to get the job done.

And then there’s the instant gratification aspect.

This weekend up I was up at my cabin, at 3000′ elevation and nestled deep in the alpine pinnacles of the Cascade Crest, and I decided that I wanted to read another one of the mystery anthologies edited by Ed Gorman and Martin H. Greenberg because I recently read By Hook or By Crook on the recommendation of Kristine Kathryn Rusch and it was fantastic.

So I just brought up the book in the Kindle store, paged through the related reads, and within sixty seconds of the impulse I was reading Between the Dark and the Daylight.

But now I have to read the bloody thing on my smartphone until my new Kindle arrives.

And while I wait, it occured to me that the fractured Kindle screen pictured above strikes a perfect image of the publishing industry and the entire state of reading these days. The old world has been shattered by feedback loops in technology and ongoing market forces that just keep reinforcing one another. Paper books ain’t going away soon, but I’ll probably live to see the day where they are uncommon for most titles. Bookstores will be relegated to specialty boutique status, like the camera and stationery stores populating the deserted shoals of strip-malls.

And you know what that smells like to me?

Opportunity.

The Courier was one example of how these shifts might spawn whole new experiences or categories of devices. The Amazon Tablet might well be another. But whatever the next hot gadget or gizmo is, rest assured, I feel like a technological wolf, scenting a long series of innovations-to-come in the shifting winds, and I’ll be looking to make a killing. 🙂 What of tablets with pen and multi-touch? What of Nicholas Chen’s Multi-Slate Reading System, a federation of cheap slates that you can scatter about your office like the glossy marketing brochures you get in the mail, tossed aside for the day where you may or may not read them? What of flexible, paper-like displays?

We’re still in the stone age here, folks, as far as e-readers are concerned. We’ll look back fondly on the Kindle and its ilk as the quaint auto-buggies that presaged a sleek, sophisticated, and nearly unrecognizable future.

That’s where I want to be, even if I have to cobble it together with clunky prototypes, Frankenstein monsters of acrylic and delrin etched out by the laser cutter of my dreams.

In the meantime, you could do a lot worse than to follow Kristine Kathryn Rusch and her husband, Dean Wesley Smith, as they talk about what this means for readers and writers and the publishing industry writ large.