Category Archives: touch

Book Chapter: Input/Output Devices and Interaction Techniques, Third Edition

Thumbnail for Computing Handbook (3rd Edition)Hinckley, K., Jacob, R., Ware, C. Wobbrock, J., and Wigdor, D., Input/Output Devices and Interaction Techniques. Appears as Chapter 21 in The Computing Handbook, Third Edition: Two-Volume Set, ed. by Tucker, A., Gonzalez, T., Topi, H., and Diaz-Herrera, J. Published by Chapman and Hall/CRC (Taylor & Francis), May 13, 2014.  [PDF – Author’s Draft – may contain discrepancies]

Invited Talk: WIPTTE 2015 Presentation of Sensing Techniques for Tablets, Pen, and Touch

The organizers of WIPTTE 2015, the Workshop on the Impact of Pen and Touch Technology on Education, kindly invited me to speak about my recent work on sensing techniques for stylus + tablet interaction.

One of the key points that I emphasized:

To design technology to fully take advantage of human skills, it is critical to observe what people do with their hands when they are engaged in manual activites such as handwriting.

Notice my deliberate the use of the plural, hands, as in both of ’em, in a division of labor that is a perfect example of cooperative bimanual action.

The power of crayon and touch.

My six-year-old daughter demonstrates the power of crayon and touch technology.

And of course I had my usual array of stupid sensor tricks to illustrate the many ways that sensing systems of the future embedded in tablets and pens could take advantage of such observations. Some of these possible uses for sensors probably seem fanciful, in this antiquated era of circa 2015.

But in eerily similar fashion, some of the earliest work that I did on sensors embedded in handheld devices also felt completely out-of-step with the times when I published it back in the year 2000. A time so backwards it already belongs to the last millennium for goodness sakes!

Now aspects of that work are embedded in practically every mobile device on the planet.

It was a fun talk, with an engaged audience of educators who are eager to see pen and tablet technology advance to better serve the educational needs of students all over the world. I have three kids of school age now so this stuff matters to me. And I love speaking to this audience because they always get so excited to see the pen and touch interaction concepts I have explored over the years, as well as the new technologies emerging from the dim fog that surrounds the leading frontiers of research.

Harold and the Purple Crayon book coverI am a strong believer in the dictum that the best way to predict the future is to invent it.

And the pen may be the single greatest tool ever invented to harness the immense creative power of the human mind, and thereby to scrawl out–perhaps even in the just-in-time fashion of the famous book Harold and the Purple Crayon–the uncertain path that leads us forward.

If you are an educator seeing impacts of pen, tablet, and touch technology in the classroom, then I strongly encourage you to start organizing and writing up your observations for next year’s workshop. The 2016 edition of the series will be held at Brown University in Providence, Rhode Island, and chaired by none other than the esteemed Andries Van Dam, who is my academic grandfather (i.e. my Ph.D. advisor’s mentor) and of course widely respected in computing circles throughout the world.

Thumbnail - WIPTTE 2015 invited TalkHinckley, K., WIPTTE 2015 Invited Talk: Sensing Techniques for Tablet + Stylus Interaction. Workshop on the Impact of Pen and Touch Technology on Education, Redmond, WA, April 28th, 2015. [Slides (.pptx)] [Slides PDF]

Project: Bimanual In-Place Commands

Here’s another interesting loose end, this one from 2012, which describes a user interface known as “In-Place Commands” that Michel Pahud, myself, and Bill Buxton developed for a range of direct-touch form factors, including everything from tablets and tabletops all the way up to electronic whiteboards a la the modern Microsoft Surface Hub devices of 2015.

Microsoft is currently running a Request for Proposals for Surface Hub research, by the way, so check it out if that sort of thing is at all up your alley. If your proposal is selected you’ll get a spiffy new Surface Hub and $25,000 to go along with it.

We’ve never written up a formal paper on our In-Place Commands work, in part because there is still much to do and we intend to pursue it further when the time is right. But in the meantime the following post and video documenting the work may be of interest to aficionados of efficient interaction on such devices. This also relates closely to the Finger Shadow and Accordion Menu explored in our Pen +Touch work, documented here and here, which collectively form a class of such techniques.

While we wouldn’t claim that any one of these represent the ultimate approach to command and control for direct input, in sum they illustrate many of the underlying issues, the rich set of capabilities we strive to support, and possible directions for future embellishments as well.

Thumbnail for In-Place CommandsKnies, R. In-Place: Interacting with Large Displays. Reporting on research by Pahud, M., Hinckley, K., and Buxton, B. TechNet Inside Microsoft Research Blog Post, Oct 4th, 2012. [Author’s cached copy of post as PDF] [Video MP4] [Watch on YouTube]

In-Place Commands Screen Shot

The user can call up commands in-place, directly where he is working, by touching both fingers down and fanning out the available tool palettes. Many of the functions thus revealed act as click-through tools, where the user may simultaneously select and apply the selected tool — as the user is about to do for the line-drawing tool in the image above.

Watch Bimanual In-Place Commands video on YouTube

Interacting with the Undead: A Crash Course on the “Inhuman Factors” of Computing

I did a far-ranging interview last week with Nora Young, the host of CBC Radio’s national technology and trend-watching show called Spark.

But the most critical and timely topic we ventured into was the burning question on everyone’s mind as All Hallows’ Eve rapidly approaches:

Can zombies use touchscreens?

This question treads (or shall we say, shambles) into the widely neglected area of Inhuman Factors, a branch of Human-Computer Interaction that studies technological affordances for the most disenfranchised and unembodied users of them all–the undead.

Fortunately for Nora, however, I am the world’s foremost authority on the topic.

And I was only too happy to speak to this glaring oversight in how we design today’s technologies, one that I have long campaigned to redress.

Needless to say, Zombie-Computer Interaction (ZCI) is an area rife with dire usability problems.

You can listen to the podcast and see how Nora sparked the discussion here.

But to clear up some common myths and misconceptions of ZCI, let me articulate seven critical design observations to keep in mind when designing technology for the undead:

  1.  Yes, zombies can use touchscreens–with appropriate design.
  2. Thus, like everything else in design, the correct answer is:
    “It Depends.”
  3. The corpse has to be fresh. Humans are essentially giant bags of water; touchscreens are sensitive to the capacitance induced by the moisture in our bodies. So long as the undead creature has recently departed the realm of the living, then, the capacitive touchscreens commonplace in today’s technology should respond appropriately.
  4. Results also may be acceptable if the zombie has fed on a sufficient quantity of brains in the last 24-36 hours.
  5. MOAR BRAINS! are better.
  6. Nonetheless, the water content of a motive corpse can be a significant barrier in day-to-day (or, to speak more precisely, night-to-night) interactions of the undead with tablets, smartphones, bank kiosks, and the like. In particular, touchscreens often completely fail to respond to mummies, ghasts, vampires, and the rarely-studied windigo of Algonquian legend–all due to the extreme desiccation of the corporeal form.
  7. Fortunately for these dried-up souls, the graveyard of devices-past is replete with resistive touchscreen technology such as the once-revered Palm Pilot handheld computer, as document in the frightening and deeply disturbing Buxton Collection of Input Devices and Technologies. These devices respond successfuly to the finger-taps of the desiccated undead because they sense contact pressure, not capacitance.

So let me recap the lessons:
Zombies can definitely use touchscreens; brains are good, MOAR BRAINS are better; and if you see a zombie sporting a Palm Pilot run like hell, because that sucker is damned hungry.

But naturally, the ground-breaking discussion on Zombie-Computer Interaction sparked by Nora’s provocation has triggered a flurry of follow-on questions from concerned citizens to my inbox:

What about ghosts? Can a ghost use a touchscreen?

A ghost is an unholy manifestation of non-corporeal form. Lacking an embodied form, a ghost therefore cannot use a touchscreen–their hand passes right through it. But ghosts can be sensed by light, such as laser rangefinders, or the depth-sensing technology of the Kinect camera for the XBox.

However, ghosts frequently can and do leave behind traces of ectoplasmic goo, which can cause touchscreens to respond in a strange and highly erratic manner.

If you have ever made a typo on a touchscreen keyboard, or triggered Angry Birds by accident when you could swear you were reaching for some other icon–chances are that “ghost contact” was triggered by a disembodied spirit trying to communicate with you from the beyond.

If this happens to you, I highly recommend that you immediately stop what you are doing and install every touchscreen Ouija board app you can find so that you can open a suitable communication channel with the realm of the dead.

What about Cthulu–H. P. Lovecraft’s terrifying cosmic deity that is part man, part loathsome alien form, and part giant squid? Can Cthulu use a touchscreen?

Studies are inconclusive. Scott’s great expedition to the Transantarctic mountains–where records of Cthulu are rumored to be hidden–vanished in the icy wastes, never to be heard from again. R. Carter et al. studied the literature extensively and promptly went insane.

Other researchers, including myself, have been understandably dissuaded from examining the issue further.

My opinion, unsupported by data, is that as a pan-dimensional being Cthulu can touch whatever the hell he wants–when the stars are right and the lost city of R’lyeh rises once again from the slimy eons-deep vaults of the black Pacific.

A lot of PEOPLE are WORRIED about Lawyers. Can lawyers use touchscreens as well?

Sadly, it is widely believed (and backed up by scientific studies) that most lawyers have no soul.

Therefore the majority of lawyers cannot use a touchscreen at all.

This is why summons and lawsuits always arrive in paper form from a beady-eyed courier.


Other noteworthy challenges to conventional INHUMAN FACTORS design wisdom

I’ve also fielded a variety of questions and strongly-held opinions from the far and dark corners of the Twittersphere.

Needless to say, these are clearly highly disturbed individuals, so I recommend that you interact with them at your own risk.

All right. I think I’ve put this topic to rest.

But keep the questions coming.

And be careful tonight.

Be sure to post in the comments below, or tweet me after midnight @ken_hinckley and I’ll do my best to give you a scientifically rigorous (if not rigor-mortis-ish) response.

Project: The Analog Keyboard: Text Input for Small Devices

With the big meaty man-thumbs that I sport, touchscreen typing–even on a full-size tablet computer–can be challenging for me.

Take it down to a phone, and I have to spend more time checking for typographical errors and embarrassing auto-miscorrections than I do actually typing in the text.

But typing on a watch?!?

I suppose you could cram an entire QWERTY layout, all those keys, into a tiny 1.6″ screen, but then typing would become an exercise in microsurgery, the augmentation of a high-power microscope an absolute necessity.

But if you instead re-envision ‘typing’ in a much more direct, analog fashion, then it’s entirely possible. And in a highly natural and intuitive manner to boot.

Enter the Analog Keyboard Project.

Analog Watch Keyboard on Moto 360 (round screen)

Wolf Kienzle, a frequent collaborator of mine, just put out an exciting new build of our touchscreen handwriting technology optimized for watches running the Android Wear Platform, including the round Moto 360 device that everyone seems so excited about.

Get all the deets–and the download–from Wolf’s project page, available here.

This builds on the touchscreen writing prototype we first presented at the MobileHCI 2013 conference, where the work earned an Honorable Mention Award, but optimized in a number of ways to fit on the tiny screen (and small memory footprint) of current watches.

All you have to do is scrawl the letters that you want to type–in a fully natural manner, not in some inscrutable secret computer graffiti-code like in those dark days of the late 1990’s–and the prototype is smart enough to transcribe your finger-writing to text.

It even works for numbers and common punctuation symbols like @ and #, indispensable tools for the propagation of internet memes and goofy cat videos these days.

Writing numbers and punctuation symbols on the Analog Keyboard

However, to fit the resource-constrained environment of the watch, the prototype currently only supports lowercase letters.

Because we all know that when it comes to the internet, UPPERCASE IS JUST FOR TROLLZ anyway.

Best of all, if you have an Android Wear device you can try it out for yourself. Just side-load the Analog Keyboard app onto your watch and once again you can write the analog way, the way real men did in the frontier days. Before everyone realized how cool digital watches were, and all we had to express our innermost desires was a jar of octopus ink and a sharpened bald eagle feather. Or something like that.

Y’know, the things that made America great.

Only now with more electrons.

You can rest easy, though, if these newfangled round watches like the Moto 360 are just a little bit too fashionable for you. As shown below, it works just fine on the more chunky square-faced designs such as the Samsung Gear Live as well.

Analog Keyboard on Samsung Gear Live watch

Check out the video embedded below, and if you have a supported Android Wear device, download the prototype and give it a try. I know Wolf would love to get your feedback on what it feels like to use the Analog Keyboard for texting on your watch.

Bring your timepiece into the 21st century.

You’ll be the envy of every digital watch nerd for miles around.

Besides: it’s clearly an idea whose time has come.

Thumbnail - Analog Keyboard ProjectKienzle, W., Hinckley, K., The Analog Keyboard Project. Handwriting keyboard download for Android Wear. Released October 2014. [Project Details and Download] [Watch demo on YouTube]


Watch Analog Keyboard video on YouTube

Paper: Experimental Study of Stroke Shortcuts for a Touchscreen Keyboard with Gesture-Redundant Keys Removed

Text Entry on Touchscreen Keyboards: Less is More?

When we go from mechanical keyboards to touchscreens we inevitably lose something in the translation. Yet the proliferation of tablets has led to widespread use of graphical keyboards.

You can’t blame people for demanding more efficient text entry techniques. This is the 21st century, after all, and intuitively it seems like we should be able to do better.

While we can’t reproduce that distinctive smell of hot metal from mechanical keys clacking away at a typewriter ribbon, the presence of the touchscreen lets keyboard designers play lots of tricks in pursuit of faster typing performance. Since everything is just pixels on a display it’s easy to introduce non-standard key layouts. You can even slide your finger over the keys to shape-write entire words in a single swipe, as pioneered by Per Ola Kristensson and Shumin Zhai (their SHARK keyboard was the predecessor for Swype and related techniques).

While these type of tricks can yield substantial performance advantages, they also often demand a substantial investment in skill acquisition from the user before significant gains can be realized. In practice, this limits how many people will stick with a new technique long enough to realize such gains. The Dvorak keyboard offers a classic example of this: the balance of evidence suggests it’s slightly faster than QWERTY, but the high cost of switching to and learning the new layout just isn’t worth it.

In this work, we explored the performance impact of an alternative approach that builds on people’s existing touch-typing skills with the standard QWERTY layout.

And we do this in a manner that is so transparent, most people don’t even realize that anything is different at first glance.

Can you spot the difference?

Snap quiz time


What’s wrong with this keyboard?  Give it a quick once-over. It looks familiar, with the standard QWERTY layout, but do you notice anything unusual? Anything out of place?

Sure, the keys are arranged in a grid rather than the usual staggered key pattern, but that’s not the “key” difference (so to speak). That’s just an artifact of our quick ‘n’ dirty design of this research-prototype keyboard for touchscreen tablets.

Got it figured out?

All right. Pencils down.

Time to check your score. Give yourself:

  • One point if you noticed that there’s no space bar.
  • Two points if you noticed that there’s no Enter key, either.
  • Three points if the lack of a Backspace key gave you palpitations.
  • Four points and a feather in your cap if you caught the Shift key going AWOL as well.

Now, what if I also told you removing four essential keys from this keyboard–rather than harming performance–actually helps you type faster?


All we ask of people coming to our touchscreen keyboard is to learn one new trick. After all, we have to make up for the summary removal of Space, Backspace, Shift, and Enter somehow. We accomplish this by augmenting the graphical touchscreen keyboard with stroke shortcuts, i.e. short straight-line finger swipes, as follows:marking-menu-overlay-5

  • Swipe right, starting anywhere on the keyboard, to enter a Space.
  • Swipe left to Backspace.
  • Swipe upwards from any key to enter the corresponding shift-symbol. Swiping up on the a key, for example, enters an uppercase A; stroking up on the 1 key enters the ! symbol; and so on.
  • Swipe diagonally down and to the left for Enter.



In addition to possible time-motion efficiencies of the stroke shortcuts themselves, the introduction of these four gestures–and the elimination of the corresponding keys made redundant by the gestures–yields a graphical keyboard with number of interesting properties:

  • Allowing the user to input stroke gestures for Space, Backspace, and Enter anywhere on the keyboard eliminates fine targeting motions as well as any round-trips necessary for a finger to acquire the corresponding keys.
  • Instead of requiring two separate keystrokes—one to tap Shift and another to tap the key to be shifted—the Shift gesture combines these into a single action: the starting point selects a key, while the stroke direction selects the Shift function itself.
  • Removing these four keys frees an entire row on the keyboard.
  • Almost all of the numeric, punctuation, and special symbols typically relegated to the secondary and tertiary graphical keyboards can then be fit in a logical manner into the freed-up space.
  • Hence, the full set of characters can fit on one keyboard while holding the key size, number of keys, and footprint constant.
  • By having only a primary keyboard, this approach affords an economy of design that simplifies the interface, while offering further potential performance gains via the elimination of keyboard switching costs—and the extra key layouts to learn.
  • Although the strokes might reduce round-trip costs, we expect articulating the stroke gesture itself to take longer than a tap. Thus, we need to test these tradeoffs empirically.


Our studies demonstrated that overall the removal of four keys—rather than coming at a cost—offers a net benefit.

Specifically, our experiments showed that a stroke keyboard with the gesture-redundant keys removed yielded a 16% performance advantage for input phrases containing mixed-case alphanumeric text and special symbols, without sacrificing error rate. We observed these performance advantages from the first block of trials onward.

Even in the case of entirely lowercase text—that is, in a context where we would not expect to observe a performance benefit because only the Space gesture offers any potential advantage—we found that our new design still performed as well as a standard graphical keyboard. Moreover, people learned the design with remarkable ease: 90% wanted to keep using the method, and 80% believed they typed faster than on their current touchscreen tablet keyboard.

Notably, our studies also revealed that it is necessary to remove the keys to achieve these benefits from the gestural stroke shortcuts. If both the stroke shortcuts and the keys remain in place, user hesitancy about which method to use undermines any potential benefit. Users, of course, also learn to use the gestural shortcuts much more quickly when they offer the only means of achieving a function.

Thus, in this context, less is definitely more in achieving faster performance for touchscreen QWERTY keyboard typing.

The full results are available in the technical paper linked below. The paper contributes a careful study of stroke-augmented keyboards, filling an important gap in the literature as well as demonstrating the efficacy of a specific design; shows that removing the gesture-redundant keys is a critical design choice; and that stroke shortcuts can be effective in the context of multi-touch typing with both hands, even though previous studies with single-point stylus input had cast doubt on this approach.

Although our studies focus on the immediate end of the usability spectrum (as opposed to longitudinal studies over many input sessions), we believe the rapid returns demonstrated by our results illustrate the potential of this approach to improve touchscreen keyboard performance immediately, while also serving to complement other text-entry techniques such as shape-writing in the future.

Stroke-Keyboard-GI-2014-thumbArif, A. S., Pahud, M., Hinckley, K., and Buxton, B.,  Experimental Study of Stroke Shortcuts for a Touchscreen Keyboard with Gesture-Redundant Keys Removed In Proc. Graphics Interface 2014 (GI’14).  Canadian Information Processing Society, Toronto, Ont., CanadaMontreal, Quebec, Canada, May 7-9, 2014. Received the Michael A. J. Sweeney Award for Best Student Paper.  [PDF] [Talk Slides (.pptx)] [Video .MP4] [Video .WMV]

Watch A Touchscreen Keyboard with Gesture-Redundant Keys Removed video on YouTube

Paper: Writing Handwritten Messages on a Small Touchscreen

Here’s the final of our three papers at MobileHCI 2013 conference. This was a particularly fun project, spearheaded by my colleague Wolf Kienzle, looking at a clever way to do handwriting input on a touchscreen using just your finger.

In general I’m a fan of using an actual stylus for handwriting, but in the context of mobile there are many “micro” note-taking tasks, akin to scrawling a note to yourself on a post-it, that wouldn’t justify unsheathing a pen even if your device had one.

The very cool thing about this approach is that it allows you to enter overlapping multi-stroke characters using the whole screen, and without resorting to something like Palm’s old Graffiti writing or full-on handwriting recognition.


The interface also incorporates some nice fluid gestures for entering spaces between words, backspacing to delete previous strokes, or transitioning to a freeform drawing mode for inserting little sketches or smiley-faces into your instant messages, as seen above.

This paper also had the distinction of receiving an Honorable Mention Award for best paper at MobileHCI 2013. We’re glad the review committee liked our paper and saw its contributions as noteworthy, as it were (pun definitely intended).

Writing-Small-Touchscreen-thumbKienzle, W., Hinckley, K., Writing Handwritten Messages on a Small Touchscreen. In ACM 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, (MobileHCI 2013), Munich, Germany, Aug. 27-30, 2013, pp. 179-182. Honorable Mention Award (Awarded to top 5% of all papers). [PDF] [video MP4] [Watch on YouTube – coming soon.]