Category Archives: Pointing Devices & Techniques

Book Chapter: Input/Output Devices and Interaction Techniques, Third Edition

Thumbnail for Computing Handbook (3rd Edition)Hinckley, K., Jacob, R., Ware, C. Wobbrock, J., and Wigdor, D., Input/Output Devices and Interaction Techniques. Appears as Chapter 21 in The Computing Handbook, Third Edition: Two-Volume Set, ed. by Tucker, A., Gonzalez, T., Topi, H., and Diaz-Herrera, J. Published by Chapman and Hall/CRC (Taylor & Francis), May 13, 2014.  [PDF – Author’s Draft – may contain discrepancies]

Paper: LightRing: Always-Available 2D Input on Any Surface

In this modern world bristling with on-the-go-go-go mobile activity, the dream of an always-available pointing device has long been held as a sort of holy grail of ubiquitous computing.

Ubiquitous computing, as futurists use the term, refers to the once-farfetched vision where computing pervades everything, everywhere, in a sort of all-encompassing computational nirvana of socially-aware displays and sensors that can respond to our every whim and need.

From our shiny little phones.

To our dull beige desktop computers.

To the vast wall-spanning electronic whiteboards of a future largely yet to come.

How will we interact with all of these devices as we move about the daily routine of this rapidly approaching future? As we encounter computing in all its many forms, carried on our person as well as enmeshed in the digitally enhanced architecture of walls, desktops, and surfaces all around?

Enter LightRing, our early take on one possible future for ubiquitous interaction.

LightRing device on a supporting surface

By virtue of being a ring always worn on the finger, LightRing travels with us and is always present.

By virtue of some simple sensing and clever signal processing, LightRing can be supported in an extremely compact form-factor while providing a straightforward pointing modality for interacting with devices.

At present, we primarily consider LightRing as it would be configured to interact with a situated display, such as a desktop computer, or a presentation projected against a wall at some distance.

The user moves their index finger, angling left and right, or flexing up and down by bending at the knuckle. Simple stuff, I know.

But unlike a mouse, it’s not anchored to any particular computer.

It travels with you.

It’s a go-everywhere interaction modality.

Close-up of LightRing and hand angles inferred from sensors

Left: The degrees-of-freedom detected by the LightRing sensors. Right: Conceptual mapping of hand movement to the sensed degrees of freedom. LightRing then combines these to support 2D pointing at targets on a display, or other interactions.

LightRing can then sense these finger movements–using a one-dimensional gyroscope to capture the left-right movement, and an infrared sensor-emitter pair to capture the proximity of the flexing finger joint–to support a cursor-control mode that is similar to how you would hold and move a mouse on a desktop.

Except there’s no mouse at all.

And there needn’t even be a desktop, as you can see in the video embedded below.

LightRing just senses the movement of your finger.  You can make the pointing motions on a tabletop, sure, but you can just as easily do them on a wall. Or on your pocket. Or a handheld clipboard.

All the sensing is relative so LightRing always knows how to interpret your motions to control a 2D cursor on a display. Once the LightRing has been paired with a situated device, this lets you point at targets, even if the display itself is beyond your physical reach. You can sketch or handwrite characters with your finger–another scenario we have explored in depth on smartphones and even watches.

The trick to the LightRing is that it can automatically, and very naturally, calibrate itself to your finger’s range of motion if you just swirl your finger. From that circular motion LightRing can work backwards from the sensor values to how your finger is moving, assuming it is constrained to (roughly) a 2D plane. And that, combined with a button-press or finger touch on the ring itself, is enough to provide an effective input device.

The LightRing, as we have prototyped it now, is just one early step in the process. There’s a lot more we could do with this device, and many more practical problems that would need to be resolved to make it a useful adjunct to everyday devices–and to tap its full potential.

But my co-author Wolf Kienzle and I are working on it.

And hopefully, before too much longer now, we’ll have further updates on even more clever and fanciful stuff that we can do through this one tiny keyhole into this field of dreams, the verdant golden country of ubiquitous computing.


LightRing thumbnailKienzle, W., Hinckley, K., LightRing: Always-Available 2D Input on Any Surface. In the 27th ACM Symposium on User Interface Software and Technology (UIST 2014), Honolulu, Hawaii, Oct. 5-8, 2014, pp. 157-160. [PDF] [video.mp4 TBA] [Watch on YouTube]

Watch LightRing video on YouTube

Book Chapter: Input Technologies and Techniques, 2012 Edition

Input Technologies and Techniques, 3rd EditionHinckley, K., Wigdor, D., Input Technologies and Techniques. Chapter 9 in The Human-Computer Interaction Handbook – Fundamentals, Evolving Technologies and Emerging Applications, Third Edition, ed. by Jacko, J., Published by Taylor & Francis. To appear. [PDF of author’s manuscript – not final]

This is an extensive revision of the 2007 and 2002 editions of my book chapter, and with some heavy weight-lifting from my new co-author Daniel Wigdor, it treats direct-touch input devices and techniques in much more depth. Lots of great new stuff. The book will be out in early 2012 or so from Taylor & Francis – keep an eye out for it!

Paper: Grips and Gestures on a Multi-Touch Pen

Multi-Touch PenSong, H., Benko, H., Guimbretiere, F., Izadi, S., Cao, X., Hinckley, K., Grips and Gestures on a Multi-Touch Pen, In Proc. CHI 2011 Conf. on Human Factors in Computing Systems. [PDF] [video .WMV]

Paper: Design and Evaluation of Interaction Models for Multi-touch Mice

Multi-touch MouseBenko, H., Izadi, S., Wilson, A. D., Cao, X., Rosenfeld, D., Hinckley, K., Design and Evaluation of Interaction Models for Multi-touch Mice. Proc. Graphics interface 2010, Ottawa, Ontario, Canada, May 31 – June 02, 2010, pp. 253-260. [PDF] [video .WMV]

Book Chapter: Input Technologies and Techniques (in Human-Computer Interaction Fundamentals)

Input-Technologies-and-Techniques-HCI-Handbook-FundamentalsHinckley, K., Input Technologies and Techniques. Chapter 9 in Human-Computer Interaction Fundamentals (Human Factors and Ergonomics), ed. by Sears, A., and Jacko, J., CRC Press, Boca Raton, FL. Published March 2, 2009. Originally appeared as Chapter 9 in Human-Computer Interaction Handbook, 2nd Edition. [PDF of author’s manuscript – not final].

Paper: An Exploration of Pen Rolling for Pen-Based Interaction

Rolling PenBi, X., Moscovich, T., Ramos, G., Balakrishnan, R., and Hinckley, K. An Exploration of Pen Rolling for Pen-Based Interaction. In Proc. UIST 2008 Symp. on User Interface Software and Technology, Monterey, CA, October 19 – 22, 2008, pp. 191-200. [PDF] [video .WMV]

Paper: Starburst: A Target Expansion Algorithm for Non-Uniform Target Distributions

StarburstBaudisch, P., Zotov, A., Cutrell, E., and Hinckley, K. Starburst: A Target Expansion Algorithm for Non-Uniform Target Distributions. In Proc. AVI 2008 Working Conference on Advanced Visual Interfaces, Napoli, Italy, May 28 – 30, 2008, pp. 129-137. [PDF] [video .WMV].

Watch Starburst on YouTube

Book Chapter: Input Technologies and Techniques, 2007 Edition

Input-Technologies-and-Techniques-HCI-Handbook-2nd-EditionHinckley, K., Input Technologies and Techniques. Chapter 9 in The Human-Computer Interaction Handbook, 2nd Edition, ed. by Sears, A., and Jacko, J., CRC Press, Boca Raton, FL. Written in 2006. Published Sept 19, 2007. Also reprinted as Chapter 9 in Human-Computer Interaction Fundamentals. [PDF of author’s manuscript – not final]. See also the 2012 and 2002 editions.

Paper: Tumble! Splat! Helping Users Access and Manipulate Occluded Content in 2D Drawings

Tumble SplatRamos, G., Robertson, G., Czerwinski, M., Tan, D., Baudisch, P., Hinckley, K., and Agrawala, M. Tumble! Splat! Helping Users Access and Manipulate Occluded Content in 2D Drawings. In Proc. AVI 2006 Working Conf. on Advanced Visual Interfaces, Venezia, Italy, May 23 – 26, 2006, pp. 428-435. [PDF] [Splatter technique .AVI] [Tumbler technique .AVI].