Category Archives: visual attention

Book Chapter: Input/Output Devices and Interaction Techniques, Third Edition

Thumbnail for Computing Handbook (3rd Edition)Hinckley, K., Jacob, R., Ware, C. Wobbrock, J., and Wigdor, D., Input/Output Devices and Interaction Techniques. Appears as Chapter 21 in The Computing Handbook, Third Edition: Two-Volume Set, ed. by Tucker, A., Gonzalez, T., Topi, H., and Diaz-Herrera, J. Published by Chapman and Hall/CRC (Taylor & Francis), May 13, 2014.  [PDF – Author’s Draft – may contain discrepancies]

Advertisements

Paper: Toward Compound Navigation Tasks on Mobiles via Spatial Manipulation

I have three papers coming out this week at MobileHCI 2013, the 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, which convenes this week in Munich. It’s one of the great small conferences that focuses exclusively on mobile interaction, which of course is a long-standing interest of mine.

This post focuses on the first of those papers, and right behind it will be short posts on the other two projects that my co-authors are presenting this week.

I’ve explored many directions for viewing and moving through information on small screens, often motivated by novel hardware sensors as well as basic insights about human motor and cognitive capabilities. And I also have a long history in three-dimensional (spatial) interaction, virtual environments, and the like. But despite doing this stuff for decades, every once in a while I still get surprised by experimental results.

That’s just part of what keeps this whole research gig fun and interesting. If the all answers were simple and obvious, there would be no point in doing the studies.

In this particular paper, my co-authors and I took a closer look at a long-standing spatial, or through-the-lens, metaphor for interaction– akin to navigating documents (or other information spaces) by looking through your mobile as if it were a camera viewfinder– and subjected it to experimental scrutiny.

While this basic idea of using your mobile as a viewport onto a larger virtual space has been around for a long time, the idea hasn’t been subjected to careful scrutiny in the context of moving a mobile device’s small screen as a way to view virtually larger documents. And the potential advantages of the approach have not been fully articulated and realized either.

This style of navigation (panning and zooming control) on mobile devices has great promise because it allows you to offload the navigation task itself to your nonpreferred hand, leaving your preferred hand free to do other things like carry bags of grocieries — or perform additional tasks such as annotation, selection, and tapping commands — on top of the resulting views.

But, as our study also shows, it is an approach not without its challenges; sensing the spatial position of the device, and devising an appropriate input mapping, are both difficult challenges that will need more progress to fully take advantage of this way of moving through information on a mobile device. For the time being, at least, the traditional touch gestures of pinch-to-zoom and drag-to-pan still appear to offer the most efficient solution for general-purpose navigation tasks.

Compound-Navigation-Mobiles-thumbPahud, M., Hinckley, K., Iqbal, S., Sellen, A., and Buxton, B., Toward Compound Navigation Tasks on Mobiles via Spatial Manipulation. In ACM 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, (MobileHCI 2013), Munich, Germany, Aug. 27-30, 2013, pp. 113-122. [PDF] [video – MP4]

Toward Compound Navigation on Mobiles via Spatial Manipulation on YouTube

Paper: Gradual Engagement between Digital Devices as a Function of Proximity: From Awareness to Progressive Reveal to Information Transfer

I collaborated on a nifty project with the fine folks from Saul Greenberg’s group at the University of Calgary exploring the emerging possibilities for devices to sense and respond to their digital ecology. When devices have fine-grained sensing of their spatial relationships to one another, as well as to the people in that space, it brings about new ways for users to interact with the resulting system of cooperating devices and displays.

This fine-grained sensing approach makes for an interesting contrast to what Nic Marquardt and I explored in GroupTogether, which intentionally took a more conservative approach towards the sensing infrastructure — with the idea in mind that sometimes, one can still do a lot with very little (sensing).

Taken together, the two papers nicely bracket some possibilities for the future of cross-device interactions and intelligent environments.

This work really underscores that we are still largely in the dark ages with regard to such possibilities for digital ecologies. As new sensors and sensing systems make this kind of rich awareness of the surround of devices and users possible, our devices, operating systems, and user experiences will grow to encompass the expanded horizons of these new possibilities as well.

The full citation and the link to our scientific paper are as follows:

Gradual Engagement with devices via proximity sensingMarquardt, N., Ballendat, T., Boring, S., Greenberg, S. and Hinckley, K., Gradual Engagement between Digital Devices as a Function of Proximity: From Awareness to Progressive Reveal to Information Transfer. In Proceedings of ACM Interactive Tabletops & Surfaces (ITS 2012). Boston, MA, USA, November 11-14. 10pp. [PDF] [video – MP4].

Watch the Gradual Engagement via Proximity video on YouTube

Award: Lasting Impact Award

Lasting Impact Award thumbnailLasting Impact Award, for Sensing Techniques for Mobile Interaction, UIST 2000. “Awarded for its scientific exploration of mobile interaction, investigating new interaction techniques for handheld mobile devices supported by hardware sensors, and laying the groundwork for new research and industrial applications.” Awarded to Ken Hinckley, Jeff Pierce, Mike Sinclair, and Eric Horvitz at the 24th ACM UIST October 2011 (Sponsored by the ACM, SIGCHI, and SIGGRAPH). October 18, 2011. Check out the original paper or watch the video appended below.

UIST 2011 Lasting Impact Award for "Sensing techniques for mobile interaction"

Sensing Techniques for Mobile Interaction on YouTube

Book Chapter: Input Technologies and Techniques, 2012 Edition

Input Technologies and Techniques, 3rd EditionHinckley, K., Wigdor, D., Input Technologies and Techniques. Chapter 9 in The Human-Computer Interaction Handbook – Fundamentals, Evolving Technologies and Emerging Applications, Third Edition, ed. by Jacko, J., Published by Taylor & Francis. To appear. [PDF of author’s manuscript – not final]

This is an extensive revision of the 2007 and 2002 editions of my book chapter, and with some heavy weight-lifting from my new co-author Daniel Wigdor, it treats direct-touch input devices and techniques in much more depth. Lots of great new stuff. The book will be out in early 2012 or so from Taylor & Francis – keep an eye out for it!

Paper: Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments

Mobile Touch-Screen GesturesBragdon, A., Nelson-Brown, E., Li, Y., Hinckley, K., Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments, In Proc. CHI 2011 Conf. on Human Factors in Computing Systems. [PDF]

Book Chapter: Input Technologies and Techniques (in Human-Computer Interaction Fundamentals)

Input-Technologies-and-Techniques-HCI-Handbook-FundamentalsHinckley, K., Input Technologies and Techniques. Chapter 9 in Human-Computer Interaction Fundamentals (Human Factors and Ergonomics), ed. by Sears, A., and Jacko, J., CRC Press, Boca Raton, FL. Published March 2, 2009. Originally appeared as Chapter 9 in Human-Computer Interaction Handbook, 2nd Edition. [PDF of author’s manuscript – not final].