Tag Archives: transfer functions

Paper: Toward Compound Navigation Tasks on Mobiles via Spatial Manipulation

I have three papers coming out this week at MobileHCI 2013, the 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, which convenes this week in Munich. It’s one of the great small conferences that focuses exclusively on mobile interaction, which of course is a long-standing interest of mine.

This post focuses on the first of those papers, and right behind it will be short posts on the other two projects that my co-authors are presenting this week.

I’ve explored many directions for viewing and moving through information on small screens, often motivated by novel hardware sensors as well as basic insights about human motor and cognitive capabilities. And I also have a long history in three-dimensional (spatial) interaction, virtual environments, and the like. But despite doing this stuff for decades, every once in a while I still get surprised by experimental results.

That’s just part of what keeps this whole research gig fun and interesting. If the all answers were simple and obvious, there would be no point in doing the studies.

In this particular paper, my co-authors and I took a closer look at a long-standing spatial, or through-the-lens, metaphor for interaction– akin to navigating documents (or other information spaces) by looking through your mobile as if it were a camera viewfinder– and subjected it to experimental scrutiny.

While this basic idea of using your mobile as a viewport onto a larger virtual space has been around for a long time, the idea hasn’t been subjected to careful scrutiny in the context of moving a mobile device’s small screen as a way to view virtually larger documents. And the potential advantages of the approach have not been fully articulated and realized either.

This style of navigation (panning and zooming control) on mobile devices has great promise because it allows you to offload the navigation task itself to your nonpreferred hand, leaving your preferred hand free to do other things like carry bags of grocieries — or perform additional tasks such as annotation, selection, and tapping commands — on top of the resulting views.

But, as our study also shows, it is an approach not without its challenges; sensing the spatial position of the device, and devising an appropriate input mapping, are both difficult challenges that will need more progress to fully take advantage of this way of moving through information on a mobile device. For the time being, at least, the traditional touch gestures of pinch-to-zoom and drag-to-pan still appear to offer the most efficient solution for general-purpose navigation tasks.

Compound-Navigation-Mobiles-thumbPahud, M., Hinckley, K., Iqbal, S., Sellen, A., and Buxton, B., Toward Compound Navigation Tasks on Mobiles via Spatial Manipulation. In ACM 15th International Conference on Human-Computer Interaction with Mobile Devices and Services, (MobileHCI 2013), Munich, Germany, Aug. 27-30, 2013, pp. 113-122. [PDF] [video – MP4]

Toward Compound Navigation on Mobiles via Spatial Manipulation on YouTube

Advertisements

Award: Lasting Impact Award

Lasting Impact Award thumbnailLasting Impact Award, for Sensing Techniques for Mobile Interaction, UIST 2000. “Awarded for its scientific exploration of mobile interaction, investigating new interaction techniques for handheld mobile devices supported by hardware sensors, and laying the groundwork for new research and industrial applications.” Awarded to Ken Hinckley, Jeff Pierce, Mike Sinclair, and Eric Horvitz at the 24th ACM UIST October 2011 (Sponsored by the ACM, SIGCHI, and SIGGRAPH). October 18, 2011. Check out the original paper or watch the video appended below.

UIST 2011 Lasting Impact Award for "Sensing techniques for mobile interaction"

Sensing Techniques for Mobile Interaction on YouTube

Paper: Sensor Synaesthesia: Touch in Motion, and Motion in Touch

Sensor SynaesthesiaHinckley, K., and Song, H., Sensor Synaesthesia: Touch in Motion, and Motion in Touch, In Proc. CHI 2011 Conf. on Human Factors in Computing Systems. CHI 2011 Honorable Mention Award. [PDF] [video .WMV].

Watch Sensor Synaesthesia video on YouTube

Paper: Snap-And-Go: Helping Users Align Objects without the Modality of Traditional Snapping

Baudisch, P., Cutrell, E., Hinckley, K., and Eversole, A. Snap-And-Go: Helping Users Align Objects without the Modality of Traditional Snapping. In Proc. CHI 2005 Conf. on Human Factors in Computing Systems, Portland, OR, April 02 – 07, 2005, pp. 301-310. CHI 2005 Best Paper Nomination Award. [PDF] [demo on Patrick Baudisch’s Flash web page]

Unpublished Manuscript: Fundamental States of Interaction for Pen, Touch, and Other Novel Interaction Devices

Fundamental States for Pen, Touch, and Pointing DevicesHinckley, K., Fundamental States of Interaction for Pen, Touch, and Other Novel Interaction Devices. Unpublished Manuscript, May 27, 2004, 5 pp. This is a white paper about fundamental constraints in designing interfaces for pen, touch, and mouse-based interaction that I wrote in 2004. Note:Parts of this material were revised and appear in my 2007 Input Technologies & Techniques book chapter. [PDF]

Paper: Mouse Ether: Accelerating the Acquisition of Targets across Multi-Monitor Displays

Mouse Ether Pointing Technique for Multiple MonitorsBaudisch, P., Cutrell, E., Hinckley, K., and Gruen, R., Mouse Ether: Accelerating the Acquisition of Targets across Multi-Monitor Displays. In Proc. CHI 2004 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria, April 24 – 29, 2004, pp. 1379-1382. [PDF] [video .AVI]

Book Chapter: Input Technologies and Techniques, 1st Edition

Input-Technologies-and-Techniques-HCI-Handbook-1st-EditionHinckley, K., Input Technologies and Techniques. Chapter 9 in The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (Human Factors and Ergonomics), 1st Edition, ed. by Sears, A., and Jacko, J. Written in 2001. Published by Lawrence Erlbaum, Hillsdale, NJ, Sept 1, 2002, pp. 151-168. [PDF of author’s manuscript – not final]. See also the 2007 and 2012 editions.