I continue to believe that stylus input — annotations, sketches, mark-up, and gestures — will be an important aspect of interaction with slate computers in the future, particularly when used effectively and convincingly with multi-modal pen+touch input. It also seems that every couple of years I stumble across an interesting new use or set of techniques for motion sensors, and this year proved to be no exception.
Thus, it should come as no surprise that my latest project has continued to push in this direction, exploring the possibilities for pen interaction when the physical stylus itself is augmented with inertial sensors including three-axis accelerometers, gyros, and magnetometers.
In recent years such sensors have become integrated with all manner of gadgets, including smart phones and tablets, and it is increasingly common for microprocessors to include such sensors directly on the die. Hence in my view of the world, we are just at the cusp of sensor-rich stylus devices becoming commercially feasible, so it is only natural to consider how such sensors afford new interactions, gestures, or context-sensing techniques when integrated directly with an active (powered) stylus on pen-operated devices.
In collaboration with Xiang ‘Anthony’ Chen and Hrvoje Benko I recently published a paper exploring motion-sensing capabilities for electronic styluses, which takes a first look at some techniques for such a device. With some timely help from Tom Blank’s brilliant devices team at Microsoft Research, we built a custom stylus — fully wireless and powered by an AAAA battery — that integrates these sensors.
These range from very simple but clever things such as reminding the user if they have left behind the pen — a common problem that users encounter with pen-based devices — to fun new techniques that emulate physical media, such as the gesture of striking a loaded brush on one’s finger in water media.
Check out the video below for an overview of these and some of the other techniques we have come up with so far, or read more about it in the technical paper linked below.
We are continuing to work in this area, and have lots more ideas that go beyond what we were able to accomplish in this first stage of the project, so stay tuned for future developments along these lines.
Hinckley, K., Chen, X., and Benko, H., Motion and Context Sensing Techniques for Pen
Computing. In Proc. Graphics Interface 2013 (GI’13). Canadian Information Processing Society, Toronto, Ont., Canada. Regina, Saskatchewan, Canada, May 29-31, 2013. [PDF] [video – MP4].