Humans communicate with each other through natural communication modalities.For example, we convey information through speech, sketching, gestures and affective displays. Information conveyed through these modalities is easily processed and understood by humans. On the other hand, computers are extremely bad at conveying and recognizing information communicated thorough these natural means. One of the goals of Intelligent User Interfaces is to allow humans to communicate with computers through means and modalities that humans are already familiar with.
In this tutorial, we will briefly review the existing work on intelligent user interfaces, and provide a hands-on treatment of sketch-based intelligent user interfaces.
Sketching is a natural mode of interaction used in a variety of settings.For example, people sketch during early design and brainstorming sessions to guide the thought process; when we communicate certain ideas, we use sketching as an additional modality to convey ideas that cannot be put in words. The emergence of hardware such as PDAs and Tablet PCs has enabled capturing freehand sketches, enabling the routine use of sketching as an additional human-computer interaction modality. Consequently, automatic recognition of hand-drawn content has also received increasing attention.
In this tutorial, we will cover the essentials of creating sketch-based interfaces including the issues of:
-
Data collection
-
Feature extraction
-
Building recognizers for sketch recognition
-
Building sketch-based interfaces
The tutorial will start with lectures on these issues.
The lectures will be followed by lab sessions in the afternoon. The goal of the lab sessions will be to provide hands-on experience to the participants by asking them to build a recognizer for a domain of their choice. The only prerequisite for the tutorial is a working knowledge of Java.