Thursday, April 3, 2008

Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes

Summary:

The authors have presented simple yet robust method of recognizing single stroke gestures (Sketch gestures) without use of any complex machine learning algorithm. The idea behind the complete $1 recognizer is simple:

  • Capture different ways of drawing a shape.
  • Re-sample the data points so that over all shape is preserved.
  • Rotate the sampled points by an indicative angle, which is defined by them as an angle between the gestures first point and the centroid. According to them this indicative angle rotation helps in finding the best match.
  • Then the samples points are compared against the available templates on the database.
  • The closest matching template is the recognized result.

Since sampled points are dependent on the drawing style of the user, the two similar looking shapes may have different template matches because of the variability in drawing style. This is eliminated by providing the database with all possible drawing style samples.

They have compared their algorithm with the Rubine algorithm and have reported better accuracy than Rubine. With 1 template, they have obtained accuracy of 97% which improved to 99% with 3 templates per gesture.

Discussion:

I have same thoughts like Brandon. being an instance based algorithm , if your database is too large , this would not yield faster results. This means that when we present the gesture to the system, it does all the pre processing and searches through data set.Even if we repeat the same gesture, it will search the complete data set again .It does not keep a notebook record of expected gesture even when gesture is repeated again. In other words , it is instance based algorithm with memory erased after each search is completed. May be by some book keeping we can improve the algorithm. Other wise it is a beautiful and simple algorithm .

No comments: