Citation
Wobbrock, Jacob O., Andrew D. Wilson, and Yang Li. "Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes."Proceedings of the 20th annual ACM symposium on User interface software and technology. ACM, 2007.Summary
The paper introduces a relatively simple algorithm to classify gestures. This is intended for people who have little or no knowledge about pattern recognition. The algorithm as 3 preprocessing steps and one step for training or classification. The training can be done by a single example. The three preprocessing steps are:
1) Resampling to points to a set of 64 points.
2) Getting rotational invariance. This is done by making the angle between the first point and the centroid of the points to be zero.
3) The next step is to do uniform scaling and aligning the centroid with a reference point.
The last step is different for training or classification. In training the the positions of points is stored for later retrieval. During classification, average distance between the points is computed and a final similarity match from all possible classes is choose based on this distance.
Discussion
The algorithm is called $1 algorithm because it is very cheap and easy. The code itself can be implemented in less than 100 lines of code. Also the idea behind eliminating rotational invariance is very interesting and works in most easy cases. Golden state search which is a variation of hill climbing algorithm is used to find the indicative rotation angle.
No comments:
Post a Comment