The power of automatic feature selection: Rubine on steroids

  • Authors:
  • Rachel Blagojevic;Samuel Hsiao-Heng Chang;Beryl Plimmer

  • Affiliations:
  • University of Auckland, Auckland, New Zealand;University of Auckland, Auckland, New Zealand;University of Auckland, Auckland, New Zealand

  • Venue:
  • Proceedings of the Seventh Sketch-Based Interfaces and Modeling Symposium
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Digital ink features drive recognition engines. Intuitively, we understand that particular features are of more value for some problems than others. Likewise, inclusion of poor features may be detrimental to recognition success. Many different ink features have been proposed for ink recognition, and most work well for the context that they are employed. However given a new problem it is not clear which of the already defined features will be most useful. We have assembled and categorized a comprehensive feature library and use this with attribute selection algorithms to choose the best features for a specified problem. To verify the effectiveness of this approach the selected features are used to train a Rubine's recognizer. We show that a set of complementary features is most effective: poor features adversely affect recognition as do two or more aliases of good features. We have composed a variant of a Rubine recognizer for 3 different datasets and compared these with the Rubine's original features, a variant on this InkRubine and $1. The results show that feature selection can significantly improve recognition rates with this simple algorithm thus verifying our hypothesis that the right combination of features for a problem is one key to recognition success.