Comment:
Chris
Summary:
This paper is completely different from what we read before. The paper deals with graph-based recognzier instead of using features to create recognizer. They represent each symbol as ARG(Attributed relational graph), each node is for primitive, each edge is for relationship between two primitive. Each node has two properties, primitive type and raletive length. Each edge has three properties, number of intersections, angle, position information, respectively. After convering each symbol into ARG, the remaining task is maching this ARG with best sample symbol. There are two steps. 1) Graph maching. Here, we find the each corresponding node between two graphs. This is well-known NP -complete problem. In this paper, author provides 4 different ways to approximate the best match. 2) Measureing maching score. author defines six different maching score metric in this paper as well as its weight value, which get from empirical study. System go through each sample and simply returns the best N-list. Result shows that about 93% of chance that the top 1 result is correctly returned.
Comments:
This paper shows another important area of pattern recognition. For graph-based approach there is one important issue we have to consider, which requires high computational cost, that is , graph matching or graph isomorphism, which remains NP-complete problem. But graph based approach has very good advantages while comparing to statistical or syntatic approach. The system does not depend on the drawn orientation and order, or scale(if use relative value), and can represent components' relationship and topology very accurately. The structural method is very appealing for me, and personaly perfer this structural approach to statistical method. Some papers also use structural way to recognize handwritting characters which seems harder than the work presented in this paper.
In all, this is pretty nice paper, and gives me very much information about modeling problem using structural way and use graph theory to solve it.
No comments:
Post a Comment