To search, Click
below search items.
|
|

All
Published Papers Search Service
|
Title
|
Combining Multiple Classifiers: Diversify with Boosting and Combining by Stacking
|
Author
|
Nima Hatami, Reza Ebrahimpour
|
Citation |
Vol. 7 No. 1 pp. 127-131
|
Abstract
|
Combining multiple classifiers is one of the most important topics in pattern recognition. In this paper the idea is to combine classifiers with different error types by a learnable combiner which is aware of the classifiers' expertise, so that the variance of estimation errors is reduced and the overall classification accuracy is improved. To achieve diverse base classifiers we use the boosting method in which the classifiers are trained with differently distributed training sets. And to combine the diverse base classifiers, considering their area of expertise, we use stacked generalization method which minimizes the generalization error by a classifier at a second layer to learn the type of errors made by the first layer classifiers. The proposed model is experimented with the SATIMAGE data from ELENA database. Experimental results show that the proposed model outperforms the stack and boosting methods with higher classification accuracy.
|
Keywords
|
Combining classifiers, stacked generalization, Boosting
|
URL
|
http://paper.ijcsns.org/07_book/200701/200701A18.pdf
|
|