Research‎ > ‎

iLEARN: Intelligent LEARNing in real application systems

Classifier Ensemble with Diversity Learning
 
-- Learning to Diversity via Weighted Kernels for Classifier Ensemble

     Classifier ensemble generally should combine diverse component classifiers. However, it is difficult to give a definitive connection between diversity measure and ensemble accuracy. Given a list of available component classifiers, how to adaptively and diversely ensemble classifiers becomes a big challenge in the literature. In this paper, we argue that diversity, not direct diversity on samples but adaptive diversity with data, is highly correlated to ensemble accuracy, and propose a novel technology for classifier ensemble, learning to diversify, which learns to adaptively combine classifiers by considering both accuracy and diversity. Specifically, our approach, Learning TO Diversify via Weighted Kernels (L2DWK), performs classifier combination by optimizing a direct but simple criterion: maximizing ensemble accuracy and adaptive diversity simultaneously by minimizing a convex loss function. Given a measure formulation, the diversity is calculated with weighted kernels (i.e., the diversity is measured on the component classifiers’ outputs which are kernelled and weighted), and the kernel weights are automatically learned. We minimize this loss function by estimating the kernel weights in conjunction with the classifier weights, and propose a self-training algorithm for conducting this convex optimization procedure iteratively. Extensive experiments on a variety of 32 UCI classification benchmark datasets show that the proposed approach consistently outperforms state-of-the-art ensembles such as Bagging, AdaBoost, Random Forests, Gasen, Regularized Selective Ensemble, and Ensemble Pruning via Semi-Definite Programming.

    Data sets and Matlab source codes of our method are available online.


-- Diversity Based Classifier Ensemble with Sample Weighting

-- Classifier Ensemble with Diversity and Sparsity


Deep Learning with Plenty Samples and Large-Scale Categories
 

 

References

[1] Xu-Cheng Yin, Chun Yang, Hong-Wei Hao,  “Learning to diversify via weighted kernels for classifier ensemble,” submitted to IEEE Trans. Pattern Analysis and Machine Intelligence (TPAMI), 2014. <Source Codes and Dataset>

[2] Xu-Cheng Yin, Kaizhu Huang, Chun Yang, and Hong-Wei Hao, “Convex ensemble learning with sparsity and diversity,” Information Fusion, vol. 20, pp. 49-59, 2014. <Paper Link>

[3] Xu-Cheng Yin, Kaizhu Huang, and Hong-Wei Hao, “A novel classifier ensemble method with sparsity and diversity,” Neurocomputing, vol. 134, pp. 214-221, 2014.

[4] Chun Yang, Xu-Cheng Yin, and Hong-Wei Hao, “Diversity-based ensemble with sample weight learning,” International Conference on Pattern Recognition (ICPR'14), accepted, 2014. 

[5] Xu-Cheng Yin, Kaizhu Huang, and Hong-Wei Hao, “DE2: Dynamic ensemble of ensembles for learning nonstationary data,” Neurocomputing, accepted, 2014. 

 

Funding and Sponsoring
 
The research project is partly supported by National Natural Science Foundation of China (61175020).



 


|Powered By Google Sites