计算机工程与应用 ›› 2011, Vol. 47 ›› Issue (22): 130-132.

• 数据库、信号与信息处理 • 上一篇    下一篇

一种新的互信息特征子集评价函数

洪智勇1,2,王天擎3,刘灿涛4   

  1. 1.西南交通大学 信息与科学技术学院,成都 610031
    2.五邑大学 计算机学院,广东 江门 529020
    3.五邑大学 管理学院,广东 江门 529020
    4.中国人民银行信息中心,北京 100037
  • 收稿日期:1900-01-01 修回日期:1900-01-01 出版日期:2011-08-01 发布日期:2011-08-01

Mutual-information evaluation function for feature subset selected

HONG Zhiyong1,2,WANG Tianqing3,LIU Cantao4   

  1. 1.School of Information Science & Technology,Southwest Jiaotong University,Chengdu 610031,China
    2.School of Computer Science,Wuyi University,Jiangmen,Guangdong 529020,China
    3.School of Economy and Management,Wuyi University,Jiangmen,Guangdong 529020,China
    4.Infromation Center for People’s Bank of China,Beijing 100037,China

  • Received:1900-01-01 Revised:1900-01-01 Online:2011-08-01 Published:2011-08-01

摘要: 传统基于互信息的特征选择方法较少考虑特征之间的关联,并且随着特征数的增加,算法复杂度过大,基于此提出了一种新的基于互信息的特征子集评价函数。该方法充分考虑了特征间如何进行协作,选择了较优的特征子集,改善了分类准确度并且计算负荷有限。实验结果表明,该方法与传统的MIFS方法相比较,分类准确度提高了3%~5%,误差减少率也有25%~30%的改善。

关键词: 互信息, 特征选择,

Abstract: Conventional mutual-information-based feature selection algorithms seldom considers how features work together,with the features incresement,the computational complexity of the algorithms will increase dramatically.So propose mutual-
information-based evaluation function for feature subset selected,different from other mutual-information-based feature selection algorithm,it considers how features work together.So it produces the optimal feature subset and improves the classification accuracy of classifier.The computational complexity of it is also limited.The results show about 3%~5% increase for classification accuracy and 25%~30% improvement for error reduction rate compare with other conventional mutual-information-
based feature selection algorithms.

Key words: mutual information, feature selection, entropy