Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (22): 54-64.DOI: 10.3778/j.issn.1002-8331.2203-0527

• Research Hotspots and Reviews • Previous Articles     Next Articles

Review of Feature Selection Methods Based on Kernel Statistical Independence Criteria

HU Zhenwei, WANG Tinghua, ZHOU Huiying   

  1. School of Mathematics and Computer Science, Gannan Normal University, Ganzhou, Jiangxi 341000, China
  • Online:2022-11-15 Published:2022-11-15



  1. 赣南师范大学 数学与计算机科学学院,江西 赣州 341000

Abstract: The Hilbert-Schmidt independence criterion(HSIC) is a kernel function-based independence criterion with the advantages of simple computation, fast convergence and low bias, which is widely used in statistical analysis and machine learning problems. Feature selection is an effective dimensionality reduction technique that evaluates the importance of features and constructs an optimal feature subspace suitable for the learning task. The HSIC-based feature selection methods are systematically reviewed, in which the theoretical basis, algorithmic models and solution methods are introduced in detail, the advantages and disadvantages of HSIC-based feature selection are analyzed, and future directions are given.

Key words: feature selection, Hilbert-Schmidt independence criterion(HSIC), kernel method, machine learning

摘要: 希尔伯特-施密特独立性准则(Hilbert-Schmidt independence criterion,HSIC)是一种基于核函数的独立性度量标准,具有计算简单、收敛速度快和偏差低等优点,广泛应用于统计分析和机器学习问题中。特征选择是一种有效的降维技术,它能评估特征的重要性,并构造适合学习任务的最优特征子空间。系统综述了基于HSIC的特征选择方法,详细介绍了其中的理论基础、算法模型和求解方法,分析了基于HSIC的特征选择的优点与不足,并对未来的研究做出展望。

关键词: 特征选择, 希尔伯特-施密特独立性准则, 核方法, 机器学习