Conditional Dynamic Mutual Information-Based Feature Selection


  • Huawen Liu Department of Computer Science, Zhgejiang Normal University, Jinhua
  • Yuchang Mo Department of Computer Science, Zhejiang Normal University, Jinhua
  • Jianmin Zhao Department of Computer Science, Zhejiang Normal University, Jinhua


Pattern classification, feature selection, mutual information, data mining, pattern recognition


With emergence of new techniques, data in many fields are getting larger and larger, especially in dimensionality aspect. The high dimensionality of data may pose great challenges to traditional learning algorithms. In fact, many of features in large volume of data are redundant and noisy. Their presence not only degrades the performance of learning algorithms, but also confuses end-users in the post-analysis process. Thus, it is necessary to eliminate irrelevant features from data before being fed into learning algorithms. Currently, many endeavors have been attempted in this field and many outstanding feature selection methods have been developed. Among different evaluation criteria, mutual information has also been widely used in feature selection because of its good capability of quantifying uncertainty of features in classification tasks. However, the mutual information estimated on the whole dataset cannot exactly represent the correlation between features. To cope with this issue, in this paper we firstly re-estimate mutual information on identified instances dynamically, and then introduce a new feature selection method based on conditional mutual information. Performance evaluations on sixteen UCI datasets show that our proposed method achieves comparable performance to other well-established feature selection algorithms in most cases.


Download data is not yet available.




How to Cite

Liu, H., Mo, Y., & Zhao, J. (2013). Conditional Dynamic Mutual Information-Based Feature Selection. COMPUTING AND INFORMATICS, 31(6), 1193–1216. Retrieved from

Most read articles by the same author(s)