帮助 关于我们

返回检索结果

由粗到细的分层特征选择
Hierarchical Feature Selection from Coarse to Fine

查看参考文献25篇

刘浩阳 1,2   林耀进 1,2 *   刘景华 3   吴镒潾 1,2   毛煜 1,2   李绍滋 4  
文摘 利用数据类别间层次结构关系进行分类学习任务广泛存在于疾病诊断、图像标注等领域.然而,数据特征空间的高维性,使得分层分类学习面临着时间复杂度高和存储负担大等问题.另外,现有研究工作都假设训练集标记粒度是充分细化,与实际分层分类学习中划分细粒度标记代价高,类别标记间存在语义歧义性等矛盾.为解决上述问题,提出一种由粗到细的分层特征选择算法.该算法考虑类内一致性和兄弟节点间的差异性以选择有代表性特征,同时在特征选择的过程中实现预测训练样本未知的细粒度标记.在7个基准数据集上的实验结果表明,所提算法的分类性能优于一些先进的对比算法,且能处理标记粒度不够细化的情况.
其他语种文摘 The task of classification learning using hierarchy of categories in data exists widely in many practical applications such as disease diagnosis, image annotation, etc. However, the high dimensionality of data feature space makes hierarchical classification learning confront problems such as high time and space complexity. In addition, existing research works assume that the training set label granularity is sufficiently fine-grained, which is contradictory to the actual hierarchical classification learning, i.e., dividing fine-grained labels is costly and ambiguity exists among category labels. To solve the above problems, we propose a coarse-to-fine hierarchical feature selection algorithm. We consider intra-class consistency and inter-sibling variability to select representative features and the unknown fine-grained labels of the training samples are predicted during feature selection. Experimental results on seven benchmark datasets show that the proposed algorithm outperforms some advanced comparative algorithms in classification performance and can handle the case where the label granularity is not fine-grained enough.
来源 电子学报 ,2022,50(11):2778-2789 【核心库】
DOI 10.12263/DZXB.20211263
关键词 特征选择 ; 分层分类 ; 标记层次结构 ; 标记粒度 ; 递归正则化 ; 稀疏优化 ; 全局最优解
地址

1. 闽南师范大学计算机学院, 福建, 漳州, 363000  

2. 闽南师范大学, 数据科学与智能应用福建省高等学校重点实验室, 福建, 漳州, 363000  

3. 华侨大学计算机科学与技术学院, 福建, 厦门, 361021  

4. 厦门大学人工智能系, 福建, 厦门, 361005

语种 中文
文献类型 研究性论文
ISSN 0372-2112
学科 自动化技术、计算机技术
基金 国家自然科学基金面上项目 ;  福建省自然科学基金重点项目
文献收藏号 CSCD:7362403

参考文献 共 25 共2页

1.  王忠伟. 基于LSH的高维大数据k近邻搜索算法. 电子学报,2016,44(4):906-912 CSCD被引 2    
2.  Deng J. ImageNet: a largescale hierarchical image database. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2009:248-255 CSCD被引 121    
3.  Wang G Y. Granular computing: from granularity optimization to multi-granularity joint problem solving. Granular Computing,2017,2(3):105-120 CSCD被引 13    
4.  Yao J T. Granular computing: perspectives and challenges. IEEE Transactions on Cybernetics,2013,43(6):1977-1989 CSCD被引 42    
5.  Bargiela A. Toward a theory of granular computing for human-centered information processing. IEEE Transactions on Fuzzy Systems,2008,16(2):320-330 CSCD被引 19    
6.  胡清华. 大规模分类任务的分层学习方法综述. 中国科学:信息科学,2018,48(5):487-500 CSCD被引 11    
7.  Guo S X. Hierarchical classification with multipath selection based on granular computing. Artificial Intelligence Review,2021,54(3):2067-2089 CSCD被引 1    
8.  Silla C N. A survey of hierarchical classification across different application domains. Data Mining and Knowledge Discovery,2011,22(1/2):31-72 CSCD被引 35    
9.  Freeman C. Joint feature selection and hierarchical classifier design. Proceedings of the International Conference on Systems, Man, and Cybernetics,2011:1728-1734 CSCD被引 1    
10.  Grimaudo L. Hierarchical learning for fine grained internet traffic classification. Proceedings of International Wireless Communications and Mobile Computing Conference,2012:463-468 CSCD被引 1    
11.  Song J. A method of the feature selection in hierarchical text classification based on the category discrimination and position information. IEEE Transactions on Engineering Management,2015,53(4):555-569 CSCD被引 2    
12.  Zhao H. A recursive regularization based feature selection framework for hierarchical classification. IEEE Transactions on Knowledge and Data Engineering,2021,33(7):2833-2846 CSCD被引 6    
13.  Tuo Q J. Hierarchical feature selection with subtree based graph regularization. Knowledge Based Systems,2018,163(1):996-1008 CSCD被引 4    
14.  白盛兴. 基于邻域粗糙集的大规模层次分类在线流特征选择. 模式识别与人工智能,2019,32(9):811-820 CSCD被引 9    
15.  Liu X X. Robust hierarchical feature selection driven by data and knowledge. Information Sciences,2021,551:341-357 CSCD被引 3    
16.  Kosmopoulos A. Evaluation measures for hierarchical classification: a unified view and novel approaches. Data Mining and Knowledge Discovery,2015,29(3):820-865 CSCD被引 3    
17.  刘洪涛. 基于标签特定特征的多目标回归稀疏集成方法. 电子学报,2020,48(5):906-913 CSCD被引 2    
18.  Argyriou A. Multi-task feature learning. Proceedings of the Annual Conference on Neural Information Processing Systems,2006:41-48 CSCD被引 1    
19.  Gretton A. Measuring statistical dependence with hilbert-Schmidt norms. Proceedings of the International Conference on Algorithmic Learning Theory,2005:63-77 CSCD被引 1    
20.  Nie F P. Efficient and robust feature selection via joint L2, 1-norms minimization. Proceedings of the Annual Conference on Neural Information Processing Systems,2010:1813-1821 CSCD被引 1    
引证文献 5

1 折延宏 面向层次结构数据的增量特征选择 计算机科学与探索,2023,17(12):2928-2941
CSCD被引 0 次

2 周正阳 基于教师-学生时空半监督网络的城市事件预测方法 电子学报,2023,51(12):3557-3571
CSCD被引 0 次

显示所有5篇文献

论文科学数据集
PlumX Metrics
相关文献

 作者相关
 关键词相关
 参考文献相关

版权所有 ©2008 中国科学院文献情报中心 制作维护:中国科学院文献情报中心
地址:北京中关村北四环西路33号 邮政编码:100190 联系电话:(010)82627496 E-mail:cscd@mail.las.ac.cn 京ICP备05002861号-4 | 京公网安备11010802043238号