全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于改进差别信息树的广义决策属性约简
Generalized Decision Attribute Reduction Based on Improved Discernibility Information Tree

DOI: 10.12677/CSA.2024.142022, PP. 215-223

Keywords: 差别矩阵,改进差别信息树,决策多层次系统,广义决策属性约简
Discernibility Matrix
, Improved Discernibility Information Tree, Multi-Hierarchical Decision Systems, Generalized Decision Attribute Reduction

Full-Text   Cite this paper   Add to My Lib

Abstract:

属性约简作为一种有效的数据降维方法,对于处理高维数据具有重要意义,通过删除冗余属性保留重要属性,获得与原系统具有相同表达能力和分类能力的属性子集。差别矩阵是得到属性约简的一种重要方法,但其中含有大量无用的信息,本文受改进差别信息树的启发,将改进差别信息树与决策多层次系统相结合,在该方法下研究不同决策层级间改进差别信息树之间的关系,提出一种基于改进差别信息树的广义决策属性约简算法。所提方法不仅可以实现对差别矩阵中非空元素的压缩存储,还有效缩短了时间消耗。为了验证算法的有效性,选取8组UCI数据集分别从算法的约简结果和约简效率两方面进行对比,实验结果验证了算法的可行性和有效性。
Attribute reduction, as an effective data dimensionality reduction method, is of great significance for dealing with high-dimensional data, which obtains a subset of attributes with the same expressive and categorization ability as the original system by removing redundant attributes and retain-ing important attributes. Discernibility matrix is an important method to get attribute reduction, but it contains a lot of useless information, this paper is inspired by the improved discernibility information tree, combines the improved discernibility information tree with the multi-hierarchical decision systems, studies the relationship between the improved discernibility information tree among different decision levels under this method, and proposes a generalized decision attribute reduction algorithm based on the improved discernibility information tree. The proposed method can not only realize the compressed storage of non-empty elements in the discernibility matrix, but also effectively reduce the time consumption. In order to verify the effectiveness of the algorithm, eight groups of UCI datasets are selected to compare the algorithm in terms of reduction results and reduction efficiency, and the experimental results verify the feasibility and effectiveness of the algorithm.

References

[1]  Pawlak, Z. (1982) Rough Sets. International Journal of Computer and Information Sciences, 11, 341-356.
https://doi.org/10.1007/BF01001956
[2]  Gao, C., Zhou, J., Miao, D.Q., et al. (2021) Granu-lar-Conditional-Entropy-Based Attribute Reduction for Partially Labeled Data with Proxy Labels. Information Sciences, 580, 111-128.
https://doi.org/10.1016/j.ins.2021.08.067
[3]  Xia, H., Chen, Z.Z., Wu, Y.M., et al. (2022) Attribute Reduction Method Based on Improved Granular Ball Neighborhood Rough Set. 2022 7th International Conference on Cloud Computing and Big Data Analytics (ICCCBDA), Chengdu, 22-24 April 2022, 13-16.
https://doi.org/10.1109/ICCCBDA55098.2022.9778889
[4]  Xia, S.Y., Wang, G.Y. and Gao, X. (2023) An Effi-cient and Accurate Rough Set for Feature Selection, Classification, and Knowledge Representation. IEEE Transactions on Knowledge and Data Engineering, 35, 7724-7735.
https://doi.org/10.1109/TKDE.2022.3220200
[5]  Mao, H., Wang, S.Y., Liu, C., et al. (2023) Hypergraph-Based Attribute Reduction of Formal Contexts in Rough Sets. Expert Systems with Applications, 234, Article ID: 121062.
https://doi.org/10.1016/j.eswa.2023.121062
[6]  Kang, L., Yu, B. and Cai, M.J. (2022) Multi-Attribute Predictive Analysis Based on Attribute-Oriented Fuzzy Rough Sets in Fuzzy Information Systems. Information Sciences, 608, 931-949.
https://doi.org/10.1016/j.ins.2022.07.006
[7]  Chen, Y., Liu, K.Y., Song, J.J., et al. (2020) Attribute Group for Attribute Reduction. Information Sciences, 535, 64-80.
https://doi.org/10.1016/j.ins.2020.05.010
[8]  Sang, B.B., Chen, H.M., Yang, L., et al. (2022) Incremental Feature Selection Using a Conditional Entropy Based on Fuzzy Dominance Neighborhood Rough Sets. IEEE Transactions on Fuzzy Systems, 30, 1683-1697.
https://doi.org/10.1109/TFUZZ.2021.3064686
[9]  Liu, Y., Zheng, L.D., Xiu, Y.L., et al. (2020) Discernibility Matrix Based Incremental Feature Selection on Fused Decision Tables. International Journal of Approximate Reasoning, 118, 1-26.
https://doi.org/10.1016/j.ijar.2019.11.010
[10]  Miao, D.Q., Zhao, Y., Yao, Y.Y., et al. (2009) Relative Reducts in Consistent and Inconsistent Decision Tables of the Pawlak Rough Set Model. Information Sciences, 179, 4140-4150.
https://doi.org/10.1016/j.ins.2009.08.020
[11]  张楠, 许鑫, 童向荣, 等. 不协调区间值决策系统的分布约简[J]. 计算机科学, 2017, 44(9): 78-82.
https://www.jsjkx.com/CN/10.11896/j.issn.1002-137X.2017.09.016
[12]  蒋瑜. 基于差别信息树的Rough Set属性约简算法[J]. 控制与决策, 2015, 30(8): 1531-1536.
https://doi.org/10.13195/j.kzyjc.2014.0724
[13]  Yang, L., Zhang, X. and Xu, W. (2019) Attribute Reduction of Discernibility Information Tree in Interval-Valued Ordered Information System. Journal of Frontiers of Computer Sci-ence & Technology, 13, 1062-1069.
https://doi.org/10.3778/j.issn.1673-9418.1805037
[14]  Jiang, Y. (2019) Attribute Reduction with Rough Set Based on Improving Discernibility Information Tree. Control & Decision, 34, 1253-1258.
https://doi.org/10.13195/j.kzyjc.2017.1523
[15]  徐怡, 唐静昕. 基于优化可辨识矩阵和改进差别信息树的属性约简算法[J]. 计算机科学, 2020, 47(3): 73-78.
https://doi.org/10.11896/jsjkx.190500125

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413