全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

一种广义决策保持的快速启发式属性约简算法
A Fast Heuristic Attribute Reduction Algorithm for Generalized Decision Preservation

DOI: 10.12677/CSA.2024.142026, PP. 260-267

Keywords: 属性约简,粗糙集,启发式算法,广义决策保持
Attribute Reduction
, Rough Set, Heuristic Algorithms, Generalized Decision Preservation

Full-Text   Cite this paper   Add to My Lib

Abstract:

属性约简是粗糙集理论的重要概念之一,旨在获得一个可以保持原始信息系统分类能力的最小属性子集。广义决策保持约简是粗糙集中的属性约简方法之一,其目标为维护决策系统中的决策结果,确保在约简过程中不丢失原始决策。这意味着约简后的系统仍可正确地进行决策,而决策规则的有效性和决策能力得以保持。传统的广义决策保持启发式属性约简算法注重算法的有效性,而算法的效率有待优化。传统算法在计算广义决策保持相似度时需多次遍历每个对象的等价类与决策类,存在大量的重复计算。为了克服这个问题,我们通过引入哈希表来存储每个对象的等价类与其广义决策,使得计算广义决策保持相似度时可针对计算对象直接得出结果而不是依次遍历,由此提出了广义决策保持的快速启发式属性约简算法。最后,通过6组UCI数据集验证了本文提出算法的有效性与高效性。
Attribute reduction is one of the key concepts in rough set theory, aimed at obtaining the minimal subset of attributes that retains the classification capability of the original information system. Generalized Decision-Preserving Reduction is a method of attribute reduction within rough set theory, which focuses on maintaining the outcomes of decision-making systems, ensuring that the original decisions are not lost during the reduction process. This implies that the reduced system can still make correct decisions, with the effectiveness and capability of decision rules maintained. Traditional algorithms prioritize effectiveness, but their efficiency needs improvement. These conventional algorithms require multiple traversals through each object’s equivalence class and decision class to compute the similarity degree for generalized decision preservation, leading to extensive redundant computations. To overcome this issue, we introduce the use of hash tables to store each object’s equivalence class and its generalized decision, allowing direct computation of similarity degree for generalized decision preservation for a computational object, rather than sequential traversals, thereby accelerating the algorithm. Finally, the efficacy and efficiency of the proposed algorithm are validated through six sets of UCI datasets.

References

[1]  Pawlak, Z. (1982) Rough Sets. International Journal of Computer and Information Sciences, 11, 341-356.
https://doi.org/10.1007/BF01001956
[2]  Pawlak, Z. (1992) Rough Sets: Theoretical Aspects of Reasoning about Data. Kluwer Academic Publishers, Berlin.
https://doi.org/10.1007/978-94-011-3534-4_7
[3]  苗夺谦, 胡桂荣. 知识约简的一种启发式算法[J]. 计算机研究与发展, 1999(6): 42-45.
[4]  张楠, 苗夺谦, 岳晓冬. 区间值信息系统的知识约简[J]. 计算机研究与发展, 2010, 47(8): 1362-1371.
[5]  Sang, B., Chen, H., Yang, L., et al. (2021) Feature Selection for Dynamic Interval-Valued Ordered Data Based on Fuzzy Dominance Neighborhood Rough Set. Knowledge-Based Systems, 227, Article ID: 107223. ttps://doi.org/10.1016/j.knosys.2021.107223
[6]  Yang, X., Li, M., Fujita, H., et al. (2022) Incremental Rough Reduc-tion with Stable Attribute Group. Information Sciences, 589, 283-299.
https://doi.org/10.1016/j.ins.2021.12.119
[7]  Xie, J., Hu, B.Q. and Jiang, H. (2022) A Novel Method to Attribute Reduction Based on Weighted Neighborhood Probabilistic Rough Sets. International Journal of Approximate Reasoning, 144, 1-17.
https://doi.org/10.1016/j.ijar.2022.01.010
[8]  Zhang, X. and Yao, Y. (2022) Tri-Level Attribute Reduction in Rough Set Theory. Expert Systems with Applications, 190, Article ID: 116187.
https://doi.org/10.1016/j.eswa.2021.116187
[9]  Kryszkiewicz, M. (1998) Rough Set Approach to Incomplete In-formation Systems. Information Sciences, 112, 39-49.
https://doi.org/10.1016/S0020-0255(98)10019-1
[10]  Miao, D.Q., Zhao, Y., Yao, Y.Y., et al. (2009) Relative Re-ducts in Consistent and Inconsistent Decision Tables of the Pawlak Rough Set Model. Information Sciences, 179, 4140-4150.
https://doi.org/10.1016/j.ins.2009.08.020
[11]  Zhang, N., Gao, X. and Yu, T. (2019) Heuristic Ap-proaches to Attribute Reduction for Generalized Decision Preservation. Applied Sciences, 9, Article 2841.
https://doi.org/10.3390/app9142841
[12]  Qian, Y.H., Liang, J.Y., Pedrycz, W., et al. (2010) Positive Approxima-tion: An Accelerator for Attribute Reduction in Rough Set Theory. Artificial Intelligence, 174, 597-618.
https://doi.org/10.1016/j.artint.2010.04.018
[13]  Xia, H., Chen, Z., Wu, Y.M., et al. (2022) Attribute Reduction Method Based on Improved Granular Ball Neighborhood Rough Set. 2022 7th International Conference on Cloud Computing and Big Data Analytics (ICCCBDA), Chengdu, 22-24 April 2022, 13-16.
https://doi.org/10.1109/ICCCBDA55098.2022.9778889
[14]  Sang, B.B., Chen, H.M., Yang, L., et al. (2022) In-cremental Feature Selection Using a Conditional Entropy Based on Fuzzy Domi-nance Neighborhood Rough Sets. IEEE Transactions on Fuzzy Systems, 30, 1683-1697.
https://doi.org/10.1109/TFUZZ.2021.3064686
[15]  Xia, S.Y., Wang, G.Y. and Gao, X. (2023) An Efficient and Accurate Rough Set for Feature Selection, Classification, and Knowledge Representation. IEEE Transactions on Knowledge and Data Engineering, 35, 7724-7735.
https://doi.org/10.1109/TKDE.2022.3220200
[16]  Mao, H., Wang, S.Y., Liu, C., et al. (2023) Hypergraph-Based Attribute Reduction of Formal Contexts in Rough Sets. Expert Systems with Applications, 234, Article ID: 121062.
https://doi.org/10.1016/j.eswa.2023.121062

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413