Big Data Based on T-Closeness Processing Breaks the Information Cocoon
DOI:
https://doi.org/10.62051/ijcsit.v4n3.35Keywords:
T-closeness, Differential privacy, Information cocoon, Privacy protectionAbstract
Aiming at the phenomenon of information cocoon and the challenge of privacy protection in the era of big data, this paper deeply discusses the role of t-closeness and differential privacy technology in breaking the information cocoon, enhancing data privacy protection and promoting information diversity. As an optimized data anonymization technology, t-closeness prevents privacy leakage, reduces algorithm bias and information homogeneity by limiting the distribution of sensitive attributes in equivalence classes, and provides an effective way to break information cocooning. This paper reviews the current situation of t-closeness technology at home and abroad, especially the fuzzy t-closeness method proposed by Chen Xiaoyu et al., which reduces information loss and improves the intensity of privacy protection through fuzzy processing. As another important privacy protection technology, differential privacy can ensure the privacy of individual data by adding random noise to data query, especially in the aspect of high-dimensional data processing. The LDP algorithm based on union tree proposed by Cheng Siyuan et al., effectively improves the processing efficiency and data quality.
Downloads
References
[1] Chen Xiaoyu, Han Bin, Huang Shucheng, et al. Journal of Computer Application and Software, 2018, 35(09):317-322+333.
[2] Wu S H. Research on t-closeness Privacy Protection Model supported by rough sets and clustering [D]. Shanxi Normal University, 2015.
[3] Gong Qiyuan. Research on Data anonymity technology for Data publishing [D]. Southeast University, 2016.
[4] [Lu Guoqing, Zhang Xiaojian, Ding Liping, et al.] A Frequent Sequence Pattern Mining Method based on Differential Privacy [J]. Journal of Computer Research and Development, 2015, 52(12):2789-2801.
[5] Dwork C, McSherry F, Nissim K, et al. Calibrating noise to sensitivity: Privacy-preserving data publishing [J]. ACM Transactions on Database Systems, 2006, 31(3): 1-41.
[6] Niu Ben, Li Fenghua, Chen Yahong, et al. Privacy protection method, device and electronic equipment for image recognition [P]. Beijing: CN202010346054.9, 2020-08-25.
[7] N. Li, T. Li and S. Venkatasubramanian, "t-closeness: Privacy Beyond k-Anonymity and l-Diversity," 2007 IEEE 23rd International Conference on Data Engineering, Istanbul, Turkey, 2007, pp. 106-115, doi: 10.1109 / ICDE. 2007.367856.
[8] Gangarde R, Sharma A, Pawar A. Enhanced Clustering Based OSN Privacy Preservation to Ensure k-Anonymity, t-closeness, l-Diversity, and Balanced Privacy Utility [J]. Computers, Materials & Continua, 2023, 75(1):2171-2190.
[9] Domingo-Ferrer J, Soria-Comas J. From t -closeness to differential privacy and vice versa in data anonymization [J]. Knowledge-Based Systems, 2015, 74151-158.
[10] Gao Ling, Jiang Yang, Ren Zhe, et al. A t-closeness privacy Protection Method satisfying ε-differential privacy [P]. Shaanxi Province: CN201910875809.1, 2023-07-11.
Published
Issue
Section
License
Copyright (c) 2024 International Journal of Computer Science and Information Technology

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.







