Indexed by:
Abstract:
Since its introduction in 2006, Hadoop technology has evolved dramatically and the Hadoop ecosystem has flourished. The Hadoop ecosystem is now composed of more than 60 components, ranging from HDFS and MapReduce. Hadoop's architecture makes it far superior to other products in large-scale data processing and analysis, making it the best choice for all industries. This makes Hadoop the preferred framework for data analysis and processing in a variety of industries. With the widespread application of Hadoop, people are increasingly concerned about the data trustworthiness of HDFS such as the explicit storage of data and over-reliance on authentication mechanisms. In order to ensure the trustworthiness of Hadoop data and to take into account the performance factors of the big data framework, this study focuses on the HDFS-based data trustworthiness problem and the use of node classification, data encryption, trustworthiness measures and other measures to comprehensively enhance the trustworthiness of HDFS file system, improve the existing HDFS system, and ensure the data storage communication in a trustworthy environment. © 2021 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2021
Page: 962-966
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 3
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: