HDFS

Redirect to:

Hadoop Distributed File System is a distributed file system that handles large data sets running on commodity hardware (Ishengoma, 2013). It is used to scale a single Apache Hadoop cluster to hundreds (and even thousands) of nodes. HDFS is one of the major components of Apache Hadoop, the others being MapReduce and YARN.


From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Nelliwinne