Hi There, I've completed the HDFS part and currently studying Spark. The whole HDFS, Mapreduce, Yarn section have been completed but no where have you mentioned that Howo to work in Hadoop. What to open, where to write commands, codes, scripts etc. You just shared someone's git link and ask us to follow the steps. Instead of , you should have shown us the way to install the softwares and where and how to write codes, commands etc. For eg, I've a windows 10 os, when I open my cmd to run a hadoop command, or mapreduce, you have not told where to go, which path to go and then run the commands.