In this blog,we are going to embark the journey of how to setup the Hadoop Multi-Node cluster on a distributed environment.
So lets do not waste any time, and let’s get started.
Here are steps you need to perform.
1.Download & install Hadoop for local machine (Single Node Setup)
http://hadoop.apache.org/releases.html – 2.7.3
use java : jdk1.8.0_111
2. Download Apache Spark from : http://spark.apache.org/downloads.html
choose spark release : 1.6.2
1. Mapping the nodes
First of all ,we have to edit hosts file in /etc/ folder on all nodes, specify the IP address of each system followed by their host names.
# vi /etc/hostsenter the following lines in the /etc/hosts file.192.168.1.xxx hadoop-master 192.168.1.xxx hadoop-slave-1192.168.56.xxx hadoop-slave-2
View original post 687 more words