Mac OS X 에 Hadoop 2.7.3 설치
Mac OS X에 Hadoop을 설치할 일이 생겨서 예전에 Ubuntu에 설치할 때 정리해 놓은 글을 참고하여 진행해 보았다. 하지만 설치 후 실행시켜 보니 아래 warning이 발생한다.
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
구글링을 해보니 큰 문제가 아니라고 하지만 꺼림직 하기에 아래와 같은 삽질을 조금 해보았다.
1. Download Java 1.7 JDK and install it
2. Install protobuf 2.5.0 via Homebrew. Refer to link
-------------------------------------------------------------------------------------------------------------------------
About the App
- App name: protobuf
- App description: Protocol buffers (Google’s data interchange format)
- App website: https://github.com/google/protobuf/
Install the App
- Press
Command+Space
and type Terminal and press enter/return key. - Run in Terminal app:
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" < /dev/null 2> /dev/null
and press enter/return key. Wait for the command to finish. - Run:
brew install protobuf
Done! You can now use protobuf
.
출처 : http://macappstore.org/protobuf/
----------------------------------------------------------------------------------------------------------------
$ brew tap homebrew/versions
$ brew install protobuf250
$ brew link --force --overwrite protobuf250
$ protoc --version
libprotoc 2.5.0
$ brew tap homebrew/versions
$ brew install protobuf250
$ brew link --force --overwrite protobuf250
$ protoc --version
libprotoc 2.5.0
3. Install cmake via Homebrew
$ brew install cmake
$ brew install cmake
4. Install Hadoop 2.7.2 via Homebrew
$ brew install hadoop
$ brew install hadoop
5. Update Hadoop’s native libraries. Refer to link
$ wget http://apache.tt.co.kr/hadoop/common/hadoop-2.7.2/hadoop-2.7.2-src.tar.gz
$ tar xvf hadoop-2.7.2-src.tar.gz
$ cd hadoop-2.7.2-src
$ mvn package -Pdist,native -DskipTests -Dtar
$ mv hadoop-dist/target/hadoop-2.7.2/lib /usr/local/Cellar/hadoop/2.7.2/
$ wget http://apache.tt.co.kr/hadoop/common/hadoop-2.7.2/hadoop-2.7.2-src.tar.gz
$ tar xvf hadoop-2.7.2-src.tar.gz
$ cd hadoop-2.7.2-src
$ mvn package -Pdist,native -DskipTests -Dtar
$ mv hadoop-dist/target/hadoop-2.7.2/lib /usr/local/Cellar/hadoop/2.7.2/
6. Open hadoop-env.sh
and add below variables
# /usr/local/Cellar/hadoop/2.7.2/libexec/etc/hadoop/hadoop-env.sh
export HADOOP_HOME="/usr/local/Cellar/hadoop/2.7.2"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
# /usr/local/Cellar/hadoop/2.7.2/libexec/etc/hadoop/hadoop-env.sh
export HADOOP_HOME="/usr/local/Cellar/hadoop/2.7.2"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
7. Edit core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
8. Edit hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
9. Activate Remote Login
option. Refer to this link
10. Set passphraseless ssh
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
$ ssh localhost
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
$ ssh localhost
11. Format filesystem
$ hdfs namenode -format
$ hdfs namenode -format
12. Execute NameNode daemon, DataNode daemon
$ /usr/local/Cellar/hadoop/2.7.2/sbin/start-dfs.sh
$ /usr/local/Cellar/hadoop/2.7.2/sbin/start-dfs.sh
13. Stop NameNode daemon, DataNode daemon
$ /usr/local/Cellar/hadoop/2.7.2/sbin/stop-dfs.sh
$ /usr/local/Cellar/hadoop/2.7.2/sbin/stop-dfs.sh