ubutun 16.04 安裝 spark
1.安裝 java jdk
安裝 openjdk8 , 因為 Oracle Java 有收費的疑慮
sudo apt-get install default-jdk
2.安裝 Scala
因為 Spark is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12.
所以安裝 ubutun 預設 Scala 2.11.6
sudo apt-get install scala
3.安裝 SBT
4.安裝SPARK
https://spark.apache.org/downloads.html
安裝 openjdk8 , 因為 Oracle Java 有收費的疑慮
sudo apt-get install default-jdk
2.安裝 Scala
因為 Spark is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12.
所以安裝 ubutun 預設 Scala 2.11.6
sudo apt-get install scala
3.安裝 SBT
echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 2EE0EA64E40A89B84B2DF73499E82A75642AC823
sudo apt-get update
sudo apt-get install sbt
4.安裝SPARK
https://spark.apache.org/downloads.html
wget http://apache.stu.edu.tw/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
mkdir spark
tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C spark
#mv spark-2.4.3-bin-hadoop2.7/ spark
sudo mv spark/ /usr/lib/
cd /usr/lib/spark/spark-2.4.3-bin-hadoop2.7/conf/
cp spark-env.sh.template spark-env.sh
5.修改環境變數
vi ~/.bashrc
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export SCALA_HOME=/usr/share/scala
export SBT_HOME=/usr/share/sbt
export SPARK_HOME=/usr/lib/spark/spark-2.4.3-bin-hadoop2.7
export PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:$SBT_HOME/bin:$SPARK_HOME/bin:$SPARK_HOME/sbin
立即生效環境變數
source ~/.bashrc
留言
張貼留言