首页 > 代码库 > Ubuntu 安装 spark
Ubuntu 安装 spark
环境:
Unbunt 12.04
Hadoop 2.2.x
Sprak 0.9
Scala scala-2.9.0.final.tgz
步骤
1. 下载 scala
2. 解压scala,然后改动/etc/profile,加入例如以下
export SCALA_HOME=/home/software/scala-2.9.0.final
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:/home/software/eclipse:$ANT_HOME/bin:$SQOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin
3.下载spark
版本号:spark-0.9.0-incubating-bin-hadoop2.tgz
4. 改动/etc/profile
export SPARK_HOME=/opt/spark
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:/home/software/eclipse:$ANT_HOME/bin:$SQOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin
5. 进入conf/,做例如以下改动
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
export SCALA_HOME=/home/software/scala-2.9.0.finalexport JAVA_HOME=/home/software/jdk1.7.0_55export SPARK_MASTER_IP=172.16.2.104export SPARK_WORKER_MEMORY=1000m
6.vim conf/slaves
localhost
datanode1
7.启动/关闭spark
sbin/start-all.sh
8.浏览master ui
http://robinson-ubuntu:8080
9.执行例子
run-example org.apache.spark.examples.SparkPi local
10.执行例子
run-example org.apache.spark.examples.SparkPi spark://172.16.2.104:707711.执行例子
run-example org.apache.spark.examples.SparkLR spark://172.16.2.104:7077
參考:
http://www.tuicool.com/articles/NB3imuY
http://blog.csdn.net/myboyliu2007/article/details/17174363
Ubuntu 安装 spark