Spark cluster on Mesos
zk1: 192.168.8.101
zk2: 192.168.8.102
zk3: 192.168.8.103
mesos-m1: 192.168.8.101
mesos-m2: 192.168.8.102
mesos-m3: 192.168.8.103
mesos-a1: 192.168.8.101
mesos-a2: 192.168.8.102
mesos-a3: 192.168.8.103
export SPARK_HOME=/opt/spark
HERE
source /etc/profile
root@router:~#/opt/spark/bin/
beeline
pyspark
提示:请确保主机名可以成功解析
pyspark
root@router:~#/opt/spark/bin/pyspark
Python
2.7.5 (default, Nov 20 2015, 02:00:19)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
16/08/02 17:55:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
Using Python version 2.7.5 (default, Nov 20 2015 02:00:19)
SparkSession available as 'spark'.
>>>
spark-shell
root@router:~#/opt/spark/bin/spark-shell
16/08/02 18:00:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/02 18:00:26 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://192.168.8.254:4040
Spark context available as 'sc' (master = local[*], app id = local-1470132026404).
Spark session available as 'spark'.
Welcome to
3.配置spark集群
http://spark.apache.org/docs/latest/running-on-mesos.html
spark1: 192.168.8.101
spark2: 192.168.8.102
spark3: 192.168.8.103
/opt/spark/sbin/start-mesos-dispatcher.sh
--master
mesos://zk://192.168.8.101:2181,192.168.8.102:2181,192.168.8.103:2181/mesos
4.提交job到spark cluster
这里直接借用spark自带example示例
/opt/spark/bin/spark-submit \
/opt/spark/bin/spark-submit \
提示:
执行的job需要http://, hdfs://等形式