Spark 3.5.0 独立部署(Standalone)模式
1.下载Spark 3.5.0
https://spark.apache.org/downloads.html
2.安装JDK
3.安装Hadoop
4.解压
mkdir /usr/spark
tar -zxvf spark-3.5.0-bin-hadoop3.tgz -C /usr/spark/
5.配置
1.修改集群节点配置,添加节点
cd /usr/spark/spark-3.5.0-bin-hadoop3/conf
mv workers.template workers
vi workers
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# A Spark Worker will be started on each of the machines listed below.
localhost
192.168.58.131
192.168.58.132
2.配置Java环境变量
mv spark-env.sh.template spark-env.sh
vi spark-env.sh
export JAVA_HOME=/usr/java/jdk8u392-b08
SPARK_MASTER_HOST=192.168.58.130
SPARK_MASTER_PORT=7077
3.为所有节点同步配置【略】
6.启动
/usr/spark/spark-3.5.0-bin-hadoop3/sbin/start-all.sh
7.访问WebUI
8.跑一个内置的测试任务
bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://192.168.58.130:7077 ./examples/jars/spark-examples_2.12-3.5.0.jar 10
作者:奇
出处:https://www.cnblogs.com/fanqisoft/p/17937845
版权:本作品采用「本文版权归作者和博客园共有,欢迎转载,但必须给出原文链接,并保留此段声明,否则保留追究法律责任的权利。」许可协议进行许可。
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 阿里最新开源QwQ-32B,效果媲美deepseek-r1满血版,部署成本又又又降低了!
· 单线程的Redis速度为什么快?
· SQL Server 2025 AI相关能力初探
· AI编程工具终极对决:字节Trae VS Cursor,谁才是开发者新宠?
· 展开说说关于C#中ORM框架的用法!