Pyspark初始化SparkContext时,报jvm不存在错误
错误如下
----> 1 sc = SparkContext(conf=conf)
/usr/local/lib/python3.6/site-packages/pyspark/context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
145 try:
146 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 147 conf, jsc, profiler_cls)
148 except:
149 # If an error occurs, clean up in order to allow future SparkContext creation:
/usr/local/lib/python3.6/site-packages/pyspark/context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)
222 # data via a socket.
223 # scala's mangled names w/ $ in them require special treatment.
--> 224 self._encryption_enabled = self._jvm.PythonUtils.isEncryptionEnabled(self._jsc)
225 os.environ["SPARK_AUTH_SOCKET_TIMEOUT"] = \
226 str(self._jvm.PythonUtils.getPythonAuthSocketTimeout(self._jsc))
/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in __getattr__(self, name)
1529 else:
1530 raise Py4JError(
-> 1531 "{0}.{1} does not exist in the JVM".format(self._fqn, name))
1532
1533 def _get_args(self, args):
Py4JError: org.apache.spark.api.python.PythonUtils.isEncryptionEnabled does not exist in the JVM
我的解决方法:
添加下面内容到~/.bashrc
export SPARK_HOME=/usr/local/spark
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH
export PATH=$SPARK_HOME/bin:$SPARK_HOME/python:$PATH
分类:
Hadoop&Spark
标签:
Spark
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
· 浏览器原生「磁吸」效果!Anchor Positioning 锚点定位神器解析
· 没有源码,如何修改代码逻辑?
· 全程不用写代码,我用AI程序员写了一个飞机大战
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· 记一次.NET内存居高不下排查解决与启示
· 白话解读 Dapr 1.15:你的「微服务管家」又秀新绝活了
· DeepSeek 开源周回顾「GitHub 热点速览」