我最近安装了pyspark。它已正确安装。当我在python中使用以下简单程序时,出现错误。
>>from pyspark import SparkContext >>sc = SparkContext() >>data = range(1,1000) >>rdd = sc.parallelize(data) >>rdd.collect()
在运行最后一行时,出现错误,其关键行似乎是
[Stage 0:> (0 + 0) / 4]18/01/15 14:36:32 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 123, in main ("%d.%d" % sys.version_info[:2], version)) Exception: Python in worker has different version 2.7 than that in driver 3.5, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.
我在.bashrc中有以下变量
export SPARK_HOME=/opt/spark export PYTHONPATH=$SPARK_HOME/python3
我正在使用Python 3。
顺便说一句,如果您使用PyCharm,则可以添加PYSPARK_PYTHON
和PYSPARK_DRIVER_PYTHON
运行/调试以下每个图像的配置
您应该在中设置以下环境变量$SPARK_HOME/conf/spark-env.sh
:
export PYSPARK_PYTHON=/usr/bin/python export PYSPARK_DRIVER_PYTHON=/usr/bin/python
如果spark-env.sh
不存在,可以重命名spark-env.sh.template