当前位置:  开发笔记 > 编程语言 > 正文

SparkException:此JVM中只能运行一个SparkContext

如何解决《SparkException:此JVM中只能运行一个SparkContext》经验,为你挑选了1个好方法。

提交一个简单的Spark管道:

./bin/spark-submit --class com.example.ExamplePipeline --master local pipeline-1.0.0-SNAPSHOT.jar
...
17/01/11 12:34:24 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:82)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:140)
com.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:72)
com.example.ExamplePipeline.exec(ExamplePipeline.java:115)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1702)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1641)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
    at org.apache.spark.SparkContext.(SparkContext.scala:2197)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
    at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:140)
    at com.example.ExamplePipeline.createExecutionContext(Exampleipeline.java:72)
    at com.example.ExamplePipeline.exec(ExamplePipeline.java:115)
    at com.example.ExamplePipeline.main(ExamplePipeline.java:144)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/01/11 12:34:24 INFO ReceiverTracker: Sent stop signal to all 1 receivers
17/01/11 12:34:24 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook

它看起来有另一个上下文运行,所以它停止了.我找不到其他正在运行的东西,但这曾经在同一个环境中工作.



1> T. Gawęda..:

你可以只有一个SparkContext实例,除非你设置spark.driver.allowMultipleContexts = true(但不建议 - 它用于测试,而不是生产)

如果你这样做:

JavaStreamingContext ssc = new JavaStreamingContext(conf, window);

Spark将创建新的SparkContext,然后创建StreamingContext,它将使用创建的SparkContext.如果您在StreamingContext之前创建了SparkContext,则会抛出异常.就我在stacktrace中看到的那样,你正在使用这个构造函数

要避免此异常,您可以运行:

JavaStreamingContext ssc = new JavaStreamingContext(sparkContext, window);

推荐阅读
wangtao
这个屌丝很懒,什么也没留下!
DevBox开发工具箱 | 专业的在线开发工具网站    京公网安备 11010802040832号  |  京ICP备19059560号-6
Copyright © 1998 - 2020 DevBox.CN. All Rights Reserved devBox.cn 开发工具箱 版权所有