我有一个dockerized Spark实例,并使用SJS从我的Spring启动应用程序中提交spark作业.到目前为止一切正常,但现在按照sjs日志提交作业时出现以下异常.
Uncaught exception while reverting partial writes to file /tmp/spark-f6f7e14c-0d89-40b0-b2d8-262278b619db/blockmgr-1eaf0f4d-8451-4eda-a2e3-ca3acafab871/09/temp_shuffle_c8e20306-d111-49b2-b025-d47ba7cac723 java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.FileChannelImpl.truncate(FileChannelImpl.java:371) at org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(BlockObjectWriter.scala:191) at org.apache.spark.util.collection.ExternalSorter$$anonfun$stop$2.apply(ExternalSorter.scala:807) at org.apache.spark.util.collection.ExternalSorter$$anonfun$stop$2.apply(ExternalSorter.scala:806) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) at org.apache.spark.util.collection.ExternalSorter.stop(ExternalSorter.scala:806) at org.apache.spark.shuffle.sort.SortShuffleWriter.stop(SortShuffleWriter.scala:94) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:76) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:70) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
我正在运行Docker for mac 1.13 beta,我在磁盘上有足够的空间.感谢您提供与此相关的任何信息.