当前位置:  开发笔记 > 编程语言 > 正文

在Windows 7 64位中删除Spark临时目录时出现异常

如何解决《在Windows764位中删除Spark临时目录时出现异常》经验,为你挑选了1个好方法。

我试图在Windows 7 64位中运行spark作业的单元测试.我有

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

我跑下面的命令:

winutils ls \tmp\hive
winutils chmod -R 777  \tmp\hive

但是,当我运行我的测试时,我得到以下错误.

Running com.dnb.trade.ui.ingest.spark.utils.ExperiencesUtilTest
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec
17/01/24 15:37:53 INFO Remoting: Remoting shut down
17/01/24 15:37:53 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554
java.io.IOException: Failed to delete: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:929)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at .....

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=786m; support was removed in 8.0

Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:525)
        ... 28 more
Caused by: java.io.IOException: Access is denied
        at java.io.WinNTFileSystem.createFileExclusively(Native Method)

我试图手动更改permition ..每次我得到相同的错误.

请帮忙



1> 小智..:

问题出在ShutdownHook中,它试图删除临时文件但失败。尽管您无法解决问题,但可以通过将以下两行添加到log4j.properties文件中来隐藏异常%SPARK_HOME%\conf。如果该文件不存在,请复制log4j.properties.template并重命名。

log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR

视线不清。

推荐阅读
惬听风吟jyy_802
这个屌丝很懒,什么也没留下!
DevBox开发工具箱 | 专业的在线开发工具网站    京公网安备 11010802040832号  |  京ICP备19059560号-6
Copyright © 1998 - 2020 DevBox.CN. All Rights Reserved devBox.cn 开发工具箱 版权所有