我试图在Windows 7 64位中运行spark作业的单元测试.我有
HADOOP_HOME=D:/winutils winutils path= D:/winutils/bin/winutils.exe
我跑下面的命令:
winutils ls \tmp\hive winutils chmod -R 777 \tmp\hive
但是,当我运行我的测试时,我得到以下错误.
Running com.dnb.trade.ui.ingest.spark.utils.ExperiencesUtilTest Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec 17/01/24 15:37:53 INFO Remoting: Remoting shut down 17/01/24 15:37:53 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554 java.io.IOException: Failed to delete: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554 at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:929) at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65) at ..... Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=786m; support was removed in 8.0 Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:525) ... 28 more Caused by: java.io.IOException: Access is denied at java.io.WinNTFileSystem.createFileExclusively(Native Method)
我试图手动更改permition ..每次我得到相同的错误.
请帮忙
问题出在ShutdownHook中,它试图删除临时文件但失败。尽管您无法解决问题,但可以通过将以下两行添加到log4j.properties
文件中来隐藏异常%SPARK_HOME%\conf
。如果该文件不存在,请复制log4j.properties.template
并重命名。
log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF log4j.logger.org.apache.spark.SparkEnv=ERROR
视线不清。