我试图在单元测试中这样做:
val sConf = new SparkConf() .setAppName("RandomAppName") .setMaster("local") val sc = new SparkContext(sConf) val sqlContext = new TestHiveContext(sc) // tried new HiveContext(sc) as well
但我明白了:
[scalatest] Exception encountered when invoking run on a nested suite - java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient *** ABORTED *** [scalatest] java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient [scalatest] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346) [scalatest] at org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:120) [scalatest] at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163) [scalatest] at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161) [scalatest] at org.apache.spark.sql.hive.HiveContext. (HiveContext.scala:168) [scalatest] at org.apache.spark.sql.hive.test.TestHiveContext. (TestHive.scala:72) [scalatest] at mypackage.NewHiveTest.beforeAll(NewHiveTest.scala:48) [scalatest] at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) [scalatest] at mypackage.NewHiveTest.beforeAll(NewHiveTest.scala:35) [scalatest] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) [scalatest] at mypackage.NewHiveTest.run(NewHiveTest.scala:35) [scalatest] at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1491)
当我使用spark-submit运行时代码工作正常,但在单元测试中则不行.如何解决单元测试问题?