我正在创建一个SQLContext
使用sbt 的Scala程序。这是我的build.sbt:
name := "sampleScalaProject" version := "1.0" scalaVersion := "2.11.7" //libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2" libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2" libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2" libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"
这是测试程序:
import org.apache.spark.SparkContext import org.apache.spark.sql.SQLContext object SqlContextSparkScala { def main (args: Array[String]) { val sc = SparkContext val sqlcontext = new SQLContext(sc) } }
我得到以下错误:
Error:(8, 26) overloaded method constructor SQLContext with alternatives: (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext cannot be applied to (org.apache.spark.SparkContext.type) val sqlcontexttest = new SQLContext(sc)
谁能让我知道这个问题,因为我是scala和spark编程的新手?
您需要new
您的SparkContext
,那应该解决它