当前位置:  开发笔记 > 编程语言 > 正文

如何使用Scala在Spark中创建SQLContext?

如何解决《如何使用Scala在Spark中创建SQLContext?》经验,为你挑选了1个好方法。

我正在创建一个SQLContext使用sbt 的Scala程序。这是我的build.sbt:

name := "sampleScalaProject"

version := "1.0"

scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"  

这是测试程序:

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object SqlContextSparkScala {

  def main (args: Array[String]) {
    val sc = SparkContext
    val sqlcontext = new SQLContext(sc)
  }
} 

我得到以下错误:

Error:(8, 26) overloaded method constructor SQLContext with alternatives:
  (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext 
  (sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
 cannot be applied to (org.apache.spark.SparkContext.type)
    val sqlcontexttest = new SQLContext(sc)  

谁能让我知道这个问题,因为我是scala和spark编程的新手?



1> Justin Pihon..:

您需要new您的SparkContext,那应该解决它

推荐阅读
小妖694_807
这个屌丝很懒,什么也没留下!
DevBox开发工具箱 | 专业的在线开发工具网站    京公网安备 11010802040832号  |  京ICP备19059560号-6
Copyright © 1998 - 2020 DevBox.CN. All Rights Reserved devBox.cn 开发工具箱 版权所有