当前位置:  开发笔记 > 编程语言 > 正文

Scala - 使用"endsWith"过滤DataFrame

如何解决《Scala-使用"endsWith"过滤DataFrame》经验,为你挑选了1个好方法。

给定一个DataFrame:

 val df = sc.parallelize(List(("Mike","1986","1976"), ("Andre","1980","1966"), ("Pedro","1989","2000")))
      .toDF("info", "year1", "year2")
df.show

 +-----+-----+-----+
 | info|year1|year2|
 +-----+-----+-----+
 | Mike| 1986| 1976|
 |Andre| 1980| 1966|
 |Pedro| 1989| 2000|
 +-----+-----+-----+

我尝试过滤所有df值结束6,但获得异常.我试过了 :

  val filtered = df.filter(df.col("*").endsWith("6"))
  org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object, tree: ResolvedStar(info#20, year1#21, year2#22)

我也试过这个:

val filtered = df.select(df.col("*")).filter(_ endsWith("6"))
error: missing parameter type for expanded function ((x$1) => x$1.endsWith("6"))

如何解决?谢谢



1> eliasah..:

我不是很确定你要做什么,而是根据我的理解:

val df = sc.parallelize(List(("Mike","1986","1976"), ("Andre","1980","1966"), ("Pedro","1989","2000"))).toDF("info", "year1", "year2")
df.show 
# +-----+-----+-----+
# | info|year1|year2|
# +-----+-----+-----+
# | Mike| 1986| 1976|
# |Andre| 1980| 1966|
# |Pedro| 1989| 2000|
# +-----+-----+-----+

val conditions = df.columns.map(df(_).endsWith("6")).reduce(_ or _)
df.withColumn("condition", conditions).filter($"condition" === true).drop("condition").show
# +-----+-----+-----+
# | info|year1|year2|
# +-----+-----+-----+
# |Andre| 1980| 1966|
# | Mike| 1986| 1976|
# +-----+-----+-----+

推荐阅读
保佑欣疼你的芯疼
这个屌丝很懒,什么也没留下!
DevBox开发工具箱 | 专业的在线开发工具网站    京公网安备 11010802040832号  |  京ICP备19059560号-6
Copyright © 1998 - 2020 DevBox.CN. All Rights Reserved devBox.cn 开发工具箱 版权所有