首页 > 代码库 > Spark wordcount 编译错误 -- reduceByKey is not a member of RDD
Spark wordcount 编译错误 -- reduceByKey is not a member of RDD
Attempting to run http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala from source.
This line val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey(_+_)
reports compile
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)]
Resolution:
import the implicit conversions from SparkContext
:
import org.apache.spark.SparkContext._
They use the ‘pimp up my library‘ pattern to add methods to RDD‘s of specific types. If curious, seeSparkContext:1296
Spark wordcount 编译错误 -- reduceByKey is not a member of RDD
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。