object RddBasedOnCollections {
def main(args: Array[String]): Unit = {
val conf = new SparkConf();
conf.setMaster("local")
conf.setAppName("RddBasedOnCollections")
val sc = new SparkContext(conf)
val numbers = to
val rdd = sc.parallelize(numbers) // 创建rdd
val sum = rdd.reduce(_+_) //-求和
println("1+2+3+...+99+100 = " + sum)
}
}
运行查看结果
idea创建maven工程的spark项目创建maven工程的spark项目遇到的问题
遇到的问题
问题一
Error:scalac: bad symbolic reference. A signature in package.class refers to type compileTimeOnly
in package scala.annotation which is not available.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.