今天从github上项目,运行一致报错,后来自己写了一个测试程序,竟然也是同样的错,最后发现竟然是scala版本不对

  def main(args: Array[String]): Unit = {
    val conf=new SparkConf().setAppName("AppConf").setMaster("local[4]")
    val sc=new SparkContext(conf)
    sc.parallelize(Seq(1, 2, 3, 4, 5, 6))
      .mapPartitions(iter => {
        iter.foreach(item => println(item))
        iter
      })
      .collect()
  }

项目中的报错信息scalac: Error: object ByteRef does not have a member create
scala.reflect.internal.FatalError: object ByteRef does not have a member creat

Error:scalac: Error: object ByteRef does not have a member create
scala.reflect.internal.FatalError: object ByteRef does not have a member create
    at scala.reflect.internal.Definitions$DefinitionsClass.scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(Definitions.scala:1186)
    at scala.reflect.internal.Definitions$DefinitionsClass.getMember(Definitions.scala:1203)
    at scala.reflect.internal.Definitions$DefinitionsClass.getMemberMethod(Definitions.scala:1238)
    at scala.tools.nsc.transform.LambdaLift$$anonfun$scala$tools$nsc$transform$LambdaLift$$refCreateMethod$1.apply(LambdaLift.scala:41)
    at scala.tools.nsc.transform.LambdaLift$$anonfun$scala$tools$nsc$transform$LambdaLift$$refCreateMethod$1.apply(LambdaLift.scala:41)
    at scala.reflect.internal.util.Collections$$anonfun$mapFrom$1.apply(Collections.scala:182)
    at scala.reflect.internal.util.Collections$$anonfun$mapFrom$1.apply(Collections.scala:182)
    at scala.collection.immutable.List.map(List.scala:273)
    at scala.reflect.internal.util.Collections$class.mapFrom(Collections.scala:182)
    at scala.reflect.internal.SymbolTable.mapFrom(SymbolTable.scala:16)

项目中用的是spark版本是1.6.2,依赖scala版本2.10,而我本地用的是scala2.11.8,将scala版本改成2.10后问题解决

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.2</version>
</dependency>

总结:长个记性,下一次先检查scala版本和本地版本是否匹配

Logo

瓜分20万奖金 获得内推名额 丰厚实物奖励 易参与易上手

更多推荐