Scala: Spark exception when project includes Akka -


i have project uses spark, , want use akka in it. project worked fine before, when added build.sbt:

librarydependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.3" 

and try run project, error:

[error] (run-main-0) org.apache.spark.sparkexception: job aborted due stage failure: task serialization failed: java.lang.classnotfoundexception: scala.function0
[error] java.lang.classnotfoundexception: scala.function0
[error] @ sbt.classpath.classpathfilter.loadclass(classloaders.scala:63)
[error] @ java.lang.classloader.loadclass(classloader.java:357)
[error] @ java.lang.class.forname0(native method)
[error] @ java.lang.class.forname(class.java:348)
[error] @ com.twitter.chill.kryobase$$anonfun$1.apply(kryobase.scala:41)
[error] @ com.twitter.chill.kryobase$$anonfun$1.apply(kryobase.scala:41)
[error] @ scala.collection.traversablelike$$anonfun$map$1.apply(traversablelike.scala:234)
[error] @ scala.collection.traversablelike$$anonfun$map$1.apply(traversablelike.scala:234)
[error] @ scala.collection.immutable.range.foreach(range.scala:160)
[error] @
scala.collection.traversablelike$class.map(traversablelike.scala:234)
[error] @ scala.collection.abstracttraversable.map(traversable.scala:104)
[error] @ com.twitter.chill.kryobase.(kryobase.scala:41)

any ideas?

with such build.sbt

scalaversion := "2.11.11" librarydependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.3" librarydependencies += "org.apache.spark" %% "spark-core" % "2.2.0" 

both helloworld in akka , wordcount in spark work fine. akka can work 2.10-2.12, spark 2.10-2.11. it's hard more without code samples , build.sbt.


Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -