1

I wrote a example with spark maven support in Intelligent IDEA. The spark version is 2.0.0, hadoop version is 2.7.3, scala version is 2.11.8. Enviroment in system and IDE is the same version. Then application runs with error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; at org.apache.spark.ui.jobs.StagePage.(StagePage.scala:44) at org.apache.spark.ui.jobs.StagesTab.(StagesTab.scala:34) at org.apache.spark.ui.SparkUI.(SparkUI.scala:62) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157) at org.apache.spark.SparkContext.<init>(SparkContext.scala:443) at org.apache.spark.SparkContext.<init>(SparkContext.scala:149) at org.apache.spark.SparkContext.<init>(SparkContext.scala:185) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:92) at com.spark.test.WordCountTest.main(WordCountTest.java:25) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)

Saurabh Srivastava
  • 1,055
  • 13
  • 26
Victor
  • 48
  • 1
  • 9

2 Answers2

0

Spark 2.0.0 build with scala 2.10, you have to add scala 2.10 as framework support

Haibo Yan
  • 11
  • 2
0

Update pom.xml with scala 2.11.8.
See the spark download documentation for version compatibility.

Mangaldeep Pannu
  • 2,692
  • 1
  • 15
  • 35