2

I am using cassandra 3.2.1 with spark, i included all the required jars. and i tried to connect cassandra from java through spark, i am getting the following error,

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lscala/collection/immutable/StringOps; at akka.util.Duration$.(Duration.scala:76) at akka.util.Duration$.(Duration.scala) at akka.actor.ActorSystem$Settings.(ActorSystem.scala:120) at akka.actor.ActorSystemImpl.(ActorSystem.scala:426) at akka.actor.ActorSystem$.apply(ActorSystem.scala:103) at akka.actor.ActorSystem$.apply(ActorSystem.scala:98) at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122) at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55) at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:142) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269) at org.apache.spark.SparkContext.(SparkContext.scala:272) at spark.Sample.run(Sample.java:13) at spark.Sample.main(Sample.java:23)

Any Idea regarding this? and what i am missing.

See the jars and my sample code in below image. Don't know where i am doing mistake.

Click here to open image

jub0bs
  • 46,795
  • 22
  • 148
  • 157
Vijaya
  • 137
  • 12
  • Specify the version of Spark and also the version of additional jar files added to Driver and Executor classpath. Also [this link](http://stackoverflow.com/questions/34641995/nosuchmethoderror-from-spark-cassandra-connector-with-assembled-jar/34645885#34645885) may help you. – Sumit Jan 28 '16 at 06:02
  • I have changed the question.In the link that you have mention, "spark 1.4 is the stable release". So, is this is the problem with spark 1.3? – Vijaya Jan 28 '16 at 06:44
  • Now i have changed all of jars with the jars you have specified in the link. In addition to that i included scala-library-2.10 jar and some other jars. But still i am getting same error. – Vijaya Jan 28 '16 at 07:31
  • There is no problem with Spark 1.3, Actually it is version compatibility between Spark and Cassandra Driver. See here the [Version Compatibility section](https://github.com/datastax/spark-cassandra-connector). – Sumit Jan 28 '16 at 07:59
  • Ok, thanks.May i know which version of scala i should use with spark 1.5 and cassandra 3.2.1. – Vijaya Jan 28 '16 at 09:31
  • Scala version should be 2.10.5 but I am not sure whether Cassandra Driver for Spark 1.5 are fully compatible and available. You can take your chances if it doesn't then you have to live with Spark 1.4 – Sumit Jan 28 '16 at 09:35
  • I have attached a image with jars and my code in question.can you please verify that and let me know if there is any mistake. – Vijaya Jan 28 '16 at 09:51
  • Versions seems to be fine but seems like you are using Spark 1.5 and the Cassandra Driver for Spark 1.5 is still in progress. In nutshell you should try Spark 1.4 and below to connect with Casandra. Needless to say leverage the Compatible versions as mentioned [here](https://github.com/datastax/spark-cassandra-connector) – Sumit Jan 28 '16 at 10:48
  • I am using spark 1.4 only, you can see that in image also. And i checked the compatibility version also. – Vijaya Jan 28 '16 at 10:51
  • In command level i am using spark master(spark 1.6) , i build this one using maven, is this is causing the problem? – Vijaya Jan 29 '16 at 05:14
  • Would suggest to use Spark 1.4 with the appropriate drivers – Sumit Jan 29 '16 at 06:36
  • Did you find any mistake in image( including jars version and java code)? – Vijaya Jan 29 '16 at 07:14

0 Answers0