I have created a library in scala and I deploy that library locally. I have documented the build.sbt below
ThisBuild / name := "sqlparser"
ThisBuild / version := "0.1"
ThisBuild / organization := "xx.xx.xxxxxx.xxxxxx"
scalaVersion := "2.12.13"
idePackagePrefix := Some("xx.xx.xxxxxxx.xxxxxx.sqlparser")
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % Test
publishTo := some(Resolver.file("file", new File(Path.userHome.absolutePath+"/.ivy2/local/"))(Resolver.ivyStylePatterns))
publishMavenStyle := false
Command I used to publish
sbt publish
Work like a charm
Now there is a spark(scala) based project that include the package described earlier. Sbt is as following
name := "brandinfoproducer"
version := "0.1"
scalaVersion := "2.12.13"
idePackagePrefix := Some("xx.xx.xxxxxxx.xxxxxx.brandinfoproducer")
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % Test
libraryDependencies += "com.google.guava" % "guava" % "30.1-jre"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.1.0"
resolvers += Resolver.file("file", new File(Path.userHome.absolutePath+"/.ivy2/local/"))(Resolver.ivyStylePatterns)
libraryDependencies += "xx.xx.xxxxxxx.xxxxxx" %% "sqlparser" % "0.1"
compile and package command
sbt clean compile package
Compile and build perfectly.
and then I use the following command to start a spark job
I execute the following command
/home/user_home/hdp26_c4000_stg/spark2/bin/spark-submit \
--name "XXX_XXXXXXXX_XXXXXXXXXX_XXX" \
--driver-java-options "-Dspring.profiles.active=stg -Dsource=MEDIA -Dspark.executor.memory=500m -Dspark.driver.memory=1g -Djava.security.krb5.conf=${KRB5_CONFIG}" \
--master local[*] \
/home/user_home/brandinfoproducer/target/scala-2.12/brandinfoproducer_2.12-0.1.jar
I am getting the following error
Exception in thread "main" java.lang.NoClassDefFoundError: xx/xx/xxxxxx/xxxxxx/sqlparser/SQLParser$
at xx.xx.xxxxxxx.xxxxxx.brandinfoproducer.BrandInfoProducer$.main(BrandInfoProducer.scala:9)
at xx.xx.xxxxxxx.xxxxxx.brandinfoproducer.BrandInfoProducer.main(BrandInfoProducer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: xx.xx.xxxxxxx.xxxxxx.sqlparser.SQLParser$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 11 more
The problem is crystal clear. But not the solution. I am trying to find it out; not being able to find a proper solution.
I would appreciate the solution. Also, being a new user of scala and sbt, there may be some obvious glaring oddity exist in my build.sbt; please let me know.