Getting started using Sbt build tool for a Spark Project. I have got the build.sbt as below:
name := "spark-first"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.0.cloudera1" % "provided",
"org.apache.spark" %% "spark-sql" % "2.1.0.cloudera1" % "provided",
"org.apache.spark" %% "spark-hive" % "2.0.0" % "provided",
"com.github.nscala-time" %% "nscala-time" % "1.6.0",
"com.typesafe" % "config" % "1.3.0",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test",
"org.apache.spark" %% "spark-hive" % "2.1.0" % "test",
"io.netty" % "netty" % "3.6.2.Final",
"com.google.guava" % "guava" % "14.0.1",
"org.apache.commons" % "commons-lang3" % "3.8",
"com.typesafe.play" %% "play-json" % "2.7.1"
)
resolvers ++= Seq(
"cloudera" at "http://repository.cloudera.com/artifactory/cloudera-repos/",
)
parallelExecution in Test := false
fork in Test: = true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
Using Cmd changed directory to the project folder and ran command 'sbt package' it's creating a target directory under my project folder and creating a jar file as spark-first_2.11-1.0.jar, but its dependency jars are such as play-functional_2.11-2.7.1.jar, and play-json_2.11-2.7.1.jar are getting placed in C:\Users\sparkuser.ivy2\cache\com.typesafe.play directory which has play-functional_2.11\jars and play-json_2.11\jars. I didn't find any luck in trying to find any resource to refer to which automatically places the play-functional_2.11-2.7.1.jar, and play-json_2.11-2.7.1.jar in the same 'target' directory as spark-first_2.11-1.0.jar is present. What changes are required to make that happen?
Thanks for all help in Advance!