2

I have different functions in my Scala Spark program (simplified code):

def func1(spark: SparkSession, inputpath: String, calc_slot_udf: UDF) : DataFrame = {
    import spark.implicits._

    val df = spark.read.parquet(inputpath)

    valresult = df.withColumn("time_slot", calc_slot_udf($"tt"))

    return result
}

def func2(df: DataFrame, c: String) : DataFrame = {
    import spark.implicits._

    result = df.filter($"country" === c)

    return result
}

Also, I have the main function where I create SparkSession spark. Then I pass spark to other functions. However, since I use filter($"...") and other similar functions, I should use import spark.implicits._ inside each function.

Is there any way to avoid repeating import spark.implicits._ inside each function?

Markus
  • 2,594
  • 6
  • 33
  • 67
  • Duplicate question here: https://stackoverflow.com/questions/45724290/workaround-for-importing-spark-implicits-everywhere also this one is similar: https://stackoverflow.com/questions/39151189/importing-spark-implicits-in-scala – ecoe Sep 25 '18 at 17:25

0 Answers0