I have different functions in my Scala Spark program (simplified code):
def func1(spark: SparkSession, inputpath: String, calc_slot_udf: UDF) : DataFrame = {
import spark.implicits._
val df = spark.read.parquet(inputpath)
valresult = df.withColumn("time_slot", calc_slot_udf($"tt"))
return result
}
def func2(df: DataFrame, c: String) : DataFrame = {
import spark.implicits._
result = df.filter($"country" === c)
return result
}
Also, I have the main function where I create SparkSession
spark
. Then I pass spark
to other functions. However, since I use filter($"...")
and other similar functions, I should use import spark.implicits._
inside each function.
Is there any way to avoid repeating import spark.implicits._
inside each function?