I have partitioned data in the HDFS. At some point I decide to update it. The algorithm is:
- Read the new data from a kafka topic.
- Find out new data's partition names.
- Load the data from partitions with these names that is in the HDFS.
- Merge the HDFS data with the new data.
- Overwrite partitions that are already on disk.
The problem is that what if the new data has partitions that don't exist on disk yet. In that case they don't get written. https://stackoverflow.com/a/49691528/10681828 <- this solution doesn't write new partitions for example.
The above picture describes the situation. Let's think of the left disk as being the partitions that are already in HDFS and of the right disk as partitions that we just received from Kafka.
Some of the partitions of the right disk will intersect with the already existing ones, the others won't. And this code:
spark.conf.set("spark.sql.sources.partitionOverwriteMode","dynamic")
dataFrame
.write
.mode(SaveMode.Overwrite)
.partitionBy("date", "key")
.option("header", "true")
.format(format)
.save(path)
is not able to write the blue part of the picture to disk.
So, how do I resolve this issue? Please provide code. I am looking for something performant.
An example for those who don't understand:
Suppose we have this data in the HDFS:
- PartitionA has data "1"
- PartitionB has data "1"
Now we receive this new data:
- PartitionB has data "2"
- PartitionC has data "1"
So, partitions A and B are in the HDFS, and partitions B and C are the new ones, and since B is in the HDFS we update it. And I want C to be written. So the end result should look like this:
- PartitionA has data "1"
- PartitionB has data "2"
- PartitionC has data "1"
But If I use the code from above, I get this:
- PartitionA has data "1"
- PartitionB has data "2"
Because the new feature overwrite dynamic
from spark 2.3 is not able to create PartitionC.
Update: It turns out that if you use hive tables instead, this will work. But if you use pure spark it doesn't... So, I guess hive's overwrite and spark's overwrite work differently.