streaming.concurrentJobs is not documented and used when we want to add parallel in our system. so multiple micro batch from same kafka topic can be processed concurrently. (if I understand correctly)
My question is whether it means there will be multiple thread runs in executor level? for example, we generally assume everything runs inside "foreachpartition" is with one single thread, and we do not do thread safe lock, but if we set spark.streaming.concurrentJobs >1, should we pat attention to thread safe? since multi thread will operate for same partition concurrently?