kafka

How to upgrade your Spark Stream application with a new checkpoint!

How to upgrade your Spark Stream application with a new checkpoint With working code Sometimes in life, we need to make breaking changes which require us to create a new checkpoint. Some example scenarios: You are doing a code/application change where you are changing logic Major Spark Version upgrade from Spark 2.x to Spark 3.x The previous deployment was wrong, and you want to reprocess from a certain point There could be plenty of scenarios where you want to control precisely which data(Kafka offsets) need to be processed.

Continue reading

Using Spark Streaming to merge/upsert data into a Delta Lake with working code

Using Spark Streaming to merge/upsert data into a Delta Lake with working code This blog will discuss how to read from a Spark Streaming and merge/upsert data into a Delta Lake. We will also optimize/cluster data of the delta table. In the end, we will show how to start a streaming pipeline with the previous target table as the source. Overall, the process works in the following manner, we read data from a streaming source and use this special function ***foreachBatch.

Continue reading