checkpoint

How to write your first Spark application with Stream-Stream Joins with working code.

How to write your first Spark application with Stream-Stream Joins with working code. Have you been waiting to try Streaming but cannot take the plunge? In a single blog, we will teach you whatever needs to be understood about Streaming Joins. We will give you a working code which you can use for your next Streaming Pipeline. The steps involved: Create a fake dataset at scale Set a baseline using traditional SQL Define Temporary Streaming Views Inner Joins with optional Watermarking Left Joins with Watermarking The cold start edge case: withEventTimeOrder Cleanup What is Stream-Stream Join?

Continue reading

Dive Deep into Spark Streaming Checkpoint

From Beginner to Pro: A Comprehensive Guide to understanding the Spark Streaming Checkpoint Spark is a distributed computing framework that allows for processing large datasets in parallel across a cluster of computers. When running a Spark job, it is not uncommon to encounter failures due to various issues such as network or hardware failures, software bugs, or even insufficient memory. One way to address these issues is to re-run the entire job from the beginning, which can be time-consuming and inefficient.

Continue reading

How to upgrade your Spark Stream application with a new checkpoint!

How to upgrade your Spark Stream application with a new checkpoint With working code Sometimes in life, we need to make breaking changes which require us to create a new checkpoint. Some example scenarios: You are doing a code/application change where you are changing logic Major Spark Version upgrade from Spark 2.x to Spark 3.x The previous deployment was wrong, and you want to reprocess from a certain point There could be plenty of scenarios where you want to control precisely which data(Kafka offsets) need to be processed.

Continue reading