How does back pressure property work in Spark Streaming?
Asked Answered
J

1

7

I have a CustomReceiver which receives a single event(String).The received single event is used during spark application's run time to read data from nosql and to apply transformations.When the processing time for each batch was observed to be greater than batch interval I set this property.

spark.streaming.backpressure.enabled=true

After which I expected the CustomReceiver to not trigger and receive the event when a batch is processing longer than batch window, which didn't happen and still a backlog of batches were being added. Am I missing something here?

Janijania answered 25/1, 2017 at 0:19 Comment(4)
Have you read vanwilgenburg.wordpress.com/2015/10/06/…?Other
Does that mean the existing PIDEstimator wouldn't help in the above problem? And there isn't way I can plugin my own estimator to stop receiving new single event when a batch is still processing?Janijania
You receive one event per batch?Advisement
Yes and that event is used later in application run time to fetch data from nosql.Janijania
C
4

Try to check this and this articles.

Crary answered 8/6, 2018 at 14:17 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.