Elastic Scaling (Dynamic Allocation)

Dynamic Allocation in Spark Streaming makes for adaptive streaming applications by scaling them up and down to adapt to load variations. It actively controls resources (as executors) and prevents resources from being wasted when the processing time is short (comparing to a batch interval) - scale down - or adds new executors to decrease the processing time - scale up.

Note
It is a work in progress in Spark Streaming and should be available in Spark 2.0.

The motivation is to control the number of executors required to process input records when their number increases to the point when the processing time could become longer than the batch interval.

Configuration

  • spark.streaming.dynamicAllocation.enabled controls whether to enabled dynamic allocation (true) or not (false).

results matching ""

    No results matching ""