reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
windowDuration: Duration): DStream[(K, V)]
reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
windowDuration: Duration,
slideDuration: Duration): DStream[(K, V)]
reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
windowDuration: Duration,
slideDuration: Duration,
numPartitions: Int): DStream[(K, V)]
reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
windowDuration: Duration,
slideDuration: Duration,
partitioner: Partitioner): DStream[(K, V)]
reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
invReduceFunc: (V, V) => V,
windowDuration: Duration,
slideDuration: Duration = self.slideDuration,
numPartitions: Int = ssc.sc.defaultParallelism,
filterFunc: ((K, V)) => Boolean = null): DStream[(K, V)]
reduceByKeyAndWindow(
reduceFunc: (V, V) => V,
invReduceFunc: (V, V) => V,
windowDuration: Duration,
slideDuration: Duration,
partitioner: Partitioner,
filterFunc: ((K, V)) => Boolean): DStream[(K, V)]
PairDStreamFunctions
PairDStreamFunctions
is a collection of operators available on DStreams of (key, value)
pairs (through an implicit conversion).
Operator | Description |
---|---|
reduceByKeyAndWindow
Operators
reduceByKeyAndWindow
returns a ReducedWindowedDStream
with the input reduceFunc
, invReduceFunc
and filterFunc
functions cleaned up.
Tip
|
Enable DEBUG logging level for org.apache.spark.streaming.dstream.ReducedWindowedDStream to see the times for window, slide, zero with current and previous windows in the logs.
|