FetchFailedException exception may be thrown when a task runs (and
ShuffleBlockFetcherIterator did not manage to fetch shuffle blocks).
FetchFailedException contains the following:
FetchFailedException is reported,
TaskRunner catches it and notifies
TaskState.FAILED task state).
OutOfMemoryErrorcould be thrown (aka OOMed) or some other unhandled exception.
The cluster manager that manages the workers with the executors of your Spark application, e.g. YARN, enforces the container memory limits and eventually decided to kill the executor due to excessive memory usage.
A solution is usually to tune the memory of your Spark application.
|FIXME Image with the call to ExecutorBackend.|