Question: When a node in a spark cluster crashes. How does spark ensure the job still gets completed? Question 2 Select one: a . Spark keeps
When a node in a spark cluster crashes. How does spark ensure the job still gets completed?
Question Select one:
a
Spark keeps track of the sequence of transforms that have been applied to the data and then replays the sequence of transformations necessary to reconstruct the data that is lost in the crashed node.
b
Spark writes the result of every RDD transformation to disk. When a crash occurs spark loads the last stored RDD and then just performs the final transformation to get the machine to the current state.
c
Sparks keeps a log of all updates and then replays the updates from the log to get back to the state before the node crash and then completes the rest of the job.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
