We have a long running spark application in MapR cluster which reads data from MapR stream and perform some processing and dump into MapR DB, we want to send alert in form of emails in case a spark application fails or re-run. Is there any way in MapR to send such alerts in 2 cases:
1. Spark application fails.
2. Spark application is processing data slowly.
For second point, we are thinking of using approach discussed in Monitoring Consumers. In there any other way in Spark to monitor consumers?
What are the ways in which we can get alert in case 1 i.e. application re-run or crashes?