Spark学习笔记——spark listener
spark可以使用SparkListener API在spark运行的过程中监控spark任务当前的运行状态,参考:SparkListener监听使用方式及自定义的事件处理动作
编写 MySparkAppListener
package com.bigdata.spark import org.apache.spark.internal.Logging import org.apache.spark.scheduler.{SparkListener, SparkListenerApplicationEnd, SparkListenerApplicationStart} class MySparkAppListener extends SparkListener with Logging { // 启动事件 override def onApplicationStart(applicationStart: SparkListenerApplicationStart): Unit = { val appId = applicationStart.appId logInfo("spark job start => " + appId.get) } // 结束事件 override def onApplicationEnd(applicationEnd: SparkListenerApplicationEnd): Unit = { logInfo("spark job end => " + applicationEnd.time) } }
添加 spark.extraListeners 参数
val sparkSession = SparkSession.builder() .master("local") .config("spark.extraListeners", "com.bigdata.spark.MySparkAppListener") .appName("spark session example") .getOrCreate()
运行任务后就可以在日志当中看到对应的日志
21/12/27 23:13:46 INFO MySparkAppListener: spark job start => local-1640618026361 21/12/27 23:13:48 INFO MySparkAppListener: spark job end => 1640618028287
还有其他的事件
abstract class SparkListener extends SparkListenerInterface { //阶段完成时触发的事件 override def onStageCompleted(stageCompleted: SparkListenerStageCompleted): Unit = { } //阶段提交时触发的事件 override def onStageSubmitted(stageSubmitted: SparkListenerStageSubmitted): Unit = { } //任务启动时触发的事件 override def onTaskStart(taskStart: SparkListenerTaskStart): Unit = { } //下载任务结果的事件 override def onTaskGettingResult(taskGettingResult: SparkListenerTaskGettingResult): Unit = { } //任务结束的事件 override def onTaskEnd(taskEnd: SparkListenerTaskEnd): Unit = { } //job启动的事件 override def onJobStart(jobStart: SparkListenerJobStart): Unit = { } //job结束的事件 override def onJobEnd(jobEnd: SparkListenerJobEnd): Unit = { } //环境变量被更新的事件 override def onEnvironmentUpdate(environmentUpdate: SparkListenerEnvironmentUpdate): Unit = { } //块管理被添加的事件 override def onBlockManagerAdded(blockManagerAdded: SparkListenerBlockManagerAdded): Unit = { } override def onBlockManagerRemoved( blockManagerRemoved: SparkListenerBlockManagerRemoved): Unit = { } //取消rdd缓存的事件 override def onUnpersistRDD(unpersistRDD: SparkListenerUnpersistRDD): Unit = { } //app启动的事件 override def onApplicationStart(applicationStart: SparkListenerApplicationStart): Unit = { } //app结束的事件 [以下各事件也如同函数名所表达各个阶段被触发的事件不在一一标注] override def onApplicationEnd(applicationEnd: SparkListenerApplicationEnd): Unit = { } override def onExecutorMetricsUpdate( executorMetricsUpdate: SparkListenerExecutorMetricsUpdate): Unit = { } override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit = { } override def onExecutorRemoved(executorRemoved: SparkListenerExecutorRemoved): Unit = { } override def onExecutorBlacklisted( executorBlacklisted: SparkListenerExecutorBlacklisted): Unit = { } override def onExecutorUnblacklisted( executorUnblacklisted: SparkListenerExecutorUnblacklisted): Unit = { } override def onNodeBlacklisted( nodeBlacklisted: SparkListenerNodeBlacklisted): Unit = { } override def onNodeUnblacklisted( nodeUnblacklisted: SparkListenerNodeUnblacklisted): Unit = { } override def onBlockUpdated(blockUpdated: SparkListenerBlockUpdated): Unit = { } override def onOtherEvent(event: SparkListenerEvent): Unit = { } }
本文只发表于博客园和tonglin0325的博客,作者:tonglin0325,转载请注明原文链接:https://www.cnblogs.com/tonglin0325/p/6817137.html