Flink Logback日志与邮件报警配置
Flink官方推荐使用Logback替代默认的Log4j作为日志框架。我们之前一直用Log4j,最近切换成了更优秀的Logback,但是配置起来略有点麻烦,本文简述配置过程。
在项目POM中加入Logback的依赖项:logback-core、logback-classic,以及log4j-over-slf4j(因为Flink依赖于Hadoop,Hadoop却直接使用Log4j输出日志,所以需要用log4j-over-slf4j将其桥接到Slf4j与Logback)
<properties>
<logback.version>1.2.3</logback.version>
<log4j-over-slf4j.version>1.7.25</log4j-over-slf4j.version>
</properties>
<dependencies>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>${logback.version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>log4j-over-slf4j</artifactId>
<version>${log4j-over-slf4j.version}</version>
</dependency>
</dependencies>
在所有Flink依赖项中,排除log4j的传递依赖与Log4j-Slf4j的桥接。否则Flink还是会使用Log4j,后面的配置就白费了。
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.bin.version}</artifactId>
<version>${flink.version}</version>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
从Flink安装目录的lib目录中删掉log4j与slf4j-log4j12的JAR包,并加入logback-core、logback-classic、log4j-over-slf4j的JAR包。
修改Flink的conf目录下的默认Logback配置文件,一共有三个:logback.xml是JobManager和TaskManager的日志配置,Standalone和YARN模式都适用;logback-console.xml是Flink命令行客户端的日志配置,只在本地模式适用;logback-yarn.xml则是on YARN Session模式下的日志配置。
我们修改logback.xml,将其配置为按天滚动的文件日志(Appender为RollingFileAppender),防止持续写同一个日志文件造成大小膨胀,代码如下。
<appender name="file" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${log.file}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${log.file}.%d{yyyy-MM-dd}.%i</fileNamePattern>
<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<maxFileSize>64MB</maxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
<maxHistory>10</maxHistory>
</rollingPolicy>
<encoder>
<charset>UTF-8</charset>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{60} %X{sourceThread} - %msg%n</pattern>
</encoder>
</appender>
我们还可以利用Logback提供的SMTPAppender将指定级别的日志发送到邮箱,在没有专门的日志监控体系(如ELK)时,能够起到一定的替代作用,代码如下。
<appender name="email" class="ch.qos.logback.classic.net.SMTPAppender">
<smtpHost>smtp.163.com</smtpHost>
<smtpPort>25</smtpPort>
<username>some_username</username>
<password>some_password</password>
<to>[email protected]</to>
<from>[email protected]</from>
<subject>Flink Error Log</subject>
<asynchronousSending>false</asynchronousSending>
<SSL>false</SSL>
<STARTTLS>true</STARTTLS>
<layout class="ch.qos.logback.classic.html.HTMLLayout">
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{60} %X{sourceThread} - %msg%n</pattern>
</layout>
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<level>ERROR</level>
</filter>
<cyclicBufferTracker class="ch.qos.logback.core.spi.CyclicBufferTracker">
<bufferSize>50</bufferSize>
</cyclicBufferTracker>
</appender>
另外还需要将发送邮件依赖的mail.jar放在lib目录下,最后别忘了在<root>标签下加上新的Appender:
<root level="INFO">
<appender-ref ref="file"/>
<appender-ref ref="email"/>
</root>
文章不错?点个【在看】吧! 👇