天天看点

Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志

架构拓扑图

Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志

 官网查找Flume整合Log4j方式

Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志
Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志
Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志
Spark Streaming实时流处理项目实战笔记——使用Flume采集Log4j产生的日志

Pom依赖

<dependency>
  <groupId>log4j</groupId>
  <artifactId>log4j</artifactId>
  <version>1.2.17</version>
</dependency>

<dependency>
  <groupId>org.apache.flume.flume-ng-clients</groupId>
  <artifactId>flume-ng-log4jappender</artifactId>
  <version>1.6.0</version>
</dependency>
           

Application

public class LoggerGenerator {

    private static Logger logger = Logger.getLogger(LoggerGenerator.class.getName());

    public static void main(String[] args) throws InterruptedException {

        int index = 0;
        while (true){
            Thread.sleep(1000);
            logger.info("value : " + index++);
        }
    }
}
           

log4j.properties

log4j.rootLogger = INFO,stdout,flume

log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = %d{yyyy-MM-dd HH:mm:ss,SSS} [%t] [%c] [%p] -%m%n

log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = hadoop
log4j.appender.flume.Port = 44444
log4j.appender.flume.UnsafeMode = true
           

启动Flume agent

flume-ng --name a1 --conf $FLUME_HOME/conf --conf-file $FLUME_HOME/conf/avro-memory-logger-demo.conf -Dflume.root.logger=INFO,console
           

继续阅读