LOG4J has been very popular logging library in Java world for years.
LOG4J2 is even better. In Aug 2015 Log4j development team officially announced end of life for Log4j version 1.* and urged to start using
While upgrading from
log4j2 is easy and can be done without changing existing code, real-life application have a lot dependencies which still come pre-configured to use old log4j and switching to new library is not always easy. Here we discuss how to start using
log4j2 on Spark. Spark 2.1 is still using old
slf4j library and though switching is possible, it is not documented and require manual steps on installed Spark system.
The following steps are required to switch Spark to use
log4j2 jars and unpack to locations accessible by Spark on each Spark node
wget http://apache.mirrors.lucidnetworks.net/logging/log4j/2.8.2/apache-log4j-2.8.2-bin.tar.gz tar xzf apache-log4j-2.8.2-bin.tar.gz cd apache-log4j-2.8.2-bin mkdir -p /usr/local/spark/extra_jars for f in log4j-1.2-api-2.8.2.jar log4j-api-2.8.2.jar log4j-api-scala_2.11-2.8.2.jar log4j-core-2.8.2.jar log4j-slf4j-impl-2.8.2.jar do cp $f /usr/local/spark/extra_jars/ done
2. Add new jars to SPARK_CLASSPATH variable using Spark configuration script
vi /usr/local/spark/conf/spark-env.sh ----- SPARK_CLASSPATH=/usr/local/spark/extra_jars/log4j-1.2-api-2.8.2.jar SPARK_CLASSPATH=/usr/local/spark/extra_jars/log4j-api-2.8.2.jar:$SPARK_CLASSPATH SPARK_CLASSPATH=/usr/local/spark/extra_jars/log4j-api-scala_2.11-2.8.2.jar:$SPARK_CLASSPATH SPARK_CLASSPATH=/usr/local/spark/extra_jars/log4j-core-2.8.2.jar:$SPARK_CLASSPATH SPARK_CLASSPATH=/usr/local/spark/extra_jars/log4j-slf4j-impl-2.8.2.jar:$SPARK_CLASSPATH
3. Remove old log4j jars from standard Spark classpath
cd /usr/local/spark/jars mv log4j-1.2.17.jar log4j-1.2.17.jar.bak mv slf4j-log4j12-1.7.16.jar slf4j-log4j12-1.7.16.jar.bak mv slf4j-log4j12-1.7.16.jar slf4j-log4j12-1.7.16.jar.bak
4. Restart Spark services.
That’s it! Now Spark uses logs using log4j2 library.