Thursday, 5 February 2009

Compression Added

I received a code contribution for the TimeAndSizeRollingAppender from Eduardo Simioni, providing two new features:
  1. Backup log files can be compressed.
  2. The scavenger thread that deletes older files can be configured not to run, thereby conserving resources whilst allowing a virtually unlimited number of backups.
I've extended the contribution to support both GZIP and ZIP via configuration. Obviously you can use only one compression algorithm at a time. New source and binaries available in the usual place.

2 comments:

Anonymous said...

Hi Simon,
I think it's very intersting your Appender, in my case I need to delete a fixed number of daily rolled logs.

I have some simply questions about downloading:

1) do I need to merge your jar (log4j-rolling-appender-20090204-1437.jar) with original log4j jar (log4j-1.2.15.jar) ?

2) do I need more operations to do the integration ? I've simply added MaxRollFileCount and ScavengeInterval=5 properties ... code is pointing to Logger classing and getting root Logger.

Thanks for help and best regards,
Alessandro

Simon said...

The TimeAndSizeRollingAppender will leave you with a number of backup files up to the MaxRollFileCount; it will delete any extra. See the Javadoc at http://www.simonsite.org.uk/javadoc/org/apache/log4j/appender/TimeAndSizeRollingAppender.html for a sample configuration and more details.

You can either place both JARs in your CLASSPATH, or merge the contents of the two JARs, its up to you. I think it's preferable to simply add both JARs to the CLASSPATH.

Note that ScavengeInterval is set in milliseconds, so if you want a 5-second delay you'll need to set the property to 5000.