Saturday 20 December 2008

TimeAndSizeRollingAppender Updated

Thanks to Eduardo and Nick who pointed out a couple of bugs in the TimeAndSizeRollingAppender. The first was a start-up bug which meant that the writer was not being initialised with the size of an existing log file. The second was an NPE issue that occurred if a filename was configured without a path. Both are fixed and the new code is available for download at http://www.simonsite.org.uk/resources/lib/log4j-rolling-appender.jar.

5 comments:

Anonymous said...

Simon, I am rather new at log4j..
How would I edit your code to rename files to something like "foo.2007-01-01.0001.log" rather than
"foo.log.2007-01-01.1"
In which file would I start looking?
Mike

Simon said...

As of the time of this post the design has a hard-coded dependency that requires the root filename to be appended by a date pattern followed by a counter suffix. Some sort of simple DSL would be required in order to provide the flexibility to specify the format of backup filenames otherwise.

For example, an additional appender configuration parameter would be needed in order to specify a second pattern, say "FDCS" for File, Date, Counter, Suffix. Any ordering of those characters would enable specification of your backup filename format. But then what about number of characters in the counter, the need for a suffix at all, or any number of other concerns? Numbers might be needed (e.g. "3C"), the presence or absence of format characters takes on meaning, etc. In short, additional complexity is needed in the implementation.

For now, you can customise the scavenging and file rollover code. In the current version, you'll need to modify LogFileScavenger::logFileList() and AbstractRoller::prepareBackupFile().

Simon

Unknown said...

Hi Simon,

I wanted to keep a few of latest log file unzipped.(Probably the last 5 or all today's). Could you point me, which class should I look into for changing.
Currently it is tougher to grep/view the logs if it was just rolled and zipped.

Alok

Simon said...

Alok,

That's a good point. The class to modify is the LogFileCompressor. This class puts Files into a FIFO queue to await compression. If you add a check for a minimum number of entries in the queue before the LogFileCompressor takes any for compression, you will quite easily achieve something like the effect you're looking for.

I'll look at adding in this feature myself to make it configurable.

Simon said...

I added in Alok's suggestion, pretty easy to do. See my home page for a new JAR and new Javadoc.