Python has a pretty robust logging facility. We’ve been using some common logging code in several projects of our projects, including django-workspace. Up until now, we’ve been bothered by some duplicate log file entries. Using this code:
LOG_FILENAME = foo.log logging.basicConfig(filename=LOG_FILENAME, level=logging.DEBUG, datefmt='%Y-%m-%d %H:%M:%S', format="%(asctime)s - %(name)s - %(levelname)s - %(message)s") rotatingFileHandler = logging.handlers.TimedRotatingFileHandler( filename=LOG_FILENAME, when='midnight') rotatingFileHandler.setLevel(logging.DEBUG) formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s') rotatingFileHandler.setFormatter(formatter) logging.getLogger('').addHandler(rotatingFileHandler)
We would get duplicate lines for every entry and we couldn’t figure out why
Well, it looks like if you supply a filename to the
call, it will setup it’s own internal handler for you. So the
call was actually adding a second handler, which is where the duplicate lines were coming from. Just a little tip for anyone else out there wrestling with this.