Configuring Logging Output¶
You can register “destinations” to handle logging output. A destination is a callable that takes a message dictionary. For example, if we want each message to be encoded in JSON and written on a new line on stdout:
import json, sys
from eliot import add_destinations
def stdout(message):
sys.stdout.write(json.dumps(message) + "\n")
add_destinations(stdout)
Up to a 1000 messages will be buffered in memory until the first set of destinations are added, at which point those messages will be delivered to newly added set of destinations.
Outputting to Files¶
You can create a destination that logs to a file by calling eliot.FileDestination(file=yourfile)
.
Each Eliot message will be encoded in JSON and written on a new line.
As a short hand you can call eliot.to_file
which will create the destination and then add it.
For example:
from eliot import to_file
to_file(open("eliot.log", "ab"))
Note
This destination is blocking: if writing to a file takes a long time your code will not be able to proceed until writing is done.
If you’re using Twisted you can wrap a eliot.FileDestination
with a non-blocking eliot.logwriter.ThreadedWriter.
This allows you to log to a file without blocking the Twisted reactor
.
Adding Fields to All Messages¶
Sometimes you want to add a field to all messages output by your process, regardless of destination.
For example if you’re aggregating logs from multiple processes into a central location you might want to include a field process_id
that records the name and process id of your process in every log message.
Use the eliot.add_global_fields
API to do so, e.g.:
import os, sys
from eliot import add_global_fields
add_global_fields(process_id="%s:%d" % (sys.argv[0], os.getpid()))
You should call add_global_fields
before add_destinations
to ensure all messages get the global fields.