Send logs to RabbitMQ from Python/Django.

Overview

python-logging-rabbitmq

Build Status

Logging handler to ships logs to RabbitMQ. Compatible with Django.

Installation

Install using pip.

pip install python_logging_rabbitmq

Versions

Version Dependency
>= 2.x Pika == 0.13
<= 1.1.1 Pika <= 0.10

Handlers

This package has two built-in handlers that you can import as follows:

from python_logging_rabbitmq import RabbitMQHandler

or (thanks to @wallezhang)

from python_logging_rabbitmq import RabbitMQHandlerOneWay
Handler Description
RabbitMQHandler Basic handler for sending logs to RabbitMQ. Every record will be delivered directly to RabbitMQ using the exchange configured.
RabbitMQHandlerOneWay High throughput handler. Initializes an internal queue where logs are stored temporarily. A thread is used to deliver the logs to RabbitMQ using the exchange configured. Your app doesn't need to wait until the log is delivered. Notice that if the main thread dies you might lose logs.

Standalone python

To use with python first create a logger for your app, then create an instance of the handler and add it to the logger created.

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)

rabbit = RabbitMQHandler(host='localhost', port=5672)
logger.addHandler(rabbit)

logger.debug('test debug')

As result, a similar message as follows will be sent to RabbitMQ:

{
	"relativeCreated":280.61580657958984,
	"process":13105,
	"args":[],
	"module":"test",
	"funcName":"<module>",
	"host":"albertomr86-laptop",
	"exc_text":null,
	"name":"myapp",
	"thread":140032818181888,
	"created":1482290387.454017,
	"threadName":"MainThread",
	"msecs":454.01692390441895,
	"filename":"test.py",
	"levelno":10,
	"processName":"MainProcess",
	"pathname":"test.py",
	"lineno":11,
	"msg":"test debug",
	"exc_info":null,
	"levelname":"DEBUG"
}

Sending logs

By default, logs will be sent to RabbitMQ using the exchange 'log', this should be of type topic. The routing key used is formed by concatenating the logger name and the log level. For example:

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
logger.addHandler(RabbitMQHandler(host='localhost', port=5672))

logger.info('test info')
logger.debug('test debug')
logger.warning('test warning')

The messages will be sent using the following routing keys:

  • myapp.INFO
  • myapp.DEBUG
  • myapp.WARNING

For an explanation about topics and routing keys go to https://www.rabbitmq.com/tutorials/tutorial-five-python.html

When create the handler, you're able to specify different parameters in order to connect to RabbitMQ or configure the handler behavior.

Overriding routing-key creation

If you wish to override routing-key format entirely, you can pass routing_key_formatter function which takes LogRecord objects and returns routing-key. For example:

RabbitMQHandler(
	host='localhost',
	port=5672,
	routing_key_formatter=lambda r: (
		'some_exchange_prefix.{}'.format(r.levelname.lower())
	)
)

Configuration

These are the configuration allowed:

Parameter Description Default
host RabbitMQ Server hostname or ip address. localhost
port RabbitMQ Server port. 5672
username Username for authentication. None
password Provide a password for the username. None
exchange Name of the exchange to publish the logs. This exchange is considered of type topic. log
declare_exchange Whether or not to declare the exchange. False
routing_key_format Customize how messages are routed to the queues. {name}.{level}
routing_key_formatter Customize how routing-key is constructed. None
connection_params Allow extra params to connect with RabbitMQ. None
formatter Use custom formatter for the logs. python_logging_rabbitmq.JSONFormatter
close_after_emit Close the active connection after send a log. A new connection is open for the next log. False
fields Dict to add as a field in each logs send to RabbitMQ. This is useful when you want fields in each log but without pass them every time. None
fields_under_root When is True, each key in parameter 'fields' will be added as an entry in the log, otherwise they will be logged under the key 'fields'. True
message_headers A dictionary of headers to be published with the message. None
record_fields A set of attributes that should be preserved from the record object. None
exclude_record_fields A set of attributes that should be ignored from the record object. None
heartbeat Lower bound for heartbeat timeout 60

Examples

RabbitMQ Connection

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	username='guest',
	password='guest',
	connection_params={
		'virtual_host': '/',
		'connection_attempts': 3,
		'socket_timeout': 5000
	}
)

Custom fields

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	fields={
		'source': 'MyApp',
		'env': 'production'
	},
	fields_under_root=True
)

Custom formatter

By default, python_logging_rabbitmq implements a custom JSONFormatter; but if you prefer to format your own message you could do it as follow:

import logging
from python_logging_rabbitmq import RabbitMQHandler

FORMAT = '%(asctime)-15s %(message)s'
formatter = logging.Formatter(fmt=FORMAT)
rabbit = RabbitMQHandler(formatter=formatter)

For a custom JSON Formatter take a look at https://github.com/madzak/python-json-logger

Django

To use with Django add the handler in the logging config.

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Configuration

Same as when use it with standalone python, you could configure the handle directly when declaring it in the config:

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'port': 5672,
			'username': 'guest',
			'password': 'guest',
			'exchange': 'log',
			'declare_exchange': False,
			'connection_params': {
				'virtual_host': '/',
				'connection_attempts': 3,
				'socket_timeout': 5000
			},
			'fields': {
				'source': 'MainAPI',
				'env': 'production'
			},
			'fields_under_root': True
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Custom formatter

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'standard': {
			'format': '%(levelname)-8s [%(asctime)s]: %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'standard'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

JSON formatter

pip install python-json-logger
LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'json': {
			'()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
			'fmt': '%(name)s %(levelname) %(asctime)s %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'json'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Releases

Date Version Notes
Mar 10, 2019 1.1.1 Removed direct dependency with Django. Integration with Travis CI. Configuration for tests. Using pipenv.
May 04, 2018 1.0.9 Fixed exchange_type parameter in channel.exchange_declare (Thanks to @cklos).
Mar 21, 2018 1.0.8 Allowing message headers (Thanks to @merretbuurman).
May 15, 2017 1.0.7 Adding support to customize the routing_key (Thanks to @hansyulian).
Mar 30, 2017 1.0.6 Fix compatibility with python3 in RabbitMQHandlerOneWay (by @sactre).
Mar 28, 2017 1.0.5 Explicit local imports.
Mar 16, 2017 1.0.4 Added new handler RabbitMQHandlerOneWay (by @wallezhang).
Mar 14, 2017 1.0.3 Added config parameter close_after_emit.
Dec 21, 2016 1.0.2 Minor fixes.
Dec 21, 2016 1.0.1 Minor fixes.
Dec 21, 2016 1.0.0 Initial release.

What's next?

  • Let's talk about tests.
  • Issues, pull requests, suggestions are welcome.
  • Fork and improve it. Free for all.

Similar efforts

Comments
  • TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    I always get this error.

    The error was from line 101 in handler,py.
    But I think it is because of the line 62
    self.connection_params.update(dict(host=host, port=port, heartbeat_interval=0))

    Just go through this Pika Documentation

    connection_params does not have heartbeat_interval

    bug 
    opened by raj-kiran-p 7
  • fix: handle thread shutdown

    fix: handle thread shutdown

    Introduce two events (stopping, stopped) to interlock with the worker thread and cause a graceful shutdown.

    Add a timeout to the Queue get of 10s, this means that a graceful shutdown will not be instantaneous.

    Switch to del on the Pika blocking channels.

    opened by donbowman 4
  • RabbitMQ server closes the connection because not receiving heartbeat

    RabbitMQ server closes the connection because not receiving heartbeat

    Hi Albert, Similar to issue https://github.com/pika/pika/issues/1104. After digging into Pika and RabbitMQ, I find with BlockedConnection, pika will not automatically send out the heartbeat. The heartbeat event will only be handled/sent in "start_consuming" and "process_data_events". For consumer, we will use "start_consuming", there will not be such issue. But for producer, normally we won't call "process_data_events" specifically, it will only be called when we call "basic_publish". Let's say we set "heartbeat" to 20s, if we don't log any message within 3x10s, the server would close the connection. (Different version of RabbitMQ might have different behaviors, some might take 3x20s) I didn't see anyone report this issue or talk this on the internet, so I'm not sure if my understanding is correct. Look forward to your response. Thanks in advance.

    bug wip 
    opened by yuanli-cn 4
  • Standalone not working

    Standalone not working

    Hello everybody,

    I'm trying to implement your lib in my python app. We're not using Django and we have this error raised :

    Traceback (most recent call last): File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/test.py", line 1, in <module> import DarwinLogger File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/DarwinLogger.py", line 4, in <module> from python_logging_rabbitmq import RabbitMQHandlerOneWay File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/__init__.py", line 2, in <module> from .formatters import JSONFormatter # noqa: F401 File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/formatters.py", line 5, in <module> from django.core.serializers.json import DjangoJSONEncoder ImportError: No module named django.core.serializers.json

    I simply followed the "standalone" part of the readme. Is this normal ? Or Am I doing something wrong ?

    Thx !

    bug 
    opened by Travincebarker 2
  • wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    Hi,

    Thank you for your great package.

    Is there any way to wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ? Naive method could be to wait a few seconds (time.sleep(2)) but there is maybe a better method.

    Thanks a lot.

    enhancement planning 
    opened by BenjaminSchmitt 2
  • Unconfigurable Routing Key Format

    Unconfigurable Routing Key Format

    I need to able to change the routing key format in my system, so i prefered that this file, python_loggin_rabbitmq/handlers.py:

    line 115:

                routing_key ="{name}.{level}".format(name=record.name, level=record.levelname)
    

    to be changed to:

    line 14:

                ROUTING_KEY_FORMAT = "{name}.{level}"
    

    line 115:

                routing_key = self.ROUTING_KEY_FORMAT.format(name=record.name, level=record.levelname)
    

    so it will be configurable thank you

    enhancement 
    opened by hansyulian 2
  • ImportError: No module named 'compat'

    ImportError: No module named 'compat'

    When I use the library I see an Exception:

    File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/init.py", line 2, in from .formatters import JSONFormatter # noqa: F401 File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/formatters.py", line 4, in from compat import json ImportError: No module named 'compat'

    are some wrong in ini?

    Regards and thank you for your library.

    bug 
    opened by sactre 2
  • Add content_type in pika.BasicProperties parameters

    Add content_type in pika.BasicProperties parameters

    https://github.com/albertomr86/python-logging-rabbitmq/blob/5d3ce4cc0b86b7303a2097d6acb46972d334e213/python_logging_rabbitmq/handlers.py#L164 The safest way to work is to add content_type = 'STRING' but could be as parameter key in class method.

    wip 
    opened by TopperBG 1
  • Fix in publish(): the body is already formatted.

    Fix in publish(): the body is already formatted.

    In emit(), the record is formatted and than queued. The worker, is getting from the queue the record to be published In publish(), that record was formatted again (a second time)

    Try a simple app like this:

    import time import logging from python_logging_rabbitmq import RabbitMQHandlerOneWay

    logger = logging.getLogger('myapp') logger.setLevel(logging.DEBUG)

    rabbit = RabbitMQHandlerOneWay(host='localhost', port=5672) logger.addHandler(rabbit)

    logger.debug('test debug') time.sleep(3)

    -- Error: File "python-logging-rabbitmq/python_logging_rabbitmq/formatters.py", line 22, in format data = record.dict.copy() AttributeError: 'str' object has no attribute 'dict'

    opened by ghost 1
  • Returning batch of changes to upstream

    Returning batch of changes to upstream

    Hi, I'm pleased to say that we've been using your library in our project and it turned out very helpful. We've made some changes to fit our needs and thought to return them to upstream, you may find them useful. In summary, we've:

    • Updated .gitignore to include broader range of Python/Vim-related files
    • Made some stylistic tweaks; sorted imports, PEP8-ified some comments
    • Added routing_key_formatter option which allows to pass lambda overriding routing-key creation
    • Added support for serialization of Django's requests (this means that Rabbit handlers can handle errors logged to django.requests)
    • Added record_fields and exclude_record_fields options which allow to include/exclude specified LogRecord attributes (sometimes fields such as levelno are just not helpful)
    • Imported DjangoJSONEncoder to json formatter in order to handle breader range of objects (such as Decimal)
    • Updated README
    opened by IwoHerka 1
  • call of channel.exchange_declare modified

    call of channel.exchange_declare modified

    According to the Pika source at: https://github.com/pika/pika/blob/master/pika/channel.py#L658 the channel.exchange_declare method has no argument 'type', the corresponding argument is 'exchange_type'.

    opened by cklos 1
  • fix: only mark task done when a task was dequeued

    fix: only mark task done when a task was dequeued

    task_done will fail if we mark a task as having finished when no task was dequeued. Since this can only happen after a task was retrieved from the queue, move the finally into an inner try so that we know task_done will work.

    Fixes #29 for the most part -- it does not address the leak regarding messages still in the queue when is_stopping is set.

    opened by klarose 0
  • Call queue.task_done() only after a successful get()

    Call queue.task_done() only after a successful get()

    queue.task_done() should be called only when an item was actually returned by get(). If get() raises a Empty exception, task_done() should not be called.

    Also, close the Pika connection only if it was actually opened.

    wip 
    opened by kmorwath 1
  •  self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    The changes in version 2.2 for fix #25 in python_logging_rabbitmq/handlers_oneway.py may have introduced an issue. Before the Queue.Empty exception was never raised because record, routing_key = self.queue.get() had no timeout. Now when the exception is raised if no messages arrives within 10s, the exception handler will call "continue" but still the "finally" block is executed anyway - and queue.task_done() could be called more times than put() and it will lead to a ValueError exception.

    queue.task_done() should be called in a inner "try..finally" block after a message has been dequeued actually, for example:

    record, routing_key = self.queue.get(block=True, timeout=10) try: #Actually got a message ... try to send the message ... finally: queue.task_done()

    Moreover when is_stopping is set the loop is exited before queue.task_done() is called, and messages still in the queue are not processed. If on the other side of the queue something attempts to call queue.join() it could never return.

    opened by kmorwath 0
  • `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    As per https://github.com/esnme/ultrajson/issues/124

    If you have a package that requires ujson, it is automatically picked up by the compat.py and used in JSONFormatter thereafter. Unfortunately, ujson is not fully compatible with the built-in json.dump and it does not understand the cls parameter.

    opened by EivV 1
  • SSL configuration isn't working automatically

    SSL configuration isn't working automatically

    As a workaround I initilize to following:

    SSLOptions(ssl.SSLContext(protocol=ssl.PROTOCOL_TLSv1_2))

    and pass it as connection_params under ssl_options

    Without a workaround I get a connection reset error.

    bug wip 
    opened by Ghost93 2
Releases(2.0.0)
Owner
Alberto Menendez Romero
Technical Manager at Globant SA.
Alberto Menendez Romero
Django API without Django REST framework.

Django API without DRF This is a API project made with Django, and without Django REST framework. This project was done with: Python 3.9.8 Django 3.2.

Regis Santos 3 Jan 19, 2022
A simple Django middleware for Duo V4 2-factor authentication.

django-duo-universal-auth A lightweight middleware application that adds a layer on top of any number of existing authentication backends, enabling 2F

Adam Angle 1 Jan 10, 2022
A Django GraphQL (Graphene) base template

backend A Django GraphQL (Graphene) base template Make sure your IDE/Editor has Black and EditorConfig plugins installed; and configure it lint file a

Reckonsys 4 May 25, 2022
Django Login Api With Python

How to run this project Download and extract this project Create an environment and install all the libraries from requiements.txt pip freeze -r requi

Vikash Kisku 1 Dec 10, 2021
Code coverage measurement for Python

Coverage.py Code coverage testing for Python. Coverage.py measures code coverage, typically during test execution. It uses the code analysis tools and

Ned Batchelder 2.3k Jan 05, 2023
Use heroicons in your Django and Jinja templates.

heroicons Use heroicons in your Django and Jinja templates. Requirements Python 3.6 to 3.9 supported. Django 2.2 to 3.2 supported. Are your tests slow

Adam Johnson 52 Dec 14, 2022
Simple XML-RPC and JSON-RPC server for modern Django

django-modern-rpc Build an XML-RPC and/or JSON-RPC server as part of your Django project. Major Django and Python versions are supported Main features

Antoine Lorence 82 Dec 04, 2022
Create a netflix-like service using Django, React.js, & More.

Create a netflix-like service using Django. Learn advanced Django techniques to achieve amazing results like never before.

Coding For Entrepreneurs 67 Dec 08, 2022
It takes time to start a Django Project and make it almost production-ready.

It takes time to start a Django Project and make it almost production-ready. A developer needs to spend a lot of time installing required libraries, setup a database, setup cache as well as hiding se

Khan Asfi Reza 1 Jan 01, 2022
Django channels basic chat

Django channels basic chat

Dennis Ivy 41 Dec 24, 2022
Notes-Django: an advanced project to save notes in Django. where users are able to Create, Read, Update and Delete their notes.

An advanced software to keep you notes. It allows users to perform CRUD operations on theirs Notes. Was implemented Authorization and Authentication

Edilson Pateguana 1 Feb 05, 2022
A simple page with paypal payment and confiramtion in django

django-paypal a simple page with paypal payment and confiramtion in django Youtube Video : Paypal Smart Button : https://developer.paypal.com/demo/che

Mahmoud Ahmed 5 Feb 19, 2022
Comprehensive Markdown plugin built for Django

Django MarkdownX Django MarkdownX is a comprehensive Markdown plugin built for Django, the renowned high-level Python web framework, with flexibility,

neutronX 738 Dec 21, 2022
Django-fast-export - Utilities for quickly streaming CSV responses to the client

django-fast-export Utilities for quickly streaming CSV responses to the client T

Matthias Kestenholz 4 Aug 24, 2022
A feature flipper for Django

README Django Waffle is (yet another) feature flipper for Django. You can define the conditions for which a flag should be active, and use it in a num

950 Dec 26, 2022
Django Livre Bank

Django Livre Bank Projeto final da academia Construdelas. API de um banco fictício com clientes, contas e transações. Integrantes da equipe Bárbara Sa

Cecília Costa 3 Dec 22, 2021
django social media app with real time features

django-social-media django social media app with these features: signup, login and old registered users are saved by cookies posts, comments, replies,

8 Apr 30, 2022
A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for quickly creating new images from the one assigned to the field.

django-versatileimagefield A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for creat

Jonathan Ellenberger 490 Dec 13, 2022
Automatically deletes old file for FileField and ImageField. It also deletes files on models instance deletion.

Django Cleanup Features The django-cleanup app automatically deletes files for FileField, ImageField and subclasses. When a FileField's value is chang

Ilya Shalyapin 838 Dec 30, 2022
Dashboad Full Stack utilizando o Django.

Dashboard FullStack completa Projeto finalizado | Informações Cadastro de cliente Menu interatico mostrando quantidade de pessoas bloqueadas, liberada

Lucas Silva 1 Dec 15, 2021