A pytest plugin, that enables you to test your code that relies on a running PostgreSQL Database

Overview

https://raw.githubusercontent.com/ClearcodeHQ/pytest-postgresql/master/logo.png

pytest-postgresql

Latest PyPI version Wheel Status Supported Python Versions License

What is this?

This is a pytest plugin, that enables you to test your code that relies on a running PostgreSQL Database. It allows you to specify fixtures for PostgreSQL process and client.

How to use

Warning

Tested on PostgreSQL versions >= 10. See tests for more details.

Install with:

pip install pytest-postgresql

You will also need to install psycopg. See its installation instructions.

Plugin contains three fixtures:

  • postgresql - it's a client fixture that has functional scope. After each test it ends all leftover connections, and drops test database from PostgreSQL ensuring repeatability. This fixture returns already connected psycopg connection.
  • postgresql_proc - session scoped fixture, that starts PostgreSQL instance at it's first use and stops at the end of the tests.
  • postgresql_noproc - a noprocess fixture, that's connecting to already running postgresql instance. For example on dockerized test environments, or CI providing postgresql services

Simply include one of these fixtures into your tests fixture list.

You can also create additional postgresql client and process fixtures if you'd need to:

from pytest_postgresql import factories

postgresql_my_proc = factories.postgresql_proc(
    port=None, unixsocketdir='/var/run')
postgresql_my = factories.postgresql('postgresql_my_proc')

Note

Each PostgreSQL process fixture can be configured in a different way than the others through the fixture factory arguments.

Sample test

def test_example_postgres(postgresql):
    """Check main postgresql fixture."""
    cur = postgresql.cursor()
    cur.execute("CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);")
    postgresql.commit()
    cur.close()

If you want the database fixture to be automatically populated with your schema there are two ways:

  1. client fixture specific
  2. process fixture specific

Both are accepting same set of possible loaders:

  • sql file path
  • loading function import path (string)
  • actual loading function

That function will receive host, port, user, dbname and password kwargs and will have to perform connection to the database inside. However, you'll be able to run SQL files or even trigger programmatically database migrations you have.

Client specific loads the database each test

postgresql_my_with_schema = factories.postgresql(
    'postgresql_my_proc',
    load=["schemafile.sql", "otherschema.sql", "import.path.to.function", "import.path.to:otherfunction", load_this]
)

Warning

This way, the database will still be dropped each time.

The process fixture performs the load once per test session, and loads the data into the template database. Client fixture then creates test database out of the template database each test, which significantly speeds up the tests.

postgresql_my_proc = factories.postgresql_proc(
    load=["schemafile.sql", "otherschema.sql", "import.path.to.function", "import.path.to:otherfunction", load_this]
)
pytest --postgresql-populate-template=path.to.loading_function --postgresql-populate-template=path.to.other:loading_function --postgresql-populate-template=path/to/file.sql

The loading_function from example will receive , and have to commit that. Connecting to already existing postgresql database --------------------------------------------------

Some projects are using already running postgresql servers (ie on docker instances). In order to connect to them, one would be using the postgresql_noproc fixture.

postgresql_external = factories.postgresql('postgresql_noproc')

By default the postgresql_noproc fixture would connect to postgresql instance using 5432 port. Standard configuration options apply to it.

These are the configuration options that are working on all levels with the postgresql_noproc fixture:

Configuration

You can define your settings in three ways, it's fixture factory argument, command line option and pytest.ini configuration option. You can pick which you prefer, but remember that these settings are handled in the following order:

  • Fixture factory argument
  • Command line option
  • Configuration option in your pytest.ini file
Configuration options
PostgreSQL option Fixture factory argument Command line option pytest.ini option Noop process fixture Default
Path to executable executable --postgresql-exec postgresql_exec
/usr/lib/postgresql/13/bin/pg_ctl
host host --postgresql-host postgresql_host yes 127.0.0.1
port port --postgresql-port postgresql_port yes (5432) random
postgresql user user --postgresql-user postgresql_user yes postgres
password password --postgresql-password postgresql_password yes  
Starting parameters (extra pg_ctl arguments) startparams --postgresql-startparams postgresql_startparams
-w
Postgres exe extra arguments (passed via pg_ctl's -o argument) postgres_options --postgresql-postgres-options postgresql_postgres_options
 
Log filename's prefix logsprefix --postgresql-logsprefix postgresql_logsprefix
 
Location for unixsockets unixsocket --postgresql-unixsocketdir postgresql_unixsocketdir
$TMPDIR
Database name dbname --postgresql-dbname postgresql_dbname yes, however with xdist an index is being added to name, resulting in test0, test1 for each worker. test
Default Schema either in sql files or import path to function that will load it (list of values for each) load --postgresql-load postgresql_load yes  
PostgreSQL connection options options --postgresql-options postgresql_options yes  

Example usage:

  • pass it as an argument in your own fixture

    postgresql_proc = factories.postgresql_proc(
        port=8888)
  • use --postgresql-port command line option when you run your tests

    py.test tests --postgresql-port=8888
    
  • specify your port as postgresql_port in your pytest.ini file.

    To do so, put a line like the following under the [pytest] section of your pytest.ini:

    [pytest]
    postgresql_port = 8888

Examples

Populating database for tests

With SQLAlchemy

This example shows how to populate database and create an SQLAlchemy's ORM connection:

Sample below is simplified session fixture from pyramid_fullauth tests:

from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.pool import NullPool
from zope.sqlalchemy import register


@pytest.fixture
def db_session(postgresql):
    """Session for SQLAlchemy."""
    from pyramid_fullauth.models import Base

    connection = f'postgresql+psycopg2://{postgresql.info.user}:@{postgresql.info.host}:{postgresql.info.port}/{postgresql.info.dbname}'

    engine = create_engine(connection, echo=False, poolclass=NullPool)
    pyramid_basemodel.Session = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
    pyramid_basemodel.bind_engine(
        engine, pyramid_basemodel.Session, should_create=True, should_drop=True)

    yield pyramid_basemodel.Session

    transaction.commit()
    Base.metadata.drop_all(engine)


@pytest.fixture
def user(db_session):
    """Test user fixture."""
    from pyramid_fullauth.models import User
    from tests.tools import DEFAULT_USER

    new_user = User(**DEFAULT_USER)
    db_session.add(new_user)
    transaction.commit()
    return new_user


def test_remove_last_admin(db_session, user):
    """
    Sample test checks internal login, but shows usage in tests with SQLAlchemy
    """
    user = db_session.merge(user)
    user.is_admin = True
    transaction.commit()
    user = db_session.merge(user)

    with pytest.raises(AttributeError):
        user.is_admin = False

Note

See the original code at pyramid_fullauth's conftest file. Depending on your needs, that in between code can fire alembic migrations in case of sqlalchemy stack or any other code

Maintaining database state outside of the fixtures

It is possible and appears it's used in other libraries for tests, to maintain database state with the use of the pytest-postgresql database managing functionality:

For this import DatabaseJanitor and use its init and drop methods:

import pytest
from pytest_postgresql.janitor import DatabaseJanitor

@pytest.fixture
def database(postgresql_proc):
    # variable definition

    janitor = DatabaseJanitor(
        postgresql_proc.user,
        postgresql_proc.host,
        postgresql_proc.port,
        "my_test_database",
        postgresql_proc.version,
        password="secret_password,
    ):
    janitor.init()
    yield psycopg2.connect(
        dbname="my_test_database",
        user=postgresql_proc.user,
        password="secret_password",
        host=postgresql_proc.host,
        port=postgresql_proc.port,
    )
    janitor.drop()

or use it as a context manager:

import pytest
from pytest_postgresql.janitor import DatabaseJanitor

@pytest.fixture
def database(postgresql_proc):
    # variable definition

    with DatabaseJanitor(
        postgresql_proc.user,
        postgresql_proc.host,
        postgresql_proc.port,
        "my_test_database",
        postgresql_proc.version,
        password="secret_password,
    ):
        yield psycopg2.connect(
            dbname="my_test_database",
            user=postgresql_proc.user,
            password="secret_password",
            host=postgresql_proc.host,
            port=postgresql_proc.port,
        )

Note

DatabaseJanitor manages the state of the database, but you'll have to create connection to use in test code yourself.

You can optionally pass in a recognized postgresql ISOLATION_LEVEL for additional control.

Note

See DatabaseJanitor usage in python's warehouse test code https://github.com/pypa/warehouse/blob/5d15bfe/tests/conftest.py#L127

Connecting to Postgresql (in a docker)

To connect to a docker run postgresql and run test on it, use noproc fixtures.

docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres

This will start postgresql in a docker container, however using a postgresql installed locally is not much different.

In tests, make sure that all your tests are using postgresql_noproc fixture like that:

postgresql_in_docker = factories.postgresql_noproc()
postresql = factories.postgresql("postgresql_in_docker", db_name="test")


def test_postgres_docker(postresql):
    """Run test."""
    cur = postgresql.cursor()
    cur.execute("CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);")
    postgresql.commit()
    cur.close()

And run tests:

pytest --postgresql-host=172.17.0.2 --postgresql-password=mysecretpassword

Using a common database initialisation between tests

If you've got several tests that require common initialisation, you need to define a load and pass it to your custom postgresql process fixture:

import pytest_postgresql.factories
def load_database(**kwargs):
    db_connection: connection = psycopg2.connect(**kwargs)
    with db_connection.cursor() as cur:
        cur.execute("CREATE TABLE stories (id serial PRIMARY KEY, name varchar);")
        cur.execute(
            "INSERT INTO stories (name) VALUES"
            "('Silmarillion'), ('Star Wars'), ('The Expanse'), ('Battlestar Galactica')"
        )
        db_connection.commit()

postgresql_proc = factories.postgresql_proc(
    load=[load_database],
)

postgresql = factories.postgresql(
    "postgresql_proc",
)

You can also define your own database name by passing same dbname value to both factories.

The way this will work is that the process fixture will populate template database, which in turn will be used automatically by client fixture to create a test database from scratch. Fast, clean and no dangling transactions, that could be accidentally rolled back.

Same approach will work with noproces fixture, while connecting to already running postgresql instance whether it'll be on a docker machine or running remotely or locally.

Owner
Clearcode
Software house with a passion for technology. We specialize in building enterprise-grade adtech, martech and analytics platforms.
Clearcode
Code for "SUGAR: Subgraph Neural Network with Reinforcement Pooling and Self-Supervised Mutual Information Mechanism"

SUGAR Code for "SUGAR: Subgraph Neural Network with Reinforcement Pooling and Self-Supervised Mutual Information Mechanism" Overview train.py: the cor

41 Nov 08, 2022
Selects tests affected by changed files. Continous test runner when used with pytest-watch.

This is a pytest plug-in which automatically selects and re-executes only tests affected by recent changes. How is this possible in dynamic language l

Tibor Arpas 614 Dec 30, 2022
Automated Security Testing For REST API's

Astra REST API penetration testing is complex due to continuous changes in existing APIs and newly added APIs. Astra can be used by security engineers

Flipkart Incubator 2.1k Dec 31, 2022
This is a pytest plugin, that enables you to test your code that relies on a running MongoDB database

This is a pytest plugin, that enables you to test your code that relies on a running MongoDB database. It allows you to specify fixtures for MongoDB process and client.

Clearcode 19 Oct 21, 2022
A small faсade for the standard python mocker library to make it user-friendly

unittest-mocker Inspired by the pytest-mock, but written from scratch for using with unittest and convenient tool - patch_class Installation pip insta

Vertliba V.V. 6 Jun 10, 2022
WEB PENETRATION TESTING TOOL 💥

N-WEB ADVANCE WEB PENETRATION TESTING TOOL Features 🎭 Admin Panel Finder Admin Scanner Dork Generator Advance Dork Finder Extract Links No Redirect H

56 Dec 23, 2022
Django test runner using nose

django-nose django-nose provides all the goodness of nose in your Django tests, like: Testing just your apps by default, not all the standard ones tha

Jazzband 880 Dec 15, 2022
Let your Python tests travel through time

FreezeGun: Let your Python tests travel through time FreezeGun is a library that allows your Python tests to travel through time by mocking the dateti

Steve Pulec 3.5k Dec 29, 2022
Given some test cases, this program automatically queries the oracle and tests your Cshanty compiler!

The Diviner A complement to The Oracle for compilers class. Given some test cases, this program automatically queries the oracle and tests your compil

Grant Holmes 2 Jan 29, 2022
Um scraper feito em python que gera arquivos de excel baseados nas tier lists do site LoLalytics.

LoLalytics-scraper Um scraper feito em python que gera arquivos de excel baseados nas tier lists do site LoLalytics. Começando por um único script com

Kevin Souza 1 Feb 19, 2022
Custom Selenium Chromedriver | Zero-Config | Passes ALL bot mitigation systems (like Distil / Imperva/ Datadadome / CloudFlare IUAM)

Custom Selenium Chromedriver | Zero-Config | Passes ALL bot mitigation systems (like Distil / Imperva/ Datadadome / CloudFlare IUAM)

Leon 3.5k Dec 30, 2022
The definitive testing tool for Python. Born under the banner of Behavior Driven Development (BDD).

mamba: the definitive test runner for Python mamba is the definitive test runner for Python. Born under the banner of behavior-driven development. Ins

Néstor Salceda 502 Dec 30, 2022
Test django schema and data migrations, including migrations' order and best practices.

django-test-migrations Features Allows to test django schema and data migrations Allows to test both forward and rollback migrations Allows to test th

wemake.services 382 Dec 27, 2022
自动化爬取并自动测试所有swagger-ui.html显示的接口

swagger-hack 在测试中偶尔会碰到swagger泄露 常见的泄露如图: 有的泄露接口特别多,每一个都手动去试根本试不过来 于是用python写了个脚本自动爬取所有接口,配置好传参发包访问 原理是首先抓取http://url/swagger-resources 获取到有哪些标准及对应的文档地

jayus 534 Dec 29, 2022
How to Create a YouTube Bot that Increases Views using Python Programming Language

YouTube-Bot-in-Python-Selenium How to Create a YouTube Bot that Increases Views using Python Programming Language. The app is for educational purpose

Edna 14 Jan 03, 2023
A single module to link Python ecosystem to the Web

A single module to link Python ecosystem to the Web. Have a quick look at the Gallery first to get convinced ! FAQ For any questions, please use Stack

66 Dec 21, 2022
pytest plugin for distributed testing and loop-on-failures testing modes.

xdist: pytest distributed testing plugin The pytest-xdist plugin extends pytest with some unique test execution modes: test run parallelization: if yo

pytest-dev 1.1k Dec 30, 2022
Switch among Guest VMs organized by Resource Pool

Proxmox PCI Switcher Switch among Guest VMs organized by Resource Pool. main features: ONE GPU card, N OS (at once) Guest VM command client Handler po

Rosiney Gomes Pereira 111 Dec 27, 2022
Auto-hms-action - Automation of NU Health Management System

🦾 Automation of NU Health Management System 🤖 長崎大学 健康管理システムの自動化 🏯 Usage / 使い方

k5-mot 3 Mar 04, 2022
Using openpyxl in Python, performed following task

Python-Automation-with-openpyxl Using openpyxl in Python, performed following tasks on an Excel Sheet containing Product Suppliers along with their pr

1 Apr 06, 2022