Anomaly detection on SQL data warehouses and databases

Overview

CueObserve Logo

Test Coverage License


With CueObserve, you can run anomaly detection on data in your SQL data warehouses and databases.

CueObserve

Getting Started

Install via Docker

docker run -p 3000:80 cuebook/cueobserve

Now visit http://localhost:3000 in your browser.

How it works

You write a SQL GROUP BY query, map its columns as dimensions and measures, and save it as a virtual Dataset.

Dataset SQL

Dataset Schema Map

You then define one or more anomaly detection jobs on the dataset.

Anomaly Definition

When an anomaly detection job runs, CueObserve does the following:

  1. Executes the SQL GROUP BY query on your data warehouse and stores the result as a Pandas dataframe.
  2. Generates one or more timeseries from the dataframe, as defined in your anomaly detection job.
  3. Generates a forecast for each timeseries using Prophet.
  4. Creates a visual card for each timeseries. Marks the card as an anomaly if the last data point is anomalous.

Features

  • Automated SQL to timeseries transformation.
  • Run anomaly detection on the aggregate metric or break it down by any dimension.
  • In-built Scheduler. CueObserve uses Celery as the executor and celery-beat as the scheduler.
  • Slack alerts when anomalies are detected. (coming soon)
  • Monitoring. Slack alert when a job fails. CueObserve maintains detailed logs. (coming soon)

Limitations

  • Currently supports Prophet for timeseries forecasting.
  • Not being built for real-time anomaly detection on streaming data.

Support

For general help using CueObserve, read the documentation, or go to Github Discussions.

To report a bug or request a feature, open an issue.

Contributing

We'd love contributions to CueObserve. Before you contribute, please first discuss the change you wish to make via an issue or a discussion. Contributors are expected to adhere to our code of conduct.

Comments
  • hourly error

    hourly error

    Discussed in https://github.com/cuebook/CueObserve/discussions/75

    Originally posted by jithendra945 August 5, 2021 image Im getting this error, when i am trying to run anomaly definition.

    Im not getting why it is having None in it.

    bug observe 
    opened by sachinkbansal 8
  • Clickhouse as datasource support

    Clickhouse as datasource support

    Describe the solution you'd like I'd love to give CueObserve a try but our warehouse is currently in MS SQL Server.

    Describe the solution you'd like Add ClickHouse as a supported data source.

    I will wait when https://github.com/cuebook/CueObserve/issues/52 will resolve and try to implements PR

    enhancement connection 
    opened by Slach 4
  • Support SQL Server Data Source

    Support SQL Server Data Source

    Is your feature request related to a problem? Please describe. I'd love to give CueObserve a try but our warehouse is currently in MS SQL Server.

    Describe the solution you'd like Add SQL Server as a supported data source.

    Additional context I'd be interested in making the necessary pull request, but I'd like some high level advice on what might be needed.

    Is it as simple as adding the necessary sqlserver.py in https://github.com/cuebook/CueObserve/tree/main/api/dbConnections ?

    enhancement connection 
    opened by adam133 4
  • Schedule Tasks more that or equal 6 are being struck lifetime in celery queue

    Schedule Tasks more that or equal 6 are being struck lifetime in celery queue

    Describe the bug Cueobserve scheduled tasks are getting being struck in celery queue and not completing(No error is thrown)

    image

    To Reproduce Steps to reproduce the behavior:

    1. create 6 different anomalies definitions
    2. create 5 min cron interval & wait for 5min to start the scheduled tasks

    th0se tasks are not completing even after 1 day.

    for debugging, I have tried executing 5 scheduled tasks and 1 Scheduled task separately(different cron intervals), its is working fine, but when the scheduled tasks are 6 with same cron interval those got struck and didn't finish

    Expected behavior It should complete the 6 accepted tasks and pull the next tasks

    Thanks

    bug observe 
    opened by itsmesrds 3
  • Docker Network Mode and port mapping

    Docker Network Mode and port mapping

    Is your feature request related to a problem? Please describe. The docker-compose files provided sets everything with network mode "host". For various reason this setting does not work in my environment.

    Describe the solution you'd like

    • Capacity to use port mapping within the docker-compose files.
    • Capacity to use a Docker Swarm
    observe 
    opened by adrien-ferrara 2
  • docker-compose start error

    docker-compose start error

    image

    got the result:

    /usr/local/lib/python2.7/dist-packages/paramiko/transport.py:33: CryptographyDeprecationWarning: Python 2 is no longer supported by the Python core team. Support for it is now deprecated in cryptography, and will be removed in the next release. from cryptography.hazmat.backends import default_backend Traceback (most recent call last): File "/usr/local/bin/docker-compose", line 7, in from compose.cli.main import main File "/usr/local/lib/python2.7/dist-packages/compose/cli/main.py", line 24, in from ..config import ConfigurationError File "/usr/local/lib/python2.7/dist-packages/compose/config/init.py", line 6, in from .config import ConfigurationError File "/usr/local/lib/python2.7/dist-packages/compose/config/config.py", line 51, in from .validation import match_named_volumes File "/usr/local/lib/python2.7/dist-packages/compose/config/validation.py", line 12, in from jsonschema import Draft4Validator File "/usr/local/lib/python2.7/dist-packages/jsonschema/init.py", line 21, in from jsonschema._types import TypeChecker File "/usr/local/lib/python2.7/dist-packages/jsonschema/_types.py", line 3, in from pyrsistent import pmap File "/usr/local/lib/python2.7/dist-packages/pyrsistent/init.py", line 3, in from pyrsistent._pmap import pmap, m, PMap File "/usr/local/lib/python2.7/dist-packages/pyrsistent/_pmap.py", line 98 ) from e ^ SyntaxError: invalid syntax

    opened by wdongdongde 1
  • Support generic rest end point for notification

    Support generic rest end point for notification

    Discussed in https://github.com/cuebook/CueObserve/discussions/129

    Originally posted by pjpringle August 30, 2021 Not everyone has slack especially in the work place environments. Provide support to plug in rest calls to notify of anomalies.

    • [x] Image bytes included in response json
    enhancement observe 
    opened by sachinkbansal 1
  • Failing task due to incomplete data

    Failing task due to incomplete data

    Sometimes Anomaly detection job fails or the RCA job fails due to fewer data in a few buckets in that case we don't even show the results for the other buckets.

        if not allTasksSucceeded:
            runStatusObj.status = ANOMALY_DETECTION_ERROR
    

    We have added such checks to the code. But I think this is not a good thing to do as the user is not aware of the reason for the failure. Better is to skip this bucket and show the result of the rest of the buckets.

    opened by AakarSharmaHME 3
  • Different Priority alerts on different slack channel

    Different Priority alerts on different slack channel

    Is your feature request related to a problem? Please describe. There are different kinds of datasets. Anomalies on some are a P0 alert and on some, they are not that urgent. On the same dataset also a certain threshold needs immediate attention while others do not that much.

    Describe the solution you'd like While configuring settings for the slack channel we should be able to configure the different priorities and their slack channel id. Then while creating anomaly definition we should mention the priority of the anomaly definition. Just like we do for schedules.

    opened by AakarSharmaHME 5
  • Docker Image Build fails on Dev compose

    Docker Image Build fails on Dev compose

    Describe the bug Failed to create a docker image using docker-compose -f docker-compose-dev.yml up -d . Build fails with error Service 'cueo-backend' failed to build : Build failed

    To Reproduce Steps to reproduce the behavior:

    1. Git clone this repo
    2. Run docker-compose -f docker-compose-dev.yml up -d

    Expected behavior CueO components image to be built successfully and CueObserve to start running on localhost:3000

    Additional context Will add the entire log in the file or snippet below.

    observe 
    opened by shreyas-dev 4
  • pyarrow not installed

    pyarrow not installed

    Discussed in https://github.com/cuebook/CueObserve/discussions/187

    Originally posted by ravi-jam October 16, 2021 Hi Folks,

    My team just deployed CueObserve to EC2. I Got this error when we tried to add a dataset

    {“log”:“2021-10-16 06:15:14,232 [dbConnections.bigquery:63] ERROR Can’t connect to db with this credentials The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.\n”,“stream”:“stderr”,“time”:“2021-10-16T06:15:14.237031452Z”}

    Any help would be appreciated.

    bug 
    opened by sachinkbansal 0
  • Handle scenario where dataset's last data point is complete and must not be ignored

    Handle scenario where dataset's last data point is complete and must not be ignored

    Discussed in https://github.com/cuebook/CueObserve/discussions/172

    Originally posted by PhilippeDo October 8, 2021 I have now this error when I tried to use prophet

    {"d2b9be2a-6d27-4e61-a577-3d99892e94e0": {"dimVal": null, "error": "{\"message\": \"single positional indexer is out-of-bounds\", \"stackTrace\": \"Traceback (most recent call last):\\n File \\\"/code/ops/tasks/anomalyDetection.py\\\", line 56, in anomalyService\\n result = detect(df, granularity, detectionRuleType, anomalyDef)\\n File \\\"/code/ops/tasks/anomalyDetection.py\\\", line 29, in detect\\n return prophetDetect(df, granularity)\\n File \\\"/code/ops/tasks/detectionTypes/prophet.py\\\", line 47, in prophetDetect\\n lastISO = df.iloc[-1][\\\"ds\\\"]\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 895, in __getitem__\\n return self._getitem_axis(maybe_callable, axis=axis)\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 1501, in _getitem_axis\\n self._validate_integer(key, axis)\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 1444, in _validate_integer\\n raise IndexError(\\\"single positional indexer is out-of-bounds\\\")\\nIndexError: single positional indexer is out-of-bounds\\n\"}", "success": false}}

    I used the following in DATASET section

    select DATE_FORMAT(date, '%Y-%m-%d %H') as Date,
    pagelt from pageloadtime2
    

    and then when I run, the dataset looks liks this. It is a minimal dataset withou dimension, with only a measure and a timestamp

    image

    enhancement anomalyDefinition 
    opened by sachinkbansal 0
Releases(v0.3.2)
Dlsite-doujin-renamer - Dlsite doujin renamer tool with python

dlsite-doujin-renamer Features 支持深度查找带有 RJ 号的文件夹 支持手动选择文件夹或拖拽文件夹到软件窗口 支持在 config

111 Jan 02, 2023
Familiar asyncio ORM for python, built with relations in mind

Tortoise ORM Introduction Tortoise ORM is an easy-to-use asyncio ORM (Object Relational Mapper) inspired by Django. Tortoise ORM was build with relati

Tortoise 3.3k Dec 31, 2022
Find graph motifs using intuitive notation

d o t m o t i f Find graph motifs using intuitive notation DotMotif is a library that identifies subgraphs or motifs in a large graph. It looks like t

APL BRAIN 45 Jan 02, 2023
Micro ODM for MongoDB

Beanie - is an asynchronous ODM for MongoDB, based on Motor and Pydantic. It uses an abstraction over Pydantic models and Motor collections to work wi

Roman 993 Jan 03, 2023
CouchDB client built on top of aiohttp (asyncio)

aiocouchdb source: https://github.com/aio-libs/aiocouchdb documentation: http://aiocouchdb.readthedocs.org/en/latest/ license: BSD CouchDB client buil

aio-libs 53 Apr 05, 2022
A fast MySQL driver written in pure C/C++ for Python. Compatible with gevent through monkey patching.

:: Description :: A fast MySQL driver written in pure C/C++ for Python. Compatible with gevent through monkey patching :: Requirements :: Requires P

ESN Social Software 549 Nov 18, 2022
a small, expressive orm -- supports postgresql, mysql and sqlite

peewee Peewee is a simple and small ORM. It has few (but expressive) concepts, making it easy to learn and intuitive to use. a small, expressive ORM p

Charles Leifer 9.7k Dec 30, 2022
High level Python client for Elasticsearch

Elasticsearch DSL Elasticsearch DSL is a high-level library whose aim is to help with writing and running queries against Elasticsearch. It is built o

elastic 3.6k Jan 03, 2023
PubMed Mapper: A Python library that map PubMed XML to Python object

pubmed-mapper: A Python Library that map PubMed XML to Python object 中文文档 1. Philosophy view UML Programmatically access PubMed article is a common ta

灵魂工具人 33 Dec 08, 2022
Asynchronous Python client for InfluxDB

aioinflux Asynchronous Python client for InfluxDB. Built on top of aiohttp and asyncio. Aioinflux is an alternative to the official InfluxDB Python cl

Gustavo Bezerra 159 Dec 27, 2022
Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data.

Redash is designed to enable anyone, regardless of the level of technical sophistication, to harness the power of data big and small. SQL users levera

Redash 22.4k Dec 30, 2022
MinIO Client SDK for Python

MinIO Python SDK for Amazon S3 Compatible Cloud Storage MinIO Python SDK is Simple Storage Service (aka S3) client to perform bucket and object operat

High Performance, Kubernetes Native Object Storage 582 Dec 28, 2022
Generate database table diagram from SQL data definition.

sql2diagram Generate database table diagram from SQL data definition. e.g. "CREATE TABLE ..." See Example below How does it works? Analyze the SQL to

django-cas-ng 1 Feb 08, 2022
A wrapper around asyncpg for use with sqlalchemy

asyncpgsa A python library wrapper around asyncpg for use with sqlalchemy Backwards incompatibility notice Since this library is still in pre 1.0 worl

Canopy 404 Dec 03, 2022
A database migrations tool for SQLAlchemy.

Alembic is a database migrations tool written by the author of SQLAlchemy. A migrations tool offers the following functionality: Can emit ALTER statem

SQLAlchemy 1.7k Jan 01, 2023
Prometheus instrumentation library for Python applications

Prometheus Python Client The official Python 2 and 3 client for Prometheus. Three Step Demo One: Install the client: pip install prometheus-client Tw

Prometheus 3.2k Jan 07, 2023
aiopg is a library for accessing a PostgreSQL database from the asyncio

aiopg aiopg is a library for accessing a PostgreSQL database from the asyncio (PEP-3156/tulip) framework. It wraps asynchronous features of the Psycop

aio-libs 1.3k Jan 03, 2023
asyncio (PEP 3156) Redis support

aioredis asyncio (PEP 3156) Redis client library. Features hiredis parser Yes Pure-python parser Yes Low-level & High-level APIs Yes Connections Pool

aio-libs 2.2k Jan 04, 2023
SpyQL - SQL with Python in the middle

SpyQL SQL with Python in the middle Concept SpyQL is a query language that combines: the simplicity and structure of SQL with the power and readabilit

Daniel Moura 853 Dec 30, 2022
Async ORM based on PyPika

PyPika-ORM - ORM for PyPika SQL Query Builder The package gives you ORM for PyPika with asycio support for a range of databases (SQLite, PostgreSQL, M

Kirill Klenov 7 Jun 04, 2022