Logica is a logic programming language that compiles to StandardSQL and runs on Google BigQuery.

Overview

Logica: language of Big Data

Logica is an open source declarative logic programming language for data manipulation. Logica is a successor to Yedalog, a language created at Google earlier.

Why?

Logica is for engineers, data scientists and other specialists who want to use logic programming syntax when writing queries and pipelines to run on BigQuery.

Logica compiles to StandardSQL and gives you access to the power of BigQuery engine with the convenience of logic programming syntax. This is useful because BigQuery is magnitudes more powerful than state of the art native logic programming engines.

We encourage you to try Logica, especially if

  • you already use logic programming and need more computational power, or
  • you use SQL, but feel unsatisfied about its readability, or
  • you want to learn logic programming and apply it to processing of Big Data.

In the future we plan to support more SQL dialects and engines.

I have not heard of logic programming. What is it?

Logic programming is a declarative programming paradigm where the program is written as a set of logical statements.

Logic programming was developed in academia from the late 60s. Prolog and Datalog are the most prominent examples of logic programming languages. Logica is a language of the Datalog family.

Datalog and relational databases start from the same idea: think of data as relations and think of data manipulation as a sequence of operations over these relations. But Datalog and SQL differ in how these operations are described. Datalog is inspired by the mathematical syntax of the first order propositional logic and SQL follows the syntax of natural language.

SQL was based on the natural language to give access to databases to the people without formal training in computer programming or mathematics. This convenience may become costly when the logic that you want to express is non trivial. There are many examples of hard-to-read SQL queries that correspond to simple logic programs.

How does Logica work?

Logica compiles the logic program into a SQL expression, so it can be executed on BigQuery, the state of the art SQL engine.

Among database theoreticians Datalog and SQL are known to be equivalent. And indeed the conversion from Datalog to SQL and back is often straightforward. However there are a few nuances, for example how to treat disjunction and negation. In Logica we tried to make choices that make understanding of the resulting SQL structure as easy as possible, thus empowering user to write programs that are executed efficiently.

Why is it called Logica?

Logica stands for Logic with aggregation.

How to learn?

Learn basics of Logica with the CoLab tutorial located at tutorial folder. See examples of using Logica in examples folder.

Tutorial and examples show how to access Logica from CoLab. You can also install Logica command line tool.

Prerequisites

To run Logica programs on BigQuery you will need a Google Cloud Project. Once you have a project you can run Logica programs in CoLab providing your project id.

To run Logica locally you need Python3.

To initiate Logica predicates execution from the command line you will need bq, a BigQuery command line tool. For that you need to install Google Cloud SDK.

Installation

Google Cloud Project is the only thing you need to run Logica in Colab, see Hello World example.

You can install Logica command with pip as follows.

# Install.
python3 -m pip install logica
# Run:
# To see usage message.
python3 -m logica
# To print SQL for HelloWorld program.
python3 -m logica - print Greet <<<'Greet(greeting: "Hello world!")'

If your PATH includes Python's bin folder then you will also be able to run it simply as

logica - print Greet <<<'Greet(greeting: "Hello world!")'

Alternatively, you can clone GitHub repository:

git clone https://github.com/evgskv/logica
cd logica
./logica - print Greet <<<'Greet(greeting: "Hello world!")'

Code samples

Here a couple examples of how Logica code looks like.

Prime numbers

Find prime numbers less than 30.

Program primes.l:

# Define natural numbers from 1 to 29.
N(x) :- x in Range(30);
# Define primes.
Prime(prime: x) :-
  N(x),
  x > 1,
  ~(
    N(y),
    y > 1,
    y != x,
    x % y == 0
  );

Running primes.l

$ logica primes.l run Prime
+-------+
| prime |
+-------+
|     2 |
|     3 |
|     5 |
|     7 |
|    11 |
|    13 |
|    17 |
|    19 |
|    23 |
|    29 |
+-------+

News mentions

Who was mentioned in the news in 2020 the most? Let's query GDELT Project dataset.

Program mentions.l

@OrderBy(Mentions, "mentions desc");
@Limit(Mentions, 10);
Mentions(person:, mentions? += 1) distinct :-
  gdelt-bq.gdeltv2.gkg(persons:, date:),
  Substr(ToString(date), 0, 4) == "2020",
  the_persons == Split(persons, ";"),
  person in the_persons;

Running mentions.l

$ logica mentions.l run Mentions
+----------------+----------+
|     person     | mentions |
+----------------+----------+
| donald trump   |  3624228 |
| joe biden      |  1591320 |
| los angeles    |  1221998 |
| george floyd   |   923472 |
| boris johnson  |   845955 |
| barack obama   |   541672 |
| vladimir putin |   486428 |
| bernie sanders |   409224 |
| andrew cuomo   |   375594 |
| nancy pelosi   |   375373 |
+----------------+----------+

Note that cities of Los Angeles and Las Vegas are mentioned in this table due to known missclasification issue in the GDELT data analysis.

Feedback

Feel free to create github issues for bugs and feature requests.

You questions and comments are welcome at our github discussions!


Unless otherwise noted, the Logica source files are distributed under the Apache 2.0 license found in the LICENSE file.

This is not an officially supported Google product.

Owner
Evgeny Skvortsov
Software Engineer
Evgeny Skvortsov
Apache Libcloud is a Python library which hides differences between different cloud provider APIs and allows you to manage different cloud resources through a unified and easy to use API

Apache Libcloud - a unified interface for the cloud Apache Libcloud is a Python library which hides differences between different cloud provider APIs

The Apache Software Foundation 1.9k Dec 25, 2022
A CRUD and REST api with mongodb atlas.

Movies_api A CRUD and REST api with mongodb atlas. Setup First import all the python dependencies in your virtual environment or globally by the follo

Pratyush Kongalla 0 Nov 09, 2022
Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Odiwuor Lameck 1 Jan 07, 2022
A pandas-like deferred expression system, with first-class SQL support

Ibis: Python data analysis framework for Hadoop and SQL engines Service Status Documentation Conda packages PyPI Azure Coverage Ibis is a toolbox to b

Ibis Project 2.3k Jan 06, 2023
A Python DB-API and SQLAlchemy dialect to Google Spreasheets

Note: shillelagh is a drop-in replacement for gsheets-db-api, with many additional features. You should use it instead. If you're using SQLAlchemy all

Beto Dealmeida 185 Jan 01, 2023
Python client for Apache Kafka

Kafka Python client Python client for the Apache Kafka distributed stream processing system. kafka-python is designed to function much like the offici

Dana Powers 5.1k Jan 08, 2023
Prometheus instrumentation library for Python applications

Prometheus Python Client The official Python 2 and 3 client for Prometheus. Three Step Demo One: Install the client: pip install prometheus-client Tw

Prometheus 3.2k Jan 07, 2023
A wrapper for SQLite and MySQL, Most of the queries wrapped into commands for ease.

Before you proceed, make sure you know Some real SQL, before looking at the code, otherwise you probably won't understand anything. Installation pip i

Refined 4 Jul 30, 2022
SpyQL - SQL with Python in the middle

SpyQL SQL with Python in the middle Concept SpyQL is a query language that combines: the simplicity and structure of SQL with the power and readabilit

Daniel Moura 853 Dec 30, 2022
Databank is an easy-to-use Python library for making raw SQL queries in a multi-threaded environment.

Databank Databank is an easy-to-use Python library for making raw SQL queries in a multi-threaded environment. No ORM, no frills. Thread-safe. Only ra

snapADDY GmbH 4 Apr 04, 2022
MySQLdb is a Python DB API-2.0 compliant library to interact with MySQL 3.23-5.1 (unofficial mirror)

==================== MySQLdb Installation ==================== .. contents:: .. Prerequisites ------------- + Python 2.3.4 or higher * http://ww

Sébastien Arnaud 17 Oct 10, 2021
PyRemoteSQL is a python SQL client that allows you to connect to your remote server with phpMyAdmin installed.

PyRemoteSQL Python MySQL remote client Basically this is a python SQL client that allows you to connect to your remote server with phpMyAdmin installe

ProbablyX 3 Nov 04, 2022
pandas-gbq is a package providing an interface to the Google BigQuery API from pandas

pandas-gbq pandas-gbq is a package providing an interface to the Google BigQuery API from pandas Installation Install latest release version via conda

Google APIs 348 Jan 03, 2023
Asynchronous Python client for InfluxDB

aioinflux Asynchronous Python client for InfluxDB. Built on top of aiohttp and asyncio. Aioinflux is an alternative to the official InfluxDB Python cl

Gustavo Bezerra 159 Dec 27, 2022
Import entity definition document into SQLie3. Manage the entity. Also, create a "Create Table SQL file".

EntityDocumentMaker Version 1.00 After importing the entity definition (Excel file), store the data in sqlite3. エンティティ定義(Excelファイル)をインポートした後、データをsqlit

G-jon FujiYama 1 Jan 09, 2022
A framework based on tornado for easier development, scaling up and maintenance

turbo 中文文档 Turbo is a framework for fast building web site and RESTFul api, based on tornado. Easily scale up and maintain Rapid development for RESTF

133 Dec 06, 2022
A supercharged SQLite library for Python

SuperSQLite: a supercharged SQLite library for Python A feature-packed Python package and for utilizing SQLite in Python by Plasticity. It is intended

Plasticity 703 Dec 30, 2022
dbd is a database prototyping tool that enables data analysts and engineers to quickly load and transform data in SQL databases.

dbd: database prototyping tool dbd is a database prototyping tool that enables data analysts and engineers to quickly load and transform data in SQL d

Zdenek Svoboda 47 Dec 07, 2022
An asyncio compatible Redis driver, written purely in Python. This is really just a pet-project for me.

asyncredis An asyncio compatible Redis driver. Just a pet-project. Information asyncredis is, like I've said above, just a pet-project for me. I reall

Vish M 1 Dec 25, 2021
Python MYSQL CheatSheet.

Python MYSQL CheatSheet Python mysql cheatsheet. Install Required Windows(WAMP) Download and Install from HERE Linux(LAMP) install packages. sudo apt

Mohammad Dori 4 Jul 15, 2022