Build, deploy and extract satellite public constellations with one command line.

Overview

Logo

SatExtractor

Build, deploy and extract satellite public constellations with one command line.
Logo

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Contributing
  5. License

About The Project

  • tldr: SatExtractor gets all revisits in a date range from a given geojson region from any public satellite constellation and store it in a cloud friendly format.

The large amount of image data makes it difficult to create datasets to train models quickly and reliably. Existing methods for extracting satellite images take a long time to process and have user quotas that restrict access.

Therefore, we created an open source extraction tool SatExtractor to perform worldwide datasets extractions using serverless providers such as Google Cloud Platform or AWS and based on a common existing standard: STAC.

The tool scales horizontally as needed, extracting revisits and storing them in zarr format to be easily used by deep learning models.

It is fully configurable using Hydra.

(back to top)

Getting Started

SatExtractor needs a cloud provider to work. Before you start using it, you'll need to create and configure a cloud provider account.

We provide the implementation to work with Google Cloud, but SatExtractor is implemented to be easily extensible to other providers.

Structure

The package is structured in a modular and configurable approach. It is basically a pipeline containing 6 important steps (separated in modules).

  • Builder: contains the logic to build the container that will run the extraction.

    more info SatExtractor is based on a docker container. The Dockerfile in the root dir is used to build the core package and a reference in it to the specific provider extraction logic should be explicitly added (see the gcp example in directory providers/gcp).

    This is done by setting ENV PROVIDER var to point the provider directory. In the default Dockerfile it is set to gcp: ENV PROVIDER providers/gcp .

  • Stac: converts a public constellation to the STAC standard.
    more info If the original constellation is not already in STAC standard it should be converted. To do so, you have to implement the constellation specific STAC conversor. Sentinel 2 and Landsat 7/8 examples can be found in src/satextractor/stac . The function that is actually called to perform the conversion to the STAC standard is set in stac hydra config file ( conf/stac/gcp.yaml )
  • Tiler: Creates tiles of the given region to perform the extraction.
    more info The Tiler split the region in UTM tiles using SentinelHub splitter . There will be one Extraction Task per Tile. The config about the tiler can be found in conf/tiler/utm.yaml . There, the size of the tiles can be specified. Take into account that these tiles are not the actual patches that are later stored in your cloud provider, this is just the unit from where the (smaller) patches will be extracted.
  • Scheduler: Decides how those tiles are going to be scheduled creating extractions tasks.

    more info The Scheduler takes the resulting tiles from the Tiler and creates the actual patches (called also tiles) to be extracted.

    For example, if the Tiler splitted the region in 10000x10000 tiles, now the scheduler can be set to extract from each of the tiles smaller patches of, say, 1000x1000. Also, the scheduler calculates the intersection between the patches and the constellation STAC assets. At the end, you'll have and object called ExtractionTask with the information to extract one revisit, one band and one tile splitted in multiple patches. This ExtractionTask will be send to the cloud provider to perform the actual extraction.

    The config about the scheduler can be found in conf/scheduler/utm.yaml .

  • Preparer: Prepare the files in the cloud storage.

    more info The Preparer creates the cloud file structure. It creates the needed zarr groups and arrays in order to later store the extracted patches.

    The gcp preparer config can be found in conf/preparer/gcp.yaml .

  • Deployer: Deploy the extraction tasks created by the scheduler to perform the extraction.
    more info The Deployer sends one message per ExtractionTask to the cloud provider to perform the actal extraction. It works by publishing messages to a PubSub queue where the extraction is subscribed to. When a new message (ExtractionTask) arrives it will be automatically run on the cloud autoscaling. The gcp deployer config can be found in conf/deployer/gcp.yaml .

All the steps are optional and the user decides which to run the main config file.

Prerequisites

In order to run SatExtractor we recommend to have a virtual env and a cloud provider user should already been created.

Installation

  1. Clone the repo
    git clone https://github.com/FrontierDevelopmentLab/sat-extractor
  2. Install python packages
    pip install .

(back to top)

Usage

🔴 🔴 🔴

- WARNING!!!!:
Running SatExtractor will use your billable cloud provider services. 
We strongly recommend testing it with a small region to see if everything is working ok. 
Be sure you are running all your cloud provider services in the same region to avoid extra costs.

🔴 🔴 🔴

Once a cloud provider user is set and the package is installed you'll need to grab the geojson region you want (you can get it from the super-cool tool geojson.io) and change the config files.

  1. Save the region as .geojson and store it in the outputs folder (you can change your output dir in the config.yaml)
  2. Open the config.yaml and you'll see something like this:

Logo

The important here is to set the dataset_name to , define the start_date and end_date for your revisits, your constellations and the tasks to be run (you would want to run the build only one time and the comment it out.)

Important: the token.json contains the needed credentials to access you cloud provider. In this example case it contains the gcp credentials. You'll need to provide it.

  1. Open the cloud/ .yaml and add there your account info as in the default provided file. (optional): you can choose different configurations by changing modules configs: builder, stac, tiler, scheduler, preparer, etc. There you can change things like patch_size, chunk_size.

  2. Run python src/satextractor/cli.py and enjoy!

(back to top)

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the BSD 2 License. See LICENSE.txt for more information.

(back to top)

Acknowledgments

This work is the result of the 2021 ESA Frontier Development Lab World Food Embeddings team. We are grateful to all organisers, mentors and sponsors for providing us this opportunity. We thank Google Cloud for providing computing and storage resources to complete this work.

Comments
  • Dockerfile path not found

    Dockerfile path not found

    in gcp builder the dockerfile path is set to: dockerfile_path = Path(__file__).parents[3] It returns '/home/fran/miniconda3/envs/sat-extractor/lib/python3.9'.

    Should be changed to pass the path as parameter.

    bug 
    opened by frandorr 3
  • Loosen version locking

    Loosen version locking

    I think it might be necessary to loosen the dependency version locking in setup.py a bit. Quite difficult to e.g. pip install prefect[google] sat-extractor because of the conflicts.

    opened by carderne 3
  • DLQ, Pyarrow, backoff

    DLQ, Pyarrow, backoff

    Without pyarrow installed, I got the following error:

    2021-11-05 16:10:20.389 | INFO     | __main__:stac:33 - using satextractor.stac.gcp_region_to_item_collection stac creator.
    Error executing job with overrides: []
    Traceback (most recent call last):
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "sat-extractor/src/satextractor/cli.py", line 185, in main
        stac(cfg)
      File "sat-extractor/src/satextractor/cli.py", line 44, in stac
        item_collection = hydra.utils.call(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 180, in instantiate
        return instantiate_node(config, *args, recursive=_recursive_, convert=_convert_)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 249, in instantiate_node
        return _call_target(_target_, *args, **kwargs)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 64, in _call_target
        raise type(e)(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: Error instantiating 'satextractor.stac.stac.gcp_region_to_item_collection' : The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    
    opened by carderne 3
  • Fix lazy scheduler

    Fix lazy scheduler

    This is much smaller than it looks, basically just:

    1. Add back support for passing item_collection as an already-loaded ItemCollection.
    2. Rename item to it on L132 to avoid shadowing item from the main loop.
    opened by carderne 2
  • satextractor.schedler causing circular import

    satextractor.schedler causing circular import

    The first line of this imports gcp_scheduler https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/init.py#L1-L2

    Which itself imports from the same __init__.py again, causing a circular loop: https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/gcp_scheduler.py#L8

    opened by carderne 2
  • Get DLQ working

    Get DLQ working

    @frandorr @Lkruitwagen

    From here, the service account for PubSub looks like this:

    PUBSUB_SERVICE_ACCOUNT="service-${project-number}@gcp-sa-pubsub.iam.gserviceaccount.com"
    

    i.e. instead of using the service account from the token.json.

    From CLI:

    PROJ_NUMBER=$(gcloud projects list \
    --filter="$(gcloud config get-value project)" \
    --format="value(PROJECT_NUMBER)")
    
    PUBSUB_SERVICE_ACCOUNT="s[email protected]"
    

    And then bind the account as already done:

    gcloud pubsub topics add-iam-policy-binding "$DLQ_TOPIC" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.publisher
    
    gcloud pubsub subscriptions add-iam-policy-binding "$MAIN_SUBSCRIPTION" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.subscriber
    
    opened by carderne 2
  • GCP dead-letter-queue not properly being created

    GCP dead-letter-queue not properly being created

    When the pubsub cloud run subscription is created, it creates also a dlq but it doesn't assign the correct roles and permissions and doesn't create the topic for the dlq: image

    As the dlq doesn't exist, the extraction task messages that fails will loop forever in the the main queue, restarting the cloud run service until messages are manually purged.

    We should add this permissions and create the dlq topic automatically to avoid an infinite loop.

    bug 
    opened by frandorr 2
  • mask and percentiles

    mask and percentiles

    These don't seem to be used for anything?

    https://github.com/FrontierDevelopmentLab/sat-extractor/blob/7a0821360ffdab8403563ca651b5bd43ecca3dc4/src/satextractor/preparer/preparer.py#L35-L51

    opened by carderne 1
  • Succeeds but get error

    Succeeds but get error

    @frandorr just sharing this here from last week

    [2021-10-29 13:17:48,746][grpc._plugin_wrapping][ERROR] - AuthMetadataPluginCallback "<google.auth.transport.grpc.AuthMetadataPlugin object at 0x7fc4d86812b0>" raised exception!
    Traceback (most recent call last):
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/grpc/_plugin_wrapping.py", line 89, in __call__
        self._metadata_plugin(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 101, in __call__
        callback(self._get_authorization_headers(context), None)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 87, in _get_authorization_headers
        self._credentials.before_request(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 134, in before_request
        self.apply(headers)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 110, in apply
        _helpers.from_bytes(token or self.token)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/_helpers.py", line 129, in from_bytes
        raise ValueError("{0!r} could not be converted to unicode".format(value))
    ValueError: None could not be converted to unicode
    
    opened by carderne 1
  • Only works with pip install -e .

    Only works with pip install -e .

    This:

    cd sat-extractor
    pyenv global 3.9.7
    mkvirtualenv test
    pip install .
    python ./src/satextractor/cli.py
    

    Fails with:

    ImportError: Encountered error: `No module named 'satextractor.builder'` when
    loading module 'satextractor.builder.gcp_builder.build_gcp'
    

    However, installing with pip install -e . (what I did initially, which is why I didn't notice this) works fine.

    Maybe because when using a non-editable install, cli.py gets confused about whether it should be looking for modules in its directory or in the somehwere-else/site-packages/satextractor directory...

    opened by carderne 1
  • Feat/append data

    Feat/append data

    • add overwrite to main config
    • create a hash string for the config spec (for unique date-range & constellation combination)
    • if extraction archive already exists, resize any existing data and masks for the union of the new timeseries. Overwrite timeries with union prior to extraction.
    opened by Lkruitwagen 0
  • For consideration: Bring main extract function into package

    For consideration: Bring main extract function into package

    @Lkruitwagen @frandorr

    Moving all the extract_patches logic into the main package under satextractor.extractor, leaving only the HTTP and BQ stuff in the Flask app.

    Seems cleaner, and gives the option to do the extractions in-process if wanted! Could make it easier to try out sat-extractor on a local machine with a local output directory, no need to spin up PubSub, CloudRun etc.

    opened by carderne 0
  • Store bands info

    Store bands info

    Current implementation doesn't store the bands for each constellation. It would be nice to have that info stored. Some ideas:

    • Store a simple metadata json at constellation level (easiest)
    • At the end of each extraction create a STAC catalog that contains metadata info (maybe better but would take longer to implement)
    • Store at array level the info, something like xarray coordinates.
    enhancement 
    opened by frandorr 0
Releases(v0.3.3)
  • v0.3.3(Dec 7, 2021)

    What's Changed

    • fix COG reader by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/27
    • Remove download for jp2. Now using same function for geotiff and jp2 with rio. Remove gdal by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/28

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.2...v0.3.3

    Source code(tar.gz)
    Source code(zip)
  • v0.3.2(Dec 7, 2021)

    What's Changed

    • Remove COG downloader and use same as jp2k because it was buggy by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/26

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.1...v0.3.2

    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Dec 6, 2021)

    What's Changed

    • Fix Dockerfile path regression by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/24
    • Fix rescaling bug by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/25

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.0...v0.3.1

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Dec 2, 2021)

    What's Changed

    • Make tile IDs globally unique by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/17
    • Improve Band common_names and Tile properties by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/21
    • Add task id to vsimem to avoid multiple tasks using the same in-memory file by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/22

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.1...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1(Nov 18, 2021)

    What's Changed

    • Feat/typos+readme by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7
    • README clarifications, add pyarrow, specify platform in gcloud run by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/8
    • DLQ, Pyarrow, backoff by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/11
    • Fix constellations bug. by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/12
    • Compatible deps; refactor build_gcp; explicit Dockerfile; ItemCollection as object by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/14
    • Deployer returns job_id so callers can track monitor tables by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/15
    • Fix jp2 lossy compresion bug

    New Contributors

    • @Lkruitwagen made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.0...v0.1.1

    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Oct 28, 2021)

    What's Changed

    • Small README and config improvements by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • small README modif by @rramosp in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • Add init file fixes #4 by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    New Contributors

    • @carderne made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • @rramosp made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • @frandorr made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/commits/v0.1.0

    Source code(tar.gz)
    Source code(zip)
Owner
Frontier Development Lab
For more projects, see: https://gitlab.com/frontierdevelopmentlab
Frontier Development Lab
An API built to format given addresses using Python and Flask.

An API built to format given addresses using Python and Flask. About The API returns properly formatted data, i.e. removing duplicate fields, distingu

1 Feb 27, 2022
Python package for earth-observing satellite data processing

Satpy The Satpy package is a python library for reading and manipulating meteorological remote sensing data and writing it to various image and data f

PyTroll 882 Dec 27, 2022
prettymaps - A minimal Python library to draw customized maps from OpenStreetMap data.

A small set of Python functions to draw pretty maps from OpenStreetMap data. Based on osmnx, matplotlib and shapely libraries.

Marcelo de Oliveira Rosa Prates 9k Jan 08, 2023
Simple CLI for Google Earth Engine Uploads

geeup: Simple CLI for Earth Engine Uploads with Selenium Support This tool came of the simple need to handle batch uploads of both image assets to col

Samapriya Roy 79 Nov 26, 2022
Geographic add-ons for Django REST Framework. Maintained by the OpenWISP Project.

Geographic add-ons for Django REST Framework. Maintained by the OpenWISP Project.

OpenWISP 982 Jan 06, 2023
EOReader is a multi-satellite reader allowing you to open optical and SAR data.

Remote-sensing opensource python library reading optical and SAR sensors, loading and stacking bands, clouds, DEM and index.

ICube-SERTIT 152 Dec 30, 2022
Django model field that can hold a geoposition, and corresponding widget

django-geoposition A model field that can hold a geoposition (latitude/longitude), and corresponding admin/form widget. Prerequisites Starting with ve

Philipp Bosch 324 Oct 17, 2022
Streamlit Component for rendering Folium maps

streamlit-folium This Streamlit Component is a work-in-progress to determine what functionality is desirable for a Folium and Streamlit integration. C

Randy Zwitch 224 Dec 30, 2022
This is the antenna performance plotted from tinyGS reception data.

tinyGS-antenna-map This is the antenna performance plotted from tinyGS reception data. See their repository. The code produces a plot that provides Az

Martin J. Levy 14 Aug 21, 2022
Create Siege configuration files from Cloud Optimized GeoTIFF.

cogeo-siege Documentation: Source Code: https://github.com/developmentseed/cogeo-siege Description Create siege configuration files from Cloud Optimiz

Development Seed 3 Dec 01, 2022
leafmap - A Python package for geospatial analysis and interactive mapping in a Jupyter environment.

A Python package for geospatial analysis and interactive mapping with minimal coding in a Jupyter environment

Qiusheng Wu 1.4k Jan 02, 2023
3D extension built off of shapely to make working with geospatial/trajectory data easier in python.

PyGeoShape 3D extension to shapely and pyproj to make working with geospatial/trajectory data easier in python. Getting Started Installation pip The e

Marc Brittain 5 Dec 27, 2022
A Python interface between Earth Engine and xarray

eexarray A Python interface between Earth Engine and xarray Description eexarray was built to make processing gridded, mesoscale time series data quic

Aaron Zuspan 159 Dec 23, 2022
Python interface to PROJ (cartographic projections and coordinate transformations library)

pyproj Python interface to PROJ (cartographic projections and coordinate transformations library). Documentation Stable: http://pyproj4.github.io/pypr

832 Dec 31, 2022
LEOGPS - Satellite Navigation with GPS on Python!

LEOGPS is an open-source Python software which performs relative satellite navigation between two formation flying satellites, with the objective of high accuracy relative positioning. Specifically,

Samuel Low 50 Dec 13, 2022
Hapi is a Python library for building Conceptual Distributed Model using HBV96 lumped model & Muskingum routing method

Current build status All platforms: Current release info Name Downloads Version Platforms Hapi - Hydrological library for Python Hapi is an open-sourc

Mostafa Farrag 15 Dec 26, 2022
Python bindings to libpostal for fast international address parsing/normalization

pypostal These are the official Python bindings to https://github.com/openvenues/libpostal, a fast statistical parser/normalizer for street addresses

openvenues 651 Dec 16, 2022
Google Maps keeps old satellite imagery around for a while – this tool collects what's available for a user-specified region in the form of a GIF.

google-maps-at-88-mph The folks maintaining Google Maps regularly update the satellite imagery it serves its users, but outdated versions of the image

Noah Doersing 111 Sep 27, 2022
h3-js provides a JavaScript version of H3, a hexagon-based geospatial indexing system.

h3-js The h3-js library provides a pure-JavaScript version of the H3 Core Library, a hexagon-based geographic grid system. It can be used either in No

Uber Open Source 648 Jan 07, 2023
Wraps GEOS geometry functions in numpy ufuncs.

PyGEOS PyGEOS is a C/Python library with vectorized geometry functions. The geometry operations are done in the open-source geometry library GEOS. PyG

362 Dec 23, 2022