Threat Intel Platform for T-POTs

Overview

GreedyBear

GitHub release (latest by date) GitHub Repo stars

CodeFactor Code style: black Imports: isort Pull request automation

The project goal is to extract data of the attacks detected by a TPOT or a cluster of them and to generate some feeds that can be used to prevent and detect attacks.

Official announcement here.

Feeds

Public feeds

There are public feeds provided by The Honeynet Project in this site: greedybear.honeynet.org. Example

Please do not perform too many requests to extract feeds or you will be banned.

If you want to be updated regularly, please download the feeds only once every 10 minutes (this is the time between each internal update).

Available feeds

The feeds are reachable through the following URL:

https://
   
    /api/feeds/
    
     /
     
      /
      
       .
        
       
      
     
    
   

The available feed_type are:

  • log4j: attacks detected from the Log4pot.
  • cowrie: attacks detected from the Cowrie Honeypot
  • all: get all types at once

The available attack_type are:

  • scanner: IP addresses captured by the honeypots while performing attacks
  • payload_request: IP addresses and domains extracted from payloads that would have been executed after a speficic attack would have been successful
  • all: get all types at once

The available age are:

  • recent: most recent IOCs seen in the last 3 days
  • persistent: these IOCs are the ones that were seen regularly by the honeypots. This feeds will start empty once no prior data was collected and will become bigger over time.

The available format are:

  • txt: plain text (just one line for each IOC)
  • csv: CSV-like file (just one line for each IOC)
  • json: JSON file with additional information regarding the IOCs

Run Greedybear on your environment

The tool has been created not only to provide the feeds from The Honeynet Project's cluster of TPOTs.

If you manage one or more T-POTs of your own, you can get the code of this application and run Greedybear on your environment. In this way, you are able to provide new feeds of your own.

Comments
  • Added Basic Testcases

    Added Basic Testcases

    Description

    Added Testcases for Views and Models

    Related issues

    Fixes #21

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [ ] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by uzaxirr 11
  • Create authenticated enrichment service

    Create authenticated enrichment service

    We could provide a service that could be queried via API key. In this way, it would be possibile to understand if an IOC is in the database of Greedybear without having to download and manage all the feeds from Greedybear.

    It would be a simple enrichment service.

    We would need:

    • a basic GUI (#11) to allow people register and get an API key.
    • limit API usage to avoid abuse.
    • allow different kind of API usage limits
    • create new API endpoint (#17)
    • Integrate it in IntelOwl (https://github.com/intelowlproject/IntelOwl/issues/817)
    opened by mlodic 9
  • Create feeds for other honeypot types

    Create feeds for other honeypot types

    GreedyBear works by extracting the data from the T-Pot logs generated by the honeypots.

    As a first alpha release we just integrated log4jpot + cowrie.

    We should also integrate all the other available honeypots in the T-PoT. Glutton should be the first

    opened by mlodic 8
  • Fixes #17: Added API for Enrichment

    Fixes #17: Added API for Enrichment

    Description

    Added Enrichment Endpoint. To get details of an observable my it's name. Endpoint: /api/enrichment?query=<observable_name>

    Please ignore the vague changes in settings.py regarding env vars. Did it because of #23 I'll revert them when my PR is good to go.

    Added Fake data in DB through admin pannel for testing purpose

    Related issues

    Fixes and Closes #17

    Type of change

    Please delete options that are not relevant.

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Screenshots

    API Response

    For a record that exist in DB

    Screenshot from 2022-01-05 21-41-55

    For a record that does not exist in DB

    Screenshot from 2022-01-05 21-42-07

    Details of the searched observable in DB

    Screenshot from 2022-01-02 23-24-40

    All Records in DB

    Screenshot from 2022-01-02 23-24-27

    opened by uzaxirr 7
  • Configured Read the Docs

    Configured Read the Docs

    Description

    Configured Read the Docs

    Changes I have done :

    added .readthedocs.yaml file made some changes to docs/source/conf.py added documentation link in readme

    Things to complete :

    I created only the empty md files in docs but haven't added any documentation in them need to add doc of openapi and redoc.

    Related issues

    This PR partially solves issue #27

    Type of change

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [x] The tests gave 0 errors.
    • [x] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by yaswanthsaivendra 4
  • Added elasticsearch container for development

    Added elasticsearch container for development

    Description

    Added elasticsearch container for development

    Related issues

    closes #23

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 4
  • Elasticsearch installation error

    Elasticsearch installation error

    i'm encountering some error while setting up GreedyBear locally After doing the docker-compose -p greedybear up cmd. It originates from settings.py where Elasticsearch client is being initialized. The ELASTIC_ENDPOINT variable in my env file is empty Screenshot from 2022-01-02 19-51-22

    opened by uzaxirr 4
  • updated feeds  view to make use of DRF and added durin authenication

    updated feeds view to make use of DRF and added durin authenication

    Description

    • Made changes to feeds View to make use of DRF
    • Added token authentication of django-rest-durin.

    Related issues

    This PR solves #26 issue.

    Type of change

    • [ ] New feature (non-breaking change which adds functionality).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)
    opened by yaswanthsaivendra 3
  • Rate limiting for admin and API

    Rate limiting for admin and API

    Description

    Rate limiting for admin and API

    Related issues

    #31

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [X] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 2
  • Add CONTRIBUTING.md file

    Add CONTRIBUTING.md file

    Can we please add or refer to the URL containing the guidelines for future contributors, I see there's nothing mentioned about it in the readme or docs for this repo.

    opened by ManishShah120 2
  • Integrate GreedyBear inside T-Pot installation

    Integrate GreedyBear inside T-Pot installation

    This would require that all of these issues were solved first:

    • #11 , #12 , #10 , #21 , #27

    Plus, we would need to work with the T-Pot team to properly integrate the project there. The goal is to try to reduce the complexity of the overall application to allow an easy integration

    opened by mlodic 2
  • Allow to do customized feeds lookups

    Allow to do customized feeds lookups

    We could add more ways to extract data feeds from GB other than "recent" and "persistent" which are free.

    These new ways must be protected with authentication to avoid abuse.

    We could give the users the chance to:

    • download the data extracted in the last X hours (customization of "recent")
    • download the data that was seen more than X times in the last X days (customization of "persistent")
    opened by mlodic 0
  • Filter IP addresses from known scanners

    Filter IP addresses from known scanners

    We should periodically download this batch of data: https://raw.githubusercontent.com/stamparm/maltrail/master/trails/static/mass_scanner.txt and add those IP to whitelists to reduce number of false positives

    opened by mlodic 0
  • Add the chance to select which honeypot we want to extract data from

    Add the chance to select which honeypot we want to extract data from

    Right now there is no chance to do that. GreedyBear would automatically extract data from all the configured honeypots.

    We should allow the app administrator from the Django Admin to enable/disable honeypot extraction. In that way we can also filter logs which states that the honeypot is not running.

    opened by mlodic 0
Releases(v1.0.2)
Owner
The Honeynet Project
The Honeynet Project
Mips script decompiles MIPS assembly instructions & bot functionality

mips mips is a python-based script that decodes MIPS instructions. Usage cd into mips and run python decode.py command or open decode.py to run the sc

Anthony Tedja 0 Mar 30, 2022
Python3 script for scanning CVE-2021-44228 (Log4shell) vulnerable machines.

Log4j_checker.py (CVE-2021-44228) Description This Python3 script tries to look for servers vulnerable to CVE-2021-44228, also known as Log4Shell, a v

lfama 8 Feb 27, 2022
Metal Gear Online 2 (MGO2) stage files decryption

Metal Gear Online 2 decryption tool Metal Gear Online 2 (MGO2) has an additional layer of encryption for stage files. I was not able to find info abou

4 Sep 02, 2022
Compilation of resources and insights that helped me on my journey to data scientist

Compilation of resources and insights that helped me on my journey to data scientist

Conor Dewey 1.5k Jan 02, 2023
If you are worried about being found perhaps try taking cover under a blanket. Pure Python PowerShell Obfuscator

If you are worried about being found perhaps try taking cover under a blanket. Pure Python PowerShell Obfuscator

Ph0tonz 3 Jun 07, 2022
Log4j minecraft with python

log4jminecraft This code DOES NOT promote or encourage any illegal activities! The content in this document is provided solely for educational purpose

David Bombal 154 Dec 24, 2022
Searches for potentially vulnerable websites to local file inclusion, throughout the web and then exploits them for LFI

LFI-Hunter Searches for potentially vulnerable websites to local file inclusion, throughout the web and then exploits them for LFI A script written in

Anukul Pandey 6 Jan 30, 2022
Ducky Script is the payload language of Hak5 gear.

Ducky Script is the payload language of Hak5 gear. Since its introduction with the USB Rubber Ducky in 2010, Ducky Script has grown in capability while maintaining simplicity. Aided by Bash for logic

Abir Abedin Khan 6 Oct 07, 2022
Security System using OpenCV

Security-System Security System using OpenCV Files in this Repository: email_send.py - This file contains python code to send an email when something

Mehul Patwari 1 Oct 28, 2021
macOS persistence tool

PoisonApple Command-line tool to perform various persistence mechanism techniques on macOS. This tool was designed to be used by threat hunters for cy

Cyborg Security, Inc 212 Dec 29, 2022
⛤Keylogger Generator for Windows written in Python⛤

⛤Keylogger Generator for Windows written in Python⛤

FZGbzuw412 33 Nov 24, 2022
This is python script that will extract the functions call in all used DLL in an executable and then provide a mapping of those functions to the attack classes defined and curated malapi.io.

F2Amapper This is python script that will extract the functions call in all used DLL in an executable and then provide a mapping of those functions to

Ajit Kumar 3 Sep 03, 2022
python driver for fingerprint machine (ZKTeco biometrics)

fpmachine python driver for fingerprint machine (ZKTeco biometrics) support until now 2 model supported and tested ZMM100_TFT and ZMM220_TFT install p

Samy Sultan 4 Oct 06, 2022
GitGuardian Shield: protect your secrets with GitGuardian

Detect secret in source code, scan your repo for leaks. Find secrets with GitGuardian and prevent leaked credentials. GitGuardian is an automated secrets detection & remediation service.

GitGuardian 1.2k Dec 27, 2022
Scan publicly accessible assets on your AWS cloud environment

poro Description Scan for publicly accessible assets on your AWS environment Services covered by this tool: AWS ELB API Gateway S3 Buckets RDS Databas

9rnt 134 Dec 16, 2022
Apk Framework Detector

🚀🚀🚀Program helps you to detect the major framework or technology used in writing any android app. Just provide the apk 😇😇

Daniel Agyapong 10 Dec 07, 2022
Profil3r is an OSINT tool that allows you to find potential profiles of a person on social networks, as well as their email addresses 🕵️

Profil3r is an OSINT tool that allows you to find potential profiles of a person on social networks, as well as their email addresses. This program also alerts you to the presence of a data leak for

1.1k Aug 24, 2021
Script Crack Facebook Premium 🚶‍♂

premium Script Crack Facebook Premium 🚶‍♂ In Script Install Script $ pkg update && pkg upgrade $ termux-setup-storage $ pkg install python $ pkg inst

Yumasaa 2 Dec 19, 2021
NExfil is an OSINT tool written in python for finding profiles by username.

NExfil is an OSINT tool written in python for finding profiles by username. The provided usernames are checked on over 350 websites within few seconds.

thewhiteh4t 1.4k Jan 01, 2023
Fast python tool to test apache path traversal CVE-2021-41773 in a List of url

CVE-2021-41773 Fast python tool to test apache path traversal CVE-2021-41773 in a List of url Usage :- create a live urls file and use the flag "-l" p

Zahir Tariq 12 Nov 09, 2022