Threat Intel Platform for T-POTs

Overview

GreedyBear

GitHub release (latest by date) GitHub Repo stars

CodeFactor Code style: black Imports: isort Pull request automation

The project goal is to extract data of the attacks detected by a TPOT or a cluster of them and to generate some feeds that can be used to prevent and detect attacks.

Official announcement here.

Feeds

Public feeds

There are public feeds provided by The Honeynet Project in this site: greedybear.honeynet.org. Example

Please do not perform too many requests to extract feeds or you will be banned.

If you want to be updated regularly, please download the feeds only once every 10 minutes (this is the time between each internal update).

Available feeds

The feeds are reachable through the following URL:

https://
   
    /api/feeds/
    
     /
     
      /
      
       .
        
       
      
     
    
   

The available feed_type are:

  • log4j: attacks detected from the Log4pot.
  • cowrie: attacks detected from the Cowrie Honeypot
  • all: get all types at once

The available attack_type are:

  • scanner: IP addresses captured by the honeypots while performing attacks
  • payload_request: IP addresses and domains extracted from payloads that would have been executed after a speficic attack would have been successful
  • all: get all types at once

The available age are:

  • recent: most recent IOCs seen in the last 3 days
  • persistent: these IOCs are the ones that were seen regularly by the honeypots. This feeds will start empty once no prior data was collected and will become bigger over time.

The available format are:

  • txt: plain text (just one line for each IOC)
  • csv: CSV-like file (just one line for each IOC)
  • json: JSON file with additional information regarding the IOCs

Run Greedybear on your environment

The tool has been created not only to provide the feeds from The Honeynet Project's cluster of TPOTs.

If you manage one or more T-POTs of your own, you can get the code of this application and run Greedybear on your environment. In this way, you are able to provide new feeds of your own.

Comments
  • Added Basic Testcases

    Added Basic Testcases

    Description

    Added Testcases for Views and Models

    Related issues

    Fixes #21

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [ ] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by uzaxirr 11
  • Create authenticated enrichment service

    Create authenticated enrichment service

    We could provide a service that could be queried via API key. In this way, it would be possibile to understand if an IOC is in the database of Greedybear without having to download and manage all the feeds from Greedybear.

    It would be a simple enrichment service.

    We would need:

    • a basic GUI (#11) to allow people register and get an API key.
    • limit API usage to avoid abuse.
    • allow different kind of API usage limits
    • create new API endpoint (#17)
    • Integrate it in IntelOwl (https://github.com/intelowlproject/IntelOwl/issues/817)
    opened by mlodic 9
  • Create feeds for other honeypot types

    Create feeds for other honeypot types

    GreedyBear works by extracting the data from the T-Pot logs generated by the honeypots.

    As a first alpha release we just integrated log4jpot + cowrie.

    We should also integrate all the other available honeypots in the T-PoT. Glutton should be the first

    opened by mlodic 8
  • Fixes #17: Added API for Enrichment

    Fixes #17: Added API for Enrichment

    Description

    Added Enrichment Endpoint. To get details of an observable my it's name. Endpoint: /api/enrichment?query=<observable_name>

    Please ignore the vague changes in settings.py regarding env vars. Did it because of #23 I'll revert them when my PR is good to go.

    Added Fake data in DB through admin pannel for testing purpose

    Related issues

    Fixes and Closes #17

    Type of change

    Please delete options that are not relevant.

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Screenshots

    API Response

    For a record that exist in DB

    Screenshot from 2022-01-05 21-41-55

    For a record that does not exist in DB

    Screenshot from 2022-01-05 21-42-07

    Details of the searched observable in DB

    Screenshot from 2022-01-02 23-24-40

    All Records in DB

    Screenshot from 2022-01-02 23-24-27

    opened by uzaxirr 7
  • Configured Read the Docs

    Configured Read the Docs

    Description

    Configured Read the Docs

    Changes I have done :

    added .readthedocs.yaml file made some changes to docs/source/conf.py added documentation link in readme

    Things to complete :

    I created only the empty md files in docs but haven't added any documentation in them need to add doc of openapi and redoc.

    Related issues

    This PR partially solves issue #27

    Type of change

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [x] The tests gave 0 errors.
    • [x] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by yaswanthsaivendra 4
  • Added elasticsearch container for development

    Added elasticsearch container for development

    Description

    Added elasticsearch container for development

    Related issues

    closes #23

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 4
  • Elasticsearch installation error

    Elasticsearch installation error

    i'm encountering some error while setting up GreedyBear locally After doing the docker-compose -p greedybear up cmd. It originates from settings.py where Elasticsearch client is being initialized. The ELASTIC_ENDPOINT variable in my env file is empty Screenshot from 2022-01-02 19-51-22

    opened by uzaxirr 4
  • updated feeds  view to make use of DRF and added durin authenication

    updated feeds view to make use of DRF and added durin authenication

    Description

    • Made changes to feeds View to make use of DRF
    • Added token authentication of django-rest-durin.

    Related issues

    This PR solves #26 issue.

    Type of change

    • [ ] New feature (non-breaking change which adds functionality).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)
    opened by yaswanthsaivendra 3
  • Rate limiting for admin and API

    Rate limiting for admin and API

    Description

    Rate limiting for admin and API

    Related issues

    #31

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [X] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 2
  • Add CONTRIBUTING.md file

    Add CONTRIBUTING.md file

    Can we please add or refer to the URL containing the guidelines for future contributors, I see there's nothing mentioned about it in the readme or docs for this repo.

    opened by ManishShah120 2
  • Integrate GreedyBear inside T-Pot installation

    Integrate GreedyBear inside T-Pot installation

    This would require that all of these issues were solved first:

    • #11 , #12 , #10 , #21 , #27

    Plus, we would need to work with the T-Pot team to properly integrate the project there. The goal is to try to reduce the complexity of the overall application to allow an easy integration

    opened by mlodic 2
  • Allow to do customized feeds lookups

    Allow to do customized feeds lookups

    We could add more ways to extract data feeds from GB other than "recent" and "persistent" which are free.

    These new ways must be protected with authentication to avoid abuse.

    We could give the users the chance to:

    • download the data extracted in the last X hours (customization of "recent")
    • download the data that was seen more than X times in the last X days (customization of "persistent")
    opened by mlodic 0
  • Filter IP addresses from known scanners

    Filter IP addresses from known scanners

    We should periodically download this batch of data: https://raw.githubusercontent.com/stamparm/maltrail/master/trails/static/mass_scanner.txt and add those IP to whitelists to reduce number of false positives

    opened by mlodic 0
  • Add the chance to select which honeypot we want to extract data from

    Add the chance to select which honeypot we want to extract data from

    Right now there is no chance to do that. GreedyBear would automatically extract data from all the configured honeypots.

    We should allow the app administrator from the Django Admin to enable/disable honeypot extraction. In that way we can also filter logs which states that the honeypot is not running.

    opened by mlodic 0
Releases(v1.0.2)
Owner
The Honeynet Project
The Honeynet Project
Script Crack Facebook Elite 🚶‍♂

elite Script Crack Facebook Elite 🚶‍♂ Install Script $ pkg update && pkg upgrade $ termux-setup-storage $ pkg install git $ pkg install python $ pip

Yumasaa 1 Jan 02, 2022
script that pulls cve collections from NVD.NIST.GOV.

# cvepull.py #script that pulls cve collections from NVD.NIST.GOV. #edit line 17 (timedelta) number to change the amount of days to search backwards

Aaron W 1 Dec 18, 2021
一款辅助探测Orderby注入漏洞的BurpSuite插件,Python3编写,适用于上xray等扫描器被ban的场景

OrderbyHunter 一款辅助探测Orderby注入漏洞的BurpSuite插件,Python3编写,适用于上xray等扫描器被ban的场景 1. 支持Get/Post型请求参数的探测,被动探测,对于存在Orderby注入的请求将会在HTTP Histroy里标红 2. 自定义排序参数list

Automne 21 Aug 12, 2022
集成crawlergo、xray、dirsearch、nmap等工具的src漏洞挖掘工具,使用docker封装运行;

tools下有几个工具,所以项目文件比较大,如果下载总是中断的话建议拆开下载各个项目然后直接拷贝dockefile和recon.py即可 0x01 hscan介绍 hscan是什么 hscan是一款旨在使用一条命令替代渗透前的多条扫描命令,通过集成crawlergo扫描和xray扫描、dirsear

102 Jan 04, 2023
Convert a collection of features to a fixed-dimensional matrix using the hashing trick.

FeatureHasher Convert a collection of features to a fixed-dimensional matrix using the hashing trick. Note, this requires Jina=2.2.4. Example Here I

Jina AI 5 Mar 15, 2022
A small utility to deal with malware embedded hashes.

Uchihash is a small utility that can save malware analysts the time of dealing with embedded hash values used for various things such as: Dyn

Abdallah Elshinbary 48 Dec 19, 2022
the metasploit script(POC/EXP) about CVE-2021-22005 VMware vCenter Server contains an arbitrary file upload vulnerability

CVE-2021-22005-metasploit the metasploit script(POC/EXP) about CVE-2021-22005 VMware vCenter Server contains an arbitrary file upload vulnerability pr

Taroballz 25 Nov 15, 2022
AnonStress-Stored-XSS-Exploit - An exploit and demonstration on how to exploit a Stored XSS vulnerability in anonstress

AnonStress Stored XSS Exploit An exploit and demonstration on how to exploit a S

صلى الله على محمد وآله 3 Jun 22, 2022
A passive-recon tool that parses through found assets and interacts with the Hackerone API

Hackerone Passive Recon Tool A passive-recon tool that parses through found assets and interacts with the Hackerone API. Setup Simply run setup.sh to

elbee 4 Jan 13, 2022
Threat Intel Platform for T-POTs

GreedyBear The project goal is to extract data of the attacks detected by a TPOT or a cluster of them and to generate some feeds that can be used to p

The Honeynet Project 72 Jan 01, 2023
Yuyu Scanner is a Web Reconnaissance & Web Analysis Scanner to find assets and information about targets.

Yuyu Scanner Yuyu Scanner is a Web Reconnaissance & Web Analysis Scanner to find assets and information about targets. installation ! run as root

Justakazh 20 Nov 24, 2022
A Burp Suite extension made to automate the process of finding reverse proxy path based SSRF.

TProxer A Burp Suite extension made to automate the process of finding reverse proxy path based SSRF. How • Install • Todo • Join Discord How it works

Krypt0mux 162 Nov 25, 2022
A high-performance DNS stub resolver for bulk lookups and reconnaissance (subdomain enumeration)

MassDNS A high-performance DNS stub resolver MassDNS is a simple high-performance DNS stub resolver targeting those who seek to resolve a massive amou

B. Blechschmidt 2.5k Jan 07, 2023
🔎 Most Advanced Open Source Intelligence (OSINT) Framework for scanning IP Address, Emails, Websites, Organizations.

🔎 Most Advanced Open Source Intelligence (OSINT) Framework for scanning IP Address, Emails, Websites, Organizations.

BhavKaran 1.5k Dec 28, 2022
Flutter Reverse Engineering Framework

This framework helps reverse engineer Flutter apps using patched version of Flutter library which is already compiled and ready for app repacking. There are changes made to snapshot deserialization p

PT SWARM 910 Jan 01, 2023
Early days of an Asset Discovery tool.

Please star this project! Written in Python Report Bug . Request Feature DISCLAIMER This project is in its early days, everything you see here is almo

grag1337 3 Dec 20, 2022
A TCP Backdoor made in python

Tracey-Backdoor A Reverse Shell Backdoor made in python OOP. It supposed to work in Windows and Linux OS Functions: Reverse Connection Send Reverse TC

13 Oct 15, 2022
#whois it? Let's find out!

whois_bot #whois it? Let's find out! Currently in development: a gatekeeper bot for a community (https://t.me/IT_antalya) of 250+ expat IT pros of Ant

Kirill Nikolaev 14 Jun 24, 2022
Collection Of Discord Hacking Tools / Fun Stuff / Exploits That Is Completely Made Using Python.

Venom Collection Of Discord Hacking Tools / Fun Stuff / Exploits That Is Completely Made Using Python. Report Bug · Request Feature Contributing Well,

PndaBoi 25 Dec 06, 2022
Hadoop Yan ResourceManager unauthorized RCE

Vuln Impact There was an unauthorized access vulnerability in Hadoop yarn ResourceManager. This vulnerability existed in Hadoop yarn, the core compone

Al1ex 25 Nov 24, 2022