Harvis is designed to automate your C2 Infrastructure.

Related tags

Text Data & NLPHarvis
Overview

Harvis

Harvis is designed to automate your C2 Infrastructure, currently using Mythic C2.

📌 What is it?

Harvis is a python tool to help you create multiple hauls for a specific operation during a red team engagement. It can automatically create your C2 machine, redirector machine, setup SSL, .htaccess configuration, firewall rules and more. Harvis also has the purpose of automating the job of the operator of identifying burned domains/redirectors that may be caught during the operation. After identifying the burned domains it provides the possibility of rotating the infrastructure, setting up different redirectors and atributing a different domain.

📌 How?

🔨 Harvis uses the Digital Ocean API to automate creation and deletion of droplets.

🔨 The Namecheap API is used to set DNS records to redirectors.

🔨 The API Void is used to constantly verify the state of the redirectors and check if anything is blacklisted.

Features

Harvis has several features to help you organize your available domains and redirector/C2 machines.

  • Namecheap Interaction - Harvis enables you to buy domains directly through the command line, as long as you have credits in your account.
  • Multiple Hauls - It is possible to create as many hauls as possible, each one having specific configurations.
  • Multiple C2 Profiles by Haul - Harvis allows you to create multiple C2 profiles by hauls: you could two HTTP listeners in one of the C2's and 3 in the other one, each of them listening on different ports.
  • Multiple Redirector Configuration - Each redirector can have a different .htaccess configuration, defined by the operator.
  • Customizable Firewall Rules - Harvis comes with default firewall rules for the redirectors and C2's, however, this feature is easily customizable.
  • Priority System - Each haul has a queue system, in a way that the new redirector will replace the blacklisted one with the domains in the queue.
  • Priority System - Automatically replace your droplets: Harvis identifies any blacklisted redirector and print the results to the operator. It allows the operator to create a temporary droplet to replace the blacklisted one. It does not configure the migration of any active agents, since the way the agent will be migrated/spawned to connect to the new domain might be a very personal decision in an engagement. After creating the temporary droplet, it allows you to migrate any active agents and kill the older redirector.
  • Priority System - If somehow the script crashes, all the information will be saved in the backup.py file. Restarting the script will recover all your infrastructure as it was.

Installation

git clone https://github.com/thiagomayllart/Harvis/
cd Harvis
pip3 install -r requirements

Running

python3 harvis.py

Configuring API Keys

Harvis can only be used with the proper API Keys from Digital Ocean, Namecheap, APIVoid.

These api keys should be added to the config.py file in the respective lines. Don't forget to setup your namecheap username in namecheap_username variable:

digital_ocean_token = ""
...
namecheap_key = ""
...
apivoid_key = ""
...
namecheap_username = ""
...

For more information regarding these API Keys, visit:

https://app.apivoid.com/ https://ap.www.namecheap.com/ https://cloud.digitalocean.com/

First Run

Before running the script, you should apply some modifications to the config file, which describes the configuration of your infrastructure:

  1. Modify the "names" variable.

This variable holds the names of each haul you want your infrastructure you have. Theses names should be applied in the next variables. You can have as many hauls you want. You can also specify which domain should already be configure to each haul. If you don't specify the domains, you will be asked to move domains to each haul in the first run. If you don't have any available domains in your namecheap API, you can buy it directly though Harvis. It is also possible to have more than one domain in each haul: the first one will be used in the redirector and the others will already be in the backup list for further infrastructure rotations. Example:

names = {"short":[],"long":[],"exploitation":[],"testing":[]}
names = {"short":["domain1.com"],"long":["domain2.com","domain3.com"],"exploitation":["domain4.com"],"testing":["domain5.com"]}
  1. Modify the "config_htaccess_dic" variable:

You can customize your htaccess rules for each haul in this variable. This variable is dictionary, so remember to add an htacces for each haul you added previously in the "names" variable like:

config_htaccess_dic = \
    {"short":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","long:"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","exploitation":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
""","testing":"""
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/({1})/?$ [NC]
RewriteRule ^.*$ https://{2}%{REQUEST_URI} [P]
RewriteRule ^.*$ http://{3}? [L,R=302]
"""
                       }

You can notice the presence of the fields: {1}, {2}, {3}. If you customize this variable, do not remove them. {1} are the parameters your Mythic agent will use to communicate to the C2 (it also allows customization). {2} is the IP address of your C2. {3} is the location the redirector will be redirecting (it also allows customization).

  1. Modify the "agent_profiles" variable:

This variable describes the HTTP parameters that your agent will use to communicate to your C2. Mythic allows setting these parameters during the creation of the agent, so, these values should match the ones you will be configurating the agent. The first one is the GET parameter and other one is the POST parameter. Also, remeber to once again add a configuration to each haul you created previously:

agent_profiles = {"short":{"URI":"data|index"},"long":{"URI":"q|id"}... ...
  1. Modify "domain_front_redirector" variable:

This variable holds the domain your redirectors will be redirecting anyone that tries to access. Add a configuration to each haul you created previously:

domain_front_redirector = {"short":"www.example.com","long":"www.example2.com"... ... ...

  1. Modify "c2_profiles" variable:

With the "c2_profiles" variable you can create different listener profiles for each Haul you created. The format is exactly the same as the JSON you may find when accessing Configuring a C2 Profile in Mythic. You can also have other profiles than HTTP, however, depending on the protocol used, it may be necessary to change firewall rules in the C2 or the redirector (further explained).

Example:


c2_profiles = {"short":[{"name":"HTTP","config":"""{
  "instances": [
  {
    "ServerHeaders": {
      "Server": "NetDNA-cache/2.2",
      "Cache-Control": "max-age=0, no-cache",
      "Pragma": "no-cache",
      "Connection": "keep-alive",
      "Content-Type": "application/javascript; charset=utf-8"
    },
    "port": 443,
    "key_path": "privkey.pem",
    "cert_path": "cert.pem",
    "debug": true
    }
  ]
}"""}],"long":[{"name":"HTTP","config":"""{
  "instances": [
  {
    "ServerHeaders": {
      "Server": "NetDNA-cache/2.2",
      "Cache-Control": "max-age=0, no-cache",
      "Pragma": "no-cache",
      "Connection": "keep-alive",
      "Content-Type": "application/javascript; charset=utf-8"
    },
    "port": 443,
    "key_path": "privkey.pem",
    "cert_path": "cert.pem",
    "debug": true
    }
  ]
}"""}] }
  1. Modify the "check_infra_state" variable:

This variable holds the value (in seconds) that will be used as the interval between each verification of blacklisted domains by APIVoid.

  1. Modify "ip_allowed_to_connect_c2":

Replace it with the IP you will be using as the proxy to connect to your Mythic C2 panel. You can use your public IP, but it is not recommended.

  1. Replace "username":

Replace it with the username you will be using during the engagement. This variable is used to tag each droplet created in digital ocean, making the distinction between the droplets of each operator easier. It also avoids that the tool erases the droplet of another user (in case you guys are using the same API Keys).

  1. Modify Firewall Rules (OPTIONAL):

If you want to modify the firewall rules for the C2/redirector, you will find them respectively at:

C2: C2_setup.py: function firewall_rules Redirector: redirect_setup.py: firewall_rules

Important

Harvis whitelists the IP of the machine you are deploying it as being able to access your Mythic C2 panel. It is highly recommended to use a VPC to deploy Harvis.

Owner
Thiago Mayllart
- Information Security Researcher / RedTeam
Thiago Mayllart
Biterm Topic Model (BTM): modeling topics in short texts

Biterm Topic Model Bitermplus implements Biterm topic model for short texts introduced by Xiaohui Yan, Jiafeng Guo, Yanyan Lan, and Xueqi Cheng. Actua

Maksim Terpilowski 49 Dec 30, 2022
A retro text-to-speech bot for Discord

hawking A retro text-to-speech bot for Discord, designed to work with all of the stuff you might've seen in Moonbase Alpha, using the existing command

Nick Schorr 23 Dec 25, 2022
An extension for asreview implements a version of the tf-idf feature extractor that saves the matrix and the vocabulary.

Extension - matrix and vocabulary extractor for TF-IDF and Doc2Vec An extension for ASReview that adds a tf-idf extractor that saves the matrix and th

ASReview 4 Jun 17, 2022
Codes to pre-train Japanese T5 models

t5-japanese Codes to pre-train a T5 (Text-to-Text Transfer Transformer) model pre-trained on Japanese web texts. The model is available at https://hug

Megagon Labs 37 Dec 25, 2022
Officile code repository for "A Game-Theoretic Perspective on Risk-Sensitive Reinforcement Learning"

CvarAdversarialRL Official code repository for "A Game-Theoretic Perspective on Risk-Sensitive Reinforcement Learning". Initial setup Create a virtual

Mathieu Godbout 1 Nov 19, 2021
ADCS - Automatic Defect Classification System (ADCS) for SSMC

Table of Contents Table of Contents ADCS Overview Summary Operator's Guide Demo System Design System Logic Training Mode Production System Flow Folder

Tam Zher Min 2 Jun 24, 2022
Python package for performing Entity and Text Matching using Deep Learning.

DeepMatcher DeepMatcher is a Python package for performing entity and text matching using deep learning. It provides built-in neural networks and util

461 Dec 28, 2022
Outreachy TFX custom component project

Schema Curation Custom Component Outreachy TFX custom component project This repo contains the code for Schema Curation Custom Component made as a par

Robert Crowe 5 Jul 16, 2021
Original implementation of the pooling method introduced in "Speaker embeddings by modeling channel-wise correlations"

Speaker-Embeddings-Correlation-Pooling This is the original implementation of the pooling method introduced in "Speaker embeddings by modeling channel

Themos Stafylakis 10 Apr 30, 2022
Global Rhythm Style Transfer Without Text Transcriptions

Global Prosody Style Transfer Without Text Transcriptions This repository provides a PyTorch implementation of AutoPST, which enables unsupervised glo

Kaizhi Qian 193 Dec 30, 2022
A PyTorch-based model pruning toolkit for pre-trained language models

English | 中文说明 TextPruner是一个为预训练语言模型设计的模型裁剪工具包,通过轻量、快速的裁剪方法对模型进行结构化剪枝,从而实现压缩模型体积、提升模型速度。 其他相关资源: 知识蒸馏工具TextBrewer:https://github.com/airaria/TextBrewe

Ziqing Yang 231 Jan 08, 2023
Protein Language Model

ProteinLM We pretrain protein language model based on Megatron-LM framework, and then evaluate the pretrained model results on TAPE (Tasks Assessing P

THUDM 77 Dec 27, 2022
A cross platform OCR Library based on PaddleOCR & OnnxRuntime

A cross platform OCR Library based on PaddleOCR & OnnxRuntime

RapidOCR Team 767 Jan 09, 2023
Yodatranslator is a simple translator English to Yoda-language

yodatranslator Overview yodatranslator is a simple translator English to Yoda-language. Project is created for educational purposes. It is intended to

1 Nov 11, 2021
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities

Hiring We are hiring at all levels (including FTE researchers and interns)! If you are interested in working with us on NLP and large-scale pre-traine

Microsoft 7.8k Jan 09, 2023
Simple tool/toolkit for evaluating NLG (Natural Language Generation) offering various automated metrics.

Simple tool/toolkit for evaluating NLG (Natural Language Generation) offering various automated metrics. Jury offers a smooth and easy-to-use interface. It uses datasets for underlying metric computa

Open Business Software Solutions 129 Jan 06, 2023
Healthsea is a spaCy pipeline for analyzing user reviews of supplementary products for their effects on health.

Welcome to Healthsea ✨ Create better access to health with spaCy. Healthsea is a pipeline for analyzing user reviews to supplement products by extract

Explosion 75 Dec 19, 2022
Continuously update some NLP practice based on different tasks.

NLP_practice We will continuously update some NLP practice based on different tasks. prerequisites Software pytorch = 1.10 torchtext = 0.11.0 sklear

0 Jan 05, 2022
मराठी भाषा वाचविण्याचा एक प्रयास. इंग्रजी ते मराठीचा शब्दकोश. An attempt to preserve the Marathi language. A lightweight and ad free English to Marathi thesaurus.

For English, scroll down मराठी शब्द मराठी भाषा वाचवण्यासाठी मी हा ओपन सोर्स प्रोजेक्ट सुरू केला आहे. माझ्या मते, आपली भाषा हळूहळू आणि कोणाचाही लक्षात

मुक्त स्त्रोत 20 Oct 11, 2022