Ankou: Guiding Grey-box Fuzzing towards Combinatorial Difference

Related tags

Deep Learningfuzzer
Overview

Ankou

Ankou is a source-based grey-box fuzzer. It intends to use a more rich fitness function by going beyond simple branch coverage and considering the combination of branches during program execution. The details of the technique can be found in our paper "Ankou: Guiding Grey-box Fuzzing towards Combinatorial Difference", which is published in ICSE 2020.

Dependencies.

Go

Ankou is written solely in Go and thus requires its installation. Be sure to configure this GOPATH environment variable, for example to ~/go directory.

AFL

Ankou relies on AFL instrumentation: fuzzed targets needs to compiled using afl-gcc or afl-clang. To install AFL:

wget http://lcamtuf.coredump.cx/afl/releases/afl-latest.tgz
tar xf afl-latest.tgz
cd afl-2.52b
make
# The last command is optional, but you'll need to provide the absolute path to
# the compiler in the configure step below if you don't install AFL compiler.
sudo make install

GDB

For the triaging gdb is required, and ASLR needs to be deactivated:

sudo echo 0 | sudo tee /proc/sys/kernel/randomize_va_space

Note that when using docker containers, this needs to be run in the host.

Installation

Once Go and AFL are installed, you can get Ankou by:

go get github.com/SoftSec-KAIST/Ankou   # Clone Ankou and its dependencies
go build github.com/SoftSec-KAIST/Ankou # Compile Ankou
Note: If getting Ankou from another location, this needs to be done manually:
mkdir -p $GOPATH/src/github.com/SoftSec-KAIST
cd $GOPATH/src/github.com/SoftSec-KAIST
git clone REPO  # By default REPO is https://github.com/SoftSec-KAIST/Ankou
cd Ankou
go get .    # Get dependencies
go build .  # Compile

Usage

Now we are ready to fuzz. We first to compile any target we want with afl-gcc or afl-clang. Let's take the classical starting example for fuzzing, binutils:

wget https://mirror.ibcp.fr/pub/gnu/binutils/binutils-2.33.1.tar.xz
tar xf binutils-2.33.1.tar.xz
cd binutils-2.33.1
CC=afl-gcc CXX=afl-g++ ./configure --prefix=`pwd`/install
make -j
make install

Now we are ready to run Ankou:

cd install/bin
mkdir seeds; cp elfedit seeds/ # Put anything in the seeds folder.
go run github.com/SoftSec-KAIST/Ankou -app ./readelf -args "-a @@" -i seeds -o out
# Or use the binary we compiled above:
/path/to/Ankou -app ./readelf -args "-a @@" -i seeds -o out

Evaluation Reproduction

Once Ankou is installed, in order to reproduce the Ankou evaluation:

  1. Compile the 24 packages mentioned in the paper at the same version or commit using afl-gcc. All the packages' source can be found with the same version used in Ankou evaluation at https://github.com/SoftSec-KAIST/Ankou-Benchmark. Additionnally, this repository includes the seeds used to initialize the evalution fuzzing campaigns.
  2. Run the produced subjects with the commands found in benchmark/configuration.json. benchmark/rq1_rq3.json only contains the 24 subjets used for Research Question 1 and 3 of the paper.
  3. Analyze Ankou output directory for results. Crashes are listed in $OUTPUT_DIR/crashes-* and found seeds in $OUTPUT_DIR/seeds-*. Statistics of the fuzzing campaign can be found in the $OUTPUT_DIR/status* directory CSV files. The edge_n value of receiver.csv represents the branch coverage. And the execN column of seed_manager.csv represents the total number of test cases executed so far. Divide it by the time column to obtain the throughout.

There are too many programs in our benchmark, so we will use only one package in this example: cflow.

  1. Compilation.
git clone https://github.com/SoftSec-KAIST/Ankou-Benchmark
cd Ankou-Benchmark
tar xf seeds.tar.xz
cd sources
tar xf cflow-1.6.tar.xz
cd cflow-1.6
CC=afl-gcc CXX=afl-g++ ./configure --prefix=`pwd`/build
make -j
make install
cd ../../..
  1. Preparation of the fuzzing campaign.
mkdir fuzzrun
cp Ankou-Benchmark/sources/cflow-1.6/build/bin/cflow fuzzrun
cp -r Ankou-Benchmark/seeds/cflow fuzzrun/seeds
  1. Run the campaign. The above starts a 24 hours fuzzing campaign. The '-dur' option can be adjusted, or Ankou interrupted earlier. In this version of cflow, and initialized with these seeds, a crash should be found in less than an hour.
cd fuzzrun
go run github.com/SoftSec-KAIST/Ankou -app cflow -args "-o /dev/null @@" \
    -i seeds -threads 1 -o cflow_out -dur 24h
  1. Results analysis
cd cflow_out/status_*
# Print the final branch coverage:
python -c "print(open('receiver.csv').readlines()[-1].split(',')[0])"
# Print the overall throughput:
python -c "last = open('seed_manager.csv').readlines()[-1].split(','); print(float(last[5])/int(last[6]))"
# Print effectiveness of the dynamic PCA (see RQ2):
python -c "last = open('receiver.csv').readlines()[-1].split(','); print('{}%'.format(100-100*float(last[2])/float(last[1])))"

Safe Stack Hash Triaging

Once the environment is setup, the scripts works in two steps:

  1. Run the binary on the crashing input to produce a core file. Using ulimit -c unlimited ensures the core to be dumped.
  2. Use the scripts in the triage folder of this repository:
cd $GOPATH/src/github.com/SoftSec-KAIST/Ankou/triage
gdb -x triage.py -x triage.gdb -batch -c /path/to/core /path/to/binary
cat hash.txt # The stack hashes are found in this text file.
Owner
SoftSec Lab
SoftSec Lab @ KAIST
SoftSec Lab
Tensorboard for pytorch (and chainer, mxnet, numpy, ...)

tensorboardX Write TensorBoard events with simple function call. The current release (v2.3) is tested on anaconda3, with PyTorch 1.8.1 / torchvision 0

Tzu-Wei Huang 7.5k Dec 28, 2022
Distributional Sliced-Wasserstein distance code

Distributional Sliced Wasserstein distance This is a pytorch implementation of the paper "Distributional Sliced-Wasserstein and Applications to Genera

VinAI Research 39 Jan 01, 2023
Analysis of Smiles through reservoir sampling & RDkit

Analysis of Smiles through reservoir sampling and machine learning (under development). This is a simple project that includes two Jupyter files for t

Aurimas A. Nausėdas 6 Aug 30, 2022
CAST: Character labeling in Animation using Self-supervision by Tracking

CAST: Character labeling in Animation using Self-supervision by Tracking (Published as a conference paper at EuroGraphics 2022) Note: The CAST paper c

15 Nov 18, 2022
Keras Image Embeddings using Contrastive Loss

Image to Embedding projection in vector space. Implementation in keras and tensorflow of batch all triplet loss for one-shot/few-shot learning.

Shravan Anand K 5 Mar 21, 2022
Code release for "BoxeR: Box-Attention for 2D and 3D Transformers"

BoxeR By Duy-Kien Nguyen, Jihong Ju, Olaf Booij, Martin R. Oswald, Cees Snoek. This repository is an official implementation of the paper BoxeR: Box-A

Nguyen Duy Kien 111 Dec 07, 2022
A repo to show how to use custom dataset to train s2anet, and change backbone to resnext101

A repo to show how to use custom dataset to train s2anet, and change backbone to resnext101

jedibobo 3 Dec 28, 2022
Automatic packaging of the open-composite libs for OvGME

OvGME Packager for OpenXR – OpenComposite for DCS Note This repository is currently unsupported and needs to be migrated to the upstream OpenComposite

12 Nov 03, 2022
Code for the paper: Adversarial Machine Learning: Bayesian Perspectives

Code for the paper: Adversarial Machine Learning: Bayesian Perspectives This repository contains code for reproducing the experiments in the ** Advers

Roi Naveiro 2 Nov 11, 2022
Face Mask Detection on Image and Video using tensorflow and keras

Face-Mask-Detection Face Mask Detection on Image and Video using tensorflow and keras Train Neural Network on face-mask dataset using tensorflow and k

Nahid Ebrahimian 12 Nov 11, 2022
IMBENS: class-imbalanced ensemble learning in Python.

IMBENS: class-imbalanced ensemble learning in Python. Links: [Documentation] [Gallery] [PyPI] [Changelog] [Source] [Download] [知乎/Zhihu] [中文README] [a

Zhining Liu 176 Jan 04, 2023
Pytorch implementation of the paper "Enhancing Content Preservation in Text Style Transfer Using Reverse Attention and Conditional Layer Normalization"

Pytorch implementation of the paper "Enhancing Content Preservation in Text Style Transfer Using Reverse Attention and Conditional Layer Normalization"

Dongkyu Lee 4 Sep 18, 2022
Deep Surface Reconstruction from Point Clouds with Visibility Information

Data, code and pretrained models for the paper Deep Surface Reconstruction from Point Clouds with Visibility Information.

Raphael Sulzer 23 Jan 04, 2023
MinHash, LSH, LSH Forest, Weighted MinHash, HyperLogLog, HyperLogLog++, LSH Ensemble

datasketch: Big Data Looks Small datasketch gives you probabilistic data structures that can process and search very large amount of data super fast,

Eric Zhu 1.9k Jan 07, 2023
PyTorch Implementation of ByteDance's Cross-speaker Emotion Transfer Based on Speaker Condition Layer Normalization and Semi-Supervised Training in Text-To-Speech

Cross-Speaker-Emotion-Transfer - PyTorch Implementation PyTorch Implementation of ByteDance's Cross-speaker Emotion Transfer Based on Speaker Conditio

Keon Lee 114 Jan 08, 2023
Specification language for generating Generalized Linear Models (with or without mixed effects) from conceptual models

tisane Tisane: Authoring Statistical Models via Formal Reasoning from Conceptual and Data Relationships TL;DR: Analysts can use Tisane to author gener

Eunice Jun 11 Nov 15, 2022
NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size

NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size Xuanyi Dong, Lu Liu, Katarzyna Musial, Bogdan Gabrys in IEEE Transactions o

D-X-Y 137 Dec 20, 2022
3D cascade RCNN for object detection on point cloud

3D Cascade RCNN This is the implementation of 3D Cascade RCNN: High Quality Object Detection in Point Clouds. We designed a 3D object detection model

Qi Cai 22 Dec 02, 2022
Code for EMNLP2020 long paper: BERT-Attack: Adversarial Attack Against BERT Using BERT

BERT-ATTACK Code for our EMNLP2020 long paper: BERT-ATTACK: Adversarial Attack Against BERT Using BERT Dependencies Python 3.7 PyTorch 1.4.0 transform

Linyang Li 142 Jan 04, 2023
Official implementation of the ICML2021 paper "Elastic Graph Neural Networks"

ElasticGNN This repository includes the official implementation of ElasticGNN in the paper "Elastic Graph Neural Networks" [ICML 2021]. Xiaorui Liu, W

liuxiaorui 34 Dec 04, 2022