Demonstrates iterative FGSM on Apple's NeuralHash model.

Overview

apple-neuralhash-attack

Demonstrates iterative FGSM on Apple's NeuralHash model.

TL;DR: It is possible to apply noise to CSAM images and make them look like regular images to the NeuralHash model. The noise does degrade the CSAM image (see samples). But this was achieved without tuning learning rate and there are more refined attacks available too.

Example

Here is an example that uses a Grumpy Cat image in place of a CSAM image. The attack adds noise to the Grumpy Cat image and makes the model see it as a Doge image.

As a result, both of these images have the same neural hash of 11d9b097ac960bd2c6c131fa, computed via ONNX Runtime, with the script by AsuharietYgvar/AppleNeuralHash2ONNX.

doge adv_cat

More generally, because the attack optimizes the model output, the adversarial image will generate largely the same hash as the good image, regardless of the seed.

Instructions

Get ONNX model

Obtain the ONNX model from AsuharietYgvar/AppleNeuralHash2ONNX. You should have a path to a model.onnx file.

Convert ONNX model to TF model

Then convert the ONNX model to a Tensorflow model by first installing the onnx_tf library via onnx/onnx-tensorflow. Then run the following:

python3 convert.py -o /path/to/model.onnx

This will save a Tensorflow model to the current directory as model.pb.

Run adversarial attack

Finally, run the adversarial attack with the following:

python3 nnhash_attack.py --seed /path/to/neuralhash_128x96_seed1.dat

Other arguments:

-m           Path to Tensorflow model (defaults to "model.pb")
--good       Path to good image (defaults to "samples/doge.png")
--bad        Path to bad image (defaults to "samples/grumpy_cat.png")
--lr         Learning rate (defaults to 3e-1)
--save_every Save every interval (defaults to 2000)

This will save generated images to samples/iteration_{i}.png.

Note that the hash similarity may decrease initially before increasing again.

Also, for the sample images and with default parameters, the hash was identical after 28000 iterations.

Terminal output:

# Some Tensorflow boilerplate...
Iteration #2000: L2-loss=134688, Hash Similarity=0.2916666666666667
Good Hash: 11d9b097ac960bd2c6c131fa
Bad Hash : 20f1089728150af2ca2de49a
Saving image to samples/iteration2000.png...
Iteration #4000: L2-loss=32605, Hash Similarity=0.41666666666666677
Good Hash: 11d9b097ac960bd2c6c131fa
Bad Hash : 20d9b097ac170ad2cfe170da
Saving image to samples/iteration4000.png...
Iteration #6000: L2-loss=18547, Hash Similarity=0.4166666666666667
Good Hash: 11d9b097ac960bd2c6c131fa
Bad Hash : 20d9b097ac170ad2c7c1f0de
Saving image to samples/iteration6000.png...

Credit

Owner
Lim Swee Kiat
Lim Swee Kiat
Pointer-generator - Code for the ACL 2017 paper Get To The Point: Summarization with Pointer-Generator Networks

Note: this code is no longer actively maintained. However, feel free to use the Issues section to discuss the code with other users. Some users have u

Abi See 2.1k Jan 04, 2023
NeuralWOZ: Learning to Collect Task-Oriented Dialogue via Model-based Simulation (ACL-IJCNLP 2021)

NeuralWOZ This code is official implementation of "NeuralWOZ: Learning to Collect Task-Oriented Dialogue via Model-based Simulation". Sungdong Kim, Mi

NAVER AI 31 Oct 25, 2022
Code for "Diversity can be Transferred: Output Diversification for White- and Black-box Attacks"

Output Diversified Sampling (ODS) This is the github repository for the NeurIPS 2020 paper "Diversity can be Transferred: Output Diversification for W

50 Dec 11, 2022
maximal update parametrization (µP)

Maximal Update Parametrization (μP) and Hyperparameter Transfer (μTransfer) Paper link | Blog link In Tensor Programs V: Tuning Large Neural Networks

Microsoft 694 Jan 03, 2023
Unofficial Implementation of RobustSTL: A Robust Seasonal-Trend Decomposition Algorithm for Long Time Series (AAAI 2019)

RobustSTL: A Robust Seasonal-Trend Decomposition Algorithm for Long Time Series (AAAI 2019) This repository contains python (3.5.2) implementation of

Doyup Lee 222 Dec 21, 2022
The Pytorch implementation for "Video-Text Pre-training with Learned Regions"

Region_Learner The Pytorch implementation for "Video-Text Pre-training with Learned Regions" (arxiv) We are still cleaning up the code further and pre

Rui Yan 0 Mar 20, 2022
Code to reproduce results from the paper "AmbientGAN: Generative models from lossy measurements"

AmbientGAN: Generative models from lossy measurements This repository provides code to reproduce results from the paper AmbientGAN: Generative models

Ashish Bora 87 Oct 19, 2022
Implementation of a Transformer, but completely in Triton

Transformer in Triton (wip) Implementation of a Transformer, but completely in Triton. I'm completely new to lower-level neural net code, so this repo

Phil Wang 152 Dec 22, 2022
A Python library for working with arbitrary-dimension hypercomplex numbers following the Cayley-Dickson construction of algebras.

Hypercomplex A Python library for working with quaternions, octonions, sedenions, and beyond following the Cayley-Dickson construction of hypercomplex

7 Nov 04, 2022
Team Enigma at ArgMining 2021 Shared Task: Leveraging Pretrained Language Models for Key Point Matching

Team Enigma at ArgMining 2021 Shared Task: Leveraging Pretrained Language Models for Key Point Matching This is our attempt of the shared task on Quan

Manav Nitin Kapadnis 12 Jul 08, 2022
Inflated i3d network with inception backbone, weights transfered from tensorflow

I3D models transfered from Tensorflow to PyTorch This repo contains several scripts that allow to transfer the weights from the tensorflow implementat

Yana 479 Dec 08, 2022
IEEE Winter Conference on Applications of Computer Vision 2022 Accepted

SSKT(Accepted WACV2022) Concept map Dataset Image dataset CIFAR10 (torchvision) CIFAR100 (torchvision) STL10 (torchvision) Pascal VOC (torchvision) Im

1 Nov 17, 2022
BookMyShowPC - Movie Ticket Reservation App made with Tkinter

Book My Show PC What is this? Movie Ticket Reservation App made with Tkinter. Tk

The Nithin Balaji 3 Dec 09, 2022
Code for ACL 2019 Paper: "COMET: Commonsense Transformers for Automatic Knowledge Graph Construction"

To run a generation experiment (either conceptnet or atomic), follow these instructions: First Steps First clone, the repo: git clone https://github.c

Antoine Bosselut 575 Jan 01, 2023
Realtime micro-expression recognition using OpenCV and PyTorch

Micro-expression Recognition Realtime micro-expression recognition from scratch using OpenCV and PyTorch Try it out with a webcam or video using the e

Irfan 35 Dec 05, 2022
Pytorch implementation of paper: "NeurMiPs: Neural Mixture of Planar Experts for View Synthesis"

NeurMips: Neural Mixture of Planar Experts for View Synthesis This is the official repo for PyTorch implementation of paper "NeurMips: Neural Mixture

James Lin 101 Dec 13, 2022
SpinalNet: Deep Neural Network with Gradual Input

SpinalNet: Deep Neural Network with Gradual Input This repository contains scripts for training different variations of the SpinalNet and its counterp

H M Dipu Kabir 142 Dec 30, 2022
CLIPort: What and Where Pathways for Robotic Manipulation

CLIPort CLIPort: What and Where Pathways for Robotic Manipulation Mohit Shridhar, Lucas Manuelli, Dieter Fox CoRL 2021 CLIPort is an end-to-end imitat

246 Dec 11, 2022
BisQue is a web-based platform designed to provide researchers with organizational and quantitative analysis tools for 5D image data. Users can extend BisQue by implementing containerized ML workflows.

Overview BisQue is a web-based platform specifically designed to provide researchers with organizational and quantitative analysis tools for up to 5D

Vision Research Lab @ UCSB 26 Nov 29, 2022
Byzantine-robust decentralized learning via self-centered clipping

Byzantine-robust decentralized learning via self-centered clipping In this paper, we study the challenging task of Byzantine-robust decentralized trai

EPFL Machine Learning and Optimization Laboratory 4 Aug 27, 2022