Spectrum is an AI that uses machine learning to generate Rap song lyrics

Overview

Contributors Forks Stargazers Issues MIT License Open In Colab


Spectrum

Spectrum is an AI that uses deep learning to generate rap song lyrics.

View Demo
Report Bug
Request Feature
Open In Colab

About The Project

Spectrum is an AI that uses deep learning to generate rap song lyrics.

Built With

This project is built using Python, Tensorflow, and Flask.

Getting Started

Installation

# clone the repo
git clone https://github.com/YigitGunduc/Spectrum.git

# install requirements
pip install -r requirements.txt

Training

# navigate to the Spectrum/AI folder 
cd Spectrum/AI

# pass verbose, epochs, save_at arguments and run train.py 
python3 train.py -h, --help  --epochs EPOCHS --save_at SAVE_AT --verbose VERBOSE --rnn_neurons RNN_NEURONS
             --embed_dim EMBED_DIM --dropout DROPOUT --num_layers NUM_LAYERS --learning_rate LEARNING_RATE

All the arguments are optional if you leave them empty model will construct itself with the default params

Generating Text from Trained Model

Call eval.py from the command line with seed text as an argument

python3 eval.py --seed SEEDTEXT

or

from model import Generator

model = Generator()

model.load_weights('../models/model-5-epochs-256-neurons.h5')

generatedText = model.predict(start_seed=SEED, gen_size=1000)

print(generatedText)
  • If you have tweaked the model's parameters while training initialize the model with the parameters you trained

Running the Web-App Locally

# navigate to the Spectrum folder 
cd Spectrum

# run app.py
python3 app.py

# check out http://0.0.0.0:8080

API

spectrum has a free web API you can send request to it as shown below

import requests 

response = requests.get("https://spectrumapp.herokuapp.com/api/generate/SEEDTEXT")
#raw response
print(response.json())
#cleaned up response
print(response.json()["lyrics"])

Hyperparameters

epochs = 30 
batch size = 128
number of layers = 2(hidden) + 1(output)
number of RNN units = 256
dropout prob = 0.3
embedding dimensions = 64
optimizer = Adam
loss = sparse categorical crossentropy

These hyperparameters are the best that I can found but you have to be careful while dealing with the hyperparameters because this model can over or underfit quite easily and GRUs performs better than LSTMs

Info about model

>>> from model import Generator
>>> model = Generator()
>>> model.load_weights('../models/model-5-epochs-256-neurons.h5')
>>> model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (1, None, 64)             6400      
_________________________________________________________________
gru (GRU)                    (1, None, 256)            247296    
_________________________________________________________________
gru_1 (GRU)                  (1, None, 256)            394752    
_________________________________________________________________
dense (Dense)                (1, None, 100)            25700     
=================================================================
Total params: 674,148
Trainable params: 674,148
Non-trainable params: 0
_________________________________________________________________

>>> model.hyperparams()
Hyper Parameters
+--------------------------+
|rnn_neurons   |        256|
|embed_dim     |         64|
|learning_rate |     0.0001|
|dropout       |        0.3|
|num_layers    |          2|
+--------------------------+
>>>

Roadmap

See the open issues for a list of proposed features (and known issues).

Results

WARNING: There is some offensive language ahead, please stop reading here if you are a sensitive person. The texts below have been generated by Spectrum

Seed : today

Prediction : 

If that don't, yeah
Weint off the music
It's like a fired-enother foar fool straight for an exactly
Nigga why I id my Door Merican muthafucka

Ng answered by need for blazy hard
The family wish fans dishes rolled up
How better just wanna die
Match all about the moment in I glory
Fire is that attention is the flop and pipe those peokin' distriors
Bitch I been hard and I'm like the Scales me and we're going to school like all-off of the allegit to get the bitches
Yeah kinda too legit back into highin'
A year have it would plobably want

And we all bustin' the conscious in the cusfuckers won't ha
Quite warkie and it's blow, and what? I cannot love him,
Alugal Superman, and the revolution likes migh
I ain't still not I uest the neighborhoo
Powers all too bad show, you crite your bac
When I say way too fathom
If you wanna revell, money, where your face we'll blin
Pulf me very, yo, they pull out for taught nothin' off
I pass a with a nigga hang some, pleas
Fuck me now, it's a

======================================================================
Seed : hello

Prediction : 

hellow motherfucker
You wanna talk on the pockets on Harlotom
I'm legit some more than Volumon
Ridicalab knowledge is blessin' some of your honierby man
We just bust the Flud joke with shoulders on the Statue
Lecock it on everybody want your dices to speak
While she speak cents look back to Pops
He was a nigga when I got behind pictures any Lil Sanvanas
Used to in her lady yaught they never had a bitch
He'll break the jird little rappers kill your children is

I'm prayin' back to ready for that bitch just finished And mised to the gamr
Every eyes on and about that getting common
I'm going to attractived with its
I just went by the crowd get the promise to buy the money-a star big down
Can one sall 'em in me tryna get them days that's how I can break the top
Well, that's hug her hands he screaming like a fucking hip-hop but put a Blidze like rhymin'
Yeah I slack like a Job let your cops got a generres
These West of it today flamping this
Black Kuttle crib, said "Ju Conlie, hold up, fuck the

======================================================================
Seed : bestfriend

Prediction : 

bestfriend
Too much time we tonight
The way I know is a please have no self-back when I be for the fucking weed and a game
What the fuck we wanna be working on the streets make it like a stay down the world is from the head of the real brain
Chain don't come back to the grass
My dick is the one to tell you I'm the fuck
So see me we gon' be fans when you had to hear the window you come to the dick when a little cooleng and I was calling what the fuck is it good as the crown
And I'm representing you finally waitin' in your girl
This is the corner with my brother
I'm just a damn door and the real motherfuckers come got the point my shit is the money on the world

I get it then the conscious that's why I cripp
I might take my own shit so let me have a bad bitch
I'm just had and make the fuck is in the single of the window
I think I ain't got the world is all my gone be mine
They ain't like the half the best between my words
And I'm changing with the heads of the speech
Fuck a bunch of best of a fuck

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

A PyTorch implementation for V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation

A PyTorch implementation of V-Net Vnet is a PyTorch implementation of the paper V-Net: Fully Convolutional Neural Networks for Volumetric Medical Imag

Matthew Macy 606 Dec 21, 2022
GANsformer: Generative Adversarial Transformers Drew A

GANformer: Generative Adversarial Transformers Drew A. Hudson* & C. Lawrence Zitnick Update: We released the new GANformer2 paper! *I wish to thank Ch

Drew Arad Hudson 1.2k Jan 02, 2023
AFL binary instrumentation

E9AFL --- Binary AFL E9AFL inserts American Fuzzy Lop (AFL) instrumentation into x86_64 Linux binaries. This allows binaries to be fuzzed without the

242 Dec 12, 2022
M2MRF: Many-to-Many Reassembly of Features for Tiny Lesion Segmentation in Fundus Images

M2MRF: Many-to-Many Reassembly of Features for Tiny Lesion Segmentation in Fundus Images This repo is the official implementation of paper "M2MRF: Man

12 Dec 14, 2022
RITA is a family of autoregressive protein models, developed by LightOn in collaboration with the OATML group at Oxford and the Debora Marks Lab at Harvard.

RITA: a Study on Scaling Up Generative Protein Sequence Models RITA is a family of autoregressive protein models, developed by a collaboration of Ligh

LightOn 69 Dec 22, 2022
A lightweight library to compare different PyTorch implementations of the same network architecture.

TorchBug is a lightweight library designed to compare two PyTorch implementations of the same network architecture. It allows you to count, and compar

Arjun Krishnakumar 5 Jan 02, 2023
Exploring Classification Equilibrium in Long-Tailed Object Detection, ICCV2021

Exploring Classification Equilibrium in Long-Tailed Object Detection (LOCE, ICCV 2021) Paper Introduction The conventional detectors tend to make imba

52 Nov 21, 2022
A state-of-the-art semi-supervised method for image recognition

Mean teachers are better role models Paper ---- NIPS 2017 poster ---- NIPS 2017 spotlight slides ---- Blog post By Antti Tarvainen, Harri Valpola (The

Curious AI 1.4k Jan 06, 2023
Training Very Deep Neural Networks Without Skip-Connections

DiracNets v2 update (January 2018): The code was updated for DiracNets-v2 in which we removed NCReLU by adding per-channel a and b multipliers without

Sergey Zagoruyko 585 Oct 12, 2022
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN

Overview PyTorch 0.4.1 | Python 3.6.5 Annotated implementations with comparative introductions for minimax, non-saturating, wasserstein, wasserstein g

Shayne O'Brien 471 Dec 16, 2022
Code for our NeurIPS 2021 paper Mining the Benefits of Two-stage and One-stage HOI Detection

CDN Code for our NeurIPS 2021 paper "Mining the Benefits of Two-stage and One-stage HOI Detection". Contributed by Aixi Zhang*, Yue Liao*, Si Liu, Mia

71 Dec 14, 2022
Library for fast text representation and classification.

fastText fastText is a library for efficient learning of word representations and sentence classification. Table of contents Resources Models Suppleme

Facebook Research 24.1k Jan 01, 2023
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

47 Dec 28, 2022
Source code for Zalo AI 2021 submission

zalo_ltr_2021 Source code for Zalo AI 2021 submission Solution: Pipeline We use the pipepline in the picture below: Our pipeline is combination of BM2

128 Dec 27, 2022
SWA Object Detection

SWA Object Detection This project hosts the scripts for training SWA object detectors, as presented in our paper: @article{zhang2020swa, title={SWA

237 Nov 28, 2022
Trainable Bilateral Filter Layer (PyTorch)

Trainable Bilateral Filter Layer (PyTorch) This repository contains our GPU-accelerated trainable bilateral filter layer (three spatial and one range

FabianWagner 26 Dec 25, 2022
Ranking Models in Unlabeled New Environments (iccv21)

Ranking Models in Unlabeled New Environments Prerequisites This code uses the following libraries Python 3.7 NumPy PyTorch 1.7.0 + torchivision 0.8.1

14 Dec 17, 2021
Hand Gesture Volume Control | Open CV | Computer Vision

Gesture Volume Control Hand Gesture Volume Control | Open CV | Computer Vision Use gesture control to change the volume of a computer. First we look i

Jhenil Parihar 3 Jun 15, 2022
A Pytorch implementation of "Splitter: Learning Node Representations that Capture Multiple Social Contexts" (WWW 2019).

Splitter ⠀⠀ A PyTorch implementation of Splitter: Learning Node Representations that Capture Multiple Social Contexts (WWW 2019). Abstract Recent inte

Benedek Rozemberczki 201 Nov 09, 2022
[NeurIPS 2021] PyTorch Code for Accelerating Robotic Reinforcement Learning with Parameterized Action Primitives

Robot Action Primitives (RAPS) This repository is the official implementation of Accelerating Robotic Reinforcement Learning via Parameterized Action

Murtaza Dalal 55 Dec 27, 2022