Spectrum is an AI that uses machine learning to generate Rap song lyrics

Overview

Contributors Forks Stargazers Issues MIT License Open In Colab


Spectrum

Spectrum is an AI that uses deep learning to generate rap song lyrics.

View Demo
Report Bug
Request Feature
Open In Colab

About The Project

Spectrum is an AI that uses deep learning to generate rap song lyrics.

Built With

This project is built using Python, Tensorflow, and Flask.

Getting Started

Installation

# clone the repo
git clone https://github.com/YigitGunduc/Spectrum.git

# install requirements
pip install -r requirements.txt

Training

# navigate to the Spectrum/AI folder 
cd Spectrum/AI

# pass verbose, epochs, save_at arguments and run train.py 
python3 train.py -h, --help  --epochs EPOCHS --save_at SAVE_AT --verbose VERBOSE --rnn_neurons RNN_NEURONS
             --embed_dim EMBED_DIM --dropout DROPOUT --num_layers NUM_LAYERS --learning_rate LEARNING_RATE

All the arguments are optional if you leave them empty model will construct itself with the default params

Generating Text from Trained Model

Call eval.py from the command line with seed text as an argument

python3 eval.py --seed SEEDTEXT

or

from model import Generator

model = Generator()

model.load_weights('../models/model-5-epochs-256-neurons.h5')

generatedText = model.predict(start_seed=SEED, gen_size=1000)

print(generatedText)
  • If you have tweaked the model's parameters while training initialize the model with the parameters you trained

Running the Web-App Locally

# navigate to the Spectrum folder 
cd Spectrum

# run app.py
python3 app.py

# check out http://0.0.0.0:8080

API

spectrum has a free web API you can send request to it as shown below

import requests 

response = requests.get("https://spectrumapp.herokuapp.com/api/generate/SEEDTEXT")
#raw response
print(response.json())
#cleaned up response
print(response.json()["lyrics"])

Hyperparameters

epochs = 30 
batch size = 128
number of layers = 2(hidden) + 1(output)
number of RNN units = 256
dropout prob = 0.3
embedding dimensions = 64
optimizer = Adam
loss = sparse categorical crossentropy

These hyperparameters are the best that I can found but you have to be careful while dealing with the hyperparameters because this model can over or underfit quite easily and GRUs performs better than LSTMs

Info about model

>>> from model import Generator
>>> model = Generator()
>>> model.load_weights('../models/model-5-epochs-256-neurons.h5')
>>> model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (1, None, 64)             6400      
_________________________________________________________________
gru (GRU)                    (1, None, 256)            247296    
_________________________________________________________________
gru_1 (GRU)                  (1, None, 256)            394752    
_________________________________________________________________
dense (Dense)                (1, None, 100)            25700     
=================================================================
Total params: 674,148
Trainable params: 674,148
Non-trainable params: 0
_________________________________________________________________

>>> model.hyperparams()
Hyper Parameters
+--------------------------+
|rnn_neurons   |        256|
|embed_dim     |         64|
|learning_rate |     0.0001|
|dropout       |        0.3|
|num_layers    |          2|
+--------------------------+
>>>

Roadmap

See the open issues for a list of proposed features (and known issues).

Results

WARNING: There is some offensive language ahead, please stop reading here if you are a sensitive person. The texts below have been generated by Spectrum

Seed : today

Prediction : 

If that don't, yeah
Weint off the music
It's like a fired-enother foar fool straight for an exactly
Nigga why I id my Door Merican muthafucka

Ng answered by need for blazy hard
The family wish fans dishes rolled up
How better just wanna die
Match all about the moment in I glory
Fire is that attention is the flop and pipe those peokin' distriors
Bitch I been hard and I'm like the Scales me and we're going to school like all-off of the allegit to get the bitches
Yeah kinda too legit back into highin'
A year have it would plobably want

And we all bustin' the conscious in the cusfuckers won't ha
Quite warkie and it's blow, and what? I cannot love him,
Alugal Superman, and the revolution likes migh
I ain't still not I uest the neighborhoo
Powers all too bad show, you crite your bac
When I say way too fathom
If you wanna revell, money, where your face we'll blin
Pulf me very, yo, they pull out for taught nothin' off
I pass a with a nigga hang some, pleas
Fuck me now, it's a

======================================================================
Seed : hello

Prediction : 

hellow motherfucker
You wanna talk on the pockets on Harlotom
I'm legit some more than Volumon
Ridicalab knowledge is blessin' some of your honierby man
We just bust the Flud joke with shoulders on the Statue
Lecock it on everybody want your dices to speak
While she speak cents look back to Pops
He was a nigga when I got behind pictures any Lil Sanvanas
Used to in her lady yaught they never had a bitch
He'll break the jird little rappers kill your children is

I'm prayin' back to ready for that bitch just finished And mised to the gamr
Every eyes on and about that getting common
I'm going to attractived with its
I just went by the crowd get the promise to buy the money-a star big down
Can one sall 'em in me tryna get them days that's how I can break the top
Well, that's hug her hands he screaming like a fucking hip-hop but put a Blidze like rhymin'
Yeah I slack like a Job let your cops got a generres
These West of it today flamping this
Black Kuttle crib, said "Ju Conlie, hold up, fuck the

======================================================================
Seed : bestfriend

Prediction : 

bestfriend
Too much time we tonight
The way I know is a please have no self-back when I be for the fucking weed and a game
What the fuck we wanna be working on the streets make it like a stay down the world is from the head of the real brain
Chain don't come back to the grass
My dick is the one to tell you I'm the fuck
So see me we gon' be fans when you had to hear the window you come to the dick when a little cooleng and I was calling what the fuck is it good as the crown
And I'm representing you finally waitin' in your girl
This is the corner with my brother
I'm just a damn door and the real motherfuckers come got the point my shit is the money on the world

I get it then the conscious that's why I cripp
I might take my own shit so let me have a bad bitch
I'm just had and make the fuck is in the single of the window
I think I ain't got the world is all my gone be mine
They ain't like the half the best between my words
And I'm changing with the heads of the speech
Fuck a bunch of best of a fuck

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Use AI to generate a optimized stock portfolio

Use AI, Modern Portfolio Theory, and Monte Carlo simulation's to generate a optimized stock portfolio that minimizes risk while maximizing returns. Ho

Greg James 30 Dec 22, 2022
PyTorch implementation of SimSiam: Exploring Simple Siamese Representation Learning

SimSiam: Exploring Simple Siamese Representation Learning This is a PyTorch implementation of the SimSiam paper: @Article{chen2020simsiam, author =

Facebook Research 834 Dec 30, 2022
Pytorch implementation for "Adversarial Robustness under Long-Tailed Distribution" (CVPR 2021 Oral)

Adversarial Long-Tail This repository contains the PyTorch implementation of the paper: Adversarial Robustness under Long-Tailed Distribution, CVPR 20

Tong WU 89 Dec 15, 2022
Code for "The Intrinsic Dimension of Images and Its Impact on Learning" - ICLR 2021 Spotlight

dimensions Estimating the instrinsic dimensionality of image datasets Code for: The Intrinsic Dimensionaity of Images and Its Impact On Learning - Phi

Phil Pope 41 Dec 10, 2022
CTRL-C: Camera calibration TRansformer with Line-Classification

CTRL-C: Camera calibration TRansformer with Line-Classification This repository contains the official code and pretrained models for CTRL-C (Camera ca

57 Nov 14, 2022
High-resolution networks and Segmentation Transformer for Semantic Segmentation

High-resolution networks and Segmentation Transformer for Semantic Segmentation Branches This is the implementation for HRNet + OCR. The PyTroch 1.1 v

HRNet 2.8k Jan 07, 2023
Autonomous Ground Vehicle Navigation and Control Simulation Examples in Python

Autonomous Ground Vehicle Navigation and Control Simulation Examples in Python THIS PROJECT IS CURRENTLY A WORK IN PROGRESS AND THUS THIS REPOSITORY I

Joshua Marshall 14 Dec 31, 2022
Official code for On Path Integration of Grid Cells: Group Representation and Isotropic Scaling (NeurIPS 2021)

On Path Integration of Grid Cells: Group Representation and Isotropic Scaling This repo contains the official implementation for the paper On Path Int

Ruiqi Gao 39 Nov 10, 2022
Progressive Growing of GANs for Improved Quality, Stability, and Variation

Progressive Growing of GANs for Improved Quality, Stability, and Variation — Official TensorFlow implementation of the ICLR 2018 paper Tero Karras (NV

Tero Karras 5.9k Jan 05, 2023
Identifying a Training-Set Attack’s Target Using Renormalized Influence Estimation

Identifying a Training-Set Attack’s Target Using Renormalized Influence Estimation By: Zayd Hammoudeh and Daniel Lowd Paper: Arxiv Preprint Coming soo

Zayd Hammoudeh 2 Oct 08, 2022
A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection

Confluence: A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection 1. 介绍 用以替代 NMS,在所有 bbox 中挑选出最优的集合。 NMS 仅考虑了 bbox 的得分,然后根据 IOU 来

44 Sep 15, 2022
Code for "Unsupervised State Representation Learning in Atari"

Unsupervised State Representation Learning in Atari Ankesh Anand*, Evan Racah*, Sherjil Ozair*, Yoshua Bengio, Marc-Alexandre Côté, R Devon Hjelm This

Mila 217 Jan 03, 2023
Open Source Differentiable Computer Vision Library for PyTorch

Kornia is a differentiable computer vision library for PyTorch. It consists of a set of routines and differentiable modules to solve generic computer

kornia 7.6k Jan 04, 2023
TensorFlow Tutorials with YouTube Videos

TensorFlow Tutorials Original repository on GitHub Original author is Magnus Erik Hvass Pedersen Introduction These tutorials are intended for beginne

9.1k Jan 02, 2023
A framework for analyzing computer vision models with simulated data

3DB: A framework for analyzing computer vision models with simulated data Paper Quickstart guide Blog post Installation Follow instructions on: https:

3DB 112 Jan 01, 2023
Editing a Conditional Radiance Field

Editing Conditional Radiance Fields Project | Paper | Video | Demo Editing Conditional Radiance Fields Steven Liu, Xiuming Zhang, Zhoutong Zhang, Rich

Steven Liu 216 Dec 30, 2022
✂️ EyeLipCropper is a Python tool to crop eyes and mouth ROIs of the given video.

EyeLipCropper EyeLipCropper is a Python tool to crop eyes and mouth ROIs of the given video. The whole process consists of three parts: frame extracti

Zi-Han Liu 9 Oct 25, 2022
Neural Scene Flow Prior (NeurIPS 2021 spotlight)

Neural Scene Flow Prior Xueqian Li, Jhony Kaesemodel Pontes, Simon Lucey Will appear on Thirty-fifth Conference on Neural Information Processing Syste

Lilac Lee 85 Jan 03, 2023
FedMM: Saddle Point Optimization for Federated Adversarial Domain Adaptation

This repository contains the code accompanying the paper " FedMM: Saddle Point Optimization for Federated Adversarial Domain Adaptation" Paper link: R

20 Jun 29, 2022
Stochastic Tensor Optimization for Robot Motion - A GPU Robot Motion Toolkit

STORM Stochastic Tensor Optimization for Robot Motion - A GPU Robot Motion Toolkit [Install Instructions] [Paper] [Website] This package contains code

NVIDIA Research Projects 101 Dec 12, 2022