Pyrics is a tool to scrape lyrics, get rhymes, generate relevant lyrics with rhymes.

Related tags

Web CrawlingPyrics
Overview

Pyrics

Pyrics is a tool to scrape lyrics, get rhymes, generate relevant lyrics with rhymes.

./test/run.py provides the full function in terminal

cmd>> python run.py

or

cmd>> conda activate
conda>> python run.py

Author: DK Xie

Installation

pip install Pyrics

Initialization

Way 1: Not Exist Data

from Pyrics import Pyrics as prc

prc = prc.Pyrics()

Way 2: Already Exist Data

from Pyrics import Pyrics as prc

prc = prc.Pyrics(path)

if exist data folder 'lyrics', folder strucure:

-lyrics
   -rhymes

lyric data (.csv) in lyrics folder, e.g.,

bands songs lyrics
0 Little Richard Tutti Frutti Wop-bop-a-loo-mop alop-bom-bom
1 Little Richard Tutti Frutti Tutti frutti, oh rutti
2 Little Richard Tutti Frutti Tutti frutti, woo!
3 Little Richard Tutti Frutti Tutti frutti, oh rutti
4 Little Richard Tutti Frutti Tutti frutti, oh rutti
5 Little Richard Tutti Frutti Tutti frutti, oh rutti

rhyme data (.csv) in rhymes folder, e.g.,

bands songs lyrics rhymes
0 Little Richard Tutti Frutti Wop-bop-a-loo-mop alop-bom-bom om
1 Little Richard Tutti Frutti Tutti frutti, oh rutti i
2 Little Richard Tutti Frutti Tutti frutti, woo! u
3 Little Richard Tutti Frutti Tutti frutti, oh rutti i
4 Little Richard Tutti Frutti Tutti frutti, oh rutti i
5 Little Richard Tutti Frutti Tutti frutti, oh rutti i

Generate Datasets

artists = 'Led Zeppelin'

1.Download Lyrics

#default iters_num = 1e20
download_lyrics(artists=artists, iters_num = 5, delay_time=10, fluctuate_rate=5):
bands songs lyrics
0 Led Zeppelin Good Times Bad Times In the days of my youth, I was told what it means to be a man
1 Led Zeppelin Good Times Bad Times And now I've reached that age, I've tried to do all those things the best I can
2 Led Zeppelin Good Times Bad Times No matter how I try, I find my way into the same old jam
3 Led Zeppelin Good Times Bad Times Good times, bad times, you know I've had my share
4 Led Zeppelin Good Times Bad Times When my woman left home with a brown eyed man
5 Led Zeppelin Good Times Bad Times But I still don't seem to care

2.Generate Lyrics Data with Rhymes

generate_rhymes(artist_search=artists)
bands songs lyrics rhymes
0 Led Zeppelin Good Times Bad Times In the days of my youth, I was told what it means to be a man ?n
1 Led Zeppelin Good Times Bad Times And now I've reached that age, I've tried to do all those things the best I can ?n
2 Led Zeppelin Good Times Bad Times No matter how I try, I find my way into the same old jam ?m
3 Led Zeppelin Good Times Bad Times Good times, bad times, you know I've had my share ?r
4 Led Zeppelin Good Times Bad Times When my woman left home with a brown eyed man ?n
5 Led Zeppelin Good Times Bad Times But I still don't seem to care ?r

Basic Function

lyrics = 'Don\'t you cry tonight'
artists = ['Guns N\' Roses', 'the doors','led zeppelin']

1.Get Rhymes of Lyrics

get_rhymes(lyrics)

=>'a?t'

2.Get the Lyrics with the Same Rhymes

get_rhymes_lyrics(lyrics=lyrics, artists=artists, length=5, exclude=True, same=False, print_lyrics=False):
bands songs lyrics rhymes
0 Led Zeppelin Heartbreaker Abuse my love a thousand times a?mz
1 The Doors Peace Frog (She came) The women are crying a???
2 Guns N' Roses Oh My God Well, this is better than a good compromise a?z
3 Guns N' Roses Don't Cry Talk to me softly, there's something in your eyes a?z
4 The Doors Someday Soon But you're going to die a?
5 The Doors Break On Through (To The Other Side) She get high a?

3.Get Relevant Lyrics contain Input Lyrics

lyrics = 'I love you'
get_relevant_lyrics(lyrics=lyrics, artists=artists, length=5):
bands songs lyrics rhymes
18 The Doors Ships w/Sails Well, you asked how much I love you u
32 Led Zeppelin Darlene And I love you, Yes I do u
31 Led Zeppelin Darlene Cause I love you, Darlene in
12 The Doors Hello, I Love You Hello, I love you u
3 Guns N' Roses Prostitute Where would you go if I told you I love you and then walked away? Oh, yeah ?

Core Function: Generate Songs Lyrics

Combine the lyrics randomly to generate songs with rhymes

keyword = 'baby'
artists = ['Guns N\' Roses', 'the doors','led zeppelin']

1.Generate Songs Lyrics contain Keywords with Rhymes

generate_lyrics(inputs=keyword, artists=artists, paragraph_length=4, lyrics_length=16, relevant=True, same=False, print_lyrics=True )
I gotta tell you baby
Save our city
Think of me as just a dream
Changes fill my time, baby, that's alright with me


Tell me, baby, what's my name
Alright, okay, alright, okay!
All that amounts to is love that you fed by perversion and pain
In fact, they look so strange


Tell you, pretty baby
Changes fill my time, baby, that's alright with me
I'm from South Philadelphia
It'd be enough, but just my luck, I fell in love and maybe


When the levee breaks, baby you've got to move, you got to move now
I see you walking around
People talking all around
Break it down

2.Generate Song Lyrics with the Same Rhymes of Input

generate_lyrics(inputs=keyword, artists=artists, paragraph_length=4, lyrics_length=16, relevant=False, same=False, print_lyrics=True )
She just puts around, being lazy
Somebody, somebody
Oh baby, baby, I like your honey and it sure likes me
If I could teach my hands to see


Noon burned gold into our hair
Gonna love you, baby, here I come again
Lost cells
But now, could you blow it all on a million-dollar bet


I never wanted you to be someone afraid to know themselves
And now I can't get back again
Please, Mr. Fireman, won't you ring your bell?
Craze, baby, the rainbow's end


Down in the pits you go no lower
You don't have to go, oh, oh, oh, oh
To a strange night of stone
Oh yeah, oh yeah, oh, oh, oh
Owner
MisterDK
MisterDK
A low-code tool that generates python crawler code based on curl or url

KKBA Intruoduction A low-code tool that generates python crawler code based on curl or url Requirement Python = 3.6 Install pip install kkba Usage Co

8 Sep 20, 2021
Binance Smart Chain Contract Scraper + Contract Evaluator

Pulls Binance Smart Chain feed of newly-verified contracts every 30 seconds, then checks their contract code for links to socials.Returns only those with socials information included, and then submit

14 Dec 09, 2022
A webdriver-based script for reserving Tsinghua badminton courts.

AutoReserve A webdriver-based script for reserving badminton courts. 使用说明 下载 chromedriver 选择当前Chrome对应版本 安装 selenium pip install selenium 更改场次、金额信息dat

Payne Zhang 4 Nov 09, 2021
一个m3u8视频流下载脚本

一个Python的m3u8流视频下载脚本 介绍 m3u8流视频日益常见,目前好用的下载器也有很多,我把之前自己写的一个小脚本分享出来,供广大网友使用。写此程序的目的在于给视频下载爱好者提供一个下载样例,可直接调用,勿再重复造轮子。 使用方法 在python中直接运行程序或进行外部调用 import

Nchu 0 Oct 10, 2021
A Powerful Spider(Web Crawler) System in Python.

pyspider A Powerful Spider(Web Crawler) System in Python. Write script in Python Powerful WebUI with script editor, task monitor, project manager and

Roy Binux 15.7k Jan 04, 2023
jd_maotai rpa 基于selenium驱动的jd抢购rpa机器人

jd_maotai rpa 基于selenium驱动的jd抢购rpa机器人, 照顾我们这样的马大哈, 不会忘记抢购了, 祝大家过年都能喝上茅台. 特别声明: 本仓库发布的jd_maotai_rpa项目定义为自动化rpa项目, 是用于防止忘记参与jd茅台的活动(由于本人时常忘记), 而不是为了秒杀和抢

35 Nov 18, 2022
A tool for scraping and organizing data from NewsBank API searches

nbscraper Overview This simple tool automates the process of copying, pasting, and organizing data from NewsBank API searches. Curerntly, nbscrape onl

0 Jun 17, 2021
12306抢票脚本

12306抢票脚本

罐子里的茶 457 Jan 05, 2023
Haphazard scripts for scraping bitcoin/bitcoin data from GitHub

This is a quick-and-dirty tool used to scrape bitcoin/bitcoin pull request and commentary data. Each output/pr number folder contains comments.json:

James O'Beirne 8 Oct 12, 2022
Web scraped S&P 500 Data from Wikipedia using Pandas and performed Exploratory Data Analysis on the data.

Web scraped S&P 500 Data from Wikipedia using Pandas and performed Exploratory Data Analysis on the data. Then used Yahoo Finance to get the related stock data and displayed them in the form of chart

Samrat Mitra 3 Sep 09, 2022
A scrapy pipeline that provides an easy way to store files and images using various folder structures.

scrapy-folder-tree This is a scrapy pipeline that provides an easy way to store files and images using various folder structures. Supported folder str

Panagiotis Simakis 7 Oct 23, 2022
对于有验证码的站点爆破,用于安全合法测试

使用方法 python3 main.py + 配置好的文件 python3 main.py Verify.json python3 main.py NoVerify.json 以上分别对应有验证码的demo和无验证码的demo Tips: 你可以以域名作为配置文件名字加载:python3 main

47 Nov 09, 2022
a high-performance, lightweight and human friendly serving engine for scrapy

a high-performance, lightweight and human friendly serving engine for scrapy

Speakol Ads 30 Mar 01, 2022
Pythonic Crawling / Scraping Framework based on Non Blocking I/O operations.

Pythonic Crawling / Scraping Framework Built on Eventlet Features High Speed WebCrawler built on Eventlet. Supports relational databases engines like

Juan Manuel Garcia 173 Dec 05, 2022
a small library for extracting rich content from urls

A small library for extracting rich content from urls. what does it do? micawber supplies a few methods for retrieving rich metadata about a variety o

Charles Leifer 588 Dec 27, 2022
FilmMikirAPI - A simple rest-api which is used for scrapping on the Kincir website using the Python and Flask package

FilmMikirAPI - A simple rest-api which is used for scrapping on the Kincir website using the Python and Flask package

UserGhost411 1 Nov 17, 2022
Parsel lets you extract data from XML/HTML documents using XPath or CSS selectors

Parsel Parsel is a BSD-licensed Python library to extract and remove data from HTML and XML using XPath and CSS selectors, optionally combined with re

Scrapy project 859 Dec 29, 2022
A python module to parse the Open Graph Protocol

OpenGraph is a module of python for parsing the Open Graph Protocol, you can read more about the specification at http://ogp.me/ Installation $ pip in

Erik Rivera 213 Nov 12, 2022
Web Scraping Practica With Python

Web-Scraping-Practica Integrants: Guillem Vidal Pallarols. Lídia Bandrés Solé Fitxers: Aquest document és el primer que trobem. A continuació trobem u

2 Nov 08, 2021
Scrapy uses Request and Response objects for crawling web sites.

Requests and Responses¶ Scrapy uses Request and Response objects for crawling web sites. Typically, Request objects are generated in the spiders and p

Md Rashidul Islam 1 Nov 03, 2021