Skip to content

Instantly share code, notes, and snippets.

@mayhewsw
mayhewsw / get_size.sh
Created October 11, 2020 00:52
Get Download Sizes of All Wikipedias
DUMP_DATE="20201001"
while read lng; do
URL="https://dumps.wikimedia.org/${lng}wiki/${DUMP_DATE}/${lng}wiki-${DUMP_DATE}-pages-articles.xml.bz2"
#echo $lng
curl -sI $URL | grep -i Content-Length | awk -v l="$lng" '{print l " " $2}'
done < langs.txt
@W4ngatang
W4ngatang / download_glue_data.py
Last active October 31, 2024 02:08
Script for downloading data of the GLUE benchmark (gluebenchmark.com)
''' Script for downloading all GLUE data.
Note: for legal reasons, we are unable to host MRPC.
You can either use the version hosted by the SentEval team, which is already tokenized,
or you can download the original data from (https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi) and extract the data from it manually.
For Windows users, you can run the .msi file. For Mac and Linux users, consider an external library such as 'cabextract' (see below for an example).
You should then rename and place specific files in a folder (see below for an example).
mkdir MRPC
cabextract MSRParaphraseCorpus.msi -d MRPC