After a few hours of searching, I've finally found a convenient way to download a large amount of files in a bulk/multi-threaded/parallel manner, while still having the ability to specify the saved files' names.
Many thanks to Diego Torres Milano
bird_4345_543.jpg https://example.com/pictures/5351/image.jpg
bird_4345_544.jpg https://example.com/5352/pictures/image.jpg
bird_12950_3912.jpg https://example.com/6593/pictures/image.jpg
...
function mywget()
{
    IFS=' ' read -a myarray <<< "$1"
    wget -O "Birds/${myarray[0]}" "${myarray[1]}"
}
export -f mywget
xargs -P 5 -n 1 -I {} bash -c "mywget '{}'" < "dl_data.txt"IFS=' ' read -a myarray <<< "$1" - Splits the line ($1), using my delimiter (a space)
wget -O "Birds/${myarray[0]}" "${myarray[1]}" - Download statement, with our specified save location and the file's URL
xargs -P 5 -n 1 -I {} bash -c "mywget '{}'" < "dl_data.txt":
-P 5 - This specifies that we want 5 wget processes, maximum, running simultaneously
-n 1 - Read 1 line of the input file at a time
"dl_data.txt" - The input file