-
-
Save milanboers/f34cdfc3e1ad9ba02ee8e44dae8e093f to your computer and use it in GitHub Desktop.
curl -s https://api.github.com/users/milanboers/repos | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone |
Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful
Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.
I finally switched to 'gickup' in a docker container for this (highly recommend!) which automagically downloads everything you have and anything you have access to.
Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful
Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.
I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects
I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects
((page_count = public_repos / 100 + 1))
+10 for this smart move, you fetched the API repo amount and then use it for pagination to expand the crawl limits.
Will try your tool at some time, atm no ongoing tasks for that. But still good job! 👍
Hi all
I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful