Skip to content

Instantly share code, notes, and snippets.

@milanboers
Last active September 12, 2024 11:52
Show Gist options
  • Save milanboers/f34cdfc3e1ad9ba02ee8e44dae8e093f to your computer and use it in GitHub Desktop.
Save milanboers/f34cdfc3e1ad9ba02ee8e44dae8e093f to your computer and use it in GitHub Desktop.
Clone all repositories of a Github user
curl -s https://api.github.com/users/milanboers/repos | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone
@mazunki
Copy link

mazunki commented Apr 27, 2020

After some searching and frustrations of my own not being able to do this easily, I built out a project that allows for public and private repo cloning of both personal and organization repos. Perfect for grabbing everything you'd need:

https://github.com/Justintime50/github-archive

Nice! I may fork this over the next day, or do some pull requests adding the option to automatically replace dotfiles. Who wants that, anyway?

@hcanning
Copy link

curl -s 'https://api.github.com/users/jdoe/repos?page=1&per_page=100' | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone

curl -s 'https://api.github.com/users/jdoe/repos?page=2&per_page=100' | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone

curl -s 'https://api.github.com/users/jdoe/repos?page=3&per_page=100' | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone

...worked for me...seems the max you can clone at a time is 100 and you have to page

@tecno14
Copy link

tecno14 commented Oct 18, 2020

here simple code to clone all Github user repo's
https://github.com/tecno14/clone_all

@tg12
Copy link

tg12 commented Oct 21, 2020

Every one of those commands : fatal: You must specify a repository to clone. for me 👎

@acetousk
Copy link

Thanks! I wanted to clone all repos(including private) with ssh

* Generate Personal Access Token with repo access : https://github.com/settings/tokens

* You may need to handle pagination
curl -s -H "Authorization: token YOUR_TOKEN_HERE" "https://api.github.com/user/repos?per_page=100" | jq -r ".[].ssh_url" | xargs -L1 git clone

This worked for me first try. Thank you @anaggh

@simonmeggle
Copy link

If you only want to clone repos of a user matching a certain regex in the repo name, use this jq filter:

curl -s https://api.github.com/users/<USER>/repos | jq -r '.[] | select( .name | test("<REGEX>")).ssh_url' | xargs -L1 git clone

@simonmeggle
Copy link

Add | select(.fork == false) to filter out all repos which are forks (when you only want to clone the author's own work - and not the whole Linux kernel :-P ):

curl -s https://api.github.com/users/<USER>/repos | jq -r '.[] | select( .name | test("<REGEX>")) | select(.fork == false).ssh_url' | xargs -L1 git clone

@mxcl
Copy link

mxcl commented Jun 6, 2021

USERNAME=''
brew install jq gh
curl -s "https://api.github.com/users/$USERNAME/repos?per_page=100" | jq -r ".[].name" | xargs -n1 gh repo clone

Clone all your repos:

brew install jq gh
gh repo list --json name | jq '.[].name' | xargs -n1 gh repo clone

@ksaadDE
Copy link

ksaadDE commented Jun 12, 2021

It doesn't work here, but the commands seem to be correct. Any ideas? I'll try a bit around.

I've seen someone posting a nodejs app above, let me explain why a small bash command using pipes should be always the first and best option. :-)

If you create a larger nodejs app without any benefits (= the same functionality, you wrote in 100+ lines more), you waste time.
Most of the time the output should exceed the input, no magic equation.

Some exceptions might be the case for GUI stuff and so on, if the user has a big advantage, compared to the bash cmd.

@mazunki
Copy link

mazunki commented Jun 13, 2021

@ksaadDE I love this: if len(input)>len(output) then: you_fucked_up = true;

@ksaadDE
Copy link

ksaadDE commented Jun 13, 2021

@mazunki
you_fucked_up = (input > output)

;-)

@ksaadDE
Copy link

ksaadDE commented Jun 14, 2021

@mazunki Did some input > output (or something like that xD)

https://github.com/ksaadDE/usefullshellscripts/blob/main/githubrepodownloader.sh

Feedback is appreciated.

@vinceskahan
Copy link

Clone all your repos:

brew install jq gh
gh repo list --json name | jq '.[].name' | xargs -n1 gh repo clone

THANKS @mxcl !!!

The 'gh' way is far (!!!!!!) easier to use if you have a mix of public and private repos. The curl way seems to work for public, but I never found an incantation that would work with private repos. Uncertain if the fact that I use 2FA and a yubikey for that on GitHub, but 'gh' supports it quite nicely. Just 'gh auth login' first and it'll prompt you.

@ksaadDE
Copy link

ksaadDE commented Jun 18, 2021

Clone all your repos:

brew install jq gh
gh repo list --json name | jq '.[].name' | xargs -n1 gh repo clone

THANKS @mxcl !!!

The 'gh' way is far (!!!!!!) easier to use if you have a mix of public and private repos. The curl way seems to work for public, but I never found an incantation that would work with private repos. Uncertain if the fact that I use 2FA and a yubikey for that on GitHub, but 'gh' supports it quite nicely. Just 'gh auth login' first and it'll prompt you.

Why don't use a token? =) (btw idk if Token only works when 2FA is activate, I remember it's possible, bc the token doesn't trigger 2FA Auth)
My Script posted above is far easier if you use a token.
Just add the token in a .token file (read only for the current user) and start my script

@2KAbhishek
Copy link

Wrote a simple menu based script for this issue:
https://github.com/2kabhishek/ghpm

Check it out if you need a quick and reusable solution

@ksaadDE
Copy link

ksaadDE commented Jul 7, 2021

Wrote a simple menu based script for this issue:
https://github.com/2kabhishek/ghpm

Check it out if you need a quick and reusable solution

why not pages=500 or counting it and then adjust it to i+1?
Except of that awesome code, especially that you're using ssh for cloneing.

@2KAbhishek
Copy link

Wrote a simple menu based script for this issue:
https://github.com/2kabhishek/ghpm
Check it out if you need a quick and reusable solution

why not pages=500 or counting it and then adjust it to i+1?
Except of that awesome code, especially that you're using ssh for cloneing.

GitHub API has limitation for 100 repos per page, but I have added a fix for that now It can clone all user repos in one go, no matter the count. Also added filtering for user repos only, so we don't get repos from orgs

@ksaadDE
Copy link

ksaadDE commented Jul 20, 2021

Wrote a simple menu based script for this issue:
https://github.com/2kabhishek/ghpm
Check it out if you need a quick and reusable solution

why not pages=500 or counting it and then adjust it to i+1?
Except of that awesome code, especially that you're using ssh for cloneing.

GitHub API has limitation for 100 repos per page, but I have added a fix for that now It can clone all user repos in one go, no matter the count. Also added filtering for user repos only, so we don't get repos from orgs

Awesome!
I still have less than 100 repos so I wasn't able to check if the limit is 100.
But I guess you read the docs.

@Justintime50
Copy link

@BitesizedLion GitHub Archive works as intended and will simply log an error when attempting to clone/pull a repo that has been DMCA'd which is intentional since GitHub won't allow access to the repo once this occurs. For more details, see the discussion on this issue

@tecno14
Copy link

tecno14 commented Oct 15, 2021

this one doesn't work without modification, and doesn't paginate

yep it need modification to set username
but sorry what did you mean by 'doesn't paginate' ?

@worstname
Copy link

This worked for me, replace the username

curl -s "https://api.github.com/users/raysan5/repos?per_page=100" | jq -r ".[].clone_url" | xargs -L1 git clone

Hopefully:

  • you can easily install packages like jq, curl, and xargs and not have a deathmatch with your computer
  • the user doesn't have useless forks that are of no interest to you
  • github doesn't change the api by the time you read this or above comments

Further I pray you don't have to try to install a Python or JS script (javascript script lol). God help those that do.

@gabrie30
Copy link

gabrie30 commented Mar 8, 2022

If you would prefer to use a cli that can also clone all repos from the GitHub Organization you can download ghorg then run...

# clone all user repos
ghorg clone <user> --clone-type=user --token=<PAT>

# clone all org repos
ghorg clone <org> --token=<PAT>

@zedorgan
Copy link

zedorgan commented Oct 21, 2022

install jq
clone only original repos not forked
curl -s "https://api.github.com/users/USER/repos?per_page=100" | jq -r '.[] | select(.fork == false) | select(.name).clone_url' | xargs -L1 git clone

@Danipulok
Copy link

Hi all
I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

@ksaadDE
Copy link

ksaadDE commented Feb 1, 2023

Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.

@vinceskahan
Copy link

I finally switched to 'gickup' in a docker container for this (highly recommend!) which automagically downloads everything you have and anything you have access to.

@2KAbhishek
Copy link

Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.

I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects

@ksaadDE
Copy link

ksaadDE commented Feb 5, 2023

I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects

((page_count = public_repos / 100 + 1))

+10 for this smart move, you fetched the API repo amount and then use it for pagination to expand the crawl limits.

Will try your tool at some time, atm no ongoing tasks for that. But still good job! 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment