Skip to content

Instantly share code, notes, and snippets.

@milanboers
Last active September 12, 2024 11:52
Show Gist options
  • Save milanboers/f34cdfc3e1ad9ba02ee8e44dae8e093f to your computer and use it in GitHub Desktop.
Save milanboers/f34cdfc3e1ad9ba02ee8e44dae8e093f to your computer and use it in GitHub Desktop.
Clone all repositories of a Github user
curl -s https://api.github.com/users/milanboers/repos | grep \"clone_url\" | awk '{print $2}' | sed -e 's/"//g' -e 's/,//g' | xargs -n1 git clone
@tecno14
Copy link

tecno14 commented Oct 15, 2021

this one doesn't work without modification, and doesn't paginate

yep it need modification to set username
but sorry what did you mean by 'doesn't paginate' ?

@worstname
Copy link

This worked for me, replace the username

curl -s "https://api.github.com/users/raysan5/repos?per_page=100" | jq -r ".[].clone_url" | xargs -L1 git clone

Hopefully:

  • you can easily install packages like jq, curl, and xargs and not have a deathmatch with your computer
  • the user doesn't have useless forks that are of no interest to you
  • github doesn't change the api by the time you read this or above comments

Further I pray you don't have to try to install a Python or JS script (javascript script lol). God help those that do.

@gabrie30
Copy link

gabrie30 commented Mar 8, 2022

If you would prefer to use a cli that can also clone all repos from the GitHub Organization you can download ghorg then run...

# clone all user repos
ghorg clone <user> --clone-type=user --token=<PAT>

# clone all org repos
ghorg clone <org> --token=<PAT>

@zedorgan
Copy link

zedorgan commented Oct 21, 2022

install jq
clone only original repos not forked
curl -s "https://api.github.com/users/USER/repos?per_page=100" | jq -r '.[] | select(.fork == false) | select(.name).clone_url' | xargs -L1 git clone

@Danipulok
Copy link

Hi all
I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

@ksaadDE
Copy link

ksaadDE commented Feb 1, 2023

Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.

@vinceskahan
Copy link

I finally switched to 'gickup' in a docker container for this (highly recommend!) which automagically downloads everything you have and anything you have access to.

@2KAbhishek
Copy link

Hi all I wrote a simple script to clone all the repositories from GitHub, either from a user or whole organization. Please check it out, I'm sure you'll find it useful

Nice script though! Keep in mind that you just crawl 100 each and you don't "loop" over that. So if you have > 100 you will only see 100 per page and no more pages. Hope it makes sense.

I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects

@ksaadDE
Copy link

ksaadDE commented Feb 5, 2023

I encountered the same issue and made this helper has a bunch of other useful utilities too for managing your github projects

((page_count = public_repos / 100 + 1))

+10 for this smart move, you fetched the API repo amount and then use it for pagination to expand the crawl limits.

Will try your tool at some time, atm no ongoing tasks for that. But still good job! 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment