IF there are 100 or fewer repos, simply run:
curl -u $YOURUSERNAME -s https://api.github.com/orgs/$ORGNAME/repos?per_page=100 | ruby -rubygems -e 'require "json"; JSON.load(STDIN.read).each { |repo| %x[git clone #{repo["ssh_url"]} ]}'
(per https://gist.github.com/caniszczyk/3856584 )
and wait.
(Note when prompted for password, they do not mean your github web password, they mean your oauth token - all command line requests for password are actually for oath token. See https://github.com/blog/1509-personal-api-tokens to generate a token.)
IF you have more than 100 repos to pull, this method stops at 100 and wont proceed. Instead, run the following command multiple times, updating the page number each time
USERNAME is your github user name and ORGNAME is the org you are backing up
curl -u $YOURUSERNAME -s "https://api.github.com/orgs/$ORGNAME/repos?per_page=100&page=1" | grep -e 'ssh_url*' | cut -d \" -f 4 | xargs -L1 git clone
then increment the page number at &page=1
and run again
If this is a user account rather than a organisation account, change orgs/$ORGNAME
to usr/$USERNAME
per https://stackoverflow.com/questions/19576742/how-to-clone-all-repos-at-once-from-github