-
-
Save codsane/25f0fd100b565b3fce03d4bbd7e7bf33 to your computer and use it in GitHub Desktop.
import requests | |
import re | |
def commitCount(u, r): | |
return re.search('\d+$', requests.get('https://api.github.com/repos/{}/{}/commits?per_page=1'.format(u, r)).links['last']['url']).group() | |
def latestCommitInfo(u, r): | |
""" Get info about the latest commit of a GitHub repo """ | |
response = requests.get('https://api.github.com/repos/{}/{}/commits?per_page=1'.format(u, r)) | |
commit = response.json()[0]; commit['number'] = re.search('\d+$', response.links['last']['url']).group() | |
return commit |
curl -I -k "https://api.github.com/repos/:owner/:repo/commits?per_page=1" | sed -n '/^[Ll]ink:/ s/.*"next".*page=\([0-9]*\).*"last".*/\1/p' | |
### And that's all ! | |
# I saw many fighting with finding first commit SHA or similar fancy thing. | |
# Here we just rely on the GH API, asking commits at 1 per page and parsing the last page number in the header of the reply (whose body only holds the last commit !) | |
# So this is robust and bandwidth efficient. :) | |
# If one want the commit count of a specific SHA, just use : | |
curl -I -k "https://api.github.com/repos/:owner/:repo/commits?per_page=1&sha=:sha" | sed -n '/^[Ll]ink:/ s/.*"next".*page=\([0-9]*\).*"last".*/\1/p' |
@codsane : nice fork but I find your regex a bit complex actually. How about
re.search('\d+$', requests.get('https://api.github.com/repos/{}/{}/commits?per_page=1'.format(u, r)).links['last']['url']).group()
?
@0penBrain Ahh thank you so much! That is indeed much better. response.links
is also great to know about, thank you!
Thanks, forked and adapted to my needs for dart here: https://gist.github.com/Agondev/9a1290149b96074abfbf1ecc6ad90589
This post was very helpful for me. Thanks a lot!!
I'v been working for reading stats via REST API, especially to get total commits count and subtotal on a given period.
By adding query options, "since=" & "until=", subtotal commits count can be easily get. But a "KeyError" is raised when there's no commit at all. So I added error handling like
while True:
try:
n_commits = re.search(...)
break
except:
n_commits = 0
break
Is there a better solution to handle error/exceptions for "0" commits?
@deniskim82 : you can store the request response into a variable and check for it, eg :
res = requests.get('https://api.github.com/repos/{}/{}/commits?per_page=1'.format(u, r))
if hasattr(res, 'links'):
return re.search('\d+$', res.links['last']['url']).group()
return None
@codsane : nice fork but I find your regex a bit complex actually. How about
re.search('\d+$', requests.get('https://api.github.com/repos/{}/{}/commits?per_page=1'.format(u, r)).links['last']['url']).group()
?