Last active
June 20, 2018 17:12
-
-
Save grantwinney/db7446197a0df1373349c49b851b6f37 to your computer and use it in GitHub Desktop.
Reads in a list of URLs (from a CSV file), writes any redirected URLs to a new CSV file, and prints out missing (404) URLs to the console
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import csv | |
import urllib2 | |
with open('urls.csv', 'rb') as f: | |
reader = csv.reader(f) | |
with open('urls-output.csv', 'w') as g: | |
writer = csv.writer(g, delimiter=',') | |
for row in reader: | |
url = row[0] | |
try: | |
r = urllib2.urlopen(url) | |
except urllib2.URLError as e: | |
r = e | |
if r.code == 200 and url != r.geturl(): | |
writer.writerow([url, r.geturl()]) | |
elif r.code == 404: | |
print("Not found: " + r.geturl()) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I use this to update mappings between Ghost (blog) and Disqus (comments). They seem to get out of whack, either because Disqus maps to the GUID version of the URL when I preview a draft, or because I change the title after its been published and Disqus can't handle that.
After downloading a URL map file from Disqus, rename the file to "urls.csv". It should be a single-column file with one URL per row.
Copy this Python script into the same directory and run it, and it should create a "urls-output.csv" file with two columns. The first column is the original URL and the second is the redirected URL, like this:
Two of the URLs in the first code block are missing from the second because they no longer exist and returned a 404. If trying to retrieve a URL returns a 404, it's printed to the console. If it returns a 200 OK and the URL wasn't redirected (same as original), the script just moves on to the next one without writing to the file or printing to the console.