Skip to content

Instantly share code, notes, and snippets.

@alucard001
Created May 27, 2019 04:49
Show Gist options
  • Save alucard001/fedce21b80a3de7aa3c07299838e1462 to your computer and use it in GitHub Desktop.
Save alucard001/fedce21b80a3de7aa3c07299838e1462 to your computer and use it in GitHub Desktop.
Google Indexing API V3 Working example with Python 3
from oauth2client.service_account import ServiceAccountCredentials
import httplib2
import json
import pandas as pd
# https://developers.google.com/search/apis/indexing-api/v3/prereqs#header_2
JSON_KEY_FILE = "json_key_file_downloaded_after_creating_your_google_service_account_see_above_details_on_how_to_do.json"
SCOPES = ["https://www.googleapis.com/auth/indexing"]
credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES)
http = credentials.authorize(httplib2.Http())
def indexURL(urls, http):
# print(type(url)); print("URL: {}".format(url));return;
ENDPOINT = "https://indexing.googleapis.com/v3/urlNotifications:publish"
for u in urls:
# print("U: {} type: {}".format(u, type(u)))
content = {}
content['url'] = u.strip()
content['type'] = "URL_UPDATED"
json_ctn = json.dumps(content)
# print(json_ctn);return
response, content = http.request(ENDPOINT, method="POST", body=json_ctn)
result = json.loads(content.decode())
# For debug purpose only
if("error" in result):
print("Error({} - {}): {}".format(result["error"]["code"], result["error"]["status"], result["error"]["message"]))
else:
print("urlNotificationMetadata.url: {}".format(result["urlNotificationMetadata"]["url"]))
print("urlNotificationMetadata.latestUpdate.url: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["url"]))
print("urlNotificationMetadata.latestUpdate.type: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["type"]))
print("urlNotificationMetadata.latestUpdate.notifyTime: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["notifyTime"]))
"""
data.csv has 2 columns: URL and date.
I just need the URL column.
"""
csv = pd.read_csv("my_data.csv")
csv[["URL"]].apply(lambda x: indexURL(x, http))
@iAMido
Copy link

iAMido commented Jul 6, 2022

Thank you @alucard001 it seems to work well.

I do have a question, at the end I receive as part of the output these lines:
URL None
dtype: object

Is that ok?

output

Appreciate your help

@nafatrue
Copy link

nafatrue commented Jul 8, 2022

@geraldoam
Copy link

Thank you so much!

@jrgallibot
Copy link

More of my websites:

http://happinesspig.com, http://premiumpig.com, http://3apig.com, http://wow.esdlife.com

How to determine the status of indexing is completed or not ?

@ZAKhan123
Copy link

Hi Dear,
I did all the steps you mentioned in your youtube video. [(https://www.youtube.com/watch?v=Wwo8YbUHlj8)]
But Still, It's not working for me. I'm Getting "Error(403 - PERMISSION_DENIED): Permission denied. Failed to verify the URL ownership"

proof1
proof2
proof3

Thanks. :-)

@alucard001
Copy link
Author

@ZAKhan123 Please make sure that you own those URLs.
By "own" I mean you can verify your URL ownership in Google Search Console.

If you did not "own" your URL, you cannot index it.

Hope it helps.

@umeshbedi
Copy link

umeshbedi commented Dec 9, 2022

I have submitted lots of URLs for our website https://allpng.in/
Screenshot 2022-12-09 100306

But after checking on google search console, I got the message 'URL is not on Google'
Like this
Screenshot 2022-12-09 101052

@Nabeelahmed55
Copy link

Hi Amit,
I followed all the steps, but I found this error and even tried hard to resolve it myself. i am using Python 3.11.1 version.
Traceback (most recent call last):
File "D:\Tutorials\indexing.py", line 11, in
credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES)
FileNotFoundError: [Errno 2] No such file or directory: 'apidetails'

any help will be highly appreciated.
Thank you

@jrgallibot
Copy link

jrgallibot commented Dec 12, 2022 via email

@confusedkuala
Copy link

image

Hello thankyou for this code. It worked well for 200 urls but showed the above error for the rest. I had 500 urls in the the csv. Can you please help me with this

@alucard001
Copy link
Author

@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.

@alucard001
Copy link
Author

Hi Amit, I followed all the steps, but I found this error and even tried hard to resolve it myself. i am using Python 3.11.1 version. Traceback (most recent call last): File "D:\Tutorials\indexing.py", line 11, in credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES) FileNotFoundError: [Errno 2] No such file or directory: 'apidetails'

any help will be highly appreciated. Thank you

It said it cannot find your JSON_KEY_FILE, please make sure you are using the correct path.

@alucard001
Copy link
Author

I have submitted lots of URLs for our website https://allpng.in/ Screenshot 2022-12-09 100306

But after checking on google search console, I got the message 'URL is not on Google' Like this Screenshot 2022-12-09 101052

Have you checked that URL today? When you are using this tool, it is not IMMEDIATELY get your URL index right away. It is up to Google to decide when to add your URL to their index.

So you can think of this tool as a "Reminder" to notify Google to add your URL to their index.

@confusedkuala
Copy link

@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.

then how do i submit 1000 urls in one query. Help me with this

@alucard001
Copy link
Author

@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.

then how do i submit 1000 urls in one query. Help me with this

You don't. You split it into 5 days, or you can check if there are ways to increase quota through your Google cloud console.

@confusedkuala
Copy link

@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.

then how do i submit 1000 urls in one query. Help me with this

You don't. You split it into 5 days, or you can check if there are ways to increase quota through your Google cloud console.

okay thankyou so much

@confusedkuala
Copy link

I am getting this error quite a few times. I checked the url also everything seems right.
image

@confusedkuala
Copy link

can you please help me with this error
image

@OracleAPIindex
Copy link

Ola ajuda aqui o meu código deu esse erro
image

@RCC2024
Copy link

RCC2024 commented Nov 28, 2023

@weirdbozer
Copy link

@alucard001
Please HELP! Why it is not working, wht is thz issue?

image

@gplhegde
Copy link

gplhegde commented Feb 6, 2024

@weirdbozer found a latest version of this https://pypi.org/project/gsc-bulk-indexer/ you can try it out.

@Jameswilliam1122
Copy link

I want to remove 404 urls with the help of this code,
plz help how can i do that.

On which line number will the code be edited?

@vaiyrian
Copy link

I solved this problem after a lot of trial and error.
If you need help with this script, please contact me on telegram.

https://t.me/ordmoong

@rus779
Copy link

rus779 commented Jul 14, 2024

I see that a lot of people have some issues with this code. I've created a code for checking index status of your URLs and submitting non-indexed ones to Google Search. please consider to use it = https://github.com/rus779/GSearch-Index-Submit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment