-
-
Save alucard001/fedce21b80a3de7aa3c07299838e1462 to your computer and use it in GitHub Desktop.
from oauth2client.service_account import ServiceAccountCredentials | |
import httplib2 | |
import json | |
import pandas as pd | |
# https://developers.google.com/search/apis/indexing-api/v3/prereqs#header_2 | |
JSON_KEY_FILE = "json_key_file_downloaded_after_creating_your_google_service_account_see_above_details_on_how_to_do.json" | |
SCOPES = ["https://www.googleapis.com/auth/indexing"] | |
credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES) | |
http = credentials.authorize(httplib2.Http()) | |
def indexURL(urls, http): | |
# print(type(url)); print("URL: {}".format(url));return; | |
ENDPOINT = "https://indexing.googleapis.com/v3/urlNotifications:publish" | |
for u in urls: | |
# print("U: {} type: {}".format(u, type(u))) | |
content = {} | |
content['url'] = u.strip() | |
content['type'] = "URL_UPDATED" | |
json_ctn = json.dumps(content) | |
# print(json_ctn);return | |
response, content = http.request(ENDPOINT, method="POST", body=json_ctn) | |
result = json.loads(content.decode()) | |
# For debug purpose only | |
if("error" in result): | |
print("Error({} - {}): {}".format(result["error"]["code"], result["error"]["status"], result["error"]["message"])) | |
else: | |
print("urlNotificationMetadata.url: {}".format(result["urlNotificationMetadata"]["url"])) | |
print("urlNotificationMetadata.latestUpdate.url: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["url"])) | |
print("urlNotificationMetadata.latestUpdate.type: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["type"])) | |
print("urlNotificationMetadata.latestUpdate.notifyTime: {}".format(result["urlNotificationMetadata"]["latestUpdate"]["notifyTime"])) | |
""" | |
data.csv has 2 columns: URL and date. | |
I just need the URL column. | |
""" | |
csv = pd.read_csv("my_data.csv") | |
csv[["URL"]].apply(lambda x: indexURL(x, http)) |
More of my websites:
http://happinesspig.com, http://premiumpig.com, http://3apig.com, http://wow.esdlife.com
How to determine the status of indexing is completed or not ?
@ZAKhan123 Please make sure that you own those URLs.
By "own" I mean you can verify your URL ownership in Google Search Console.
If you did not "own" your URL, you cannot index it.
Hope it helps.
I have submitted lots of URLs for our website https://allpng.in/
But after checking on google search console, I got the message 'URL is not on Google'
Like this
Hi Amit,
I followed all the steps, but I found this error and even tried hard to resolve it myself. i am using Python 3.11.1 version.
Traceback (most recent call last):
File "D:\Tutorials\indexing.py", line 11, in
credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES)
FileNotFoundError: [Errno 2] No such file or directory: 'apidetails'
any help will be highly appreciated.
Thank you
@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.
Hi Amit, I followed all the steps, but I found this error and even tried hard to resolve it myself. i am using Python 3.11.1 version. Traceback (most recent call last): File "D:\Tutorials\indexing.py", line 11, in credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES) FileNotFoundError: [Errno 2] No such file or directory: 'apidetails'
any help will be highly appreciated. Thank you
It said it cannot find your JSON_KEY_FILE
, please make sure you are using the correct path.
I have submitted lots of URLs for our website https://allpng.in/
But after checking on google search console, I got the message 'URL is not on Google' Like this
Have you checked that URL today? When you are using this tool, it is not IMMEDIATELY get your URL index right away. It is up to Google to decide when to add your URL to their index.
So you can think of this tool as a "Reminder" to notify Google to add your URL to their index.
@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.
then how do i submit 1000 urls in one query. Help me with this
@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.
then how do i submit 1000 urls in one query. Help me with this
You don't. You split it into 5 days, or you can check if there are ways to increase quota through your Google cloud console.
@confusedkuala The error message already told you: there is a quota for indexing URL. So please do the indexing next day.
then how do i submit 1000 urls in one query. Help me with this
You don't. You split it into 5 days, or you can check if there are ways to increase quota through your Google cloud console.
okay thankyou so much
@alucard001
Please HELP! Why it is not working, wht is thz issue?
@weirdbozer found a latest version of this https://pypi.org/project/gsc-bulk-indexer/ you can try it out.
I want to remove 404 urls with the help of this code,
plz help how can i do that.
On which line number will the code be edited?
I solved this problem after a lot of trial and error.
If you need help with this script, please contact me on telegram.
I see that a lot of people have some issues with this code. I've created a code for checking index status of your URLs and submitting non-indexed ones to Google Search. please consider to use it = https://github.com/rus779/GSearch-Index-Submit
Thank you so much!