• Automate Google indexing submission urls

    Posted by Binu Antony on June 29, 2025 at 2:49 pm

    I want to eliminate the process of manually submitting urls in gsc for indexing. This has become tedious incase of websites having bulk url submission.

    I need a way out I can fix using wordpress code snippets and Google indexing api. Please help with respective steps.

    Hasan replied 3 weeks, 4 days ago 2 Members · 3 Replies
  • 3 Replies
  • Hasan

    Administrator
    June 30, 2025 at 7:44 am

    Hi Binu, but why you are submitting the URLs anyway ?

    Google will crawl your website anyway. did you add it to Google search console?

    in my experience, these simple tricks doesn’t affect your traffic, what really matters is the content of the web page.

    Read the Note here on Rankmath website:

    https://rankmath.com/wordpress/plugin/instant-indexing/

    they state clearly:

    Note: Google recommends that you use the Indexing API ONLY for Job Posting and Live Streaming websites. However, it works on any type of website and many of our users have seen great results already. Please proceed with caution.

    I my self don’t recommend this.

  • Binu Antony

    Member
    June 30, 2025 at 8:26 am

    @hasan This is for bigger websites that work on code based, not necessarily wordpress and publish 100 of urls daily through programmatic seo.
    Most of their urls go to not-indexed status.
    Not indexed is not ranking.

    • Hasan

      Administrator
      June 30, 2025 at 8:57 am

      First, check if your posts have the following meta tag:

      <meta name="robots" content="noindex, nofollow">

      and check the robots.txt file.

      Also check HTTP headers Just in case.

      Sometimes a server sends a X-Robots-Tag: noindex header. You can check this in Chrome DevTools → Network → your post → Headers.

      Second, is it a new website? what is the DA ?

      because: Publishing 50–100 posts/day can trigger Google’s “low quality content” filter.


      Third, this is from Google docs:

      The Indexing API allows site owners to directly notify Google when their job posting or livestreaming video pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to higher quality user traffic. The Indexing API can only be used to crawl pages with either JobPosting or BroadcastEvent embedded in a VideoObject. For websites with many short-lived pages like job postings or livestream videos, the Indexing API keeps content fresh in search results because it allows updates to be pushed individually.


      If you still want to do so on your own risk:

      here is a sample python script:

      import json
      from googleapiclient.discovery import build
      from oauth2client.service_account import ServiceAccountCredentials
      # Load credentials
      SCOPES = ["https://www.googleapis.com/auth/indexing"]
      JSON_KEY_FILE = "service-account.json" # replace with your key file
      credentials = ServiceAccountCredentials.from_json_keyfile_name(
      JSON_KEY_FILE, scopes=SCOPES)
      service = build('indexing', 'v3', credentials=credentials)
      def notify_url(url):
      body = {
      "url": url,
      "type": "URL_UPDATED"
      }
      response = service.urlNotifications().publish(body=body).execute()
      print(response)
      # List of URLs to reindex
      urls = [
      "https://yourdomain.com/news/article-1",
      "https://yourdomain.com/news/article-2"
      ]
      for url in urls:
      notify_url(url)

Log in to reply.