• Hasan

    Administrator
    June 30, 2025 at 8:57 am

    First, check if your posts have the following meta tag:

    <meta name="robots" content="noindex, nofollow">

    and check the robots.txt file.

    Also check HTTP headers Just in case.

    Sometimes a server sends a X-Robots-Tag: noindex header. You can check this in Chrome DevTools → Network → your post → Headers.

    Second, is it a new website? what is the DA ?

    because: Publishing 50–100 posts/day can trigger Google’s “low quality content” filter.


    Third, this is from Google docs:

    The Indexing API allows site owners to directly notify Google when their job posting or livestreaming video pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to higher quality user traffic. The Indexing API can only be used to crawl pages with either JobPosting or BroadcastEvent embedded in a VideoObject. For websites with many short-lived pages like job postings or livestream videos, the Indexing API keeps content fresh in search results because it allows updates to be pushed individually.


    If you still want to do so on your own risk:

    here is a sample python script:

    import json
    from googleapiclient.discovery import build
    from oauth2client.service_account import ServiceAccountCredentials
    # Load credentials
    SCOPES = ["https://www.googleapis.com/auth/indexing"]
    JSON_KEY_FILE = "service-account.json" # replace with your key file
    credentials = ServiceAccountCredentials.from_json_keyfile_name(
    JSON_KEY_FILE, scopes=SCOPES)
    service = build('indexing', 'v3', credentials=credentials)
    def notify_url(url):
    body = {
    "url": url,
    "type": "URL_UPDATED"
    }
    response = service.urlNotifications().publish(body=body).execute()
    print(response)
    # List of URLs to reindex
    urls = [
    "https://yourdomain.com/news/article-1",
    "https://yourdomain.com/news/article-2"
    ]
    for url in urls:
    notify_url(url)