Google Merchant Center Website Unreachable: How to Fix It (2026)

A "website unreachable" error from Google Merchant Center means Google's crawler cannot access your website when it tries to verify product information or check compliance. This can happen for a variety of reasons — from server downtime to crawl blocking — and it can prevent your products from appearing in Google Shopping or trigger an account suspension.

This guide walks through all the common causes of website unreachable errors in Google Merchant Center and how to diagnose and fix each one.

What "Website Unreachable" Means in Google Merchant Center

When Google Merchant Center shows a "website unreachable" status, it means that Googlebot (Google's web crawler) attempted to visit your website or specific product pages and failed to load them. This can happen at the account level (Google can't reach your website at all) or at the product level (certain product pages return errors).

Common HTTP status codes that trigger this error include:

Common Causes of Website Unreachable Errors

1. Server Downtime or Instability

If your web server was down or experiencing issues at the time Google's crawler visited, all pages would return server errors. This can happen due to:

If the downtime was temporary and your site is now accessible, Google's next crawl may resolve the issue automatically. However, if the downtime was extended or repeated, Google may have flagged your account.

2. DNS Configuration Issues

If your domain's DNS records are misconfigured or have propagation issues (common after domain transfers or nameserver changes), Google may not be able to resolve your domain to an IP address. This results in connection failures that look like "website unreachable."

DNS issues can also occur if your domain registration has expired — your registrar may take down the domain's DNS records, making the site inaccessible.

3. Googlebot Being Blocked by robots.txt

If your robots.txt file has rules that block Googlebot from crawling your website or product pages, Google's crawler will report those pages as unreachable (specifically, as "blocked by robots.txt"). This is one of the most common technical mistakes merchants make.

A common example that causes issues: Disallow: / under User-agent: * — this blocks ALL crawlers from ALL pages.

4. Firewall or Security Blocking Google's IP Addresses

Some web hosting providers, CDNs, or security plugins (like Wordfence for WordPress) block requests from certain IP ranges. If Google's crawler IPs are included in a blocklist, the crawler gets a connection refused or timeout error.

This often happens accidentally after a security incident, when overly aggressive IP blocking rules are put in place.

5. Password Protection or Login Requirements

If your website (or specific pages Google is trying to crawl) requires a password to access, Google's crawler cannot view the content. This sometimes happens with:

6. Product Pages Moved or Deleted (404 Errors)

If you've changed product URLs, deleted products, or reorganized your store, the URLs that were previously submitted in your Merchant Center feed may now return 404 errors. Google's crawler visits those URLs to verify product information — when it gets a 404, it reports those products as unreachable.

7. Server-Side Blocking Based on User Agent

Some server configurations block requests that identify as bots or crawlers. Since Googlebot identifies itself clearly in its user agent string ("Googlebot"), a rule that blocks bot user agents will prevent Google from reaching your pages.

How to Diagnose the Cause

Step 1: Visit Your Website Manually

First, verify your website is accessible from a regular browser. Try visiting the homepage and several product pages. If they load normally, the issue may be specific to Google's crawler rather than general server downtime.

Step 2: Check Your robots.txt

Visit https://yourdomain.com/robots.txt. Review the rules carefully. Look for any Disallow rules that might block Googlebot or all user agents (User-agent: *) from your product pages or site root.

A correctly configured robots.txt for an ecommerce site typically looks like:

User-agent: *
Disallow: /admin/
Disallow: /checkout/
Disallow: /cart/
Allow: /

Sitemap: https://yourdomain.com/sitemap.xml

There should be no Disallow: / or blanket disallow rules for your main content pages.

Step 3: Use Google Search Console

Google Search Console's URL Inspection Tool lets you test if Google can crawl a specific URL. Enter your product page URLs and see what status Google gets when it tries to fetch them. The tool will show you the HTTP status code and any rendering issues.

Step 4: Check for Firewall Rules

If you use a security plugin, CDN (like Cloudflare), or your hosting provider's firewall, check the settings for any rules that might block crawlers or specific IP ranges. Google publishes its crawler IP addresses — verify they're not blocked.

Step 5: Verify Product Page URLs

In Google Merchant Center, download your product feed and check the URLs (link attribute) for products that are showing as unreachable. Manually visit those URLs to verify they return 200 OK status codes.

How to Fix Website Unreachable Issues

Fixing Server Downtime

If server instability is the cause, work with your hosting provider to:

Fixing robots.txt Blocking

Edit your robots.txt file to ensure Google can access your product pages and landing pages. Remove any overly broad Disallow rules. If you need to block Google from certain admin or private areas, use specific path-based disallow rules rather than blocking everything.

Fixing DNS Issues

Verify your domain's DNS records are correctly configured using a DNS lookup tool. Check that your domain registration hasn't expired. After making DNS changes, allow 24-48 hours for propagation before testing again.

Fixing Firewall Blocking

If Cloudflare or another security layer is blocking Googlebot:

Fixing Product 404 Errors

For product pages that have moved:

Fixing Password Protection

Remove password protection from all pages that should be publicly accessible. Product pages, category pages, and all landing pages linked from your Shopping ads must be publicly accessible without any login.

After Fixing: Submitting Your Appeal

Once you've resolved the website unreachable issues, verify the fix thoroughly, then submit a reinstatement appeal. In your appeal, explain:

For detailed appeal writing guidance, see our Google Merchant Center appeal guide.

Related Technical Issues

Frequently Asked Questions

How quickly does Google re-check my site after I fix the unreachable issue?

Google's crawler doesn't visit on a fixed schedule. After submitting a reinstatement appeal, Google typically reviews within 1-3 weeks. You can also use Google Search Console's URL Inspection Tool to manually request re-indexing of specific pages and confirm they're now accessible.

My website is accessible now — do I still need to appeal?

If your Merchant Center account is suspended (not just products disapproved), yes — you need to submit an appeal even if the issue is now resolved. Google won't automatically reinstate suspended accounts without a formal appeal submission.

Can robots.txt cause a Merchant Center suspension?

Yes. If your robots.txt blocks Googlebot from accessing your product pages or key website content, Google cannot verify your site meets Shopping policies. This can cause both product disapprovals and account-level suspension, categorized as "website unreachable" or "crawl issues."

Need Help Getting Reinstated?

GMCSuspension.com provides professional Google Merchant Center technical audits and reinstatement services, including diagnosing complex crawlability and server issues.

Get Professional Help