Pricing
  • Tech Blog
  • How We Fixed the Google Crawlbot Issue: A Troubleshooting Guide

How We Fixed the Google Crawlbot Issue: A Troubleshooting Guide

June 11, 2021

Learn how our team identified and fixed the Google Crawlbot not working issue on our website by investigating logs, using developer tools, and analyzing server configurations.

Table of Contents

  1. Identifying the Errors in Google Cloud Log
  2. Inspecting the Issue with Google Chrome Dev Tools
  3. Analyzing Google's Rendering Issues
  4. Implementing Axios onError Request and Response
  5. Investigating Firebase Issues
  6. Fixing the Nuxt Proxy Configuration
  7. Final Resolution and Lessons Learned

Identifying the Errors in Google Cloud Log

Our first step was to examine the Google Cloud Log to find any errors related to our website. We found several logs indicating that the Google Crawlbot was not able to access our website content.

Inspecting the Issue with Google Chrome Dev Tools

We opened Google Chrome's Developer Tools and added a new device to the app toolbar with the Google Crawlbot User-Agent string. This helped us identify that our website was not being rendered properly for the Crawlbot.

Analyzing Google's Rendering Issues

After identifying the rendering issues, we checked Google's official documentation and resources to better understand the problem. This allowed us to confirm that our website was not being crawled as expected.

Implementing Axios onError Request and Response

We added the Axios onError request and response console log to our application, which provided us with additional information regarding the issue.

Investigating Firebase Issues

At this point, we believed that the issue might be related to our Firebase setup. We followed Firebase's debugging steps and continued to investigate the problem.

Fixing the Nuxt Proxy Configuration

While debugging, we discovered that the ignoreHeaders option was not working in our Nuxt.js setup due to an incorrect proxy configuration. By correcting the configuration, we managed to resolve the issue.

Final Resolution and Lessons Learned

After several days of investigation and debugging, Firebase informed us that they had resolved some issues on their end. With the problems fixed, the Google Crawlbot was finally able to access and crawl our website properly. This experience taught us the importance of thoroughly investigating and debugging website issues, as well as the value of patience and persistence.

NEED ASSISTANCE OR HAVE IDEAS?

Get in touch with any questions, suggestions, or concerns about our website analyzer services.