In this episode of SEO Fairy Tales, Jason Stevens, who works in SEO at Google, shares a story about a product team that reached out for an audit. After conducting the audit, they noticed that the site was getting “no information available for this site” snippets for many of its ranked queries, which was confusing users and impacting traffic. The team found that the site-wide crawling request had dropped to zero, which they determined was due to a disallow directive in the site’s robots.txt file. They were able to solve the issue by removing the directive. The team realized that the site’s traffic had declined because of the issue with the robots file, and the team was able to easily solve the issue by removing the directive.
Recent Posts
- From Clicks to Customers: A Small‑Business Blueprint for Online Advertising Success
- Targeted Advertising on a Shoestring: How Small Businesses Can Maximize Impact with Minimal Spend
- Unlocking Growth: The Ultimate Small‑Business Guide to Online Advertising Success
- Small Budget, Big Impact: A Step-by-Step Blueprint for Effective Online Advertising
- Zero-to-Hero: A Small Business’s Roadmap to Online Advertising Success