Custom robots.txt For Blogger
User-agent: Googlebot
Disallow: /nogooglebot/
User-agent: *
Sitemap: https://www.123456789t.blogspot.com/sitemap.xml
Key Points:
- Disallow Sections: Blocks indexing of admin areas, tag pages, and specific search URL structures to avoid duplicate content and irrelevant sections.
- Allow Specific Paths: Ensures that search engines can index blog posts and essential pages.
- Static Resources: Allows crawling of CSS, JS, and image directories necessary for proper page rendering.
- Sitemap: Points to your sitemap for better search engine understanding of your site structure.
- Crawl Delay: Manages the rate at which bots crawl your site to prevent server overload.
- Specific Bots: Optional directives to block known problematic bots if needed.
.jpg)
No comments:
Post a Comment