If You’re Looking for the Commonly Searched Term “How To Create a Perfect Custom Robots.txt File in Blogger?” (or) “How to Create Robots.txt for my Website?”. You are at the Right Place to Get Your Answers & Solve the Issue/Error.
You May Also Like: How to Fix “ERR_GFX_INIT Failed to initialize graphics device. Please reboot or reinstall the latest drivers.”Is the error in Red Dead Redemption 2?
Most People Like to do blogging and most of them Always wanted to get more visitors. Some peoples achieve it quickly But Most of them are not. This is due to low-quality writing, Not Properly using Long-tail Keywords, Good Keywords, and most Importantly spammy Backlinks. Before Going to the Topic, Check your Website to have Enough articles with High-Quality Content and Remove the Spammy Backlinks If anything is present on your site. You Can Check the Spammy Backlinks using SEMrush Backlink Audit Tool or Ahrefs Site Explorer Tool. When all this is fine you can Create a Custom Robots.txt File. A Proper Indexation Can be Done By Using the Proper Robots.txt File.
Robots.txt is a text file that contains a few codes that instruct web search engines about how to index your websites and crawl your website properly. It can be created or written by using notepad itself. It Doesn’t Need Any Coding Knowledge. Basic Knowledge is Enough To Manage It.Robots.txt File has a huge impact on Seo’s point of View. Without the Proper Usage of this code leads to negative impacts on SEO and decreases your Website’s Seo Ratings and Rankings Also.
Most Important Things You Must Know Before Using a Robots.txt file. A Robots.txt file should be named as “robots.txt” (not as Robots.txt, robots.TXT – Since it is Case Sensitive). A robots.txt file is usually placed in the top-level directory of a site. You can Block the Crawling and Indexing of Certain Pages, Posts from Search Engines with help of Robots.txt Files. So, You need to Use it Carefully. If your Blocking a URL gets Wrong then it may affect your website SEO. We are here to help you, Just Follow our Method to Create a Perfect Robots.txt File.
Let’s See About it.
Steps For How To Create a Perfect Custom Robots.txt File in Blogger?
Step 1: Goto your Blogger and log in with your Account and Goto your Dashboard and Navigate to Settings.
Step 2: Now Click on Search Preferences And Now You Will see the list of crawling and Indexing Options
Step 3: Click the Edit Option Button Of Custom robots.txt and now an empty box will appear where you are going to copy-paste the code.
Here is the Search Engine Optimised Robots.txt Code that We used given Below. This is a Default Robots.txt Code For Every Blogger.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.website.com/feeds/posts/default?orderby=updated
*Important Note: In Sitemap Url, Replace ” website” with your site name.
Step 4 : After Copy Pasting the above codes into the empty box of your blogger and click on the Save Changes Button
How to Check Your Robots.txt File?
You can view your robots.txt file by typing your full URL for the homepage and then just add /robots.txt
Example – https://website.com/robots.txt
Replace ” Website” With Your Site Name
Final Step [ Not Necessary For Bloggers ]
How To Inform Google about the Updated The Robots.txt?
You Can Inform Google About The Changes You Made in Robots.txt File because google takes time to update the robots.txt file, So You can Tell Google Quickly To Update it Via Manually Using the Below Given Url and also check it whether the changes are made or not
https://www.google.com/webmasters/tools/robots-testing-tool?
*Important Note: You Must Verify Your Blog in Google Webmasters Tool in order to access the Robots Testing Tool
***