I lost 50% of my traffic in one day ! - Fixing robots.txt files
Robots.txt files are critical for search engines to crawl and index your site. You need to have the appropriate settings on them in order for the sites to be crawled appropriately. Having the robots.txt file setup incorrection can negatively impact your site's rankings and position in search results. I experienced an issue where I lost 50% of my traffic in a single day, which was all due a robots.txt issue. In this video, you will find out what the issue was, why it happened, and how I solved it. Resource mentioned in video: https://developers.google.com/search/docs/advanced/robots/submit-updated-robots-txt —----------------Affiliate Links—------------------- Hosting: Hostinger - https://blog-sprout.com/recommends/hostinger Ezoic: https://blog-sprout.com/recommends/ezoic Theme: https://blog-sprout.com/recommends/generatepress Ubersuggest: https://neilpatel.com/ubersuggest/ Canva: https://blog-sprout.com/recommends/canva Tailwind: Get 1 month free! - https://blog-sprout.com/recommends/tailwind