You're not going to master the rest of your life in one day. Don't stress. Master the day. Make this a daily reminder.
Hello Bavya,
It's quite interesting to ask this from you! Are you interested to learn regarding this?
Great question! There are nine important factors are there. Here I listed down each and every point.
- Sitemap – Without errors, we need to create the sitemap file and submit via GSC.
- Index page – Important pages need to be allowed.
- Crawling – This is a process for Googlebot finds new and latest updated pages to be added to the Google index. so, we need to fix the duplicate issues and broken links.
- Robots.txt – Important pages need to be allowed. Only secure files need to be disallowed.
- Mobile Friendly – Google roll out the Mobile first indexing, so it’s very important.
- Internal Links – Every IBL links should be in live without broken.
- HTTPS – One of the ranking signal, Morethan 70% business owners moved to HTTPS.
- Page Speed – It will impact on the UX, So we need to give the high priority and
- Recrawling via GSC – If Above 8 factors have been corrected without any single mistake, we can ask to SE to re-crawl.
I hope you can understand what I mentioned here. If you have any further clarifications, please ask your doubts here.
You're not going to master the rest of your life in one day. Don't stress. Master the day. Make this a daily reminder.
You're not going to master the rest of your life in one day. Don't stress. Master the day. Make this a daily reminder.
Hello Bhavya,
Okay, sure, Let me explain it simply
Actually, The Robots.txt means, it tells to the search engine which page needs to crawl and which page no need to crawl in the particular site. If you don't want to crawl any particular page in your web, you simply define it with disallow.
This is a format of Creating robots.txt file:
Basic format:
---------------
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
And also we should focus on the following method as well.
- The robots.txt file must be placed in a websiteÂ’s top-level directory (root).
- Robots.txt is case sensitive: the file should be named “robots.txt” (not Robots.txt, robots.TXT).
- If you have any subdomain, you need to create the robots.txt file separately. Like this: (at blog.abcd.com/robots.txt and abcd.com/robots.txt)
This is the way to check the robots. txt file: https://domain.com/robots.txt
Hope you can understand
You're not going to master the rest of your life in one day. Don't stress. Master the day. Make this a daily reminder.
Bookmarks