With SEO after you have done your on page optimization you MUST then get the website indexed or listed with the search engines. It must be indexed first before you can even start to think of ranking significantly. This is very important. While a lot of people focus on the main page or index page of the website you want to include all your pages so that you can spread your focus around your main keywords and other long tail keywords for your niche. You want to rank naturally instead of cramming everything up in one page.
The easiest way to tell the search engines about all your pages is to use a sitemap. This can come in a text (.txt) file, an html (.html) file or an XML (.xml) file. Most website hosts have scripts that automatically generates this sitemap for you and tells the search engines about all the inner pages on your website. This way you get to optimize your website and business by targetting each category in your shop or department based on different web pages to the search engines and avoid spamming by making your keyword the every-other-word in your text.
There are times when you do not want certain pages to show up in the search results either for privacy or for any other reason at all. You handle this by using the robots.txt file. This file basically tells the search engine spiders how to behave at your website, either to include or exclude certain pages in the crawl and index them. The primary use for this file is to direct the spiders to the sitemap so they get a list of all pages on the website for maximum exposure of your website. This is important because most people have a menu on their website homepage directing visitors to specific pages for certain products or categories. The sitemap helps those specific pages to rank in their own right as opposed to being a meer subpage when it actually talks about said product in more detail than your homepage.
Be sure to include these two files so you can get maximum visibility for your website.
No comments:
Post a Comment