Mon Dec 28 18:58:52 EET 2009

Few little SEO tips on indexing a website / webpage in a better fashion (Through usage of robots.txt and sitemap.xml)

As I've mentioned in my previous post it's of vital value to have your website W3C compliant.
I'm trying to learn something more about SEO this days and therefore I found a website dedicated to
Search Engine Optimization discussing the importance of having the robots.txt file in each and every hosted
domain as well as the huge importance of generating sitemap for every web page out there in the Net.
More about robots.txt can be learned fromabout robots.txt's website
Here is a straigh forward robots.txt I choose to use to enable indexing of website without any restrictions
# Allow all
User-agent:  *
Disallow:

It was kinda of handy for me to learn more about sitemaps on sitemaps.org
In few words sitemaps are used to better map content on a website as every SEO out there knows.
Here is a sample sitemap from the sitemap.xml I've generated for pc-freak.net's entry page
There are plenty of both online and standalone sitemap.xml generators on the net.
I've choosed to use an online service to generate my sitemap.xml after which I modified
the entries in the generated file manually.
I've also given a try to google-sitemapgen from the bsd port tree, however I wasn't able to use that soft to generate any sitemaps.
One of the most used and probably honoured project on sitemap generating is Google's sitemap Generator, check in Google for further info.
Another handy tip that helps indexing in search engines is including:
Sitemap: http://pc-freak.net/sitemap.xml

in the robots.txt file, so after inclusion robots.txt
would look alike.

# Allow all
User-agent:  *
Disallow:
Sitemap: http://pc-freak.net/sitemap.xml