Maybe it’s a fad, but I like subdomains in URLs. My debian package maradns allows you to have subdomains on your server quite easily.

One of the good things about this, is that you can utilize Google’s site: command better.

However I think there must be some down sides. For example if I wanted to boycott MSN search and add:

echo -e "User-agent: msnbot\nDisallow: /" >> robots.txt

I would need to do it all my websites:

hendry@bilbo:/web$ for i in `ls`; do echo $i; done

Ok, it doesn’t seem that hard after all… :)

Found any of my content interesting or useful?