feedburner
Enter your email address:

Delivered by FeedBurner

feedburner count

zondag 23 november 2008

Woops

I was programming a mini-site that spawns about 90.000 pages out of the blue, and to test it I put it on a subdomain on a server. I was just starting to normalize the database and program the functions layer, adding some MooTools Ajax stuff and it's starting to look good.

At some point I check the SeoQuake stats bar and I suddenly see "3 indexed pages" and I go "****". I used search engine friendly url rewrites with a domain database table tag as category but as my data is all test-data, half of the tags don't match so if GoogleBot decides to go for it, I get 90.000 screwed up pages full of old incorrect data in the Google index with the wrong urls.

Like /sex/news.google.com/

I don't think Google will appreciate that :) So I put the entire on Disallow in robots.txt, I heard that works. I hope so, otherwise I had better erase the subdomain.

0 reacties: