Posted - January 14 2011 : 11:01:24
| I thought this might be of interest to new website owners and maybe experienced ones who had never looked at this.
Our visitor numbers have more than doubled in the past 12 months which is obviously good. But because of this and the customisation we have done to vpasp I have spent a disproportionate amount of time looking after our server. By this I mean making sure it doesn’t crash.
The problem had been that with the increase in traffic, the server was coming under more pressure and so I would have to restart www services maybe 4 or 5 times a day.
At first I thought it was vpasp, mssql or our hosts. So I checked code (not a fun job – I wouldn’t recommend it, unless you want to be really bored), made sure updates were in mssql and got our hosts to check the server. I made sure css files were compressed, I even purchased a second server and made sure all images and static files were hosted there to reduce the calls to the main server, thereby allowing the main server the resources to just process database info.
Everything I did helped at first, but then after a few days it went back to how it had been. On one day, when we had sent a newsletter out, I had to restart services nearly every hour. I was literally tearing my hair out (and if you know me I haven’t got much as it is).
Then I read on a forum about bots, spiders and robot text files. So I created a robots text file, I blocked various spiders using it and others I forced a crawl delay.
Did this work – like hell did it; in fact I would say it made it even worse. It was as if the spiders were saying “Don’t like us? Well tough were going to attack your sites even more now”. Basically they just ignored the robots file. Although I should be fair and say that only a few ignored it, but they were the worst offenders in the first place.
So I went to the isapi rewrite forum and found a posting about pushing bad bots to a trap page and completely blocking them from the site.
What I’ve done is block various bots; they can only go to a static page that has the word “thanks” on it. The page is only 299 bytes so it’s nothing.
Now comes the interesting bit.
I’ve been monitoring this file and it is being accessed by one bot every 10 seconds. Even though that bot has been blocked by the robots file. That is only one bot. Overall bots are being forced to this page every 2.6 seconds. This shows just how much strain they were putting on our server.
Finally I feel we have solved the problem of the server crashing. It wasn’t the server, vpasp, mssql or human visitors it was an artificial menace.
So if you are having problems with your site speed don’t just look at the software or the hardware. Look at your logs and see if you are getting an excessive visitation from bad bots, if you pay extra for bandwidth it will save you money, but more importantly human visitors will get a site that is up more often.
Hope this helps someone.
Sex toys from a UK sex shop including vibrators and dildos.