DreamHost 500 Internal Server Error

I’ve been getting a bunch of 500 Internal Server Errors on my website, especially when adding images to my Gallery. Takes about 3-4 times before it’ll work. DreamHost is apparently having some recent slew of problems and are currently trying to fix all the issues: DreamHost Status.

If you unfortunately hit this problem, try refreshing the page a few times and it should work. I apologize beforehand for the inconvenience.

Looking at the error.log, I see:

[Sun Apr 15 19:33:12 2007] [error] [client xxx.xxx.xxx.xxx] Premature end of script headers: /dh/cgi-system/php5.cgi
[Sun Apr 15 19:33:12 2007] [error] [client xxx.xxx.xxx.xxx] File does not exist: /home/krunk4ever/krunk4ever.com/internal_error.html

I had tried reverting back to PHP 4.x, but get a similar error:

[Sun Apr 15 22:30:15 2007] [error] [client xxx.xxx.xxx.xxx] Premature end of script headers: /dh/cgi-system/php.cgi
[Sun Apr 15 22:30:15 2007] [error] [client xxx.xxx.xxx.xxx] File does not exist: /home/krunk4ever/krunk4ever.com/internal_error.html

Oh well, guess I can only wait for them to fix the problems.


Update

So DreamHost got back to me:

It appears some scripts on your site are being killed by our process watcher program. This keeps the CPU/Memory load balanced for all users on the server. Unfortunately, while we can’t change the settings on that, there are ways to reduce the CPU and Memory footprint of your site. One is to make sure you’ve disabled hotlinking of your images. More often, the culprit is robot “crawlers”, such as googlebot, which like to get caught in loops on some sites, causing the load to jump on the server.

You might find the following articles in our wiki useful in helping you reduce the load your sites put on the server:

Finding Causes of Heavy Usage

CPU Resources FAQ

and

Bots spiders and crawlers

Please feel free to let me know if you have any other questions.

I’m note exactly sure what changed and I’ve already enabled caching for WordPress and full acceleration on Gallery. Hotlinking has already been disabled a long time ago. I went ahead and followed one of the suggestions of delaying crawlers by added a robots.txt:

User-agent: *
Crawl-Delay: 60
Disallow:

However, the wiki does state that Note: this is relatively worthless as it has been confirmed that Googlebot ignores it

We’ll see if it gets any better.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.