Welcome to Geeklog, Anonymous Thursday, March 28 2024 @ 07:30 am EDT

Geeklog Forums

Website shut down for utilizing excessive amounts of system resources in Geeklog script


Status: offline

rvicker

Forum User
Newbie
Registered: 12/03/08
Posts: 3
freakingout
My web hoster shut down my account on a shared server tonight because it was "utilizing excessive amounts of system resources". They included the following:
Top Process %CPU 35.0 [php] Top Process %CPU 27.0 [php] Top Process %CPU 0.3 httpd [fromthefarm.#####.com] [/submit.php?typecalendar&mode&month02&day22&year2009&h]
This site has been the target of comment SPAM and I have updated to latest Geeklog, added the CAPTCHA plugin and been blocking the repeating IPs. They want to know what I am going to do to share the server nicely. Other than disabling the site what is there to do? Thanks
 Quote

Richard.bkk

Anonymous
cheerful
I would say install the Bad Behavior2 plugin.... I did work wonders for me, and probably it will do for you....

What provider do you have, as the processor use is not extremely high...
 Quote

Status: offline

1000ideen

Forum User
Full Member
Registered: 08/04/03
Posts: 1298
BadBehavior plugin, Spam-X and Captchas are a good idea. Personally I have had similar attacks and the spammer could not insert any spam because I had the calendar entries locked up for members only.

Anyway it is anoying if you get thousands e.g. 7000 get requests on submit.php in 2 hours or so. Cry
 Quote

Status: offline

rvicker

Forum User
Newbie
Registered: 12/03/08
Posts: 3
awake
I'm using Lunarpages. I have everything but comments require a login and I am the only person that can login. Spam-X so far has prevented any comments from actually getting posted. I get an email for all comments and even when the spammers are attacking there are only a couple of dozen a day. So how the heck could there e excessive usage? Their FAQs mostly blame bad scripts and say that they can't identify a script for the users. So has there been any reports of a problem with GeekLog itself (bad code) causing problems like going into a tight loop on invalid data? Due to the low volume (less than 10,000 requests per month) it has to be either they misread something or something chasing its tail.
 Quote

Status: offline

richard.bkk

Forum User
Junior
Registered: 09/27/08
Posts: 21
Hmm Rvicker,

Forget the idea of bad code in Geeklog; Geeklog is very stable and rock solid. Some very popular websites are based on Geeklog and still no problems.

Take for instance a website like http://www.groklaw.net/, it has the amount of visitors per day that all webmasters dream of…

For myself we tested in the beginning lots of scripts, basically we wanted a website where people could post of the forum without the need to login…. Also it needed to be able to handle articles .. To make things short, we found that Geeklog was basically one of the best in what we wanted and since that early version the quality of the php code has become flawless.

On the fighting spam part, Geeklog is one of the best CMS available. And with some help from a few plugins you can make it completely spam free….
 Quote

Status: offline

Laugh

Site Admin
Admin
Registered: 09/27/05
Posts: 1468
Location:Canada
You should look at your logs to see what is happening and find out more informaiton about the excessive amount of system resources. Did it just happen that one night?

It could of been some scraper bot that hammered your site. I get bots every now and then that make thousands of requests in a very short time.

You could install the GUS plugin (it will take away some of your resources) to help track where these requests are coming from and then hopefully ban them.


One of the Geeklog Core Developers.
 Quote

Status: offline

rvicker

Forum User
Newbie
Registered: 12/03/08
Posts: 3
awake
The logs show the top URL as /submit.php followed by / but with only 32 and 31 hits respectively so far this month.
 Quote

Status: offline

Dirk

Site Admin
Admin
Registered: 01/12/02
Posts: 13073
Location:Stuttgart, Germany
Keep in mind that when you get DoS-like amounts of requests, there isn't really much Geeklog (or any other web application) can do. These things are best detected and blocked by the webserver or even the hardware (e.g. the router). For Geeklog to detect excessive amounts of requests, it would require running a PHP script and making database requests in the first place - and if you do a lot of them, you may very well exceed your limits. Or, to put in another way: If you want to prevent excessive load, doing it in a PHP script is too late (and would only add to the problem).

If you can, try to figure out the IP addresses the requests are coming from and block them in your webserver (Deny from <ipaddress> in Apache). In really bad cases, you may have to ask your hosting service for help.

We sometimes have cases here on geeklog.net where someone is trying to download the entire(!) website, i.e. some script sends a request for each and every URL. Each request by itself is perfectly valid, but the amount of all requests can be a problem. I usually block those temporarily with a "Deny from".

We've also had one or two cases of DoS attacks where I had to ask our hoster for help. They blocked them at the network level somehow and things went back to normal for the site.

HTH

bye, Dirk
 Quote

Status: offline

1000ideen

Forum User
Full Member
Registered: 08/04/03
Posts: 1298
It can`t be just the 32 and 31 hits it must be more if the hoster tells you to look into it.

@Laugh: Does this attack hit /submit.php ? I`m trying to get an idea what is going on and to write a feature request either here at GL or with the author of BadBehavior to help stopping those attacks or to minimize the impact. BB isn`t bad but up to now does not stop the silly requests though I`m sure that Wordpress users etc. will face similar problems.
 Quote

Status: offline

Laugh

Site Admin
Admin
Registered: 09/27/05
Posts: 1468
Location:Canada
Quote by: 1000ideen

It can`t be just the 32 and 31 hits it must be more if the hoster tells you to look into it.

@Laugh: Does this attack hit /submit.php ? I`m trying to get an idea what is going on and to write a feature request either here at GL or with the author of BadBehavior to help stopping those attacks or to minimize the impact. BB isn`t bad but up to now does not stop the silly requests though I`m sure that Wordpress users etc. will face similar problems.



This is something I would like to see and have mentioned it before, but which plugin should handle it? SPAMX, Badbehaviour, GUS, BAN? maybe all of these plugins should be combined at some point.

What I would like to see is if some ip makes a predefined number of requests in a predefined time period then it is blocked either permenantly or for a predefined amount of time. Of course we do not want to block any search bots like google (not that they request pages that fast).

To answer the question what they attack, it is more what Dirk says, they are just downloading the entire site. A couple of days ago I had some IP with over 8000 requests.

I do get bots trying to create comment, forum and links SPAM but CAPTCHA takes care of these things.
One of the Geeklog Core Developers.
 Quote

All times are EDT. The time is now 07:30 am.

  • Normal Topic
  • Sticky Topic
  • Locked Topic
  • New Post
  • Sticky Topic W/ New Post
  • Locked Topic W/ New Post
  •  View Anonymous Posts
  •  Able to post
  •  Filtered HTML Allowed
  •  Censored Content