[Rejected] when mybb_sessions is full causes a DoS to website
#1
The problem:
A crawler (Seekport) started scraping the website at one url a second (or faster) Every attempt it made on each and every page added an entry into the mybb_sessions table. This was fine for a while, but eventually it filled the sessions table, at which point no more sessions could be stored.

This caused Mybb to output an error page while also attempting to send multiple error emails for every page attempt after it could handle no more sessions.

This stopped the website from being accessible by the public (Although I could still access the backend admin)

The fix:
I initially accessed the database directly and purged the whole session table. Doing so removed the problem and the site was up and running again. (Later I realised I probably could of used the Task which is meant for clearing old sessions)

I just thought I would mention it as there might well be a way to either slowdown or block consecutive attempts after a given threshold. While it might be possible to increase the value type to hold more data, it won't stop such further incidents, just prolong the inevitable.
Reply
#2
Had issues with seekport a while back.  Not sure there's a single solution to this problem.  We have a number of mechanisms to manage it, depending on Admin skill levels.  Which vary on our board.  Some handle routine ACP tasks, but would never go on the host.

As you mention, the ACP will remain accessible with a full sessions table.  Last year I cobbled together a task clearing all guest sessions.  Admins can enable and run as required.  Bad bot and other uid 0 data is cleared from the table.  Particularly critical when we had a memory type sessions table.

This doesn't address the root cause, but can buy you time.  Also have a number of sql queries I run frequently - to identify bot-like or threatening behaviour.  Additional detail is available in the web server site logs, but everything you need to make an action decision is usually in the sessions table.  You'll want to convert the binary IP address and epoch timestamps to human readable values, but this shouldn't be a problem for most.

Other approaches to this would be interesting to hear - without showing anyone's hand - so to speak. Smile

cheers...
Reply
#3
Adding crawlers to ACP's Configuration → Spiders / Bots → Add New Bot with a matching User Agent String should limit saving mybb_sessions records to 1 per bot.
devilshakerz.com/pgp (DF3A 34D9 A627 42E5 BC6A 6750 1F2F B8AA 28FF E1BC) ▪ keybase.io/devilshakerz
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)