Detect and Avoid Potential Security Threats Using Server Side Tracking

Date Published: 12/03/2009 23:30

Since the launch of Dev Explorer just over two weeks ago, I have been closely monitoring my site traffic to see where my users come from, what they come for and how long they stay. In this article I will reflect on using server side traffic monitoring to detect and prevent users who visit your site with malicious intent. Any site which permits user input is potentially a target for hackers, whether they are attempting to execute scripts on/from your site or access your personal files/database. Carefully watching your site traffic can help prevent these attacks.

Methods of Prevention

Before even considering using tracking to help identify malicious users ensure you follow the rather obvious pointers below.

  • Form Validation - Simple and crucial, you should never open a form to the world without extensive validation. You'd be amazed to see the things hackers can do with a poorly validated form.
  • Human Recognition Methods - If information is going to be piped straight out onto your site without your authorisation ALWAYS use a method of human recognition. This could be a simple question or image containing letters and numbers, as long as it is in place and works effectively.
  • Query String Validation - Any query string values used in your code must be rigorously validated first, just like form inputs.

If the above are in place you are at very least moderately safe from the evils that lurk on the world wide web. Always take extra precautions however, if you aren't sure if you are fully secure double check.

Methods of Tracking

For general site usage tracking I use Google analytics which allows me to see in depth data about my users such as browser/operating system preferences, geographical location and search terms used to find my site. I've used Google analytics also on projects in the work place, giving the client an insight into the usage of their site. Although extensive I often feel the need to question its accuracy and it is not suitable for detailed traffic monitoring.

To oversea site usage properly you can simply insert various details from each HTTP request into a database. This is the most reliable way to track every time a certain page is hit. Using client side solutions such as Google analytics or clicktale will only allow you to monitor users with JavaScript enabled, whereas using a server side solution will monitor everyone. Whether the hit is from a real user, a search engine spider or a spam bot this method will log them in your database. The information you should store should probably consist of the following 7 items.

  1. Page Accessed - Knowing which page they accessed is crucial since it may highlight vulnerabilities in your security.
  2. IP Address - Storing this will allow you to block unwanted visitors.
  3. HTTP Method - The method used can highlight what they are trying to achieve. If the same IP address makes multiple POST requests to the same script, chances are they are trying to get around your form validation.
  4. User Agent - This will identify what technology the visitor used to access your site and enables you to detect search engine spiders more easily.
  5. Query String - Keeping track of the query string used for the request means you will be able to detect if someone has attempted to hack your site using the URL.
  6. Referer - This is the link they clicked (if they clicked one) to reach your site. Monitoring this will show you who comes from where and help you keep track of your back links.
  7. Time of Visit - Always make sure you track when it was they visited, this way you can see if the number of requests from a suspicious IP address would be plausible for a human being (IE 3 requests a second is unlikely to have originated from a real person).

All of the information above is made available whenever someone accesses a web page and is not personal information. Your browser submits this information by default to help web masters to serve you the most relevant pages. Some sites use your IP address to determine which localised version of a website to display. Taking note of this information will allow you to keep track of who or what is using your site and for what purpose. Most of the entries in your logs will be perfectly innocent and can be ignored. Every now and then however one will come along which could potentially be a threat.

What to Look Out For

The main thing to look out for are requests in your logs which look out of place. This could be requests which submit odd information into your query string, possibly attempting to determine whether validation is in place between the URL and the database. I spent many hours fixing a clients website at work, after hackers managed to execute an UPDATE SQL query using an insecure query string. The UPDATE query added a small JavaScript reference to every string in the database, causing users to be redirected to a page prompting them to download malicious software. This kind of attack is common and can be easily avoided using proper validation. It only takes one redirected user to download the malicious software and the hack was worthwhile for the culprit.

If requests are very frequent from an IP address over a short period of time this could indicate an attempted brute force attack. Here on Dev Explorer an IP address originating from India attempted to brute force past my CAPCHA human validation on an article comment form. I noticed this when my log had 45 consecutive requests from one source all within a matter of seconds. Their intent is unknown as the attacker never managed to beat the form validation but I have since blocked the offending IP address to avoid it happening again.

These attacks may not only put your site at risk but also your users. Not all suspicious looking requests are as they seem however. Only block a user if their actions are obviously with bad intentions. There are many feed readers and other such automated bots out there, which are perfectly harmless and can in fact help your site thrive.

Conclusion

If the above seems painfully obvious to you then you are safe since you have considered these factors before. Security on the web is a very difficult thing to determine. What appears safe could in fact carry a small hole waiting to be taken advantage of. Using detailed server side tracking simply allows you to get a better picture of what your site goes through. You will be able to see when the different spiders visit and get an insight which may not have previously been available. I would recommend it wherever possible, even if you are not there to monitor it, if something does go wrong at least you will have a detailed log to look back on.

Comments

Sorry comments are currently disabled for maintenence

5 Most Recent Articles

Manually Triggering Events in ASP.NET from JavaScript

A quick guide for ASP.NET developers on how to manually trigger ASP.NET events from JavaScript.

Advanced Use of MySQL Stored Procedures

An article for users of MySQL databases describing how they can use advanced stored procedures to improve efficiently in their applications.

Using MySQL Stored Procedures and Extending MySQLi in PHP

A guide for LAMP developers to using stored procedures in MySQL and extending the MySQLi class.

Reading and Writing to Excel Spreadsheets in Python

An introduction to using the xlwt and xlrd modules for python to interact with Microsoft Excel spreadsheets.

Interact with the Web Using Python and the HTTP Library

This is an introduction to making HTTP requests from a python script/application using httplib.

Sponsors