-
The Application Hackers Toolkit
To perform a thorough assessment of any web application you should become familiar with the tools introduced in this section. We recommend a toolset based on our experience and personal preference of the course authors. The number one rule when assembling a tool kit is to find a selection you are comfortable with and use it, however the tools we recommend should provide everything you need and provide a good basis to benchmark others.
Assessment Proxies
A web application assessment proxy (a.k.a Man-in-the-Middle proxy) is perhaps the most important tool in your arsenal. This type of proxy is designed to sit between a web browser and the web application to allow requests and responses to be modified and examined in real time.
- There are many different tools available to perform this task, which tool you choose should support the following.
- Permit the trapping and modification of requests over both HTTP and HTTPS
- Offer some form of work flow management (e.g. a list of application components you have accessed and manipulated)
- A fuzzer component (more about this later)
- A regex engine to allow automatic modification of request data and extraction of user defined values
- A web spider
- NTLM/Digest/Basic authentication support
- The ability to resubmit request
We recommends Burp Suite based upon its rich feature set and flexibility. Other recommended tools include Web Scarab, @stake Web proxy and Paros Proxy.
-
Burp Suite
- Burp Suite by Port Swigger (🔗 http://portswigger.net/suite) is man-in-the middle specifically designed for performing application assessments. In this module you will be using Burp Suite to map application, discover vulnerabilities and exploit them.
- Ability to “passively’ spider an application in a non intrusive manner with all requests originating from the browser.
- One-click transfer of interesting requests between tools e.g. from Burp Proxy request history, or the Burp spider results tree.
- Detailed analysis and rendering of requests and responses.
- Extensibility via IBurpExtender interface, which allows third-party code to extend the functionality of Burp Suite. Data processed by one tool can be used in arbitrary ways to affect the behavior and results of other tools.
- Centrally configured settings for downstream proxies, web and proxy authentication, and logging.
- Tools can run in a single tabbed window, or be detached in individual windows.
- All tools and suite configuration is optionally persistent across program loads.
- Runs in both Linux and Windows.
Burp Suite Demonstration
At this point of the course the instructor will provide a brief tour of Burp Suite’s key functionality. For further information see the extensive help documentation that ships with Burp Suite.
Burp Spider Burp Intruder The Burp Spider represents discovered components in a tree view providing a method to track your workflow and launch attacks against each component The Burp Intruder is used to submit large numbers of requests using one of several fuzzing options to discover and exploit vulnerabilities. The Burp Intruder has many possible implementations such as discovering SQL injection flaws to attacking authentication forms. Burp Proxy Burp Repeater The Burp Proxy allows requests and responses to be intercepted and manipulated on the fly. Regex expressions are also supported to allow values to be dynamically replaced or extracted The Burp Repeater allows requests to be replayed and manipulated. This feature is useful when confirming and exploiting vulnerabilities. -
Scanners
There are many HTTP Based scanners available from basic Perl scripts to commercial GUI driven applications. Web Application/HTTP Scanners are known by a number of different names depending on their specific focus. The term CGI Scanner is often used to describe a scanner designed to assess Web Server Build Security; whereas the term “Application Scanner” (or similar) suggests the focus is aimed at Application Code flaws such as SQL Injection. All scanners are designed to discover vulnerabilities with little to no intervention from the user. Automated scanners can be powerful tools if used correctly and within the appropriate context. A scanner should not be used to replace manual testing, but should be considered a tool in your arsenal.
CGI Scanners
CGI Scanners are used to discover web server components that are vulnerable to known published vulnerabilities. Most good scanners will also highlight areas of interest such as administrative interfaces and potentially sensitive components. The term “CGI Scanner” is historical and takes its name from “Common Gateway Interface” web server technology that once dominated web applications.
Nikto
Nikto is an open source CGI Scanner written in Perl and is available from 🔗 http://www.cirt.net. Nikto is widely considered to be one of the best CGI-Scanners available due to its extensive vulnerability database. Several other CGI scanners such as Wikto also implement the Nikto vulnerability database.
Wikto
Wikto implements the Nikto vulnerability database and adds a number of features such as a GUI Interface, Google hacking, and directory/file guessing features. Wikto is a Windows only tool and is available from 🔗 http://www.sensepost.com.
Forced Browsing Tools
- Forced browsing is an attack that aims to discover web application components that are hidden and/or are not specifically linked from the application. The aim of forced browsing is to discover the sensitive components such as:
- Backup Files and Archives
- Source Code
- Administrative Interfaces
- Components with inadequate access controls such as pages and scripts that would normally only be accessible once authenticated
- Outdated components that are no longer maintained and potentially insecure
Wikto’s Backend
Wikto provides a feature called “named” to perform forced browsing. The backend feature is based on a word list of common directory and filenames and attempts each file using a list of potential file extensions. There are a number of other tools available to perform this function including “Directory Buster” available from OWASP project website (🔗 www.owasp.org), however Wikto provides a richer feature set and superior performance (Wikto also provides some directory guessing capabilities).
Application Scanners
Application scanners are designed to discover application code vulnerabilities such as SQL Injection and Cross Site `scripting. Typically the Application Scanner will begin by spidering the website to discover content and will then attempt to manipulate each Query String and Form parameter in order to discover vulnerabilities.
Unfortunately automated scanners cannot apply the same logic as a consultant and as such do not follow the functional paths correctly. For example, consider a registration form that guides the user through multiple steps before successfully registering the user in a database. An automated scanner may not complete the registration process in by incorrectly entering registration data or failing to complete a Captcha authentication process. In this case the scanner would fail to detect any vulnerabilities within the registration process past the point of failure. Another common issue is where the web application requires authentication and logs the user out upon each attack attempt, most scanners do not correctly handle the re-authentication process and miss vulnerabilities as a result.
Web Application assessment should be performed by a consultant or developer with the relevant application testing skills. Application Scanners provide a method for saving time and discovering the “low hanging fruit” vulnerabilities which are the most likely to be exploited. The following scanners have been tested and are considered the best by Sec-1 Ltd Consultants.
Paros Proxy
Paros Proxy (🔗 www.parosproxy.org) is a free Main-In-The-Middle proxy with an application scanning feature. Paros Proxy is written in Java and free to use.
A challenge response authentication process is where the user is typically asked to enter characters from an image; 🔗 CAPTCHA
ANSA – Vulnerability Assessment Scanner
The Sec-1 Scanner is an online scanning service with a web application-scanning component. Our scanning system is constantly benchmarked against the best in the industry to ensure it offers leading detection rates and functionality. As well as web application scanning the Sec-1 scanner also performs port scanning and Nessus vulnerability scanning. Scans can be scheduled to occur at regular intervals, results offer advanced technical details to allow consultants to further investigate each discovered vulnerability.
WatchFire AppScan & SPI Dynamics Web Inspect
AppScan from WatchFire and WebInspect from SPI Dynamics are commercially licensed vulnerability scanners for Windows Platforms. There is little to separate each scanner in terms of functionality, both offer a comprehensive list of scanning features and comparable detection rates.
Both scanners perform effectively in terms an automated scanner but are susceptible to the same flaws as all automated testing systems; they cannot improvise, complete forms with the same precision as a real life user and Captcha authentication systems would prevent each scanner from authenticating.
-
Encoding/Decoding Tools
As we have already seen web applications implement a number of encoding standards for request and response data. When testing an application you will frequently need to encode and decode data, therefore a tool to perform this function is essential. The “Encoder” utility from http://www.woany.co.uk / supports most of the major encoding standards and provides a handy graphical utility for use under Windows.
- Encoder supports the following:
- Base64 encode/decode
- HTML encode/decode
- URL encode/decode
- UTF-7 encode
- UTF-7 encode (URL encoded)
- HEX encode/decode
- MD5 encode
- Tiger digest
- RIPEMD128
- RIPEMD160
- RIPEMD256
- RIPEMD320
- SHA-1
- SHA-256
- SHA-384
- SHA-512
- Length
- Lower case
- Upper case
The Burp Suite also includes and encoding utility but is less intuitive than Encoder.
-
Password Hash Cracking
During your application assessment endeavors you may encounter passwords stored using one of several password-hashing algorithms such as MD5 or SHA-1. Hashing is a one-way process, it is impossible to reverse a hash back to its plain text equivalent.
There are three possible solutions to this problem. The first and perhaps most common solution is to take a word list and perform the hashing calculation on each word, if any of the hashes match the one you are attempting to crack then you have found the plain text value. A similar technique known as a “brute force attack” performs the same process but uses all possible character combinations until the plain text value is found. The Brute Force method can take a very long time to complete depending on the hash algorithm and chosen character set. A third option uses pre-computed hash tables known as Rainbow Tables (🔗 http://en.wikipedia.org/wiki/Rainbow_table), to use this method you will need to generate or download rainbow tables for the hashing algorithm in question. For MD5 hashes the size of the tables varies dramatically depending on the character set.
Cain & Abel
Cain & Able is a Windows based tool designed for intercepting and cracking many different types of authentication schemes. Part of its functionality is to perform dictionary, brute force and cryptoanalysis (rainbow tables) attacks against common hashing algorithms including MD5 and SHA-1.
Cain & Abel can be downloaded at 🔗 http://www.oxid.it
Application Mapping
Mapping the application and fingerprinting server side technology is a key first step when planning an attack against an application. In this section we introduce the techniques required to fingerprint the web server (presentation server) and discover application technologies implemented as part of the application.
HTTP Fingerprinting
HTTP fingerprinting can be used to identify the web server used to host the application. Once the type and version of the web server can be determined this information can be used to search for known flaws in the product. Web sites such as secunia.com and securityfocus.com provide searchable databases of known security vulnerabilities within a given off the shelf application.
Banner Grabbing
Many web servers such as Apache and Microsoft IIS will respond with their type and version within the “Server” response header of most HTTP requests. For example, using netcat 🔗 http://www.vulnwatch.org/netcat) to connect to 🔗 www.microsoft.com and issuing the following request will reveal the server type and version:
Request
C:\Users\garyoleary>nc www.microsoft.com 80 GET / HTTP/1.0 Host: www.microsoft.com
Response
HTTP/1.1 302 Found Cache-Control: private Content-Type: text/html; charset=utf-8 Location: /en/us/default.aspx Server: Microsoft-IIS/7.0 X-AspNet-Version: 2.0.50727
Providing Microsoft has not purposefully manipulated the server header, we can deduce from the server response that they are hosting 🔗 www.microsoft.com on Microsoft IIS 7.0.
Although the server header will provide accurate information in the vast majority of cases, it is possible to manipulate the response to be purposefully misleading or brief.
For example, Apache servers are frequently configured to display “Apache” within the server header instead of the more verbose response including the version number, “Apache/2.0.49”.
HTTPrint
HTTPrint fingerprints the HTTP protocol by analyzing the web servers’ response to various request types. The aim of the HTTPrint is to detect the actual web server type and version even if the server response header has been manipulated.
HTTPrint is available for multiple platforms and can be downloaded here: 🔗 http://www.net-square.com
-
Identifying Server Side Technology
File Extensions
It is usually possible to determine which server side technology is being implemented by a specific application by observing the file extension used by the server side scripts. The following table maps file extensions to their respective technologies:
File Extension Technology .asp Microsoft Active Server Pages. Most Probably an IIS Server. It is possible to find .asp files hosted from an apache server. However this is rare. .aspx Microsoft .NET .pl Perl .cfm Cold Fusion .jsp Java Server Pages .py Python .dll Compiled Microsoft Windows Code .nsf Lotus Notes .ntf Lotus Notes .php PHP d2w WebSphere .do Web-based Java program run by a Web server, such as IBM WebSphere; typically mapped to the Struts controller -
Cookies
Application frameworks such as Microsoft .NET and The Java Platform provide session/state management capabilities via dynamically generated cookies, often referred to as Session IDs.
Each framework generates a cookie that is unique to that platform and therefore be used to fingerprint the server side environment. The following table provides example cookies and their respective frameworks.
Cookie Platform ASPSESSIONID Microsoft IIS / ASP ASP.NET_SessionId Microsoft ASP.NET CFID/CFTOKEN Cold Fusion PHPSESSID PHP NAME_session_id Ruby on Rails where NAME is the name of the rails project JSESSIONID The Java Platform ORASSO Oracle Single Sign on / Apache base oracle application server Spidering
Web spidering is performed to discover website content linked from a given start point and serves as a method of mapping and cataloguing application components. There are two methods of spidering available when performing an application assessment:
Automatic
An automatic web spidering tool will take a start point such as the application home page and attempt to automatically discover all linked components. The spider will work through each page parsing and following links until no further links can be found with a given scope (i.e. allowed domain).
The main disadvantage of spidering in this way is that some components cannot be correctly parsed without user intervention. For example, a registration page with a Captcha authentication image cannot be correctly followed without user intervention. Many automatic spiders also run into problems with Ajax components and JavaScript navigation.
User Directed
User Directed Spidering can be performed using tools that combine a Man-In-The-Middle web proxy with a spidering tool, two such tools are Burp Suite and WebScarab.
In this case the user navigates around the application whilst tracking his or her progress through the proxy server. The proxy server is configured to add each visited link to the spider queue and continue with normal spidering operation for each link it receives. The approach allows multistage forms, Ajax components and other complex systems to be correctly identified during the spidering process.
Burp Suite allows you to identify application components without enabling the spider, here all discovered links are recorded but none are automatically followed. This is useful when assessing sensitive applications such as administrative interfaces that could cause damage if every link is automatically followed (e.g. some links may cause users or site components to be deleted).
Forced Browsing
The term “Forced Browsing” is sometimes used when describing CGI Scanning. This is due to the fact that forced browsing and CGI Scanning are both designed to find hidden, administrative and default components that may be interesting to an attacker. When mapping an application we can use forced browsing in an attempt to discover hidden functionality such as Content Management Systems and components that are not intended to be accessed by anonymous Internet users.
The Wikto tool has a feature named “Backend” that will attempt to guess directory names using a word list, once directories are discovered each is checked for common file names with a list of common extensions.
Before running a Wikto scan any directory names you have learned from spidering should be added to the directory list in an attempt to discover hidden subdirectories and files. Each discovered resource should then be accessed via your man-in-the-middle proxy and be subject to further spidering.