[go: up one dir, main page]

0% found this document useful (0 votes)
36 views1 page

Enumerating Content and Functionality

The document discusses techniques for enumerating all content and functionality within a web application, including manual browsing and using automated web spiders. Web spiders can recursively request pages and submit forms to discover more content than simple browsing alone. Several free tools like Burp Suite and WebScarab can help perform this enumeration.

Uploaded by

deer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views1 page

Enumerating Content and Functionality

The document discusses techniques for enumerating all content and functionality within a web application, including manual browsing and using automated web spiders. Web spiders can recursively request pages and submit forms to discover more content than simple browsing alone. Several free tools like Burp Suite and WebScarab can help perform this enumeration.

Uploaded by

deer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Enumerating Content and Functionality

In a typical application, the majority of the content and functionality can be


identifi ed via manual browsing. The basic approach is to walk through the
application starting from the main initial page, following every link, and navigating
through all multistage functions (such as user registration or password
resetting). If the application contains a “site map,” this can provide a useful
starting point for enumerating content.
However, to perform a rigorous inspection of the enumerated content, and
to obtain a comprehensive record of everything identifi ed, you must employ
more advanced techniques than simple browsing.
Web Spidering
Various tools can perform automated spidering of websites. These tools work
by requesting a web page, parsing it for links to other content, requesting these
links, and continuing recursively until no new content is discovered.
Building on this basic function, web application spiders attempt to achieve
a higher level of coverage by also parsing HTML forms and submitting these
back to the application using various preset or random values. This can enable
them to walk through multistage functionality and to follow forms-based navigation
(such as where drop-down lists are used as content menus). Some tools
also parse client-side JavaScript to extract URLs pointing to further content.
Numerous free tools are available that do a decent job of enumerating application
content and functionality, including Burp Suite, WebScarab, Zed Attack
Proxy, and CAT (see Chapter 20 for more details).

You might also like