RAWR - Rapid Assessment of Web Resources [beta]

Started by al14s, March 20, 2013, 03:52:44 PM

Previous topic - Next topic

al14s

Hey all,

   Just wanted to suggest the addition of a python web enumeration app to the repo.

It's hosted on bitbucket:    https://bitbucket.org/al14s/rawr

Git:    https://bitbucket.org/al14s/rawr.git

I just released it at Caronlinac0n:   http://www.youtube.com/watch?v=YuQXUCWmkxA


From the Wiki:

RAWR - Rapid Assessment of Web Resources
by: Adam Byers (@al14s)

Requirements:

    nmap - at least 6.00 - required for SSL strength assessment
    python - tested with 2.7
    phantomJS - tested with 1.8.1

Installation:

    Run ' ./rawr.py --check-install ' Checks for nmap, then downloads the latest default password list, IP to country list, and phantomJS (from google.code).

OS Compatibility (tested with):

    Backtrack 5r3 x86 / NMap 6.01
    Ubuntu 12.04LTS x64 / NMap 5.21 (nmap caused errors due to absence of ssl-enum-ciphers.nse)
    MacOSX 10.8.2 / NMap 6.25
    Windows 7 Ultimate / NMap 6.25
    Windows XP Pro SP3 / NMap 6.25
    Windows 8 x64 / NMap 6.25

Browser Compatibility:

    Supported - Firefox (tested w/ 14)
    Tested/working on - Safari (tested w/ 5), Chrome (tested w/ 22), IE (tested w/ 7-9)

  Usage: ./rawr.py [-n <range> (-p <ports|all|fuzzdb>)|-x <xml file>]
                     [-u] [-e] [--sslv] [-z] [-d <dir>]

   INPUT OPTIONS:
    -i      Target an input list.  [NMap format] [can't be used with -n]
    -n      Target the specified range or host.  [NMap format]
    -p      Specify port(s) to scan.   [default is '80,443,8080,8088']
    -t      Set a custom NMap scan timing.   [default is 4]
    -x      An xml file or dir containing xml files from which to pull hosts.
    --sslv  Assess the SSL security of each target.  [considered intrusive]

   OUTPUT OPTIONS:
    -d      Logging Directory [default is './log_[date]_[time]_rawr']
    -e      Exclude default username/password data from output.
    -z      Compress log folder when finished.

   OTHER:
    -b      Use Bing to gather external hostnames. (good for shared hosting)
    -h      Show this info + summary + examples.
    -u      Check for newer version of IpToCountry.csv and defpass.csv.
    -U      Force update of IpToCountry.csv and defpass.csv.

    --check-install  Check for newer IpToCountry.csv and defpass.csv,
                     Check for presence of NMap and its version.
                     Check for presence of phantomJS, prompts if installing.

    --force-install  Force update - IpToCountry.csv, defpass,csv, phantomJS.
                     Check for presence of NMap and its version.


   SUMMARY:

         Uses NMap scan data (by running a scan or from previous xml output)
             to target web services for enumeration.  Visits each host on each
             port with an identified web service and gathers all of the data. 


         Output: 
              All NMap output formats (xml uses local copy of nmap.xsl)
              CSV worksheet containing all collected info.
              HTML report  (searchable, jQuery-driven, standalone)
              Images folder  (contains screenshots of the web interfaces)
              Cookies folder
              SSL Certificates folder

         Usage diagram:

         .--LOG             --.    .--SCAN                                 --.
         |./log_[dt]_rawr/ |   | -x nmap xml or dir of xml files  |
         |-z .tar file           > | -i use an input list for NMap      |
         |-d log directory  |   | -n nmap <range>                     > [.xml data]
         `--                  --'    |     (-p <ports>,-t <timing>)     |       .
                                       `--                                         --'       |
                                               .------------------------------------'
                                              |
         .--SUPPLEMENT         --.  |      .--ENUMERATE                          --.   
         | IpToCountry.csv &     |  `-> |      Web service enumeration       |
         |   defpass.csv             |         | (screenshot, record server data)  --.
         | -u|-U to update from   ----> | --sslv [intrusive] SSL assessment|  |
         |    the SF page or       |          `--                                           --'  |
         | -e to exclude defpass |                                                              |
         | -b use Bing for DNS   |         .----------------------------------------`
         `--                           --'         |           
                                                     |     .--OUTPUT       --.
                                                     |     | CSV worksheet |
                                                     |     | HTML report     |
                                                     `-> | NMap output     >    :)
                                                           | Cookies            |   
                                                           | Screenshots      |
                                                           | SSL certs          |
                                                           `--                   --'

   
Best regards,
   al14s

c0ncealed

A lot of thought, preparation, and testing has gone into this tool thus far which I believe will be seen when given the chance to utilize it to map out the web resources of your target environment.
We just released it at CarolinaCon 9, and already have a good list of features that we will be progressing towards in the next few months.

Please give it a run and let us know what you think!
Good or bad, please make it constructive so that we can further better this tool for use within the security community!

Thanks!
@c0ncealed

al14s


There's a new version out...     0.1.1

If anyone is working with it at the moment, I suggest pulling it down.  Lots of bugfixes.    ;D

New Features:
     -now pulls robots.txt
     -self-updating function fully operational


Regards,
al14s

ostendali

Quote from: al14s on March 25, 2013, 03:54:39 PM

There's a new version out...     0.1.1

If anyone is working with it at the moment, I suggest pulling it down.  Lots of bugfixes.    ;D

New Features:
     -now pulls robots.txt
     -self-updating function fully operational


Regards,
al14s
thanks for your feedback about the update...
stay tuned! ;)

al14s

More updates (available only in the al14s/dev branch until I can solidify the way it parses nessus files)-


  • use '-o' to send an additional 'OPTIONS' request - retrieves available methods
  • now accepts nessus .xml result files as input - need feedback, as I don't really use Nessus.  :-\ (would be helpful if anyone has a link to sample reports, including Service versioning of web services and/or SSL cert enum)
  • i've set a socket timeout equal to the timeout specified in the user vars section of the script...  this should prevent the initial call from (for example) drowning in a pot of honey.

Plans:

  • adding the option to make an additional call using HTTP 1.0 in hopes of obtaining more info from the earlier, 'chattier' version.
  • cleaning up the info in the HTML report's 'i' tab

al14s

Hello all - here's an update on the latest revision of RAWR

I've added spidering, diagrams, new outputs (sqlite/json/etc), and lots of bugfixes.

It can nom NMap|Nessus|Nexpose|Qualys XML files as well as Metaploit csv dumps.

Check out the DerbyCon talk:  http://www.youtube.com/watch?v=SnXUzZSNwpU


Here's a mini pic of the diagram of google.com:


(The actual screenshot is at: http://a.fsdn.com/con/app/proj/rawr-webenum/screenshots/74.125.225.102_20130927-041612_443_diagram.png

ZEROF


Don't ask, read : http://wiki.backbox.org
or just run sudo rm -rf /*