Paul Seekamp Profile picture
Sep 11 11 tweets 3 min read Twitter logo Read on Twitter
How I just got gained access to 22 unauthorized endpoints across 116 websites (260k endpoints) in about 10 minutes. Use what your comfy with.

👇
Once I have my list of sites. I load them into my Burpsuite site map via Firefox. There are many ways to do it. If it's not thousands of sites, I use the Bulk Open extension. There are others. Image
I had Burp crawl a bit for me. You can be as thorough as you want in this phase. For the sake of demo. You can let Burp do its thing. Image
Once that finished. Open up GAP (). Make sure you set "Prefix with selected Targets". github.com/xnl-h4ck3r/GAP…
Image
Select your whole sitemap and send it to the GAP extension. Image
Once GAP finishes. Copy out your links and feed them into file that you will run FFUF against. In my case, I filtered by "/api", "/admin" and "/user". Those are usually juicy. Image
We want to run FFUF against all of the urls that your tool GAP output for you, and make sure you save the whole request/response.

command:
ffuf -w "GAPoutput.txt" -u "FUZZ" -noninteractive -o "/tmp/results.json" -od "/tmp/bodies/" -of json
Then we want to feed this FFUF data to FFUFPostprocessing (). It does a decent job at sanitizing the 260k+ endpoints.

command:
ffufPostprocessing -result-file "/tmp/results.json" -bodies-folder "/tmp/bodies/" -delete-bodies -overwrite-result-filegithub.com/Damian89/ffufP…
Finally. Have httpx read through all the results and hit on the endpoints that have more than 60 lines in them.

jq -r '.results[].url' "/tmp/results.json" | httpx -title -sc -lc -nc -silent | sed 's/[][]//g' |awk '$NF > 60' |egrep ' 200| 301| 302'
Once this finishes, I got about 50 endpoints back which I manually visited. I was able to whittle down from 260k endpoints to about 50 endpoints with interesting information in them. 22 of them were sensitive in nature.
Added bonus, check them for leaked tokens/api keys.

command:
jq -r '.results[].url' "/tmp/results.json" | nuclei -tags tokens

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Paul Seekamp

Paul Seekamp Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @nullenc0de

Feb 11
Yesterday, I had a customer send me a nessus report and an app spider report. They were both clean. Few minor config issues that are norm in those automated tools.

They were sure there was nothing to find as they do them monthly.

I found a list of all users, sqli and 2 xss. 👇
I fired up Burpsuite and started manually clicking on all links in the web app. The app isn't more than 20 links and is all .aspx. and has a has a login page for which I don't have creds to.

Using my sitemap I made a custom wordlist of directories and parameters using
@xnl_h4ck3r's GAP tool. I fed the dirs. into Burp's content discovery tool. It hit on a few new endpoints. 'User.asp'. Which contained a list of all usernames for the app. It also had a password policy. I made another list of all users and knowing this is not on ADFS.
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(