You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

37 lines
1.6 KiB

4 years ago
4 years ago
4 years ago
  1. # massurl
  2. massurl is a simple tool that aims to parse the outputs of tools like gau, and
  3. extract the parameters for each URL, remove duplicates and do it all very
  4. quickly. Because web scraping tools' outputs can get very large very quickly,
  5. it is nice to have a tool that parses them and and outputs something clean and
  6. easy to read.
  7. ## How to use?
  8. Simply clone the git repository and run `make` which outputs the binary
  9. *massurl*. You can then simply pipe the output of any command that outputs urls
  10. into it or pass the filename where you want it to read the urls from. It
  11. expects each line to have only one url. It has several parameters:
  12. ``` sh
  13. usage: massurl [-r] [-v] [-o outfile] [-p payloads] [-n minparamnum] input_file
  14. ```
  15. You can specify an output file, which it will write instead of stdout, you can
  16. also give it a list of payloads which massurl will automatically enter as the
  17. values for each parameter. If you are testing for reflected values, in
  18. parameters, you can put a pseudorandom value in each param using the flag -r.
  19. And finally, you can specify the minimum amount of parameters a url must have
  20. to be outputted, this value is zero by default but I recommend you use 1.
  21. ## How fast is it?
  22. The tool uses a binary tree to store the urls and keeps it balanced using the
  23. red-black self balancing tree algorithm, which allows it to run at incredible
  24. speeds.
  25. ## Contributing
  26. This is a very simple project so you shouldn't have trouble reading the code
  27. and fixing the bugs you encounter. If you do so, feel free to send a PR. Or, if
  28. you can't seem to fix it yourself, don't be shy and open an issue!