View Issue Details

IDProjectCategoryView StatusLast Update
0000876Kali LinuxNew Tool Requestspublic2020-03-18 17:52
Reporterjnieto Assigned To 
PrioritynormalSeverityfeatureReproducibilityhave not tried
Status closedResolutionsuspended 
Summary0000876: Parsero - The tool to audit the Robots.txt automatically
Description

Parsero is a free script written in Python which read the Robots.txt file of a web server and look at the Disallow entries in order to connect with them. Parsero prints if they are available or not. This tool is really faster and help you out when you are auditing a website trying to locate files or directories on a web server which have been hidden to the search engines by the web administrators. I haven't found any similar tool to achieve this goal.

I would be really happy if you consider to try this tool.

You can get the latest version here:

https://github.com/behindthefirewalls/Parsero

More info:

http://www.behindthefirewalls.com/2013/12/parsero-tool-to-audit-robotstxt.html

Please, let my know a feedback and if you want a real approach with real examples.

Additional Information

Python3 and Urllib3 is needed.

Attached Files
Parsero_v06_Kali_1.PNG (120,996 bytes)   
Parsero_v06_Kali_1.PNG (120,996 bytes)   
Parsero_v06_Kali_2.PNG (280,760 bytes)
Parsero_v071.png (215,788 bytes)   
Parsero_v071.png (215,788 bytes)   

Activities

jnieto

jnieto

2014-02-10 12:41

reporter   ~0001510

That the administrator write a robots.txt, it doesn't mean that the files or direcotories typed in this file will not be indexed by Bing, Google, Yahoo..

Now Parsero is capable of searching in Bing to locate content indexed whithout the web administrator authorization!!!

jnieto

jnieto

2014-05-09 08:42

reporter   ~0001768

Update: Some times the links indexed by Bing are available and sometimes not, now Parsero check each one (in the first Bing results page) in order to print out if they are available or not.

g0tmi1k

g0tmi1k

2018-01-29 15:08

administrator   ~0008447

To help speed up the process of evaluating the tool, please make sure to include the following information (the more information you include, the more beneficial it will for us):

  • [Name] - The name of the tool
  • [Version] - What version of the tool should be added?
    --- If it uses source control (such as git), please make sure there is a release to match (e.g. git tag)
  • [Homepage] - Where can the tool be found online? Where to go to get more information?
  • [Download] - Where to go to get the tool?
  • [Author] - Who made the tool?
  • [Licence] - How is the software distributed? What conditions does it come with?
  • [Description] - What is the tool about? What does it do?
  • [Dependencies] - What is needed for the tool to work?
  • [Similar tools] - What other tools are out there?
  • [How to install] - How do you compile it?
  • [How to use] - What are some basic commands/functions to demonstrate it?
jnieto

jnieto

2018-01-29 21:08

reporter   ~0008554

  • [Name] - Parsero
  • [Version] - 0.81
  • [Homepage] - https://github.com/behindthefirewalls/Parsero
  • [Download] - https://github.com/behindthefirewalls/Parsero or https://pypi.python.org/pypi/parsero/0.81
  • [Author] - Javier Nieto (www.behindthefirewalls.com)
  • [Licence] - GNU General Public License v2.0
  • [Description] - Parsero is a free script written in Python which reads the Robots.txt file of a web server and looks at the Disallow entries. Then it checks the HTTP status code of each Disallow entry in order to check automatically if these directories are available or not.
  • [Dependencies] - Python3, Pip3, beautifulsoup4, urllib3.
  • [Similar tools] - Nothing similar (AFAIK)
  • [How to install] - How do you compile it?
    • sudo pip3 install parsero or sudo setup.py install (Github)
  • [How to use] - What are some basic commands/functions to demonstrate it?

$ parsero -h

usage: parsero.py [-h] [-u URL] [-o] [-sb]

optional arguments:
-h, --help show this help message and exit
-u URL Type the URL which will be analyzed
-o Show only the "HTTP 200" status code
-sb Search in Bing indexed Disallows
-f FILE Scan a list of domains from a list

Example
=======

root@kali:~# parsero -u www.example.com -sb

g0tmi1k

g0tmi1k

2018-01-30 10:20

administrator   ~0008569

Please could you git tag your release?

g0tmi1k

g0tmi1k

2020-03-18 17:52

administrator   ~0012459

No update upstream from 2014

Issue History

Date Modified Username Field Change
2014-01-08 21:48 jnieto New Issue
2014-02-10 12:39 jnieto File Added: Parsero_v06_Kali_1.PNG
2014-02-10 12:39 jnieto File Added: Parsero_v06_Kali_2.PNG
2014-02-10 12:41 jnieto Note Added: 0001510
2014-05-09 08:42 jnieto Note Added: 0001768
2014-05-09 08:43 jnieto File Added: Parsero_v071.png
2014-05-12 17:16 xploitx Issue cloned: 0001195
2014-06-04 15:19 karkassa Issue cloned: 0001382
2018-01-29 15:08 g0tmi1k Note Added: 0008447
2018-01-29 21:08 jnieto Note Added: 0008554
2018-01-30 10:20 g0tmi1k Note Added: 0008569
2018-05-08 08:56 g0tmi1k Summary Parsero: The tool to audit the Robots.txt automatically => Parsero - The tool to audit the Robots.txt automatically
2019-12-09 13:30 g0tmi1k Severity minor => feature
2020-03-18 17:52 g0tmi1k Status new => closed
2020-03-18 17:52 g0tmi1k Resolution open => suspended
2020-03-18 17:52 g0tmi1k Note Added: 0012459