2018-05-24 15:30 UTC

View Issue Details Jump to Notes ]
IDProjectCategoryView StatusLast Update
0000876Kali Linux[All Projects] New Tool Requestspublic2018-05-08 08:56
Reporterjnieto 
Assigned To 
PrioritynormalSeverityminorReproducibilityhave not tried
StatusnewResolutionopen 
Product Version 
Target VersionFixed in Version 
Summary0000876: Parsero - The tool to audit the Robots.txt automatically
DescriptionParsero is a free script written in Python which read the Robots.txt file of a web server and look at the Disallow entries in order to connect with them. Parsero prints if they are available or not. This tool is really faster and help you out when you are auditing a website trying to locate files or directories on a web server which have been hidden to the search engines by the web administrators. I haven't found any similar tool to achieve this goal.

I would be really happy if you consider to try this tool.

You can get the latest version here:

https://github.com/behindthefirewalls/Parsero

More info:

http://www.behindthefirewalls.com/2013/12/parsero-tool-to-audit-robotstxt.html

Please, let my know a feedback and if you want a real approach with real examples.
Additional InformationPython3 and Urllib3 is needed.
Attached Files

-Relationships
+Relationships

-Notes

~0001510

jnieto (reporter)

That the administrator write a robots.txt, it doesn't mean that the files or direcotories typed in this file will not be indexed by Bing, Google, Yahoo..

Now Parsero is capable of searching in Bing to locate content indexed whithout the web administrator authorization!!!

~0001768

jnieto (reporter)

Update: Some times the links indexed by Bing are available and sometimes not, now Parsero check each one (in the first Bing results page) in order to print out if they are available or not.

~0008447

g0tmi1k (administrator)

To help speed up the process of evaluating the tool, please make sure to include the following information (the more information you include, the more beneficial it will for us):

- [Name] - The name of the tool
- [Version] - What version of the tool should be added?
--- If it uses source control (such as git), please make sure there is a release to match (e.g. git tag)
- [Homepage] - Where can the tool be found online? Where to go to get more information?
- [Download] - Where to go to get the tool?
- [Author] - Who made the tool?
- [Licence] - How is the software distributed? What conditions does it come with?
- [Description] - What is the tool about? What does it do?
- [Dependencies] - What is needed for the tool to work?
- [Similar tools] - What other tools are out there?
- [How to install] - How do you compile it?
- [How to use] - What are some basic commands/functions to demonstrate it?

~0008554

jnieto (reporter)

- [Name] - Parsero
- [Version] - 0.81
- [Homepage] - https://github.com/behindthefirewalls/Parsero
- [Download] - https://github.com/behindthefirewalls/Parsero or https://pypi.python.org/pypi/parsero/0.81
- [Author] - Javier Nieto (www.behindthefirewalls.com)
- [Licence] - GNU General Public License v2.0
- [Description] - Parsero is a free script written in Python which reads the Robots.txt file of a web server and looks at the Disallow entries. Then it checks the HTTP status code of each Disallow entry in order to check automatically if these directories are available or not.
- [Dependencies] - Python3, Pip3, beautifulsoup4, urllib3.
- [Similar tools] - Nothing similar (AFAIK)
- [How to install] - How do you compile it?
 - sudo pip3 install parsero or sudo setup.py install (Github)
- [How to use] - What are some basic commands/functions to demonstrate it?

$ parsero -h

usage: parsero.py [-h] [-u URL] [-o] [-sb]

optional arguments:
-h, --help show this help message and exit
-u URL Type the URL which will be analyzed
-o Show only the "HTTP 200" status code
-sb Search in Bing indexed Disallows
-f FILE Scan a list of domains from a list

Example
=======

root@kali:~# parsero -u www.example.com -sb

~0008569

g0tmi1k (administrator)

Please could you git tag your release?
+Notes

-Issue History
Date Modified Username Field Change
2014-01-08 21:48 jnieto New Issue
2014-02-10 12:39 jnieto File Added: Parsero_v06_Kali_1.PNG
2014-02-10 12:39 jnieto File Added: Parsero_v06_Kali_2.PNG
2014-02-10 12:41 jnieto Note Added: 0001510
2014-05-09 08:42 jnieto Note Added: 0001768
2014-05-09 08:43 jnieto File Added: Parsero_v071.png
2014-05-12 17:16 xploitx Issue cloned: 0001195
2014-06-04 15:19 karkassa Issue cloned: 0001382
2018-01-29 15:08 g0tmi1k Note Added: 0008447
2018-01-29 21:08 jnieto Note Added: 0008554
2018-01-30 10:20 g0tmi1k Note Added: 0008569
2018-05-08 08:56 g0tmi1k Summary Parsero: The tool to audit the Robots.txt automatically => Parsero - The tool to audit the Robots.txt automatically
+Issue History