Hubros Vulnerability Finder
It was evening, there was nothing ... It was decided to surprise the hawkers. But how? Banal searches of vulnerabilities on well-known sites are no longer interesting to anyone. Well, then turn on the fantasy ... After all, users also have sites! Urgently check their durability!
It was necessary to come up with an action plan. In principle, it was all about how to parse as many pages of users as possible and then parse the pages of sites into them. Then put it all on the scanner. Something like this…
Point One: Sparsed Pages.
Not everything is so simple so that you know ... There are not as many users as you want on the / users page. It was decided to recursively go to users and take other users from subscribers there, and there also users and more ...
My favorite python is uncovered, imagination is on, this is written:
After some time of work, a sufficient number of hara-humans were sparsed (about 10k). It is clear that most of these people are repeating themselves ... So what to do?
Point two:
Having unique Thinking, I decided that shipping a python would be unfair. Therefore, the idea of array_unique from php got into my brain. Thought - done.
The whole script consists in opening and closing the file, but then the uniqueization in one function.
Everything was launched from the console, everything worked fine.
It turned out a little less than 2k unique characters ... So now you need to go and find their sites.
Point three: Search sites
Again pulled a python, and wrote this. Everything is very simple, go through the file (line by line), go to the user pages, look for sites. We find, hide in a file.
Sites turned out a decent amount, and if they were not repeated, then they had an indigestible look.
We need to clean them ...
Point Four: We clean sites
We use preg_match in php.
We get a good list, ala:
yandex.ru
google.com
yahoo.com
Now you need to fasten all this to the scanner, get the answer, hide it in the file ...
Point five: flying with the scanner
A firmware was written on the python to manipulate the console.
I write from memory, because this script was written on production (BackTrack r3 on the virtual machine), and after the happy discovery of a huge pile of vulnerabilities, this whole thing was closed without saving (virtualka in the sense), so the script itself was not saved ...
The point is in order to run the nikto.pl pearl script for 60 seconds, and write it in a file with the name of the site in time for this time (for ease of further processing).
After working with this scanner in files of the form: site.txt, quickly found vulnerabilities were detected (within 60 seconds).
Point six: debriefing
After some time of work (I left it for the night) I decided to see what kind of holes there were ...
Because the habrasociety was pretty old in this matter (in information security, in the sense), some reports were completely without pluses pleasing to the eye ...
However, among 3 hundred sites “leaky” sites were found.
one). All kinds of open folders and files not allowed to access. (40 ~ sites)
2). Various possible SQL and XSS locations. (20 ~ sites)
3). Dangerous information exchange protocols, non-password memcached and all sorts of holes in the admin area (there is no check for access to files, or an empty password = full password). (10 ~ sites)
4). Frank errors (or flaws). The lack of a password for the admin panel (the script swore as much as 12 files, and sent to see what was interesting there). (2 sites)
5). The error met in a single copy, and then due to an oversight (I think), direct access to phpmyadmin with the root login and no password (default settings). No more vulnerabilities were found on this site, so I believe that this is simply an oversight.
Not all users were checked, not all sites, not all features of all the possibilities!
I hope that this post will be useful to someone (some administrator will suddenly check something on his site).
Vulnerable site administrators notified. I hope for positive thoughts in your head!
How it all happened:
It was necessary to come up with an action plan. In principle, it was all about how to parse as many pages of users as possible and then parse the pages of sites into them. Then put it all on the scanner. Something like this…
How it works?
Point One: Sparsed Pages.
Not everything is so simple so that you know ... There are not as many users as you want on the / users page. It was decided to recursively go to users and take other users from subscribers there, and there also users and more ...
My favorite python is uncovered, imagination is on, this is written:
import urllib2, re
start = "первый пользователь"
""" Выкачиваем страничку первого пользователя, и парсим с нее имена первых жертв """
page = urllib2.urlopen("http://habrahabr.ru/users/"+start+"/subscription/followers/").read()
names = re.findall('', page)
""" рекурсивная функция, которая, вызывая себя в себе, будет ходить по сайту в поисках новых жертв"""
def going(names):
for name in names:
page = urllib2.urlopen("http://habrahabr.ru/users/"+name[0]+"/subscription/followers/").read()
names = re.findall('', page)
""" и писать новых жертв в файл """
base = open('habrs.txt', 'a')
writed = 0
for item in names:
base.write(item[0]+"\r\n")
writed = writed + 1
print 'Saved habrs\'s: '+str(writed)
going(names)
going(names)
After some time of work, a sufficient number of hara-humans were sparsed (about 10k). It is clear that most of these people are repeating themselves ... So what to do?
Point two:
Having unique Thinking, I decided that shipping a python would be unfair. Therefore, the idea of array_unique from php got into my brain. Thought - done.
The whole script consists in opening and closing the file, but then the uniqueization in one function.
Everything was launched from the console, everything worked fine.
It turned out a little less than 2k unique characters ... So now you need to go and find their sites.
Point three: Search sites
Again pulled a python, and wrote this. Everything is very simple, go through the file (line by line), go to the user pages, look for sites. We find, hide in a file.
import urllib2, re
sites = ""
users = file("habrs2.txt")
for user in users:
user = re.split("\r\n", user)[0]
page = urllib2.urlopen("http://habrahabr.ru/users/"+user+"/").read()
site = re.findall('(.*?)',page)
if len(site) > 0:
for site, fake in site:
sites += site+"\r\n"
with open("sites.txt", "a") as f:
f.write(sites)
Sites turned out a decent amount, and if they were not repeated, then they had an indigestible look.
We need to clean them ...
Point Four: We clean sites
We use preg_match in php.
We get a good list, ala:
yandex.ru
google.com
yahoo.com
Now you need to fasten all this to the scanner, get the answer, hide it in the file ...
Point five: flying with the scanner
A firmware was written on the python to manipulate the console.
I write from memory, because this script was written on production (BackTrack r3 on the virtual machine), and after the happy discovery of a huge pile of vulnerabilities, this whole thing was closed without saving (virtualka in the sense), so the script itself was not saved ...
The point is in order to run the nikto.pl pearl script for 60 seconds, and write it in a file with the name of the site in time for this time (for ease of further processing).
import os, time
sites = file("sites2.txt")
for site in sites:
os.system("perl nikto.pl -h "+site+" | tee "+site+".txt")
os.system("pidof perl | tee perlID.txt")
time.sleep(60)
pid = file("perlID.txt")[0]
os.system("taskkill "+pid)
After working with this scanner in files of the form: site.txt, quickly found vulnerabilities were detected (within 60 seconds).
Point six: debriefing
After some time of work (I left it for the night) I decided to see what kind of holes there were ...
Because the habrasociety was pretty old in this matter (in information security, in the sense), some reports were completely without pluses pleasing to the eye ...
However, among 3 hundred sites “leaky” sites were found.
Vulnerability rating
one). All kinds of open folders and files not allowed to access. (40 ~ sites)
2). Various possible SQL and XSS locations. (20 ~ sites)
3). Dangerous information exchange protocols, non-password memcached and all sorts of holes in the admin area (there is no check for access to files, or an empty password = full password). (10 ~ sites)
4). Frank errors (or flaws). The lack of a password for the admin panel (the script swore as much as 12 files, and sent to see what was interesting there). (2 sites)
5). The error met in a single copy, and then due to an oversight (I think), direct access to phpmyadmin with the root login and no password (default settings). No more vulnerabilities were found on this site, so I believe that this is simply an oversight.
Not all users were checked, not all sites, not all features of all the possibilities!
I hope that this post will be useful to someone (some administrator will suddenly check something on his site).
Vulnerable site administrators notified. I hope for positive thoughts in your head!