Getting the list of Internet connections of the DIR-615 router in a readable form

I live in a large family with many computers. The need for Internet access for these computers is met by the D-link DIR-615 router. In the web interface of this device there is always a record that interested me about all current Internet connections passing through the device. Looking at this huge number of records, I was always worried, but what have my household members been doing so creepy, but have they become victims of some creepy botnet? All entries in this connection table operate only with the IP addresses of computers on the local network and on the Internet. Having cast aside the ethical side of the question, I will tell you in this article my script-kiddivsky way to solve the problem of converting the list of connections in a human-readable form.

The list of connections looks like this:



Of course, you can sit and painstakingly copy the value of the 'Internet' field to any reverse DNS lookup service, but this is a terribly useless task. Therefore, first we will find a way to get the list at least in this form in our program. In my router, in its performance E4 there is no telnet yet, as in older models, and it does not flash any DD-WRT to get a list of connections in any decent way. Therefore, we will receive data directly from the web interface. To access this page, you first need to log in to the router. Let's see how this process happens when connecting through a browser, for this we use a Mozilla Firefox browser plugin called HTTPFox.
By clicking on the record of all HTTP requests in the plugin, we try to log in to the router and see what happened. It should be noted that the router on the local network is located at 192.168.0.1.



We see that the authorization process is a simple sending of a POST request to the address for login.cgi. The request body mainly consists of various variations of our username and password. They only look strange, but their new appearance becomes clear. The script on the login page, they are encoded in base64 from the text fields of the interface elements. The encoding script is recorded directly on the login page. But it’s bad luck, the function on the page gave different results when encoding the password with the urlsafe_b64encode function from the Python base64 library. The difference was that the page script sometimes changed the last character from '=' to 'A'. Therefore, I just tore both the headers and POST Data from HTTPFox and wrote a login function. Hereinafter, to communicate with the router using the HTTP protocol, the Python httplib module will be used.
def LoginRouter():
	headers = {
	'(Request-Line)':	'POST /login.cgi HTTP/1.1',
	'Host':	'192.168.0.1',
	'User-Agent':	'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:18.0) Gecko/20100101 Firefox/18.0',
	'Accept':	'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
	'Accept-Language':	'ru-RU,ru;q=0.8,en-US;q=0.5,en;q=0.3',
	'Accept-Encoding':	'gzip, deflate',
	'Referer':	'http://192.168.0.1/login_auth.asp',
	'Connection':	'keep-alive',
	'Content-Type':	'application/x-www-form-urlencoded',
	'Content-Length': '144',
	}
	post_data = 'html_response_page=login_fail.asp&login_name=YWRtaW4A&login_pass=ZmFrZXB3ZA%3D%3D&graph_id=f1b37&login_n=admin&log_pass=ZmFrZXB3ZA%3D%3D&graph_code=&login=Login'
	conn = httplib.HTTPConnection('192.168.0.1:80')
	conn.request("POST", "/login.cgi", post_data, headers)
	response = conn.getresponse()

No cookies come in response to us, so apparently the router remembers us by IP address, which even simplifies everything.
Further, looking at the source text of the Internet connections page, we see that all the necessary data is set to the interface element with the input tag. And he is also one on the page. So the page parser will look simple, it will take the value field of the parameters of the input tag. To solve the problem, I did not want to install any third-party libraries, so I consciously refused to use LXML, realizing that of course it will be more beautiful and simpler. For parsing a web page, use the Python HTMLParser module. Using a module implies creating a class inherited from HTMLParser with the methods defined in it, which will be called as soon as the parser finds certain data. We need only the handle_data and handle_starttag methods, which are called accordingly when the parser finds the data and the beginning of the tag. The data will look like a string of something like this: ”TCP / 7767 / EST / OUT / 192.168.0.100 / 52751 / 64.12.30.3 / 5190/52751”, but we can easily cut it into a dict of a beautiful, convenient look. Getting the page will be a simple GET request. If in the response page we find an entry of the form “function redirect ()”, then we are not authorized on the router.

def GetInternetSessionsPage():
	class ConnectionsParser(HTMLParser):
		def __init__(self):
			HTMLParser.__init__(self)
			self.saved_data = []
		def handle_starttag(self, tag, attrs):
			if tag == 'input':
				attr = dict(attrs)
				connections_str = attr['value']
				connections_list = connections_str.split(',')
				for connection_entry in connections_list:
					data_entry_keywords = ['Protocol', 'Time out', 'State', 'Direction', 'Local IP', 'Local Port', 'Destination IP', 'Destination Port', 'NAT']
					data_entry_values = connection_entry.split('/')
					data_entry  = dict(zip(data_entry_keywords, data_entry_values))
					self.saved_data.append(data_entry)
	conn = httplib.HTTPConnection('192.168.0.1:80')
	conn.request("GET", "/internet_sessions.asp")
	response = conn.getresponse()
	if response.status == httplib.OK:
		responce_text = response.read()
		if responce_text.find('function redirect()') >=0:
			return None
		parser = ConnectionsParser()
		parser.feed(responce_text)
		return parser.saved_data


Now it remains only to turn the ip address into a domain name. On request, Google has a bunch of sites for reverse search. On the first one I find the API section, and I understand that the API is paid. But through the web interface you can use as much as you like. Well, it will be perverted. But on this site everything turned out to be simpler, the desired IP address is simply added to the URL, and the result page is displayed. We are connected to the compiled address, and we parse the same HTMLParser result. The truth came out a little silly and ugly, because in the parser in the method of finding data, we write the following five records of data fields after the line "Resolve Host:". Well, this is the result table on this site.

def GetHostNameByIP(address):
	class DNSParser(HTMLParser):
		NUM_DATA_TO_SAVE = 5
		def __init__(self):
			HTMLParser.__init__(self)
			self.next_data_save = 0
			self.saved_data = []
		def handle_data(self, data):
			if data.find('Resolve Host:') >= 0:
				self.next_data_save = self.NUM_DATA_TO_SAVE
			if self.next_data_save>0:
				self.saved_data.append(data)
				self.next_data_save -= 1
	conn = httplib.HTTPConnection('domaintz.com:80')
	conn.request("GET", "/tools/overview/"  + address)
	response = conn.getresponse()
	if response.status == httplib.OK:
		parser = DNSParser()
		parser.feed(response.read())
		return parser.saved_data
	else:
		return 'Error ' + str(response.status)


Next, we write a simple function that receives a list of connections, if it doesn’t receive it, it logs on to the router and tries again. And then it runs each destination address through reverse DNS. We fasten a simple filter to it so that information is displayed only at a specific local network address.

def LookupRouterConnections(looking_ip = None):
	sessions = GetInternetSessionsPage()
	if sessions is None:
		LoginRouter()
		sessions = GetInternetSessionsPage()
		if sessions is None:
			print 'Cant login to router, check Login/Password'
			return
	for entry in sessions:
		if entry.has_key('Destination IP'):
			if looking_ip is not None:
				if entry.get('Local IP') != looking_ip:
					continue
			print '{0} : {1}'.format(entry['Local IP'],GetHostNameByIP(entry['Destination IP']))


We try:

LookupRouterConnections('192.168.0.100')
192.168.0.100 : ['Resolve Host:', '212-36-249-250.rdtc.ru', ' (212.36.249.250)', 'IP Location:', 'Russian Federation, Novokuznetsk, Regional Digital Telecommunication Company (212.36.249.250)']
192.168.0.100 : ['Resolve Host:', 'Debian-60-squeeze-64-minimal', ' (5.9.145.232)', 'IP Location:', 'Germany, RIPE Network Coordination Center (5.9.145.232)']
192.168.0.100 : ['Resolve Host:', 'server15033.teamviewer.com', ' (178.255.155.21)', 'IP Location:', 'Italy, ANEXIA Internetdienstleistungs GmbH (178.255.155.21)']
192.168.0.100 : ['Resolve Host:', 'cm-04.lux.valve.net', ' (146.66.152.15)', 'IP Location:', 'Luxembourg, Valve Corporation (146.66.152.15)']


and so on, a lot of records.

Sources used:

http://ru.wikipedia.org/wiki/HTTP
http://www.python.org/doc/

Also popular now: