A little dive inside a hacked site

    It's no secret that most sites these days are not hacked manually. There is a large army of bots that look for vulnerabilities in site scripts, brute-force CMS admin panels, FTP / SSH accounts, then download small bootloader scripts or backdoors, use them to inject dozens of managing “agents” into the site’s scripts, and also scatter them randomly writable directories, web shells, spam mailers and other malicious php (and sometimes perl) scripts. From the inside, the infected site looks something like this (fragment of the AI-BOLIT scanner report): The

    infection patterns (number, composition and purpose of malicious scripts) can change. In this case, the infection statistics are as follows:

    • 41 backdoor insert
    • 5 WSO Web Shells
    • 4 scripts that inject malicious code into .php files
    • 7 mail () spam mailers
    • 2 spam mailers working through SMTP
    • 1 backdoor
    • 1 script injecting malicious code into wordpress / joomla scripts

    Among the "malware" there are all sorts of interesting specimens. But today it will not be about them. It is more interesting to analyze not so much static malicious code in files as the process of working with “malware” in dynamics: what requests and in what format command centers send to embedded backdoors, with what intensity, with what parameters, etc. In addition, static analysis for modern malware works poorly, because some scripts do not contain payloads.

    Take the popular backdoor:

    It consists of one or two lines and is used to receive and execute PHP code. Payload “flies” in the parameters of the POST request and is executed at the same time. Naturally, the payload code is not saved anywhere, so the dynamic analysis process usually rests on the lack of data: there are no logs with the request body that would contain the code under investigation.

    To analyze the communication of the command center with its "agents", you need to log HTTP requests to a malicious script, and for this you need to timely configure the body of the POST request to a file, which is never done on shared hosting. On dedicated servers, however, POST requests with data are also not logged. This is due to the saving of server resources and, in general, the lack of need. But that's not all. The second problem associated with the analysis of hacking and infection of the site is the late appeal of the owner of the infected site to specialists.
    Almost always, "patients" are contacted 2-3 weeks after the appearance of the first malicious script. That is, when the downloaded code has already been implemented, it’s “recovered” and begins to be actively exploited by cybercriminals, and the site is blocked by hosting due to spam mailing, phishing pages or attacks on third-party resources. By the way, the code is “tracked down” in order to cover up the signs of hacking and not immediately cause suspicion on the site owner. After two weeks, the rotation of the logs does its dirty job, erasing information about how the malicious code was uploaded to the site, and the introduced malicious programs begin to create a harmful "useful" load: attack other resources, upload doorway pages to the site, spam mailing lists, inject redirect codes into bundles of splits, send tons of spam emails with phishing content, etc.

    But from time to time, it is still possible to configure logging on the infected site and collect the internals of requests for malicious scripts. So what is hidden from prying eyes?

    Quite typical for such an infection is the introduction of a redirect to the pharm. Affiliate in the root .htaccess (selling Viagra, etc.), a wap-click affiliate (subscribing to media content for SMS) or a malicious resource (for conducting drive- by attacks or downloading a trojan under the guise of updating a flash player or antivirus).

    The redirect in this case is implemented as follows: in the POST request, the php code is transmitted, which is wrapped in base64. The code is executed using the backdoor on the hacked site and adds its 404 error handler, in which it redirects visitors to the attacker's site. Moreover, it is enough to have a missing image, script or .css file on some page of the site so that the redirect works for the visitor. Domains to which visitors are redirected periodically change.

    Another example of a log of requests to embedded backdoors and downloaded scripts for spam mailing:

    Here, too, all data is transmitted in base64 encoding via POST and COOKIE variables. Moreover, executable fragments are double wrapped in base64 encoding in order to bypass WAFs and web scanners that are aware of base64 and can decode it. In the decoded version, the request to the backdoor looks like this:

    In payload, directory traversal and code injection are performed, which will search for wordpress files in accessible directories and do one of two things: either inject malicious content into them or restore the original file content (this is how the command center introduces malware on time or on schedule ) To make it harder to find changed scripts, the modification date (mtime) for files is set according to the date of the change of one of the wordpress scripts. In addition, read-only attributes are set so that inexperienced webmasters cannot edit them (many are really puzzled).

    As for the other “useful” load - spamming - the content is double-wrapped in base64 and passed in the POST parameters of the request to the spam-mail list. And from time to time, some verification letters with service information may be sent:

    An interesting observation: if you remove all malicious scripts from the site, then after a few unsuccessful requests, the process of communication with the "agents" stops. That is, the command center does not immediately try to re-hack the site and upload backdoors to it, apparently for the same reason - to hide the process of initial loading of backdoors. But if you leave at least one backdoor during treatment, then the whole “bundle” of hacker shells, backdoors and spam mailers will be downloaded through it again.
    Therefore, the site must always be treated very carefully.

    Also popular now: