最近发现服务器一些垃圾蜘蛛十分可恨,浪费占用大连服务器资源,根本不顾服务器的性能,有多大劲就使多劲,不停的抓取,而且无视Robots协议。
不过可以用伪静态来实现屏蔽蜘蛛!下面介绍方法:
Windows 2008系统IIS7、IIS7.5环境下Web.config,如果没有伪静态组件,得先安装伪静态组件。
<rule name="Block spider"> <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> <conditions> <add input="{HTTP_USER_AGENT}" pattern="Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot" ignoreCase="true" /> </conditions> <action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Forbidden" /> </rule>
Linux系统apache环境下规则文件.htaccess(如没有可手工创建.htaccess文件到站点根目录)加入以下代码:
<IfModule mod_rewrite.c> RewriteEngine On #Block spider RewriteCond %{HTTP_USER_AGENT} "Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC] RewriteRule !(^robots\.txt$) - [F] </IfModule>
Windows 2003下IIS6.0环境下规则文件httpd.conf或者httpd.ini(在服务器或者虚拟主机控制面板中用“ISAPI筛选器自定义设置"开启自定义伪静态Isapi_Rewite3.1或者免费版)加入以下代码:
#Block spider RewriteCond %{HTTP_USER_AGENT} (Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu) [NC] RewriteRule !(^/robots.txt$) - [F]
注意:规则中默认的是不明搜索蜘蛛,要屏蔽其它蜘蛛按规则添加替换即可,至于各大搜索引擎蜘蛛的名字可以在度娘当中获得。