.NET程序屏蔽一些无效蜘蛛的访问配置

1  <rule name="Block spider">
2           <match url="(^robots.txt$)" ignoreCase="false" negate="true" />
3           <conditions>
4             <add input="{HTTP_USER_AGENT}" pattern="Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms| EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot| Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu|facebookexternalhit"   ignoreCase="true" />
5           </conditions>
6           <action type="AbortRequest" />
7       </rule>

记录一下,简单测试可用

posted @   uxinxin  阅读(12)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
· winform 绘制太阳,地球,月球 运作规律
点击右上角即可分享
微信分享提示