代码改变世界

爬网日志中的警告信息:文件达到最大下载次数,The file reached the maximum download limit. Check that the full text of the document can be meani

  Virus-BeautyCode  阅读(1174)  评论(0编辑  收藏  举报

  这个是我在国外的一个网站找打的,原文地址,Maximum File Size for Crawling具体的修改办法如下:

   Maximum File Size for Crawling

By default, Search Services can crawl and filter a file with a size of up to 16 megabytes (MB). It will always crawl the first 16MB of a file. After this limit is reached, SharePoint Portal Server enters a warning in the gatherer log “The file reached the maximum download limit. Check that the full text of the document can be meaningfully crawled.”

 

To increase the limit of 16 MB, you must add in the registry new entry MaxDownloadSize.  To do this, follow these steps:

 

1. Start Registry Editor (Regedit.exe).

2. Locate the following key in the registry:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\12.0\Search\Global\Gathering Manager

3. Open Edit - New - DWORD Value. Name it MaxDownloadSize.

4. Double-click, change the value to Decimal, and type the maximum size (in MB) for files that the gatherer downloads.

5. Restart the server.

6. Start Full Crawl.

 

NOTE: Increasing the file size may cause a timeout exception because the crawler can timeout if the file takes too long to crawl/index (because of its size). To increase timeout value, follow these steps:

 

1. In Central Administration, on the Application Management tab, in the Search section, click Manage search service.

2. On the Manage Search Service page, in the Farm-Level Search Settings section, click Farm-level search settings.

3. In the Timeout Settings section change Connection and Request acknowledgement time.

编辑推荐:
· Linux系列:如何用 C#调用 C方法造成内存泄露
· AI与.NET技术实操系列(二):开始使用ML.NET
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
阅读排行:
· DeepSeek 开源周回顾「GitHub 热点速览」
· 物流快递公司核心技术能力-地址解析分单基础技术分享
· .NET 10首个预览版发布:重大改进与新特性概览!
· AI与.NET技术实操系列(二):开始使用ML.NET
· 单线程的Redis速度为什么快?
点击右上角即可分享
微信分享提示