Solr data import 中XML/HTTP 数据源的使用
Solr data import 中XML/HTTP 数据源的使用
参考自:http://wiki.apache.org/solr/DataImportHandler
DataImportHandler 可以通过datasource数据源索引来自于HTTP的数据。包括REST/XML和RSS/ATOM 。
在版本1.4中,推荐使用URLDataSource。
配置示例如下:
- <dataSource name="b" type="HttpDataSource" baseUrl="http://host:port/" encoding="UTF-8" connectionTimeout="5000" readTimeout="10000"/>
- <!-- or in Solr 1.4-->
- <dataSource name="a" type="URLDataSource" baseUrl="http://host:port/" encoding="UTF-8" connectionTimeout="5000" readTimeout="10000"/>
baseUrl(可选): you should use it when the host/port changes between Dev/QA/Prod environments. Using this attribute isolates the changes to be made to the solrconfig.xml
encoding(可选):定义响应头里面的编码方式。这个属性可以替换掉服务器的默认编码方式。
connectionTimeout(可选):默认时间是5000ms
readTimeout(可选):默认是10000ms
下面是一个data-config.xml示例的配置:
这是一个Slashdot RSS feed的例子。
- <dataConfig>
- <dataSource type="HttpDataSource" />
- <document>
- <entity name="slashdot"
- pk="link"
- url="http://rss.slashdot.org/Slashdot/slashdot"
- processor="XPathEntityProcessor"
- forEach="/RDF/channel | /RDF/item"
- transformer="DateFormatTransformer">
- <field column="source" xpath="/RDF/channel/title" commonField="true" />
- <field column="source-link" xpath="/RDF/channel/link" commonField="true" />
- <field column="subject" xpath="/RDF/channel/subject" commonField="true" />
- <field column="title" xpath="/RDF/item/title" />
- <field column="link" xpath="/RDF/item/link" />
- <field column="description" xpath="/RDF/item/description" />
- <field column="creator" xpath="/RDF/item/creator" />
- <field column="item-subject" xpath="/RDF/item/subject" />
- <field column="slash-department" xpath="/RDF/item/department" />
- <field column="slash-section" xpath="/RDF/item/section" />
- <field column="slash-comments" xpath="/RDF/item/comments" />
- <field column="date" xpath="/RDF/item/date" dateTimeFormat="yyyy-MM-dd'T'hh:mm:ss" />
- </entity>
- </document>
- </dataConfig>
下面这个data-config.xml是用于索引http://dumps.wikimedia.org/enwiki/20100312/ 这个网页下的pages-articles.xml.bz2文件,我们需要去这个网页下载该文件,解压以后放到conf/data目录下,这个文件不压缩的大小是1.50GB。
- <dataConfig>
- <dataSource type="FileDataSource" encoding="UTF-8" />
- <document>
- <entity name="page"
- processor="XPathEntityProcessor"
- stream="true"
- forEach="/mediawiki/page/"
- url="/data/enwiki-20100312-pages-articles.xml"
- transformer="RegexTransformer,DateFormatTransformer"
- >
- <field column="id" xpath="/mediawiki/page/id" />
- <field column="title" xpath="/mediawiki/page/title" />
- <field column="revision" xpath="/mediawiki/page/revision/id" />
- <field column="user" xpath="/mediawiki/page/revision/contributor/username" />
- <field column="userId" xpath="/mediawiki/page/revision/contributor/id" />
- <field column="text" xpath="/mediawiki/page/revision/text" />
- <field column="timestamp" xpath="/mediawiki/page/revision/timestamp" dateTimeFormat="yyyy-MM-dd'T'hh:mm:ss'Z'" />
- <field column="$skipDoc" regex="^#REDIRECT .*" replaceWith="true" sourceColName="text"/>
- </entity>
- </document>
- </dataConfig>
相关的schema.xml文件如下:
- <field name="id" type="integer" indexed="true" stored="true" required="true"/>
- <field name="title" type="string" indexed="true" stored="false"/>
- <field name="revision" type="sint" indexed="true" stored="true"/>
- <field name="user" type="string" indexed="true" stored="true"/>
- <field name="userId" type="integer" indexed="true" stored="true"/>
- <field name="text" type="text" indexed="true" stored="false"/>
- <field name="timestamp" type="date" indexed="true" stored="true"/>
- <field name="titleText" type="text" indexed="true" stored="true"/>
- ...
- <uniqueKey>id</uniqueKey>
- <copyField source="title" dest="titleText"/>
请注意:由于目前唯一支持delta import的EntityProcessor是SqlEntityProcessor,而XPathEntityProcessor并现在没有实现delta import。如果你想实现这些方法的话,你需要修改EntityProcessor.java