spaCy 并行分词

spaCy 并行分词

在使用spacy的时候,感觉比nltk慢了许多,一直在寻找并行化的方案,好在找到了,下面给出spaCy并行化的分词方法使用示例:

import spacy

nlp = spacy.load("en")

docs = [
    "Our dream was to bring to Shanghai a tribute event dedicated to China which tells our history and visio.",
    "It was not simply a fashion show, but something that we created especially with love and passion for China and all the people around the world who loves Dolce & Gabbana"
]

for doc in nlp.pipe(docs, batch_size=100, n_threads=3):
    print(list(doc))
    print("*" * 50)

posted @ 2018-11-23 16:11  狂徒归来  阅读(1788)  评论(0编辑  收藏  举报