想增加你的词汇量吗?---教你如何爬取某贝单词
一文教你如何爬取扇贝单词
目录
推荐
一、网页分析
二、代码实现
三、运行结果
无意之中打开了扇贝Python必背词汇的网址。那么既然打开了。那么就尝试爬取一下这个网页!
扇贝Python必背词汇网址:https://www.shanbay.com/wordlist/110521/232414/
一、网页分析
我们打开此网站之后,通过以往爬取网页的经验,会发现此网页特别容易爬取。
大概查看了网页,我们只需爬取单词和含义即可。首先我们先来查看网页源码
下面分别把他们解析出来:
????,分析完毕后,我们就可以通过代码进行实现了。
etree_obj = etree.HTML(html)
word_list = etree_obj.xpath('//strong/text()')
explain_list = etree_obj.xpath('//td[@class="span10"]/text()')
item_zip = zip(word_list,explain_list)
for item in item_zip:
items.append(item)
分析完内容,下面就开始分析分页。鉴于此URL只有三页URL,因此,博主就使用最简单的方式,把Url拼接出来
base_url = "https://www.shanbay.com/wordlist/110521/232414/?page={}"
for i in range(1, 4):
url = base_url.format(i)
print(url)
二、代码实现
# encoding: utf-8
'''
@author 李运辰
@create 2020-11-08
@software: Pycharm
@file: 作业:爬扇贝Python必背词汇.py
@Version:1.0
'''
import csv
import requests
from lxml import etree
"""
https://www.shanbay.com/wordlist/110521/232414/?page=1
https://www.shanbay.com/wordlist/110521/232414/?page=2
https://www.shanbay.com/wordlist/110521/232414/?page=3
//strong # en
//td[@class="span10"] # cn
"""
base_url = "https://www.shanbay.com/wordlist/110521/232414/?page={}"
headers={
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.110 Safari/537.36',
}
items =[]
def parse_url(url):
"""解析url,得到响应内容"""
response = requests.get(url=url,headers=headers)
return response.content.decode("utf-8")
def parse_html(html):
"""使用xpath解析html"""
etree_obj = etree.HTML(html)
word_list = etree_obj.xpath('//strong/text()')
explain_list = etree_obj.xpath('//td[@class="span10"]/text()')
item_zip = zip(word_list,explain_list)
for item in item_zip:
items.append(item)
def save():
"""将数据保存到csv中"""
with open("./shanbei.csv", "a", encoding="utf-8") as file:
writer = csv.writer(file)
for item in items:
writer.writerow(item)
def start():
"""开始爬虫"""
for i in range(1, 4):
url = base_url.format(i)
html = parse_url(url)
parse_html(html)
save()
if __name__ == '__main__':
start()
三、运行结果
正文结束!!!
欢迎关注公众号:Python爬虫数据分析挖掘
记录学习python的点点滴滴;
回复【开源源码】免费获取更多开源项目源码;
公众号每日更新python知识和【免费】工具;
本文已同步到【开源中国】、【腾讯云社区】、【CSDN】;
文章源:buwenbuhuo.blog.csdn.net/
耐得住寂寞,才能登得顶
Gitee码云:https://gitee.com/lyc96/projects