爬虫数据存储 - JSON CSV MySQL MongoDB

爬取的数据需要妥善存储。

存储为JSON

import json
data = [{'name': '张三', 'age': 25}]
with open('data.json', 'w', encoding='utf-8') as f:
    json.dump(data, f, ensure_ascii=False, indent=2)

存储到MySQL

import pymysql
conn = pymysql.connect(host='localhost', user='root', password='', database='test')
with conn.cursor() as cursor:
    cursor.execute("INSERT INTO users (name) VALUES ('test')")
conn.commit()

选择合适的存储方式让数据管理更高效!

发表回复

后才能评论