celery 基础教程(五):守护进程
一 守护进程方式启动
https://blog.csdn.net/p571912102/article/details/82735052
文件目录如下
. ├── config.py ├── main.py ├── test │ └── tasks.py └── test2 ├── __init__.py └── tasks.py
一个应用一个文件夹
异步任务的文件名必须是tasks.py
在需要执行该任务的地方导入该任务
from celery_tasks.test.tasks import test from celery_tasks.test2.tasks import test as test2 test_id = test.delay() test2_id = test2.delay() print(test_id) print(test2_id)
调用该异步任务会马上放回一个id,执行结果可以在result_backend中通过id找到
/home/python/.virtualenvs/kol_site_py3/bin/python /home/python/projects/supervisor/supervisor/celery_tasks/test.py a6e13745-c05b-496d-bbbe-2b636f84009c d92d50b4-0ba1-4b05-9e96-eeb92a854929 Process finished with exit code
127.0.0.1:6379[15]> keys * 1) "celery-task-meta-2a9c0a4b-5b40-4121-9986-a8430fc6b235" 2) "celery-task-meta-0f16e227-393f-48ea-b41b-3419df84528e" 3) "celery-task-meta-fbf31a20-6eee-4298-8a91-214d2e5c9399" 4) "celery-task-meta-61f012c0-bde1-4344-9e1c-b5e8a7b93902" 5) "celery-task-meta-074a659f-d76f-4818-8516-f098d1b900ed" 6) "celery-task-meta-8a89c4db-f2e2-484b-94ee-e1af9911c69f" 7) "celery-task-meta-0012966d-e8fd-483b-b8ac-d160d65c8221" 8) "celery-task-meta-f97a452d-3812-4950-bfd9-02ff9e69a4b2" 9) "celery-task-meta-4bebe710-7725-43f5-b0f7-9a35b57ba3b1" 10) "celery-task-meta-4b1cca23-31c3-4c82-a99f-bbe306846191" 11) "celery-task-meta-4cdf3a68-7df4-4bdf-8f54-abe6be83df3a" 12) "celery-task-meta-d92d50b4-0ba1-4b05-9e96-eeb92a854929" 13) "celery-task-meta-17265693-ba36-4f6c-80c8-d89a52f549f7" 14) "celery-task-meta-d62bbf16-6469-40a7-bc25-61b553014d76" 15) "celery-task-meta-4cca0f47-2f2d-45e6-8341-52264e50d969" 16) "celery-task-meta-1fd1e52a-00e1-486a-a224-36bd0fbb5d4a" 17) "celery-task-meta-af3b9536-91a6-4ae3-ab9b-59755bfb4883" 18) "celery-task-meta-b5710e2a-1905-44fd-8b11-4d7057113291" 19) "celery-task-meta-bebeb902-cce1-4edb-bdac-734ed6dc16ae" 20) "celery-task-meta-2771b961-694f-4727-9b19-07928834475e" 21) "celery-task-meta-8c683476-5cec-4933-8370-73793d656e23" 22) "celery-task-meta-6c8e6763-a416-4c02-9689-a0bb38bf26a6" 23) "celery-task-meta-7a4edb71-b13b-4f0f-b882-408716bb3ba9" 24) "celery-task-meta-4e368ca3-f686-4215-aed7-f0c6463cfac9" 25) "celery-task-meta-757f196d-c377-4f38-982d-700fa4f45c6b" 26) "celery-task-meta-094ea32e-5cf8-41c5-bf63-fb629e0e1e67" 27) "celery-task-meta-2e1f2188-0806-41f1-8eb8-4a0f73ec2aca" 28) "celery-task-meta-fd7e8fea-c738-4d49-b13d-c5d782eeaa96" 29) "celery-task-meta-e476f036-7192-4687-b9b7-c6a06556b4c3" 30) "celery-task-meta-2463c15f-5903-4381-8646-1b2aa6418ca0" 31) "celery-task-meta-a6e13745-c05b-496d-bbbe-2b636f84009c" 32) "celery-task-meta-f4f2d940-3e16-4d78-a0c4-3766eb91c908" 33) "celery-task-meta-5a1eaba8-0675-4e82-aedc-fee801ff31ef" 127.0.0.1:6379[15]>
启动celery的方法
# 最终在终端运行这个main文件 celery -A 应用包名 worker -l info # 我们当前项目,在后端项目根目录下运行 celery -A celery_tasks.main worker -l info # 守护进程 celery multi start w1 -A celery_tasks.main -l info --logfile=./celerylog.log # 停止和重启 分别将 start 改为 stop / restart
二 系统守护进程
1. 使用systemd控制Celery
用法: systemctl {start|stop|restart|status} celery.service
配置文件: /etc/celery/celery.conf
celery服务文件: /etc/systemd/system/celery.service
celery beat服务文件: /etc/systemd/system/celerybeat.service
服务文件: /etc/systemd/system/celery.service
[Unit] Description=Celery Service After=network.target [Service] Type=forking User=celery Group=celery EnvironmentFile=/etc/celery/celery.conf WorkingDirectory=/app/celery ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \ -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \ --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}' ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES} \ --pidfile=${CELERYD_PID_FILE}' ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} \ -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \ --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}' [Install] WantedBy=multi-user.target
应用的配置文件: /etc/celery/celery.conf
# Name of nodes to start # here we have a single node CELERYD_NODES="w1" # or we could have three nodes: #CELERYD_NODES="w1 w2 w3" # or we could have three nodes: #CELERYD_NODES="w1 w2 w3" # or we could have three nodes: #CELERYD_NODES="w1 w2 w3" # Absolute or relative path to the 'celery' command: CELERY_BIN="/usr/local/python3.6.5/bin/celery" # App instance to use # comment out this line if you don't use an app CELERY_APP="tasks" # or fully qualified: #CELERY_APP="proj.tasks:app" # How to call manage.py CELERYD_MULTI="multi" # Extra command-line arguments to the worker CELERYD_OPTS="--time-limit=300 --concurrency=8" # - %n will be replaced with the first part of the nodename. # - %I will be replaced with the current child process index # and is important when using the prefork pool to avoid race conditions. CELERYD_PID_FILE="/var/run/celery/%n.pid" CELERYD_LOG_FILE="/var/log/celery/%n%I.log" CELERYD_LOG_LEVEL="INFO" # you may wish to add these options for Celery Beat CELERYBEAT_PID_FILE="/var/run/celery/beat.pid" CELERYBEAT_LOG_FILE="/var/log/celery/beat.log"
Celery Beat服务文件:celerybeat.service
这是Celery Beat的示例systemd文件:
/etc/systemd/system/celerybeat.service
[Unit] Description=Celery Beat Service After=network.target [Service] Type=simple User=celery Group=celery EnvironmentFile=/etc/celery/celery.conf WorkingDirectory=/opt/celery ExecStart=/bin/sh -c '${CELERY_BIN} beat \ -A ${CELERY_APP} --pidfile=${CELERYBEAT_PID_FILE} \ --logfile=${CELERYBEAT_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL}' [Install] WantedBy=multi-user.target
本文来自博客园,作者:秋华,转载请注明原文链接:https://www.cnblogs.com/qiu-hua/p/12706534.html