celery 5.3.6 报错ValueError: not enough values to unpack (expected 3, got 0)
celery 5.3.6 报错ValueError: not enough values to unpack
启动celery脚本报错
执行 python run_task.py报错,celery服务端和脚本端日志信息如下
# celery -A tasks worker --loglevel=INFO -------------- celery@DESKTOP-BQAR0JR v5.3.6 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.19045-SP0 2023-12-06 11:01:05 - *** --- * --- - ** ---------- [config] - ** ---------- .> app: tasks:0x1a7532b6340 - ** ---------- .> transport: redis://192.168.1.105:6379/0 - ** ---------- .> results: redis://192.168.1.105/0 - *** --- * --- .> concurrency: 8 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . tasks.add [2023-12-06 11:01:05,903: WARNING/MainProcess] c:\users\administrator\appdata\local\programs\py thon\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarn ing: The broker_connection_retry configuration setting will no longer determine whether broker connection retries are made during startup in Celery 6.0 and above. If you wish to retain the existing behavior for retrying connections on startup, you should set broker_connection_retry_on_startup to True. warnings.warn( [2023-12-06 11:01:06,037: INFO/MainProcess] Connected to redis://192.168.1.105:6379/0 [2023-12-06 11:01:06,039: WARNING/MainProcess] c:\users\administrator\appdata\local\programs\py thon\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarn [2023-12-06 11:01:06,864: INFO/SpawnPoolWorker-7] child process 12484 calling self.run() [2023-12-06 11:01:06,867: INFO/SpawnPoolWorker-5] child process 21516 calling self.run() [2023-12-06 11:01:07,680: INFO/MainProcess] mingle: all alone [2023-12-06 11:01:08,304: INFO/MainProcess] celery@DESKTOP-BQAR0JR ready. [2023-12-06 11:01:44,627: INFO/MainProcess] Task tasks.add[2e88f4bd-aebb-4f80-a0dc-c34ea27a4a22 ] received [2023-12-06 11:01:44,806: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)') billiard.einfo.RemoteTraceback: """ Traceback (most recent call last): File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\billiar d\pool.py", line 361, in workloop result = (True, prepare_result(fun(*args, **kwargs))) Traceback (most recent call last): File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\billiar d\pool.py", line 361, in workloop result = (True, prepare_result(fun(*args, **kwargs))) File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\celery\ app\trace.py", line 664, in fast_trace_task tasks, accept, hostname = _loc ValueError: not enough values to unpack (expected 3, got 0) worker: Hitting Ctrl+C again will terminate all running tasks!
报错 AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'
执行脚本
# coding=utf-8 from tasks import add result = add.delay(4,4) print('Is task ready: %s' % result.ready()) run_result = result.get(timeout=1) print('Task Result: %s' % run_result)
解决方案
win10上运行celery4.x ,5.x就会出现这个问题,需要安装eventlet解决,问题解决。
#安装依赖 pip install eventlet #运行work server指定eventlet celery -A <mymodule> worker -l info -P eventlet
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 分享4款.NET开源、免费、实用的商城系统
· 全程不用写代码,我用AI程序员写了一个飞机大战
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· 白话解读 Dapr 1.15:你的「微服务管家」又秀新绝活了
· 上周热点回顾(2.24-3.2)