flask使用celery

celery是一个异步框架,有些业务需要异步处理,比如上传一张衣服图片,得到这个图片的识别信息:比如上下两件,自动识别标签类型分类,锚点信息

这个训练费时,需要gpu,  因此用户上传图片接口结束之后,先返回OK,后期轮询接口结果,中间就丢入celery

celery的任务文件task.py和app.py或者main.py所在目录一致,里面的redis地址是我启动的一个docker容器,暴露端口6379,第8库

celery自己将任务执行结果根据task_id写入redis,然后根据task_id自己拿出来,所以redis配置好就OK,自己不用去redis取

import connexion
from celery import Celery

app = connexion.FlaskApp(__name__, specification_dir='.')

application = app.app
# Celery configuration db 8
application.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/8'
application.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/8'


def make_celery(app):
    celery = Celery(app.name, backend=app.config['CELERY_RESULT_BACKEND'],
                    broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    TaskBase = celery.Task
    class ContextTask(TaskBase):
        abstract = True
        def __call__(self, *args, **kwargs):
            with app.app_context():
                return TaskBase.__call__(self, *args, **kwargs)
    celery.Task = ContextTask
    return celery

celery = make_celery(app.app)


@celery.task(bind=True)
def calling_api(self, url):

    # try:
    #     lenth = len(url)
    # except SomeNetworkException as e:
    #     print("maybe do some clenup here....")
    # self.retry(e)

    result = [url, len(url), "helloworld"]
    return result

上面的任务函数比较独立,传入url,返回[url, len(url), "helloworld"],就是跟简单的输入参数求和一样

其他地方我写了个simple_test.py

# coding=utf-8

import copy

from flask import Flask, request, render_template, session, flash, redirect, \
    url_for, jsonify

RESULT = {
    'code': 0,
    'message': 'Success',
    'data': {}
}

from ...task import calling_api


def send_data():
    result = copy.deepcopy(RESULT)

    ret = request.get_json()
    print("request.get_json():", ret)
    url = ret.get("url")
    if url is not None:
        print("url:", url)
        # outcome = calling_api.apply_async(args=[url], countdown=60)
        outcome = calling_api.apply_async(args=[url])
        # outcome = calling_api.delay(url)
        print("outcome:", outcome)
        print("outcome type:", type(outcome))
        print("outcome dict:", type(outcome.__dict__()))
       
        outcome_dict = {}
        outcome['task_id'] = str(outcome)
        result['data'] = outcome_dict

    else:
        result['code'] = 1
        result['message'] = "failed"

    return jsonify(result)


def get_data(task_id):
    result = copy.deepcopy(RESULT)
    task = calling_api.AsyncResult(task_id)
    # task = calling_api.AsyncResult(task_id)

    task_id = "{0}".format(task.id)
    print(type(task_id), task_id)

    if task.status == 'SUCCESS':
        print("ok")
        result_dict = {}
        result_dict["status"] = task.status
        result_dict["ready"] = task.ready()
        result_dict["result"] = task.result
        result_dict["state"] = task.state
        result_dict["id"] = task.id
        result['data'] = result_dict
    else:
        result['code'] = 1

    return jsonify(result)

 

执行任务接口:无非就是接口传入url, 调用  outcome = calling_api.apply_async(args=[url])  返回task_id,  每次丢入一个url就会返回一个task_id

返回任务接口:task = calling_api.AsyncResult(task_id) 返回任务结果,根据状态判断成功的返回即可。

 

执行命令:

进入task.py目录:
celery -A task.celery worker -l info
# celery -A task.celery worker -l info --logfile=./celerylog.log
由于celery没有守护进程,不能通过-d 放到后台

再起一个命令行入口,进入app.py或者main.py目录:
uwsgi --http :20000 -w app.main

测试:
我的路由是:
创建任务: post http://localhost:20000/testing request: application/json {url="xxxxx.jpg"} response: "task_id": "44057af4-3e14-468a-a36d-8c31e3665bce" 带着返回的task_id取回结果 get http://localhost:20000/testing/{task_id} response: { "id": "44057af4-3e14-468a-a36d-8c31e3665bc", "ready": true, "result": [ "http://s3.xxxx.com/ba7571a95d64eaa69a49912f26816e2f.jpg", 60, "helloworld" ], "state": "SUCCESS", "status": "SUCCESS" },

  

 

posted @ 2019-03-14 18:15  Adamanter  阅读(1005)  评论(0编辑  收藏  举报