Stay Hungry,Stay Foolish!

blocking to nonblocking of Python

Getting Started With Async Features in Python

https://realpython.com/python-async-features/#:~:text=%20Using%20Python%20Async%20Features%20in%20Practice%20,of%20the%20program%20is%20the%20same...%20More%20

同步接口实现的server不能影响成千上万的连接,在短时间内。

Building a Synchronous Web Server

A web server’s basic unit of work is, more or less, the same as batch processing. The server will get some input, process it, and create the output. Written as a synchronous program, this would create a working web server.

It would also be an absolutely terrible web server.

Why? In this case, one unit of work (input, process, output) is not the only purpose. The real purpose is to handle hundreds or even thousands of units of work as quickly as possible. This can happen over long periods of time, and several work units may even arrive all at once.

Can a synchronous web server be made better? Sure, you could optimize the execution steps so that all the work coming in is handled as quickly as possible. Unfortunately, there are limitations to this approach. The result could be a web server that doesn’t respond fast enough, can’t handle enough work, or even one that times out when work gets stacked up.

 

Synchronous Programming

完全的同步模式编码, 队列中的数据被第一个 task 依次处理, 没有多task并发执行的功能。

import queue


def task(name, work_queue):

    if work_queue.empty():

        print(f"Task {name} nothing to do")

    else:

        while not work_queue.empty():

            count = work_queue.get()

            total = 0

            print(f"Task {name} running")

            for x in range(count):

                total += 1

            print(f"Task {name} total: {total}")


def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = queue.Queue()


    # Put some work in the queue

    for work in [15, 10, 5, 2]:

        work_queue.put(work)


    # Create some synchronous tasks

    tasks = [(task, "One", work_queue), (task, "Two", work_queue)]


    # Run the tasks

    for t, n, q in tasks:

        t(n, q)


if __name__ == "__main__":

    main()

 

Simple Cooperative Concurrency

task中使用yield,将task变成生成器函数, 可以在task未执行逻辑中提前退出,去执行其他的task。

import queue


def task(name, queue):

    while not queue.empty():

        count = queue.get()

        total = 0

        print(f"Task {name} running")

        for x in range(count):

            total += 1

            yield

        print(f"Task {name} total: {total}")


def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = queue.Queue()


    # Put some work in the queue

    for work in [15, 10, 5, 2]:

        work_queue.put(work)


    # Create some tasks

    tasks = [task("One", work_queue), task("Two", work_queue)]


    # Run the tasks

    done = False

    while not done:

        for t in tasks:

            try:

                next(t)

            except StopIteration:

                tasks.remove(t)

            if len(tasks) == 0:

                done = True


if __name__ == "__main__":

    main()

 

Cooperative Concurrency With Blocking Calls

对于使用yield将task改变为生成器的方式, 只能实现并发, 然而并不能实现效率的提升。

例如在task中添加 阻塞的 sleep, 并不能再一个sleep的时候出让CPU控制权, 给其它task执行, 程序只能干等着。

import time

import queue

from codetiming import Timer


def task(name, queue):

    timer = Timer(text=f"Task {name} elapsed time: {{:.1f}}")

    while not queue.empty():

        delay = queue.get()

        print(f"Task {name} running")

        timer.start()

        time.sleep(delay)

        timer.stop()

        yield


def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = queue.Queue()


    # Put some work in the queue

    for work in [15, 10, 5, 2]:

        work_queue.put(work)


    tasks = [task("One", work_queue), task("Two", work_queue)]


    # Run the tasks

    done = False

    with Timer(text="\nTotal elapsed time: {:.1f}"):

        while not done:

            for t in tasks:

                try:

                    next(t)

                except StopIteration:

                    tasks.remove(t)

                if len(tasks) == 0:

                    done = True


if __name__ == "__main__":

    main()

 

Cooperative Concurrency With Non-Blocking Calls

使用asyncio库,其提供了 异步的sleep, 可以在一个task sleep的过程中, 出让CPU给其它task使用。

import asyncio

from codetiming import Timer


async def task(name, work_queue):

    timer = Timer(text=f"Task {name} elapsed time: {{:.1f}}")

    while not work_queue.empty():

        delay = await work_queue.get()

        print(f"Task {name} running")

        timer.start()

        await asyncio.sleep(delay)

        timer.stop()


async def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = asyncio.Queue()


    # Put some work in the queue

    for work in [15, 10, 5, 2]:

        await work_queue.put(work)


    # Run the tasks

    with Timer(text="\nTotal elapsed time: {:.1f}"):

        await asyncio.gather(

            asyncio.create_task(task("One", work_queue)),

            asyncio.create_task(task("Two", work_queue)),

        )


if __name__ == "__main__":

    asyncio.run(main())

 

Synchronous (Blocking) HTTP Calls

使用request库, 发起同步web请求。效率低。

import queue

import requests

from codetiming import Timer


def task(name, work_queue):

    timer = Timer(text=f"Task {name} elapsed time: {{:.1f}}")

    with requests.Session() as session:

        while not work_queue.empty():

            url = work_queue.get()

            print(f"Task {name} getting URL: {url}")

            timer.start()

            session.get(url)

            timer.stop()

            yield


def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = queue.Queue()


    # Put some work in the queue

    for url in [

        "http://google.com",

        "http://yahoo.com",

        "http://linkedin.com",

        "http://apple.com",

        "http://microsoft.com",

        "http://facebook.com",

        "http://twitter.com",

    ]:

        work_queue.put(url)


    tasks = [task("One", work_queue), task("Two", work_queue)]


    # Run the tasks

    done = False

    with Timer(text="\nTotal elapsed time: {:.1f}"):

        while not done:

            for t in tasks:

                try:

                    next(t)

                except StopIteration:

                    tasks.remove(t)

                if len(tasks) == 0:

                    done = True


if __name__ == "__main__":

    main()

 

Asynchronous (Non-Blocking) HTTP Calls

使用异步的web client, 结合asyncio, 发起异步并发请求。效率高。

import asyncio

import aiohttp

from codetiming import Timer


async def task(name, work_queue):

    timer = Timer(text=f"Task {name} elapsed time: {{:.1f}}")

    async with aiohttp.ClientSession() as session:

        while not work_queue.empty():

            url = await work_queue.get()

            print(f"Task {name} getting URL: {url}")

            timer.start()

            async with session.get(url) as response:

                await response.text()

            timer.stop()


async def main():

    """

    This is the main entry point for the program

    """

    # Create the queue of work

    work_queue = asyncio.Queue()


    # Put some work in the queue

    for url in [

        "http://google.com",

        "http://yahoo.com",

        "http://linkedin.com",

        "http://apple.com",

        "http://microsoft.com",

        "http://facebook.com",

        "http://twitter.com",

    ]:

        await work_queue.put(url)


    # Run the tasks

    with Timer(text="\nTotal elapsed time: {:.1f}"):

        await asyncio.gather(

            asyncio.create_task(task("One", work_queue)),

            asyncio.create_task(task("Two", work_queue)),

        )


if __name__ == "__main__":

    asyncio.run(main())

 

posted @ 2021-08-07 21:58  lightsong  阅读(50)  评论(0编辑  收藏  举报
Life Is Short, We Need Ship To Travel