关于AysncController的一次测试(url重写后静态页文件内容的读取是否需要使用异步?)

因为做网站的静态页缓存,所以做了这个测试

MVC项目

准备了4个Action,分两组,一组是读取本地磁盘的一个html页面文件,一组是延时2秒

public class TestController : Controller
{
    public ActionResult Article(string name)
    {
        string path = @"I:\c#\nn.html";
        using (StreamReader reader = new StreamReader(path))
        {
            return Content(reader.ReadToEnd());
        }
    }

    public async Task<ActionResult> Article2(string name)
    {
        string path = @"I:\c#\nn.html";
        using (StreamReader reader = new StreamReader(path))
        {
            return Content(await reader.ReadToEndAsync());
        }
    }

    public ActionResult Index1()
    {
        Thread.Sleep(2000);

        return Content("synchronized");
    }

    public async Task<ActionResult> Index2()
    {
        await Task.Delay(2000);

        return Content("asynchronized");
    }
}

控制台程序,测试代码

class Program
{
    static void Main(string[] args)
    {
        var syncUrl = "http://localhost:61771/Test/Article";
        var asyncUrl = "http://localhost:61771/Test/Article2";

        var syncUrl2 = "http://localhost:61771/Test/Index1";
        var asyncUrl2 = "http://localhost:61771/Test/Index2";
        var count = 20;

        int i = 0;
        while (true)
        {
            Console.WriteLine();

            Benchmark(asyncUrl, count);
            Benchmark(syncUrl, count);
            Benchmark(asyncUrl2, count);
            Benchmark(syncUrl2, count);

            i++;
            if (Console.ReadKey().Key == ConsoleKey.C)
            {
                break;
            }
        }

        Console.ReadKey();
    }
    static void Benchmark(string url, int count)
    {
        var stopwatch = new Stopwatch();
        stopwatch.Start();

        var threads = new List<Thread>();
        var countdown = new CountdownEvent(count);
        for (int i = 0; i < count; i++)
        {
            threads.Add(new Thread(() =>
            {
                using (var client = new WebClient())
                {
                    client.DownloadData(url);
                    countdown.Signal();
                }
            }));
        }

        for (int i = 0; i < count; i++)
        {
            threads[i].Start();
        }

        while (!countdown.IsSet) ;

        stopwatch.Stop();

        Console.WriteLine(string.Format("{0} costs {1} ms", url, stopwatch.ElapsedMilliseconds.ToString()));
    }
}

测试结果

运行环境: 笔记本电脑本地测试。
执行结果:

  1. count = 20




  2. count = 100




  3. count = 200




  4. count = 500
    测试时WebClient抛出了请求超时的警告,
    代码调整如下:

     public class NewWebClient : WebClient
     {
     	private int _timeout;
     	public NewWebClient()
     	{
     		this._timeout = 60000;
     	}
     
     	public NewWebClient(int timeout)
     	{
     		this._timeout = timeout;
     	}
     
     	protected override WebRequest GetWebRequest(Uri address)
     	{
     		var result = base.GetWebRequest(address);
     		result.Timeout = this._timeout;
     		return result;
     	}
     }
     
     static void Benchmark(string url, int count)
     {
     	var stopwatch = new Stopwatch();
     	stopwatch.Start();
     
     	var threads = new List<Thread>();
     	var countdown = new CountdownEvent(count);
     	for (int i = 0; i < count; i++)
     	{
     		threads.Add(new Thread(() =>
     		{
     			using (var client = new NewWebClient(30 * 60 * 1000))
     			{
     				client.DownloadData(url);
     				countdown.Signal();
     			}
     		}));
     	}
     
     	for (int i = 0; i < count; i++)
     	{
     		threads[i].Start();
     	}
     
     	while (!countdown.IsSet) ;
     
     	stopwatch.Stop();
     
     	Console.WriteLine(string.Format("{0} costs {1} ms", url, stopwatch.ElapsedMilliseconds.ToString()));
     }
    





  1. count = 1000




  2. count = 1500

总结

按照过去看过的资料描述 ,应该是 AsyncController虽然会因为线程切换而使单个请求增加额外的处理时间,但使耗时的操作不再占用工作线程,从而可以让IIS在相同时间内可以响应更多的请求,提高吞吐率。

第二组Action(延时2秒)的测试结果数据确实反映了这一效果。

但第一组读取本地文件的测试结果则是 单个请求的处理时间,异步Action明显高于同步不说,处理相同请求数所消耗的时间也是异步高于同步……,磁盘文件的并发读取是不是有什么限制呢?,待确认。

UrlRewrite

  1. 在已经完成MapHandler后就不能重写了,所以一般在BeginRequest的时候,执行重写
  2. IIS的url不区分大小写
  3. ///////home//////index//////////////////,request.Url.PathAndQuery依然是/home/index/
  4. /index.html/ 末尾带斜杠访问一个动态页可以正常访问(样式可能会乱掉),但访问一个静态页会报404

补充测试

    protected void Application_BeginRequest()
    {
        var context = HttpContext.Current;
        var request = context.Request;
        if (request.RequestType == "GET")//过滤所有API
        {
            string regularUrl = request.Url.PathAndQuery;
            if (regularUrl.StartsWith("/Test/Article3"))
            {
                context.RewritePath("/Project_Readme.html");
            }
            else if (regularUrl.StartsWith("/Test/Article4"))
            {
                string path = context.Server.MapPath("~/Project_Readme.html");
                if (File.Exists(path))
                {
                    using (StreamReader reader = new StreamReader(path))
                    {
                        context.Response.Write( reader.ReadToEnd());
                        context.Response.End();
                    }
                }
            }
        }
    }
  1. count = 20

  2. count = 100

  3. count = 1000

另一台电脑上的测试(Cpu A8)

  1. count = 20
  2. count = 500

测试 服务器?

  1. count = 20

因为用的是360导航的首页另存为得到的一个静态页,内容太多,当通过网络访问页面时,带宽和流量就成为了一个很大的制约因素,之后更换了一个大小适当的页面后速度有明显提升,不过直接访问静态页路径速度并没有明显提升了……。

posted @ 2017-04-21 08:02  随心~  阅读(704)  评论(3编辑  收藏  举报