Theano2.1.5-基础知识之打印出theano的图

来自:http://deeplearning.net/software/theano/tutorial/printing_drawing.html

Printing/Drawing Theano graphs

    Theano提供的函数theano.printing.pprint() 和 theano.printing.debugprint() 可以用来在编译前和后打印一个graph到终端上。 pprint() 该函数更紧凑而且更偏向于数学形式, debugprint() 更为的详细。 Theano同样提供pydotprint() 来生成一张有关该函数的图片。更详细的可以看看 printing – Graph Printing and Symbolic Print Statement.

note:当打印theano函数的时候,有时候会比较难读懂。为了简化过程,可以禁止一些theano优化,只要使用theano的flag: optimizer_excluding=fusion:inplace. 不要在工作执行的时候使用这个flag,这会使得graph更慢而且使用更多的内存。

    考虑逻辑回归的例子:

>>> import numpy
>>> import theano
>>> import theano.tensor as T
>>> rng = numpy.random
>>> # Training data
>>> N = 400
>>> feats = 784
>>> D = (rng.randn(N, feats).astype(theano.config.floatX), rng.randint(size=N,low=0, high=2).astype(theano.config.floatX))
>>> training_steps = 10000
>>> # Declare Theano symbolic variables
>>> x = T.matrix("x")
>>> y = T.vector("y")
>>> w = theano.shared(rng.randn(feats).astype(theano.config.floatX), name="w")
>>> b = theano.shared(numpy.asarray(0., dtype=theano.config.floatX), name="b")
>>> x.tag.test_value = D[0]
>>> y.tag.test_value = D[1]
>>> # Construct Theano expression graph
>>> p_1 = 1 / (1 + T.exp(-T.dot(x, w)-b)) # Probability of having a one
>>> prediction = p_1 > 0.5 # The prediction that is done: 0 or 1
>>> # Compute gradients
>>> xent = -y*T.log(p_1) - (1-y)*T.log(1-p_1) # Cross-entropy
>>> cost = xent.mean() + 0.01*(w**2).sum() # The cost to optimize
>>> gw,gb = T.grad(cost, [w,b])
>>> # Training and prediction function
>>> train = theano.function(inputs=[x,y], outputs=[prediction, xent], updates=[[w, w-0.01*gw], [b, b-0.01*gb]], name = "train")
>>> predict = theano.function(inputs=[x], outputs=prediction, name = "predict")

友好的打印结果:

>>> theano.printing.pprint(prediction) 
'gt((TensorConstant{1} / (TensorConstant{1} + exp(((-(x \\dot w)) - b)))),
TensorConstant{0.5})'

调试打印

预编译图:

>>> theano.printing.debugprint(prediction) 
    Elemwise{gt,no_inplace} [@A] ''
    |Elemwise{true_div,no_inplace} [@B] ''
    | |DimShuffle{x} [@C] ''
    | | |TensorConstant{1} [@D]
    | |Elemwise{add,no_inplace} [@E] ''
    |   |DimShuffle{x} [@F] ''
    |   | |TensorConstant{1} [@D]
    |   |Elemwise{exp,no_inplace} [@G] ''
    |     |Elemwise{sub,no_inplace} [@H] ''
    |       |Elemwise{neg,no_inplace} [@I] ''
    |       | |dot [@J] ''
    |       |   |x [@K]
    |       |   |w [@L]
    |       |DimShuffle{x} [@M] ''
    |         |b [@N]
    |DimShuffle{x} [@O] ''
      |TensorConstant{0.5} [@P]

编译后的图:

>>> theano.printing.debugprint(predict) 
    Elemwise{Composite{GT(scalar_sigmoid((-((-i0) - i1))), i2)}} [@A] ''   4
     |CGemv{inplace} [@B] ''   3
     | |Alloc [@C] ''   2
     | | |TensorConstant{0.0} [@D]
     | | |Shape_i{0} [@E] ''   1
     | |   |x [@F]
     | |TensorConstant{1.0} [@G]
     | |x [@F]
     | |w [@H]
     | |TensorConstant{0.0} [@D]
     |InplaceDimShuffle{x} [@I] ''   0
     | |b [@J]
     |TensorConstant{(1,) of 0.5} [@K]

graph的图片打印

预编译图

>>> theano.printing.pydotprint(prediction, outfile="pics/logreg_pydotprint_prediction.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_prediction.png



../_images/logreg_pydotprint_prediction2.png


编译后的图

>>> theano.printing.pydotprint(predict, outfile="pics/logreg_pydotprint_predict.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_predict.png
../_images/logreg_pydotprint_predict2.png
优化后的训练图:
>>> theano.printing.pydotprint(train, outfile="pics/logreg_pydotprint_train.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_train.png

../_images/logreg_pydotprint_train2.png

参考资料:

[1] 官网:http://deeplearning.net/software/theano/tutorial/printing_drawing.html



../_images/logreg_pydotprint_predict2.png
../_images/logreg_pydotprint_predict2.png
../_images/logreg_pydotprint_train2.png
posted @ 2015-06-16 11:27  仙守  阅读(1178)  评论(0编辑  收藏  举报