Table Of Contents

Previous topic

Graph Structures

Next topic

Derivatives in Theano

This Page

Printing/Drawing Theano graphs

Theano provides two functions (theano.pp() and theano.printing.debugprint()) to print a graph to the terminal before or after compilation. These two functions print expression graphs in different ways: pp() is more compact and math-like, debugprint() is more verbose. Theano also provides pydotprint() that creates a png image of the function. You can read about them in printing – Graph Printing and Symbolic Print Statement.

Note

When printing Theano functions, they can sometimes be hard to read. To help with this, you can disable some Theano optimizations by using the Theano flag: optimizer_excluding=fusion:inplace. Do not use this during real job execution, as this will make the graph slower and use more memory.

Consider again the logistic regression but notice the additional printing instuctions. The following output depicts the pre- and post- compilation graphs.

import theano
import theano.tensor as T

import numpy

import os

rng = numpy.random

N = 400
feats = 784
D = (rng.randn(N, feats).astype(theano.config.floatX),
rng.randint(size=N,low=0, high=2).astype(theano.config.floatX))
training_steps = 10000

# Declare Theano symbolic variables
x = T.matrix("x")
y = T.vector("y")
w = theano.shared(rng.randn(feats).astype(theano.config.floatX), name="w")
b = theano.shared(numpy.asarray(0., dtype=theano.config.floatX), name="b")
x.tag.test_value = D[0]
y.tag.test_value = D[1]
#print "Initial model:"
#print w.get_value(), b.get_value()


# Construct Theano expression graph
p_1 = 1 / (1 + T.exp(-T.dot(x, w) - b)) # Probability of having a one
prediction = p_1 > 0.5 # The prediction that is done: 0 or 1
xent = -y * T.log(p_1) - (1 - y) * T.log(1 - p_1) # Cross-entropy
cost = xent.mean() + 0.01 * (w ** 2).sum() # The cost to optimize
gw,gb = T.grad(cost, [w, b])

# Compile expressions to functions
train = theano.function(
            inputs=[x, y],
            outputs=[prediction, xent],
            updates=[(w, w - 0.01 * gw), (b, b - 0.01 * gb)],
            name="train")
predict = theano.function(inputs=[x], outputs=prediction,
            name="predict")

if any([x.op.__class__.__name__ in ['Gemv', 'CGemv'] for x in
        train.maker.fgraph.toposort()]):
    print 'Used the cpu'
elif any([x.op.__class__.__name__ == 'GpuGemm' for x in
         train.maker.fgraph.toposort()]):
    print 'Used the gpu'
else:
    print 'ERROR, not able to tell if theano used the cpu or the gpu'
    print train.maker.fgraph.toposort()


for i in range(training_steps):
    pred, err = train(D[0], D[1])
#print "Final model:"
#print w.get_value(), b.get_value()

print "target values for D"
print D[1]

print "prediction on D"
print predict(D[0])


# Print the picture graphs
# after compilation
if not os.path.exists('pics'):
    os.mkdir('pics')
theano.printing.pydotprint(predict,
                           outfile="pics/logreg_pydotprint_predic.png",
                           var_with_name_simple=True)
# before compilation
theano.printing.pydotprint_variables(prediction,
                           outfile="pics/logreg_pydotprint_prediction.png",
                           var_with_name_simple=True)
theano.printing.pydotprint(train,
                           outfile="pics/logreg_pydotprint_train.png",
                           var_with_name_simple=True)

Pretty Printing

theano.printing.pprint(variable)

>>> theano.printing.pprint(prediction)  # (pre-compilation)
gt((TensorConstant{1} / (TensorConstant{1} + exp(((-(x \\dot w)) - b)))),TensorConstant{0.5})

Debug Printing

theano.printing.debugprint({fct, variable, list of variables})

>>> theano.printing.debugprint(prediction)  # (pre-compilation)
Elemwise{gt,no_inplace} [@181772236] ''
 |Elemwise{true_div,no_inplace} [@181746668] ''
 | |InplaceDimShuffle{x} [@181746412] ''
 | | |TensorConstant{1} [@181745836]
 | |Elemwise{add,no_inplace} [@181745644] ''
 | | |InplaceDimShuffle{x} [@181745420] ''
 | | | |TensorConstant{1} [@181744844]
 | | |Elemwise{exp,no_inplace} [@181744652] ''
 | | | |Elemwise{sub,no_inplace} [@181744012] ''
 | | | | |Elemwise{neg,no_inplace} [@181730764] ''
 | | | | | |dot [@181729676] ''
 | | | | | | |x [@181563948]
 | | | | | | |w [@181729964]
 | | | | |InplaceDimShuffle{x} [@181743788] ''
 | | | | | |b [@181730156]
 |InplaceDimShuffle{x} [@181771788] ''
 | |TensorConstant{0.5} [@181771148]
>>> theano.printing.debugprint(predict)  # (post-compilation)
Elemwise{Composite{neg,{sub,{{scalar_sigmoid,GT},neg}}}} [@183160204] ''   2
 |dot [@183018796] ''   1
 | |x [@183000780]
 | |w [@183000812]
 |InplaceDimShuffle{x} [@183133580] ''   0
 | |b [@183000876]
 |TensorConstant{[ 0.5]} [@183084108]

Picture Printing

>>> theano.printing.pydotprint_variables(prediction)  # (pre-compilation)
../_images/logreg_pydotprint_prediction2.png

Notice that pydotprint() requires Graphviz and Python’s pydot.

>>> theano.printing.pydotprint(predict)  # (post-compilation)
../_images/logreg_pydotprint_predic2.png
>>> theano.printing.pydotprint(train) # This is a small train example!
../_images/logreg_pydotprint_train2.png