This file contains auxiliary Ops, used during the compilation phase and Ops building class (FromFunctionOp) and decorator (as_op()) that help make new Ops more rapidly.
Build a basic Theano Op around a function.
Since the resulting Op is very basic and is missing most of the optional functionalities, some optimizations may not apply. If you want to help, you can supply an infer_shape function that computes the shapes of the output given the shapes of the inputs.
Also the gradient is undefined in the resulting op and Theano will raise an error if you attempt to get the gradient of a graph containing this op.
This op is used only internally by Theano.
Only the AddDestroyHandler optimizer tries to insert them in the graph.
This Op is declared as destructive while it is not destroying anything. It returns a view. This is used to prevent destruction of the output variables of a Theano function.
There is a mechanism in Theano that should prevent this, but the use of OutputGuard adds a safeguard: it may be possible for some optimization run before the add_destroy_handler phase to bypass this mechanism, by making in-place optimizations.
TODO: find a current full explanation.
Change the input’s broadcastable fields in some predetermined way.
Rebroadcast((0, True), (1, False))(x) would make x broadcastable in axis 0 and not broadcastable in axis 1
See also
..note: works inplace and works for CudaNdarrayType
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
L{Op} that puts into the graph the user-provided shape.
In the case where this op stays in the final graph, we assert the shape. For this the output of this op must be used in the graph. This is not the case most of the time if we only take the shape of the output. Maybe there are other optimizations that will mess with this.
@note: Maybe in the future we will never do the assert! @note: We currently don’t support specifying partial shape information.
Returns an inplace view of the input. Used internally by Theano.
Decorator that converts a function into a basic Theano op that will call the supplied function as its implementation.
It takes an optional infer_shape parameter that should be a callable with this signature:
- def infer_shape(node, input_shapes):
- ... return output_shapes
Here input_shapes and output_shapes are lists of tuples that represent the shape of the corresponding inputs/outputs.
This should not be used when performance is a concern since the very basic nature of the resulting Op may interfere with certain graph optimizations.
Example usage:
- @as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
- otypes=[theano.tensor.fmatrix])
- def numpy_dot(a, b):
- return numpy.dot(a, b)
Tell DeepCopyOp how to generate C code for a Theano Type
Parameters: |
|
---|
Tell Rebroadcast how to generate C code for a Theano Type
Parameters: |
|
---|
Tell Shape Op how to generate C code for a Theano Type
Parameters: |
|
---|
Tell Shape_i how to generate C code for a Theano Type
Parameters: |
|
---|
Tell SpecifyShape how to generate C code for a Theano Type
Parameters: |
|
---|
Tell ViewOp how to generate C code for a Theano Type
Parameters: |
|
---|
Equivalent of var.shape[i], but apply if possible the shape feature optimization
This is useful in optimization that need to get the shape. This remove the need of the following shape_feature optimization that convert it. So this speed up optimization and remove Equilibrium max iteration problems.
Parameters: |
|
---|