pymc.SymbolicRandomVariable.pullback#
- SymbolicRandomVariable.pullback(inputs, outputs, output_grads)#
Construct a graph for the vector-Jacobian product (pullback).
Given a function \(f\) implemented by this Op with inputs \(x\) and outputs \(y = f(x)\), the pullback computes \(\bar{x} = \bar{y}^T J\) where \(J\) is the Jacobian \(\frac{\partial f}{\partial x}\) and \(\bar{y}\) are the cotangent vectors (upstream gradients).
This is the core method for reverse-mode automatic differentiation.
If the output is not differentiable with respect to an input, return a variable of type DisconnectedType for that input. If the gradient is not implemented for some input, return a variable of type NullType (see
pytensor.gradient.grad_not_implemented()andpytensor.gradient.grad_undefined()).- Parameters:
- inputs
Sequence[Variable] The input variables of the Apply node using this Op.
- outputs
Sequence[Variable] The output variables of the Apply node using this Op.
- cotangents
Sequence[Variable] The cotangent vectors (gradients w.r.t. each output).
- inputs
- Returns:
- input_cotangents
listofVariable The cotangent vectors w.r.t. each input. One Variable per input.
- input_cotangents