pymc.SymbolicRandomVariable.pullback#

SymbolicRandomVariable.pullback(inputs, outputs, output_grads)#

Construct a graph for the vector-Jacobian product (pullback).

Given a function \(f\) implemented by this Op with inputs \(x\) and outputs \(y = f(x)\), the pullback computes \(\bar{x} = \bar{y}^T J\) where \(J\) is the Jacobian \(\frac{\partial f}{\partial x}\) and \(\bar{y}\) are the cotangent vectors (upstream gradients).

This is the core method for reverse-mode automatic differentiation.

If the output is not differentiable with respect to an input, return a variable of type DisconnectedType for that input. If the gradient is not implemented for some input, return a variable of type NullType (see pytensor.gradient.grad_not_implemented() and pytensor.gradient.grad_undefined()).

Parameters:
inputsSequence[Variable]

The input variables of the Apply node using this Op.

outputsSequence[Variable]

The output variables of the Apply node using this Op.

cotangentsSequence[Variable]

The cotangent vectors (gradients w.r.t. each output).

Returns:
input_cotangentslist of Variable

The cotangent vectors w.r.t. each input. One Variable per input.