pub struct Tape<V: Differentiable> { /* private fields */ }Expand description
Reverse-mode AD tape.
The tape records operations performed on TrackedTensor values and
enables gradient computation via Tape::pullback or HVP via
Tape::hvp.
Create leaf values with Tape::leaf, perform operations using
AD-aware functions (e.g., tracked_einsum), then call
Tape::pullback on the scalar loss to compute gradients.
Tape is cheaply cloneable (internally reference-counted). Multiple
clones refer to the same underlying tape.
§Examples
use chainrules::Tape;
use tenferro_einsum::tracked_einsum;
use tenferro_tensor::{MemoryOrder, Tensor};
use tenferro_device::LogicalMemorySpace;
let tape = Tape::<Tensor<f64>>::new();
let a = tape.leaf(Tensor::ones(
&[2, 3],
LogicalMemorySpace::MainMemory,
MemoryOrder::ColumnMajor,
));
let b = tape.leaf(Tensor::ones(
&[3, 4],
LogicalMemorySpace::MainMemory,
MemoryOrder::ColumnMajor,
));
let c = tracked_einsum("ij,jk->ik", &[&a, &b]).unwrap();
let loss = tracked_einsum("ij,ij->", &[&c, &c]).unwrap();
let grads = tape.pullback(&loss).unwrap();
let _ga = grads.get(a.node_id().unwrap()).unwrap();Implementations§
Source§impl<V: Differentiable> Tape<V>
impl<V: Differentiable> Tape<V>
Sourcepub fn leaf(&self, _value: V) -> TrackedTensor<V>
pub fn leaf(&self, _value: V) -> TrackedTensor<V>
Creates a leaf value requiring gradient on this tape.
The returned TrackedTensor is connected to this tape and
will participate in gradient computation via Tape::pullback.
§Examples
use chainrules::Tape;
let tape = Tape::<f64>::new();
let x = tape.leaf(3.14);
assert!(x.requires_grad());Sourcepub fn leaf_with_tangent(
&self,
_value: V,
_tangent: V::Tangent,
) -> AdResult<TrackedTensor<V>>
pub fn leaf_with_tangent( &self, _value: V, _tangent: V::Tangent, ) -> AdResult<TrackedTensor<V>>
Creates a leaf value with a tangent for HVP computation.
The tangent defines the perturbation direction v used in forward-over-reverse Hessian-vector product computation.
§Errors
Returns AutodiffError::TangentShapeMismatch if shapes differ.
§Examples
use chainrules::Tape;
let tape = Tape::<f64>::new();
let x = tape.leaf_with_tangent(3.14, 1.0).unwrap();
assert!(x.requires_grad());
assert!(x.has_tangent());Sourcepub fn pullback(&self, _loss: &TrackedTensor<V>) -> AdResult<Gradients<V>>
pub fn pullback(&self, _loss: &TrackedTensor<V>) -> AdResult<Gradients<V>>
Runs reverse-mode pullback from a scalar loss value.
§Errors
Returns AutodiffError::NonScalarLoss for non-scalar losses.
Returns AutodiffError::MissingNode if the loss is not connected
to this tape.
§Examples
use chainrules::Tape;
let tape = Tape::<f64>::new();
let x = tape.leaf(2.0);
// ... compute loss from x ...
let grads = tape.pullback(&x).unwrap();Sourcepub fn hvp(&self, _loss: &TrackedTensor<V>) -> AdResult<HvpResult<V>>
pub fn hvp(&self, _loss: &TrackedTensor<V>) -> AdResult<HvpResult<V>>
Computes gradient and Hessian-vector product via forward-over-reverse.
Leaf values with tangents (created via Tape::leaf_with_tangent)
define the direction v. The function runs pullback through the tape,
propagating both cotangents and cotangent-tangents at each node.
Returns both the gradient (in HvpResult::gradients) and H*v (in
HvpResult::hvp).
§Errors
Returns AutodiffError::NonScalarLoss for non-scalar losses.
Returns AutodiffError::HvpNotSupported if any ReverseRule on the tape
does not implement pullback_with_tangents.
§Examples
use chainrules::Tape;
use tenferro_einsum::tracked_einsum;
use tenferro_tensor::{MemoryOrder, Tensor};
use tenferro_device::LogicalMemorySpace;
let tape = Tape::<Tensor<f64>>::new();
let x = tape.leaf_with_tangent(
Tensor::ones(&[3], LogicalMemorySpace::MainMemory, MemoryOrder::ColumnMajor),
Tensor::ones(&[3], LogicalMemorySpace::MainMemory, MemoryOrder::ColumnMajor),
).unwrap();
let loss = tracked_einsum("i,i->", &[&x, &x]).unwrap();
let result = tape.hvp(&loss).unwrap();
let _grad = result.gradients;
let _hv = result.hvp;