Skip to content

Scalar

toydl.core.scalar.context.Context dataclass

Context class is used by ScalarFunction to store information during the forward pass.

save_for_backward

save_for_backward(*values: Any) -> None

Store the given values if they need to be used during backpropagation.

Parameters:

Name Type Description Default
values Any

the values that should be saved for backward

()
Source code in toydl/core/scalar/context.py
14
15
16
17
18
19
20
21
def save_for_backward(self, *values: Any) -> None:
    """Store the given `values` if they need to be used during backpropagation.

    :param values: the values that should be saved for backward
    """
    if self.no_grad:
        return
    self.saved_values = values

toydl.core.scalar.scalar.Add

Bases: ScalarFunction

Addition function :math:f(x, y) = x + y

toydl.core.scalar.scalar.EQ

Bases: ScalarFunction

Equal function :math:f(x) = 1.0 if x is equal to y else 0.0

toydl.core.scalar.scalar.Exp

Bases: ScalarFunction

Exp function

toydl.core.scalar.scalar.Inv

Bases: ScalarFunction

Inverse function

toydl.core.scalar.scalar.LT

Bases: ScalarFunction

Less-than function :math:f(x) = 1.0 if x is less than y else 0.0

toydl.core.scalar.scalar.Log

Bases: ScalarFunction

Log function :math:f(x) = log(x)

toydl.core.scalar.scalar.Mul

Bases: ScalarFunction

Multiplication function

toydl.core.scalar.scalar.Neg

Bases: ScalarFunction

Negation function

toydl.core.scalar.scalar.ReLU

Bases: ScalarFunction

ReLU function

toydl.core.scalar.scalar.Scalar

Scalar(
    v: float,
    history: ScalarHistory = ScalarHistory(),
    name: Optional[str] = None,
)

A reimplementation of scalar values for auto-differentiation tracking. Scalar Variables behave as close as possible to standard Python numbers while also tracking the operations that led to the number's creation. They can only be manipulated by ScalarFunction.

Source code in toydl/core/scalar/scalar.py
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
def __init__(
    self,
    v: float,
    history: ScalarHistory = ScalarHistory(),
    name: Optional[str] = None,
):
    global _var_count
    _var_count += 1
    self._unique_id: int = _var_count
    self.data: float = float(v)
    self.history: ScalarHistory = history
    self.derivative: Optional[float] = None
    if name is not None:
        self.name = name
    else:
        self.name = str(self.unique_id)

accumulate_derivative

accumulate_derivative(x: Any) -> None

Add x to the derivative accumulated on this variable. Should only be called during auto-differentiation on leaf variables.

Parameters:

Name Type Description Default
x Any

value to be accumulated

required
Source code in toydl/core/scalar/scalar.py
111
112
113
114
115
116
117
118
119
120
121
def accumulate_derivative(self, x: Any) -> None:
    """
    Add `x` to the derivative accumulated on this variable.
    Should only be called during auto-differentiation on leaf variables.

    :param x: value to be accumulated
    """
    assert self.is_leaf(), "Only leaf variables can have derivatives."
    if self.derivative is None:
        self.derivative = 0.0
    self.derivative += x

backward

backward(d_output: Optional[float] = None) -> None

Calls autodiff to fill in the derivatives for the history of this object.

Args: d_output (number, opt): starting derivative to backpropagate through the model (typically left out, and assumed to be 1.0).

Source code in toydl/core/scalar/scalar.py
153
154
155
156
157
158
159
160
161
162
163
def backward(self, d_output: Optional[float] = None) -> None:
    """
    Calls autodiff to fill in the derivatives for the history of this object.

    Args:
        d_output (number, opt): starting derivative to backpropagate through the model
                               (typically left out, and assumed to be 1.0).
    """
    if d_output is None:
        d_output = 1.0
    backpropagate(self, d_output)

is_leaf

is_leaf() -> bool

True if this variable created by the user (no last_fn)

Source code in toydl/core/scalar/scalar.py
123
124
125
def is_leaf(self) -> bool:
    """True if this variable created by the user (no `last_fn`)"""
    return self.history is not None and self.history.last_fn is None

requires_grad_

requires_grad_(flag: bool = True)

Set the requires_grad flag to flag on variable.

Ensures that operations on this variable will trigger backpropagation.

Parameters:

Name Type Description Default
flag bool

whether to require grad

True
Source code in toydl/core/scalar/scalar.py
 99
100
101
102
103
104
105
106
107
108
109
def requires_grad_(self, flag: bool = True):
    """
    Set the requires_grad flag to `flag` on variable.

    Ensures that operations on this variable will trigger
    backpropagation.

    :param flag: whether to require grad
    """
    if flag:
        self.history = ScalarHistory()

toydl.core.scalar.scalar.ScalarFunction

A wrapper for a mathematical function that processes and produces Scalar variables.

This is a static class and is never instantiated. We use class here to group together the forward and backward code.

toydl.core.scalar.scalar.Sigmoid

Bases: ScalarFunction

Sigmoid function