You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a side note, `w` is processed only once during gradient computation and this property of sharing preservation is guaranteed universally by horde-ad without any action required from the user. The property holds not only for scalar values, but for arbitrary tensors, e.g., those in further examples. We won't mention the property further.
Do we want yet another example here, before we reach Jacobians or shaped tensors? Perhaps one with the testing infrastructure, e.g., generating a single set of random tensors, or a full QuickCheck example or just a simple
? Or is there a risk the reader won't make it to the shaped example below if we tarry here? Or perhaps finish the shaped tensor example below with an invocation of `assertEqualUpToEpsilon`?
59
-
-->
53
+
54
+
Note that `w` is processed only once during gradient computation and this property of sharing preservation is guaranteed for the `crev` tool universally by horde-ad without any action required from the user. When computing symbolic derivative programs, however, the user has to explicitly mark values for sharing using `tlet` with a more specific type of the objective function, as shown below.
55
+
56
+
```hs
57
+
fooLet:: (RealFloatH (targeta), LetTensortarget)
58
+
=> (targeta, targeta, targeta) ->targeta
59
+
fooLet (x, y, z) =
60
+
tlet (x *sin y) $\w ->
61
+
atan2H z w + z * w
62
+
```
63
+
64
+
The symbolic derivative program (here presented with additional formatting) can be obtained using the `revArtifactAdapt` tool:
A quick inspection of the derivative program reveals that computations are not repeated, which is thanks to sharing. A concrete value of the symbolic derivative can be obtained by interpreting the derivative program in the context of the operations supplied by the horde-ad library. The value should be the same as when evaluating `fooLet` with `crev` on the concrete input, as before. A shorthand that creates the symbolic derivative program and evaluates it at a given input is called `rev` and is used exactly the same (but with potentially better performance) as `crev`.
80
+
81
+
82
+
# WIP: The examples below are outdated and will be replaced soon using a new API
60
83
61
84
62
-
<!--
63
85
## Computing Jacobians
64
86
65
87
-- TODO: we can have vector/matrix/tensor codomains, but not pair codomains
66
88
-- until #68 is done;
67
89
-- perhaps a vector codomain example, with a 1000x3 Jacobian, would make sense?
90
+
-- 2 years later: actually, we can now have TKProduct codomains.
68
91
69
92
Now let's consider a function from 'R^n` to `R^m'. We don't want the gradient, but instead the Jacobian.
0 commit comments