You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When `foo` is instantiated to matrices, which is a similarly trivial example due to the arithmetic operations working on the arrays element-wise, the gradient is:
39
+
When `foo` is instantiated to matrices, which is a similarly trivial example as before due to the arithmetic operations working on the arrays element-wise, the gradient is:
The most general symbolic gradient program can be then obtained using the `vjpArtifact` tool:
66
+
The most general symbolic gradient program can be then obtained using the `vjpArtifact` tool. We are using `fooLet` without `ssum0` this time, becuase the `vjp` family of tools by convention permits non-scalar domains (but expects an incoming cotangent argument to compensate, visible in the code as `dret`).
With additional formatting, it looks like an ordinary functional program with a lot of nested pairs and projections to represent tuples present in the objective function. A quick inspection of the gradient program reveals that computations are not repeated, which is thanks to the sharing mechanism, as promised.
72
+
With additional formatting, the gradient program below looks like ordinary functional code with a lot of nested pairs and projections to represent tuples. A quick inspection of the gradient code reveals that computations are not repeated, which is thanks to the sharing mechanism, as promised.
73
73
74
74
```hs
75
75
>>> printArtifactPretty artifact
@@ -83,14 +83,14 @@ With additional formatting, it looks like an ordinary functional program with a
83
83
, (m4 * m5) * dret + m4 * dret)
84
84
```
85
85
86
-
A concrete value of the symbolic gradient at the same input as before can be obtained by interpreting the gradient program in the context of the operations supplied by the horde-ad library. The value is the same as for `fooLet` evaluated by `cgrad` on the same input:
86
+
A concrete value of the symbolic gradient at the same input as before can be obtained by interpreting the gradient program in the context of the operations supplied by the horde-ad library. The value is the same as for `fooLet` evaluated by `cgrad` on the same input, as long as the incoming cotangent argument consists of ones in all array cells, which is denoted by `srepl 1` in this case:
A shorthand that creates the symbolic derivative program, simplifies it and interprets it with a given input on the default CPU backend is called `grad` and is used exactly the same (but with often much better performance) as `cgrad`:
93
+
A shorthand that creates the symbolic derivative program, simplifies it and interprets it with a given input on the default CPU backend is called `grad` and is used exactly the same as (but with often much better performance) `cgrad`:
94
94
```hs
95
95
>>> grad (kfromS . ssum0 . fooLet) threeSimpleMatrices
0 commit comments