You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Generalize the QuickCheck gradient vs derivative vs perturbation test and apply it to other neural networks. For a start, factoring out the part inside conjoin would already be beneficial. The generalized tools could be applied as QuickCheck properties in big standalone neural network tests and also in CrossTesting (without QuickCheck) in place of the ad-hoc comparison of reverse and forward derivatives.
It could also use a cleanup, in particular, by using the EqEpsilon machinery for comparing values up to epsilon, which will require a generalization of EqEpsilon. That should also permit to uncomment the "Gradient is a linear function" part that requires comparing arrays.