Welcome to NNGeometry’s documentation!¶
NNGeometry is a library built on top of PyTorch aiming at giving tools to easily manipulate and study properties of Fisher Information Matrices and tangent kernels.
You can start by looking at the quick start example below. Convinced? Then install NNGeometry, try the tutorials or explore the API reference.
Warning
NNGeometry is under developement, as such it is possible that core components change when between versions.
Quick example¶
Computing the Fisher Information Matrix on a given PyTorch model using a KFAC representation, and then computing its trace is as simple as:
>>> F_kfac = FIM(model=model,
loader=loader,
representation=PMatKFAC,
n_output=10,
variant='classif_logits',
device='cuda')
>>> print(F_kfac.trace())
If we instead wanted to choose a nngeometry.object.pspace.PMatBlockDiag
representation, we can just replace representation=PMatKFAC
with representation=PMatBlockDiag
in the above.
This example is further detailed in Quick example. Other available parameter space representations are listed in Parameter space representations.
More examples¶
More notebook examples can be found at https://github.com/tfjgeorge/nngeometry/tree/master/examples
Indices and tables¶
In-depth¶
References¶
- GLB+18
Thomas George, César Laurent, Xavier Bouthillier, Nicolas Ballas, and Pascal Vincent. Fast approximate natural gradient descent in a kronecker factored eigenbasis. Advances in Neural Information Processing Systems, 31:9550–9560, 2018.
- GM16
Roger Grosse and James Martens. A kronecker-factored approximate fisher matrix for convolution layers. In International Conference on Machine Learning, 573–582. PMLR, 2016.
- MG15
James Martens and Roger Grosse. Optimizing neural networks with kronecker-factored approximate curvature. In International conference on machine learning, 2408–2417. PMLR, 2015.
- Oll15
Yann Ollivier. Riemannian metrics for neural networks i: feedforward networks. Information and Inference: A Journal of the IMA, 4(2):108–153, 2015.