Welcome to NNGeometry’s documentation!¶
NNGeometry is a library built on top of PyTorch aiming at giving tools to easily manipulate and study properties of Fisher Information Matrices and tangent kernels.
You can start by looking at the quick start example below. Convinced? Then install NNGeometry, try the tutorials or explore the API reference.
NNGeometry is under developement, as such it is possible that core components change when between versions.
Computing the Fisher Information Matrix on a given PyTorch model using a KFAC representation, and then computing its trace is as simple as:
>>> F_kfac = FIM(model=model, loader=loader, representation=PMatKFAC, n_output=10, variant='classif_logits', device='cuda') >>> print(F_kfac.trace())
If we instead wanted to choose a
nngeometry.object.pspace.PMatBlockDiag representation, we can just replace
representation=PMatBlockDiag in the above.
More notebook examples can be found at https://github.com/tfjgeorge/nngeometry/tree/master/examples
Indices and tables¶
Thomas George, César Laurent, Xavier Bouthillier, Nicolas Ballas, and Pascal Vincent. Fast approximate natural gradient descent in a kronecker factored eigenbasis. Advances in Neural Information Processing Systems, 31:9550–9560, 2018.
Roger Grosse and James Martens. A kronecker-factored approximate fisher matrix for convolution layers. In International Conference on Machine Learning, 573–582. PMLR, 2016.
James Martens and Roger Grosse. Optimizing neural networks with kronecker-factored approximate curvature. In International conference on machine learning, 2408–2417. PMLR, 2015.
Yann Ollivier. Riemannian metrics for neural networks i: feedforward networks. Information and Inference: A Journal of the IMA, 4(2):108–153, 2015.