Skip to content

asterycs/DiffMatic.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DiffMatic.jl

Documentation Documentation Run tests codecov Aqua QA

Symbolic differentiation of vector/matrix/tensor expressions in Julia

Example

Create a matrix and a vector:

using DiffMatic

@matrix A
@vector x

Create an expression:

expr = x' * A * x

The variable expr now contains an internal representation of the expression x' * A * x.

Compute the gradient and the Hessian with respect to the vector x.

g = gradient(expr, x)
H = hessian(expr, x)

Convert the gradient and the Hessian to standard notation using to_std:

to_std(g) # "Aᵀx + Ax"
to_std(H) # "Aᵀ + A"

Jacobians can be computed with jacobian:

to_std(jacobian(A * x, x)) # "A"

The function derivative can be used to compute arbitrary derivatives.

to_std(derivative(tr(A), A)) # "I"

The function to_std will throw an exception when given an expression that that cannot be converted to standard notation.

Supported functions and operators

+, -, ', *, ^, abs, sin, cos, log

Element-wise operations sin., cos., abs., .*, .^ and log. are supported.
Vector 1-norm and 2-norm can be computed with LinearAlgebra.norm(..., 1) and LinearAlgebra.norm(..., 2).
Sums of vectors can be computed with sum.
Matrix traces can be computed with LinearAlgebra.tr.

Installation

Installation from the general registry:

using Pkg; Pkg.add("DiffMatic")

Acknowledgements

The implementation is based on the ideas presented in

S. Laue, M. Mitterreiter, and J. Giesen. Computing Higher Order Derivatives of Matrix and Tensor Expressions, NeurIPS 2018.

About

A library for differentiating vector/matrix/tensor expressions.

Resources

License

Stars

Watchers

Forks

Languages