Skip to content

jpfairbanks/ForwardDiff.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status Coverage Status ForwardDiff ForwardDiff ForwardDiff

Go To ForwardDiff.jl's Documentation

Warning: Please read this issue before attempting nested differentiation with ForwardDiff.jl.

ForwardDiff.jl

ForwardDiff.jl implements methods to take derivatives, gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really) using forward mode automatic differentiation (AD).

While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff.jl generally outperform non-AD algorithms in both speed and accuracy.

Here's a simple example showing the package in action:

julia> import ForwardDiff

julia> f(x::Vector) = sum(sin, x) + prod(tan, x) * sum(sqrt, x);

julia> x = rand(5) # small size for example's sake
5-element Array{Float64,1}:
 0.986403
 0.140913
 0.294963
 0.837125
 0.650451

julia> g = ForwardDiff.gradient(f); # g = ∇f

julia> g(x)
5-element Array{Float64,1}:
 1.01358
 2.50014
 1.72574
 1.10139
 1.2445

julia> ForwardDiff.hessian(f, x)
5x5 Array{Float64,2}:
 0.585111  3.48083  1.7706    0.994057  1.03257
 3.48083   1.06079  5.79299   3.25245   3.37871
 1.7706    5.79299  0.423981  1.65416   1.71818
 0.994057  3.25245  1.65416   0.251396  0.964566
 1.03257   3.37871  1.71818   0.964566  0.140689

News

About

Julia package for performing forward mode automatic differentiation

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 94.9%
  • C++ 3.1%
  • Python 2.0%