Solvers
The three basic verification methods are "reachability", "optimization", and "search". These are further divided into the five categories listed below. Note that all of the optimization methods use the JuMP.jl library.
Reachability-Based Methods
These methods perform exact or approximate reachability analysis to determine the output set corresponding to a given input set. In addition, MaxSens
, which computes lower and upper bounds for each layer, is called within other solver types in the form of get_bounds
.
ExactReach
NeuralVerification.ExactReach
— TypeExactReach
ExactReach performs exact reachability analysis to compute the output reachable set for a network.
Problem requirement
- Network: any depth, ReLU activation
- Input: HPolytope
- Output: AbstractPolytope
Return
ReachabilityResult
Method
Exact reachability analysis.
Property
Sound and complete.
Reference
Ai2
NeuralVerification.Ai2
— TypeAi2{T}
Ai2
performs over-approximated reachability analysis to compute the over-approximated output reachable set for a network. T
can be Hyperrectangle
, Zonotope
, or HPolytope
, and determines the amount of over-approximation (and hence also performance tradeoff). The original implementation (from [1]) uses Zonotopes, so we consider this the "benchmark" case. The HPolytope
case is more precise, but slower, and the opposite is true of the Hyperrectangle
case.
Note that initializing Ai2()
defaults to Ai2{Zonotope}
. The following aliases also exist for convenience:
const Ai2h = Ai2{HPolytope}
const Ai2z = Ai2{Zonotope}
const Box = Ai2{Hyperrectangle}
Problem requirement
- Network: any depth, ReLU activation (more activations to be supported in the future)
- Input: AbstractPolytope
- Output: AbstractPolytope
Return
ReachabilityResult
Method
Reachability analysis using split and join.
Property
Sound but not complete.
Reference
[1] T. Gehr, M. Mirman, D. Drashsler-Cohen, P. Tsankov, S. Chaudhuri, and M. Vechev, "Ai2: Safety and Robustness Certification of Neural Networks with Abstract Interpretation," in 2018 IEEE Symposium on Security and Privacy (SP), 2018.
Note
Efficient over-approximation of intersections and unions involving zonotopes relies on Theorem 3.1 of
[2] Singh, G., Gehr, T., Mirman, M., Püschel, M., & Vechev, M. (2018). Fast and effective robustness certification. In Advances in Neural Information Processing Systems (pp. 10802-10813).
MaxSens
NeuralVerification.MaxSens
— TypeMaxSens(resolution::Float64, tight::Bool)
MaxSens performs over-approximated reachability analysis to compute the over-approximated output reachable set for a network.
Problem requirement
- Network: any depth, any activation that is monotone
- Input:
Hyperrectangle
orHPolytope
- Output:
AbstractPolytope
Return
ReachabilityResult
Method
First partition the input space into small grid cells according to resolution
. Then use interval arithmetic to compute the reachable set for each cell. Two versions of interval arithmetic is implemented with indicator tight
. Default resolution
is 1.0
. Default tight = false
.
Property
Sound but not complete.
Reference
ReluVal
NeuralVerification.ReluVal
— TypeReluVal(max_iter::Int64, tree_search::Symbol)
ReluVal combines symbolic reachability analysis with iterative interval refinement to minimize over-approximation of the reachable set.
Problem requirement
- Network: any depth, ReLU activation
- Input: Hyperrectangle
- Output: LazySet
Return
CounterExampleResult
Method
Symbolic reachability analysis and iterative interval refinement (search).
max_iter
default10
.tree_search
default:DFS
- depth first search.
Property
Sound but not complete.
Reference
Neurify
Missing docstring for Neurify
. Check Documenter's build log for details.
FastLin
NeuralVerification.FastLin
— TypeFastLin(maxIter::Int64, ϵ0::Float64, accuracy::Float64)
FastLin combines reachability analysis with binary search to find maximum allowable disturbance.
Problem requirement
- Network: any depth, ReLU activation
- Input: hypercube
- Output: AbstractPolytope
Return
AdversarialResult
Method
Reachability analysis by network approximation and binary search.
max_iter
is the maximum iteration in search, default10
;ϵ0
is the initial search radius, default100.0
;accuracy
is the stopping criteria, default0.1
;
Property
Sound but not complete.
Reference
FastLip
NeuralVerification.FastLip
— TypeFastLip(maxIter::Int64, ϵ0::Float64, accuracy::Float64)
FastLip adds Lipschitz estimation on top of FastLin.
Problem requirement
- Network: any depth, ReLU activation
- Input: hypercube
- Output: halfspace
Return
AdversarialResult
Method
Lipschitz estimation + FastLin. All arguments are for FastLin.
max_iter
is the maximum iteration in search, default10
;ϵ0
is the initial search radius, default100.0
;accuracy
is the stopping criteria, default0.1
;
Property
Sound but not complete.
Reference
DLV
NeuralVerification.DLV
— TypeDLV(ϵ::Float64)
DLV searches layer by layer for counter examples in hidden layers.
Problem requirement
- Network: any depth, any activation (currently only support ReLU)
- Input: Hyperrectangle
- Output: AbstractPolytope
Return
CounterExampleResult
Method
The following operations are performed layer by layer. for layer i
- determine a reachable set from the reachable set in layer i-1
- determine a search tree in the reachable set by refining the search tree in layer i-1
- Verify
- True -> continue to layer i+1
- False -> counter example
The argument ϵ
is the resolution of the initial search tree. Default 1.0
.
Property
Sound but not complete.
Reference
Optimization-Based Methods
Example
using NeuralVerification # hide
nnet = read_nnet("../../examples/networks/small_nnet.nnet")
input = Hyperrectangle([0.0], [.5])
output = HPolytope(ones(1,1), [102.5])
problem = Problem(nnet, input, output)
# set the JuMP solver with `optimizer` keyword or use default:
solver = MIPVerify(optimizer = GLPKSolverMIP())
solve(solver, problem)
NSVerify
NeuralVerification.NSVerify
— TypeNSVerify(optimizer, m::Float64)
NSVerify finds counter examples using mixed integer linear programming.
Problem requirement
- Network: any depth, ReLU activation
- Input: hyperrectangle or hpolytope
- Output: PolytopeComplement
Return
CounterExampleResult
Method
MILP encoding (using m
). No presolve. Default optimizer
is GLPKSolverMIP()
. Default m
is 1000.0
(should be large enough to avoid approximation error).
Property
Sound and complete.
Reference
MIPVerify
NeuralVerification.MIPVerify
— TypeMIPVerify(optimizer)
MIPVerify computes maximum allowable disturbance using mixed integer linear programming.
Problem requirement
- Network: any depth, ReLU activation
- Input: hyperrectangle
- Output: PolytopeComplement
Return
AdversarialResult
Method
MILP encoding. Use presolve to compute a tight node-wise bounds first. Default optimizer
is GLPKSolverMIP()
.
Property
Sound and complete.
Reference
V. Tjeng, K. Xiao, and R. Tedrake, "Evaluating Robustness of Neural Networks with Mixed Integer Programming," ArXiv Preprint ArXiv:1711.07356, 2017.
ILP
NeuralVerification.ILP
— TypeILP(optimizer, max_iter)
ILP iteratively solves a linearized primal optimization to compute maximum allowable disturbance. It iteratively adds the linear constraint to the problem.
Problem requirement
- Network: any depth, ReLU activation
- Input: hyperrectangle
- Output: PolytopeComplement
Return
AdversarialResult
Method
Iteratively solve a linear encoding of the problem. It only considers the linear piece of the network that has the same activation pattern as the reference input. Default optimizer
is GLPKSolverMIP()
. We provide both iterative method and non-iterative method to solve the LP problem. Default iterative
is true
.
Property
Sound but not complete.
Reference
Duality
NeuralVerification.Duality
— TypeDuality(optimizer)
Duality uses Lagrangian relaxation to compute over-approximated bounds for a network
Problem requirement
- Network: any depth, any activation function that is monotone
- Input: hyperrectangle
- Output: halfspace
Return
BasicResult
Method
Lagrangian relaxation. Default optimizer
is GLPKSolverMIP()
.
Property
Sound but not complete.
Reference
ConvDual
NeuralVerification.ConvDual
— TypeConvDual
ConvDual uses convex relaxation to compute over-approximated bounds for a network
Problem requirement
- Network: any depth, ReLU activation
- Input: hypercube
- Output: halfspace
Return
BasicResult
Method
Convex relaxation with duality.
Property
Sound but not complete.
Reference
Certify
NeuralVerification.Certify
— TypeCertify(optimizer)
Certify uses semidefinite programming to compute over-approximated certificates for a neural network with only one hidden layer.
Problem requirement
- Network: one hidden layer, any activation that is differentiable almost everywhere whose derivative is bound by 0 and 1
- Input: hypercube
- Output: halfspace
Return
BasicResult
Method
Semindefinite programming. Default optimizer
is SCSSolver()
.
Property
Sound but not complete.
Reference
Sherlock
NeuralVerification.Sherlock
— TypeSherlock(optimizer, ϵ::Float64)
Sherlock combines local and global search to estimate the range of the output node.
Problem requirement
- Network: any depth, ReLU activation, single output
- Input: hpolytope and hyperrectangle
- Output: hyperrectangle (1d interval)
Return
CounterExampleResult
or ReachabilityResult
Method
Local search: solve a linear program to find local optima on a line segment of the piece-wise linear network. Global search: solve a feasibilty problem using MILP encoding (default calling NSVerify).
optimizer
defaultGLPKSolverMIP()
ϵ
is the margin for global search, default0.1
.
Property
Sound but not complete.
Reference
BaB
NeuralVerification.BaB
— TypeBaB(optimizer, ϵ::Float64)
BaB uses branch and bound to estimate the range of the output node.
Problem requirement
- Network: any depth, ReLU activation, single output
- Input: hyperrectangle
- Output: hyperrectangle (1d interval)
Return
CounterExampleResult
or ReachabilityResult
Method
Branch and bound. For branch, it uses iterative interval refinement. For bound, it computes concrete bounds by sampling, approximated bound by optimization.
optimizer
defaultGLPKSolverMIP()
ϵ
is the desired accurancy for termination, default0.1
.
Property
Sound and complete.
Reference
Planet
NeuralVerification.Planet
— TypePlanet(optimizer, eager::Bool)
Planet integrates a SAT solver (PicoSAT.jl
) to find an activation pattern that maps a feasible input to an infeasible output.
Problem requirement
- Network: any depth, ReLU activation
- Input: hyperrectangle or bounded hpolytope
- Output: PolytopeComplement
Return
BasicResult
Method
Binary search of activations (0/1) and pruning by optimization. Our implementation is non eager.
optimizer
defaultGLPKSolverMIP()
;eager
defaultfalse
;
Property
Sound and complete.
Reference
Reluplex
NeuralVerification.Reluplex
— TypeReluplex(optimizer, eager::Bool)
Reluplex uses binary tree search to find an activation pattern that maps a feasible input to an infeasible output.
Problem requirement
- Network: any depth, ReLU activation
- Input: hyperrectangle
- Output: PolytopeComplement
Return
CounterExampleResult
Method
Binary search of activations (0/1) and pruning by optimization.
Property
Sound and complete.
Reference