JuMP is a domain-specific modeling language for mathematical programming embedded in Julia. It currently supports a number of open-source and commercial solvers (COIN Clp, COIN Cbc, GNU GLPK, and Gurobi) via a generic solver-independent interface provided by the MathProgBase package. One the best features of JuMP is its speed - benchmarking has shown that it can create problems at similar speeds to special-purpose modeling languages such as AMPL while maintaining the expressiveness of a generic high-level programming language.

If you are familiar with Julia you can get started quickly by using the package manager to install JuMP:

```
julia> Pkg.add("JuMP")
```

And a solver, e.g.:

```
julia> Pkg.add("Clp") # Will install Cbc as well
```

Then read the *Quick Start Guide* and/or see a *Simple Example*. We also
have details of the functions and types defined by JuMP.
If you are new to Julia or want more details, read on to the next section.

This guide will briefly guide you through installing Julia, JuMP and[a] solver[s] of your choice.

At the time of writing this documentation the latest release of Julia is version `0.2`, which is the version required by JuMP. You can easily build from source on OSX and Linux, but the binaries will work well for most people.

Download links and more detailed instructions are available on the Julia website.

Once you’ve installed Julia, installing JuMP is simple. Julia has a git-based package system. To use it, open Julia in interactive mode (i.e. `julia` at the command line) and use the package manager:

```
julia> Pkg.add("JuMP")
```

This command checks METADATA.jl to determine what the most recent version of JuMP is and then downloads it from its repository on GitHub.

Solver support in Julia is currently provided by writing a solver-specific package that provides a very thin wrapper around the solver’s C interface and providing a standard interface that JuMP can call. If you are interested in providing an interface to your solver, please get in touch. We currently have interfaces for COIN-OR, Gurobi, GNU GLPK, and CPLEX.

Support for Clp and Cbc is provided via CoinMP and the Cbc.jl and Clp.jl packages. You can install these solvers through the package manager:

```
julia> Pkg.add("Cbc")
julia> Pkg.add("Clp") # Clp depends on Cbc, so installing Clp first
# will install both.
```

Regarding the CoinMP binary itself, installation will differ by platform:

- Linux - Only option is to build from source, which will happen automatically.
- OS X - Downloads binary via the Homebrew.jl package.
- Windows -
**Only 32-bit versions of Julia are supported by the COIN solvers at this time**. The 32-bit version of Julia can be used on 64-bit Windows with no issues. Binary download. Will require Visual Studio 2012 redistributable if not already installed.

Clp and Cbc, if available, are the default choice of solver in JuMP.

Gurobi is an excellent high-performance commercial solver. It supports quadratic objectives and constraints, and is currently the only solver supported by Julia/JuMP with that functionality. Install Gurobi as you normally would and then add the Gurobi.jl package:

```
julia> Pkg.add("Gurobi")
```

Warning

If you are using 64-bit Gurobi, you must use 64-bit Julia (and similarly with 32-bit Gurobi).

The Gurobi package README contains examples of how to use Gurobi within JuMP.

Installing GLPK is a bit more involved than can be covered here - see the documentation for more information.

This quick start guide will introduce the main concepts of JuMP.
If you are familiar with another modeling language embedded in a high-level
language such as PuLP (Python) or a solver-specific interface you will find
most of this familiar, with the exception of *macros*. A deep understanding
of macros is not essential, but if you would like to know more please see
the Julia documentation.
If you are coming from an AMPL or similar background, you may find some of
the concepts novel but the general appearance will still be familiar.

**Models** are Julia objects. They are created by calling the constructor:

```
m = Model()
```

All variables and constraints are associated with a `Model` object. For
a list of all functions related to `Model`, including how to change the
default solver and set solver parameters, see *Models*.

**Variables** are also Julia objects, and are defined using the `@defVar`
macro. The first argument will always be the `Model` to associate this
variable with. In the examples below we assume `m` is already defined.
The second argument is an expression that declares the variable name and
optionally allows specification of lower and upper bounds. For example:

```
@defVar(m, x ) # No bounds
@defVar(m, x >= lb ) # Lower bound only (note: 'lb <= x' is not valid)
@defVar(m, x <= ub ) # Upper bound only
@defVar(m, lb <= x <= ub ) # Lower and upper bounds
```

All these variations introduce a new variable `x` in the local scope.
The names of your variables must be valid Julia variable names.
For information about common operations on variables, e.g. changing their
bounds, see the *Variables* section.

**Integer** and **binary** restrictions can optionally be specified with a
third argument, `Int` or `Bin`.

To create arrays of variables we append brackets to the variable name. For example:

```
@defVar(m, x[1:M,1:N] >= 0 )
```

will create an `M` by `N` array of variables. Both ranges and arbitrary
iterable sets are supported as index sets. Currently we only support ranges
of the form `a:b` where `a` is an explicit integer, not a variable.
Using ranges will generally be faster than using arbitrary symbols. You can
mix both ranges and lists of symbols, as in the following example:

```
s = ["Green", "Blue"]
@defVar(m, x[-10:10,s], Int )
# e.g. x[-4, "Green"]
```

Finally, bounds can depend on variable indices:

```
@defVar(m, x[i=1:10] >= i )
```

JuMP allows users to use a natural notation to describe linear expressions. There are two ways to do so. The first is very similar to other modeling languages and has no restrictions. The second utilizes Julia’s powerful metaprogramming features to get excellent performance even for large problems, but has some restrictions on how they can be used.

To add constraints in the first way, use the `addConstraint()` and `setObjective()`
functions, e.g.:

```
setObjective(m, :Max, 5x + 22y + (x+y)/2) # or :Min
addConstraint(m, y + z == 4) # Other options: <= and >=
```

The second way is visually very similar, and uses the `@addConstraint` and `@setObjective`
macros, e.g.:

```
@addConstraint(m, x[i] - s[i] <= 0)
@setObjective(m, Max, sum{x[i], i=1:numLocation} )
```

Note

The `sense` passed to `setObjective` must be a symbol type: `:Min` or `:Max`.
The `@setObjective` macro accepts `:Min` and `:Max`, as well as `Min` and `Max` (without the colon) directly.

There is one key restriction on the form of the expression in the second case:
*if there is a product between coefficients and variables, the variables must appear last*.
That is, Coefficient times Variable is good, but Variable times Coefficient is bad:

```
@addConstraint(m, x[i]*5 >= 2) # Causes an error
@addConstraint(m, 5*x[i] >= 2) # No problems
```

However, division by constants is supported.

You may have noticed a special `sum{}` operator above. This is defined only for
the second kind of function. The syntax is of the form:

```
sum{expression, i = I1, j = I2, ...}
```

which is equivalent to:

```
a = AffExpr() # Create a new empty affine expression
for i = I1
for j = I2
...
a += expression
...
end
end
```

You can also put a condition in:

```
sum{expression, i = I1, j = I2, ...; cond}
```

which is equivalent to:

```
a = AffExpr()
for i = I1
for j = I2
...
if cond
a += expression
end
...
end
end
```

In this section we will construct a simple model and explain every step along the way.
The are more complex examples in the `JuMP/examples/` folder. Here is the code we will walk through:

```
using JuMP
m = Model()
@defVar(m, 0 <= x <= 2 )
@defVar(m, 0 <= y <= 30 )
@setObjective(m, Max, 5x + 3*y )
@addConstraint(m, 1x + 5y <= 3.0 )
print(m)
status = solve(m)
println("Objective value: ", getObjectiveValue(m))
println("x = ", getValue(x))
println("y = ", getValue(y))
```

Once JuMP is *installed*, to use JuMP in your
programs, you just need to say:

```
using JuMP
```

Models are created with the `Model()` function:

```
m = Model()
```

Note

Your model doesn’t have to be called m - it’s just a name.

There are a few options for defining a variable, depending on whether you want
to have lower bounds, upper bounds, both bounds, or even no bounds. The following
commands will create two variables, `x` and `y`, with both lower and upper bounds.
Note the first argument is our model variable `m`. These variables are associated
with this model and cannot be used in another model.:

```
@defVar(m, 0 <= x <= 2 )
@defVar(m, 0 <= y <= 30 )
```

Next we’ll set our objective. Note again the `m`, so we know which model’s
objective we are setting! The objective sense, `Max` or `Min`, should
be provided as the second argument. Note also that we don’t have a multiplication `*`
symbol between 5 and our variable `x` - Julia is smart enough to not need it!
Feel free to stick with `*` if it makes you feel more comfortable, as we have
done with `3*y`:

```
@setObjective(m, Max, 5x + 3*y )
```

Adding constraints is a lot like setting the objective. Here we create a
less-than-or-equal-to constraint using `<=`, but we can also create equality
constraints using `==` and greater-than-or-equal-to constraints with `>=`:

```
@addConstraint(m, 1x + 5y <= 3.0 )
```

If you want to see what your model looks like in a human-readable format,
the `print` function is defined for models.

```
print(m)
```

Models are solved with the `solve()` function. This function will not raise
an error if your model is infeasible - instead it will return a flag. In this
case, the model is feasible so the value of `status` will be `:Optimal`,
where `:` again denotes a symbol. The possible values of `status`
are described in the MathProgBase documentation.

```
status = solve(m)
```

Finally, we can access the results of our optimization. Getting the objective value is simple:

```
println("Objective value: ", getObjectiveValue(m))
```

To get the value from a variable, we call the `getValue()` function. If `x`
is not a single variable, but instead a range of variables, `getValue()` will
return a list. In this case, however, it will just return a single value.

```
println("x = ", getValue(x))
println("y = ", getValue(y))
```

`Model` is a type defined by JuMP. All variables and constraints are
associated with a `Model` object. It has a constructor that has no
required arguments:

```
m = Model()
```

The constructor also accepts an optional keyword argument, `solver`,
which can be used to change the default solver behavior.

`solver` must be an `AbstractMathProgSolver` object, which is constructed as follows:

```
solver = solvername(Option1=Value1, Option2=Value2, ...)
```

where `solvername` is one of the supported LP solvers (`ClpSolver`, `GLPKSolverLP`, and `GurobiSolver`) or MIP solvers (`CbcSolver`, `GLPKSolverMIP`, and `GurobiSolver`). To use these objects, the corresponding modules (`Clp`, `Cbc`, `GLPKMathProgInterface`, and `Gurobi`) must be first loaded. All options are solver-dependent; see corresponding solver packages for more information.

Note

Be sure that the solver provided supports the problem class of the model. For example `ClpSolver` and `GLPKSolverLP` support only linear programming problems. `CbcSolver` and `GLPKSolverMIP` support only mixed-integer programming problems. `GurobiSolver` supports both classes as well as problems with quadratic objectives and/or constraints.

As an example, we can create a `Model` object that will use GLPK’s
exact solver for LPs as follows:

```
m = Model(solver = GLPKSolverLP(method=:Exact))
```

**General**

`getNumVars(m::Model)`- returns the number of variables associated with the`Model m`.`getNumConstraints(m::Model)`- returns the number of constraints associated with the`Model m`.

**Objective**

`getObjective(m::Model)`- returns the objective function as a`QuadExpr`.`setObjective(m::Model, sense::Symbol, a::AffExpr)`,`setObjective(m::Model, sense::Symbol, q::QuadExpr)`- sets the objective function to`a`and`q`respectively, with given objective sense, which must be either`:Min`or`:Max`.`getObjectiveSense(m::Model)`- returns objective sense, either`:Min`or`:Max`.`setObjectiveSense(m::Model, newSense::Symbol)`- sets the objective sense (`newSense`is either`:Min`or`:Max`).`getObjectiveValue(m::Model)`- returns objective value after a call to`solve`.

**Output**

`writeLP(m::Model, filename::String)`- write the model to`filename`in the LP file format.`writeMPS(m::Model, filename::String)`- write the model to`filename`in the MPS file format.

Quadratic objectives are supported by JuMP but currently the only supported
solver is `Gurobi`. The other issue is that the `@setObjective` macro
**does not yet support quadratic terms**, but you may use instead the (slower)
`setObjective` function:

```
m = Model()
@defVar(m, 0 <= x <= 2 )
@defVar(m, 0 <= y <= 30 )
setObjective(m, :Min, x*x+ 2x*y + y*y ) # Cannot use macro
@addConstraint(m, x + y >= 1 )
print(m)
status = solve(m)
```

Variables, also known as columns or decision variables, are the results of the optimization.

The primary way to create variables is with the `@defVar` macro.
The first argument will always be a `Model`. In the examples below we assume
`m` is already defined. The second argument is an expression that declares
the variable name and optionally allows specification of lower and upper bounds.
Adding variables “column-wise”, e.g., as in column generation, is supported as well;
see the syntax discussed in the *Problem Modification* section.

```
@defVar(m, x ) # No bounds
@defVar(m, x >= lb ) # Lower bound only (note: 'lb <= x' is not valid)
@defVar(m, x <= ub ) # Upper bound only
@defVar(m, lb <= x <= ub ) # Lower and upper bounds
```

All these variations create a new local variable, in this case `x`.
The names of your variables must be valid Julia variable names.
Integer and binary restrictions can optionally be specified with a third argument, `Int` or `Bin`.

To create arrays of variables we append brackets to the variable name.

```
@defVar(m, x[1:M,1:N] >= 0 )
```

will create an `M` by `N` array of variables. Both ranges and arbitrary
iterable sets are supported as index sets. Currently we only support ranges
of the form `a:b` where `a` is an explicit integer, not a variable. Using
ranges will generally be faster than using arbitrary symbols. You can mix both
ranges and lists of symbols, as in the following example:

```
s = ["Green","Blue"]
@defVar(m, x[-10:10,s] , Int)
x[-4,"Green"]
```

Bounds can depend on variable indices:

```
@defVar(m, x[i=1:10] >= i )
```

Finally, variables can be constructed manually, one-by-one:

```
x = Variable(m::Model, lower::Number, upper::Number, category::Int, name::String)
x = Variable(m::Model, lower::Number, upper::Number, category::Int)
```

but this is not considered idiomatic JuMP code.

**Bounds**

`setLower(x::Variable, lower::Number)`,`getLower(x::Variable)`- Set/get the lower bound of a variable.`setUpper(x::Variable, upper::Number)`,`getUpper(x::Variable)`- Set/get the upper bound of a variable.

**Values**

`getValue(x)`- Get the value of this variable in the solution. If`x`is a single variable, this will simply return a number. If`x`is indexable then it will return an indexable dictionary of values.`setValue(x,v)`- Provide an initial value`v`for this variable that can be used by supporting MILP solvers. If`v`is`NaN`, the solver may attempt to fill in this value to construct a feasible solution.`getDual(x)`- Get the reduced cost of this variable in the solution. Similar behavior to`getValue`for indexable variables.

**Names**

Variables (in the sense of columns) can have internal names (different from the Julia variable name) that can be used for writing models to file. This feature is disabled for performance reasons, but will be added if there is demand or a special use case.

`setName(x::Variable, newName)`,`getName(x::Variable)`- Set/get the variable’s internal name.

`AffExpr` is an affine expression type defined by JuMP. It has three fields:
a vector of coefficients, a vector of variables, and a constant. Apart from
a default constructor that takes no arguments, it also has a full constructor that
can be useful if you want to manually build an affine expression:

```
aff = AffExpr([3.0, 4.0], [x, z], 2.0) # 3x + 4z + 2
```

Note that the coefficients must be floating point numbers. The matching
constraint for `AffExpr` is `LinearConstraint` which is defined by an
`AffExpr` and a lower and upper bound. If a solver interface does not
support range constraints, this will automatically translated into two
constraints at solve time. Constructing constraints manually is not an
expected behavior and won’t add the constraint to a model automatically.
See below for the correct methods.

There is also `QuadExpr` for quadratic expressions type that also provides
a default constructor that takes no arguments and a full constructor. There
are four fields: two vectors of variables, a vector of coefficients, and the
affine part of the expression. This is best explained by example:

```
aff = AffExpr([3.0, 4.0], [x, z], 2.0) # 3x + 4z + 2
quad = QuadExpr([x,y],[x,z],[3.0,4.0],aff) # 3x^2 + 4yz + 3x + 4z + 2
```

The corresponding constraint is `QuadConstraint`, which is expected to
be a convex quadratic constraint.

`@addConstraint(m::Model, con)`- efficient way to add linear constraints. Uses macros and thus does not yet support quadratic constraints.`addConstraint(m::Model, con)`- general way to add linear and quadratic constraints.

In order to manipulate constraints after creation, it is necessary to maintain
a reference. For linear constraints both `@addConstraint` and `addConstraint`
return an object of type `ConstraintRef{LinearConstraint}`. To facilitate
the storage of these we provide the convenience macro, e.g.:

```
@defConstrRef constraintName[1:3]
```

That behaves like `@defVar`. You can then iterate over constraints and store
references in this structure, e.g.:

```
@defVar(m, x[1:5] >= 0)
@defConstrRef myCons[1:5]
for i = 1:5
myCons[i] = @addConstraint(m, x[i] >= i)
end
```

To obtain the dual of a constraint, call `getDual` on the constraint reference:

```
println(getDual(myCons[1]))
```

Dual information is unavailable for MIPs and has not yet been implemented for quadratic constraints.

It can be useful to modify models after they have been created and solved, for example when we are solving many similar models in succession or generating the model dynamically (e.g. column generation). Additionally it is sometimes desirable for the solver to re-start from the last solution to reduce running times for successive solves (“hot-start”). Where available, JuMP exposes this functionality.

Some solvers do not expose the ability to modify a model after creation - the model must be constructed from scratch each time. JuMP will use the ability to modify problems exposed by the solver if possible, and will still work even if the solver does not support this functionality by passing the complete problem to the solver every time.

Changing problem class will trigger a fresh model construction, e.g. changing between an LP and MILP. This restriction is partially to support the variety of different solvers capabilities.

Modifying and resolving a MILP does currently trigger a fresh model construction, but JuMP will provide the last solution as a “warm-start” solution if supported by the solver.

As before, variables can be added using the `@defVar` macro. To remove a variable,
one can set the bounds on that variable to zero, e.g.:

```
setLower(x, 0.0)
setUpper(x, 0.0)
```

While bound updates are applied immediately in JuMP, variable bound changes are not
transmitted to the solver until `solve` is called again.

To add variables that appear in existing constraints, e.g. in column generation,
there is an alternative form of the `defVar` macro:

```
@defVar(m, x, objcoef, constrrefs, values)
@defVar(m, x >= lb, objcoef, constrrefs, values)
@defVar(m, x <= ub, objcoef, constrrefs, values)
@defVar(m, lb <= x <= ub, objcoef, constrrefs, values)
@defVar(m, lb <= x <= ub, Int, objcoef, constrrefs, values) # Types are supported
```

where `objcoef` is the coefficient of the variable in the new problem,
`constrrefs` is a vector of `ConstraintRef`, and `values` is a vector
of numbers. To give an example, consider the following code snippet:

```
m = Model()
@defVar(m, 0 <= x <= 1)
@defVar(m, 0 <= y <= 1)
@setObjective(m, Max, 5x + 1y)
con = @addConstraint(m, x + y <= 1)
solve(m) # x = 1, y = 0
@defVar(m, 0 <= z <= 1, 10.0, [con], [1.0])
# The constraint is now x + y + z <= 1
# The objective is now 5x + 1y + 10z
solve(m) # z = 1
```

In some situations you may be adding all variables in this way. To do so, first define a set of empty constraints, e.g.

```
m = Model()
con = @addConstraint(m, 0 <= 1)
@setObjective(m, Max, 0)
@defVar(m, 0 <= x <= 1, 5, [con], [1.0])
@defVar(m, 0 <= y <= 1, 1, [con], [1.0])
@defVar(m, 0 <= z <= 1, 10, [con], [1.0])
solve(m)
```

JuMP does not currently support changing constraint coefficients. For less-than and greater-than constraints, the right-hand-side can be changed, e.g.:

```
mycon = @addConstraint(m, x + y <= 4)
solve(m)
chgConstrRHS(mycon, 3) # Now x + y <= 3
solve(m) # Hot-start for LPs
```

To change the objective, simply call `@setObjective` again - the previous objective
function and sense will be replaced.

Many solvers offer the ability to modify the solve process. Examples include changing branching decisions in branch-and-bound, adding custom cuts, providing solvers with integer solutions, or adding new constraints only when they are violated by the current solution (lazy constraints).

Solver-independent modelling languages do not, in general, provide a way to provide callbacks that will work with any solver. However, JuMP does provide limited support for this functionality. Currently we have cross-solver support for adding “lazy constraints” for the Gurobi, CPLEX, and GLPK solvers.

Lazy constraints are useful when the full set of constraints is too large to explicitly include in the initial formulation. When a MIP solver reaches a new solution, for example with a heuristic or by solving a problem at a node in the branch-and-bound tree, it will give the user the chance to provide constraint(s) that would make the current solution infeasible. For some more information about lazy constraints, see this blog post by Paul Rubin.

There are three important steps to providing a lazy constraint callback. First we
must write a function that will analyze the current solution that takes a
single argument, e.g. `function myLazyCutGenerator(cb)`, where cb is a reference
to the callback management code inside JuMP. Next you will do whatever
analysis of the solution you need to inside your function to generate the new
constraint before adding it to the model with the JuMP function
`addLazyConstraint(cb, myconstraint)` or the macro version
`@addLazyConstraint(cb, myconstraint)` (same limitations as addConstraint).
Finally we notify JuMP that this function should be used for lazy constraint
generation using the `setlazycallback(m, myLazyCutGenerator)` function
before we call `solve(m)`.

The following is a simple example to make this more clear. In this two-dimensional problem we have a set of box constraints explicitly provided and a set of two lazy constraints we can add on the fly. The solution without the lazy constraints will be either (0,2) or (2,2), and the final solution will be (1,2):

```
using JuMP
using Gurobi
# We will use Gurobi, which requires that we manually set the attribute
# LazyConstraints to 1 if we use lazy constraint generation
m = Model(solver=GurobiSolver(LazyConstraints=1))
# Define our variables to be inside a box, and integer
@defVar(m, 0 <= x <= 2, Int)
@defVar(m, 0 <= y <= 2, Int)
@setObjective(m, Max, y)
# We now define our callback function that takes one argument,
# the callback handle. Note that we can access m, x, and y because
# this function is defined inside the same scope
function corners(cb)
x_val = getValue(x)
y_val = getValue(y)
println("In callback function, x=$x_val, y=$y_val")
# We have two constraints, one cutting off the top
# left corner and one cutting off the top right corner, e.g.
# (0,2) +---+---+ (2,2)
# |xx/ \xx|
# |x/ \x|
# |/ \|
# + +
# | |
# | |
# | |
# (0,0) +---+---+ (2,0)
# Allow for some impreciseness in the solution
TOL = 1e-6
# Check top left, allowing some tolerance
if y_val - x_val > 1 + TOL
# Cut off this solution
println("Solution was in top left, cut it off")
# Use the original variables, but not m - cb instead
@addLazyConstraint(cb, y - x <= 1)
# Check top right
elseif y_val + x_val > 3 + TOL
# Cut off this solution
println("Solution was in top right, cut it off")
# Use the original variables, but not m - cb instead
@addLazyConstraint(cb, y + x <= 3)
end
end # End of callback function
# Tell JuMP/Gurobi to use our callback function
setlazycallback(m, corners)
# Solve the problem
solve(m)
# Print our final solution
println("Final solution: [ $(getValue(x)), $(getValue(y)) ]")
```

The code should print something like (amongst the output from Gurobi):

```
In callback function, x=2.0, y=2.0
Solution was in top right, cut it off
In callback function, x=0.0, y=2.0
Solution was in top left, cut it off
In callback function, x=1.0, y=2.0
Final solution: [ 1.0, 2.0 ]
```

This code can also be found in `/JuMP/examples/simplelazy.jl`.

In the above example the callback function is defined in the same scope as the model and variable definitions, allowing us to access them. If we defined the function in some other scope, or even file, we would not be able to access them directly. The proposed solution to this design problem is to seperate the logic of analyzing the current solution values from the callback itself. This has many benefits, including writing unit tests for the callback function to check its correctness. The callback function pased to JuMP is then simply a stub that extracts the current solution and any other relevant information and passes that to the constraint generation logic. To apply this to our previous example, consider the following code:

```
using JuMP
using Gurobi
using Base.Test
function cornerChecker(x_val, y_val)
# This function does not depend on the model, and could
# be written anywhere. Instead, it returns a tuple of
# values (newcut, x_coeff, y_coeff, rhs) where newcut is a
# boolean if a cut is needed, x_coeff is the coefficient
# on the x variable, y_coeff is the coefficient on
# the y variable, and rhs is the right hand side
TOL = 1e-6
if y_val - x_val > 1 + TOL
return (true, -1.0, 1.0, 1.0) # Top left
elseif y_val + x_val > 3 + TOL
return (true, 1.0, 1.0, 3.0) # Top right
else
return (false, 0.0, 0.0, 0.0) # No cut
end
end
# A unit test for the cornerChecker function
function test_cornerChecker()
# Test the four corners - only two should produce cuts
newcut, x_coeff, y_coeff, rhs = cornerChecker(0, 0)
@test !newcut
newcut, x_coeff, y_coeff, rhs = cornerChecker(2, 0)
@test !newcut
newcut, x_coeff, y_coeff, rhs = cornerChecker(0, 2)
@test newcut
@test x_coeff == -1.0
@test y_coeff == 1.0
@test rhs == 1.0
newcut, x_coeff, y_coeff, rhs = cornerChecker(2, 2)
@test newcut
@test x_coeff == 1.0
@test y_coeff == 1.0
@test rhs == 3.0
end
function solveProblem()
m = Model(solver=GurobiSolver(LazyConstraints=1))
@defVar(m, 0 <= x <= 2, Int)
@defVar(m, 0 <= y <= 2, Int)
@setObjective(m, Max, y)
# Note that the callback is now a stub that passes off
# the work to the "algorithm"
function corners(cb)
x_val = getValue(x)
y_val = getValue(y)
println("In callback function, x=$x_val, y=$y_val")
newcut, x_coeff, y_coeff, rhs = cornerChecker(x_val, y_val)
if newcut
@addLazyConstraint(cb, x_coeff*x + y_coeff*y <= rhs)
end
end # End of callback function
setlazycallback(m, corners)
solve(m)
println("Final solution: [ $(getValue(x)), $(getValue(y)) ]")
end
# Run tests
test_cornerChecker()
# Solve it
solveProblem()
```

This code can also be found in `/JuMP/examples/simplelazy2.jl`.