I am a full-time consultant and provide services related to the design, implementation and deployment of mathematical programming, optimization and data-science applications. I also teach courses and workshops. Usually I cannot blog about projects I am doing, but there are many technical notes I'd like to share. Not in the least so I have an easy way to search and find them again myself. You can reach me at erwin@amsterdamoptimization.com.

Monday, January 23, 2012

To really appreciate the features of a modeling system requires quite some investment. When comparing more systems, also often a “least common denominator” model is used. Some issues are more specialized, such as:

Saturday, January 21, 2012

I have a question: In GAMS, is there something similar to "structure" in C ? I need to define a network optimization problem and I need to define: A node as the structure, incoming arcs as one attribute, and outgoing arcs as another attribute.

GAMS does not have that and most likely you don’t “need” that anyway. A compact way to describe a network is shown here:

Monday, January 16, 2012

Dinkelbach’s Algorithm as an Efficient Method for Solving a Class of MINLP Models for Large-Scale Cyclic Scheduling Problems [link]

by Prof. Grossmann e.a. investigates some mixed-integer linear fractional programming problems. It compares some standard MINLP codes against an implementation of Dinkelbach’s algorithm and concludes that this algorithm performs very good. The algorithm consists of a solving a series of MIP problems with each time a different weighted objective function. This method is so simple it can be coded in GAMS in just a few lines.

If the problem is something like:

then we need to solve a series of MIP problems:

for different values of λ.

I wanted to see if I could use this on a (very) large mixed-integer linear fractional programming problem. Of course first try it on some very small examples. The first test is a continuous problem:

For our very large problem (approx. 1 million rows) this method was even more successful. All NLP solvers I have access to had troubles with the sheer size of the problem, even though the NLP relaxations are linearly constrained. But the MIP problems generated inside Dinkelbach’s algorithm turned out to be large but easy to solve. In addition the algorithm converged in about 5 major iterations.

Tuesday, January 10, 2012

where the rest of the model was linear (after applying some reformulations). As the model contains both integer and binary variables, this now has become an MINLP.

For several reasons this is actually not a very good expression:

this expression generates a lot of non-linear variables and non-linear non-zero elements in the matrix

as some z(i)’s can be zero it is difficult to protect this against division-by-zero (although the way we ran it had a linear model in front of this model, so our starting point was reasonable and no division-by-zero occurred)

A better formulation would be:

Here we only have two non-linear variables: the rest of the variables is now appearing linearly in the constraints. In addition, although we cannot bound each z(i) away from zero, we could introduce a nonzero lower bound on w.

Although we are adding extra constraints and variables to the MINLP model, this actually makes the model easier to solve and more robust.

PS1. The model is now becoming somewhat complex as we try to deal with different types of decisions in the same model. We call this an “integrated model” rather than just messy!

PS2. In some cases fractions can be reformulated using a “fractional programming” technique. In practice for larger models this often turns out to be a difficult reformulation to implement.

Question: it is suggested in a comment below to use (2d?) bisection. Is that really a good idea? Any references for such an approach applied in a similar situation?