Branch and bound (BB) algorithm

For details, see wikipedia article.

In short, as a list:

  • imagine we are trying to find the minimum of some function f(x) in some range x[x1;x2]; we do not want to just iterate all the values of x - we want optimizations
  • BB algorithm first splits the [x1;x2] range into a number of sub-ranges
  • then BB algorithm estimates the lower and upper bounds for the minimal value of f(x) for each sub-range; this is the step where optimization occurs - instead of iterating all the values within a sub-range, we can perform only two evaluations of f(x) for each sub-range
  • all the sub-ranges are compared, and the one with the lowest upper bound for the minimal f(x) value is iterated to find the global minimum of f(x). Instead of iterating the final sub-range, it could also be split into more sub-ranges to apply BB algorithm once again.

BB algorithm looks like a clasical tradeoff case: you either get speed (when sub-ranges are coarse, i.e. include many data points), or you get precision (e.g. when each sub-range has only 3 data-points, so that you can hardly miss a local minimum).

I have no proof for that, but I think that applying BB algorithm to a single set of data points with varying criteria for sub-range sizes could allow achieving higher precision (by decreasing chances of missing a local minimum) while preserving unproportionally higher speed for coarse sub-ranges.

Syndicate content