tag:blogger.com,1999:blog-8781383461061929571.post5807499689428688704..comments2024-03-14T09:08:19.035-04:00Comments on OR in an OB World: Benders Decomposition Then and NowPaul A. Rubinhttp://www.blogger.com/profile/05801891157261357482noreply@blogger.comBlogger106125tag:blogger.com,1999:blog-8781383461061929571.post-46445624335919040062020-01-30T15:40:03.072-05:002020-01-30T15:40:03.072-05:00If the M values are "reasonably small", ...If the M values are "reasonably small", you do not need to use the combinatorial form of Benders from Codato & Fischetti; you can use the original form of Benders. That should eliminate your concern about putting the original objective function in the subproblem in constraint form. As far as decomposing so as to reduce the number of binary variables in any one problem, I don't see any easy fix. You could try asking on OR Stack Exchange (a link to which is in the right margin near the top of the post).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-41553198415174194572020-01-30T15:10:18.657-05:002020-01-30T15:10:18.657-05:00Thank you Dr. Rubin for the response. As for the f...Thank you Dr. Rubin for the response. As for the first criterion you mentioned, the model is too large with 7 million binary variables for an instance. The single model does exceed the solver's memory capacity. However, I cannot find an intuitive way to decompose the model. <br />Here is the model using the notation from paper by Codato and Fischetti on combinatorial BD:<br />min d^t*y<br />s.t.<br />Ax <= a<br />Bz <= b<br />Gz + Hx = g<br />Cx + Dy >= c<br />Ez + Fy >= e<br />x,z = bin, y >= 0<br /><br />As suggested by Codato et. al., I could separate the part with continuous variables and disjunctive constraints into a subproblem. It would still leave three issues: first, the master problem would still contain a large number of binary variables and it may be difficult to get an integer solution using a solver. Especially constraints related to Bz <= b constitute a VRP with multiple precedence and coordination constraints. Second, The objective function will need to be introduced into the slave problem as a constraint of form d^t * y <= UB - e. This has an issue of finding a sufficiently small value of parameter e which does not cut any feasible master problem solution. I have observed multiple integer feasible solutions with objective very close to UB. Final issue relates to the CB cuts added to the Master problem in each iteration. Cutting one solution at a time can introduce a large number of constraints into the master. The master involves, for instance, choosing 100 "x" variables as 1, out of 6 million. I am afraid the number of integer feasible solutions in the Master could be too large to be cut off by CB cuts. <br /><br />Regarding the second criterion, the big-M is not quite infinity and can be bounded by a reasonably small value. However, solving the problem as a single model still does not work for large instances. <br />Anonymoushttps://www.blogger.com/profile/03995432263472338626noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-31102584607007347882020-01-30T13:53:27.139-05:002020-01-30T13:53:27.139-05:00In general, I do not know any way to look at a pro...In general, I do not know any way to look at a problem and decide if Benders is the "right" way to model it. Generally speaking, if there would only be one subproblem and if a single model would not exceed the solver's memory capacity, I would probably try a single model first. If the problem decomposes into several / many subproblems, maybe I would try Benders first.<br /><br />You mentioned using big M constraints, which to me is a special case. If I know valid choices for the M parameters that are "reasonably small", I would be inclined to try a single model. On the other hand, if I find myself using "not quite infinity" values for M in order to be sure they are valid, I would definitely try Benders, since it gets rid of the M values.<br /><br />Bottom line, though, is that these are just personal tendencies. I don't think there is a way to decide with any high degree of certainty just by "looking" at the model.Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-28748776025553619262020-01-29T19:35:15.630-05:002020-01-29T19:35:15.630-05:00Dear Dr. Rubin,
I am thinking about using BD for...Dear Dr. Rubin, <br /><br />I am thinking about using BD for a MIP model. It has two types of Binary variables, both connected to each other through a constraint. The model also has continuous variables (part of disjunctive constraints, modeled with bigM) which in turn depend on both sets of binary variables. The objective function involves minimizing a function of continuous variables. Is there a way to find out beforehand whether BD will be a good approach to try? The number of integer variables in the model runs into tens of millions. An alternate question could be, are there problems for which BD does not work as well? How can one answer that question without lengthy experiments and just by "looking" at a MIP model?Zulqarnainhttps://www.blogger.com/profile/17690333321574031547noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-8213828721582216372019-09-07T10:44:54.975-04:002019-09-07T10:44:54.975-04:00CPLEX can detect a potential incumbent solution vi...CPLEX can detect a potential incumbent solution via heuristics, even at nodes with fractional solutions. In both the the "legacy" and "generic" callback systems, lazy constraint callbacks are definitely triggered in those cases, and I would assume that would be true in automatic Benders as well. So I'm fairly sure Benders cuts can be added at nodes with fractional solutions. The only way I can think of to avoid that would be to turn off all heuristics.<br />Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-36344680418836914852019-09-07T07:16:54.106-04:002019-09-07T07:16:54.106-04:00Dear Prof. Rubin,
Could you please let me know if...Dear Prof. Rubin,<br /><br />Could you please let me know if Cplex' Automatic Benders adds Benders cuts only at integer nodes of the Branch&Bound tree, or also at fractional nodes?<br /><br />Is there a way to control the nodes (integer versus fractioanl) where to add Benders cuts?<br /><br />Regards,<br /><br />Sachinsachin Jayaswalhttps://www.blogger.com/profile/04657771560757276728noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-3467198997962225632018-12-08T19:46:13.762-05:002018-12-08T19:46:13.762-05:00No, start solutions do not change the algorithm (p...No, start solutions do not change the algorithm (provided that either the starting solution is feasible or that it is tested in the subproblem and the subproblem is given a chance to reject it).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-88070093947696691232018-12-08T19:40:45.758-05:002018-12-08T19:40:45.758-05:00I am trying to implement BD with start solution an...I am trying to implement BD with start solution and regularization by simple adding a quadratic term to the objective function. When using however a start solution to initialize my master problem, the algorithm overestimates the final cost. Do you change the algorithm anyhow when using start solutions?Matthiashttps://www.blogger.com/profile/04542908134686208805noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-37196159569094602412018-09-23T11:05:33.883-04:002018-09-23T11:05:33.883-04:00The upper bound in a min problem is the objective ...The upper bound in a min problem is the objective value of the best known feasible solution. You can look for solver options that emphasize finding feasible solutions (perhaps by applying heuristics at more nodes), or you can try designing and running a heuristic before starting Benders decomposition, to find a good starting solution (which you would pass to the solver).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-84949127333846009452018-09-23T10:54:01.409-04:002018-09-23T10:54:01.409-04:00it's minimizing. i am working on scheduling pr...it's minimizing. i am working on scheduling problem, forexample upper bound dosen't change for 100 iteretion , what should i do to became faster?<br />Anonymoushttps://www.blogger.com/profile/00679555399738188563noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-46032907917339063922018-09-22T15:33:16.623-04:002018-09-22T15:33:16.623-04:00There are a lot of possible reasons, and possibly ...There are a lot of possible reasons, and possibly more than one applies. (Also, it's hard to answer a question like this without knowing if you are maximizing or minimizing.)Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-3690958311077480972018-09-22T15:21:45.538-04:002018-09-22T15:21:45.538-04:00Dear Prof Rubin
hi
upper bound in benders decompos...Dear Prof Rubin<br />hi<br />upper bound in benders decomposition dosen't change or change in long time, do you know why?Anonymoushttps://www.blogger.com/profile/00679555399738188563noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-66414040513895050922018-09-20T09:04:04.967-04:002018-09-20T09:04:04.967-04:00Sorry, I somehow did not get notification of this ...Sorry, I somehow did not get notification of this comment being posted. If the subproblem contains integer variables and is unbounded, then its LP relaxation is unbounded. So if you know the subproblem is unbounded, you can relax it and get a cut the usual way. The catch is that the LP relaxation can be unbounded without the original subproblem being unbounded. Unfortunately, I'm not aware of what research (if any) there is about that case (or how you know that the subproblem is truly unbounded).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-9036970112122564812018-08-24T13:50:02.187-04:002018-08-24T13:50:02.187-04:00Hi Prof Rubin,
What if there exist integer variab...Hi Prof Rubin,<br /><br />What if there exist integer variables in subproblems? Is it possible to obtain an extreme ray if subproblem is unbounded like LP? If not, how to generate a Bender cut in this case? Thank you very much.Anonymoushttps://www.blogger.com/profile/04693448961465693871noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-43044079111708267062018-02-27T13:40:17.226-05:002018-02-27T13:40:17.226-05:00Yes, the format/derivation of the cuts is the same...Yes, the format/derivation of the cuts is the same, and no, there is no danger that cuts derived from a feasible but suboptimal solution to the current master will cut off the optimal solution. The proof of validity of either type of cut does not depend on the proposed incumbent being optimal (in the current master) or not.Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-402903013307478642018-02-27T04:51:48.994-05:002018-02-27T04:51:48.994-05:00Dear Prof. Rubin,
I really like your articles, wh...Dear Prof. Rubin,<br /><br />I really like your articles, which helped me a lot in understanding how the modern BD with callbacks works. However, I still got a little doubt. <br /><br />My question is, will the cuts (both feasibility and optimality) used in the "old" BD and in the "modern" way with callbacks be in the same format? Should we somehow alter the cuts in order to use them in the modern approach with single B&C tree? To be specific, in the modern BD with callbacks, if we generate the same-format cuts for only a feasible master solution, is it possible that these cuts can somehow cut out the optimal solution in the single B&C tree?<br /><br />Thanks a lot for your articles and looking forward to your reply.<br /><br />Best regards!Anonymoushttps://www.blogger.com/profile/17689795047216679947noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-371236026827035082017-07-12T16:14:22.457-04:002017-07-12T16:14:22.457-04:00It's hard to diagnose a slow moving bound. Som...It's hard to diagnose a slow moving bound. Sometimes it reflects a "loose" formulation, but sometimes it happens and there is not much you can do to tighten the formulation. You might do a little searching to see if someone has developed facet defining cuts for a problem similar to yours, and if so try adding them. There will also likely be some solver parameters you can try adjusting. (Again, I'm out of touch with Xpress, but I know both CPLEX and Gurobi are tunable, so likely Xpress is too.)<br /><br />If the out-of-memory error is due to the tree occupying too much memory, you can always reduce the memory footprint by switching to depth-first search (which may extend the solution time considerably). There may also be a parameter that lets you adjust when backtracking occurs, so that you can shift the search a bit toward depth-first (and away from breadth-first) without going totally depth-first. The less often you backtrack, the less memory you use (maybe).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-28090781328478796732017-07-11T20:03:38.339-04:002017-07-11T20:03:38.339-04:00Thanks for your reply Professor. I've managed ...Thanks for your reply Professor. I've managed to implement lazyconstraint callback, to search in one tree and add combinatorial cut when it's needed. It solves much faster than before (without callbacks and comb cuts) for small sized problems but it still has problems to reach optimal value for medium and large scaled problems. with one tree approach it gives memory error since it has spent long times to find a new incumbent after 20th integer solution. I used time limit criteria to prevent this situation and mixed classical benders with one tree before finding the optimal value.in this case, it ended up being stuck around %30 percent gap and keep giving similar values for lower bound in every iteration. Do you have any suggestion or diagnosis for this situation. sedaaağhttps://www.blogger.com/profile/13093884063740789376noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-75379300069389224352017-06-25T14:23:39.961-04:002017-06-25T14:23:39.961-04:00Yes, the upper and lower bounds come from the mast...Yes, the upper and lower bounds come from the master problem. If some candidate solution survives the callback (no cut generated), it becomes the new master problem incumbent. As long as at least one incumbent has been found, the best (most recent) one will give you the upper bound, and that bound will be valid -- the objective value of the optimal solution will not be any larger than the objective value of that incumbent. If you have not yet found an incumbent, I believe getObjValue will throw an exception. No, you will not get premature termination.Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-60577306905208857672017-06-24T03:00:16.857-04:002017-06-24T03:00:16.857-04:00Dear Prof Rubin,
In a lazy constraint callback ba...Dear Prof Rubin,<br /><br />In a lazy constraint callback based BD implementation for minimization problem, if I want to early-stop cplex using an optimality gap% criterion (setting epgap), and on termination extract the UB and LB values using cplex.getObjValue() and cplex.getBestObjValue() functions, am I not extracting UB and LB of the MP? It may be possible that the overall problem has a different UB (of higher value)? So, won't it be a termination earlier than I intend? If yes, what would be the correct way to early-stop a lazy constraint callback based BD by specifying the epgap parameter? Is there any mechanism to tell cplex to continue in a situation where the MP's bounds are smaller than epgap, but overall problem's bounds are still bigger than epgap? Please advise.<br />Jyotirmoy Dalalhttps://www.blogger.com/profile/16994038091377159844noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-26178143606432047782017-05-03T15:16:25.908-04:002017-05-03T15:16:25.908-04:00It's been quite a few years since I used Xpres...It's been quite a few years since I used Xpress, and I've forgotten what callbacks they support. If they have something equivalent to CPLEX's lazy constraint callback, then using it does not stop the solver when an integer-feasible candidate is found, it just pauses the solver.<br /><br />Do you need to do this as opposed to solving to optimality, adding a constraint and repeating (the "traditional" way)? No, it's your choice. I think the traditional method is sometimes more efficient. (I suspect that occurs when the master problem is "easy" to solve and the subproblem is "hard".) My subjective, anecdotal, unsupported by rigorous analysis opinion is that the callback ("one tree") method is _usually_ more efficient, and so I default to using it.<br /><br />You mentioned a "big M" method (in the subproblem?). If you are struggling to improve performance, you might consider using "combinatorial Benders cuts" in lieu of the big M method. Again, sometimes it's faster, sometimes not, depending on how tight your values for M are.<br /><br />Lastly, you're very welcome. Thanks for the kind words. :-)Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-88490974831965058122017-05-02T10:06:20.942-04:002017-05-02T10:06:20.942-04:00Hi Professor Rubin,
I'm applying benders deco...Hi Professor Rubin, <br />I'm applying benders decomposition in Xpress-Ive for a crew scheduling problem. I have lots of consecutive constraints hold binary and integer variables. Due to the structure of constraints;<br />1) I have always getting feasibility cut<br />2) I'm using bigm method to getray from dual sub problem. Ive tried using C++ but getray function didn't work for my case.<br />3) I'm trying to use call backs. <br />I wonder that with callback do I need to stop solving master problem for every integer feasible solution found and solve dual with the feasible one rather than waiting for optimal solution of master problem. <br />Thanks for this great blog, it's been so good for knowledge. <br /><br />Regardssedaaağhttps://www.blogger.com/profile/13093884063740789376noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-36304996023739530282016-07-24T14:12:59.448-04:002016-07-24T14:12:59.448-04:00Alysson, nice to hear from you! Although this remi...Alysson, nice to hear from you! Although this reminds me of the first time I discovered that a friend of mine had become a grandfather. Makes me feel old! I'm four years retired, so finding time for this is a bit easier than it was back in '03. Once you're a full professor, with less pressure to publish all the time, you may find some slack in your schedule (particularly if you're good at ducking committee work).Paul A. Rubinhttps://www.blogger.com/profile/05801891157261357482noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-89551733577994381922016-07-24T13:22:20.630-04:002016-07-24T13:22:20.630-04:00(disclaimer: I didn't read all comments).
Pro...(disclaimer: I didn't read all comments).<br /><br />Prof. Rubin was such a big help when I was doing my PhD, 10 ou 12 years ago. Now I get pointed to this post by a PhD student of mine.<br /><br />Summary: 1) Being an academic is fun. 2) Now, being a PhD supervisor and a "true" doctor, I really can't understand how he finds time for this. 3) I still own Dr. Rubin and now doctor Ceriola a beer.<br /><br />( https://groups.google.com/forum/#!search/benders$20decomposition$20alysson$20rubin/sci.op-research/sGhIyksVDdk/G-U1kQZSiv0J )A.https://www.blogger.com/profile/06190349991342117195noreply@blogger.comtag:blogger.com,1999:blog-8781383461061929571.post-57120572817637658762016-05-13T14:04:06.719-04:002016-05-13T14:04:06.719-04:00Thank you very much, I will read the reference you...Thank you very much, I will read the reference you suggest. As I said earlier, I mostly read the Geoffrion paper for the Generalized Benders Decomposition but it is not very clear to me.<br /><br />Best regards,<br /><br />HoaLê Thái Hòahttps://www.blogger.com/profile/16361113123833537587noreply@blogger.com