An overview of Large Neighborhood Search (LNS) heuristics for solving complex Mixed-Integer Linear Programs (MILP), exploring core concepts, 'destroy and repair' methods, and popular frameworks like RINS and Local Branching.
Large Neighborhood search is a heuristic algorithm for solving large-scale optimization problems. It is a combination of local search and CP/MIP (constraint programming / mixed-integer programming).
Find a feasible solution using constraint programming
Then Repeat:
This works because:
The algorithm is almost technology independent: MIP can replace CP here.
The neiborhood can be defined as follows: (there are many options)
If we use the strategy mentioned above, the LNS algorithm can be reprased as
repeat:
Mixed-Integer Linear Programming (MILP) is a powerful mathematical framework used to model and solve complex optimization problems across operations research, supply chain management, scheduling, and engineering. While modern commercial solvers (such as Gurobi, CPLEX, and FICO Xpress) are incredibly sophisticated, finding the exact optimal solution for large-scale, real-world MILP instances is often $\mathcal{NP}$-hard and computationally intractable within a reasonable timeframe.
To deal with this complexity, researchers and practitioners often employ primal heuristics—methods designed to find “good” or “near-optimal” feasible solutions quickly, even if optimality cannot be mathematically proven. One class of highly successful heuristics in this domain is Large Neighborhood Search (LNS).
Large Neighborhood Search (LNS) is an iterative metaheuristic that explores the solution space by partially destroying a known feasible solution and then rebuilding (or repairing) it in hopes of discovering a better one.
While standard local search methods operate on “small” neighborhoods (e.g., flipping a single binary variable or swapping two items), LNS explores significantly larger neighborhoods. Because the neighborhood is so large, exploring it exhaustively is usually impossible. Instead, LNS relies on exact optimization methods (often by solving a smaller, restricted version of the original MILP) to navigate this neighborhood and find the best possible repair.
At its algorithmic core, LNS operates through two alternating steps applied to an incumbent solution $x^*$:
If the “repaired” solution is strictly better than the incumbent $x^*$, it becomes the new incumbent, and the process repeats.
When applying LNS generally, destroy and repair operators are often highly problem-specific. However, over the past two decades, several generic MIP-based LNS heuristics have been developed to operate automatically inside commercial solvers without requiring the user to write specific operators.
Two prominent, general-purpose LNS strategies for MILP include Local Branching and Relaxation Induced Neighborhood Search (RINS).
Introduced by Fischetti and Lodi (2003), Local Branching explores the neighborhood of an incumbent solution by adding linear constraints (local branching cuts) to the original problem.
Given an incumbent binary solution $x^$, a “local branching constraint” restricts the search space so that any new solution can differ from $x^$ in at most $k$ binary variables:
\[\sum_{j \in B: x_j^* = 1} (1 - x_j) + \sum_{j \in B: x_j^* = 0} x_j \le k\]Where $B$ is the set of index variables for all binaries. The solver explores this reduced neighborhood (known as the $k$-opt neighborhood) as a sub-MILP. If a better solution is found, the incumbent is updated.
Designed by Danna, Rothberg, and Le Pape (2005) and heavily utilized by IBM CPLEX and Gurobi, RINS is a highly effective generic heuristic.
RINS works by comparing the current best incumbent integer solution ($x^*$) against the optimal solution of the continuous LP relaxation at the current node of the branch-and-bound tree ($x^{LP}$).
A sub-MIP is formulated on the free variables and handed to the solver with a strict node or time limit. RINS leverages the intuition that variables showing consensus between the continuous relaxation and the integer incumbent are highly likely to maintain those values in an optimal or near-optimal solution.
Modern LNS implementations often utilize adaptive machine learning techniques:
Pros:
Cons:
Large Neighborhood Search offers an elegant compromise between heuristic speed and exact algorithmic rigor. By systematically destroying and repairing portions of a solution through sub-MIP formulations, LNS algorithms like RINS and Local Branching have fundamentally elevated the primal performance of commercial MILP optimization software today.
| Discrete Optimization | 01 Large Neighborhood Search asymmetric TSP with time windows 8 42 link |