Solvor all your optimization needs.
| Category | Solvors | Use Case |
|---|---|---|
| Linear/Integer | solve_lp, solve_lp_interior, solve_milp, solve_cg, solve_bp |
Resource allocation, cutting stock |
| Constraint | solve_sat, Model |
Sudoku, puzzles, and that one config problem that's been bugging you |
| Combinatorial | solve_knapsack, solve_bin_pack, solve_job_shop, solve_vrptw |
Packing, scheduling, routing |
| Local Search | anneal, tabu_search, lns, alns |
TSP, combinatorial optimization |
| Population | evolve, differential_evolution, particle_swarm |
Global search, nature-inspired |
| Gradient | gradient_descent, momentum, rmsprop, adam |
ML, curve fitting |
| Quasi-Newton | bfgs, lbfgs |
Fast convergence, smooth functions |
| Derivative-Free | nelder_mead, powell, bayesian_opt |
Black-box, expensive functions |
| Pathfinding | bfs, dfs, dijkstra, astar, astar_grid, bellman_ford, floyd_warshall |
Shortest paths, graph traversal |
| Graph | max_flow, min_cost_flow, kruskal, prim, topological_sort, strongly_connected_components, pagerank, louvain, articulation_points, bridges, kcore_decomposition |
Flow, MST, dependency, centrality, community detection |
| Assignment | solve_assignment, solve_hungarian, network_simplex |
Matching, min-cost flow |
| Exact Cover | solve_exact_cover |
N-Queens, tiling puzzles |
| Utilities | FenwickTree, UnionFind |
Data structures for algorithms |
uv add solvorfrom solvor import solve_lp, solve_tsp, dijkstra, solve_hungarian
# Linear Programming
result = solve_lp(c=[1, 2], A=[[1, 1], [2, 1]], b=[4, 5])
print(result.solution) # optimal x
# TSP with tabu search
distances = [[0, 10, 15], [10, 0, 20], [15, 20, 0]]
result = solve_tsp(distances)
print(result.solution) # best tour found
# Shortest path
graph = {'A': [('B', 1), ('C', 4)], 'B': [('C', 2)], 'C': []}
result = dijkstra('A', 'C', lambda n: graph.get(n, []))
print(result.solution) # ['A', 'B', 'C']
# Assignment
costs = [[10, 5], [3, 9]]
result = solve_hungarian(costs)
print(result.solution) # [1, 0] - row 0 gets col 1, row 1 gets col 0Linear & Integer Programming
For resource allocation, blending, production planning. Finds the exact optimum for linear objectives with linear constraints.
# minimize 2x + 3y subject to x + y >= 4, x <= 3
result = solve_lp(
c=[2, 3],
A=[[-1, -1], [1, 0]], # constraints as Ax <= b
b=[-4, 3]
)When some variables must be integers. Diet problems, scheduling with discrete slots, set covering.
# same as above, but x must be integer
result = solve_milp(c=[2, 3], A=[[-1, -1], [1, 0]], b=[-4, 3], integers=[0])Column generation for problems with exponentially many variables. Cutting stock, bin packing, vehicle routing, crew scheduling.
# Cutting stock: minimize rolls to cut required pieces
result = solve_cg(
demands=[97, 610, 395, 211],
roll_width=100,
piece_sizes=[45, 36, 31, 14],
)
print(result.objective) # 454 rollsBranch-and-price for optimal integer solutions. Combines column generation with branch-and-bound for proven optimality.
# Cutting stock with guaranteed integer optimality
result = solve_bp(
demands=[97, 610, 395, 211],
roll_width=100,
piece_sizes=[45, 36, 31, 14],
)
print(result.objective) # Integer optimalConstraint Programming
For "is this configuration valid?" problems. Dependencies, exclusions, implications - anything that boils down to boolean constraints.
# (x1 OR x2) AND (NOT x1 OR x3) AND (NOT x2 OR NOT x3)
result = solve_sat([[1, 2], [-1, 3], [-2, -3]])
print(result.solution) # {1: True, 2: False, 3: True}Constraint programming for puzzles and scheduling with "all different", arithmetic, and logical constraints. Sudoku, N-Queens, timetabling.
m = Model()
cells = [[m.int_var(1, 9, f'c{i}{j}') for j in range(9)] for i in range(9)]
# All different in each row
for row in cells:
m.add(m.all_different(row))
result = m.solve()Metaheuristics
Simulated annealing, sometimes you have to go downhill to find a higher peak.
result = anneal(
initial=initial_solution,
objective_fn=cost_function,
neighbors=random_neighbor,
temperature=1000,
cooling=0.9995
)Greedy local search with a "don't go back there" list. Simple but surprisingly effective.
result = tabu_search(
initial=initial_solution,
objective_fn=cost_function,
neighbors=get_neighbors, # returns [(move, solution), ...]
cooldown=10
)Large Neighborhood Search, destroy part of your solution and rebuild it better. ALNS learns which operators work best.
result = lns(initial, objective_fn, destroy_ops, repair_ops, max_iter=1000)
result = alns(initial, objective_fn, destroy_ops, repair_ops, max_iter=1000) # adaptivePopulation-based global search. Let a swarm of candidates explore for you. DE and PSO work on continuous spaces.
result = evolve(objective_fn=fitness, population=pop, crossover=cx, mutate=mut)
result = differential_evolution(objective_fn, bounds=[(0, 10)] * n, population_size=50)
result = particle_swarm(objective_fn, bounds=[(0, 10)] * n, n_particles=30)Continuous Optimization
gradient_descent, momentum, rmsprop, adam - follow the slope. Adam adapts learning rates per parameter.
def grad_fn(x):
return [2 * x[0], 2 * x[1]] # gradient of x^2 + y^2
result = adam(grad_fn, x0=[5.0, 5.0])bfgs, lbfgs - approximate Hessian for faster convergence. L-BFGS uses limited memory.
result = bfgs(objective_fn, grad_fn, x0=[5.0, 5.0])
result = lbfgs(objective_fn, grad_fn, x0=[5.0, 5.0], memory=10)nelder_mead, powell - no gradients needed. Good for noisy, non-smooth, or "I have no idea what this function looks like" situations.
result = nelder_mead(objective_fn, x0=[5.0, 5.0])
result = powell(objective_fn, x0=[5.0, 5.0])When each evaluation is expensive. Builds a surrogate model to sample intelligently. Perfect for hyperparameter tuning or when your simulation takes 10 minutes per run.
result = bayesian_opt(expensive_fn, bounds=[(0, 1), (0, 1)], max_iter=30)Pathfinding
Unweighted graph traversal. BFS finds shortest paths (fewest edges), DFS explores depth-first. Both work with any state type and neighbor function.
# Find shortest path in a grid
def neighbors(pos):
x, y = pos
return [(x+1, y), (x-1, y), (x, y+1), (x, y-1)]
result = bfs(start=(0, 0), goal=(5, 5), neighbors=neighbors)
print(result.solution) # path from start to goalWeighted shortest paths. Classic algorithm for when edges have costs. Stops early when goal is found.
def neighbors(node):
return graph[node] # returns [(neighbor, cost), ...]
result = dijkstra(start='A', goal='Z', neighbors=neighbors)A* with heuristics. Faster than Dijkstra when you have a good distance estimate. astar_grid handles 2D grids with built-in heuristics.
# Grid pathfinding with obstacles
grid = [[0, 0, 1, 0], [0, 0, 1, 0], [0, 0, 0, 0]]
result = astar_grid(grid, start=(0, 0), goal=(2, 3))Handles negative edge weights. Slower than Dijkstra but detects negative cycles, which is useful when costs can go negative.
result = bellman_ford(start=0, edges=[(0, 1, 5), (1, 2, -3), ...], n_nodes=4)All-pairs shortest paths. O(n³) but gives you every shortest path at once. Worth it when you need the full picture.
result = floyd_warshall(n_nodes=4, edges=[(0, 1, 3), (1, 2, 1), ...])
# result.solution[i][j] = shortest distance from i to jNetwork Flow & MST
"How much can I push through this network?" Assigning workers to tasks, finding bottlenecks.
graph = {
's': [('a', 10, 0), ('b', 5, 0)],
'a': [('b', 15, 0), ('t', 10, 0)],
'b': [('t', 10, 0)],
't': []
}
result = max_flow(graph, 's', 't')"What's the cheapest way to route X units?" Use min_cost_flow for simplicity, network_simplex for large instances.
# network_simplex for large min-cost flow
arcs = [(0, 1, 10, 2), (0, 2, 5, 3), (1, 2, 15, 1)] # (from, to, cap, cost)
supplies = [10, 0, -10] # positive = source, negative = sink
result = network_simplex(n_nodes=3, arcs=arcs, supplies=supplies)Minimum spanning tree. Connect all nodes with minimum total edge weight. Kruskal sorts edges, Prim grows from a start node.
edges = [(0, 1, 4), (0, 2, 3), (1, 2, 2)] # (u, v, weight)
result = kruskal(n_nodes=3, edges=edges)
# result.solution = [(1, 2, 2), (0, 2, 3)] - MST edgesAssignment
Optimal one-to-one matching. solve_hungarian is O(n³), direct algorithm for assignment problems.
costs = [
[10, 5, 13],
[3, 9, 18],
[10, 6, 12]
]
result = solve_hungarian(costs)
# result.solution[i] = column assigned to row i
# result.objective = total cost
# For maximization
result = solve_hungarian(costs, minimize=False)Exact Cover
For "place these pieces without overlap" problems. Sudoku, pentomino tiling, N-Queens, or any puzzle where you stare at a grid wondering why nothing fits.
# Tiling problem: cover all columns with non-overlapping rows
matrix = [
[1, 1, 0, 0], # row 0 covers columns 0, 1
[0, 1, 1, 0], # row 1 covers columns 1, 2
[0, 0, 1, 1], # row 2 covers columns 2, 3
[1, 0, 0, 1], # row 3 covers columns 0, 3
]
result = solve_exact_cover(matrix)
# result.solution = (0, 2) or (1, 3) - rows that cover all columns exactly onceCombinatorial Solvers
The classic "what fits in your bag" problem. Select items to maximize value within capacity.
values = [60, 100, 120]
weights = [10, 20, 30]
result = solve_knapsack(values, weights, capacity=50)
# result.solution = (1, 2) - indices of selected itemsBin packing heuristics. Minimize bins needed for items.
items = [4, 8, 1, 4, 2, 1]
result = solve_bin_pack(items, bin_capacity=10)
# result.solution = (1, 0, 0, 1, 0, 0) - bin index for each item
# result.objective = 2 (number of bins)Job shop scheduling. Minimize makespan for jobs on machines.
# jobs[i] = [(machine, duration), ...] - operations for job i
jobs = [[(0, 3), (1, 2)], [(1, 2), (0, 4)]]
result = solve_job_shop(jobs, n_machines=2)Vehicle Routing with Time Windows. Serve customers with capacity and time constraints.
from solvor import Customer, solve_vrptw
customers = [
Customer(1, 10, 10, demand=10, tw_start=0, tw_end=50, service_time=5),
Customer(2, 20, 20, demand=10, tw_start=0, tw_end=50, service_time=5),
Customer(3, 30, 30, demand=10, tw_start=0, tw_end=50, service_time=5),
]
result = solve_vrptw(customers, vehicles=2, vehicle_capacity=30)All solvors return a consistent Result dataclass:
Result(
solution, # best solution found
objective, # objective value
iterations, # solver iterations (pivots, generations, etc.)
evaluations, # function evaluations
status, # OPTIMAL, FEASIBLE, INFEASIBLE, UNBOUNDED, MAX_ITER
error, # error message if failed (None on success)
solutions, # multiple solutions when solution_limit > 1
)| Problem | Solvor |
|---|---|
| Linear constraints, continuous | solve_lp, solve_lp_interior |
| Linear constraints, integers | solve_milp |
| Cutting stock, crew scheduling | solve_cg, solve_bp |
| Boolean satisfiability | solve_sat |
| Discrete vars, complex constraints | Model |
| Knapsack, subset selection | solve_knapsack |
| Bin packing | solve_bin_pack |
| Job shop scheduling | solve_job_shop |
| Vehicle routing | solve_vrptw |
| Combinatorial, local search | tabu_search, anneal |
| Combinatorial, destroy-repair | lns, alns |
| Global search, continuous | differential_evolution, particle_swarm |
| Global search, discrete | evolve |
| Smooth, differentiable, fast | bfgs, lbfgs |
| Smooth, differentiable, ML | adam, rmsprop |
| Non-smooth, no gradients | nelder_mead, powell |
| Expensive black-box | bayesian_opt |
| Shortest path, unweighted | bfs, dfs |
| Shortest path, weighted | dijkstra, astar |
| Shortest path, negative weights | bellman_ford |
| All-pairs shortest paths | floyd_warshall |
| Minimum spanning tree | kruskal, prim |
| Dependency ordering | topological_sort |
| Circular dependencies | strongly_connected_components |
| Node importance | pagerank |
| Community detection | louvain |
| Single points of failure | articulation_points, bridges |
| Core vs periphery | kcore_decomposition |
| Maximum flow | max_flow |
| Min-cost flow | min_cost_flow, network_simplex |
| Assignment, matching | solve_hungarian, solve_assignment |
| Exact cover, tiling | solve_exact_cover |
Have your cake and eat it too. solvOR gives you both:
- Readable Python - Every algorithm is implemented in clear, documented Python you can study and modify
- Rust Performance - Optional drop-in Rust backends with 5-60x speedup for compute-heavy algorithms
from solvor import floyd_warshall
# "auto" (default) - Uses Rust if available, falls back to Python
result = floyd_warshall(n_nodes, edges)
# "python" - Force pure Python (for learning, debugging, or extending)
result = floyd_warshall(n_nodes, edges, backend="python")Pre-built wheels include Rust extensions for Linux, macOS, and Windows. If Rust isn't available, solvOR falls back to Python automatically.
See the Performance docs for which algorithms have Rust backends and benchmarks.
- Readable first: every algorithm in clear Python you can actually understand
- Performance when needed: optional Rust backends, zero-copy where possible
- Consistent: same Result format, same minimize/maximize convention
- Practical: solves real problems (and AoC puzzles)
Full docs at solvOR.ai: getting started, algorithm reference, cookbook with worked examples, and troubleshooting.
See CONTRIBUTING.md for development setup and guidelines.
Apache 2.0 License, free for personal and commercial use.
A little bit of history..
I learned about solvers back in 2011, working with some great minds at a startup in Paris. I started writing python myself around 2018, always as a hobby, and in 2024 I got back into solvers for an energy management system (EMS) and wrote a few tools (simplex, milp, genetic) myself mainly to improve my understanding.Over time this toolbox got larger and larger, so I decided to publish it on GitHub so others can use it and improve it even further. Since I am learning Rust, I will eventually replace some performance critical operations with a high performance Rust implementation. But since I work on this project (and others) in my spare time, what and when is uncertain. The name solvOR is a mix of solver(s) and OR (Operations Research).
Disclaimer; I am not a professional software engineer, so there are probably some obvious improvements possible. If so let me know, or create a PR!