This article includes a list of references, related reading, or external links, but its sources remain unclear because it lacks inline citations. Please help to improve this article by introducing more precise citations. (September 2016) (Learn how and when to remove this template message)

In mathematics, a **constraint** is a condition of an optimization problem that the solution must satisfy. There are several types of constraints—primarily equality constraints, inequality constraints, and integer constraints. The set of candidate solutions that satisfy all constraints is called the feasible set.^{[1]}

The following is a simple optimization problem:

subject to

and

where denotes the vector (*x*_{1}, *x*_{2}).

In this example, the first line defines the function to be minimized (called the objective function, loss function, or cost function). The second and third lines define two constraints, the first of which is an inequality constraint and the second of which is an equality constraint. These two constraints are hard constraints, meaning that it is required that they be satisfied; they define the feasible set of candidate solutions.

Without the constraints, the solution would be (0,0), where has the lowest value. But this solution does not satisfy the constraints. The solution of the constrained optimization problem stated above is , which is the point with the smallest value of that satisfies the two constraints.

- If an inequality constraint holds with
*equality*at the optimal point, the constraint is said to be**binding**, as the point*cannot*be varied in the direction of the constraint even though doing so would improve the value of the objective function. - If an inequality constraint holds as a
*strict inequality*at the optimal point (that is, does not hold with equality), the constraint is said to be**non-binding**, as the point*could*be varied in the direction of the constraint, although it would not be optimal to do so. Under certain conditions, as for example in convex optimization, if a constraint is non-binding, the optimization problem would have the same solution even in the absence of that constraint. - If a constraint is not satisfied at a given point, the point is said to be
**infeasible**.

If the problem mandates that the constraints be satisfied, as in the above discussion, the constraints are sometimes referred to as *hard constraints*. However, in some problems, called flexible constraint satisfaction problems, it is preferred but not required that certain constraints be satisfied; such non-mandatory constraints are known as *soft constraints*. Soft constraints arise in, for example, preference-based planning. In a MAX-CSP problem, a number of constraints are allowed to be violated, and the quality of a solution is measured by the number of satisfied constraints.

Global constraints^{[2]} are constraints representing a specific relation on a number of variables, taken altogether. Some of them, such as the `alldifferent`

constraint, can be rewritten as a conjunction of atomic constraints in a simpler language: the `alldifferent`

constraint holds on *n* variables , and is satisfied if the variables take values which are pairwise different. It is semantically equivalent to the conjunction of inequalities . Other global constraints extend the expressivity of the constraint framework. In this case, they usually capture a typical structure of combinatorial problems. For instance, the `regular`

constraint expresses that a sequence of variables is accepted by a deterministic finite automaton.

Global constraints are used^{[3]} to simplify the modeling of constraint satisfaction problems, to extend the expressivity of constraint languages, and also to improve the constraint resolution: indeed, by considering the variables altogether, infeasible situations can be seen earlier in the solving process. Many of the global constraints are referenced into an online catalog.