assumption of LP Model

Alright — let’s unpack the four key assumptions of Linear Programming (LP) clearly, practically, and with examples so it sticks.


1. Linearity

We already discussed this, but here’s the full picture:

Meaning:

  • The effect of each decision variable on the objective function and constraints is directly proportional and constant.

  • If you double a decision variable, its contribution to the total also doubles.

Mathematically:


Z = c_1x_1 + c_2x_2 + \dots + c_nx_n

a_{11}x_1 + a_{12}x_2 + \dots + a_{1n}x_n \leq b_1

Example:
If producing a chair always gives $50 profit and uses 4 hours of labor, making 2 chairs gives $100 profit and 8 labor hours — exactly double.

If violated:
If making the 100th chair is harder than the first (due to fatigue), the profit/labor relationship is no longer constant → non-linear problem.


2. Divisibility

Meaning:

  • Decision variables can take any fractional value, not just whole numbers.

  • The model assumes production, allocation, or usage can be split into smaller parts without restriction.

Example:

  • LP allows you to produce 2.5 chairs or invest $3,742.56 in a project, even if in reality chairs must be whole and money is in cents.

  • This is okay in theory because LP models often represent large-scale averages where fractions make sense.

If violated:

  • If variables must be whole numbers (like people, trucks, or discrete items), we need Integer Programming, not standard LP.


3. Certainty

Meaning:

  • All coefficients in the objective function and constraints are known with complete certainty and remain constant during the planning period.

  • No randomness or probability — the model assumes the environment is stable.

Example:

  • If you say each chair uses 4 hours of labor, that is exactly 4 hours for every chair — not 3.8 today and 4.2 tomorrow.

  • If the profit is $50 per chair, that stays $50 for the entire decision-making period.

If violated:

  • If market prices, material usage, or demand change unpredictably, we need stochastic programming or robust optimization.


4. Non-Negativity

Meaning:

  • Decision variables can’t be negative.

  • You can’t produce -5 chairs or use -10 hours of labor — that makes no physical or logical sense.

Mathematically:


x_i \geq 0 \quad \text{for all } i

Example:
If = chairs produced, . Even if LP math says the "best" is , the non-negativity restriction forces it to be at least zero.


Quick Real-Life Analogy for All Four

Imagine you run a bakery:

  1. Linearity:

    • 1 cake always uses 2 cups of flour and earns $10 profit. Double cakes → double flour, double profit.

  2. Divisibility:

    • You can “mathematically” make 2.5 cakes in the model (even if real life doesn’t allow half-cakes).

  3. Certainty:

    • Flour cost, cake price, and baking time are fixed and known ahead of time.

  4. Non-Negativity:

    • You can’t make negative cakes or use negative flour.


If you want, I can give you a compact table summarizing all LP assumptions with violations and consequences — that’s the kind of cheat sheet students love before exams.
Do you want me to make that for you?