Identify key stakeholders and their needs, and the project position within the organization(urgency or not)
Deliver value incrementally
Monitor result for business value
Measure progress
Subdivide task to find MVP (minimum viable product= core value loop)
Make data-driven decisions
Term
Focus
Audience
Scope & Detail
Example Outcome
POC (Proof of Concept)
Feasibility (“Can it be built?”)
Internal (team/stakeholders)
Very limited; tests core idea
Basic script showing a tech works
Prototype(Design & Usability)
“How will it look/feel?”
Internal + early testers
More visual/functional mockup
Clickable wireframe or demo model
MVP (Minimum Viable Product)
A quick prototype for Market Validation (“Will users pay for it?”)
External (early customers)
Functional product with core features
Basic app released to test demand
MBI (Minimum Business Increment)
Business Value (“Does it deliver measurable business impact?”)
External (paying customers / broader market)
The highest, most quickly realized piece of work that gives value
A shipped increment that independently drives real revenue, retention, cost savings, or another key business metric
Deliver Value Incrementally
Deliverable:
Kanban board: The tool shows the stage of work required to deliver value incrementally.
Backlog: Created by product owner, this is a list of prioritized work used by the project team to do their work.
Lessons Learned , PDCA and agile.
Keep business value in check
Ensure the delivered value remains aligned with business goals.
Financial gain - Increased sales, revenue or profit
Improvement - efficient, quality, conditions or infrastructure
New customers and opportunities - gain in market share
First to market - Prestige for the organization and competitive advantage.
Social - Impact a wider community or cause.
Technological - Improve processes or strengthen digital infrastructure or presence.
Project Charter
Business Case - provides the justification for a project, program, or portfolio, it is essential to ensure the business value of the project work.
Release Planning - Product roadmap, iterations, sprints - allow product owner and team to decide how much needs to be developed and how long it will take to have a releasable product based on business goals.
Subdivide tasks to find MVP (sub-tasks)
MVP:minimum viable products are the foundation of a prototype that allows early testing and increased potential of a project.
Story Mapping - Kanban board, customer validation
MoSCoW method - separate user requirements into 4 categories.
Must Have - Non-negoticable atribute
Should have - important but not essential
Could have - Desirable, time and budget permitting
Won’t have - Not in budget or time line “nice to have” but has no real value
MBI - Facebook’s “Like” Button
They are post-validation (after MVP/proof of demand)
They are independently deployable and provide end-to-end value (no “half-finished” experience).
They tie directly to measurable business outcomes (revenue, cost savings, retention, efficiency).
They are intentionally small to enable fast feedback and low risk.
Measure progress - best tools - quantities and quality
Define value from the customer
Determine value expectations
Set target and baselines
Determine metrics that communicate progress
Select one or more means of collecting metric data
Collect data at regular intervals
TOOLS:
WBS, Kanban,
Burndown charts for predicting when all the work will be completed.
EVM (Earned value management) : track project performance against baseline (cost, quality,time,scope and resources)
Reporting and tracking tools (PMIS, microsoft project etc, cumulative flow diagram, velocity charts etc)
Retrospectives - after an iteration or phase, actively measures progress.
Make data-driven decisions (focus on business only)
Data from different sources (internal and external, cost and profit, quality , customer and supplier etc)
Tools:
schedule data
release planning
quality metrics
work performance data
risk register
requirements traceability matrix
Product roadmap
Questionairs
Question 1 (Risk Analysis – Quantitative) Your project has three key risks with the following data after Monte Carlo simulation (10,000 iterations):
Risk A: Probability 40%, EMV = –$80,000
Risk B: Probability 25%, EMV = –$120,000
Risk C: Probability 15%, EMV = +$50,000 (upside opportunity The contingency reserve is currently set at $150,000. The sponsor asks you to recommend the most appropriate adjustment. What should you do?
A) Increase contingency reserve to $200,000 to cover the P50 value B) Recommend $170,000 based on aggregated expected monetary value C) Set reserve at $0 because upside opportunity offsets threats D) Perform sensitivity analysis before deciding on reserve
Answer:B
EMV (Expected Monetary Value)= prob x Impact $
Threats=A (-80,000)+B(-120,000)=-200,000$
Oppotunity:C (+50,000)
Net aggregated EMV=-200,000+50,000= - 150,000$
(current reserve matches exactly). But the question implies adjustment needed, and option B says $170k (perhaps a slight buffer or misread). Actually, upon close review, the best is often the aggregated threats’ EMV without fully netting upside unless the opportunity is realized in the same scenarios (Monte Carlo would show this).
Question 2 (Risk Response – Threats) During risk response planning for a high-priority technical risk, the team identifies a vendor who can deliver a proven alternative component that reduces probability from 70% to 10%, but adds $45,000 to the budget. The cost of impact if the risk occurs is estimated at $300,000. What is the most appropriate strategy?
A) Accept the risk because the cost of mitigation exceeds 15% of impact B) Mitigate by contracting the vendor (secondary risk created) C) Transfer the risk fully to the vendor via warranty clause D) Avoid the risk by redesigning the component in-house
Question 3 (Monitor and Close Risks) In the monthly risk review meeting, you notice that a previously low-priority risk has triggered and is now impacting the critical path. The risk owner has not updated the risk register in two months. What is your BEST immediate action?
A) Escalate to the project sponsor for additional funding B) Update the risk register, reassess probability/impact, and trigger the response plan C) Close the risk as “realized” and document lessons learned D) Issue a change request to extend the schedule
Answer:B
Question 4 (Monte Carlo Simulation – Details and Outputs) You are performing a Monte Carlo simulation on your project’s schedule using 5,000 iterations. The simulation assumes triangular distributions for activity durations: optimistic (O), most likely (ML), and pessimistic (P). After running the simulation, the output shows a mean project duration of 120 days, with P10 at 105 days, P50 at 118 days, and P90 at 140 days. Stakeholders request a contingency reserve to achieve an 80% confidence level in meeting the 130-day target. Based on the simulation results, what is the recommended contingency reserve in days?
Formula reminder: Contingency reserve = (Target confidence level duration) - (Baseline/mean duration), adjusted from simulation percentiles.
A) 10 days (P90 - mean) B) 12 days (P80 interpolated ≈ 130 days - mean) C) 20 days (P90 - P50) D) 25 days (P90 - P10 / 2)
Answer:B
Contingency reserve= P80 ( around 130) - P50 (118) = 12 days
Question 5 (Decision Tree Analysis – Branching and EMV) Your team is evaluating two vendor options for a critical component using decision tree analysis. Vendor A costs $100,000 upfront with a 60% chance of on-time delivery (value +$500,000) and 40% chance of delay (impact -$200,000). Vendor B costs $150,000 upfront with an 80% chance of on-time delivery (same +$500,000 value) and 20% chance of delay (same -$200,000 impact). Calculate the net EMV for each path and recommend the better option.
Formula: Net Path Value = (Probability × Outcome) - Initial Cost; Overall EMV = Sum of net path values.
A) Choose Vendor A: EMV = +$60,000 B) Choose Vendor B: EMV = +$90,000 C) Choose Vendor A: EMV = +$140,000 D) Choose Vendor B: EMV = +$210,000
Question 6 (Sensitivity Analysis – Tornado Diagram and Formulas) In quantitative risk analysis, you create a tornado diagram to show sensitivity of the project’s NPV to key variables. The baseline NPV is $1,200,000. Variables include: Cost overrun (range -20% to +30%, sensitivity impact ±$400,000), Revenue delay (range -10% to +15%, impact ±$250,000), and Market demand (range -15% to +20%, impact ±$150,000). Which variable should be prioritized for further mitigation, and why?
Formula: Sensitivity = (Max impact - Min impact) / Baseline, but prioritize by widest bar in tornado (absolute impact range).
A) Cost overrun: widest range (±$400,000) B) Revenue delay: moderate range but higher percentage sensitivity C) Market demand: narrowest range, least priority D) All equal; perform Monte Carlo next
Question 7 (Monte Carlo Simulation – Distributions and Correlations) During Monte Carlo setup for cost risk analysis, you model three correlated risks: Material cost (normal distribution, mean $50,000, SD $10,000), Labor cost (lognormal, mean $80,000, SD $15,000), and Exchange rate fluctuation (uniform, $0.90-$1.10). Risks have a +0.7 correlation between material and labor. After 10,000 iterations, the simulation outputs a P75 total cost of $160,000 against a baseline of $130,000. What is the primary reason to include correlations in the model?
A) To reduce iteration count for faster computation B) To accurately reflect real-world dependencies, avoiding under/overestimation of variance C) To convert all distributions to triangular for simplicity D) To eliminate the need for sensitivity analysis
Code
set.seed(2026) # for reproducibilityn <-10000# ─── 1. Generate correlated Material and Labor ────────────────────────# Correlation matrix (material ↔ labor = 0.7, exchange independent)rho <-0.7cor_mat <-matrix(c(1.0, rho, 0, rho, 1.0, 0,0, 0, 1.0), nrow =3, byrow =TRUE)# Cholesky decomposition: L %*% t(L) = cor_matL <-t(chol(cor_mat))# Independent standard normals (n rows × 3 columns)Z <-matrix(rnorm(n *3), nrow = n, ncol =3)# Correlated standard normalsU <- Z %*% L# ─── 2. Transform to target distributions ─────────────────────────────# Material ~ Normal(50,000, 10,000)material <-50000+10000* U[, 1]# Labor — lognormal with *real-space* mean = 80,000 and sd = 15,000m <-80000s <-15000sigma <-sqrt(log(1+ (s/m)^2)) # sd of log(labor)mu <-log(m) -0.5* sigma^2# mean of log(labor)labor <-exp(mu + sigma * U[, 2])# Exchange rate — independent Uniform(0.90, 1.10)exchange <-runif(n, 0.90, 1.10)# ─── 3. Total cost ────────────────────────────────────────────────────total_cost <- (material + labor) * exchange# ─── 4. Results ───────────────────────────────────────────────────────cat("Monte Carlo results (", n, " iterations):\n\n", sep ="")
Monte Carlo results (10000 iterations):
Code
cat("P75 total cost:", format(round(quantile(total_cost, 0.75)), big.mark =","), "\n")
Min. 1st Qu. Median Mean 3rd Qu. Max.
54302 114870 128749 129782 143906 236489
Code
# Optional: histogramhist(total_cost, breaks =80, col ="#aacdff", border ="white",main ="Total Cost Distribution — 10,000 iterations",xlab ="Total Cost ($)", las =1)abline(v =quantile(total_cost, 0.75), col ="red", lwd =2, lty =2)legend("topright", legend =paste("P75 =", P75), col ="red", lty =2)
Question 8 (Decision Tree Analysis – Multi-Stage with Formulas) In a multi-stage decision tree for a product launch, the first decision is to invest $200,000 in R&D with 70% success probability (leading to market test) or abandon (EMV $0). If successful, market test costs $100,000 with outcomes: High demand (50%, +$1,000,000), Medium (30%, +$400,000), Low (20%, -$50,000). Calculate the overall EMV and decide if to proceed with R&D.
Formula: Roll back from end nodes: EMV_node = Sum (P × Outcome) - Cost_at_node; compare to alternatives.
A) Proceed: Overall EMV = +$210,000 B) Abandon: EMV = $0 (better) C) Proceed: Overall EMV = +$385,000 D) Proceed: Overall EMV = +$157,000
Answer:D
EMV if R&D failed= 0.3 x 0$ - 200,000 = - 200,000
EMV if R&D succeeds=0.7*(0.5*1000000+0.3*400000-0.2*50000-100000) =357000
cat("Decision: Proceed with R&D (positive EMV)\n")
Decision: Proceed with R&D (positive EMV)
Code
# Draw tree (probs on chance edges, costs/benefits on actions if non-zero)dt$draw(border =TRUE)
Question 9 (PERT vs Triangular Distributions in Monte Carlo) You are modeling activity duration risk using both PERT and triangular distributions for Monte Carlo simulation (10,000 iterations each). For one critical path activity:
Optimistic = 8 days, Most Likely = 12 days, Pessimistic = 22 days
PERT expected duration = (O + 4ML + P) / 6 = 13 days
Triangular expected duration = (O + ML + P) / 3 = 14 days
The simulation using PERT shows a project P80 duration of 145 days; using triangular shows P80 of 152 days. Assuming the same inputs and correlations, what is the most likely reason for the difference in P80 output?
A) Triangular distribution has higher variance due to equal weighting of extremes B) PERT assumes beta distribution with lower standard deviation C) Triangular distribution is inappropriate for schedule risk D) Monte Carlo iteration count should be increased to 50,000
Question 10 (Decision Tree – Expected Value of Perfect Information – EVPI) A project faces a key uncertain event with two outcomes: Favorable (60% probability, project NPV +$800,000) or Unfavorable (40%, NPV –$300,000). Without information, the best decision is to proceed (EMV = 0.6×800k + 0.4×(–300k) = +$360,000). A market study can perfectly predict the outcome at a cost of $80,000. What is the Expected Value of Perfect Information (EVPI)?
Formula: EVPI = Expected value with perfect information – Expected value without information
A) $80,000 B) $120,000 C) $200,000 D) $280,000
Expected value EMV without information=0.6 x 800k + 0.4 x (-300k)= +$ 360,000
Expected value EVwPI with information= 0.6 x 800k +0.4 x $0 = + $480,000
EVPI=EVwPI-EMV without info = 480,000-360,000=120,000
Question 11 (Monte Carlo – Latin Hypercube vs Simple Monte Carlo) Your risk analyst proposes switching from simple (random) Monte Carlo sampling to Latin Hypercube Sampling (LHS) for a cost risk model with 15 input variables. After running both with 5,000 iterations, LHS produces a tighter confidence interval for the P90 total cost estimate. What is the primary advantage of LHS in this context?
A) It reduces computation time significantly B) It provides better coverage of the input probability space with fewer iterations C) It eliminates the need to model correlations D) It automatically converts all distributions to normal
Answer:B
Latin Hypercube Sampling (LHS) is a stratified sampling technique that ensures each input distribution is sampled more evenly across its range, even with fewer iterations. This leads to:
Lower variance in output estimates
Tighter confidence intervals
More stable P90/P10 estimates compared to simple random (Monte Carlo) samplin
Question 12 (Sensitivity Analysis – Correlation Coefficients and Partial Rank Correlation) In a Monte Carlo simulation output, you review a sensitivity tornado diagram based on Spearman rank correlation coefficients between input variables and project NPV. The top three ranked inputs are:
Input X: SRC = +0.68
Input Y: SRC = –0.55
Input Z: SRC = +0.42
Later, partial rank correlation coefficients (PRCC) are calculated to control for confounding:
X (controlling for others): PRCC = +0.65
Y: PRCC = –0.12
Z: PRCC = +0.38
Which input should be prioritized for risk response planning, and why?
A) Input X – highest absolute SRC and stable PRCC B) Input Z – highest PRCC after controlling for others C) Input Y – large negative SRC indicates strong threat D) None – PRCC differences indicate multicollinearity issues
Answer A
Spearman Rank Correlation (SRC) shows raw correlation with the output.
Partial Rank Correlation Coefficient (PRCC) removes the confounding effect of other variables (controls for multicollinearity).
Input X remains very strong even after controlling (PRCC = +0.65) → it has an independent, significant effect.
Input Y drops dramatically (from –0.55 to –0.12) → much of its apparent effect was due to correlation with other variables.
Input Z stays moderate.
Prioritize Input X — it has the strongest independent influence on NPV. Correct: A — Very well done!