1. Expert judgement
If the project team is experienced in delivering the type of work in the scope, they can use their specialised knowledge to estimate costs. The accuracy of this method relies on the skill of the project team and a tightly defined project scope.
2. Analogous estimate
This technique uses historical data from similar projects or business as usual tasks to create cost estimates. Adjustments are made for known differences between the old and new projects or tasks. This method is usually used in the early phases of a project.
3. Three point estimate
This concept calculates a project’s costs based on the weighted average of the most likely (a realistic estimate), optimistic (best case estimate) and pessimistic (worst case estimate) cost projections. For example, let’s say you usually outsource web development for $10,000 (realistic estimate). A friend told you they used a contractor who charged $7,500 (optimistic estimate), but you’re concerned labour shortages might inflate costs to $15,000 (pessimistic estimate). You get your 3-point estimate by adding them together and dividing by 3, giving you a cost estimate for your web development of $10,833.
4. Parametric estimate
Parametric estimating uses statistical modelling to derive cost estimates. It uses historical data of key cost drivers to calculate an estimate the cost for different projects. For example, if it takes one person an hour to build 20 components, then parametric estimating suggests 100 components will take one person five hours to build, or five people one hour, and so on.
5. Bottom-up estimate
Bottom-up estimating uses the estimates from individual work packages to calculate the overall cost estimate for the project. It’s generally more accurate because it analyses costs at a granular level. It’s a good idea to get the team members or vendors responsible for the task to make the estimate as they’ll have a better idea of how long it will take. The downside to this method is it can be time-consuming and resource-intensive, especially in large, complex projects.
Source:
Any estimate with deterministic contingency is known as a Deterministic Estimate.
Business risk analysis is a process that studies inherent risk to a project's success.
Without risking, rewards are less likely. However, too much risk can lead to failure.
Optimism Bias*
People have a systematic bias towards overconfidence.
Thus, cost distribution approaches that rely on expert judgement to set several values (minimum, most likely and maximum) lead to distributions that are too tight and even weaker assessments of extreme values on the threat side.
Methods exist to reduce bias in assessing uncertain quantities, but are not embraced widely, more so in the engineering and construction industry.
Risk analysis facilitates a balancing act between taking risks and mitigating them.
It is both an art and a science.
Benefit:
Encourages project stakeholders to recognise that every item of work has its unique risk profile, and to quantify opportunities & threats so that appropriate Management Reserve (Probabilistic Contingency) is provided to enhance the rate of success.
* https://www.cmaanet.org/sites/default/files/resource/An-Overview-of-Correlation.pdf
Any estimate with probabilistic contingency is known as a Risk-Based Estimate.
They demystify Quantitative Financial Risk Analysis.
• Make this powerful technique easy to understand & use.
• Provide pertinent and fit for purpose features and charts.
• Intrinsic correlation. Practical & simple. No need for complex co-efficient inputs.
• Designed to be hands-on training software for Project Management Institutes.
• Comprehensive yet simple & user-friendly so any estimator or project manager can use them routinely on minor to major projects.
• Provide risk-based estimate as well as deterministic estimates (Simplistic, 3 Point and PERT) to compare and make a prudent decision on management reserve to be provided.
All projects, irrespective of dollar value.
Contemporary best practice is to generate an estimate expressed as a range of values associated with their Levels of Confidence (LoC/P-Levels/P values). Each value depicts probability of cost overrun/underrun for making a decision on Management Reserve.
This technique is called Quantitative Risk Analysis.
Quantitative Risk Analysis is vital for making a prudent decision on:
1) Competitive bid pricing in low bid environment
Real world scenario:
Construction company 'ABC' had a policy of submitting their bid price with 13% profit margin (5% towards corporate overheads & 8% towards actual profit) on estimated total construction cost. Being in a low bid environment, the estimation team had propensity to ignore certain threats in the cost estimate. Eventually, 'ABC' achieved only 9% profit against anticipated 13%.
Erosion of 4% profit margin translated to 50% reduction in company's profitability.
As a knee-jerk reaction 'ABC' began applying 17% off-top margin (13% plus 4% standard deterministic contingency) for future bids and as a result became uncompetitive in most cases. Interestingly, they were still uncertain of achieving 8% actual profit on the tenders won.
2) Public Sector budget forecasting and control
A tale of two City Councils in Melbourne:
Council 'A' provides 10% standard contingency in individual project budgets. This provision works reasonably well for them at the program level. Addressing 'MAIMS' could potentially effect further cost savings.
Neighbouring Council 'B' includes 15% standard contingency in individual project budgets. Even then, some project budgets are under allocated and has issues at the program level.
The likely reason could be that Council 'B' is complacent due to provision of high standard contingency of 15% and as a result, does not hone skills to make accurate estimates. The presence of MAIMS may be further aggravating inefficient use of rate-payers dollars.
MAIMS (Money Allocated Is Money Spent)
Applying standard deterministic contingency results in either under or over allocation of money to projects. The idea that sum of shortfalls and savings may be close to zero does not work in the real world and results in inefficient use of public funds.
Key Performance Indicator (KPI) of managers is linked to their ability to spend the budget which leads to diminished or zero savings.
On the other hand, managers of under allocated projects will be busy writing reports seeking additional funds and struggle to finish projects on time and as per original scope.
High precision risk analytic tools that can quantify opportunity and threat accurately at item level are required for making a prudent decision on appropriate Management Reserve.
PMax uses Monte Carlo Simulation to model risk and provides P value of each item.
CMB's Financial Performance Plan addresses MAIMS.
'P' represents percentile.
In PMax, terms P-Level, P value and Level of Confidence (LoC) are interchangeable.
* For administrative purposes, the Department requires cost estimates for projects seeking Commonwealth funding to be presented as both a P50 and a P90 project estimate defined as follows:
P50 - P50 represents the project cost with sufficient funding to provide a 50% level of confidence in the outcome; there is a 50% likelihood that the final project cost will not exceed the funding provided.
P90 - P90 represents the project cost with sufficient funding to provide a 90% level of confidence in the outcomes; there is a 90% likelihood that the final project cost will not exceed the funding provided. In other words, it represents a conservative position; a funding allocation that has only a 10% chance of being exceeded.
Management Reserve is: 'preferred P value - P50'
For example: If P90 is preferred, then
Management Reserve is: P90 - P50
* The Project Management Body of Knowledge (PMBOK) says that reserves may be used, reduced, or eliminated over time. Not everyone agrees with reducing unused reserves. Project managers should determine if reserves will be reduced or eliminated during the project, how this will occur, and when. Include this information in Cost Management Plan (PMax's Financial Performance Plan).
If you track management reserves in your organization for multiple projects, you will discover what makes sense. Adjust the reserve estimates accordingly for future projects.
Reserves should be used when risks occur (e.g., threats occur resulting in issues). Reserves should not be used for gold plating a project. Use change control where appropriate.
* Source: https://projectriskcoach.com/how-to-determine-project-budget-reserves/
It is important to note that Management Reserves do not provide for Scope Increases.
Scope Increase requests to be made explicitly. These requests could be accommodated if savings are realised through the financial performance plan.
* A rational human desire to meet targets and avoid censure due to overruns means that artificially reducing or deleting management reserves in cost estimates in the hope that this will reduce cost 'adding' and encourage competitive final costs tends to have exactly the opposite effect.
A better way to ensure competitive outcomes is to encourage open and honest cost estimating with full declaration of management reserves (calculated on the basis of risk analysis) and to encourage very good front end definition before development of the authorization estimate.
If 'Padding' is hidden, project cost control is degraded (which adds greater risk) and since it is hidden, it is more likely to be spent. All of this tends to lead to less cost competitive project outcomes.
* Source:
Key Performance Indicator (KPI) of managers is often linked to their ability to spend the budget which leads to diminished/zero savings. Money Allocated will be eventually spent.
On the other hand, managers of under allocated projects will be busy writing reports seeking additional funds and struggle to finish projects on time and as per original scope.
MAIMS* (Money Allocated is Money Spent)
It is the financial analogue of Parkinson’s Law and is a major contributor to cost overruns or higher than necessary expenditures in the delivery of a program.
One tell-tale sign that this effect is in full play is in multi-project programs where the final cost performance index is at 1.00 for a large number of the individual projects. This is not the result of perfect management, but rather the wilful consumption of any underrun that may have existed. The MAIMS principle effectively makes any potential savings from underruns unavailable to cover overruns elsewhere in the program.
Typical project cost analysis assumes an 'ideal' project or program, where savings on one element are made available to other elements. The presence of MAIMS in program or project contexts drives to an alternative strategy on establishing budgets and dynamically managing Management Reserve pools.
CMB's Financial Performance Plan:
Allocate P50 amount to each project.
Management Reserves pool is set aside and monies are released as and when required.
Ideally, on overall portfolio of projects, the amounts allocated at P50 should be adequate. However in practice it is not achievable.
Management Reserves pool help ensure that 90% probability of under-run is achieved on overall portfolio of projects. Periodic updating of forecast is done to assess savings and decide on accommodating Scope Increase requests or including additional projects. November and March forecasts will be of greater significance.
PMax has the unique ability to produce P50 & 'P90 or other preferred P-Level'/ values for each line-item. 'Cost Monitoring' sheet is provided for this purpose.
* https://www.cmaanet.org/sites/default/files/resource/An-Overview-of-Correlation.pdf
They are comprehensive yet simple & user-friendly so any estimator or project manager can use them routinely on minor to major projects. Designed to become a standard tool across Private and Public sectors.
PMax Algorithm delivers game-changing precision of 0.2% in triangular distribution by simulating 1Million trials (10x increase in precision compared to 10,000 trials) for accurate cost estimation and budget forecasting.
Best outcome is possible using the Synergy of Deterministic & Probabilistic Analyses:
Though deterministic analysis can have significant disadvantages, it is required to provide input to probabilistic analysis.
PMax Risk Analytics provide deterministic estimates along with probabilistic estimates for comparing and making decision on appropriate Management Reserve.
PMax Risk Analytic Products are suitable for any industry where financial risks persist due to uncertainty.
Triangular Distribution is simple, cost-efficient, effective and practical to implement.
It requires just three inputs: Lowest, Most likely and Highest values for each Domain.
These lowest, most likely and highest values assigned are easy to define and explain.
They provide reasonable and representable data for modelling risk.
Triangular Distribution is suited for judgmental data estimates.
The criticism that 'Triangular Distribution with Correlation'* overstates risk has been addressed by PMax through implementation of 'Triangular Distribution with Intrinsic Correlation'. As well, this approach ensures achievement of consistent result.
Furthermore, PMax provides option to choose lower P-Levels.
Correlation* & issues:
· Is it really required?
· Whether correlation is to be calculated at item-level, section-level, etc. ?
· If a default value is opted, what value is to be adopted?
· Some researchers are of the opinion that correlation co-efficient of 1 (i.e. 100%) can magnify P90 by up to 5%
For more info on Triangular Distribution and Correlation, please read:
For 'Sources of Correlation', please read (Page 6 - Table 1):
https://www.cmaanet.org/sites/default/files/resource/An-Overview-of-Correlation.pdf
Correlation must be incorporated into the analysis. However, it should be in the form of Intrinsic Correlation so risk is not unduly magnified.
PMax adopts a practical approach that does not require correlation co-efficient inputs.
As well, this method maintains the integrity of the analysis.
Correlation between 'Quantity & Rate'; 'Rate components' such as labour, material, fuel, etc. is addressed by aggregation and segregation techniques.
Aggregation: Labour, materials, fuel, etc. that have complex correlation are aggregated. Escalation formula which is linked to periodic price escalation would suffice.
Segregation: Used when productivity based direct correlation exists.
Quantity is segmented in to parts, if Rate is sensitive to the volume of work.
Its worth noting that this correlation to volume of work is complex and variable.
Rate (inherent correlation applied) for each part is provided based on past experience.
No extra effort is needed as it is a standard practice for developing Bill of Quantities.
For example:
Providing and laying 300mm dia RCC pipe
a) (>0 to x) metres
b) (>x to y) metres
c) (>y to z) metres, etc.
PMax refers to this practical method of correlating variables as 'Intrinsic Correlation'.
For more info on 'Correlation in Quantitative Risk Analysis' please read the article below:
https://broadleaf.com.au/resource-material/correlation-in-quantitative-risk-analysis/
Specifically the excerpts below:
In relation to modelling based on a list of line-item costs in a budget estimate, there are usually underlying uncertainties that affect several costs in a similar way, such as market conditions, design detail, weather or project duration. Line-by-line modelling of this kind is not recommended.
In practice, simulation software often overrides any selected correlation coefficients to force them into values that are mathematically consistent with one another. These enforced changes are artificial and have no basis in the analysis. Accepting this fix undermines the integrity of the analysis.
PMax Risk-Based Cost Estimator & PMax Budget Forecaster are the flagship products.
No claim is made that these are the most appropriate or ultimate cost estimation and budgeting tools but they are certainly an advancement on current conventional deterministic processes to yield reliable and better financial results.
Intent, transparency and continuous improvement lead to outstanding performance.
*Estimating and managing the costs in the planning/design and construction phases, for both contractors and owners - has been a challenge for decades.
PMax Risk-Based Cost Estimator is for Contractors whose concerns are – profit, consequences of loss, impacts to reputation/future work
PMax Budget Forecaster is for Owners whose concerns are – meeting budget and schedule, maintaining public credibility
*Source:
*Monte Carlo Simulation is a computer-based process involving repeated random sampling from probability distributions of the inputs, which are specified as a range of values, to obtain the expected value of a random variable — cost estimate or budget forecast.
A set of range of values is known as ‘Domain’. This provides an opportunity to quantify the uncertainty involved in quantities (Domain1) and rates (Domain2) for each line item.
Monte Carlo Simulation is probably the most easily usable form of probability analysis.
** Given the right Monte Carlo Simulation tools and skills, any size project can take advantage of the advancements of information availability and technology to yield powerful results.
All stakeholders can in some way participate in the risk management process and the Monte Carlo Simulation.
Source:
* https://epress.lib.uts.edu.au/journals/index.php/opm/article/view/4112/4550
** https://www.pmi.org/learning/library/monte-carlo-simulation-cost-estimating-6195
* In determining how many iterations to run in a simulation, there are generally two opposing pressures:
There is a strict mathematical relationship between sample size and statistical error. The larger the sample, the smaller the statistical error, or the more confident an analyst can be that a simulation has converged to the 'true' underlying distribution.
Statistical error is a verifiable source of error in any simulation, and it is a consequence of randomness and a function of the size of the sample.
It should be borne in mind that in Monte Carlo Simulation (MCS), the final outcome is an approximation of the correct value with respective error bounds, and the correct value is within those error bounds.
*Source:
PMax refers to statistical error or error bounds as 'variance' in final outcome.
PMax achieves a phenomenal 0.2% variance by running 1Million iterations in MCS.
For example: Outcome (approximation of the correct value) lies between 99.9 & 100.1
Due to the high precision achieved by the innovative PMax Algorithm, all projects in a portfolio can maintain the preferred Level of Confidence through-out their PLCs.
* In the theory of Monte Carlo methods, variance reduction is a procedure used to increase the precision of the estimates that can be obtained for a given simulation or computational effort. Every output random variable from the simulation is associated with a variance which limits the precision of the simulation results.
** PMax achieves 10x higher precision compared to software that run 10,000 iterations.
This accuracy is possible by performing 100x iterations (i.e. increase from 10 K to 1 Million).
PMax takes < 7 minutes** which is achieved by our innovative PMax Algorithm.
(for minor projects having up to 230 line items)
Source:
* https://en.wikipedia.org/wiki/Variance_reduction
** Using Dell Inspiron 7506 - i7 or similar
Works on lower configuration computers as well. However execution time may vary.
For major projects having up to 2032 line items, the time taken is < 40 minutes.
* Traditional deterministic cost-estimating methods, while well accepted, can overestimate or underestimate costs and provide very limited information regarding risks that may occur. Risk-based cost-estimating methods build on a deterministic cost base and add consideration of variability and potential risk events to give information that is relevant to risk identification, characterization, and management.
They also give more information to manage the budget (owners) and to secure a project in a competitive bidding environment (contractors), as well as inform strategies to manage disputes and claims in construction (owners and contractors).
More relevant information gives more options to manage risk. The earlier such information is available, the sooner that strategies and management actions can be implemented to avoid problems and achieve good results.
In particular, such information helps owners by highlighting budget issues early, allowing good decisions to be made regarding expected bid results, and helps contractors to decide if they can be competitive given the owner’s budget and in competition with other contractors. Subsequent to winning a bid, strategies for cost and claims management are informed by better cost and risk information.
*Source:
Risk-Based Cost Estimator is for Private Sector (Contract Bidders).
Deterministic quantities and rates which are calculated in the regular fashion form the basis for Most Likely Quantities and Most Likely Rates of line-items.
Subsequently, the opportunities and threats are assessed and these values are fed into PMax Risk-Based Cost Estimator in the form of Lowest, Most Likely and Highest and a risk based cost estimate is generated which gives insight into risk involved at various levels of confidence.
PMax Budget Forecaster is for Public Sector (Owners) to prepare accurate budgets.
Triangular Distribution is chosen as it is ideally suited for judgmental data estimates.
It requires just three inputs: The Lowest/Minimum, Most Likely, Highest/Maximum for each Domain.
Domain1 (Quantity Range): The Lowest, Most Likely, Highest should be arrived at upon due deliberation and entered as user-defined rather than assigning pre-defined percentage ranges provided.
Domain2 (Rate Range): The Lowest, Most Likely, Highest should be arrived at upon due deliberation and entered as user-defined rather than assigning pre-defined percentage ranges provided. They can also be auto-filled from Rate Database, if available.
Typically, item range input values are expected to fall within 50% of Most Likely on either side i.e. both opportunity and threat sides. Eg.: 50:100:150 (i.e. 1:2:3 ratio)
However in construction industry, certain items may suffer higher than 50% variability on the threat side. To cater for such scenarios, 'Auto Range limit Protection' triggers only when the Highest > 2x Most Likely. The purpose of 'Auto Range limit Protection' is to contain the adverse effect of 'low probability & high impact zone'.
It is given as a default option and can be opted-out.
When opted, during simulation if Highest is found to be > 2x Most Likely, the program will internally auto adjust the Most Likely and Lowest keeping Highest intact. The resulting values will be in 1:2:4 ratio (i.e. 50:100:200).
Most Likely Amount is product of Most Likely Quantity and Most Likely Rate.
Column 'T' of the 'Data Input' sheet shows Most Likely Amount of line-items.
Whenever Highest > 2x Most Likely in either or both Domains of a line-item, the font colour of corresponding field/cell changes.
Data validation feature of PMax checks and prevents the program from proceeding further
if the '(Lowest/Minimum) ≤ Most Likely ≤ (Highest /Maximum)' data entry rule is violated.
Line-items with errors will be highlighted for easy identification.
After correcting the error/s, click the 'Generate Estimate' button.
A living document can be continually edited and updated.
PMax flagship products are designed to be living documents.
Updates can be made at various stages till the bid price/budget is finalised.
Periodically during project execution, the domain values can be edited to reflect the actual value confirmed upon completing a line-item (by entering constant values in that domain).
For example: Quantity range was (100, 120, 160) cu.m at Bid/Budget Stage.
If actual quantity is 135 cu.m then enter 135,135,135 in quantity range.
Repeat the process for another line-item whose actual quantity is confirmed.
Similarly, if Rate range was ( $125, 150, 225) at Bid/Budget Stage.
If the actual rate is $180 then enter 180,180,180 in rate range.
Repeat the process for another line-item whose actual rate is confirmed.
Generate new estimate to obtain updated estimate/forecast.
Cost monitoring sheet can also be periodically updated.
Cost Impact Analysis is also known as Sensitivity Analysis.
Tornado chart is used to depict what matters most at a glance.
The sections listed at the top are the most important in the way they affect the outcome.
By looking at the bigger picture, the quality of your decision improves and enables you to manage uncertainty better.
Outturn Cost: Escalated cost is known as outturn cost.
PMax has provision to accommodate 11 periods of escalation in the 'Simulator Sheet'.
Additionally, PMax allows for Variations & Extra Claims in 'Provisional OnCosts'.
These can be entered in 'Estimate Sheet' which appears upon completion of simulation.
* Variations are typically, instructed to the Contractor and are clearly identified and evaluated following the rules provided for in the Contract.
Extra claims however, require that the Contractor to comply with the detailed Contractual and procedural requirements, which places the burden of proof on the Contractor if the remedy sought is to be maximised.
Source:
* https://contractbites.com/the-difference-between-a-variation-and-a-claim/
Copyright © 2015 - 2024 PMax - All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.