Formulae for Scoreboard Calculations
The most important type of scoreboards is the Favorite View based scoreboards. Favorite View based scoreboards are defined using a Favorite View as data selection and drill down settings. Favorite view defines Key filtering, AUM, measures to include, time range, Calculated measures etc. The favorite view also provides configuration of how the pivot sheet will look when drill down from a scoreboard is performed.
The following table illustrates the different Favorite view based formulae for calculating Scoreboards in M3 DMP:
Formula | Description |
---|---|
1 – Difference versus Base |
This formula detects hits inside or outside an interval for formula 1 - Difference versus base of two measures. It is used for calculating Forecast Percentage Accuracy. 1-(([M1] - [M2]) / [M1]) >= 0 AND 1-(([M1] - [M2]) / [M1]) <= 0 |
Change Interval |
This formula is used to calculate the Change Interval of a single measure compared to the measure without changes. (M(Pr+Poff)*F1 <<=>>= (M(Pr) + C(Pr)) <<=>>= M(Pr+Poff)*F2) |
Compare Measures |
This formula is used to calculate and compare two measures against each other. (M2(Pr-Poff)*F1 <<=>>= M1(Pr) <<=>>= M2(Pr-Poff)*F2) |
Compare Periods |
This formula is used to calculate and compare values of one measure in different periods. (M (Pr-Poff)*F1 <<=>>= M(Pr) <<=>>= M (Pr-Poff)*F2) |
Difference |
This formula is used to calculate the difference between two measures inside or outside a value interval. (L1 <<=>>= M1(Pr)-M2(Pr-Poff) <<=>>= L2) |
Difference vs. Base |
This formula is used to calculate the Difference vs. Base of two measures. (L1 <<=>>= (M1(Pr) - M2(Pr-Poff))/M1(Pr) <<=>>= L2) |
Difference vs. Subtrahend |
This formula is used to calculate the Difference vs. Subtrahend of two measures. (L1 <<=>>= (M1(Pr) - M2(Pr-Poff))/M2(Pr-Poff) <<=>>= L2) |
Difference vs. Sum |
This formula is used to calculate the Difference vs. Sum of two measures inside or outside a value interval (L1 <<=>>= (M1(Pr) - M2(Pr-Poff))/M1(Pr) + M2(Pr-Poff) <<=>>= L2) |
Forecast Alarm 1 |
This formula is used to calculate the Forecast Alarm type 1. See About Forecast Alarms. F1*MAD <<=>>= |M1(Pr) - M2(Pr-Poff)| <<=>>= F2*MAD) |
Forecast Alarm 2 |
This formula is used to calculate the Forecast Alarm type 2. See About Forecast Alarms. F1*MAD <<=>>= Avg(n,M1(Pr) - M2(Pr-Poff)) <<=>>= F2*MAD) |
Interval |
This formula is used to calculate a measure inside or outside a given value interval. (L1<<=>>=M(Pr) <<=>>= L2) |
Square Difference |
This formula is used to calculate the Square Difference between two measures inside or outside a value interval. (L1 <<=>>= (M1(Pr)-M2(Pr-Poff))^2 <<=>>= L2) |
Square Root Difference |
This formula is used to calculate the Square Root Difference between two measures inside or outside a value interval. (L1 <<=>>= Sqr((M1(Pr)-M2(Pr-Poff))^2) <<=>>= L2) |
Sum |
This formula is used to calculate the sum of two measures inside or outside a value interval. (L1 <<=>>= M1(Pr) + M2(P-Poff) <<=>>= L2) |
Tracking Signal |
This formula is a special formula with Sum of differences of two measures over a number of periods divided by Average of the same the difference for the same number of periods. The formula detects if tracking signal lies outside approximately the standard deviation. Sum(2,([M1] - [M2])) / Avg(2,Abs(([M1] - [M2]))) >= 0 AND Sum(2,([M1] - [M2])) / Avg(2,Abs(([M1] - [M2]))) <= 0 |
Formula Templates
In addition to the above mentioned formulae, M3 DMP includes a number of Key Performance Indicators (KPI) which can be easily defined from a set of predefined formulae described below.
Formula | Description |
---|---|
Forecast Percentage Accuracy | Forecast Accuracy is a measure of how close the actual quantity is to the forecasted quantity and can vary in content depending on the type of business and the data that is available. The mathematical calculation is generally (forecast sales - actual sales) / forecast sales. |
Forecast Percentage Error |
Forecast error is the converse of accuracy: Error (%) = 1 – Accuracy (%) Accuracy is normally constrained to be between 0 and 100% meaning that error > 100% => 0% accuracy and error close to 0% => increasing forecast accuracy. |
Forecast Error | Forecast error is the difference between the actual quantity and the forecasted quantity. |
MFE - Mean Forecast Error | The Mean Forecast Error is the Average of the Sum of signed Forecast errors. (Identical with BIAS). |
BIAS - Mean Forecast Error | Forecast Bias is the Average of the Sum of signed Forecast errors. (Identical with MFE). |
MAD - Mean Absolute Deviation | The absolute average deviation. This figure does not say very much. It is recommendable to use MAPE, MFE or BIAS instead. |
MAPE - Mean Absolute Percentage Error | MAPE or the absolute percent error is a decent cross-sectional measure to evaluate divisional or corporate performance across many SKUs. It is used to measure the SKU level forecast error in most supply chains. |
MSE - Mean Squared Error | MSE is a better measure for calculating safety stock and other inventory planning parameters. The MSE has simple statistical properties and hence can be used to interpret the service levels as the number of standard deviations in a standard normal distribution. |
RMSE - Root Mean Squared Error | RMSE is a good longitudinal measure across time. You should use RMSE to compare error over time for the same SKU as this is also used typically to set safety stock planning. |
Tracking Signal | The formula detects if Tracking signal is used for accumulating errors over time to detect when the basic pattern has changed. |
Accuracy Signal |
The Accuracy Signal is used for practical following up on the forecast quality. (Forecast – Actual) / (Forecast + Actual) In addition to measuring the Forecast accuracy, the Accuracy Signal also provides information about how to improve the forecast. |
Theils U-statistic |
Theils U-statistic provides a relative comparison of two measures, including weighing of error severity. If U = 1 then the Naive Method is as good as the current Forecast Method If U < 1 then the Forecasting Method is better than the Naive Method If U > 1 then the Naive Method is better than the Forecasting Method. There is no need to waste time applying further Forecasting methods. |