REQUEST A DEMO

Forecast accuracy assessment guide. How to evaluate your forecasts?

A simplified procedure to do time series forecasting comprises 3 main steps:

  1. data assessment and preparation (discussed in the previous article <link>),
  2. prediction calculation,
  3. prediction assessment.

To achieve this, we need a metric to measure the accuracy of our forecasts. Without this measurement, we cannot determine the reliability of our predictions—and using unreliable forecasts for planning poses risks a planner would like to avoid.

This is also an issue with model choice for forecasting. To choose the best one you can base on the statistical features and their compatibility with the model (more about that in the next article), but to find out if it works you need to test it and properly assess the result – in other words, you need an evidence that the model is good to make an appropriate decision.

So, how to ensure appropriate accuracy measures? In this article, I will describe the usual metrics used to assess a forecast. I will also introduce our custom method – BiModal Prediction Score.

Common forecast accuracy metrics and formulas

Forecast accuracy is a factor based on which we choose the right model for a given time series. Choosing the right metrics is key to the whole process. Below, I present the most relatable metrics we can use to assess prediction.

Legend:

  • yi– prediction value
  • xi – true value
  • n – number of forecasted periods
  • T – number of periods in training set
  • h - the training sample
  • Scale-independent – good to compare different forecasts
  • Direction of deviation –  the information if the forecasts are over-or underestimated.

1. Mean absolute error (MAE)

Equation:

  • Scale-independent: no
  • Direction of deviation: no
  • Remarks and recommendations: average volume error, good for stable time series.

MAE quantifies the average absolute difference between predicted values and actual outcomes [1].

2. Symmetric Mean Absolute Percentage Error (SMAPE)

Equation:

  • Scale-independent: yes
  • Direction of deviation: no
  • Remarks and recommendations: precision indicator, good for comparison between time series.

The "symmetric" aspect means it treats over-predictions and under-predictions equally by dividing by the average of actual and forecast values (instead of just the values themselves) [1].[GU1]

3. Mean Absolute Scale Error (MASE)

Equation:

  • Scale-independent: yes
  • Direction of deviation: no

Remarks and recommendations: nominator is MAE of a forecast, denominator is MAE of naive forecast, good for time series with high variabilityIt was proposed by Hyndman and Koehler [2] as a more robust and comparable alternative to metrics like MAPE and MAE.

4. Root Mean Square Error (RMSE)

Equation:

  • Scale-independent: yes
  • Direction of deviation: no
  • Remarks and recommendations: more importance is given to the larger differences, recommended for high-value products

RMSE is a measure of the average magnitude of the errors between predicted/forecasted values and actual observed values. It gives higher weight to large errors because errors are squared before averaging, making it sensitive to outliers [1].

5. Root Mean Squared Scaled Error (RMSSE)

Equation:

  • Scale-independent: yes
  • Direction of deviation: no
  • Remarks and recommendations: compares the MSE of the forecast to the MSE of a one-step naive forecast; similar to RMSE, but appropriately scaled

Uses squared errors and their root to strongly penalize large forecast mistakes and provide a more robust comparison across series of different scales [3].

6. Forecast Bias

Equation:

  • Scale-independent: no
  • Direction of deviation: yes
  • Remarks and recommendations: indicates if the prediction in general is too high or too low, good to assess short-term predictions

Forecast bias reflects the systematic error in predictions, indicating the direction of the error rather than just its magnitude [1].

There's a high chance you have encountered such a summary. In addition to the numerical forecast errors, I would also like to include two more qualitative ones, which can also be quantified.

Forecast evaluation and statistical compatibility

If the forecast seems to be incompatible with the historical data, it signals that something could go wrong in the process. Such incompatibility can significantly differ from the average of the forecast if the data are without a significant trend. These things may happen when too simple models are used or data are influenced significantly by explanatory variables. This is a sign to use multivariate forecasting.

Forecast errors and prediction intervals

If the resulting intervals are too wide, you can qualify the forecast to a different method. Prediction intervals are calculated based on residuals from the test set, so you can also use simply MAE or bias at the testing stage. Assessment of prediction intervals happens at the very end of the process.

Measuring forecast accuracy: which method to choose?

In general, this decision is not an easy one. There are different goals to pursue, and each is better suited for specific purposes. This is why there is no perfect metric that works for everything, as perfection or ideal solutions do not exist in principle.

The same applies to models for time series data; there is no one-size-fits-all model. However, when it comes to choosing a model, we have options. Different models can be employed, as they all ultimately aim to achieve the same outcome: making predictions.

Measuring forecast accuracy: my recommendation

On the other hand, the accuracy metric should be universal, as it facilitates the comparison of the values between different time series. At the same time, it should bring proper meaning to the gathered data. Due to that, the choice of one metric is difficult at best; personally, I believe it to be an impossible endeavor.

So, the most plausible way to ensure that the measurement brings valuable insights is to focus on more than one goal, and create a score incorporating both appropriate accuracy indicators and business relevance (currently, the trend is to combine the forecast bias and MAE formulas).

Evaluating forecast accuracy with BiModal Forecasting

As we recognize the advantages of combining accuracy metrics and business context, we have developed the BiModal Prediction Score.

This score includes:

  • the Symmetric Mean Absolute Percentage Error (SMAPE),
  • a statistical compatibility assessment,
  • and a factor for the width of prediction intervals.

We intentionally do not include bias, as it can create a misleading impression of accuracy in long-term forecasts—the positive and negative deviations can offset each other.

This approach works fairly well for short-term forecasts. Over a three-month period, if overestimations and underestimations balance out, your forecast remains stable. Conversely, using this method for long-term forecasts can be beneficial if applied in specific time buckets.

A guide on demand forecast assessment: the conclusion

Each error has some flaws – therefore, we should use hybrids to ensure accuracy and business relevance. This, along with the knowledge of the data features, will ensure that the correct model is chosen. The result will be the elimination of both the principal and statistical errors, and—on the other hand—meeting the high accuracy of your predictions.

The next post will touch upon some of the most popular forecasting models. Let's stay in touch!


Works Cited: 

[1] Makridakis, S., Wheelwright, S. C., & Hyndman, R. J. (1998). Forecasting: methods and applications (3rd ed.). Wiley, 40-45

[2] Hyndman, R. J., & Koehler, A. B. (2006). Another Look at Measures of Forecast Accuracy. International Journal of Forecasting, 22(4), 679–688.

[3] Guangyu Wu. (2022, September 16). MASE, RMSSE Metrics. Retrieved from https://guangyuwu.wordpress.com/2022/09/16/mase-rmsse-metrics/


Request a demo

Leave us your phone number. We will contact you shortly!

  • United States+1
  • United Kingdom+44
  • Afghanistan (‫افغانستان‬‎)+93
  • Albania (Shqipëri)+355
  • Algeria (‫الجزائر‬‎)+213
  • American Samoa+1
  • Andorra+376
  • Angola+244
  • Anguilla+1
  • Antigua and Barbuda+1
  • Argentina+54
  • Armenia (Հայաստան)+374
  • Aruba+297
  • Ascension Island+247
  • Australia+61
  • Austria (Österreich)+43
  • Azerbaijan (Azərbaycan)+994
  • Bahamas+1
  • Bahrain (‫البحرين‬‎)+973
  • Bangladesh (বাংলাদেশ)+880
  • Barbados+1
  • Belarus (Беларусь)+375
  • Belgium (België)+32
  • Belize+501
  • Benin (Bénin)+229
  • Bermuda+1
  • Bhutan (འབྲུག)+975
  • Bolivia+591
  • Bosnia and Herzegovina (Босна и Херцеговина)+387
  • Botswana+267
  • Brazil (Brasil)+55
  • British Indian Ocean Territory+246
  • British Virgin Islands+1
  • Brunei+673
  • Bulgaria (България)+359
  • Burkina Faso+226
  • Burundi (Uburundi)+257
  • Cambodia (កម្ពុជា)+855
  • Cameroon (Cameroun)+237
  • Canada+1
  • Cape Verde (Kabu Verdi)+238
  • Caribbean Netherlands+599
  • Cayman Islands+1
  • Central African Republic (République centrafricaine)+236
  • Chad (Tchad)+235
  • Chile+56
  • China (中国)+86
  • Christmas Island+61
  • Cocos (Keeling) Islands+61
  • Colombia+57
  • Comoros (‫جزر القمر‬‎)+269
  • Congo (DRC) (Jamhuri ya Kidemokrasia ya Kongo)+243
  • Congo (Republic) (Congo-Brazzaville)+242
  • Cook Islands+682
  • Costa Rica+506
  • Côte d’Ivoire+225
  • Croatia (Hrvatska)+385
  • Cuba+53
  • Curaçao+599
  • Cyprus (Κύπρος)+357
  • Czech Republic (Česká republika)+420
  • Denmark (Danmark)+45
  • Djibouti+253
  • Dominica+1
  • Dominican Republic (República Dominicana)+1
  • Ecuador+593
  • Egypt (‫مصر‬‎)+20
  • El Salvador+503
  • Equatorial Guinea (Guinea Ecuatorial)+240
  • Eritrea+291
  • Estonia (Eesti)+372
  • Eswatini+268
  • Ethiopia+251
  • Falkland Islands (Islas Malvinas)+500
  • Faroe Islands (Føroyar)+298
  • Fiji+679
  • Finland (Suomi)+358
  • France+33
  • French Guiana (Guyane française)+594
  • French Polynesia (Polynésie française)+689
  • Gabon+241
  • Gambia+220
  • Georgia (საქართველო)+995
  • Germany (Deutschland)+49
  • Ghana (Gaana)+233
  • Gibraltar+350
  • Greece (Ελλάδα)+30
  • Greenland (Kalaallit Nunaat)+299
  • Grenada+1
  • Guadeloupe+590
  • Guam+1
  • Guatemala+502
  • Guernsey+44
  • Guinea (Guinée)+224
  • Guinea-Bissau (Guiné Bissau)+245
  • Guyana+592
  • Haiti+509
  • Honduras+504
  • Hong Kong (香港)+852
  • Hungary (Magyarország)+36
  • Iceland (Ísland)+354
  • India (भारत)+91
  • Indonesia+62
  • Iran (‫ایران‬‎)+98
  • Iraq (‫العراق‬‎)+964
  • Ireland+353
  • Isle of Man+44
  • Israel (‫ישראל‬‎)+972
  • Italy (Italia)+39
  • Jamaica+1
  • Japan (日本)+81
  • Jersey+44
  • Jordan (‫الأردن‬‎)+962
  • Kazakhstan (Казахстан)+7
  • Kenya+254
  • Kiribati+686
  • Kosovo+383
  • Kuwait (‫الكويت‬‎)+965
  • Kyrgyzstan (Кыргызстан)+996
  • Laos (ລາວ)+856
  • Latvia (Latvija)+371
  • Lebanon (‫لبنان‬‎)+961
  • Lesotho+266
  • Liberia+231
  • Libya (‫ليبيا‬‎)+218
  • Liechtenstein+423
  • Lithuania (Lietuva)+370
  • Luxembourg+352
  • Macau (澳門)+853
  • North Macedonia (Македонија)+389
  • Madagascar (Madagasikara)+261
  • Malawi+265
  • Malaysia+60
  • Maldives+960
  • Mali+223
  • Malta+356
  • Marshall Islands+692
  • Martinique+596
  • Mauritania (‫موريتانيا‬‎)+222
  • Mauritius (Moris)+230
  • Mayotte+262
  • Mexico (México)+52
  • Micronesia+691
  • Moldova (Republica Moldova)+373
  • Monaco+377
  • Mongolia (Монгол)+976
  • Montenegro (Crna Gora)+382
  • Montserrat+1
  • Morocco (‫المغرب‬‎)+212
  • Mozambique (Moçambique)+258
  • Myanmar (Burma) (မြန်မာ)+95
  • Namibia (Namibië)+264
  • Nauru+674
  • Nepal (नेपाल)+977
  • Netherlands (Nederland)+31
  • New Caledonia (Nouvelle-Calédonie)+687
  • New Zealand+64
  • Nicaragua+505
  • Niger (Nijar)+227
  • Nigeria+234
  • Niue+683
  • Norfolk Island+672
  • North Korea (조선 민주주의 인민 공화국)+850
  • Northern Mariana Islands+1
  • Norway (Norge)+47
  • Oman (‫عُمان‬‎)+968
  • Pakistan (‫پاکستان‬‎)+92
  • Palau+680
  • Palestine (‫فلسطين‬‎)+970
  • Panama (Panamá)+507
  • Papua New Guinea+675
  • Paraguay+595
  • Peru (Perú)+51
  • Philippines+63
  • Poland (Polska)+48
  • Portugal+351
  • Puerto Rico+1
  • Qatar (‫قطر‬‎)+974
  • Réunion (La Réunion)+262
  • Romania (România)+40
  • Russia (Россия)+7
  • Rwanda+250
  • Saint Barthélemy+590
  • Saint Helena+290
  • Saint Kitts and Nevis+1
  • Saint Lucia+1
  • Saint Martin (Saint-Martin (partie française))+590
  • Saint Pierre and Miquelon (Saint-Pierre-et-Miquelon)+508
  • Saint Vincent and the Grenadines+1
  • Samoa+685
  • San Marino+378
  • São Tomé and Príncipe (São Tomé e Príncipe)+239
  • Saudi Arabia (‫المملكة العربية السعودية‬‎)+966
  • Senegal (Sénégal)+221
  • Serbia (Србија)+381
  • Seychelles+248
  • Sierra Leone+232
  • Singapore+65
  • Sint Maarten+1
  • Slovakia (Slovensko)+421
  • Slovenia (Slovenija)+386
  • Solomon Islands+677
  • Somalia (Soomaaliya)+252
  • South Africa+27
  • South Korea (대한민국)+82
  • South Sudan (‫جنوب السودان‬‎)+211
  • Spain (España)+34
  • Sri Lanka (ශ්‍රී ලංකාව)+94
  • Sudan (‫السودان‬‎)+249
  • Suriname+597
  • Svalbard and Jan Mayen+47
  • Sweden (Sverige)+46
  • Switzerland (Schweiz)+41
  • Syria (‫سوريا‬‎)+963
  • Taiwan (台灣)+886
  • Tajikistan+992
  • Tanzania+255
  • Thailand (ไทย)+66
  • Timor-Leste+670
  • Togo+228
  • Tokelau+690
  • Tonga+676
  • Trinidad and Tobago+1
  • Tunisia (‫تونس‬‎)+216
  • Turkey (Türkiye)+90
  • Turkmenistan+993
  • Turks and Caicos Islands+1
  • Tuvalu+688
  • U.S. Virgin Islands+1
  • Uganda+256
  • Ukraine (Україна)+380
  • United Arab Emirates (‫الإمارات العربية المتحدة‬‎)+971
  • United Kingdom+44
  • United States+1
  • Uruguay+598
  • Uzbekistan (Oʻzbekiston)+998
  • Vanuatu+678
  • Vatican City (Città del Vaticano)+39
  • Venezuela+58
  • Vietnam (Việt Nam)+84
  • Wallis and Futuna (Wallis-et-Futuna)+681
  • Western Sahara (‫الصحراء الغربية‬‎)+212
  • Yemen (‫اليمن‬‎)+967
  • Zambia+260
  • Zimbabwe+263
  • Åland Islands+358
Thank you! Your message has been sent.
Unable to send your message. Please fix errors then try again.