2024-04-01

This vignette explains briefly how to use the function adam() and the related auto.adam() in smooth package. It does not aim at covering all aspects of the function, but focuses on the main ones.

ADAM is Augmented Dynamic Adaptive Model. It is a model that underlies ETS, ARIMA and regression, connecting them in a unified framework. The underlying model for ADAM is a Single Source of Error state space model, which is explained in detail separately in an online textbook.

The main philosophy of adam() function is to be agnostic of the provided data. This means that it will work with ts, msts, zoo, xts, data.frame, numeric and other classes of data. The specification of seasonality in the model is done using a separate parameter lags, so you are not obliged to transform the existing data to something specific, and can use it as is. If you provide a matrix, or a data.frame, or a data.table, or any other multivariate structure, then the function will use the first column for the response variable and the others for the explanatory ones. One thing that is currently assumed in the function is that the data is measured at a regular frequency. If this is not the case, you will need to introduce missing values manually.

In order to run the experiments in this vignette, we need to load the following packages:

require(greybox)
require(smooth)

First and foremost, ADAM implements ETS model, although in a more flexible way than (Hyndman et al. 2008): it supports different distributions for the error term, which are regulated via distribution parameter. By default, the additive error model relies on Normal distribution, while the multiplicative error one assumes Inverse Gaussian. If you want to reproduce the classical ETS, you would need to specify distribution="dnorm". Here is an example of ADAM ETS(MMM) with Normal distribution on AirPassengers data:

testModel <- adam(AirPassengers, "MMM", lags=c(1,12), distribution="dnorm",
h=12, holdout=TRUE)
summary(testModel)
#>
#> Model estimated using adam() function: ETS(MMM)
#> Response variable: AirPassengers
#> Distribution used in the estimation: Normal
#> Loss function type: likelihood; Loss function value: 468.8889
#> Coefficients:
#>             Estimate Std. Error Lower 2.5% Upper 97.5%
#> alpha         0.8451     0.0841     0.6784      1.0000 *
#> beta          0.0205     0.0264     0.0000      0.0725
#> gamma         0.0000     0.0381     0.0000      0.0753
#> level       120.3078    13.1685    94.2236    146.3289 *
#> trend         1.0017     0.0100     0.9818      1.0216 *
#> seasonal_1    0.9150     0.0081     0.9007      0.9381 *
#> seasonal_2    0.9001     0.0081     0.8859      0.9233 *
#> seasonal_3    1.0299     0.0097     1.0156      1.0530 *
#> seasonal_4    0.9867     0.0084     0.9725      1.0099 *
#> seasonal_5    0.9856     0.0072     0.9714      1.0088 *
#> seasonal_6    1.1172     0.0095     1.1030      1.1404 *
#> seasonal_7    1.2344     0.0117     1.2202      1.2576 *
#> seasonal_8    1.2251     0.0106     1.2108      1.2482 *
#> seasonal_9    1.0663     0.0093     1.0521      1.0895 *
#> seasonal_10   0.9261     0.0089     0.9119      0.9493 *
#> seasonal_11   0.8046     0.0077     0.7904      0.8278 *
#>
#> Error standard deviation: 0.0375
#> Sample size: 132
#> Number of estimated parameters: 17
#> Number of degrees of freedom: 115
#> Information criteria:
#>       AIC      AICc       BIC      BICc
#>  971.7778  977.1463 1020.7855 1033.8919
plot(forecast(testModel,h=12,interval="prediction"))

You might notice that the summary contains more than what is reported by other smooth functions. This one also produces standard errors for the estimated parameters based on Fisher Information calculation. Note that this is computationally expensive, so if you have a model with more than 30 variables, the calculation of standard errors might take plenty of time. As for the default print() method, it will produce a shorter summary from the model, without the standard errors (similar to what es() does):

testModel
#> Time elapsed: 0.12 seconds
#> Model estimated using adam() function: ETS(MMM)
#> Distribution assumed in the model: Normal
#> Loss function type: likelihood; Loss function value: 468.8889
#> Persistence vector g:
#>  alpha   beta  gamma
#> 0.8451 0.0205 0.0000
#>
#> Sample size: 132
#> Number of estimated parameters: 17
#> Number of degrees of freedom: 115
#> Information criteria:
#>       AIC      AICc       BIC      BICc
#>  971.7778  977.1463 1020.7855 1033.8919
#>
#> Forecast errors:
#> ME: -4.589; MAE: 15.524; RMSE: 21.552
#> sCE: -20.977%; Asymmetry: -12.8%; sMAE: 5.914%; sMSE: 0.674%
#> MASE: 0.645; RMSSE: 0.688; rMAE: 0.204; rRMSE: 0.209

Also, note that the prediction interval in case of multiplicative error models are approximate. It is advisable to use simulations instead (which is slower, but more accurate):

plot(forecast(testModel,h=18,interval="simulated"))

If you want to do the residuals diagnostics, then it is recommended to use plot function, something like this (you can select, which of the plots to produce):

par(mfcol=c(3,4))
plot(testModel,which=c(1:11))
par(mfcol=c(1,1))
plot(testModel,which=12)

By default ADAM will estimate models via maximising likelihood function. But there is also a parameter loss, which allows selecting from a list of already implemented loss functions (again, see documentation for adam() for the full list) or using a function written by a user. Here is how to do the latter on the example of BJsales:

lossFunction <- function(actual, fitted, B){
return(sum(abs(actual-fitted)^3))
}
testModel <- adam(BJsales, "AAN", silent=FALSE, loss=lossFunction,
h=12, holdout=TRUE)
testModel
#> Time elapsed: 0.02 seconds
#> Model estimated using adam() function: ETS(AAN)
#> Distribution assumed in the model: Normal
#> Loss function type: custom; Loss function value: 599.2251
#> Persistence vector g:
#>  alpha   beta
#> 1.0000 0.2266
#>
#> Sample size: 138
#> Number of estimated parameters: 4
#> Number of degrees of freedom: 134
#> Information criteria are unavailable for the chosen loss & distribution.
#>
#> Forecast errors:
#> ME: 3.016; MAE: 3.129; RMSE: 3.867
#> sCE: 15.922%; Asymmetry: 91.7%; sMAE: 1.377%; sMSE: 0.029%
#> MASE: 2.627; RMSSE: 2.521; rMAE: 1.01; rRMSE: 1.009

Note that you need to have parameters actual, fitted and B in the function, which correspond to the vector of actual values, vector of fitted values on each iteration and a vector of the optimised parameters.

loss and distribution parameters are independent, so in the example above, we have assumed that the error term follows Normal distribution, but we have estimated its parameters using a non-conventional loss because we can. Some of distributions assume that there is an additional parameter, which can either be estimated or provided by user. These include Asymmetric Laplace (distribution="dalaplace") with alpha, Generalised Normal and Log-Generalised normal (distribution=c("gnorm","dlgnorm")) with shape and Student’s T (distribution="dt") with nu:

testModel <- adam(BJsales, "MMN", silent=FALSE, distribution="dgnorm", shape=3,
h=12, holdout=TRUE)

The model selection in ADAM ETS relies on information criteria and works correctly only for the loss="likelihood". There are several options, how to select the model, see them in the description of the function: ?adam(). The default one uses branch-and-bound algorithm, similar to the one used in es(), but only considers additive trend models (the multiplicative trend ones are less stable and need more attention from a forecaster):

testModel <- adam(AirPassengers, "ZXZ", lags=c(1,12), silent=FALSE,
h=12, holdout=TRUE)
#> Forming the pool of models based on... ANN , ANA , MNM , MAM , Estimation progress:    71 %86 %100 %... Done!
testModel
#> Time elapsed: 0.56 seconds
#> Model estimated using adam() function: ETS(MAM)
#> Distribution assumed in the model: Gamma
#> Loss function type: likelihood; Loss function value: 466.5937
#> Persistence vector g:
#>  alpha   beta  gamma
#> 0.7764 0.0000 0.0000
#>
#> Sample size: 132
#> Number of estimated parameters: 17
#> Number of degrees of freedom: 115
#> Information criteria:
#>       AIC      AICc       BIC      BICc
#>  967.1875  972.5559 1016.1951 1029.3016
#>
#> Forecast errors:
#> ME: 10.678; MAE: 21.661; RMSE: 26.648
#> sCE: 48.815%; Asymmetry: 66.5%; sMAE: 8.252%; sMSE: 1.031%
#> MASE: 0.899; RMSSE: 0.85; rMAE: 0.285; rRMSE: 0.259

Note that the function produces point forecasts if h>0, but it won’t generate prediction interval. This is why you need to use forecast() method (as shown in the first example in this vignette).

Similarly to es(), function supports combination of models, but it saves all the tested models in the output for a potential reuse. Here how it works:

testModel <- adam(AirPassengers, "CXC", lags=c(1,12),
h=12, holdout=TRUE)
testForecast <- forecast(testModel,h=18,interval="semiparametric", level=c(0.9,0.95))
testForecast
#>          Point forecast Lower bound (5%) Lower bound (2.5%) Upper bound (95%)
#> Jan 1960       412.7963         389.9166           385.6619          436.2219
#> Feb 1960       407.2683         378.6459           373.3673          436.7623
#> Mar 1960       468.6856         429.4083           422.2202          509.3979
#> Apr 1960       451.6047         410.7938           403.3531          494.0283
#> May 1960       452.4222         410.7801           403.1952          495.7416
#> Jun 1960       514.3279         465.1691           456.2333          565.5463
#> Jul 1960       571.2106         514.8504           504.6241          630.0119
#> Aug 1960       569.9350         512.9405           502.6071          629.4331
#> Sep 1960       497.4044         447.9023           438.9249          549.0698
#> Oct 1960       433.9085         390.4578           382.5806          479.2706
#> Nov 1960       378.7129         340.7720           333.8938          418.3236
#> Dec 1960       427.5252         383.8846           375.9820          473.1243
#> Jan 1961       435.1441         388.7155           380.3307          483.7554
#> Feb 1961       429.2176         381.0829           372.4180          479.7370
#> Mar 1961       493.8320         434.4863           423.8542          556.3402
#> Apr 1961       475.7266         416.0207           405.3583          538.7638
#> May 1961       476.4809         416.3052           405.5640          540.0364
#> Jun 1961       541.5579         473.1998           460.9976          613.7535
#>          Upper bound (97.5%)
#> Jan 1960            440.8401
#> Feb 1960            442.6214
#> Mar 1960            517.5417
#> Apr 1960            502.5429
#> May 1960            504.4435
#> Jun 1960            575.8535
#> Jul 1960            641.8638
#> Aug 1960            641.4335
#> Sep 1960            559.4879
#> Oct 1960            488.4205
#> Nov 1960            426.3137
#> Dec 1960            482.3312
#> Jan 1961            493.5936
#> Feb 1961            489.9897
#> Mar 1961            569.0779
#> Apr 1961            551.6443
#> May 1961            553.0280
#> Jun 1961            628.5107
plot(testForecast)

Yes, now we support vectors for the levels in case you want to produce several. In fact, we also support side for prediction interval, so you can extract specific quantiles without a hustle:

forecast(testModel,h=18,interval="semiparametric", level=c(0.9,0.95,0.99), side="upper")
#>          Point forecast Upper bound (90%) Upper bound (95%) Upper bound (99%)
#> Jan 1960       412.7963          430.9374          436.2219          446.2504
#> Feb 1960       407.2683          430.0713          436.7623          449.4989
#> Mar 1960       468.6856          500.1145          509.3979          527.1183
#> Apr 1960       451.6047          484.3306          494.0283          512.5643
#> May 1960       452.4222          485.8329          495.7416          514.6875
#> Jun 1960       514.3279          553.8151          565.5463          587.9929
#> Jul 1960       571.2106          616.5282          630.0119          655.8281
#> Aug 1960       569.9350          615.7827          629.4331          655.5753
#> Sep 1960       497.4044          537.2186          549.0698          571.7643
#> Oct 1960       433.9085          468.8628          479.2706          499.2034
#> Nov 1960       378.7129          409.2353          418.3236          435.7297
#> Dec 1960       427.5252          462.6545          473.1243          493.1839
#> Jan 1961       435.1441          472.5745          483.7554          505.1975
#> Feb 1961       429.2176          468.0933          479.7370          502.0912
#> Mar 1961       493.8320          541.8896          556.3402          584.1282
#> Apr 1961       475.7266          524.1617          538.7638          566.8736
#> May 1961       476.4809          525.3098          540.0364          568.3902
#> Jun 1961       541.5579          597.0253          613.7535          645.9606

A brand new thing in the function is the possibility to use several frequencies (double / triple / quadruple / … seasonal models). In order to show how it works, we will generate an artificial time series, inspired by half-hourly electricity demand using sim.gum() function:

ordersGUM <- c(1,1,1)
lagsGUM <- c(1,48,336)
initialGUM1 <- -25381.7
initialGUM2 <- c(23955.09, 24248.75, 24848.54, 25012.63, 24634.14, 24548.22, 24544.63, 24572.77,
24498.33, 24250.94, 24545.44, 25005.92, 26164.65, 27038.55, 28262.16, 28619.83,
28892.19, 28575.07, 28837.87, 28695.12, 28623.02, 28679.42, 28682.16, 28683.40,
28647.97, 28374.42, 28261.56, 28199.69, 28341.69, 28314.12, 28252.46, 28491.20,
28647.98, 28761.28, 28560.11, 28059.95, 27719.22, 27530.23, 27315.47, 27028.83,
26933.75, 26961.91, 27372.44, 27362.18, 27271.31, 26365.97, 25570.88, 25058.01)
initialGUM3 <- c(23920.16, 23026.43, 22812.23, 23169.52, 23332.56, 23129.27, 22941.20, 22692.40,
22607.53, 22427.79, 22227.64, 22580.72, 23871.99, 25758.34, 28092.21, 30220.46,
31786.51, 32699.80, 33225.72, 33788.82, 33892.25, 34112.97, 34231.06, 34449.53,
34423.61, 34333.93, 34085.28, 33948.46, 33791.81, 33736.17, 33536.61, 33633.48,
33798.09, 33918.13, 33871.41, 33403.75, 32706.46, 31929.96, 31400.48, 30798.24,
29958.04, 30020.36, 29822.62, 30414.88, 30100.74, 29833.49, 28302.29, 26906.72,
26378.64, 25382.11, 25108.30, 25407.07, 25469.06, 25291.89, 25054.11, 24802.21,
24681.89, 24366.97, 24134.74, 24304.08, 25253.99, 26950.23, 29080.48, 31076.33,
32453.20, 33232.81, 33661.61, 33991.21, 34017.02, 34164.47, 34398.01, 34655.21,
34746.83, 34596.60, 34396.54, 34236.31, 34153.32, 34102.62, 33970.92, 34016.13,
34237.27, 34430.08, 34379.39, 33944.06, 33154.67, 32418.62, 31781.90, 31208.69,
30662.59, 30230.67, 30062.80, 30421.11, 30710.54, 30239.27, 28949.56, 27506.96,
26891.75, 25946.24, 25599.88, 25921.47, 26023.51, 25826.29, 25548.72, 25405.78,
25210.45, 25046.38, 24759.76, 24957.54, 25815.10, 27568.98, 29765.24, 31728.25,
32987.51, 33633.74, 34021.09, 34407.19, 34464.65, 34540.67, 34644.56, 34756.59,
34743.81, 34630.05, 34506.39, 34319.61, 34110.96, 33961.19, 33876.04, 33969.95,
34220.96, 34444.66, 34474.57, 34018.83, 33307.40, 32718.90, 32115.27, 31663.53,
30903.82, 31013.83, 31025.04, 31106.81, 30681.74, 30245.70, 29055.49, 27582.68,
26974.67, 25993.83, 25701.93, 25940.87, 26098.63, 25771.85, 25468.41, 25315.74,
25131.87, 24913.15, 24641.53, 24807.15, 25760.85, 27386.39, 29570.03, 31634.00,
32911.26, 33603.94, 34020.90, 34297.65, 34308.37, 34504.71, 34586.78, 34725.81,
34765.47, 34619.92, 34478.54, 34285.00, 34071.90, 33986.48, 33756.85, 33799.37,
33987.95, 34047.32, 33924.48, 33580.82, 32905.87, 32293.86, 31670.02, 31092.57,
30639.73, 30245.42, 30281.61, 30484.33, 30349.51, 29889.23, 28570.31, 27185.55,
26521.85, 25543.84, 25187.82, 25371.59, 25410.07, 25077.67, 24741.93, 24554.62,
24427.19, 24127.21, 23887.55, 24028.40, 24981.34, 26652.32, 28808.00, 30847.09,
32304.13, 33059.02, 33562.51, 33878.96, 33976.68, 34172.61, 34274.50, 34328.71,
34370.12, 34095.69, 33797.46, 33522.96, 33169.94, 32883.32, 32586.24, 32380.84,
32425.30, 32532.69, 32444.24, 32132.49, 31582.39, 30926.58, 30347.73, 29518.04,
29070.95, 28586.20, 28416.94, 28598.76, 28529.75, 28424.68, 27588.76, 26604.13,
26101.63, 25003.82, 24576.66, 24634.66, 24586.21, 24224.92, 23858.42, 23577.32,
23272.28, 22772.00, 22215.13, 21987.29, 21948.95, 22310.79, 22853.79, 24226.06,
25772.55, 27266.27, 28045.65, 28606.14, 28793.51, 28755.83, 28613.74, 28376.47,
27900.76, 27682.75, 27089.10, 26481.80, 26062.94, 25717.46, 25500.27, 25171.05,
25223.12, 25634.63, 26306.31, 26822.46, 26787.57, 26571.18, 26405.21, 26148.41,
25704.47, 25473.10, 25265.97, 26006.94, 26408.68, 26592.04, 26224.64, 25407.27,
25090.35, 23930.21, 23534.13, 23585.75, 23556.93, 23230.25, 22880.24, 22525.52,
22236.71, 21715.08, 21051.17, 20689.40, 20099.18, 19939.71, 19722.69, 20421.58,
21542.03, 22962.69, 23848.69, 24958.84, 25938.72, 26316.56, 26742.61, 26990.79,
27116.94, 27168.78, 26464.41, 25703.23, 25103.56, 24891.27, 24715.27, 24436.51,
24327.31, 24473.02, 24893.89, 25304.13, 25591.77, 25653.00, 25897.55, 25859.32,
25918.32, 25984.63, 26232.01, 26810.86, 27209.70, 26863.50, 25734.54, 24456.96)
y <- sim.gum(orders=ordersGUM, lags=lagsGUM, nsim=1, frequency=336, obs=3360,
measurement=rep(1,3), transition=diag(3), persistence=c(0.045,0.162,0.375),
initial=cbind(initialGUM1,initialGUM2,initialGUM3))$data We can then apply ADAM to this data: testModel <- adam(y, "MMdM", lags=c(1,48,336), initial="backcasting", silent=FALSE, h=336, holdout=TRUE) testModel #> Time elapsed: 0.8 seconds #> Model estimated using adam() function: ETS(MMdM)[48, 336] #> Distribution assumed in the model: Gamma #> Loss function type: likelihood; Loss function value: 22193.97 #> Persistence vector g: #> alpha beta gamma1 gamma2 #> 0.9825 0.0000 0.0175 0.0175 #> Damping parameter: 0.9388 #> Sample size: 3024 #> Number of estimated parameters: 6 #> Number of degrees of freedom: 3018 #> Information criteria: #> AIC AICc BIC BICc #> 44399.94 44399.96 44436.02 44436.13 #> #> Forecast errors: #> ME: 41.303; MAE: 723.168; RMSE: 954.172 #> sCE: 46.478%; Asymmetry: 8.6%; sMAE: 2.422%; sMSE: 0.102% #> MASE: 0.984; RMSSE: 0.924; rMAE: 0.107; rRMSE: 0.115 Note that the more lags you have, the more initial seasonal components the function will need to estimate, which is a difficult task. This is why we used initial="backcasting" in the example above - this speeds up the estimation by reducing the number of parameters to estimate. Still, the optimiser might not get close to the optimal value, so we can help it. First, we can give more time for the calculation, increasing the number of iterations via maxeval (the default value is 40 iterations for each estimated parameter, e.g. $$40 \times 5 = 200$$ in our case): testModel <- adam(y, "MMdM", lags=c(1,48,336), initial="backcasting", silent=FALSE, h=336, holdout=TRUE, maxeval=10000) testModel #> Time elapsed: 2.02 seconds #> Model estimated using adam() function: ETS(MMdM)[48, 336] #> Distribution assumed in the model: Gamma #> Loss function type: likelihood; Loss function value: 22193.97 #> Persistence vector g: #> alpha beta gamma1 gamma2 #> 0.9825 0.0000 0.0175 0.0175 #> Damping parameter: 0.9464 #> Sample size: 3024 #> Number of estimated parameters: 6 #> Number of degrees of freedom: 3018 #> Information criteria: #> AIC AICc BIC BICc #> 44399.94 44399.96 44436.02 44436.13 #> #> Forecast errors: #> ME: 41.303; MAE: 723.168; RMSE: 954.172 #> sCE: 46.478%; Asymmetry: 8.6%; sMAE: 2.422%; sMSE: 0.102% #> MASE: 0.984; RMSSE: 0.924; rMAE: 0.107; rRMSE: 0.115 This will take more time, but will typically lead to more refined parameters. You can control other parameters of the optimiser as well, such as algorithm, xtol_rel, print_level and others, which are explained in the documentation for nloptr function from nloptr package (run nloptr.print.options() for details). Second, we can give a different set of initial parameters for the optimiser, have a look at what the function saves: testModel$B

and use this as a starting point for the reestimation (e.g. with a different algorithm):

testModel <- adam(y, "MMdM", lags=c(1,48,336), initial="backcasting",
silent=FALSE, h=336, holdout=TRUE, B=testModel$B) testModel #> Time elapsed: 0.67 seconds #> Model estimated using adam() function: ETS(MMdM)[48, 336] #> Distribution assumed in the model: Gamma #> Loss function type: likelihood; Loss function value: 22193.97 #> Persistence vector g: #> alpha beta gamma1 gamma2 #> 0.9825 0.0000 0.0175 0.0175 #> Damping parameter: 0.9537 #> Sample size: 3024 #> Number of estimated parameters: 6 #> Number of degrees of freedom: 3018 #> Information criteria: #> AIC AICc BIC BICc #> 44399.94 44399.96 44436.02 44436.13 #> #> Forecast errors: #> ME: 41.303; MAE: 723.168; RMSE: 954.172 #> sCE: 46.478%; Asymmetry: 8.6%; sMAE: 2.422%; sMSE: 0.102% #> MASE: 0.984; RMSSE: 0.924; rMAE: 0.107; rRMSE: 0.115 If you are ready to wait, you can change the initialisation to the initial="optimal", which in our case will take much more time because of the number of estimated parameters - 389 for the chosen model. The estimation process in this case might take 20 - 30 times more than in the example above. In addition, you can specify some parts of the initial state vector or some parts of the persistence vector, here is an example: testModel <- adam(y, "MMdM", lags=c(1,48,336), initial="backcasting", silent=TRUE, h=336, holdout=TRUE, persistence=list(beta=0.1)) testModel #> Time elapsed: 0.57 seconds #> Model estimated using adam() function: ETS(MMdM)[48, 336] #> Distribution assumed in the model: Gamma #> Loss function type: likelihood; Loss function value: 21765.29 #> Persistence vector g: #> alpha beta gamma1 gamma2 #> 0.9344 0.1000 0.0638 0.0656 #> Damping parameter: 0.4112 #> Sample size: 3024 #> Number of estimated parameters: 5 #> Number of degrees of freedom: 3019 #> Information criteria: #> AIC AICc BIC BICc #> 43540.58 43540.60 43570.65 43570.73 #> #> Forecast errors: #> ME: -18.126; MAE: 692.908; RMSE: 915.669 #> sCE: -20.397%; Asymmetry: 0.1%; sMAE: 2.321%; sMSE: 0.094% #> MASE: 0.943; RMSSE: 0.887; rMAE: 0.103; rRMSE: 0.11 The function also handles intermittent data (the data with zeroes) and the data with missing values. This is partially covered in the vignette on the oes() function. Here is a simple example: testModel <- adam(rpois(120,0.5), "MNN", silent=FALSE, h=12, holdout=TRUE, occurrence="odds-ratio") testModel #> Time elapsed: 0.04 seconds #> Model estimated using adam() function: iETS(MNN)[O] #> Occurrence model type: Odds ratio #> Distribution assumed in the model: Mixture of Bernoulli and Gamma #> Loss function type: likelihood; Loss function value: 39.789 #> Persistence vector g: #> alpha #> 0.7545 #> #> Sample size: 108 #> Number of estimated parameters: 5 #> Number of degrees of freedom: 103 #> Information criteria: #> AIC AICc BIC BICc #> 232.9765 233.2073 246.3872 237.5632 #> #> Forecast errors: #> Asymmetry: -31.604%; sMSE: 38.824%; rRMSE: 0.794; sPIS: -79.727%; sCE: 31.189% Finally, adam() is faster than es() function, because its code is more efficient and it uses a different optimisation algorithm with more finely tuned parameters by default. Let’s compare: adamModel <- adam(AirPassengers, "CCC", h=12, holdout=TRUE) esModel <- es(AirPassengers, "CCC", h=12, holdout=TRUE) "adam:" #> [1] "adam:" adamModel #> Time elapsed: 2.25 seconds #> Model estimated: ETS(CCC) #> Loss function type: likelihood #> #> Number of models combined: 30 #> Sample size: 132 #> Average number of estimated parameters: 20.6558 #> Average number of degrees of freedom: 111.3442 #> #> Forecast errors: #> ME: 2.537; MAE: 15.688; RMSE: 21.994 #> sCE: 11.599%; sMAE: 5.976%; sMSE: 0.702% #> MASE: 0.651; RMSSE: 0.702; rMAE: 0.206; rRMSE: 0.214 "es():" #> [1] "es():" esModel #> Time elapsed: 2.14 seconds #> Model estimated: ETS(CCC) #> Loss function type: likelihood #> #> Number of models combined: 30 #> Sample size: 132 #> Average number of estimated parameters: 21.1644 #> Average number of degrees of freedom: 110.8356 #> #> Forecast errors: #> ME: 4.754; MAE: 17.114; RMSE: 22.996 #> sCE: 21.733%; sMAE: 6.52%; sMSE: 0.767% #> MASE: 0.711; RMSSE: 0.734; rMAE: 0.225; rRMSE: 0.223 ADAM ARIMA As mentioned above, ADAM does not only contain ETS, it also contains ARIMA model, which is regulated via orders parameter. If you want to have a pure ARIMA, you need to switch off ETS, which is done via model="NNN": testModel <- adam(BJsales, "NNN", silent=FALSE, orders=c(0,2,2), h=12, holdout=TRUE) testModel #> Time elapsed: 0.03 seconds #> Model estimated using adam() function: ARIMA(0,2,2) #> Distribution assumed in the model: Normal #> Loss function type: likelihood; Loss function value: 240.5944 #> ARMA parameters of the model: #> MA: #> theta1[1] theta2[1] #> -0.7484 -0.0165 #> #> Sample size: 138 #> Number of estimated parameters: 5 #> Number of degrees of freedom: 133 #> Information criteria: #> AIC AICc BIC BICc #> 491.1888 491.6434 505.8251 506.9449 #> #> Forecast errors: #> ME: 2.961; MAE: 3.087; RMSE: 3.812 #> sCE: 15.63%; Asymmetry: 90.2%; sMAE: 1.358%; sMSE: 0.028% #> MASE: 2.591; RMSSE: 2.485; rMAE: 0.996; rRMSE: 0.995 Given that both models are implemented in the same framework, they can be compared using information criteria. The functionality of ADAM ARIMA is similar to the one of msarima function in smooth package, although there are several differences. First, changing the distribution parameter will allow switching between additive / multiplicative models. For example, distribution="dlnorm" will create an ARIMA, equivalent to the one on logarithms of the data: testModel <- adam(AirPassengers, "NNN", silent=FALSE, lags=c(1,12), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dlnorm", h=12, holdout=TRUE) testModel #> Time elapsed: 0.37 seconds #> Model estimated using adam() function: SARIMA(1,1,2)[1](1,1,2)[12] #> Distribution assumed in the model: Log-Normal #> Loss function type: likelihood; Loss function value: 617.5117 #> ARMA parameters of the model: #> AR: #> phi1[1] phi1[12] #> 0.8286 0.4681 #> MA: #> theta1[1] theta2[1] theta1[12] theta2[12] #> -0.5840 0.0896 -0.1938 -0.1226 #> #> Sample size: 132 #> Number of estimated parameters: 33 #> Number of degrees of freedom: 99 #> Information criteria: #> AIC AICc BIC BICc #> 1301.024 1323.921 1396.156 1452.059 #> #> Forecast errors: #> ME: -94.42; MAE: 94.42; RMSE: 100.143 #> sCE: -431.648%; Asymmetry: -100%; sMAE: 35.971%; sMSE: 14.555% #> MASE: 3.92; RMSSE: 3.196; rMAE: 1.242; rRMSE: 0.972 Second, if you want the model with intercept / drift, you can do it using constant parameter: testModel <- adam(AirPassengers, "NNN", silent=FALSE, lags=c(1,12), constant=TRUE, orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dnorm", h=12, holdout=TRUE) testModel #> Time elapsed: 0.34 seconds #> Model estimated using adam() function: SARIMA(1,1,2)[1](1,1,2)[12] with drift #> Distribution assumed in the model: Normal #> Loss function type: likelihood; Loss function value: 493.1328 #> ARMA parameters of the model: #> AR: #> phi1[1] phi1[12] #> -0.8840 -0.2784 #> MA: #> theta1[1] theta2[1] theta1[12] theta2[12] #> 0.8752 -0.1018 -0.1000 0.0965 #> #> Sample size: 132 #> Number of estimated parameters: 34 #> Number of degrees of freedom: 98 #> Information criteria: #> AIC AICc BIC BICc #> 1054.265 1078.802 1152.281 1212.183 #> #> Forecast errors: #> ME: -6.83; MAE: 13.958; RMSE: 17.923 #> sCE: -31.226%; Asymmetry: -49.2%; sMAE: 5.317%; sMSE: 0.466% #> MASE: 0.58; RMSSE: 0.572; rMAE: 0.184; rRMSE: 0.174 If the model contains non-zero differences, then the constant acts as a drift. Third, you can specify parameters of ARIMA via the arma parameter in the following manner: testModel <- adam(AirPassengers, "NNN", silent=FALSE, lags=c(1,12), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dnorm", arma=list(ar=c(0.1,0.1), ma=c(-0.96, 0.03, -0.12, 0.03)), h=12, holdout=TRUE) testModel #> Time elapsed: 0.18 seconds #> Model estimated using adam() function: SARIMA(1,1,2)[1](1,1,2)[12] #> Distribution assumed in the model: Normal #> Loss function type: likelihood; Loss function value: 534.8804 #> ARMA parameters of the model: #> AR: #> phi1[1] phi1[12] #> 0.1 0.1 #> MA: #> theta1[1] theta2[1] theta1[12] theta2[12] #> -0.96 0.03 -0.12 0.03 #> #> Sample size: 132 #> Number of estimated parameters: 27 #> Number of degrees of freedom: 105 #> Information criteria: #> AIC AICc BIC BICc #> 1123.761 1138.299 1201.596 1237.091 #> #> Forecast errors: #> ME: 9.575; MAE: 17.082; RMSE: 19.148 #> sCE: 43.773%; Asymmetry: 61.9%; sMAE: 6.508%; sMSE: 0.532% #> MASE: 0.709; RMSSE: 0.611; rMAE: 0.225; rRMSE: 0.186 Finally, the initials for the states can also be provided, although getting the correct ones might be a challenging task (you also need to know how many of them to provide; checking testModel$initial might help):

testModel <- adam(AirPassengers, "NNN", silent=FALSE, lags=c(1,12),
orders=list(ar=c(1,1),i=c(1,1),ma=c(2,0)), distribution="dnorm",
initial=list(arima=AirPassengers[1:24]),
h=12, holdout=TRUE)
testModel
#> Time elapsed: 0.28 seconds
#> Model estimated using adam() function: SARIMA(1,1,2)[1](1,1,0)[12]
#> Distribution assumed in the model: Normal
#> Loss function type: likelihood; Loss function value: 492.2311
#> ARMA parameters of the model:
#> AR:
#>  phi1[1] phi1[12]
#>   0.1018   0.0950
#> MA:
#> theta1[1] theta2[1]
#>   -0.3302    0.0588
#>
#> Sample size: 132
#> Number of estimated parameters: 31
#> Number of degrees of freedom: 101
#> Information criteria:
#>      AIC     AICc      BIC     BICc
#> 1046.462 1066.302 1135.829 1184.266
#>
#> Forecast errors:
#> ME: -21.3; MAE: 21.948; RMSE: 27.37
#> sCE: -97.376%; Asymmetry: -95%; sMAE: 8.361%; sMSE: 1.087%
#> MASE: 0.911; RMSSE: 0.874; rMAE: 0.289; rRMSE: 0.266

If you work with ADAM ARIMA model, then there is no such thing as “usual” bounds for the parameters, so the function will use the bounds="admissible", checking the AR / MA polynomials in order to make sure that the model is stationary and invertible (aka stable).

Similarly to ETS, you can use different distributions and losses for the estimation. Note that the order selection for ARIMA is done in auto.adam() function, not in the adam()! However, if you do orders=list(..., select=TRUE) in adam(), it will call auto.adam() and do the selection.

Finally, ARIMA is typically slower than ETS, mainly because its initial states are more difficult to estimate due to an increased complexity of the model. If you want to speed things up, use initial="backcasting" and reduce the number of iterations via maxeval parameter.

ADAM ETSX / ARIMAX / ETSX+ARIMA

Another important feature of ADAM is introduction of explanatory variables. Unlike in es(), adam() expects a matrix for data and can work with a formula. If the latter is not provided, then it will use all explanatory variables. Here is a brief example:

BJData <- cbind(BJsales,BJsales.lead)
testModel <- adam(BJData, "AAN", h=18, silent=FALSE)

If you work with data.frame or similar structures, then you can use them directly, ADAM will extract the response variable either assuming that it is in the first column or from the provided formula (if you specify one via formula parameter). Here is an example, where we create a matrix with lags and leads of an explanatory variable:

BJData <- cbind(as.data.frame(BJsales),as.data.frame(xregExpander(BJsales.lead,c(-7:7))))
colnames(BJData)[1] <- "y"
testModel <- adam(BJData, "ANN", h=18, silent=FALSE, holdout=TRUE, formula=y~xLag1+xLag2+xLag3)
testModel
#> Time elapsed: 0.13 seconds
#> Model estimated using adam() function: ETSX(ANN)
#> Distribution assumed in the model: Normal
#> Loss function type: likelihood; Loss function value: 211.0843
#> Persistence vector g (excluding xreg):
#> alpha
#>     1
#>
#> Sample size: 132
#> Number of estimated parameters: 6
#> Number of degrees of freedom: 126
#> Information criteria:
#>      AIC     AICc      BIC     BICc
#> 434.1687 434.8407 451.4655 453.1061
#>
#> Forecast errors:
#> ME: 0.758; MAE: 1.304; RMSE: 1.79
#> sCE: 6.042%; Asymmetry: 45.1%; sMAE: 0.577%; sMSE: 0.006%
#> MASE: 1.069; RMSSE: 1.145; rMAE: 0.582; rRMSE: 0.713