# AR(p): on a combination of previous time points,

AR(p): An Autoregressive process of p order can be defined in the following way1:Xt= j=1pjXt-j+t For Autoregressive Models, it’s assumed that the dependent value X at time t can be predicted  using previous values of a specific lag in time, where p refers to the number of lags, in that sense the process is usually used to model a time series which exhibit long term dependencies between past data. As an example, sales increase in christmas are predicted through the increase in sales from previous christmas (12 months period lag) rather than last month or current quarter (Q4 of that specific year). AR can be fitted by looking at Partial ACF plot and determining the maximum value of  p afterwhich partial autocorrelation values become equal to zero in relation to a significance level.MA(q): A moving average process of q order can be defined in the following way1:Xt=t+j=1qjt-j  Moving Average process is the idea of predicting a point in a time series based on a combination of previous time points, that such predictions follow a linear trend of such previous time, with an emphasis on the representing trends that are short and preceding the point in time that we want to predict its value. As an example, MA can model market trends like “mean reversion” where the assumption that a stock’s price will tend to move to the average price over time. MA can be fitted by looking where ACF value turns to zero, in that case we will be able to see where q+1 lag actually is. So determining q is related to where the ACF value becomes not so significant from zero for the lags beyond the maximum lag q.ARMA(p,q): an autoregressive–moving-average model of q and p orders can be defined in the following way Xt=j=1pjXt-j+j=1qjt-j+t  ARMA tries to capture both short term trends and long term cycles by combining both Autoregressive processes and Moving average processes to predict the next prediction in time t.  An example of ARMA is when stock prices are affected by fundamental information as well as being affected by for instance it’s tendency to move  to the average price over time effects due to market participants. For ARMA, the fitting would start with checking stationarity, in case there was no stationarity, it should stationarize in an iterative manner, then we should compute both the ACF and Partial ACF, in the ACF we try to find the lag value where ACF stabilizes inside significance bounds, and where PACF we find the lag after which all correlation is explained. In this way we find the orders p and q to fit the model.(b)MA(q) process is said to be invertible to an AR(infinite) process, under the condition that final moving average parameter ||<1, as we need to make sure that equivalent AR process' coefficients decrease to 0 as we move back in time, in contrast with the situation of ||>1, which in that case, the effect of past observations increases with the distance. AR(q) is invertible by definition, therefore, an ARMA(p,q) process is invertible if it’s MA(p) part is invertible.