Anda di halaman 1dari 13

ASSIGNMENT -II

FORECASTING TECHNIQUES

Forecasting is the process of making statements about events whose actual outcomes (typically) have not yet been observed. A commonplace example might be estimation of the expected value for some variable of interest at some specified future date. Prediction is a similar, but more general term. Both might refer to formal statistical methods employing time series, cross-sectional or longitudinal data, or alternatively to less formal judgemental methods

METHODS:
TIME SERIES METHODS Time series methods use historical data as the basis of estimating future outcomes.

Rolling forecast is a projection into the future based on past performances, routinely updated on a regular schedule to incorporate data.[4] Moving average weighted moving average Exponential smoothing Autoregressive moving average (ARMA) Autoregressive integrated moving average (ARIMA)e.g. Box-Jenkins Extrapolation Linear prediction Trend estimation Growth curve

MOVING AVERAGE: In statistics, a moving average, also called rolling average, rolling mean or running average, is a type of finite impulse response filter used to analyze a set of data points by creating a series of averages of different subsets of the full data set. Given a series of numbers and a fixed subset size, the moving average can be obtained by first taking the average of the first subset. The fixed subset size is then shifted forward, creating a new subset of numbers, which is averaged. This process is repeated over the

entire data series. The plot line connecting all the (fixed) averages is the moving average. Thus, a moving average is not a single number, but it is a set of numbers, each of which is the average of the corresponding subset of a larger set of data points. A moving average may also use unequal weights for each data value in the subset to emphasize particular values in the subset. SIMPLE MOVING AVERAGE: A simple moving average (SMA) is the unweighted mean of the previous n data points. [citation needed] For example, a 10-day simple moving average of closing price is the mean of the previous 10 days' closing prices. If those prices are then the formula is

When calculating successive values, a new value comes into the sum and an old value drops out, meaning a full summation each time is unnecessary,

In technical analysis there are various popular values for n, like 10 days, 40 days, or 200 days. The period selected depends on the kind of movement one is concentrating on, such as short, intermediate, or long term. In any case moving average levels are interpreted as support in a rising market, or resistance in a falling market. CUMULATIVE MOVING AVERAGE The cumulative moving average[citation needed] is also frequently called a running average or a long running average[citation needed] although the term running average is also used as synonym for a moving average.[citation needed] This article uses the term cumulative moving average or simply cumulative average since this term is more descriptive and unambiguous. In some data acquisition systems, the data arrives in an ordered data stream and the statistician would like to get the average of all of the data up until the current data point. For example, an investor may want the average price of all of the stock transactions for a particular stock up until the current time. As each new transaction occurs, the average price at the time of the transaction can be calculated for all of the transactions up to that point using the cumulative average. This is the cumulative average, which is typically an unweighted average of the sequence of i values x1, ..., xi up to the current time:

The brute force method to calculate this would be to store all of the data and calculate the sum and divide by the number of data points every time a new data point arrived. However, it is possible to simply update cumulative average as a new value xi+1 becomes available, using the formula:[citation needed]

where CA0 can be taken to be equal to 0. AUTOREGRESSIVE INTEGRATED MOVING AVERAGE : In statistics and econometrics, and in particular in time series analysis, an autoregressive integrated moving average (ARIMA) model is a generalization of an autoregressive moving average (ARMA) model. These models are fitted to time series data either to better understand the data or to predict future points in the series (forecasting). They are applied in some cases where data show evidence of non-stationarity, where an initial differencing step (corresponding to the "integrated" part of the model) can be applied to remove the non-stationarity. Definition Given a time series of data Xt where t is an integer index and the Xt are real numbers, then an ARMA(p,q) model is given by:

where L is the lag operator, the i are the parameters of the autoregressive part of the model, the i are the parameters of the moving average part and the are error terms. The error terms are generally assumed to be independent, identically distributed variables sampled from a normal distribution with zero mean.

Assume now that the polynomial Then it can be rewritten as:

has a unitary root of multiplicity d.

An ARIMA(p,d,q) process expresses this polynomial factorisation property, and is given by:

and thus can be thought as a particular case of an ARMA(p+d,q) process having the autoregressive polynomial with some roots in the unity. For this reason every ARIMA model with d>0 is not wide sense stationary. LINEAR PREDICTION: Linear prediction is a mathematical operation where future values of a discrete-time signal are estimated as a linear function of previous samples. In digital signal processing, linear prediction is often called linear predictive coding (LPC) and can thus be viewed as a subset of filter theory. In system analysis (a subfield of mathematics), linear prediction can be viewed as a part of mathematical modelling or optimization. The most common representation is

where is the predicted signal value, x(n i) the previous observed values, and ai the predictor coefficients. The error generated by this estimate is

where x(n) is the true signal value. These equations are valid for all types of (one-dimensional) linear prediction. The differences are found in the way the parameters ai are chosen. For multi-dimensional signals the error metric is often defined as

where

is a suitable chosen vector norm.

CAUSAL / ECONOMETRIC METHODS

Some forecasting methods use the assumption that it is possible to identify the underlying factors that might influence the variable that is being forecast. For example, sales of umbrellas might be associated with weather conditions. If the causes are understood, projections of the influencing variables can be made and used in the forecast.

Regression analysis using linear regression or non-linear regression Econometrics Autoregressive moving average with exogenous inputs (ARMAX)

REGRESSION ANALYSIS: In statistics, regression analysis includes any techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables. More specifically, regression analysis helps us understand how the typical value of the dependent variable changes when any one of the independent variables is varied, while the other independent variables are held fixed. Most commonly, regression analysis estimates the conditional expectation of the dependent variable given the independent variables that is, the average value of the dependent variable when the independent variables are held fixed. Less commonly, the focus is on a quantile, or other location parameter of the conditional distribution of the dependent variable given the independent variables. In all cases, the estimation target is a function of the independent variables called the regression function. In regression analysis, it is also of interest to characterize the variation of the dependent variable around the regression function, which can be described by a probability distribution. Classical assumptions for regression analysis include:

The sample must be representative of the population for the inference prediction. The error is assumed to be a random variable with a mean of zero conditional on the explanatory variables. The variables are error-free. If this is not so, modeling may be done using errorsin-variables model techniques. The predictors must be linearly independent, i.e. it must not be possible to express any predictor as a linear combination of the others. See Multicollinearity. The errors are uncorrelated, that is, the variance-covariance matrix of the errors is diagonal and each non-zero element is the variance of the error. The variance of the error is constant across observations (homoscedasticity). If not, weighted least squares or other methods might be used.

Regression models involve the following variables:


The unknown parameters denoted as ; this may be a scalar or a vector. The independent variables, X. The dependent variable, Y.

A regression model relates Y to a function of X and .

The approximation is usually formalized as E(Y | X) = f(X, ). To carry out regression analysis, the form of the function f must be specified. Sometimes the form of this function is based on knowledge about the relationship between Y and X that does not rely on the data. If no such knowledge is available, a flexible or convenient form for f is chosen.

General linear model


In the more general multiple regression model, there are p independent variables:

The least square parameter estimates are obtained by p normal equations. The residual can be written as

The normal equations are

Note that for the normal equations depicted above, That is, there is no 0. Thus in what follows, In matrix notation, the normal equations for k responses (usually k = 1) are written as

with generalized inverse ( ) solution, subscripts showing matrix dimensions:

AUTOREGRESSIVE MOVING AVERAGE MODEL: n statistics and signal processing, autoregressive moving average (ARMA) models, sometimes called Box-Jenkins models after the iterative Box-Jenkins methodology usually used to estimate them, are typically applied to autocorrelated time series

data.Given a time series of data Xt, the ARMA model is a tool for understanding and, perhaps, predicting future values in this series. The model consists of two parts, an autoregressive (AR) part and a moving average (MA) part. The model is usually then referred to as the ARMA(p,q) model where p is the order of the autoregressive part and q is the order of the moving average part The notation ARMA(p, q) refers to the model with p autoregressive terms and q moving average terms. This model contains the AR(p) and MA(q) models,

ALTERNATIVE NOTATION

Some authors, including Box, Jenkins & Reinsel (1994) use a different convention for the autoregression coefficients. This allows all the polynomials involving the lag operator to appear in a similar form throughout. Thus the ARMA model would be written as

APPLICATIONS

ARMA is appropriate when a system is a function of a series of unobserved shocks (the MA part)[clarification needed] as well as its own behavior. For example, stock prices may be shocked by fundamental information as well as exhibiting technical trending and meanreversion effects due to market participants. When looking at long term data, econometricians tend to opt for an Ar(p) model for simplicity

JUDGMENTAL METHODS:
Judgmental forecasting methods incorporate intuitive judgements, opinions and subjective probability estimates.

Composite forecasts Surveys Delphi method Scenario building Technology forecasting Forecast by analogy

DELPHI METHOD: The Delphi method is a systematic, interactive forecasting method which relies on a panel of experts. The experts answer questionnaires in two or more rounds. After each

round, a facilitator provides an anonymous summary of the experts forecasts from the previous round as well as the reasons they provided for their judgments. Thus, experts are encouraged to revise their earlier answers in light of the replies of other members of their panel. It is believed that during this process the range of the answers will decrease and the group will converge towards the "correct" answer. Finally, the process is stopped after a pre-defined stop criterion (e.g. number of rounds, achievement of consensus, stability of results) and the mean or median scores of the final rounds determine the results.[1] USE IN FORECASTING First applications of the Delphi method were in the field of science and technology forecasting. The objective of the method was to combine expert opinions on likelihood and expected development time, of the particular technology, in a single indicator. One of the first such reports, prepared in 1964 by Gordon and Helmer, assessed the direction of long-term trends in science and technology development, covering such topics as scientific breakthroughs, population control, automation, space progress, war prevention and weapon systems. Other forecasts of technology were dealing with vehicle-highway systems, industrial robots, intelligent internet, broadband connections, and technology in education. Later the Delphi method was applied in other areas, especially those related to public policy issues, such as economic trends, health and education. It was also applied successfully and with high accuracy in business forecasting. For example, in one case reported by Basu and Schroeder (1977), the Delphi method predicted the sales of a new product during the first two years with inaccuracy of 34% compared with actual sales. Quantitative methods produced errors of 1015%, and traditional unstructured forecast methods had errors of about 20%. The Delphi method has also been used as a tool to implement multi-stakeholder approaches for participative policy-making in developing countries. The governments of Latin America and the Caribbean have successfully used the Delphi method as an openended public-private sector approach to identify the most urgent challenges for their regional ICT-for-development eLAC Action Plans.[6] As a result, governments have widely acknowledged the value of collective intelligence from civil society, academic and private sector participants of the Delphi, especially in a field of rapid change, such as technology policies. In this sense, the Delphi method can contribute to a general appreciation of participative policy-making. SCENARIO ANALYSIS Scenario analysis is a process of analyzing possible future events by considering alternative possible outcomes (scenarios).The analysis is designed to allow improved decision-making by allowing consideration of outcomes and their implications.Scenario analysis can also be used to illuminate "wild cards." For example, analysis of the possibility of the earth being struck by a large celestial object (a meteor) suggests that whilst the probability is low, the damage inflicted is so high that the event is much more

important (threatening) than the low probability (in any one year) alone would suggest. However, this possibility is usually disregarded by organizations using scenario analysis to develop a strategic plan since it has such overarching repercussions.
FINANCIAL APPLICATIONS

For example, in economics and finance, a financial institution might attempt to forecast several possible scenarios for the economy (e.g. rapid growth, moderate growth, slow growth) and it might also attempt to forecast financial market returns (for bonds, stocks and cash) in each of those scenarios. It might consider sub-sets of each of the possibilities. It might further seek to determine correlations and assign probabilities to the scenarios (and sub-sets if any). Then it will be in a position to consider how to distribute assets between asset types (i.e. asset allocation); the institution can also calculate the scenario-weighted expected return (which figure will indicate the overall attractiveness of the financial environment). It may also perform stress testing, using adverse scenarios.
GEO-POLITICAL APPLICATIONS

In politics or geo-politics, scenario analysis involves modelling the possible alternative paths of a social or political environment and possibly diplomatic and war risks. For example, in the recent Iraq War, the Pentagon certainly had to model alternative possibilities that might arise in the war situation and had to position materiel and troops accordingly.

ARTIFICIAL INTELLIGENCE METHODS :


Artificial neural networks Support vector machines

ARTIFICIAL NEURAL NETWORK An artificial neural network (ANN), usually called "neural network" (NN), is a mathematical model or computational model that is inspired by the structure and/or functional aspects of biological neural networks. It consists of an interconnected group of artificial neurons and processes information using a connectionist approach to computation. In most cases an ANN is an adaptive system that changes its structure based on external or internal information that flows through the network during the learning phase. Modern neural networks are non-linear statistical data modeling tools. They are usually used to model complex relationships between inputs and outputs or to find patterns in data.
MODELS

Neural network models in artificial intelligence are usually referred to as artificial neural networks (ANNs); these are essentially simple mathematical models defining a function or a distribution over X or both X and Y, but sometimes models also

intimately associated with a particular learning algorithm or learning rule. A common use of the phrase ANN model really means the definition of a class of such functions (where members of the class are obtained by varying parameters, connection weights, or specifics of the architecture such as the number of neurons or their connectivity).

REAL LIFE APPLICATIONS:

The tasks to which artificial neural networks are applied tend to fall within the following broad categories:

Function approximation, or regression analysis, including time series prediction, fitness approximation and modeling. Classification, including pattern and sequence recognition, novelty detection and sequential decision making. Data processing, including filtering, clustering, blind source separation and compression. Robotics, including directing manipulators, Computer numerical control

OTHER METHODS

Simulation Prediction market Probabilistic forecasting and Ensemble forecasting Reference class forecasting

SIMULATION: Simulation is the imitation of some real thing, state of affairs, or process. The act of simulating something generally entails representing certain key characteristics or behaviours of a selected physical or abstract system.Simulation is used in many contexts, including the modeling of natural systems or human systems in order to gain insight into their functioning.[1] Other contexts include simulation of technology for performance optimization, safety engineering, testing, training and education. Simulation can be used to show the eventual real effects of alternative conditions and courses of action. Simulation is also used when the real system cannot be engaged. The real system may not be engaged because it may not be accessible, it may be dangerous or unacceptable to engage, or it may simply not exist
FLIGHT SIMULATION:

Flight Simulation Training Devices (FSTD) are used to train pilots on the ground. In comparison to training in an actual aircraft, simulation based training allows for the

training of maneuvers or situations that may be impractical (or even dangerous) to perform in the aircraft, while keeping the pilot and instructor in a relatively low-risk environment on the ground. For example, electrical system failures, instrument failures, hydraulic system failures, and even flight control failures can be simulated without risk to the pilots or an aircraft. Instructors can also provide students with a higher concentration of training tasks in a given period of time than is usually possible in the aircraft. For example, conducting multiple instrument approaches in the actual aircraft may require significant time spent repositioning the aircraft, while in a simulation, as soon as one approach has been completed, the instructor can immediately preposition the simulated aircraft to an ideal (or less than ideal) location from which to begin the next approach. PREDICTION MARKET: Prediction markets (also known as predictive markets, information markets, decision markets, idea futures, event derivatives, or virtual markets) are speculative markets created for the purpose of making predictions. Assets are created whose final cash value is tied to a particular event (e.g., will the next US president be a Republican) or parameter (e.g., total sales next quarter). The current market prices can then be interpreted as predictions of the probability of the event or the expected value of the parameter. Prediction markets are thus structured as betting exchanges, without any risk for the bookmaker.People who buy low and sell high are rewarded for improving the market prediction, while those who buy high and sell low are punished for degrading the market prediction. Evidence so far suggests that prediction markets are at least as accurate as other institutions predicting the same events with a similar pool of participants. PROBABILISTIC FORECASTING: Probabilistic forecasting is a technique for weather forecasting which relies on different methods to establish an event occurrence/magnitude probability. This differs substantially from giving a definite information on the occurrence/magnitude (or not) of the same event, technique used in deterministic forecasting. Both techniques try to predict events but information on the uncertainty of the prediction is only present in the probabilistic forecast.The probability information is typically derived by using several numerical model runs, with slightly varying initial conditions. This technique is usually referred to as Ensemble forecasting by an EPS Ensemble Prediction System (EPS). EPS does not produce a full forecast probability distribution over all possible events, and it is possible to use purely statistical or hybrid statistical/numerical methods to do this.[1] For example, temperature can take on a theoretically infinite number of possible values (events) from zero to infinity; a statistical method would produce a distribution assigning a probability value to every possible temperature. Implausibly high or low temperatures would then have close to zero probability values.

EXAMPLES:
Canada has been one of the first countries to broadcast their probabilistic forecast by giving chances of precipitation in percentages. As an example of fully probabilistic forecasts, recently, distribution forecasts of rainfall amounts by purely statistical methods have been developed whose performance is competitive with hybrid EPS/statistical rainfall forecasts of daily rainfall amounts.[5]

Anda mungkin juga menyukai