This demo illustrates how the JMSL Library can be used to build an application that uses a feed-forward neural network to forecast time series based on historical data. Although the particular time series available in this demo is champagne sales per month, sun spot frequency per year, or the price of crude oil, such an analysis could apply to numerous types of cyclical time series data such as hurricane frequency, El-nino intensity, or securities prices. The forecast is compared to observed values and also an ARMA(2,1) model.
The neural network forecasting technique can generate forecasts for data with very challenging and complex characteristics. Such characteristics can include noisy data, seasonal data, short time series, categorical data, as well as numerous possible, but unknown, interactions.
The neural network incorporates several features. For
inputs, the historical data are lagged 12 times to pick up the yearly cycle
in the Sun Spots and Champagne Sales data. There is a single hidden layer consisting
of 8 perceptrons. Each perceptron uses a logistic activation function. Finally
there is a single output node using a linear activation function. The network
is trained using an EpochTrainer, a two-stage trainer using a QuasiNewtonTrainer
for each stage, with 30 epochs.
Use the JCheckBoxes to select which additional lines to overlay: an ARMA forecast, a neural net forecast, and the actual future data values.
Different data sets can be selected from the Data menu item.
com.imsl.stat.ARMA
- this class is used
to predict future values of the time series.
com.imsl.datamining.neural.FeedForwardNetwork
- this is the main class is used to build, train, and forecast with a feed forward
neural network. The setWeights()
method is used to load information
from a previously trained network. Since training for these data sets takes
over a minute, training was done previously and stored in data files on disk.
com.imsl.datamining.neural.Perceptrons
-
Perceptrons are the nodes of the network that make up each of the hidden layers,
input layer and output layer. Its activation function is configured using the
setActivation
method.
com.imsl.datamining.neural.QuasiNewton Trainer
and com.imsl.datamining.neural.EpochTrainer
- while this has been
commented out in the demo, these classes are used to train the network.
The chart is a standard line chart with symbols. Each of the four lines that can be shown on the graph are basic Data objects that are then differentiated using different colors, types or patterns.
Each data section are shown below:
data
= new Data(axis, xsub, ysub);
data.setTitle("Historical data ");
data.setDataType(Data.DATA_TYPE_LINE | Data.DATA_TYPE_MARKER);
data.setMarkerType(Data.MARKER_TYPE_HOLLOW_CIRCLE);
data.setMarkerColor(Color.BLUE);
data.setLineColor(Color.BLUE);
realData = new Data(axis, x2, y2);
realData.setTitle("Actual outcome ");
realData.setDataType(Data.DATA_TYPE_LINE);
realData.setLineColor(Color.BLUE);
realData.setLineDashPattern(Data.DASH_PATTERN_DOT);
forecastData = new Data(axis, prepend(x2[0], x1), prepend(y2[0], tmp));
forecastData.setTitle("ARMA Forecast");
forecastData.setDataType(Data.DATA_TYPE_LINE);
forecastData.setLineColor(Color.RED);
nnData = new Data(axis, prepend(x2[0], x1), prepend(y2[0], sol));
nnData.setTitle("FFNN Forecast");
nnData.setDataType(Data.DATA_TYPE_LINE);
nnData.setLineColor(Color.MAGENTA);
This example forecasts a single time series using a feed
forward neural network. But to train the network, we actually use a series of
12 inputs and the following data point as an output. This input series is created
by a process known as lagging. For multidimensional data, one can use the TimeSeriesFilter
class for construction, but for one dimensional data as in this case, a separate
method had to be created:
public
double[][] lagData(double[] a, int n) {
int len = a.length;
double[][] b = new double[len-n][n];
for (int i=0; i<len-n; i++) {
for (int j=0; j<n; j++) {
b[i][j] = a[i+j];
}
}
return b;
}
ForecastNet.java | This is the main class that extends JFrameChart. The chart and GUI elements are all initialized in the constructor. |
Two alternatives are available to run this demo:
1) Use the source code in your development environment as any other Java code. More information is available in the How To.
2) An executable jar file containing all of the demos
referenced in this guide is included in the jmsl/lib directory. On
Windows, you may double-click the file to run it if files with a ".jar"
extension are properly registered with javaw.exe. Alternatively,
for both Windows and UNIX environments, the jar file may be executed from the
command line using java -jar gallery.jar
.
As list of buttons, one for each demo, is created. Demos can be subsetted as they relate to specific areas (Math, Stat, Finance, Charting) by choosing the appropriate selection on the JComboBox. To run the Additional Demos, select Quick Start in the JComboBox.