This post demonstrates how to predict the stock market using the recurrent neural network (RNN) technique, specifically the Long short-term memory (LSTM) network. The implementation is in Tensorflow.
Introduction
Finanical time series are time stamped sequential data where traditional feed-forward neural network doesn't handle well. Recurrent neural network (RNN) solves this issue by feeding output neurons back into the input to provide memories of previous states. This turns out to be a huge success, especially in Natural Language Processing. Later on, Long short-term memory (LSTM) and Gated Recurrent Unit(GRU) are designed to alleviate the so-called vanishing/exploding gradients issues in the back-propagation phase of RNNs.
In this post, I will build an RNN model with LSTM or GRU cell to predict the prices of S&P 500. To be specific, a supervised machine learning model will be calibrated to predict tomorrow's S&P 500 close price based on the prices from the previous 20 business days. This is a regression problem; yet the codes can be easily adapted to handle classification problem such as simply predicting tomorrow's market direction. The code is located on Github.
Data Processing
The index data is downloaded from Yahoo Finance. It contains daily Open, High, Low, Close, and Volume information since year 2006. I use three of them, namely High, Low, and Close, believing that Average True Range containing relevant information about current market conditions. The data is firstly grouped into sliding window of 20 business days, and then splitted 90%/10% into train/test data sets.
An important step in using the stock price data is to normalize the data. Some people use sklearn.preprocessing.MinMaxScaler. To me a natural choice is yesterday's close, which will give us the (intraday high/low) returns seen on newspaper.
1 | n_window_size = 20 # 20 business days; use first 19 to predict the 20th |
To feed batches into tensorflow, it needs a help function called get_next_batch, which I literally adapted from [3].
1 | # generate next batch; randomly shuffle test set; and then draw without replacement batch_size samples |
Building Graph
I prefer using tensorflow directly to gain full control over graphs. An alternative is to use Keras [https://en.wikipedia.org/wiki/Keras]. Since we use 19 business days to predict the next business day, the step is 19. Also input is three because every day we use High, Low, and Close prices. Tensorflow already provides layer structure for basic RNN cells, as below.
1 | n_steps = n_window_size - 1 # 20 business days, X has 19 days |
Running Graph
To run the graph, initiate a tensorflow session and feed in batches. The trained model is also saved for later use.
1 | with tf.Session() as sess: |
Since we are predicting next day prices, the returns need to be converted back to prices and compare with actual closes.
1 | y_train_actual = spx['Close'].iloc[n_window_size:n_train_set_size+n_window_size] # (2736,) |
By now the RNN model is trained and ready to predict. The next step would be to put this forecasting signal into backtesting. Stay tuned.
Reference * Géron, Aurélien. Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. " O'Reilly Media, Inc.", 2017. * Lilian Weng, Predict Stock Prices Using RNN * Raoul Malm, NY Stock Price Prediction RNN LSTM GRU
DISCLAIMER: This post is for the purpose of research and backtest only. The author doesn't promise any future profits and doesn't take responsibility for any trading losses.