Importing dependecies

from IPython.core.debugger import set_trace

import pandas as pd
import numpy as np
import os
import matplotlib.pyplot as plt
import datetime

plt.style.use(style="seaborn")
%matplotlib inline

Importing the Stock Price Dataset

df = pd.read_csv("MSFT-1Y-Hourly.csv", sep=",")

Data Exploration

df.isna().any()
date        False
open        False
high        False
low         False
close       False
volume      False
average     False
barCount    False
dtype: bool
df.set_index("date", drop=True, inplace=True)
df.info()
<class 'pandas.core.frame.DataFrame'>
Index: 1753 entries, 2019-08-07 14:30:00 to 2020-08-05 15:00:00
Data columns (total 7 columns):
 #   Column    Non-Null Count  Dtype  
---  ------    --------------  -----  
 0   open      1753 non-null   float64
 1   high      1753 non-null   float64
 2   low       1753 non-null   float64
 3   close     1753 non-null   float64
 4   volume    1753 non-null   int64  
 5   average   1753 non-null   float64
 6   barCount  1753 non-null   int64  
dtypes: float64(5), int64(2)
memory usage: 109.6+ KB
df
open high low close volume average barCount
date
2019-08-07 14:30:00 133.80 133.83 131.82 132.89 35647 132.701 17523
2019-08-07 15:00:00 132.87 135.20 132.64 134.75 48757 134.043 26974
2019-08-07 16:00:00 134.74 134.92 133.52 133.88 28977 134.147 17853
2019-08-07 17:00:00 133.89 134.06 133.07 133.90 21670 133.618 13497
2019-08-07 18:00:00 133.89 135.24 133.83 134.83 22648 134.653 12602
... ... ... ... ... ... ... ...
2020-08-04 18:00:00 212.13 213.15 211.76 212.99 27210 212.572 13294
2020-08-04 19:00:00 213.01 213.21 211.13 211.97 32968 212.084 18230
2020-08-04 20:00:00 211.97 213.37 211.25 213.35 51797 212.261 28991
2020-08-05 14:30:00 214.89 215.00 212.07 214.19 44975 213.543 20045
2020-08-05 15:00:00 214.21 214.40 213.22 213.64 10402 213.800 5254

1753 rows × 7 columns

df = df[["close"]]
df.describe()
close
count 1753.000000
mean 164.330610
std 23.125225
min 132.670000
25% 143.320000
50% 159.750000
75% 183.390000
max 216.540000

Visualizing Microsoft close price

plt.figure(1, figsize=(16, 8))
plt.plot(df.close)
[<matplotlib.lines.Line2D at 0x1c76637e460>]

Adding some Features

df["returns"] = df.close.pct_change()
<ipython-input-11-6d4567e8d880>:1: SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
  df["returns"] = df.close.pct_change()
df.head()
close returns
date
2019-08-07 14:30:00 132.89 NaN
2019-08-07 15:00:00 134.75 0.013997
2019-08-07 16:00:00 133.88 -0.006456
2019-08-07 17:00:00 133.90 0.000149
2019-08-07 18:00:00 134.83 0.006945

Setting the Log returns as a Feature

df["log_returns"] = np.log(1 + df["returns"])
<ipython-input-13-fe3b4b6f324b>:1: SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
  df["log_returns"] = np.log(1 + df["returns"])
df.head()
close returns log_returns
date
2019-08-07 14:30:00 132.89 NaN NaN
2019-08-07 15:00:00 134.75 0.013997 0.013899
2019-08-07 16:00:00 133.88 -0.006456 -0.006477
2019-08-07 17:00:00 133.90 0.000149 0.000149
2019-08-07 18:00:00 134.83 0.006945 0.006921

Plotting the Log returns

plt.figure(figsize=(16, 8))
plt.plot(df.log_returns)
[<matplotlib.lines.Line2D at 0x1c76806f280>]
df.dropna(inplace=True)
X = df[["close", "log_returns"]].values
<ipython-input-16-48661d55c8de>:1: SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
  df.dropna(inplace=True)
X[:10]
array([[ 1.34750000e+02,  1.38994914e-02],
       [ 1.33880000e+02, -6.47733345e-03],
       [ 1.33900000e+02,  1.49376354e-04],
       [ 1.34830000e+02,  6.92147295e-03],
       [ 1.35480000e+02,  4.80930230e-03],
       [ 1.35280000e+02, -1.47732336e-03],
       [ 1.36850000e+02,  1.15387309e-02],
       [ 1.37810000e+02,  6.99048940e-03],
       [ 1.38250000e+02,  3.18771552e-03],
       [ 1.38330000e+02,  5.78494484e-04]])

Model Building

from sklearn.preprocessing import MinMaxScaler

Scaling the Features

scaler = MinMaxScaler(feature_range=(0, 1)).fit(X)
X_scaled = scaler.transform(X)
X_scaled[:10]
array([[0.02480029, 0.6063657 ],
       [0.01442709, 0.50534309],
       [0.01466555, 0.53819647],
       [0.02575414, 0.57177063],
       [0.03350423, 0.56129908],
       [0.03111959, 0.53013175],
       [0.04983904, 0.59466171],
       [0.06128532, 0.5721128 ],
       [0.06653154, 0.55325971],
       [0.06748539, 0.54032392]])
y = [x[0] for x in X_scaled]
y[:10]
[0.024800286157148133,
 0.014427089543340932,
 0.014665553833313805,
 0.025754143317038736,
 0.033504232741147,
 0.03111958984142138,
 0.049839036604268694,
 0.06128532252295238,
 0.06653153690234914,
 0.06748539406223952]

Splitting the data into training & test data

split = int(len(X_scaled) * 0.8)
print(split)
1401
X_train = X_scaled[:split]
X_test = X_scaled[split : len(X_scaled)]
y_train = y[:split]
y_test = y[split : len(y)]
assert len(X_train) == len(y_train)
assert len(X_test) == len(y_test)

Labeling:

we want to predict the stock price at a future time. We'll predict the stock price at time t+1 relative to stock price at time t

As we're going to use an LSTM architecture, we know that it has memory and that it is maintined by setting a time step basiclly how many steps in the past we want the LSTM to use.

The time step refers to how many steps in the time we want the backpropagation algorithm to use when calculating the gradients for weights updates during training.

So we can use a method to create both the timestep and the ouput variable, t+1

n = 3
Xtrain = []
Xtest = []
ytrain = []
ytest = []
for i in range(n, len(X_train)):
    Xtrain.append(X_train[i - n : i, : X_train.shape[1]])
    ytrain.append(y_train[i])
for i in range(n, len(X_test)):
    Xtest.append(X_test[i - n : i, : X_test.shape[1]])
    ytest.append(y_test[i])
df.head()
close returns log_returns
date
2019-08-07 15:00:00 134.75 0.013997 0.013899
2019-08-07 16:00:00 133.88 -0.006456 -0.006477
2019-08-07 17:00:00 133.90 0.000149 0.000149
2019-08-07 18:00:00 134.83 0.006945 0.006921
2019-08-07 19:00:00 135.48 0.004821 0.004809
Xtrain[0]
array([[0.02480029, 0.6063657 ],
       [0.01442709, 0.50534309],
       [0.01466555, 0.53819647]])
ytrain[0]
0.025754143317038736
val = np.array(ytrain[0])
val = np.c_[val, np.zeros(val.shape)]
scaler.inverse_transform(val)
array([[ 1.34830000e+02, -1.08407857e-01]])

In an LSTM network the input for each LSTM layer needs to contain the following network:

The number of observations

The time steps

The features

Therefore we need to add a temporel dimension compared to a classical network (number of observation, number of steps, number of feature per step)

Xtrain, ytrain = (np.array(Xtrain), np.array(ytrain))
Xtrain = np.reshape(Xtrain, (Xtrain.shape[0],Xtrain.shape[1],Xtrain.shape[2]))

Xtest, ytest = (np.array(Xtest), np.array(ytest))
Xtest = np.reshape(Xtest, (Xtest.shape[0],Xtest.shape[1],Xtest.shape[2]))
print(Xtrain.shape)
print(ytrain.shape)
print('-------')
print(Xtest.shape)
print(ytest.shape)
(1398, 3, 2)
(1398,)
-------
(348, 3, 2)
(348,)

LSTM Model

from keras.models import Sequential
from keras.layers import LSTM, Dense
model =Sequential()
model.add(LSTM(4, input_shape=(Xtrain.shape[1],Xtrain.shape[2])))
model.add(Dense(1))
model.compile(loss="mean_squared_error", optimizer="adam")
model.fit(Xtrain, ytrain, epochs=100, validation_data=(Xtest,ytest), batch_size=16, verbose=1)
Epoch 1/100
88/88 [==============================] - 10s 31ms/step - loss: 0.1529 - val_loss: 0.5727
Epoch 2/100
88/88 [==============================] - 1s 11ms/step - loss: 0.0524 - val_loss: 0.2905
Epoch 3/100
88/88 [==============================] - 1s 10ms/step - loss: 0.0328 - val_loss: 0.2107
Epoch 4/100
88/88 [==============================] - 1s 10ms/step - loss: 0.0282 - val_loss: 0.1800
Epoch 5/100
88/88 [==============================] - 1s 12ms/step - loss: 0.0237 - val_loss: 0.1393
Epoch 6/100
88/88 [==============================] - 1s 10ms/step - loss: 0.0190 - val_loss: 0.1030
Epoch 7/100
88/88 [==============================] - 1s 11ms/step - loss: 0.0137 - val_loss: 0.0664
Epoch 8/100
88/88 [==============================] - 1s 9ms/step - loss: 0.0084 - val_loss: 0.0296
Epoch 9/100
88/88 [==============================] - 1s 9ms/step - loss: 0.0038 - val_loss: 0.0085
Epoch 10/100
88/88 [==============================] - 1s 8ms/step - loss: 0.0012 - val_loss: 0.0011
Epoch 11/100
88/88 [==============================] - 1s 7ms/step - loss: 5.1356e-04 - val_loss: 4.4004e-04
Epoch 12/100
88/88 [==============================] - 1s 7ms/step - loss: 4.4835e-04 - val_loss: 4.5397e-04
Epoch 13/100
88/88 [==============================] - 1s 8ms/step - loss: 4.4176e-04 - val_loss: 4.4671e-04
Epoch 14/100
88/88 [==============================] - 1s 7ms/step - loss: 4.3327e-04 - val_loss: 4.5926e-04
Epoch 15/100
88/88 [==============================] - 1s 6ms/step - loss: 4.2333e-04 - val_loss: 4.7426e-04
Epoch 16/100
88/88 [==============================] - 1s 7ms/step - loss: 4.1812e-04 - val_loss: 5.4229e-04
Epoch 17/100
88/88 [==============================] - 1s 6ms/step - loss: 4.1389e-04 - val_loss: 5.3970e-04
Epoch 18/100
88/88 [==============================] - 1s 6ms/step - loss: 4.0356e-04 - val_loss: 5.6159e-04
Epoch 19/100
88/88 [==============================] - 1s 7ms/step - loss: 3.9753e-04 - val_loss: 6.4844e-04
Epoch 20/100
88/88 [==============================] - 1s 7ms/step - loss: 3.9038e-04 - val_loss: 7.9734e-04
Epoch 21/100
88/88 [==============================] - 1s 6ms/step - loss: 3.8707e-04 - val_loss: 7.7441e-04
Epoch 22/100
88/88 [==============================] - 1s 7ms/step - loss: 3.7957e-04 - val_loss: 8.1438e-04
Epoch 23/100
88/88 [==============================] - 1s 6ms/step - loss: 3.7526e-04 - val_loss: 0.0011
Epoch 24/100
88/88 [==============================] - 1s 7ms/step - loss: 3.6494e-04 - val_loss: 9.8834e-04
Epoch 25/100
88/88 [==============================] - 1s 6ms/step - loss: 3.6789e-04 - val_loss: 0.0012
Epoch 26/100
88/88 [==============================] - 1s 7ms/step - loss: 3.5950e-04 - val_loss: 0.0012
Epoch 27/100
88/88 [==============================] - 1s 6ms/step - loss: 3.5627e-04 - val_loss: 0.0013
Epoch 28/100
88/88 [==============================] - 1s 7ms/step - loss: 3.4922e-04 - val_loss: 0.0014
Epoch 29/100
88/88 [==============================] - 1s 7ms/step - loss: 3.4514e-04 - val_loss: 0.0014
Epoch 30/100
88/88 [==============================] - 1s 7ms/step - loss: 3.4433e-04 - val_loss: 0.0017
Epoch 31/100
88/88 [==============================] - 1s 6ms/step - loss: 3.3925e-04 - val_loss: 0.0015
Epoch 32/100
88/88 [==============================] - 1s 7ms/step - loss: 3.4124e-04 - val_loss: 0.0017
Epoch 33/100
88/88 [==============================] - 1s 6ms/step - loss: 3.3934e-04 - val_loss: 0.0022
Epoch 34/100
88/88 [==============================] - 1s 7ms/step - loss: 3.3443e-04 - val_loss: 0.0022
Epoch 35/100
88/88 [==============================] - 1s 6ms/step - loss: 3.3363e-04 - val_loss: 0.0021
Epoch 36/100
88/88 [==============================] - 1s 7ms/step - loss: 3.3237e-04 - val_loss: 0.0022
Epoch 37/100
88/88 [==============================] - 1s 7ms/step - loss: 3.3179e-04 - val_loss: 0.0028
Epoch 38/100
88/88 [==============================] - 1s 6ms/step - loss: 3.3473e-04 - val_loss: 0.0022
Epoch 39/100
88/88 [==============================] - 1s 6ms/step - loss: 3.2801e-04 - val_loss: 0.0026
Epoch 40/100
88/88 [==============================] - 1s 6ms/step - loss: 3.3199e-04 - val_loss: 0.0026
Epoch 41/100
88/88 [==============================] - 1s 6ms/step - loss: 3.2491e-04 - val_loss: 0.0026
Epoch 42/100
88/88 [==============================] - 1s 6ms/step - loss: 3.2698e-04 - val_loss: 0.0028
Epoch 43/100
88/88 [==============================] - 1s 7ms/step - loss: 3.2235e-04 - val_loss: 0.0026
Epoch 44/100
88/88 [==============================] - 1s 6ms/step - loss: 3.1769e-04 - val_loss: 0.0024
Epoch 45/100
88/88 [==============================] - 1s 7ms/step - loss: 3.2646e-04 - val_loss: 0.0024
Epoch 46/100
88/88 [==============================] - 1s 6ms/step - loss: 3.2070e-04 - val_loss: 0.0026
Epoch 47/100
88/88 [==============================] - 1s 9ms/step - loss: 3.1651e-04 - val_loss: 0.0030
Epoch 48/100
88/88 [==============================] - 1s 7ms/step - loss: 3.1200e-04 - val_loss: 0.0025
Epoch 49/100
88/88 [==============================] - 1s 7ms/step - loss: 3.1635e-04 - val_loss: 0.0027
Epoch 50/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0964e-04 - val_loss: 0.0032
Epoch 51/100
88/88 [==============================] - 1s 7ms/step - loss: 3.1014e-04 - val_loss: 0.0034
Epoch 52/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0779e-04 - val_loss: 0.0030
Epoch 53/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0799e-04 - val_loss: 0.0029
Epoch 54/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0336e-04 - val_loss: 0.0026
Epoch 55/100
88/88 [==============================] - 1s 7ms/step - loss: 3.1482e-04 - val_loss: 0.0023
Epoch 56/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0991e-04 - val_loss: 0.0028
Epoch 57/100
88/88 [==============================] - 1s 6ms/step - loss: 3.0377e-04 - val_loss: 0.0026
Epoch 58/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0418e-04 - val_loss: 0.0029
Epoch 59/100
88/88 [==============================] - 1s 8ms/step - loss: 3.0124e-04 - val_loss: 0.0029
Epoch 60/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9761e-04 - val_loss: 0.0025
Epoch 61/100
88/88 [==============================] - 1s 10ms/step - loss: 2.9867e-04 - val_loss: 0.0027
Epoch 62/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0046e-04 - val_loss: 0.0029
Epoch 63/100
88/88 [==============================] - 1s 8ms/step - loss: 2.9729e-04 - val_loss: 0.0030
Epoch 64/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0713e-04 - val_loss: 0.0027
Epoch 65/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9885e-04 - val_loss: 0.0029
Epoch 66/100
88/88 [==============================] - 1s 6ms/step - loss: 2.9568e-04 - val_loss: 0.0025
Epoch 67/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0262e-04 - val_loss: 0.0033
Epoch 68/100
88/88 [==============================] - 1s 8ms/step - loss: 2.9262e-04 - val_loss: 0.0024
Epoch 69/100
88/88 [==============================] - 1s 8ms/step - loss: 2.9898e-04 - val_loss: 0.0020
Epoch 70/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9041e-04 - val_loss: 0.0034
Epoch 71/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9213e-04 - val_loss: 0.0025
Epoch 72/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9524e-04 - val_loss: 0.0024
Epoch 73/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9529e-04 - val_loss: 0.0026
Epoch 74/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8955e-04 - val_loss: 0.0028
Epoch 75/100
88/88 [==============================] - 1s 7ms/step - loss: 2.9428e-04 - val_loss: 0.0041
Epoch 76/100
88/88 [==============================] - 1s 7ms/step - loss: 3.0405e-04 - val_loss: 0.0028
Epoch 77/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8589e-04 - val_loss: 0.0024
Epoch 78/100
88/88 [==============================] - 1s 6ms/step - loss: 2.8484e-04 - val_loss: 0.0024
Epoch 79/100
88/88 [==============================] - 1s 6ms/step - loss: 2.8276e-04 - val_loss: 0.0027
Epoch 80/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8379e-04 - val_loss: 0.0025
Epoch 81/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8411e-04 - val_loss: 0.0031
Epoch 82/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8286e-04 - val_loss: 0.0032
Epoch 83/100
88/88 [==============================] - 1s 8ms/step - loss: 2.8796e-04 - val_loss: 0.0024
Epoch 84/100
88/88 [==============================] - 1s 6ms/step - loss: 2.8584e-04 - val_loss: 0.0034
Epoch 85/100
88/88 [==============================] - 1s 8ms/step - loss: 2.8866e-04 - val_loss: 0.0025
Epoch 86/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8129e-04 - val_loss: 0.0027
Epoch 87/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8106e-04 - val_loss: 0.0023
Epoch 88/100
88/88 [==============================] - 1s 7ms/step - loss: 2.7909e-04 - val_loss: 0.0025
Epoch 89/100
88/88 [==============================] - 1s 6ms/step - loss: 2.8744e-04 - val_loss: 0.0024
Epoch 90/100
88/88 [==============================] - 1s 7ms/step - loss: 2.7966e-04 - val_loss: 0.0028
Epoch 91/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8353e-04 - val_loss: 0.0025
Epoch 92/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8781e-04 - val_loss: 0.0020
Epoch 93/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8071e-04 - val_loss: 0.0023
Epoch 94/100
88/88 [==============================] - 1s 7ms/step - loss: 2.7348e-04 - val_loss: 0.0020
Epoch 95/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8587e-04 - val_loss: 0.0024
Epoch 96/100
88/88 [==============================] - 1s 7ms/step - loss: 2.7434e-04 - val_loss: 0.0026
Epoch 97/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8425e-04 - val_loss: 0.0020
Epoch 98/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8123e-04 - val_loss: 0.0024
Epoch 99/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8880e-04 - val_loss: 0.0026
Epoch 100/100
88/88 [==============================] - 1s 7ms/step - loss: 2.8185e-04 - val_loss: 0.0021
<keras.callbacks.History at 0x1c774090820>
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm (LSTM)                  (None, 4)                 112       
_________________________________________________________________
dense (Dense)                (None, 1)                 5         
=================================================================
Total params: 117
Trainable params: 117
Non-trainable params: 0
_________________________________________________________________

Model Prediction

trainPredict = model.predict(Xtrain)
testPredict = model.predict(Xtest)
trainPredict = np.c_[trainPredict, np.zeros(trainPredict.shape)]
testPredict = np.c_[testPredict, np.zeros(testPredict.shape)]
trainPredict = scaler.inverse_transform(trainPredict)
trainPredict = [x[0] for x in trainPredict]

testPredict = scaler.inverse_transform(testPredict)
testPredict = [x[0] for x in testPredict]
print(trainPredict[:5])
print(testPredict[:5])
#Plot the train & test Predict Features
[134.6962815105915, 134.8327444097027, 135.48487375501543, 135.70029776528477, 136.69200200550256]
[183.2663044089079, 183.02248115599156, 182.80488287508487, 182.00673590004445, 179.3152119332552]
from sklearn.metrics import mean_squared_error
trainScore = mean_squared_error([x[0][0] for x in Xtrain],trainPredict, squared= False )
print("Train Score : %.2f RMSE" % (trainScore))

testScore = mean_squared_error([x[0][0] for x in Xtest],testPredict, squared= False )
print("Train Score : %.2f RMSE" % (testScore))
Train Score : 156.33 RMSE
Train Score : 195.62 RMSE