Stock Market Prediction Using Multi-Layer Perceptrons With TensorFlow

This post is part of a series on artificial neural networks (ANN) in TensorFlow and Python.

  1. Stock Market Prediction Using Multi-Layer Perceptrons With TensorFlow
  2. Stock Market Prediction in Python Part 2
  3. Visualizing Neural Network Performance on High-Dimensional Data
  4. Image Classification Using Convolutional Neural Networks in TensorFlow

In this post a multi-layer perceptron (MLP) class based on the TensorFlow library is discussed. The class is then applied to the problem of performing stock prediction given historical data. Note: This post is not meant to characterize how stock prediction is actually done; it is intended to demonstrate the TensorFlow library and MLPs.

Data Setup

The data used in this post was collected from The data consists of historical stock data from Yahoo Inc. over the period of the 12th of April 1996 to the 19th of April 2016. The data can be downloaded as a CSV file from the provided link. Note: YHOO is no longer traded as the company’s name has changed; feel free to use the provided link or historical data from another company.

To pre-process the data for the neural network, first transform the dates into integer values using LibreOffice’s DATEVALUE function. A screen-shot of the transformed data can be seen as follows:

Continue reading “Stock Market Prediction Using Multi-Layer Perceptrons With TensorFlow”


Multi-Layer Perceptrons and Back-Propagation; a Derivation and Implementation in Python

Artificial neural networks have regained popularity in machine learning circles with recent advances in deep learning. Deep learning techniques trace their origins back to the concept of back-propagation in multi-layer perceptron (MLP) networks, the topic of this post.

Multi-Layer Perceptron Networks for Regression

A MLP network consists of layers of artificial neurons connected by weighted edges. Neurons are denoted n_{ij} for the j-th neuron in the i-th layer of the MLP from left to right top to bottom. Inputs are fed into the leftmost layer and propagate through the network along weighted edges until reaching the final, or output, layer. An example of a MLP network can be seen below in Figure 1. Continue reading “Multi-Layer Perceptrons and Back-Propagation; a Derivation and Implementation in Python”