Multi-Layer Perceptron Networks in Theano and TensorFlow: An Implementation and Benchmark

Multi-Layer Perceptron Networks in Theano and TensorFlow: An Implementation and Benchmark

A past blog post explored using multi-layer-perceptrons (MLP) to predict stock prices using Tensorflow and Python. This post introduces another common library used for artificial neural networks (ANN) and other numerical purposes: Theano. An MLP Python class is created implemented using Theano, and then the performance of the class is compared with the TFANN class in a benchmark.

Continue reading

Image Classification Using Convolutional Neural Networks in TensorFlow

Image Classification Using Convolutional Neural Networks in TensorFlow

This blog post introduces a type of neural network called a convolutional neural network (CNN) using Python and TensorFlow. A brief introduction to CNNs is given and a helper class for building CNNs in Python and TensorFlow is provided. The source code from this post is available here on GitHub.

Continue reading

Visualizing Neural Network Performance on High-Dimensional Data

Visualizing Neural Network Performance on High-Dimensional Data

This post presents a short script that plots neural network performance on high-dimensional binary data using MatPlotLib in Python. Binary vectors, or vectors only containing 0 and 1, can be useful for representing categorical data or discrete phenomena. The code in this post is available on GitHub.

Continue reading

Stock Market Prediction in Python Part 2

Stock Market Prediction in Python Part 2

This post revisits the problem of predicting stock prices based on historical stock data using TensorFlow that was explored in a previous post. In the previous post, stock price was predicted solely based on the date. First, the date was converted to a numerical value in LibreOffice, then the resulting integer value was read into a matrix using numpy. As stated in the post, this method was not meant to be indicative of how actual stock prediction is done. This post aims to slightly improve upon the previous model and explore new features in tensorflow and Anaconda python. The corresponding source code is available here.

Note: See a later post Visualizing Neural Network Performance on High-Dimensional Data for code to help visualize neural network learning and performance.

Continue reading

Stock Market Prediction Using Multi-Layer Perceptrons With TensorFlow

Stock Market Prediction Using Multi-Layer Perceptrons With TensorFlow

In this post a multi-layer perceptron (MLP) class based on the TensorFlow library is discussed. The class is then applied to the problem of performing stock prediction given historical data. Note: This post is not meant to characterize how stock prediction is actually done; it is intended to demonstrate the TensorFlow library and MLPs. Update: See part 2 of this series for more examples of using python and TensorFlow for performing stock prediction. Update 2: See a later post Visualizing Neural Network Performance on High-Dimensional Data for code to help visualize neural network learning and performance.

Data Setup

The data used in this post was collected from finance.yahoo.com. The data consists of historical stock data from Yahoo Inc. over the period of the 12th of April 1996 to the 19th of April 2016. The data can be downloaded as a CSV file from the provided link. To pre-process the data for the neural network, first transform the dates into integer values using LibreOffice’s DATEVALUE function. A screen-shot of the transformed data can be seen as follows:

Continue reading

Multi-Layer Perceptrons and Back-Propagation; a Derivation and Implementation in Python

Multi-Layer Perceptrons and Back-Propagation; a Derivation and Implementation in Python

Artificial neural networks have regained popularity in machine learning circles with recent advances in deep learning. Deep learning techniques trace their origins back to the concept of back-propagation in multi-layer perceptron (MLP) networks, the topic of this post.

Multi-Layer Perceptron Networks for Regression

A MLP network consists of layers of artificial neurons connected by weighted edges. Neurons are denoted n_{ij} for the j-th neuron in the i-th layer of the MLP from left to right top to bottom. Inputs are fed into the leftmost layer and propagate through the network along weighted edges until reaching the final, or output, layer. An example of a MLP network can be seen below in Figure 1. Continue reading

Eigenfaces versus Fisherfaces on the Faces94 Database with Scikit-Learn

Eigenfaces versus Fisherfaces on the Faces94 Database with Scikit-Learn

In this post, two basic facial recognition techniques will be compared on the Faces94 database. Images from the Faces94 database are 180 by 200 pixels in resolution and were taken as the subjects were speaking to induce variations in the images. In order to train a classifier with the images, the raw pixel information is extracted, converted to grayscale, and flattened into vectors of dimension 180 \times 200 = 36000. For this experiment, 12 subjects will be used from the database with 20 files will be used per subject. Each subject is confined to a unique directory that contains only 20 image files. Continue reading