Skip to main content
This project explores deep artificial neural networks and their use with Google’s open-source library TensorFlow. We begin by laying the theoretical foundations of these networks, covering their motivation, techniques used and some... more
This project explores deep artificial neural networks and their use with Google’s open-source library TensorFlow. We begin by laying the theoretical foundations of these networks, covering their motivation, techniques used and some mathematical aspects of their training. Special attention is paid to various regularisation methods which are applied later on.

After that, we delve into the computational approach, explaining TensorFlow’s operation principles and the necessary concepts for its use, namely the computational graph, variables and execution sessions. Through the first example of a deep network, we illustrate the theoretical and TensorFlow-related elements described earlier, applying them to the problem of classifying flowers of the Iris species.

We then pave the way for the problem of image classification: we comment several higher-level TensorFlow wrappers (focusing on Slim, a library born within Google itself which is used in the last part of the project), describe the basic principles of convolutional networks and introduce the MNIST problem (automatic handwritten digit recognition), outlining its history and current state of the art.

Finally, we create three convolutional networks to tackle MNIST, detailing how such a task is approached with TensorFlow and the workflow followed. All three networks reach over 98% classification accuracy, going as far as 99.52% in the case of the best one. We conclude with an explanation of the obtained results, relating the structures of the different networks with their performance and training cost.
Research Interests: