This document discusses hand gesture recognition using an artificial neural network. It aims to classify hand gestures into five categories (pointing one to five fingers) using a supervised feed-forward neural network and backpropagation algorithm. The objective is to facilitate communication for deaf people by automatically translating hand gestures into text. The system requires software like Pandas, Numpy and Matplotlib as well as hardware with a quad core processor and 16GB RAM. It explains key concepts of neural networks like neurons, weights, biases, activation functions and their advantages in handling large datasets and inferring unseen relationships.