This document presents a system for sign language recognition using neural networks. The system aims to recognize hand gestures in real-time and translate them into English words or sentences. It uses a convolutional neural network (CNN) algorithm to extract features from captured images of hand gestures and classify the gestures. The system was able to accurately recognize gestures with a classification rate of 92.4%. The system could help mute individuals communicate through translating sign language into text that could be read or understood by others. It may also assist blind individuals by allowing communication through speech recognition of the translated text.