This paper proposes a neural conversational model that uses a sequence-to-sequence framework to generate answers to questions. It can be trained end-to-end without many hand-crafted rules and can generate simple conversations given large training data. The model adopts the sequence-to-sequence framework and is trained using backpropagation to maximize cross entropy. During inference it uses greedy decoding but beam search could also be used. The model was verified by having humans rate it against CleverBot on 200 questions and it can generate basic conversations and answer questions without rules.