We have envisioned that computers will understand natural language and predict what we need help in order to complete tasks via conversational interactions. This talk focuses on context-aware understanding in different levels: 1) word-level contexts in sentences 2) sentence-level contexts in dialogues. Word-level contexts contribute both semantic and syntactic relations, which benefit sense representation learning and knowledge-guided language understanding. Also, sentence-level contexts may significantly affect dialogue-level performance. This talk investigates how misunderstanding of a single-turn utterance degrades the success rate of an end-to-end reinforcement learning based dialogue system. Then we will highlight challenges and recent trends driven by deep learning and intelligent assistants.