Reference (order ofappearance)
โข Cho, W. I., Kim, S. M., & Kim, N. S. (2019). Investigating an Effective Character-level Embedding in
Korean Sentence Classification. arXiv preprint arXiv:1905.13656.
โข Cho, W. I., Moon, Y. K., Kang, W. H., & Kim, N. S. (2018). Extracting Arguments from Korean Question
and Command: An Annotated Corpus for Structured Paraphrasing. arXiv preprint arXiv:1810.04631.
โข ์กฐ์์ต, ๋ฌธ์๊ธฐ, ๊น์ข ์ธ, ๊น๋จ์, "๋ดํ ์ฑ๋ถ์ ํ์ฉํ ์ง์ ๋ฐํ์ ํค ํ๋ ์ด์ฆ ์ถ์ถ: ํ๊ตญ์ด ๋ณ๋ ฌ ์ฝํผ์ค
๊ตฌ์ถ ๋ฐ ๋ฐ์ดํฐ ์ฆ๊ฐ ๋ฐฉ๋ฒ๋ก " ์ 31ํ ํ๊ธ ๋ฐ ํ๊ตญ์ด ์ ๋ณด์ฒ๋ฆฌ ํ์ ๋ํ, 2019, pp. 241-245.
โข Schuster, Mike, and Kuldip K. Paliwal. "Bidirectional recurrent neural networks." IEEE Transactions on
Signal Processing 45.11 (1997): 2673-2681.
โข Lin, Z., Feng, M., Santos, C. N. D., Yu, M., Xiang, B., Zhou, B., & Bengio, Y. (2017). A structured self-
attentive sentence embedding. arXiv preprint arXiv:1703.03130.
โข Chollet, F. (2015). Keras.
โข Cho, W. I., Cho, J., Kang, W. H., & Kim, N. S. (2019). Disambiguating Speech Intention via Audio-Text
Co-attention Framework: A Case of Prosody-semantics Interface. arXiv preprint arXiv:1910.09275.
18