This is a post that reviews the recent advancements of machine translation, especially neural machine translation (NMT).
Prerequisites: this post assumes some prior knowledge about machine learning, artificial neural networks, CNN, RNN (LSTM, GRU) encoder-decoder architecture, seq-to-seq models, etc. Continue reading
This is a post that summarizes the fundamental approaches to representing a word in NLP. I found that it’ll make me better understand a concept if I write it down and make some reviewing on it.
Basically, this post tries to answer the question: how to represent a word in a sentence for various applications of NLP?
Here is a link to a literature review I made on the topic of rare word issue and its approaches in neural machine translation. It briefly introduces neural machine translation, and then focuses on the specific present solutions to rare word issue in NMT. Feel free to make comments on it.
This post summarizes and reorganizes the key points of Caffe Tutorial for later use and quick reference. Continue reading