Category: NLP

Why some linguistics is necessary for NLP

Disclaimer: I’m no means an expert in linguistics and below is the opinion of my personal research. Feel free to correct me. After sitting in the NLP classes for the last 3 weekends, my classmates exclaimed that it felt like they went through 3 adult English classes. It didn’t help that the workshops are designed

Crossing the language barrier with NLP

One of the biggest open problems in NLP is the unavailability of many non-English dataset. Dealing with low-resource/low-data setting can be quite frustrating when it seems impossible to transfer the same success we saw in various English NLP tasks. In fact, there are voices within the NLP community to advocate research and focus on low-resource

Transfer learning and beyond

Transfer learning has proven to be useful in NLP in the recent years. As many called the “Imagenet moment” when the likes of large pretrained language models such as BERT, GPT, GPT2 have sprung out from the big research labs, they have been extended in various methods to achieve further state of the art results

Recent Advances in Abstractive Summarization Using Deep Learning Part 2

This post is a continuation to the previous post here. We continue to track the recent progress and trend in abstractive summarization in 2018. The earlier efforts in abstractive summarisation focuses on problems that are related to natural language generation rather than the summarization task itself. Some problems that were tackled: Unfactual information (copy mechanism)