Note: I had stopped writing posts in 2017. Slowly getting back to it starting late 2024, mostly for AI.

NLG and the range of tasks within it

Jan 23, 2024 | Concepts

The first 10 minutes of this Stanford CS224N lecture explained NLU, NLG and the tasks within and it was helpful. Natural Language Understanding (NLU) is a subset of NLP (processing), which uses syntactic and semantic analysis of text and speech to determine the meaning of a sentence. Natural Language Generation (NLG) is another subset of NLP focusing on the process of producing a human language text response based on some data input.

Historically NLG systems were rule based, but deep learning techniques have been used in recent past and ChatGPT is the shining example of the stunning results. ChatGPT is very general purpose NLG system and it can accomplish many NLG tasks. The spectrum of tasks can be highly specific at one end and open-ended tasks on the other.

Examples of NLG tasks that are quite specific and low on entropy:  Translation (because the scope of model outputs for translating a phrase from Language 1 into Language 2 is pretty small if the point is to keep the phrase semantics remains same), Summarization. A good example of open-ended, high entropy task is story or poem generation. There is a vast number of model outputs that can be considered valid.

Some tasks fall somewhere in the middle of this specific-vs-open spectrum. A god example is chatbots that aim to have a conversation. That’s because for each input there are quite a few ways for the bot to respond. There is a big output distribution space for the model to explore when task is a conversation.