Natural Language Processing (NLP) has had a relatively long development period. It is often broken down into smaller problems: text classification, Named Entity Recognition (NER), summarization, etc. to solve concrete challenges.
For each smaller challenge, we have different small models to solve it, and sometimes, we must prepare large enough training data.
For example, to use text classification to detect when a guest asks about check-in time, we need to create a list of similar questions for the intentcheck-in
in the following format (using Rasa NLU syntax):
nlu:
- intent: check_in_time
examples: |
- When can I check-in?
- What time am I allowed to check-in?
- Can you tell me the check-in time?
- intent: pool
examples: |
- Is the pool available for guests to use?
- Could you let me know if there is a swimming pool here?
...
Then put it into the intent classification model to train.
With lots of intents, this file becomes bigger, takes more time to train, and when we add new intents or training phrases, we must retrain.
With the rise of Large Language Models like ChatGPT, it can tackle NLP problems more easily. With zero-shot prompts, we just need to put the guest’s question and list of intents in a prompt without any examples:
Premium IPTV Experience with line4k
Experience the ultimate entertainment with our premium IPTV service. Watch your favorite channels, movies, and sports events in stunning 4K quality. Enjoy seamless streaming with zero buffering and access to over 10,000+ channels worldwide.
