TransWikia.com

Ngram based Langauge Models learned using an Encoder-Decoder Model

Data Science Asked by Akshit Singh on March 8, 2021

I have been going through a Ngram based Langauge Model learned using an Encoder-Decoder Model for Email smart compose.

The program output only 1 prediction for given input.
I want to know how to get multiple predictions out of the same.

Here is the link to the notebook: https://nbviewer.jupyter.org/github/PrithivirajDamodaran/NLP-Experiments/blob/master/Gmail_style_smart_compose_with_char_ngram_based_language_model.ipynb

Here, for input sequence : “hi there” the predicted sequence is: “, how are you today?”
But it maybe possible that I have multiple sentences starting with “hi there” in my training dataset. So how do I get all of those?

One Answer

For your case, I feel that skip-gram model will be fit for your business problem. Where u can predict a sequence of a word rather than one word. I recommend you to train the skiagram model on your corpus.

Answered by Gaurav Koradiya on March 8, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP