Word

For example, when building neural networks that learn to translate between different languages it is unlikely that there will be the same number of words in every translated sentence compared to each input sentence.The single output label “positive” might apply to an entire sentence (which is composed of a sequence of words).This is how we might build a single embedding from a sequence of words (the document) for the purposes of document comparison.