This enum contains an operation for every token to merge the
tokens together to their detokenized form.

Package opennlp.tools.tokenize Description

Contains classes related to finding token or words in a string. All
tokenizer implement the Tokenizer interface. Currently there is the
learnable TokenizerME, the WhitespaceTokenizer and
the SimpleTokenizer which is a character class tokenizer.