tokenizer

Version 0.1.0.0 revision 0 uploaded by lev_135.

Package meta

Synopsis
Check uniqueness and tokenize safely
Description

Provide fast enough uniqueness checking for set of tokens specified on a subset of regular expression. See README for more info.

WARNING this package is not tested enough for the moment. Bugs are very likely here.

Author
Lev135
Bug reports
n/a
Category
text
Copyright
Lev Dvorkin (c) 2022
Homepage
https://github.com/Lev135/tokenizer
Maintainer
lev_135@mail.ru
Package URL
n/a
Stability
n/a

Components