Metadata-Version: 1.1
Name: tokenizer_xm
Version: 0.2
Summary: Tokenizing with options to exclude contractions, lemmatize and stem.
Home-page: https://github.com/ALaughingHorse/tokenizer_xm
Author: Xiao Ma
Author-email: Marshalma0923@gmail.com
License: MIT
Download-URL: https://github.com/ALaughingHorse/tokenizer_xm/archive/v_02.tar.gz
Description: UNKNOWN
Keywords: text preprocessing,tokenize,NLP
Platform: UNKNOWN
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Build Tools
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
