Skip to content

wzuden/nlp-chinese_text_classification

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

nlp-chinese_text_classification

here I build up a chinese text classification with package jieba for tokenization and sklearn for ML classification algorithm

I write my code from the basic code from http://blog.sina.com.cn/s/blog_7e5f32ff0102w9ll.html I add up more parts for the text classification: to be added....

  1. chinese tokenization

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%