项目作者: dat821168

项目描述 :
Textclassifiers: Collection of Text Classification/Document Classification/Sentence Classification/Sentiment Analysis models for PyTorch
高级语言: Python
项目地址: git://github.com/dat821168/textclassifiers.git
创建时间: 2020-05-06T10:14:13Z
项目社区:https://github.com/dat821168/textclassifiers

开源协议:

下载


textclassifiers

Textclassifiers: Collection of Text Classification/Document Classification/Sentence Classification/Sentiment Analysis models for PyTorch

Install dependencies:

pip3 install -r requirements.txt

Run the code

Train

python3 run.py --mode train --config configs/fasttext_config.yaml

Results

The overall model performances on test set.

**Note: The test’s model parameter configuration is saved in ./examples/















































































Model Score
Query Well formedness AG News
Accuracy F1 Score Accuracy F1 Score
FastText [1] 66.33% 66.20% .% .%
TextRNN 69.35% 68.98% .% .%
TextCNN [2] 68.08% 67.72% .% .%
RCNN [3] 68.00% 67.72% .% .%
LSTM + Attention [4] 67.27% 66.70% .% .%
Transformer [5] 68.31% 67.78% .% .%
BERT [6] .% .% .% .%
HAN [7] .% .% .% .%
DNN .% .% .% .%

Model Releases

References

[1] Joulin, Armand, Edouard Grave, and Piotr Bojanowski Tomas Mikolov. “Bag of Tricks for Efficient Text Classification.” EACL 2017 (2017): 427.

[2] Kim, Yoon. “Convolutional Neural Networks for Sentence Classification.” Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014.

[3] Lai, Siwei, et al. “Recurrent convolutional neural networks for text classification.” In Proc. Conference of the Association for the Advancement of Artificial Intelligence (AAAI). 2015.

[4] Du, Changshun, and Lei Huang. “Text classification research with attention-based recurrent neural networks.” International Journal of Computers Communications & Control 13.1 (2018): 50-61.

[5] Vaswani, Ashish, et al. “Attention is all you need.” Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates Inc., 2017.

[6] Devlin, Jacob, et al. “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.” Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019.

[7] Yang, Zichao, et al. “Hierarchical attention networks for document classification.” Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. 2016.