[Day86] NLP Framework
오늘은 자연어처리를 위해 사용할 수 있는 여러가지 프레임워크에 대해서 알아봤는데요. 데모가 놀라운 allen nlp부터 요즘 핫하다는 huggingface까지 살펴보았습니다. 알아두면 유용한 만한 자료들을 아래에 공유합니다.
1. Top NLP Libraries to Use 2020
towardsdatascience.com/top-nlp-libraries-to-use-2020-4f700cdb841f
Top NLP Libraries to Use 2020
AllenNLP, Fast.ai, Spacy, NLTK, TorchText, Huggingface, Gensim, OpenNMT, ParlAI, DeepPavlov
towardsdatascience.com
2. Allen NLP
AllenNLP
AllenNLP is a free, open-source natural language processing platform for building state of the art models.
allennlp.org
3. allennlp (github)
allenai/allennlp
An open-source NLP research library, built on PyTorch. - allenai/allennlp
github.com
4. GLUE
GLUE Benchmark
The General Language Understanding Evaluation (GLUE) benchmark is a collection of resources for training, evaluating, and analyzing natural language understanding systems
gluebenchmark.com
5. GLUE Baselines
github.com/nyu-mll/GLUE-baselines
nyu-mll/GLUE-baselines
[DEPRECATED] Repo for exploring multi-task learning approaches to learning sentence representations - nyu-mll/GLUE-baselines
github.com
6. fairseq documentation
fairseq.readthedocs.io/en/latest/
fairseq documentation — fairseq 1.0.0a0+e607911 documentation
© Copyright Facebook AI Research (FAIR) Revision e607911d.
fairseq.readthedocs.io
7. fairseq (github)
pytorch/fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - pytorch/fairseq
github.com
8. Welcome to fastai
Welcome to fastai | fastai
fastai is a deep learning library which provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed a
docs.fast.ai
9. fastai (github)
fastai/fastai
The fastai deep learning library, plus lessons and tutorials - fastai/fastai
github.com
10. tensor2tensor (github)
github.com/tensorflow/tensor2tensor
tensorflow/tensor2tensor
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. - tensorflow/tensor2tensor
github.com
11. trax (github)
google/trax
Trax — Deep Learning with Clear Code and Speed. Contribute to google/trax development by creating an account on GitHub.
github.com
12. Spacy
spaCy · Industrial-strength Natural Language Processing in Python
spaCy is a free open-source library for Natural Language Processing in Python. It features NER, POS tagging, dependency parsing, word vectors and more.
spacy.io
13. spaCy (github)
explosion/spaCy
💫 Industrial-strength Natural Language Processing (NLP) with Python and Cython - explosion/spaCy
github.com
14. Natural Language Toolkit
Natural Language Toolkit — NLTK 3.5 documentation
www.nltk.org
15. text
pytorch/text
Data loaders and abstractions for text and NLP. Contribute to pytorch/text development by creating an account on GitHub.
github.com
16. KoNLPy: Korean NLP in Python
KoNLPy: Korean NLP in Python — KoNLPy 0.5.2 documentation
KoNLPy: Korean NLP in Python KoNLPy (pronounced “ko en el PIE”) is a Python package for natural language processing (NLP) of the Korean language. For installation directions, see here. For users new to NLP, go to Getting started. For step-by-step instr
konlpy.org
17. konlpy (github)
konlpy/konlpy
Python package for Korean natural language processing. - konlpy/konlpy
github.com
18. Huggingface Transformers
Transformers — transformers 4.1.1 documentation
Blenderbot (from Facebook) released with the paper Recipes for building an open-domain chatbot by Stephen Roller, Emily Dinan, Naman Goyal, Da Ju, Mary Williamson, Yinhan Liu, Jing Xu, Myle Ott, Kurt Shuster, Eric M. Smith, Y-Lan Boureau, Jason Weston.
huggingface.co
19. transformers (github)
github.com/huggingface/transformers
huggingface/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0. - huggingface/transformers
github.com
20. huggingface Pretrained models
huggingface.co/transformers/pretrained_models.html
Pretrained models — transformers 4.1.1 documentation
14 layers: 3 blocks 6, 3x2, 3x2 layers then 2 layers decoder, 768-hidden, 12-heads, 130M parameters
huggingface.co