Top > BERT
[[MachineLearning]] >  [[Transformer]] > [[NaturalLanguageProcessing]]

- [[XLNet]]
- [[ALBERT]] (a Lite BERT)
- [[DistillBERT]]

* 言語毎 [#v2128ca6]

- 🇩🇪 German BERT https://deepset.ai/german-bert
- 🇫🇷 CamemBERT https://arxiv.org/abs/1911.03894, FlauBERT https://arxiv.org/abs/1912.05372
- 🇮🇹 AlBERTo http://ceur-ws.org/Vol-2481/paper57.pdf
- 🇳🇱 RobBERT https://arxiv.org/abs/2001.06286
- 🇷🇺 RuBERT https://arxiv.org/abs/1912.09582
- 🇪🇸 BETO https://github.com/dccuchile/beto 
- 🇳🇱 BERTje https://arxiv.org/abs/1912.09582
- 🇧🇷 Portuguese BERT https://github.com/neuralmind-ai/portuguese-bert
- 🇩🇪https://github.com/dbmdz/berts/blob/master/README.md

*. [#g08228b5]


- はじめての自然言語処理 第3回 BERT を用いた自然言語処理における転移学習
--  https://www.ogis-ri.co.jp/otc/hiroba/technical/similar-document-search/part3.html

- 事前学習言語モデルを用いたVision & Languageの動向 / A Survey of Pre-trained Language Models for Vision & Language
--  https://speakerdeck.com/kyoun/a-survey-of-pre-trained-language-models-for-vision-and-language/

- 自然言語処理の王様「BERT」の論文を徹底解説
--  https://qiita.com/omiita/items/72998858efc19a368e50

- 雑記: BERTが何をしているかを掘り下げる
--  https://cookie-box.hatenablog.com/entry/2020/09/20/010910

Reload   Diff   Front page List of pages Search Recent changes Backup Referer   Help   RSS of recent changes