- BERT日本語Pretrainedモデル - KUROHASHI-MURAWAKI LAB (Juman++)
- NICT BERT 日本語 Pre-trained モデル (mecab+jumandic)
- NWJC-BERT:多義語に対するヒトと文脈化単語埋め込みの類似性判断の対照分析 (
モデルは2020年度中に公開予定
) - GitHub - akirakubo/bert-japanese-aozora: Japanese BERT trained on Aozora Bunko and Wikipedia, pre-tokenized by MeCab with UniDic, SudachiPy
- bert/multilingual.md at master · google-research/bert · GitHub (official BERT multilingual models)
- GitHub - yoheikikuta/bert-japanese: BERT with SentencePiece for Japanese text. (SentencePiece)
- GitHub - hottolink/hottoSNS-bert: hottoSNS-BERT: 大規模SNSコーパスによる文分散表現モデル (SentencePiece)
- GitHub - laboroai/Laboro-BERT-Japanese: Laboro BERT Japanese: Japanese BERT Pre-Trained With Web-Corpus (SentencePiece)
https://github.com/himkt/awesome-bert-japanese contains more details.