NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.
Version 1.1 of the model is general, and should be fine-tuned for any particular use.
NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.
NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.
Version 1.1 of the model is general, and should be fine-tuned for any particular use.
NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.
Extended metadata
resource Common Info:
resource Type: languageDescription
identification Info:
resource Name: NB BERT-base
resource Name: NB BERT-base
description: NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.
Version 1.1 of the model is general, and should be fine-tuned for any particular use.
NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.
description: NB BERT-base er en generell BERT-base modell bygget med utgangspunkt i den store digitale samlingen ved Nasjonalbiblioteket. Modellen er basert på samme struktur som BERT Cased flerspråklig modell, og er trent på et bredt utvalg norsk tekst – både bokmål og nynorsk – fra de siste 200 årene.
Versjon 1.1 av modellen er generell, og bør finjusteres for spesielle formål.
Modellen er laget og utgitt av AI-laben ved Nasjonalbiblioteket, og er en av de best presterende modellene for norsk og andre skandinaviske språk til nå.
NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.
Version 1.1 of the model is general, and should be fine-tuned for any particular use.
NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.