×
Mar 11, 2024 · We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله ایواره?
People also ask
Mar 4, 2024 · This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned using (a second step of) knowledge distillation on SQuAD v1.1.
Missing: مجله ایواره? q=
DistilBERT is a small, fast, cheap and light Transformer model trained by distilling BERT base. It has 40% less parameters than google-bert/bert-base-uncased, ...
Missing: مجله ایواره? q=
Jan 4, 2024 · Model Description: This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned on SST-2. This model reaches an accuracy of 91.3 ...
Missing: مجله ایواره? q=
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله ایواره? q=
Apr 20, 2023 · It's smaller, faster than Bert and any other Bert-based model. Distilbert-base-uncased finetuned on the emotion dataset using HuggingFace ...
Missing: مجله ایواره? q=
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله ایواره? q=
Jan 18, 2024 · This model is a fine-tuned with knowledge distillation version of distilbert-base-uncased on the clinc_oos dataset. The model is used in Chapter ...
Missing: مجله ایواره? q=
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed. If you like, you can repeat the search with the omitted results included.