BERT NER β Fine-tuned Named Entity Recognition Model
Model: ELHACHYMI/bert-ner
Base model: bert-base-uncased
Task: Token Classification β Named Entity Recognition (NER)
Dataset: CoNLL-2003 (English)
Model Overview
This model is a fine-tuned version of BERT Base Uncased on the CoNLL-2003 Named Entity Recognition (NER) dataset.
It predicts the following entity types:
- PER β Person
- ORG β Organization
- LOC β Location
- MISC β Miscellaneous
- O β Outside any entity
The model is suitable for information extraction, document understanding, chatbot entity detection, and structured text processing.
Labels
The model uses the standard IOB2 tagging scheme:
| ID | Label |
|---|---|
| 0 | O |
| 1 | B-PER |
| 2 | I-PER |
| 3 | B-ORG |
| 4 | I-ORG |
| 5 | B-LOC |
| 6 | I-LOC |
| 7 | B-MISC |
| 8 | I-MISC |
How to Load the Model
Using Hugging Face Pipeline
from transformers import pipeline
ner = pipeline("ner", model="ELHACHYMI/bert-ner", aggregation_strategy="simple")
text = "Bill Gates founded Microsoft in the United States."
print(ner(text))
- Downloads last month
- 35
Model tree for ELHACHYMI/bert-ner
Base model
google-bert/bert-base-uncased