Post
39
Meet **GLinker** — an ultra-fast, modular, **zero-shot entity linking** framework 🚀
When we introduced the **GLiNER bi-encoder** in 2024, it enabled efficient zero-shot NER across hundreds of entity types. But that was just the beginning. Our bigger goal was always clear: **linking text to millions of entities dynamically, without retraining**.
In other words: **true entity linking at scale** ⚡
This unlocks powerful applications:
▪️ More precise search with real-world entity disambiguation
▪️ Knowledge graph construction across diverse document collections
▪️ Wikification — turning raw text into richly linked, navigable knowledge
After nearly two years of research + engineering, this vision is now real.
We’re excited to release **GLinker** — a **production-ready**, zero-shot entity linking system powered by our novel **GLiNER bi-encoder**. It efficiently detects entity spans of any length and matches them directly to entity descriptions — **no retraining required**.
**Why GLinker?**
▪️ Production-ready: multi-layer caching (Redis → Elasticsearch → PostgreSQL)
▪️ Research-friendly: fully configurable YAML pipelines
▪️ High performance: precomputed embeddings for bi-encoder models
▪️ Scalable by design: DAG-based execution + efficient batch processing
GLinker transforms raw text into **structured, disambiguated entity mentions**, bridging unstructured language with large, evolving knowledge bases.
🔗 Explore more:
GitHub: https://github.com/Knowledgator/GLinker
Report: https://github.com/Knowledgator/GLinker/blob/main/papers/GLiNER_bi_Encoder_paper.pdf
Linking models: https://huggingface.co/collections/knowledgator/gliner-linker
Bi-encoder models: https://huggingface.co/collections/knowledgator/gliner-bi-encoder
When we introduced the **GLiNER bi-encoder** in 2024, it enabled efficient zero-shot NER across hundreds of entity types. But that was just the beginning. Our bigger goal was always clear: **linking text to millions of entities dynamically, without retraining**.
In other words: **true entity linking at scale** ⚡
This unlocks powerful applications:
▪️ More precise search with real-world entity disambiguation
▪️ Knowledge graph construction across diverse document collections
▪️ Wikification — turning raw text into richly linked, navigable knowledge
After nearly two years of research + engineering, this vision is now real.
We’re excited to release **GLinker** — a **production-ready**, zero-shot entity linking system powered by our novel **GLiNER bi-encoder**. It efficiently detects entity spans of any length and matches them directly to entity descriptions — **no retraining required**.
**Why GLinker?**
▪️ Production-ready: multi-layer caching (Redis → Elasticsearch → PostgreSQL)
▪️ Research-friendly: fully configurable YAML pipelines
▪️ High performance: precomputed embeddings for bi-encoder models
▪️ Scalable by design: DAG-based execution + efficient batch processing
GLinker transforms raw text into **structured, disambiguated entity mentions**, bridging unstructured language with large, evolving knowledge bases.
🔗 Explore more:
GitHub: https://github.com/Knowledgator/GLinker
Report: https://github.com/Knowledgator/GLinker/blob/main/papers/GLiNER_bi_Encoder_paper.pdf
Linking models: https://huggingface.co/collections/knowledgator/gliner-linker
Bi-encoder models: https://huggingface.co/collections/knowledgator/gliner-bi-encoder