Emergent Semantics Beyond Token Embeddings: Transformer LMs with Frozen Visual Unicode Representations Paper • 2507.04886 • Published Jul 7 • 3
MoCa: Modality-aware Continual Pre-training Makes Better Bidirectional Multimodal Embeddings Paper • 2506.23115 • Published Jun 29 • 37