About Pleias
Pleias is a French private AI lab founded in 2024 by Pierre-Carl Langlais, Ivan Yamshchikov and Anastasia Stasenko, based in Paris and focused on small, energy-efficient language models trained exclusively on open data. It coordinated the release of Common Corpus, one of the largest open multilingual datasets for LLM pre-training, and ships the Pleias 1.0 family (up to ~3B parameters) plus Pleias-RAG-350m and Pleias-RAG-1B small reasoning models optimised for retrieval-augmented generation with built-in source citations. Models are released openly on Hugging Face for auditable, document-centric use cases.

Anthropic
Cohere
AI21 Labs 

