home projects news about us contact
European AI Infrastructure

Multilingual AI
from scratch

Building a new frontier in AI, an industry-disrupting large language model engineered for seamless, nuanced, and truly multilingual understanding.

Why TerraNex is Different

Core AI infrastructure for European sovereignty. European data. European compute. European control.

🌐

True Multilingual Core

Custom-trained LLMs optimized for European language performance. TerraNex thinks and reasons across 36 languages from its foundational architecture, preserving nuance and cultural context.

💬

Advanced Tokenization

Our byte-level BPE tokenizer achieves 23% fewer tokens than Qwen3 and 15% fewer than DeepSeek, addressing structural biases in existing multilingual models.

🌱

Green AI

Training powered by 100% hydroelectric energy through Europe's supercomputers, delivering enterprise-scale AI with a commitment to sustainability.

News

From R&D to training a European foundation model — our journey so far.

Phase 3 — 2026

€4.45M in Compute Secured

Phase 3 begins — training a next-generation European foundation model designed to outperform conventional transformers in compute efficiency, long-context stability, and energy consumption per token.

Phase 2 — 2025

Tokenizer Achieves Best-in-Class Efficiency

23% fewer tokens than Qwen3, 15% fewer than DeepSeek, 58% single-token coverage, and the lowest fragmentation score across all 36 European languages. Fewer tokens, lower inference cost, stronger unit economics.

Phase 1 — 2025

R&D Architecture & Model Development Complete

In partnership with RISE Research Institutes of Sweden, completed model architecture design and tokenizer R&D using allocated cluster access through a RISE piggyback grant.

Building European AI Infrastructure

Based in Stockholm, TerraNex is building core AI infrastructure for European sovereignty. Having completed our R&D phase and a best-in-class multilingual tokenizer, we're now training a next-generation foundation model designed to outperform conventional transformer systems.

Where existing models treat multilingualism as translation, we embed it at the architectural level — creating AI that reasons, understands context, and communicates with native-level nuance across 36 European languages. Most models scale by adding parameters. We scale by improving how compute is used.

Backed by €4.45M in compute and partnered with RISE Research Institutes of Sweden, EuroHPC Joint Undertaking, and Mimer AI Factory, we train entirely within European infrastructure on 100% hydroelectric energy.

€4.45M
Compute Secured
36
European Languages
100%
Hydroelectric Energy
4
Research Partners

Ready to Connect?

Whether you're an enterprise looking for multilingual AI or a researcher interested in collaboration, we'd love to hear from you.