Apertus 70B: Truly Open - Swiss LLM by ETH, EPFL and CSCS

35 denysvitali 6 9/2/2025, 8:14:51 PM huggingface.co ↗

Comments (6)

denysvitali · 5h ago
Report: https://github.com/swiss-ai/apertus-tech-report/raw/refs/hea...

Key features

Fully open model: open weights + open data + full training details including all data and training recipes

Massively Multilingual: 1811 natively supported languages

Compliant: Apertus is trained while respecting opt-out consent of data owners (even retrospectivey), and avoiding memorization of training data

Bromeo · 4h ago
Looks like the performance is pretty decent, somewhere around Llama3.1 for general knowledge (Tables 17) but still a bit behind in Code and Reasoning (Table 18). Llama3.1 was released about one year ago.
lastdong · 4h ago
In my opinion, we need more models trained on fully traceable and clean data instead of closed models that we later find out were trained on Reddit and Facebook discussion threads.
SilverElfin · 5h ago
Apparently a project of https://www.swiss-ai.org/
nickpsecurity · 5h ago
Upvoting to encourage discussion of these differentiators:

"Apertus is a 70B and 8B parameter language model designed to push the boundaries of fully-open multilingual and transparent models. The model supports over 1000 languages and long context, it uses only fully compliant and open training data, and achieves comparable performance to models trained behind closed doors."

"pretrained on 15T tokens with a staged curriculum of web, code and math data"

"open weights + open data + full training details including all data and training recipes"

"Apertus is trained while respecting opt-out consent of data owners (even retrospectivey), and avoiding memorization of training data"

titaniumrain · 4h ago
seems a DOA