Multiverse Computing Releases Free Compressed HyperNova 60B Model, Challenging AI Giants
Spanish startup Multiverse Computing has released a compressed version of its HyperNova 60B AI model for free on Hugging Face, positioning the company as a challenger to larger U.S. and European AI providers. Using its CompactifAI quantum‑inspired compression, Multiverse reduced the model size to ~32GB — roughly 50% smaller than the gpt-oss-120B source — while retaining about 95% of original accuracy. The HyperNova 60B 2602 build claims 45% faster inference and a 60% smaller memory footprint versus comparable models, with improved multilingual support and enhanced tool-calling and agentic coding features. Multiverse compares favorably in efficiency benchmarks with rivals such as Mistral AI, and emphasizes European technological sovereignty, aided by regional government support and participation in a €215 million Series B. The company is reportedly pursuing a €500 million funding round that could lift its valuation above €1.5 billion and has claimed (unconfirmed) ARR around €100 million. Multiverse plans further open-source releases in 2026, compression toolkits in early 2026, specialized industry models in late 2025, and multimodal compressed models by 2026 Q3. Market implications include broader access for SMEs, edge deployment possibilities, and adoption in regulated sectors where data sovereignty matters. Keywords: compressed AI model, HyperNova 60B, CompactifAI, model compression, Hugging Face, European AI sovereignty.
Neutral
The release of a free compressed model lowers technical and cost barriers for AI deployment, which can indirectly benefit crypto projects and infrastructure that rely on AI tools (analytics, trading bots, on‑chain data processing). For traders, the announcement is neutral overall: it does not directly alter token fundamentals or fiat liquidity, but it improves tooling and reduces costs for teams building crypto services. Short term: modest positive sentiment for AI‑enabled crypto services and developer communities, possibly increasing interest in related infrastructure tokens; no immediate price catalyst expected for major cryptocurrencies. Long term: widespread adoption of efficient, cheaper models could accelerate development of AI-driven trading products, risk models, and decentralized applications, improving productivity and potentially driving demand for compute- or AI-infrastructure tokens. Historical parallels: open-source model releases (e.g., LLaMA derivatives, Mistral releases) boosted developer activity and ecosystem tooling without directly driving major crypto market moves. Therefore impact on crypto markets is supportive for infrastructure and tooling niches but neutral for broad market direction.