One of Europe’s most prominent AI startups has released two AI models that are too small, they have named them after the brain of a chicken and a fly brain.
Polygoring computing Claims are the smallest models in the world who are still performing high and can handle chat, speech and even arguments in a case.
The purpose of these new small models is to be embedded in the Internet of Things devices, as well as running on smartphones, tablets and PCs locally.
“We can compress the model so much that they can fit on devices,” founder Roman OruS told Techcrunch. “You can run them in the campus, directly on your iPhone, or on your Apple Watch.”
As we had earlier mentioned, multivars computing is a magnificent European AI startup with around 100 employees in offices around the world. It was co-established by Quantum Computer and a top European professor, Roman Orus, a top of physics; Quantum computing specialist Samuel Mughal; And former Deputy CEO of Anneem bench Enrich Lizaso Olmos.
It raised € 189 million (approximately $ 215 million) in June, which says “CompactFi” on the strength of a model compression technique. (Since it was established in 2019, it has raised around $ 250 million, Oruz said.)
CompactFi is a quantum-inspired compression algorithm that reduces the size of the existing AI model without renouncing the performance of models, said by OruS.
Techcrunch event
San francisco
,
27-29 October, 2025
“We have a compression technique that is not a specific compression technique that people of computer science or machine learning, because we come from quantum physics,” he said. “This is a more subtle and more sophisticated compression algorithm.”
The company has already released a long list of compressed versions of the open source model, especially popular short models such as Lama 4 Scouts or Mistral Small 3.1. And it only launched compressed versions of two newly opened models of Openai. It has also narrowed some very large models – for example, provides a Deepsek R1 slim.
But since it is in the business of making the model small, it has focused additional focus on making the most powerful model possible yet.
Its two new models are so small that they can bring chat AI capabilities about any IOT device and work without internet connection, the company says. It is humorly calls this family a model zoo because this animal is naming products based on the size of the brain.
A model calls it a superfly, which is a narrow version of the face’s open source model smolm2-135. The original has 135 million parameters and was developed for on-device uses. The superfly is a 94 million parameter, which Oruz prefers for the shape of the brain of a fly. “It’s like being a fly, but a little more clever,” he said.
The superfly is designed to train on very restricted data, like the operation of the device. Multivars embedded it into home appliances, allowing users to operate them with voice commands such as “Start Quick Wash” for a washing machine. Or users may ask troubleshooting questions. With a little processing power (like an arduino), the model can handle a voice interface, as the company shown Techcrunch in a live demo.
The other model is named Chikbrain, and it is larger on 3.2 billion parameters, but is also more capable and argument capabilities. It is a narrowed version of the Lama 3.1 8B model of the meta, says multivars. Still it is quite small to walk on MacBook, no internet connection is required.
Even more importantly, Oruos said that Shabraine actually improves the original in many standard benchmarks, including language-skill benchmark MMLU-Pr, Mathematics Skills Benchmark Math 500 and GSM 8 K, and General Knowledge Benchmark GPQ Diamond.
Here are the results of internal tests of multivars on the benchmark. The company did not offer benchmark results for superfly, but multivors are also not targeting superfly in cases of use that require logic.

It is important to note that the multiverse is not claiming that its model zoo will beat the largest state -of -the -art model on such a benchmark. The performance of the zoo cannot even land on the leaderboard. The issue is that its technique performance can shrink the size of the model without a hit, the company says.
Oru says that the company is already interacting with all major devices and equipment manufacturers. “We are talking with apples. We are talking with Samsung, Sony and also with HP, obviously. HP came as an investor in the last round,” he said. The round was led by the famous European VC firm Bulhound Capital, with HP Tech Ventures and several others participation.
The startup machine also provides compression techniques for other forms of learning, such as image recognition, and customers such as BASF, associates, moodyes, boschs, and in six years. Other.
In addition to selling your model directly to major device manufacturers, the multiverse offers their own Compressed model Anyone who can use developers through the API hosted on AWS, often at a lower token fee than contestants.
We are always looking to develop, and techcrunch and by providing some insight into our coverage and events in our perspective and response, you can help us! to fill This survey To tell us how we are doing and get a chance to win the award in return!

