Journalism of Courage
Advertisement
Premium

Samsung takes on OpenAI and Google with tiny, 7 million-parameter AI model

The new model from Samsung AI Labs named Tiny Recursive Model (TRM) has taken on models that are thousand times bigger than it.

The biggest takeaway from TRM is that instead of building bigger models, one can build smarter, more efficient ones that think recursively. (Image: FreePik)The biggest takeaway from TRM is that instead of building bigger models, one can build smarter, more efficient ones that think recursively. (Image: FreePik)

It is generally accepted that when it comes to large language models (LLM), the bigger the better. And this is the key reason why the likes of OpenAI, Google, and Anthropic have been pushing billions of dollars into their AI models. All of this changed with the arrival of Chinese AI startup DeepSeek, which claimed to build a superlative AI model at a fraction of the cost used by big tech. However, now, another entrant in the arena seems to have stunned the tech world. Samsung, with a small team at its AI lab in Montreal, has introduced the Tiny Recursive Model (TRM), which is contrary to the view that performance scales with more parameters. 

Reportedly, Samsung’s TRM is 10,000x smaller than most of the prominent AI models. The model has only seven million parameters and has shown reasoning results on par with systems that are thousands of times bigger. The makers released it as open-source on GitHub along with a companion paper on arXiv. Samsung has described it as a model that has been built around a recursive process that allows the network to improve its answers over time. 

With its small parameter count, TRM has reportedly outdone DeepSeek R1, which has 671 billion parameters, Gemini 2.5 Pro, and o3-mini on the ARC-AGI, a benchmark that has been developed to test actual reasoning. When it comes to performance, TRM obtained 44.6 per cent accuracy on ARC-AGI-1; 7.8 per cent on ARC-AGI-2 – a benchmark where most LLMs score under 5 per cent; and 87 per cent on Sudoku-Extreme – a benchmark designed to adjudge human-like problem-solving. Interestingly, all of this is possible with a model that can be run on a laptop.  

What is a recursive approach to model building?

Instead of building a massive model with a huge network, Samsung has built TRM that uses recursion. In simple words, recursion stands for “Is my answer good? If not, can I make it better?” The makers claim that TRM offers an answer, then looks back, and later refines it. The model does this a few times until it is convinced with its output. 

According to the research paper, an older system named the Hierarchical Reasoning Model (HRM) also attempted recursive thinking. However, it was complicated, as it used two networks working at different frequencies. It was dependent on heavy mathematical assumptions and biological analogies and was hard to interpret. However, TRM simplifies this process, as it uses only one small network, does not rely on complex theories, and learns solely from data by trial and improvement. 

The biggest takeaway here is that instead of building bigger models, one can build smarter, more efficient ones that think recursively. TRM shows that reasoning can also come from repetition and refinement, and not just size or power.

From the homepage
Tags:
  • artificial intelligence
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Bihar Election ResultsNitish factor propels NDA towards Bihar sweep, Tejashwi fails to retain strongholds
X