Premium
This is an archive article published on June 19, 2023

Meet Orca, Microsoft’s new 13 billion parameter AI model that can imitate GPT-4

Orca, a 13 billion parameter-powered AI model, can imitate and learn from large language models like GPT-4.

OrcaOrca a small language model (Credit: Bing Image generator)
Listen to this article
Meet Orca, Microsoft’s new 13 billion parameter AI model that can imitate GPT-4
x
00:00
1x 1.5x 1.8x

Microsoft, in partnership with OpenAI, has been steadily implementing AI capabilities in its products and services and also building smaller case-specific models. Microsoft Research unveiled a new AI model called Orca, which learns by imitating large language models. According to the research paper, Orca is designed to overcome the limitations of smaller models by imitating the reasoning processes of large foundation models like GPT-4.

Language models like Orca can be optimized for specific tasks and trained using large language models like GPT-4. Due to its smaller size, Orca requires fewer computing resources to run and operate. Researchers can optimize their models according to their requirements and independently-run them without relying on a large data center.

According to the research paper, Orca, a 13 billion parameter-powered AI model, can imitate and learn from large language models like GPT-4 and is based on Vicuna. It can learn explanations, step-by-step thought processes, and other complex instructions with the help of GPT-4, which is rumoured to have over one trillion parameters.

Story continues below this ad

Microsoft is utilising large-scale and diverse imitation data to promote progressive learning with Orca, which has already surpassed Vicuna by 100% on complex zero-shot reasoning benchmarks like Big-Bench Hard (BBH). The new model is also claimed to be 42% faster than conventional AI models on AGIEval.

In terms of reasoning, despite being a smaller model, Orca is said to be on par with ChatGPT on benchmarks like BBH. Additionally, it demonstrates competitive performance on academic examinations such as SAT, LSAT, GRE, and GMAT, although it falls behind GPT-4.

The Microsoft research team states that Orca can learn using step-by-step explanations created by humans and more advanced language models and is expected to get improved skills and capabilities.

Latest Comment
Post Comment
Read Comments
Advertisement
Loading Taboola...
Advertisement