• (425) 677-7430
  • info@cascadestrategies.com

“Distillation” Is Shaking Up The AI Industry

Share

Copyright: Airam Dato-on

 

Paradigm Shift

We’ve recently written about recent AI advancements and popularity, particularly generative AI like that of ChatGPT, driving renewed demand for data centers not seen in decades.  This surging demand pushed tech investors to put $39.6 billion into data center development in 2024, which is 12 times the amount invested back in 2016.

A recent development, however, has stirred things up, especially the concept that billions of dollars needed to be spent for AI advancement.  Developed by a Chinese AI research lab, an open-source large language model named DeepSeek was released and performed on par with OpenAI, but it reportedly operates for just a fraction of the cost of Western AI models.  OpenAI, however, is investigating if DeepSeek utilized distillation of the former’s AI models to develop their systems.

Copyright: cottonbro studio

 

What Is “Distillation?”

According to Labelbox, model distillation (or knowledge distillation) is a machine learning technique involving the transfer of knowledge from a large model to a smaller one.  Distillation bridges the gap between computational demand and the cost for training large models while maintaining performance.  Basically, the large model learns from an enormous amount of raw data for a number of months and a huge sum of money typically in a training lab, then passes on that knowledge to its smaller counterpart primed for real-world application and production for less time and money.  

Distillation has been around for some time and has been used by AI developers, but not to the same degree of success as DeepSeek.  The Chinese AI developer had said that aside from their own models, they also distilled from open-source AIs released by Meta Platforms and Alibaba.

However, the terms of service for OpenAI prohibits the use of its models for developing competing applications.  While OpenAI had banned suspected accounts for distillation during its investigation, US President Donald Trump’s AI czar David Sacks had called out DeepSeek for distilling from OpenAI models.  Sacks added that US AI companies should take measures to protect their models or make it difficult for their models to be distilled.

Copyright: Darlene Anderson

 

How Does Distillation Affect AI Investments?

On the back of DeepSeek’s success, distillation might give tech giants cause to reexamine their business models and investors to question the amount of dollars they put into AI advancements.  Is it worth it to be a pioneer or industry leader when the same efforts can be replicated by smaller rivals at less cost?  Can an advantage still exist for tech companies that ask for huge investments to blaze a trail when others are too quick to follow and build upon the leader’s achievements?

A recent Wall Street Journal article notes that tech executives expect distillation to produce more high-quality models.  The same article mentions Anthropic CEO Dario Amodei blogging that DeepSeek’s R1 model “is not a unique breakthrough or something that fundamentally changes the economics” of advanced AI systems.  This is an expected development as the costs for AI operations continue to fall and models move towards being more open-source.  

Perhaps that’s where the advantage for tech leaders and investors lies: the opportunity to break new ground and the understanding that you’re seeking answers from unexplored spaces while the rest limit themselves and reiterate within the same technological confines.  Established tech giants continue to enjoy the prestige of their AI models being more widely used in Silicon Valley — despite DeepSeek’s economical advantage — and the expectation of being the first to bring new advancements and developments to the digital world.

And maybe, just maybe, in that space between the pursuit of new AI breakthroughs and lower-cost AI models lie solutions to help meet the increasing demand for data centers and computing power.   

Copyright: panumas nikhomkhai

 

Featured Image Copyright: Matheus Bertelli
Top Image Copyright: Airam Dato-on

 

 

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

background

Tell us how we can help you

Cascade Strategies can serve your market research needs from the most straightforward to the most sophisticated project. Don’t hesitate to contact us to tell us about your next project, or your overall research needs in general. You can call (425) 677-7430 and ask for Jerry, Nestor, or Ernie. Or send us an email at info@cascadestrategies.com. We’ll get back to you quickly!

subscribe