How Hardware Constraints Shape Superior Deliveries

Hardware constraints can drive innovation by forcing teams to optimize resources, focus on efficiency, and prioritise quality over quantity. This approach not only leads to smarter, more sustainable solutions but also proves that creativity thrives under limitations.

Zoe Reinhardt Chief Delivery Officer (CDO)
·2 min read (444 words)

In the rapidly advancing field of artificial intelligence, the race for superior models is often measured in teraflops, parameter counts, and the scale of training datasets. Yet, DeepSeek, the latest AI sensation, took a decidedly different path to arrive at its groundbreaking R1 model. 

Surprisingly, it wasn’t the abundance of resources but the constraints of hardware that shaped its development. And it’s precisely these limitations that have made DeepSeek more efficient than AI giants like ChatGPT.

The Role of Constraints in Innovation

DeepSeek’s journey was constrained by limited access to high-end hardware. While competitors like ChatGPT were developed using state-of-the-art supercomputers and nearly limitless budgets, DeepSeek’s developers had to rely on more modest computational resources. Rather than seeing this as a disadvantage, the DeepSeek team turned these constraints into an opportunity to innovate.

Optimisation over Excess

Without access to massive GPU clusters, the DeepSeek team had to optimise every aspect of their model. This forced them to think creatively and take an agile approach to architecture, data efficiency, and training techniques. This resulted not only in reducing the computational cost but also made the model more efficient at inference, consuming less energy than traditional monolithic architectures like those used in ChatGPT.

Smaller, Yet Smarter

DeepSeek also took a laser-focused approach to data curation. Instead of training on massive, redundant datasets, the team prioritised high-quality data. This smaller, cleaner dataset allowed the model to learn more effectively, achieving comparable performance to ChatGPT without the need for excessive resources.

Why constraints make us better 

The results of this resource-conscious development are remarkable. DeepSeek’s R1 model offers capabilities that rivals and in some cases surpasses those of ChatGPT, but with a fraction of the resource consumption. 

Lessons for the Industry

DeepSeek’s success story is a testament to the power of constraints in driving innovation. When we are pushed to think outside the box, the results can challenge the status quo and redefine industry standards, process and projects. DeepSeek proves that bigger isn’t always better; smarter, more thoughtful designs can yield results that are just as powerful and far more efficient.

As Chief Delivery Officer at New Icon, I believe that DeepSeek’s development journey offers valuable lessons for the tech industry. It reminds us that constraints are not obstacles but opportunities to innovate and improve. By focusing on efficiency, quality, and transparency, we can create technologies that are not only powerful but also sustainable and accessible.

DeepSeek has set a new benchmark for what AI can achieve with less. As we continue to integrate these lessons at New Icon, I am excited about the possibilities for building smarter, leaner, and greener technologies and teams that shape a better future.


Zoe Reinhardt Chief Delivery Officer (CDO) at Newicon

Join the newsletter

Subscribe to get our best content. No spam, ever. Unsubscribe at any time.

Get in touch

Send us a message for more information about how we can help you