Home / Publications / Cost of AI compute is a barrier to AI startups?

Cost of AI compute is a barrier to AI startups?

CMS Digitalbytes

10 July 2018

At the House of Lords AI evidence meeting yesterday there was a heated debate as to whether the cost of AI compute is a barrier to AI startups. A derivatives data company with a turnover in the millions expressed the view that it is too expensive to access necessary supercomputing power for developing its products.

A well known consultancy firm with an AI development team pointed out that a lot can be done even with laptop computers.  Even so, researchers from OpenAI have found that the amount of compute used in AI models since 2012 has doubled every 3.5 months during that time, due to the increased complexity of the AI models and the nature of the data used. In comparison, the number of transistors per square inch on integrated circuits doubles every 18 months.

Lord Clement-Jones was asked about whether the UK has a data centre policy which takes into account Brexit, and there was lots of discussion around AI hardware accelerators and whether these will be the solution to the increasing demand for AI compute.The House of Lords were gathering evidence about Infrastructure and AI, both in terms of how AI can help improve infrastructure in our country, such as water and power utilities, road networks, and cities, and in terms of what infrastructure is needed to enable our businesses to realise the potential of AI.

Will Cavendish from Arup spoke clearly about how AI improves asset management, discovering new materials, infrastructure design, flood risk management and other infrastructure related problems.  Antony Walker from techUK spoke about the Institute of Civil Engineer's project 13 which includes use of AI to aid infrastructure.  Tomas Romero from WiPro and Marko Balabanovic from the Digital Catapult gave lots of detail about AI as a tool for growth of the economy.

"OpenAI yesterday released an analysis on the largest AI models since 2012. Researchers discovered that the amount of compute used for training doubles every 3.5 months. By comparison, the number of transistors per square inch on integrated circuits only doubles every 18 months, a.k.a Moore’s Law."

AI Doubling Its Compute Every 3.5 Months

The content above was originally posted on CMS DigitalBytes - CMS lawyers sharing comment and commentary on all things tech.

Authors

Rachel Free
Dr. Rachel Free
Partner
London