Is The Future Of AI Around the Corner? Computing Power Suppliers Are Shifting Gears
You're reading Entrepreneur South Africa, an international franchise of Entrepreneur Media.
Artificial Intelligence (AI) is changing everything from the information we see on our Facebook feeds, to improving diagnosis and treatment of medical conditions.
According to McKinsey, it has the potential to create an additional $13 trillion in global economic output by 2030. Governments and start-ups alike are scrambling to ensure they are in a position to enjoy the economic benefits that AI will bring.
However, despite the apparent potential, there’s one significant bottleneck — the supply of computational power required to develop and drive AI products and solutions.
At present, cloud-based providers of computing resources are striving to keep up with the pace of development in power hungry AI. Here, we look at the challenges faced by cloud-based computing, and some potential solutions.
Challenge 1 – Supply and Demand
AI relies on data and lots of it. As such, the computational demand of AI is growing — one report estimates that the amount of compute used by AI currently has a three and a half month doubling time.
As things stand, AI developers are dependent on computing capacity from the $247 billion cloud computing industry. This industry is dominated by four corporate IT firms – Amazon, Microsoft, Google, and IBM, collectively known as the ‘Big Four’. These companies rely on their vast centralised data centres to keep the world's cloud computing services running.
In an attempt to meet the growing demand for AI computing services, investment in data centres is also growing at a startling pace. Computing firms spent $27 billion in the first quarter of 2018, with most of that expenditure thought to be directed into developing data centres. Compare this with the $74 billion spent in 2017, and the pace of growth is evident.
The question is, how long can the computing firms continue to keep up with demand using the traditional datacenter model?
In an attempt to stem demand, the big IT firms are increasing their costs.
With AI services requiring massive computational power before they can go to market, the rising cost of computing risks stifling innovation, particularly for smaller developers.
Challenge 2 – Environmental Sustainability
If the only way to meet demand is to build more data centres, then this means more electricity-hungry machines. It’s reported that 2% of all CO2 emissions globally emerge from the datacenter industry — more than the airline industry.
Just this month, the United States of America’s Department of Energy reported that data centres in their country accounted for around 2% of the overall energy consumption.
While owners are investigating green energy alternatives, the fact remains that more data centres will result in higher energy consumption.
Challenge 3 – A single point of failure
Amazon famously brought down a number of large websites last year when an employee accidentally took more servers offline than intended. That event sparked a domino effect that was felt globally.
It’s natural that single points of failure raise the risk of an incident having a more substantial impact. With cloud data provided by just four companies from a limited number of data centres, that risk is always present.
A Quantum of Solace?
Chinese marketplace Alibaba is also in the business of cloud computing, albeit not yet one of the ‘Big Four’ However, its representatives have clearly stated that the company has market leader Amazon firmly in its sights.
Earlier this year, Alibaba launched its first cloud quantum computer, capable of processing 11 quantum bits (qubit). A typical computer chip is binary, meaning it can only process values of 0 or 1 at any given moment depending on its speed. A quantum computer is capable of handling both at the same time, meaning a single qubit can participate in many millions of processes.
Alibaba has pledged to continue development in this area, having already invested $15.5 billion at the end of last year. IBM is also firmly invested in quantum computing, having launched its own quantum computer last year. Quantum computers could ultimately do away with the need for centralised data centres.
From Cloud Computing to Distributed Computing
While some commentators have predicted a wait of five to ten years for quantum computing solutions to cope with demand, a few start-ups are working to meet the requirements of cloud computing on a shorter timeline. One start-up has devised a scalable solution, which it says will be up and running as early as 2019.
Tatau, which calls itself the “Uber of Computing,” has designed a platform which essentially creates a global supercomputer to harness the joint capacity of already existing GPU computing capacity.
By utilising a resource that already exists, the company claims it can offer cloud computing that is cheaper, more environmentally friendly, and more scalable than current solutions. Moreover, a decentralised model doesn’t have a single point of failure, reducing the risk of downtime or hacks.
Tatau’s decentralised network taps into the computing capacity that sits outside of data centres. The company has designed a blockchain-driven marketplace where owners of GPU hardware can sell under-utilised computing power to a buyer. By utilising latent capacity, this solution provides a way for owners of hardware to receive better returns on their investment, and provide access to reliable, cost-effective compute previously unavailable to AI developers.
The Future for AI Development
Given the growing demand for AI services, the computing sector needs to find a way to meet the need for computing services. Given the challenges inherent in the current datacenter model, it seems likely that quantum or distributed computing could ultimately take off.
The question will be, are the current ‘Big Four’ market players ready to compete on a different playing field?