Here, we go through the challenges faced by cloud-based computing, plus some potential solutions.
You’re reading Entrepreneur South Africa, a global franchise of Entrepreneur Media.
Artificial Intelligence (AI) is changing from the info we see on our Facebook feeds, to improving diagnosis and treatment of medical ailments.
According to McKinsey, it gets the potential to create yet another $13 trillion in global economic output by 2030. Governments and start-ups alike are scrambling to make sure they are able to benefit from the economic benefits that AI provides.
However, regardless of the apparent potential, there’s one significant bottleneck – the way to obtain computational power necessary to develop and drive AI products and solutions.
At the moment, cloud-based providers of computing resources are striving to maintain with the pace of development in power hungry AI. Here, we go through the challenges faced by cloud-based computing, plus some potential solutions.
Challenge 1 – Supply and Demand
AI depends on data and a lot of it. As such, the computational demand of AI keeps growing – one report estimates that the quantity of compute utilized by AI currently includes a three . 5 month doubling time.
As things stand, AI developers are reliant on computing capacity from the $247 billion cloud computing industry . This industry is dominated by four corporate IT firms – Amazon, Microsoft, Google, and IBM, collectively referred to as the ‘Big Four’. These businesses depend on their vast centralised data centres to keep carefully the world’s cloud computing services running.
So that they can meet up with the growing demand for AI computing services, investment in data centres can be growing at a startling pace. Computing firms spent $27 billion in the first quarter of 2018, with the majority of that expenditure regarded as directed into developing data centres. Compare this with the $74 billion spent in 2017, and the pace of growth is evident.
The question is, how long can the computing firms always keep up with demand using the original datacenter model?
So that they can stem demand, the big IT firms are increasing their costs.
With AI services requiring massive computational power before they are able to head to market, the rising cost of computing risks stifling innovation, particularly for smaller developers.
Challenge 2 – Environmental Sustainability
If the only path to meet up demand is to build more data centres, then this implies more electricity-hungry machines. It’s reported that 2% of most CO2 emissions globally emerge from the datacenter industry – a lot more than the airline industry.
Just this month, america of America’s Department of Energy reported that data centres within their country accounted for about 2% of the entire energy consumption .
While owners are investigating green energy alternatives, the actual fact remains that more data centres can lead to higher energy consumption.
Challenge 3 – An individual point of failure
Amazon famously brought down several large websites this past year when a worker accidentally took more servers offline than intended. That event sparked a domino effect that was felt globally.
It’s natural that single points of failure improve the threat of an incident having a far more substantial impact. With cloud data supplied by just four companies from a restricted number of data centres, that risk is always present.
A Quantum of Solace?
Chinese marketplace Alibaba can be available of cloud computing, albeit not yet among the ‘Big Four’ However, its representatives have clearly stated that the business has market leader Amazon firmly in its sights.
Earlier this season, Alibaba launched its first cloud quantum computer, with the capacity of processing 11 quantum bits (qubit). An average computer chip is binary, meaning it could only process values of 0 or 1 at any given moment based on its speed. A quantum computer is able to handle both as well, meaning an individual qubit can take part in many an incredible number of processes.
Alibaba has pledged to keep development in this area, having already invested $15.5 billion towards the end of this past year. IBM can be firmly committed to quantum computing, having launched its quantum computer this past year. Quantum computers could ultimately get rid of the necessity for centralised data centres.
From Cloud Computing to Distributed Computing
Although some commentators have predicted a wait of five to a decade for quantum computing answers to cope with demand, a few start-ups will work to meet certain requirements of cloud computing on a shorter timeline. One start-up has devised a scalable solution, which it says will be ready to go as soon as 2019.
Tatau, which calls itself the “Uber of Computing,” has designed a platform which essentially creates a worldwide supercomputer to harness the joint capacity of already existing GPU computing capacity.
By utilising a resource that already exists, the business claims it could offer cloud computing that’s cheaper, more green, and more scalable than current solutions. Moreover, a decentralised model doesn’t have an individual point of failure, reducing the chance of downtime or hacks.
Tatau’s decentralised network taps in to the computing capacity that sits beyond data centres. The business has designed a blockchain-driven marketplace where owners of GPU hardware can sell under-utilised computing capacity to a buyer. By utilising latent capacity, this solution offers a method for owners of hardware to get better returns on the investment, and provide usage of reliable, cost-effective compute previously unavailable to AI developers.
THE NEAR FUTURE for AI Development
Given the growing demand for AI services, the computing sector must find a way to meet up the necessity for computing services. Given the challenges inherent in today’s datacenter model, it appears likely that quantum or distributed computing could ultimately remove.
The question will be, will be the current ‘Big Four’ market players prepared to compete on a different pl