Five Reasons Why Datacenters are Going Hyperscale
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.
To start with, we need to understand what ‘Hyperscale’ stands for. Public cloud infrastructure, datacenters, etc. are generally very large-scaled, but are not always ‘Hyperscale’. However, companies like Google, Microsoft and Amazon have redefined the meaning of ‘scale’ by creating enormous complexes covering hundreds of acres and housing hundreds of thousands of computing resources. These ‘Hyperscale’ datacenters have been able to provide unmatched scalability to globally consumed applications, often with hundreds of millions of users.
So what is driving the rapid growth of Hyperscale datacenters? Here are 5 major reasons.
1. Cloud & Big Data: The fast evolution of cloud computing and Big Data have played a significant role. According to a recent Cisco survey, by 2021, datacenters would store 1.3 zettabytes of data, big data would consume 30% of datacenter storage and cloud traffic would make up 95% of all datacenter operations worldwide. The survey also predicts that more than 50% of all datacenter servers will have Hyperscale architecture by 2021.
2. IoT and Consumer Devices: The growth of IoT and consumer ‘things’ has led to large amounts of unstructured data being generated every day. According to statista.com, nearly 125,000 Petabytes of consumer IP traffic is generated every month. This is expected to double by 2021.
The growth of consumer-generated data is in addition to enormous data generated daily from millions of sensors, biometric devices, wearables, enterprise applications, industrial devices, geospatial applications, healthcare devices, gaming apps, etc. While powerful intelligence can be drawn from this data, the challenge lies in building scalable storage, compute and network resources to manage, transform and process this data. This is where Hyperscale datacenters give companies the much-needed performance, availability and scalability to implement advanced analytics (including AI, machine learning, predictive analytics) on top of consumer / IoT data.
3. Containerization: The concept of containers that make cloud workloads independent of platforms or VMs is relatively new, with tools like Docker and Kubernetes taking the lead in this space. Concepts like Cloud Native Computing enables the development of distributed, scale-out applications on the cloud. Containers and Cloud Native enable applications to scale massively, requiring huge amounts of computing and storage.
4. Energy Efficiency: Hyperscale allows datacenters and cloud infrastructure to be designed for maximum power efficiency. In many cases, cloud infrastructure leaders such as Google, Amazon, Microsoft, and IBM have built their datacenters around renewable energy sources
5. Cost Benefits: With Hyperscale, there are obvious, direct and significant improvements in performance and efficiency. Organizations that effectively leverage Hyperscale infrastructure for extremely high density and high volume business processes will see major benefits in terms of capital cost savings, support cost reductions, power & space cost benefits and minimization of administrative costs.
At present, there are nearly 400 Hyperscale datacenters across the world, with US accounting for almost half of these. Emerging markets like India are seeing a lot of demand, but account for less than 3% of all Hyperscale datacenters. This is a significant opportunity for leading cloud services and infrastructure providers like Netmagic and others to invest in Hyperscale datacenters and provide cost-effective, scalable and available services. Large organizations, especially those that involve consumer data (e.g. banking, healthcare, online retail, aviation) must leverage Hyperscale datacenters to mine massive data sets, build powerful insights and discover new opportunities to enhance business performance.