Hot Take: Data Center Trends at Gartner IOCS
I just returned from my fourth trip to Las Vegas this year. While each has been an amazing experience (starting with taking customers to see U2:UV powered by WEKA at Sphere), Vegas can wear on you. But the Gartner IT Infrastructure, Operations & Cloud Strategies Conference (a mouthful, so henceforth just “IOCS”) was an incredibly energizing and informative way to end my Vegas mini-residency in 2023.
If you aren’t familiar with this conference, it focuses on what is driving cutting-edge trends and strategies in IT, both on-premises and in the cloud. It is designed for IT professionals to get valuable insights from industry experts through sessions and workshops, offering practical guidance for optimizing business operations in a dynamic digital landscape.
IOCS had over 5,000 attendees this year, and while it is not a high-volume conference, it is focused on quality – the quality of interactions with customers and prospects and the quality of the information shared by analysts. Below are some takeaways from the many quality interactions I had at the show.
The plethora of conversations we had in WEKA’s booth and around the conference underscored that last year’s reticence to start new large IT projects has well and truly passed. We heard from organizations of all sizes about the new initiatives they are kicking off or already have underway, with AI, analytics, and big data being major focus areas.
The timing for many of them seems to be “now.” In fact, more than once a day someone would ask if we could kick off a PoC with them as soon as possible! There was also a noticeably stronger federal presence this year compared to last, highlighting that this investment trend also extends to the U.S. Federal Government.
GenAI was a hot topic in conversations across the show and in multiple sessions, including WEKA’s own roundtable discussion on how to understand and manage its diverse IO patterns.
While some attendees had active GenAI projects, most were still learning, exploring, and searching for best practices. The session covering the brand new Hype Cycle for Generative AI was packed with attendees looking to understand how disruptive GenAI will be and what they can do to harness it.
Some key takeaways from this session delivered by lead author Arun Chandrasekaran are:
GenAI is rapidly evolving – and chaotic. Arun noted that practitioners need to be prepared that “the models you will use in six months or next year will not be the same as the one you use today.” My takeaway here was that you need to be flexible with model selection and for your initial projects – choose what works now but be ready to change in the future as your needs and the models do.
Infrastructure is core to AI success.
. Arun reinforced that “the biggest gating factor for the growth of AI today is infrastructure.” Everything from the shortage of compute to train models on, to the surrounding infrastructure.
At WEKA, we are seeing this from our customers as well, where many are concerned about getting the most out of their GPUs by driving down idle times and reducing data stalls to make the most efficient use of these high-value resources.
AI models are only as good as the data they are trained on. Gathering enough quality data to power Gen AI continues to be challenging. Arun pointed out that “while AI models have been evolving at a pretty rapid pace, enterprise data engineering is unfortunately still stuck in the 20th century.”
This is reflected in what we’re seeing at WEKA in the emergence of data platforms as an effective tool to help organizations adapt to our new Al-driven, digital world by enabling data to flow more readily to resources that need it, like GPUs.
Most everyone I talked to was actively evaluating AI projects and/or working on pilots. What was interesting to me was that they were extremely focused on the success of their pilot efforts. They were looking for significant performance even in small deployments.
As one analyst confirmed for me “Often the order is to do the pilot and then build a plan.” Some customers were frustrated that some vendors would not engage with them unless they had projects involving significant petabytes. To be successful, vendors will need to deliver on small projects and then have the ability to easily and rapidly scale as the bigger AI project plan takes shape.
Repeatedly the fact that modern workflows span numerous locations came up. Customers cited examples for workloads that covered a variety of configurations – data center to cloud, edge to cloud, and edge to edge – are becoming increasingly common. There were many reasons they cited for these configurations, from trying to manage data locality to the belief that they needed more security than they could get in the cloud for certain portions of workloads. In the WEKA booth, they asked for help implementing modern data architectures and global namespaces to enable more of these workflows.
Gartner research supports the importance of full hybrid cloud deployments with data that says by 2027 60% of IT leaders will implement hybrid cloud file deployments. And, to help them make sense of the market, they have launched a new Gartner Magic Quadrant for Distributed Hybrid Infrastructure.
This is just a quick overview of some of the quality conversations we had across the three days in Vegas but there were many more that I don’t have the time or space to summarize. The show covers a wide range of topics including but not limited to edge computing, security, optimizing costs and value, focusing on diversity, equity and inclusion, and sustainability.
Hope to see you there next year!