At the SDDC Virtual Symposium, Scality’s Greg DiFraia, Cumulus Networks’ Partho Mishra, Mellanox Technologies’ John Kim, HPE’s Chris Tinker, and moderator Hal Woods (Datera) participated in a panel discussion called the Mega Talk. The conversation focused on “digital transformation” and how enterprises like the companies represented are helping usher in a new era of software defined data centers.
First up, Scality’s DiFraia spoke about how his organization is looking to help its customers solve the storage and analysis of the overwhelming amount of unstructured data out there these days.
Datera’s Woods agreed, saying, “It’s all about the data. That’s a common thing. We hear that all the time. And there’s an explosion in data and much of it’s unstructured.”
DiFraia pointed out that while he sees a doubling in the amount of traditional workloads (NAS, server/client applications, data protection, backup, etc.), there’s an even more massive tsunami of new data and applications coming.
“These are all these cloud native, … AI, ML-driven, web, social mobile platforms that are massively distributed and have massive scale but also have tremendous value to the business,” he said. “What we see is when we talk to IT, they’re actually [being] asked to keep the business running with traditional applications while innovat[ing] and extend[ing] these new services.”
What are the differences between object, NAS and file storage?
The way Scality has been working with this problem is by delegating workloads to various nearby regions but using a common API and metadata service to keep elasticity in the ways customers can access, analyze, and use their data.
“We’re a piece of the puzzle and I think this is when you start to get into the software defined use cases in deliverables,” he said. “There’s block, there’s object, there is multicloud, there’s hardware, there’s networking, there’s distribution, there’s orchestration, there’s aventing, there’s runbooks—there’s a lot of things.”
Working together with these customers is the best answer when it comes to htis digital transformation. “Because if we execute well on unstructured data,” DeFraia said, “but we don’t help connect the dots on the network, it doesn’t solve the problem. And I think when we’re talking about partnerships and how really valuable they are for customers, that’s essentially what for me, from an execution perspective, is absolutely critical.”
Cumulus Network’s Mishra agrees. His customers look to his company to prove them with networking APIs that they can use in the overall process, without relying on human intervention, which can introduce mistakes.
“So that for example, if you have a VM that migrates across the network, you don’t have to have someone then go in and manually stitch up a villain appropriately to mirror its movement,” he said. “And we find that where that hook is missing, that’s where we see more mistakes. So I think it’s going to be extremely important for us as a technology industry to be able to provide that kind of end to end orchestration capability across all of our components.”
There’s still room for people, of course, as Mellanoz Technologies’ John Kim noted. “I think the humans still want to make the high level rules; they want to set the policy.”
Datera’s Hal Woods asked Chris Tinker (HPE) a little bit about Infosight, an AI engine that helps manage all the information involved in new software defined systems.
“In any large data center,” said Woods, “there’s a lot of equipment, it’s generating arguably millions of points of telemetry a day. In our system, we do that analytics, we try to feed that back to the customer, but we’re one piece of that puzzle. We feed into higher level tools, that sort of thing.”
Tinker responded that partnerships are key, here, as well.
“We can’t make everything, we work with our partners best-in-breed of the technologies to bring them into a way to bring the solution to market,” he said.
Infosight takes HPE’s storage and our server portfolio, takes the actual telemetry data, and then feeds that into a global learning apparatus. This lets the company do several key things, said Tinker.
All of the data is then put into HPE’s API-driven framework that can be put into an automatic process to better manage customer needs.
All this automation, said Woods, lets customers decouple the purchasing of the software from the purchasing of the hardware.
While companies like Cumulus Networks initially thought they might just charge customers based on consumption, said Mishra, they ended up with a subscription model since many customers were still looking at costs closely associated with hardware.
In addition, Mishra continued, the support his customers expect requires a degree of partnership between the software company and its hardware partners, which seems to be incredibly tight.
“I would say that one of the learnings for these last several years has been our ability to put those processes together to make it such that for the customers, the experience is not very different than it would be in a traditional model,” he said.
Ultimately, the panel wanted to make it clear that the software defined data center isn’t coming soon; it’s already here.
“It’s not something two years from now, it’s something you can deploy today and realize that value. It gives you the developer velocity, the operational agility, automation, performance, flexibility, scale, [and] all of those things that have been that promise of software define that we’ve lived for many years, but all of the pieces are coming together and partners like HPE are able to deliver that value.”
Be sure to watch the entire, wide-ranging panel discussion: