Many organizations are moving away from public cloud data storage towards software-defined hybrid cloud environments. Software-defined data centers (SDDC) are the new norm, where infrastructure is virtualized and delivered as a service. Hardware configurations are maintained via intelligent software systems, no longer defined simply by hardware devices.
At the SDDC Symposium earlier this month, Scality’s General Manager Greg DiFraia spoke about the performance and flexibility of data storage built to take advantage of these types of private, third-party, and hybrid clouds.
While we’re still seeing traditional applications like user files, NAS, and client-server applications, said DiFraia, there are ten times as much growth in newer data workloads, like mobile web apps, artificial intelligence, machine learning, and big data analytics.
Today’s customers need choice and flexibility, said DiFraia, which lets them decide where and why workloads should live, with no barriers to extending those services. “This is something that again is coming up in the majority of our customer conversation today,” he said. “[Customers are saying], I’ve got transformation, I have traditional workloads, I’ve got a traditional data center, I’ve got multi-cloud. How can you help us really provide a platform and a strategy that help us execute at enterprise-class scale at a low cost to serve?”
At the same time, said DiFraia, we’re seeing the emergence of edge computing, where the data and the analysis happens closer to the data itself. “These next-gen applications and distributed workloads are everywhere,” he said.
“So we’re seeing things like Azure Stack, we’re seeing investment in Stack Edge with Microsoft, we’re seeing Outpost with AWS,” DiFraia continued. What Scality does, then, is to give its customers a common storage approach across all clouds, whether on-premise, third-party and public clouds by providing a single global handle.
This gives customers a common platform that’s ubiquitous across multiple locations so that they can access their data, and analyze it, in any way and from anywhere. “Because we’re talking about hundreds or even thousands of applications,” said DiFraia. “We’re talking about multiple sites, multiple clouds, and I need to be able to do this at, again, large scale … with enterprise-class service.”
Scality is able to offer this level of reliable data storage at low cost, with Scality RING, its flagship software platform. It provides distributed file and objects software that’s running on x86 hardware. “Again, [it] can live in a data center or across multiple data centers, in either an active or an active-passive manner,” said DiFraia.
Zenko, he said, is Scality’s multi-cloud controller software, which lets the company extend the storage namespace across third-party storage (as well as across the clouds of choice) in a seamless manner. “We’re also doing this in a way that does not provide lock-in,” said DiFraia. “We do not manipulate any of the data.”
The ability for data to be stored across providers also lets companies find the most cost-effective services, as well. This intelligent data management lets you renegotiate your cloud contract with anyone that provides a cost-benefit. “[For example], if I was writing data to AWS and to Azure and we’ll say I’m redoing my ELA and that ELA allowed me to renegotiate my cloud contract with Azure and we’ll say that Azure now provides me with a 30% benefit over AWS,” said DiFraia.
Scality is focused on unstructured data, DiFraia said, in closing. “We’re focused on doing this not only in the data center but across multiple clouds and driving this in a very programmatic way. Our goal is really to give freedom and control to people who create value with data, right? We want to help you store it. We want to help you protect it. We’re going to help you distribute that information, but we also want to provide value from that information.”
Be sure to watch the full video, embedded below, to learn more about Scality’s perspective on the multi-cloud world: