Measuring Storage TCO: Have We Got It Wrong Way Around?
- By Winston Thomas
- November 08, 2023
For a long time, storage has been a TCO game. Along with its cousin ROI, TCO helps you determine how much you need to invest. It is also a measure that CFOs, CIOs or COOs understand.
The challenge with TCOs in a data-driven world is that storage needs are growing unpredictably fast, and companies feel its impact beyond the numbers.
For example, nobody would have thought about the impact of allowing quick access to cold data on tapes. But that’s what today’s AI models demand.
Storage also impacts other business goals. Take sustainability, for example. Where and how you store your bits and bytes directly impacts your energy usage and carbon footprint in more ways than one.
So, should we still use TCOs when deciding what storage to procure?
How data is impacting storage
To answer this question, Mark Jobbins, vice president and field chief technology officer for Asia Pacific & Japan at Pure Storage, feels we must first shift attitudes about TCO.
“I mean, we do talk to customers that are past focusing on that dollar value and performance stats. But as you rightly asked me, the question [of TCO] is only part of the picture,” he explains.
Beyond numbers, Jobbins sees companies starting to look at the flexibility and simplicity of the underlying infrastructure that supports their digitalization efforts as part of the calculations.
"Because you need to be able to respond in a far more agile way. So, those basic numbers on dollars per performance only form part of the picture," he adds.
No TCO number will also show how companies can unravel the complex scenario of managing data silos, which in turn makes managing storage onerous. Addressing this is becoming more urgent as more applications and AI models need to ingest this data in disparate locations.
Jobbins urges companies to simplify their storage infrastructure, not just the data layer. “So, having a simple platform where we can hold that data shared between the applications becomes even more important.”
Data latency is another major factor that decision-makers miss in TCO calculations.
Users and customers have no patience with delays or downturns, while keeping essential bytes online is becoming the typical ask. But it also means you need higher maintenance, investing in failover, and maintaining the physical storage nearer to customers.
"Fundamentally, we're now living in a world where people don't accept downtime or accepting downtime is incredibly hard. And it has a huge impact on the business, from a productivity and service perspective and an expectation perspective," says Jobbins.
Consolidating data silos and reducing data latency is not easy, especially when many companies have substantial technical debts built over the years to overcome. But it is a journey they will ultimately have to undertake, says Jobbins.
GenAI and sustainability muddy the picture
Two new developments are throwing what Jobbins describes as “curveballs.”
First is GenAI. This need is new, and companies are only now getting into grips with the storage needs. It does not help that companies see potential rewards in GenAI but need time and resources to experiment and define the proper use case.
One major storage issue with GenAI lies with model training, which becomes more critical as the technology has the potential to pass off hallucinations as fact. "You're only going to get meaningful responses if you've fed enough information into the model initially," says Jobbins.
It also puts pressure on the storage administrators. We are no longer talking about cold data, low-use or unused data traditionally stored on tapes or slower yet less expensive media. They now need this data to be available for intensive AI model training and retraining (to avoid model drifts). In turn, it impacts the TCO results.
"The opportunities [of GenAI] are staggering, but so are challenges associated with something like generative AI around the computational needs for running the applications," says Jobbins.
At the opposite end lies the cost of sustainability non-compliance. Companies track their carbon footprint against sustainability goals as government regulators and investors track them for compliance.
The exploding growth of data in the past few years means storage hardware can have a material impact on the company's top and bottom lines for the long term. This is often not captured easily in TCO calculations beyond immediate energy savings.
“Being able to run data centers from a power cooling and space perspective adds a whole new dimension for organizations,” says Jobbins.
Thinking differently about storage
While many of these issues have only crept up to Board of Director (BoD) discussions, the solutions are here.
For example, the cost of flash has come down significantly. Flash storage promises high-speed responses, which brings latency down to microseconds. Having a non-volatile memory means the data remains when the power is off.
Pure Storage, which has staked its entire company's future on an all-flash future, sees its flash products differently. Besides adding a 75TB DirectFlash Module (DFM), it also has plans to bring in a 300TB module.
A "smaller" 1PB FlashArray//E configuration and FlashArray//X and FlashArray//C R4 update with Intel Sapphire Rapids controllers offer a significant performance alternative for workloads requiring “cheaper” hard drives with less storage.
All these products put high-performance at better prices in reach of any company and data center operators looking to build high-density configurations.
However, TCO calculations miss how these modules are being built and managed. For example, Pure DFMs fail far less frequently than HDDs and SSDs, explains Jobbins.
This means less operating costs and allows companies to reallocate their budget for other investments. But the design also lowers the power and cooling needs, enabling data center operators to meet increasingly strict moratoriums on expansion, like Singapore's.
Ultimately, these developments offer IT decision-makers "to think differently" regarding storage. Instead of purely using the TCO figure, they can look at it from a long-term planning point of view.
"If you're still stuck on a traditional storage platform, you're going to hit a roadblock very quickly [when it comes to long-term planning]. The tech debt is just going to blow out even further. So thinking differently becomes more important,” says Jobbins.
Pure is also changing the business model for buying storage. For example, its Evergreen//One Storage as a Service (STaaS) and Evergreen//Flex come with the industry's first paid power and rack commitment and power and space efficiency guarantee. No Data Migration and Zero Data Loss guarantees also allow companies to mitigate costs due to unplanned outages.
It means you can keep upgrading the hardware and continue to meet new data demands without forklifting upgrades and doing any data migration. No more repurchasing or relicensing headaches, and you can focus on the business case.
“People want a little bit more choice,” says Jobbins. He notes that companies are becoming very conscious about the performance profiles of their data, the issues with data sovereignty, and increasing data egress costs.
"Quite often, they want to get [their data] on-premises or in a private cloud environment where they get all the cloud-like benefits, such as non-disruptive upgrades, continual innovation, etc.," explains Jobbins.
He notes that Pure Storage's subscription plans "gives them a consumption-based model to perhaps start small but scale quickly if need be."
Storage security will matter more than TCO soon
A new area that is becoming more significant than TCO figures is security.
For some time, the discussion about security stayed within data circles. However, protecting this valuable asset is now becoming a concern for storage administrators. After all, they now sit on the frontlines of malware and threats that target backups, like many ransomware do.
With the increase in threat numbers and sophistication, there is a higher chance a hacker will be successful. When the security game switches from "if" to "when," fast data restoration becomes a critical lifeline.
"When I started, we used to backup data to tape. It took a long time to do, and honestly, we sort of prayed that you never needed to restore because it would be complex," says Jobbins.
Companies having lots of replicated data in different locations and silos further complicates restoration. Corrupting one copy may also introduce biases and errors if used in a machine learning model.
“The whole mindset around actually protecting the data and being able to recover that data has fundamentally changed. You can't have outages for days and weeks. Businesses just simply won’t survive,” says Jobbins.
This is why the company introduced the first STaaS Ransomware Recovery SLA in Evergreen//One subscription. It guarantees clean arrays, recovery plans, data transfer and onsite staff.
“It's one of those things I just don't think people focus on enough,” says Jobbins.
So, are TCO figures still important? As Jobbins says, they are a start, but decision-makers need to know there's more behind those numbers that may matter more to business success.
Winston Thomas is the editor-in-chief of CDOTrends. He's a singularity believer, a blockchain enthusiast, and believes we already live in a metaverse. You can reach him at [email protected].
Image credit: iStockphoto/photosaint
Winston Thomas
Winston Thomas is the editor-in-chief of CDOTrends. He likes to piece together the weird and wondering tech puzzle for readers and identify groundbreaking business models led by tech while waiting for the singularity.