IBM Brings Back Choice Into Hybrid Clouds
- By Winston Thomas
- September 20, 2021

Hybrid cloud now has a meaning. A decade ago, it was a necessary compromise for companies who couldn’t get rid of the legacy baggage but wanted to have a lifeline to a cloud-native world.
But this was before containerization took off in a big way and edge computing became more than a propeller head’s fad. Before, local regulators wanted you to have your most sensitive data onshore and in the same country of operations. And before companies realized that having a cloud player’s data center onshore is not just an advantage but a necessity.
There is also the data latency argument that is emerging with a louder voice. IoT and machine learning applications demand data be nearby for fast processing and immediate insights. Likewise, smart manufacturing, smart city, and autonomous vehicles assume that the data is always nearby, not in a cloud data center thousands of miles away.
“Customers are facing data residency and data governance issues, but would also like to leverage cloud services, like AI and machine learning, and so on. Today, they struggle to do that because those services are largely available on a [public] cloud,” says Raymond Wong, general manager for ASEAN, IBM Cloud.
The new case for hybrid cloud
Data residency and latency challenges have ushered in a new era of Edge IT and micro data centers. Some of the processing is done immediately next to where they were created. But this also means that the hybrid cloud no longer follows the traditional hub-spoke configuration and can be more complicated. And with Gartner predicting in its Top Strategic Technology Trends for 2021 report that edge computing and processing will grow by 75% by 2025, it will only get more complex.
Data workflows are also changing the way we architect. In the past, companies may have been alright egressing, ingressing, and regressing data on public cloud data centers. But with the pandemic-led growth in data, many realized they needed to bring the cloud to the data.
“The other pain point is the cost of moving data can be prohibitive. So what you will see is that many enterprises are limiting themselves to the possibilities,” Wong adds.
Companies are beginning to realize the need for multicloud deployments. What started as an expensive means to avoid vendor lock-in and remain compliant is now a necessary approach for companies operating in multiple locations with different data privacy laws and different cloud application vendors.
Lastly, data democratization and the rise of Kubernetes have made managing data and applications across the cloud (and clouds) a complex affair. DevOps teams are adding infrastructure as code (IaC), which allows developers to manage cloud data center resources but makes providing a consistent infrastructure vital.
How IBM Cloud Satellite changes the value proposition
A thought leadership paper, “Next-generation hybrid cloud empowers next-generation business,” by IBM Institue of Business Value, noted that only 20% of workloads have moved to the cloud, even though 90% claim to have embraced cloud. Eighty percent of the workloads — often the mission-critical part — remain on-premises.
Why? For one thing, every hybrid cloud journey is unique to a company. While the journey may have similar challenges, the path to hybrid depends on a company’s business needs and current infrastructure, notes Wong.
Many also fear losing control over their data, especially across clouds and between edge applications. Data portability fears also see many companies keeping their mission-critical data and apps on-premises, even if it means a lack of agility.
“This is what IBM Cloud Satellite looks to address,” says Wong.
IBM Cloud Satellite offers a distributed cloud answer that provides a variety of cloud services like security controls, compliance, and cloud management in any cloud location. Think of it as bringing the cloud to where the data resides.
Wong notes that this approach makes sense in today’s modern DevOps environment. “So, for example, we've won deals most recently where the customers have a distributed development team, some sitting in India, some in the Philippines, some in Singapore, and some in China. All of them are taking care of different projects, and each decides what they want to use. Managing all these individually can be challenging.”
Another advantage is that companies can now run workloads where they matter most. For example, think of AI and ML applications that need to be near the data creation for quick processing and developing real-time insights.
“With Cloud Satellite, they can now put workloads in a data center of choice, in a public cloud, at the edge, or the customers' data centers,” says Wong.
The workload portability allows companies to explore new AI-themed use cases and businesses, such as smart manufacturing, autonomous vehicles, disease management, and smart city applications.
“So that's the beauty of IBM Cloud Satellite, right? What we need is just an x86 instance that is running Red Hat Enterprise Linux. So where can you find this? Practically everywhere. So if the client has, for example, a lock-in contract for infrastructure with one cloud provider but wants to use IBM Cloud services, he or she can deploy cloud services like Red Hat OpenShift on IBM Cloud and any application that supports it, via IBM Cloud Satellite onto those contracted infrastructures,” Wong explains.
And this advantage will matter as agility and infrastructure flexibility continues to matter in an innovation-driven market environment.
“Essentially, we've tried to empower the customer so that they have a choice,” Wong concludes.
Winston Thomas
Winston Thomas is the editor-in-chief of CDOTrends. He likes to piece together the weird and wondering tech puzzle for readers and identify groundbreaking business models led by tech while waiting for the singularity.