Cleveland, Aug. 14, 2023 (GLOBE NEWSWIRE) -- Michael Cantor, CIO at Park Place Technologies, is advising that emerging concerns are raising question about the long-term viability of relying solely on cloud services, including data and compute.
“In recent years, the public cloud has experienced an unprecedented surge in adoption, becoming the de-facto compute solution for businesses across industries,” Cantor said in an article for TechRadarPro. “The potential for lower costs, increased flexibility, and seamless scalability propelled a rapid movement to cloud-based computing with both born-in-the-cloud digital natives and heritage brands alike migrating their workloads to leverage its transformative capabilities. This was accelerated further as the world grappled with the global pandemic, when the cloud emerged as a lifeline for remote work, digital collaboration, and business continuity.
“However,” Cantor added, “the cloud may not be the answer to every problem and has some of its own issues that need resolution in any company that is utilizing the cloud.”
Cantor cited research revealing that cutting cloud costs has overtaken security as the main concern for businesses embracing the technology. He said many organizations have adopted the cloud without creating a cloud competency in their infrastructure group for managing costs, and the resulting increase in expenses through lack of management is driving this interest in cutting costs.
“The initial pull of lower costs and increased flexibility has been overshadowed by unexpected price components, such as the cost to extract data from the platform,” he said. “The increasing complexity of cloud infrastructure and the difficulty of predicting usage patterns have made it challenging for businesses to manage their cloud costs effectively.”
In response to these issues, many businesses are opting to repatriate some data sets to better control long-term costs and lower the need for expense. However, while repatriation can help companies save money, it can be a complex process that requires significant investment in infrastructure and expertise.
Is it time to repatriate your data?
Cantor lists several benefits of repatriation.
- “One frequently cited benefit is the concept that by bringing data back in-house, companies can ensure better control over data, reduce the risk of data breaches, and meet compliance requirements more easily. This has to be evaluated on a case by case basis – these benefits may indeed accrue from having the infrastructure on-premise, or the movement of the data to a different form of management may not be the root cause. Security management in the cloud is not the same as on-premise, so a lack of required skillset in the security team may be the actual root issue.
- “Another potential of data repatriation right now is cost, which businesses have found can quickly spiral out of control and is increasingly difficult to budget and plan effectively. A big reason for this is that cloud providers often charge extra fees for services such as data egress, the cost associated with moving data out of the cloud storage platforms where the data is normally held. Firms have seen egress charges rise as they look to do more with their data, such as mining archives for business intelligence purposes, or to train AI engines. But data egress is one of the costs of cloud computing that companies without a dedicated cloud team may not factor in to projects and can quickly mount up. In the most extreme cases, egress charge bill shock can make a cloud project so expensive that it’s no longer viable, and significantly impact a company's bottom line.
- “Another factor driving data repatriation is data security and compliance. Many businesses operating in more regulated industries, such as finance, healthcare and telecoms, deal with sensitive data that must be stored and managed in a highly secure and compliant manner. While cloud providers offer robust security and compliance capabilities, some businesses may feel more comfortable managing their data in-house, where they have more control.
- “Finally, enhanced performance. Although the cloud provides theoretically limitless scalability, it still loses some speed due to internet connectivity and virtualisation overhead. For certain use cases or for higher scales of data, workloads, or concurrency requirements, faster performance is essential. Some real-time analytics workloads, like machine learning-based AI, can be sensitive to latency. Analytics applications can leverage caching and other network optimisation methods to reduce latency. However, one of the easiest and most pragmatic fixes is to shorten the communication path. This means that unless the data was born on the cloud, bring the analytics back in-house.”
Addressing the challenges head on
While Cantor said it’s true there are many benefits to repatriation, the reality is that it’s not without its challenges, which means it may not be a possibility for some smaller to medium businesses.
“It can be a complex and time-consuming process, requiring businesses to assess their data needs, migrate their data, and ensure that, once back on premises, it is properly secured and managed, including backup and disaster recovery,” Cantor said. “Another challenge of data repatriation is the potential impact on business agility. Cloud infrastructure is highly flexible and scalable, allowing businesses to quickly spin up new resources and adjust capacity as needed. By bringing their data back on-premises, businesses may lose some of this flexibility, which could impact their ability to respond quickly to changing business needs.”
Ultimately, Cantor concludes, “As with any major infrastructure decision, repatriation is not one that businesses should take lightly. They need to carefully evaluate their options, weigh the costs and benefits, and develop a clear strategy that aligns with their business objectives and regulatory requirements. This may involve working with cloud providers to identify the best workloads to bring back in-house, assessing their data management and security capabilities, and implementing the right tools and processes to manage their data effectively.”
About Michael Cantor and Park Place Technologies
As Chief Information Officer, Michael leads the delivery of technology initiatives to improve Park Place’s internal and customer-facing capabilities while ensuring the globalization and security of Park Place’s systems as the company continues to expand.
Michael comes to Park Place with over 25 years of experience in the IT industry. Most recently Michael served as CIO for Cardinal Health at Home, where he was responsible for all segment-specific IT initiatives and the creation of a new digital front-end for the B2C portion of the business. Prior to Cardinal Health, Michael was the CIO at Matrix Medical Network, based in Scottsdale, Arizona. At Matrix, Michael drove the digital transformation of the business through the growth of Matrix’s IT capability and implementation of key IT projects that enabled the company to grow revenue at a 3x rate over three years.
Michael is a Georgia State University alumnus with a BBA in Computer Information Systems.
Park Place Technologies is a global data center and networking optimization firm. We help more than 21,000 clients optimize data center budgets, productivity, performance, and sustainability so they can think bigger – and act faster. From procurement to decommissioning, our comprehensive portfolio of services and products helps IT teams optimize IT lifecycle management. This frees time and spend so they can focus on transforming their businesses for the future.
Park Place’s industry-leading and award-winning services portfolio includes Park Place Hardware Maintenance™, Park Place Professional Services™, ParkView Managed Services™, Entuity Software ™ and Curvature Hardware sales. For more information, visit www.parkplacetechnologies.com. Park Place is a portfolio company of Charlesbank Capital Partners and GTCR.
Attachment