Mission Critical Computing: 8 Reasons Why CIOs Should Keep Data On-Premises

Chief Information Officers have bought into the promise of the cloud and implemented a “Cloud First” strategy.

Lower total cost for data. More security. Less staffing headaches.

Management consultant Gartner projects public cloud services to grow 20.7% to total $591.8 billion in 2023.

But all the hype appears to be fading.

Since 2021, IT decision makers are reversing course and repatriating their organization’s data back to on premises. A survey by 451 Research found that 48% of 600 companies had migrated their data back from AWS, Microsoft Azure or Google Cloud Platform to another location. In total, 86% opted to move the IT operations back to their own data center.

Most (96%) of the 139 IT decision makers who repatriated their apps and workloads said cost efficiency was the #1 benefit of their efforts, according to a Dell survey. Almost half (40%) listed security and compliance concerns as the main reasons for bringing their data back on premises.

“I learned a long time ago not to fall blindly in love with any technology, including cloud computing,” said David Linthicum, a Cloud Strategy Officer at Deloitte, who writes for InfoWorld.

In one of his recent articles entitled, “2023 Could Be The Year of Public Cloud Repatriation”, Linthicum challenges the strong bias in the marketplace. “The cloud bills are higher than expected,” he writes, “because lifted-and-shifted applications can’t take advantage of native capabilities such as auto-scaling, security, and storage management that allow workloads to function efficiently.”  In addition, organizations applying big data analytics in the Cloud find the costs of data functions like search, compute and egress are significant and rising.

Recognizing these staggering issues with data storage, Swiss Vault set out to address the data storage problem, with products to provide a practical and more sustainable alternative.

Swiss Vault, based in Princeton, New Jersey, creates a unique data-storage platform for customers that demand mission-critical computing. The company’s products help organizations in various fields—ranging from genomics research to earth telemetry—that need to rapidly capture, store, access and retrieve large volumes of data. (See Swissvault.global for product launches.)

For CIOs who decide to move their most valuable asset onto the Cloud, here are eight reasons why they should change their minds before it’s too late:

  1. Disaster Recovery/Business Continuity

Swiss Vault stores data in a unique way that self-monitors and self-heals to support business continuity with less time, energy, space, investment and impact on the environment.

“CIOs are struggling to manage the exponential growth in data,” said Swiss Vault CEO Bhupinder Bhullar, who developed the technology to address the challenges he faced managing a burgeoning volume of human genome sequence data. “Our solution allows CIOs to dramatically cut their costs and easily expand (scale-up) their data storage, so their most valuable asset remains closer on premises.”

By securing data on-premises and moving away from the cloud, the data owner removes the risks associated with data held on someone else’s machine. Particularly important is the risk to the client’s data privacy and leverage the cloud providers have with control of your data.  By moving to on-premises, data is in your control and your responsibility. SwissVault helps empower IT professionals seeking self-custody of their data by providing long term self-hosted storage solutions as our number one priority.  Cutting out the middleman between you and your data has its benefits in both cost, security, and removing exposure of a third-party failures, such as loss of customer data.

  1. Lower Cost of Data

A typical converged server rack weighs 800 kg and consumes 18,000-50,000 watts. It generates excessive decibel levels from continuously operating  during its three-to-five-year useful lifespan, until servers are refreshed. The size, weight and power (SWaP), as well as regular equipment replacement, are all high-cost drivers.

Swiss Vault’s data file system reduces the volume of data storage up to 50%, while increasing the resiliency of the data. (See #8.) The software automatically backs up and distributes data across the network to improve resiliency and availability while improving storage capacity.  It’s self-healing ability allows servers to be used until the end of life, removing the need for scheduling a server refresh every three to five years.  The hardware dramatically reduces space and energy consumption, as well as maintaining archival integrity for decades. As a result, setup and operational costs decrease.

  1. Lower Capital Costs

Swiss Vault offers its hardware and software on a subscription basis to help CIOs lower their capital costs. Turning the investment into an operational expense enables organizations to receive a more favorable tax treatment and free up capital in the short term, which supports an organization’s strong data growth trajectory.

  1. Improved Security

Cybercrime is expected to cost $8 trillion dollars this year, according to Cybersecurity Ventures.

Any methodology to mitigate against data loss will be beneficial to the industry.

Doug Fortune, Swiss Vault’s CTO, explains how the unique version of erasure coding achieves a higher level of robustness: “Once uploaded into the network, the software distributes fragments of the data onto different servers. Even with simultaneous loss or theft of multiple servers, clients will have access to their data, and the system will automatically rebuild the lost data chunks.”

  1. Rapid Capacity Expansion

Swiss Vault’s hardware and software platform allows 14 times more stored data per conventional server volume (m3). Expanding one’s data capacity can be as simple as adding more servers with hard drives to the network.

  1. Support Limited Staffing

Data centers routinely experience the loss of disks and servers, file corruption, bit rot or other errors that prevent data retrieval. The replacement of disks, servers or other fixes are labor intensive.

Swiss Vaults’ software system compensates for the demand on talent and skill by automatically rebuilding data on the fly to available space without delays and interruptions. The lost data file is automatically regenerated. This automated file restore enables large data migration onto new servers, saving valuable time and removing headaches for staff.

  1. Customized optimization

Swiss Vault’s unique version of erasure coding allows the system administrator to specify any desired level of robustness and, optionally, define and redefine custom levels for each file, file type, file class and directory.

Operators can set the parameters to the desired level of data robustness and data availability, optimized for the number of nodes—as well as available space, energy and labor that results in significant cost and higher productivity.

Typically, cloud providers have limited options that may not meet the organization’s needs or charge a premium if data overhead vs. availability customization is allowed.

  1. Energy Efficiency

Swiss Vault’s data storage system is 10x more energy efficient (Watts/PB).

While a 1PB RAID file management configuration with two backup servers would require 3 PB’s of hardware and infrastructure, the Swiss Vault software enables an overhead of 1.5PB but with higher levels of data robustness and availability. The lower disk volume reduces the energy per PB of data under management.

The Swiss Vault hardware system also creates a much smaller energy and space footprint—750 watts/PB versus 7,200 watts/PB or more for a standard hyperconverged server system. (See website for product release dates).

“Our innovative data-storage approach is a fraction of the cost and simplifies data operations, allowing organizations to manage the exacting demands of mission critical computing on-premises,” Bhullar said.

For more solutions to keep your vital data secure on premises at a fraction of the cost, contact Swiss Vault today.