Blog

Cloud Tiering: How On-Prem Data Centers Can Benefit from Moving Data to the Cloud

Companies using trusted NetApp Fabric-Attached Storage (FAS) SSD drives or All-Flash FAS (AFF) arrays are generally happy with their on-prem data centers. They have powerful, in-house storage systems that can meet all of their applications’ performance demands. But there are several situations where that model needs to adapt and embrace the technology that has totally transformed the way storage is managed—the cloud.


Why would companies committed to on-prem storage start storing data in the cloud? What are the benefits of doing so? In this post we’ll look at three main drivers for adopting the use of cloud storage and offer a way that NetApp can make that happen for FAS and AFF storage users: our new Cloud Tiering service.

Why Move to the Cloud?

There are a number of reasons why a company with an on-prem data center might consider moving some data to the cloud. Three major reasons include a directive to move towards a cloud strategy, solving for diminishing on-prem storage system space, and finding new ways to shift some of the company’s IT spending from a CAPEX to an OPEX model.


In this section we’ll look at each of these points in detail.

1. Company Decision to Move Towards a Cloud Strategy

It’s been decided: you’re moving to the cloud. And your company’s CEO wants to get started as soon as possible. Whether it’s a cloud-first, a hybrid cloud strategy, or a gradual transition to an all-cloud architecture, it’s going to be up to the IT team to craft the entire migration strategy from the ground up. Migrating applications to the cloud is going to involve making massive changes to the application layer, a process which is going to involve multiple stakeholders throughout the organization. Any misstep can have serious consequences for the business. This step needs to be careful and well thought-out. How can you get a foot into the cloud before all those high-risk steps take place?


Leveraging cloud data storage for infrequently-used data is one way that a company can start to experiment and begin to carry out a cloud strategy. In effect, it is a low-risk first step in adopting a hybrid architecture, which is the most popular approach for established enterprise companies beginning their cloud journeys.


Consider how a data migration to the cloud might play out for a customer at an eLearning company as an example. Let’s assume this company runs 20 disparate applications. With a CIO decision to adopt a cloud-first strategy, the IT team is going to have an enormous amount of work cut out for them to transition all of those workloads to the cloud. By offloading infrequently-used data to the cloud, they can get some experience with using a cloud service provider (CSP) for storage while focusing on the next step, which is moving the application layer.


Data usage also factors into this use case. Since online courses are prepared in advance and usually consumed during the following semester, data in use now has to be available in the immediate future. Additionally, after the semester ends, the course material still needs to be available although it isn’t (and may never be) in immediate use. Storing that data in the cloud is going to require an easy way to bring it back up to performant storage for use.

2. The Data Center Crunch

Your organization always had a lot of data to contend with but lately, there’s more of it than ever. You’re not anywhere near your AFF storage systems refresh period, and yet something is starting to become painfully obvious: you’re running out of room to store data.

But your data center space running out faster than you planned may only be one of your problems. With data growing faster than it ever has before, IT teams need to find ways to store that data efficiently. For AFF systems, that means housing information that isn’t necessarily going to take full advantage of the powerhouse storage system’s benefits. What you gain by keeping it stored in your data center is a way to ensure its availability, but there’s no reason why all that data should remain on the performance tier.


You could try to manually tier this data to a less-performant storage system on-prem, but that isn’t going to be easy, and it will require intense amounts of work for your IT team to configure not only the transfer process but to make all the changes to the application layer so that it can deal with information in use of two different systems. That means costly investments in time and money that could be better spent elsewhere.


As an example, let’s see how this situation might play out for a major healthcare provider. In their vertical, data can grow as much as 1 TB per day, which is too fast to leverage data center storage systems cost-effectively. Once patients are discharged or get better, their records won’t be in frequent use, but that data still needs to be saved for future visits and to meet healthcare compliance regulations. However cold this data may be, it needs to be available and accessible again at any time. The cloud can offer a way to store this data at low costs, and keep it ready for use as soon as it’s needed.

3. Trading CAPEX for OPEX

OPEX spending is more attractive to enterprises than CAPEX spending because it gives them more financial flexibility and reporting benefits. Storage has traditionally been a capital expenditure: it is purchased as an asset. With the cloud’s pay-as-you-go model, storage can now be reported as an operating expense.


Take the example of a media company with a huge library of video, audio, images, and other media file data. That kind of use case not only requires the high-demand staging environment for video and image shooting, with editing processes that can take a week or more, it also needs a place to store all that media data when it isn’t in use, in a way that makes it accessible in the future.


Relying only on on-prem storage, that media company will have to carry the costs not only for the machines it needs to buy to store that data, but also for the building it buys to house them. These expenditures will continue to be deducted for multiple years, all while depreciating over time. If this media company is very new, it may not have a budget for long-term capital expenditures. With the cloud’s alternative, they only have to pay for the resources they use. Their storage solution can be reported as an operating expense the same way that the company’s electric bills are, dramatically offsetting their tax liability and improving end-of-the-year reports.

Introducing NetApp Cloud Tiering

Cloud Tiering is a new service from NetApp that provides data tiering capabilities for on-prem AFF storage systems.

What Is Data Tiering?

Data tiering is a method for moving data between different storage systems or formats so that the data is located in the tier that offers the most cost and performance benefits at a given time. For example: backup data is best stored on inexpensive, less-performant storage until there is a need for it, such as a restore operation.


With Cloud Tiering, high-performance storage space can be freed up and dedicated to more workloads while infrequently-used data is automatically moved to lower-cost storage in the cloud, leveraging object storage on Amazon S3, Azure Blob, or IBM Cloud Object Store as a capacity tier. Using smart, auto tiering technology, as soon as that data is needed it can be seamlessly transferred back to the highly-performant FAS and AFF storage systems.


Cloud Tiering doesn’t require any changes to your applications and processes. There is absolutely no need to rearchitect or refactor applications for use in the cloud. We like to call this the “lift and DON’T shift” approach to the cloud. It also frees up valuable AFF or FAS storage space for use of more workloads. You’re able to do more with less, and can expect to expand your storage system capacity by 20x. You’ll also be able to lower TCO as well as the flexibility to switch from CAPEX to the cloud’s OPEX model. On average, Cloud Tiering allows users to move about 80% of their data to low-cost storage in the public cloud.

Users of Cloud Tiering have a choice of two different license models: a consumption-based PAYGO or an upfront, termed-based BYOL. Users can also mix-and-match PAYGO and BYOL license.

To learn more, visit our new Cloud Tiering service page.

Conclusion

Companies with on-prem data centers have a lot of reasons to take that first step into the cloud. Whether it’s that management has sent down word that it’s time to develop a cloud strategy, your critical AFF systems just don’t have enough space to house all of the data you’ve accumulated, or that you need to find a way to trade a pricey CAPEX spending model for a cloud-driven OPEX one, Cloud Tiering offers an easy way to achieve all of those goals at once.


To learn more, visit our new Cloud Tiering service page.

To try out Cloud Tiering for your data center, register for a preview of Cloud Tiering here.

-