hamburger icon close icon

Cloud Tiering for Heavy Industry: Cutting Costs for Major Manufacturers

As the age of industry 4.0 picks up its pace, success in big manufacturing depends on one crucial thing—the ability to collect and extract value from data. When analyzed properly, data can inform decisions on the factory floor, increasing operational efficiency, raising product quality, and, ultimately, ensuring the company maintains its competitive edge. This cannot be achieved, however, without tending to one of the most basic building blocks in data: storage.     

Storage is a particularly tricky issue when there are massive volumes of data involved. Heavy industry companies produce a lot of data which is often stored primarily in on-premises data centers. These can be cumbersome and expensive to maintain and severely limit the pace of innovation. In addition, generating data quickly with no differentiation between active data and infrequently-used data can result in degraded storage performance and extremely high costs.

One of the ways to handle these challenges is by extending on-premises data centers to the cloud. This can instantly increase capacity, enable scalability, and cut expenses by storing cold data in low-cost storage tiers.

In this blog post we explore some of the major storage challenges manufacturers encounter. We’ll then introduce NetApp’s Cloud Tiering solution and demonstrate how it can address these challenges through a specific use case with an industrial giant.

Heavy Industry Data: Why It’s So Valuable and How It Can Easily Slip Away

Heavy industries require large-scale infrastructure, heavy equipment, huge machinery that will all typically be involved in multiple complex processes. These big manufacturers span different sectors including transportation, energy, and automotive.

Most of the big industrial players have an expanding repository of historical data, with multiple interconnected devices that generate more and more data by the day. This data is an extremely valuable asset in manufacturing. The data collected can be analyzed to shed light on manufacturing processes, throughput, energy efficiency, yield, and quality, all while lowering production costs. In this way, manufacturers can use data to shorten product life cycles and time to market, creating higher-quality products faster. 

The problem is that you cannot extract value from data without having an infrastructure to support it. Without establishing data-handling practices and optimizing storage, data will continue to pile up and gather dust, clogging up the system without providing any value. This pain point was made very clear in a case study detailed in McKinsey’s 2015 report manufacturing’s next act which pertained to an oil-exploration company. Although the company collected over 30,000 data points from each of its drilling rigs, it only utilized 1% of the data collected. The rest of the information was lost due to problems with data storage, transfer and architecture. To understand how this could happen, let’s take a closer look at the storage challenges manufacturers face.

Big Manufacturing Storage Challenges

With big manufacturers collecting so much data at a fast pace, on-premises storage can often run out of space. Without a mechanism to identify which data is “cold” and which must be available immediately for operations, data centers can get bogged down by mostly inactive data. In other words, manufacturers often store data that is almost never used in high-performance storage. In these cases, the cost of long-term storage far outweighs the performance benefits gained.

Just as some data is almost never used, there are other cases in which data must be made available 24/7. The majority of big manufacturers have multiple data centers that are dispersed across remote sites. Some of the data collected should be available anytime and anywhere to employees across the world. As long as the data collected is joined together in one indistinguishable dataset, manufacturers are in danger of suffering from degraded storage performance and racking up unnecessarily high storage costs.

Another issue is that when enterprises own on-premises storage, scaling up capacity takes time due to a typically long procurement cycle, a bottleneck that can cause innovation to slow down significantly. Besides the time involved, scaling up on-premises high-performance capacity is also very expensive. Situations such as these can occur, for instance, in a scenario of mergers or acquisitions where companies may have to double their on-premises storage capacity overnight.

In search of a better solution to surmount some of these difficulties, manufacturers have begun the move towards the cloud. Manufacturing was one of the segments that had the fastest growing cloud spend in 2018-2019. One 2019 study found that while 43% of manufacturers still use traditional data centers for their principal IT architecture, 20% of manufacturers have turned to a cloud service provider to deal with workloads that they can no longer handle on-premises. This shift is projected to continue. According to a survey conducted in 2019, 71% of respondents believe that cloud adoption will considerably increase in manufacturing in the coming years. 

However, in turning to the cloud, companies are often faced with the difficult dilemma of abandoning their existing systems altogether or setting up a hybrid solution of some kind. Ideally, most companies wish to maintain their existing infrastructure while getting the most out of what the cloud has to offer. With NetApp’s Cloud Tiering service, manufacturers can continue to use their high-performing in-house storage and at the same time access the numerous advantages of the cloud.

Extending Data Centers to the Cloud with Cloud Tiering

Cloud Tiering leverages NetApp’s FabricPool technology to extend on-premises data centers using NetApp AFF models and SSD-backed FAS storage to the cloud with zero-effort. The service identifies data that is infrequently used and automatically transfers it from on-premises storage to the more economic cloud object storage (see the Cloud Tiering architecture in action here). When the data is needed again, it is seamlessly moved back to the high-performance tier on-prem.

In this way, companies can maintain their on-premises storage, workflows and processes all the while benefiting from what the cloud has to offer, gaining multiple benefits:     

  • On-premises storage is freed up and optimized for the best performance.
  • Capacity can be scaled up almost instantly, up to 50x, as needed, with users paying only for the data that is tiered.
  • Policies for specific types of data, such as cold Snapshot data, all cold data, or entire volumes of data.
  • With an average of 80% of data transferred to low-cost storage in the cloud, the data center footprint can be significantly reduced, lowering costs dramatically.
  • The service makes no change to the application layer, operating silently in the background with no impact on performance or processes.
  • Capable for use in AWS, Azure, Google Cloud, or in multicloud deployments on all three platforms.

Festo: Optimizing Operations of an Industrial Giant with Cloud Tiering

Based in Esslingen am Neckar, Germany, Festo is a multinational industrial control and automation company that serves customers in over 35 verticals. As an engineering-driven enterprise, Festo develops, manufactures, and sells pneumatic and electrical control and drive solutions for process and factory automation.

Cloud Tiering wasn’t the first step Festo took into the cloud. The industrial giant had a strong, cloud-first strategy in place and was already using Cloud Volumes ONTAP for its SAP workloads on Microsoft Azure. It was after purchasing a number of AFF arrays that the company’s IT department realized NetApp’s Cloud Tiering service could significantly optimize their in-house storage by extending it to the public cloud.

Employing Cloud Tiering’s snapshot-only policy, Festo’s Snapshot data that hasn't been accessed for two days is identified and automatically tiered from the company’s AFF arrays to Amazon S3 object storage. Initially, tiering was enabled on about 200 volumes and the service is currently being extended to additional volumes.

Cloud Tiering service provided Festo with considerable benefits: 

  • Storage capacity was optimized, using up much less on-prem storage space than before.
  • Data storage costs were considerably reduced.
  • Festo is exploring the possibility of tiering full volumes of cold data, which will automatically be transferred between AFF arrays and Amazon S3.

You can read the full Festo success story here.

Conclusion

Heavy industry comes with huge demands. The infrastructure is larger, the machinery bigger, and the processes more numerous and complex. Not surprisingly, these giant manufacturers generate huge volumes of data. In the age of Industry 4.0, big manufacturers can no longer afford to merely collect data and hope for the best. They must find ways to handle data wisely. This starts with storage.

With NetApp’s Cloud Tiering service, manufacturers can instantly infuse their physical data centers with cloud storage capabilities on AWS, Azure, and Google Cloud. Cloud Tiering ensures data resides in its correct tier and transferred only when needed. As a result, storage operates optimally and expenses reflect what’s being done with the data, rather than how much data happened to accumulate on-premises. 

Start using Cloud Tiering in your NetApp data center today.

New call-to-action

Oded Berman, Product Evangelist

Product Evangelist

-