hamburger icon close icon
HPC on Azure

To Migrate or Not to Migrate? Legacy Apps and Line-of-Business Applications in the Cloud

Read Next:

November 6, 2019

Topics: Azure NetApp Files 7 minute read

With the fast pace of cloud adoption, cloud service providers have attempted to keep up by offering a robust portfolio of services for enterprises. To cite a recent statistic: 83% of workloads are expected to move to the cloud by 2020, according to a survey conducted among key stakeholders and influencers across a swathe of industries. But many are wondering: Should I migrate my HPC workloads to Azure?

Moving legacy workloads that have long resided on premises requires caution and forethought, and it can be a nerve-wracking process for even the most pragmatic of executives and cloud architects. When we’re talking about the pillar applications at a given organization, such as databases, line-of-business applications, and analytics software, the alarms go off and the red tape seems to thicken.

Our intention in writing this blog is essentially to quell concerns about moving legacy apps, line-of-business applications, and high-performance computing applications to the cloud, particularly those HPC workloads that require powerful NFS shares. To address these needs, Azure NetApp Files (ANF) offers a fully managed cloud file share service in Azure with multiprotocol support and enterprise class data management features.

How One Oil and Gas Company Answered the Cloud Dilemma

Consider the case of an oil and gas company that’s running large-scale HPC systems and analytics applications alongside legacy applications and databases. The cloud holds a great deal of appeal for these applications, offering such benefits as unlimited scalability, agility, performance, and capex to opex conversion.

Once the decision is made to migrate these applications to the cloud, and to Azure in particular, it falls to the company’s cloud architect to plan the migration to Azure. The cloud architect would be tasked with creating a cloud architecture that would ensure on-premises-like performance, availability, and ongoing security, in addition to security during the migration process. The main aspects that they’d consider are: how to wrestle legacy apps and HPC workloads into the cloud; how to ensure the security, availability, and performance of LoB and database applications; and how to maintain the pace of DevOps, now that those workloads are hosted in the cloud.

Migrating Legacy Applications to the Cloud

Legacy applications, especially those that are Linux-based and have NFS file share requirements, are the most challenging of the lot. Traditionally, organizations have taken a DIY-approach to setting up NFS shares in the cloud, but more recently, cloud services like Azure NetApp Files have eliminated the need to architect custom file shares, instead offering a multiprotocol-supportive infrastructure.

An architect would have to invest a considerable amount of time in designing a DIY system that really solves for high availability, resilience, and security. Operating procedures for ongoing system management and maintenance would have to be carefully defined and implemented. These added layers of complexity deviate from the organization’s original stipulation: migration to the cloud must be carried out with minimal overhead.

How to Deal With Big Data Analytics in a New Environment

A typical analytics application in the oil and gas sector handles petabytes of data on a daily basis; those applications access that data from file shares. Architects have to plan for the storage capacity needs of those applications, taking into account future scale-up, before implementation can even begin. (We also recommend utilizing a cloud monitoring tool, like Cloud Insights, to monitor your applications before and after cloud migration). The cloud environment has to be the right size before migration, with the potential to grow at any given moment. Once migrated, the storage should also be capable of meeting the required performance levels of analytics applications so that the company can continue to gain business insights from the data at the same speed as on premises.

Production Databases Without Disruption

Databases, such as SQL, Oracle, or MySQL, are used by line-of-business applications to store transactional data and valuable system information. The ability to read and write to those databases in near real time is of prime importance. A slow response rate could result in delayed decisions and, ultimately, lost business. That potential chain of events places a great deal of pressure on the organization’s data management team to select the right storage solution—one that offers the required throughput and IOPS with consistent uptime.

DevOps, Accelerated

Innovation is key to staying ahead of the game for all organizations, regardless of industry. As organizations implement better DevOps practices, improving how they build software, they must also find quicker ways to deploy their applications. The cloud has been one source of kinetic energy for these developers. A well-oiled DevOps practice is at the heart of the matter.

Evolving your applications to the cloud takes time and care, but a precise approach to migration means that your DevOps teams can increase the agility of their efforts through ANF’s snapshot and volume restore capabilities. The cloud offers many ways to improve DevOps work, and holds a lot of promise for enterprises seeking to grow at the speed of technology.

Addressing Cloud Migration Concerns

For the oil and gas company, ANF is an all-in-one solution. It’s a managed cloud service that gives them the ability to choose the appropriate protocols, as well as capacity and performance characteristics according to their requirements and their particular blend of applications. It’s the freedom to choose. And it can answer to workloads ranging from databases to high-performance computing. But how does it work?

  1. You can migrate without rearchitecting. With Azure NetApp Files, there is no need to rearchitect legacy applications to pick up the slack for unuseable NFS file shares. Since ANF supports both NFS v3 and NFS v4.1, your organization skirts around the overhead of a self-managed NFS solution.

    It also supports traditional and known access control mechanisms such as ACLs, so there’s a minimal learning curve involved. In addition to NFS v3 and NFS v4.1, ANF supports SMB, which gives you more freedom to choose.

  2. You can migrate data at scale. ANF enables easy replication of required datasets from on premises to Azure using NetApp’s Cloud Sync services. It even keeps them synchronized. This eases the burden of the costly data transfers often associated with other data migration services. It also offers premium performance for analytics workloads and can be quickly integrated with other data services, like HDInsights and Azure Blobs. In addition, ANF volumes can be scaled up to 100TB for a single volume to accommodate data growth requirements.

  3. You have the freedom to choose among performance levels. ANF offers three storage service levels—Standard, Premium, and Ultra—supporting throughput of 16MB/s, 64MB/s, and 128MB/s, respectively, per terabyte of storage. While migrated production databases could benefit from the Premium and Ultra tiers, log files and test databases can live on the standard tier to maintain a balance between cost and performance. All three tiers also offer the high availability needed for production workloads. The provisioning is straightforward and can be done directly from the Azure portal, minimizing the learning curve. The service is also highly available and scalable by design.

  4. You can get your applications to market faster. ANF’s backup and efficient restore capability makes it easy to spin up test and development environments through DevOps processes from existing data sets. Snapshot copies are updated incrementally to maximize space and minimize wasted time. Multiple copies of the data can be restored from ANF volume Snapshots so that development can continue in parallel environments, which is a boon to organizations seeking to speed up development and lower costs.

  5. You can easily manage your cloud infrastructure from a single pane. No additional contracts are required in order to start using the service, and it can be provisioned like any other Azure service from the Azure portal. Organizations can easily integrate the service with their existing Azure environment management process using Rest APIs, Powershell, or CLI. Support is directly handled by the Azure support team.

Fast-Tracked Migration to Azure

Moving to cloud is one of the most significant decisions that an enterprise can make; afterall, it redefines not only operations and workflows, but also IT expenditure models, shifting spending from capex to opex. Your cloud journey will not be worry free, but ANF can make it a little bit better through its myriad capabilities.

As evident from the challenges companies face while migrating their file-service-based workloads to Azure, architects and migration teams require a versatile solution to meet the multi-dimensional requirements of each use case. For example, big data workloads can benefit from scalability, whereas performance and high availability are decisive factors for database workloads. Managing multiple storage solutions adds considerable overhead for IT teams in the long run. The ideal solution would be a service that meets all these requirements.

Azure NetApp Files features enterprise-class storage management capabilities characteristic of on premises, as well as the flexibility and agility offered by the cloud. ANF is quick and easy to deploy and highly scalable by design. Combined with near bare-metal performance, it is an all-encompassing solution to get your applications running in Azure in no time. Designed to meet the needs of a range of applications with flexibility and ease, it’s the cloud service of choice for organizations seeking to grow consistently, at the pace of technology.

More HPC on Azure content:

Cloud Data Services

-