Enterprise Workloads

Typical questions you need to ask when running enterprise workloads in the cloud:

  • Can I guarantee business continuity, with zero downtime and no data loss?
  • How do I comply with data security, data protection, and disaster recovery regulations?
  • Will I need to rewrite applications when moving them to the cloud?
  • Can I ensure performance and SLAs?
  • How do I sync data between on-premises and cloud environments?

Whether you are a cloud newcomer, born-in-the-cloud, or working in a hybrid or multicloud environment, you will require a solution that enables seamless management and protection of your enterprise workload data, and provides tools that will help you move and sync data easily, quickly, and securely.

Get started now with

Cloud Volumes Service Cloud Volumes ONTAP

Key Considerations

Does your workload require high performance?

Does your workload require high performance?

Performance and Guaranteed SLAs

Enterprise workloads demand performance at the highest levels with guaranteed SLAs that can ensure peak operability at the right IOPS levels.

Is minimizing storage costs a key consideration?

Storage Efficiencies

Leverage efficient data snapshots, thin provisioning, data compression, deduplication, data tiering and cloning that can reduce storage expenses by as much as 70%.

Is protecting your workload data a key requirement?

Data Protection and High Availability

Recovery from data loss can be achieved with efficient data snapshots, disaster recovery and high availability. These must be easily configured and cost effective, support seamless failover, failback, restore, and recovery processes that meet your SLAs.

Choose the offering that best suits your Enterprise Workloads


Cloud Volumes Service

Fully managed cloud storage service delivered by NetApp

Read More

Cloud Volumes ONTAP

A customer-managed solution providing full ONTAP capabilities

Read More

logo-service-1 Cloud Volumes Service

NetApp’s fully-managed Cloud Volumes Service providing enterprise-grade data management and data protection capabilities with highly-available and highly performant storage for enterprise workloads.

3 Steps for a Successful POC

  • Sign up for the Cloud Volumes Service trial period
    • Sign up for the Cloud Volumes Service trial period on AWS or register for our Preview Program on Azure or GCP.
    • Enter your information and NetApp will evaluate your requirements and provide you with access.
    • Spin up your first 100 TB of storage in the cloud.
  • Create your first 100 TB in the cloud
    • Log into your NetApp account on NetApp Cloud Central.
    • Name the volume you want to create.
    • Decide which protocol you want to expose.
  • Set a snapshot schedule

    Keep your data protected by scheduling your snapshots.

    • Follow the steps above to create your volume in the cloud.
    • In the snapshots menu decide the number of snapshots you’d like to keep.
    • Set the time period which the snapshots are to be created.


  • ic-spin Spin up 100 TB of storage in just seconds
  • ic-utilize Utilize Cloud Sync technology to replicate data to the cloud quickly and efficiently (“lift & shift” data)
  • Leverage cloning technology Leverage cloning technology

    This will create thin, RW data copies without performance penalties. Such copies can be used as testing environments without compromising production volumes.

  • ic-api Use API calls to automate your allocations

logo-ontap-1 Cloud Volumes ONTAP

Cloud Volumes ONTAP provides enterprise-grade data management you control, with data protection capabilities, highly-available storage, and storage efficiencies in the cloud of your choice.

3 Steps for a Successful POC

  • Start your free Cloud Volumes ONTAP trial
    • Sign up or log in to NetApp Cloud Central.
    • Click on Cloud Volumes ONTAP “Start Free Trial” and follow the wizard.
    • Create your first Cloud Volumes ONTAP HA environment.
  • Test Cloud Volumes ONTAP HA Resiliency
    • Copy data to a Cloud Volumes ONTAP storage volume.
    • Shut down the Cloud Volumes ONTAP primary node and verify that the data you copied is accessible.
    • Restart the primary node to resume high-availability status.
  • Test capacity reductions with storage efficiencies

    Watch your storage footprint reduce and benefit from an average of 50% cost savings.

    • Thin provisioning: Provision a volume which is larger than the aggregate. Despite the overprovisioning, you will not see an increase in the storage footprint.
    • Deduplication: Copy a file multiple times and you’ll see that the used capacity remains the same, with no increase in storage footprint with repeating blocks.
    • Compression: Save a large text file and note how it is compressed substantially, reducing the overall storage footprint.
    • Data tiering: Check the volume capacity and see how “cold” data is located on object storage.


  • ew-tip-7 (2) Create your HA environment with the Multiple Availability Zone deployment model to protect against Availability Zone failure.
  • Select the Automatic Capacity Management feature Select the Automatic Capacity Management feature

    This will increase the aggregate size automatically when required, and will delete empty aggregates to save storage costs. You can control the thresholds!

  • Leverage the “Schedule Downtime” option Leverage the “Schedule Downtime” option to turn on and off your environment in order to save compute and license costs
  • Tier your cold data, DR environment, and snapshots to Azure Blob Storage or Amazon S3 Tier your DR environment, infrequently-used data, and snapshots to Azure Blob Storage or Amazon S3

    This will reduce storage costs for infrequently-used data to as low as $0.03 per GB per month.

  • Change your volumes’ underlying storage types non-disruptively Change your volumes’ underlying storage types non-disruptively

    This will move volumes between costly, high-performance storage (e.g. GP2) and cheaper, lower-performance storage (e.g. ST1) according to your SLAs.

  • ic-utilize Utilize NetApp replication technology to “lift & shift” data to the cloud quickly and efficiently
  • ic-cloning-1 Use cloning technology
    This will create thin, RW data copies without performance or capacity penalties, that can be used as testing environments.