logo-ontapCloud Volumes ONTAP

Big data on NFS repositories

Optimized data management for any type of analytics workload.

Start Free Trial


Lots of investments in storage and complicated management tasks

Big data frameworks, such as Hadoop, Storm and Hive, provides a way to process enormous volumes of data. Sizing a big data cluster that is going to run on a dynamic data set which will increase its size can be challenging as well as dealing with native block storage capacity limits and consider HDFS 3x replication factor or charges for operations calls to object storage. These introduce additional operational burden, to an already complicated environment, and costs.


Run analytics on NFS repositories

Cloud Volumes ONTAP offers enhanced data management capabilities of native cloud storage. Together with the NetApp In-Place Analytics Module, big data frameworks can easily access data stored in Cloud Volumes ONTAP using NFS. NAS based data repositories eliminate the need for additional cloud storage to store multiple replicas of datasets or make extremely large amounts of API calls to object storage, allowing significant savings and simplified operations.

How it works


  • 1

    Create NFS volumes for big data

  • 2

    Install NetApp In-PlaceAnalytics Module (NIPAM)

  • 3

    Mount NFS volumes tobig data framework

Image 257
  • cost


    No need to worry about big data over-consuming cloud resources

  • performance


    Safely run variations of file-based data analytics using zero-penalty clones

  • np-protection-2791803-000000


    Peace of mind that your data is always protected everywhere

  • np-protection-2791803-000000


    Seamlessly move data to the cloud and back


Get block and file storage for the price of object storage.

See Full Pricing

How to get started

Select cloud to get started with