logo-ontapCloud Volumes ONTAP

Big data on NFS repositories

Optimized data management for any type of analytics workload.

Start Free Trial

CHALLENGE

Lots of investments in storage and complicated management tasks

Big data frameworks, such as Hadoop, Storm and Hive, provides a way to process enormous volumes of data. Sizing a big data cluster that is going to run on a dynamic data set which will increase its size can be challenging as well as dealing with native block storage capacity limits and consider HDFS 3x replication factor or charges for operations calls to object storage. These introduce additional operational burden, to an already complicated environment, and costs.

SOLUTION

Run analytics on NFS repositories

Cloud Volumes ONTAP offers enhanced data management capabilities of native cloud storage. Together with the NetApp In-Place Analytics Module, big data frameworks can easily access data stored in Cloud Volumes ONTAP using NFS. NAS based data repositories eliminate the need for additional cloud storage to store multiple replicas of datasets or make extremely large amounts of API calls to object storage, allowing significant savings and simplified operations.

How it works

how-it-works

  • 1

    Create NFS volumes for big data

  • 2

    Install NetApp In-PlaceAnalytics Module (NIPAM)

  • 3

    Mount NFS volumes tobig data framework

Benefits
Image 257
  • cost

    Cost

    No need to worry about big data over-consuming cloud resources

  • performance

    Performance

    Safely run variations of file-based data analytics using zero-penalty clones

  • np-protection-2791803-000000

    Protection

    Peace of mind that your data is always protected everywhere

  • np-protection-2791803-000000

    Portability

    Seamlessly move data to the cloud and back

Pricing

Get block and file storage for the price of object storage.

See Full Pricing

How to get started

Select cloud to get started with