Big data frameworks, such as Hadoop, Storm and Hive, provides a way to process enormous volumes of data. Sizing a big data cluster that is going to run on a dynamic data set which will increase its size can be challenging as well as dealing with native block storage capacity limits and consider HDFS 3x replication factor or charges for operations calls to object storage. These introduce additional operational burden, to an already complicated environment, and costs.
Cloud Volumes ONTAP offers enhanced data management capabilities of native cloud storage. Together with the NetApp In-Place Analytics Module, big data frameworks can easily access data stored in Cloud Volumes ONTAP using NFS. NAS based data repositories eliminate the need for additional cloud storage to store multiple replicas of datasets or make extremely large amounts of API calls to object storage, allowing significant savings and simplified operations.
Create NFS volumes for big data
Install NetApp In-PlaceAnalytics Module (NIPAM)
Mount NFS volumes tobig data framework
No need to worry about big data over-consuming cloud resources
Safely run variations of file-based data analytics using zero-penalty clones
Peace of mind that your data is always protected everywhere
Seamlessly move data to the cloud and back