Bringing Data Fabric to Life with Support for Hybrid Multi-Cloud and DevOps
The NetApp Fabric Orchestrator is an extensible cloud service that helps customers create and manage their data in a multi-cloud world. Data is the biggest asset that drives innovation and differentiation to every company in the world, but data cannot deliver its full value in silos. Data needs to be integrated and orchestrated with the applications, services, and workloads that unleash its true value.
The Fabric Orchestrator discovers your data, applications, and services by securely connecting to public, private, and on-premises providers including ONTAP systems, Cloud Data Services, NetApp HCI and the NetApp Kubernetes Service. All assets discovered sit behind a unified Data Fabric API and a single user interface.
Once your Data Fabric has been discovered, you can invite your teams to collaborate on data, organize it, protect it, secure it, orchestrate and integrate it. All through a rich metadata tagging and labeling solution. Start automating your organizations best practices and compliance rules across your Data Fabric with Fabric Policies and Fabric Flows.
Easily scale processes and policies across an entire data estate.
Apply access controls automatically to new datasets based on established policy.
Organize using metadata with simple concepts like tags and labels.
Automate the proximity of data based on usage patterns and applications.
Enforce data deletion policies across all applications without relying on busy admins to remember to remove data.
Fabric Orchestrator provides a unified UI and API for all your Cloud Volumes and Azure NetApp Files. It helps you monitor your usage and subscriptions. Just link your cloud services and it will discover them. Use a single service account to control your volumes but share control via Teams and Workspaces.
Tap into ONTAP APIs across any location and in a simple way to automate routine tasks or create whole workflows using Fabric Flows. Override or augment the Fabric Orchestrator’s built-in actions for volumes and snapshots based on Workspaces.
By making the Fabric Orchestrator an extensible platform for orchestrating NetApp Cloud Data Services, on-premise systems, and 3rd party services, the platform will grow in capabilities along with your Data Fabric. Fabric Orchestrator comes with a Data Fabric API, an SDK for custom providers and services, and an SDK for Fabric Flows extensions.
Set rules and apply uniform settings to Fabric Objects. You can set and forget, allowing your Data Fabric to manage itself. Policies can run powerful workflows by configuring pre- and post- actions that are powered by Fabric Flows.
Link credentials to workspaces to control access to external resources. Invite internal users or external partners to teams to grant access to workspaces with role based access controls. Create shared Fabric Flows that can be scheduled and delegated to teams to run.
When you add an object like a volume to your Data Fabric, it becomes a citizen with an ancestry, hierarchy, and space of its own that can be tracked, traced, and logged at any point in its lifecycle. All actions in the Data Fabric, either manually actioned or automated via an API, can be traced for compliance and security purposes.
Configure backup and snapshot policies for volumes and applications. Automatically apply data protection based on labels, workspaces and other Data Fabric objects. Protect data from malicious or accidental deletion by 'circuit breaking' actions.
Fabric Orchestrator introduces the concept of Data Sync, an extensible workflow for data movement. Data should be able to be moved, cloned, replicated, and synchronized across your Data Fabric using simple actions that can be automated. Our initial data movers are Cloud Sync, SnapMirror, and Cloud Backup Service.
Any object in the fabric can have metadata, including Tags & Labels which are a fundamental enabler for organizing, finding, and annotating so that policies and automations can do the heavy lifting for you.
Fabric Flows are user customizable visual workflows that control services, manage data, apply configurations and react to external and internal events without needing an engineering degree to customize.
Fabric Orchestrator manages your library of Fabric Flows that you build over time to implement Fabric Policy actions or repetitive operations that can save your team a whole lot of time.
When the Fabric Advisor finds an opportunity to optimize your Data Fabric for cost, speed, compliance, protection, security or many other advisory reasons, it will let you know and give you the controls to react. You can also trigger your Fabric Policies so they can do it for you.
Design Fabric Flows and export their controls to easily share the power of custom-built actions with a simple link. Easy buttons enable developers and operations to integrate your Data Fabric workflows and actions into external systems and services.
Storage and cloud architects can define data workflows to follow best practices and share the capability of running those workflows instead of direct access to sensitive systems.
Get context-sensitive help in a single place. Query NetApp's
AI/ML enabled knowledge base, get support, view how-to videos, and take a guided tour of a platform that understands your intentions.
Extend Fabric Orchestrator's visibility and control across clouds and into on-premises environments via public APIs or secure northbound-only tunnels. Internal credentials stay on premises and cloud credentials and access tokens are securely stored as part of your Data Fabric.
Hiring knowledge workers is getting harder in many places, so teams are becoming increasingly distributed. Fabric Orchestrator
helps companies to stay productive by orchestrating the data in their Data Fabric based on their internal workflows.
A large creative studio has a distributed workforce of digital artists and developers working together on a project with hundreds of thousands of large files and build artifacts. While one team works out of the Asia office, the U.S. teams are offline.
When the U.S. teams come online, they need all the latest changes to the design files and build artifacts available locally on their high performance on-premises storage systems, but they are located across the U.S. and working on subsets of the data.
Synchronizing all the increasingly large data sets with every office is slow and costly. While the data synchronizes, the artists’ and developers’ productivity suffers and the project grinds to a halt.
Using the Fabric Orchestrator’s features; Fabric Flows, Labels, Teams, and Fabric Policies to their data, the IT operations team manages to automate the process of matching the data needed by each team based on what they have recently been working on, which workspaces that team belongs to and the labels applied to the data.
The DevOps team is also able to backup build artifacts like container images to the cloud and restore only the latest images to every development teams’ local repositories.
Artists no longer have to wait for their design files to show up because of unnecessary data syncing—and the developers’ software builds and container image pulls run faster because the assets and build artifacts are local instead of in a distant repository.
The CEO is happy because productivity stays up even when the team grows or a new location is added, and the finance team is happy because the data sync costs went down and they save money on storage systems. The IT operations team is empowered and now wants to automate most of their day-to-day IT tasks.
As the sprawl of applications and data moves outside the traditional data center, businesses find it increasingly difficult to
protect, let alone keep track of the exact location of their business data to ensure it is protected and regulatory compliant.
A bank in Europe runs several large applications across multiple clouds and employs a team of data scientists to help them derive the most business value from the data they create and collect.
The data scientists copy data between clouds to run AI/ML processes and then send the results back to their on-premise systems. The bank has no way to control or track the data as it moves between clouds, potentially outside of EMEA, and cannot ensure that compliance or data protection practices are being followed.
The bank decides to use Cloud Volumes universally because of its excellent performance and the simplicity of the unified CV API, provided by the Fabric Orchestrator. By linking Fabric Orchestrator to their public cloud accounts, and their internal NetApp HCI systems to NetApp Kubernetes Service and Cloud Volumes, they automatically track what’s being created, copied, and moved.
They can see a complete Fabric History and audit log for user-initiated and automated actions. They use the collaboration capabilities of the Fabric Orchestrator to consolidate cloud access to a few service accounts, and they apply roles to teams rather than individuals.
They set up a single data protection policy that automatically protects any data labeled "production”. And they organize their sensitive data (GDPR) into labeled workspaces so they can set Fabric Policies that “circuit break” any API requests that try to move any original or derived sensitive data to non-compliant locations.
Using the Fabric Orchestrator with CVS not only improved the performance of their applications and data, but it also gave them full visibility and traceability of their data.
The data scientists are happy because the Fabric Orchestrator makes it simpler for them to find and move data and they know they can't mess up because their Fabric Policies will hit the breaks.
The CSO and the compliance officer are happy because they have fewer things to worry about, and the CEO is happy because the company is now making the best use of their intellectual workforce’s time, with quicker results. The data scientists are now eager to use Fabric Flows to automate their data pipelines.