hamburger icon close icon

Avoiding Data Leaking: Syncing, Securing and Monitoring Amazon S3 Buckets

The security of sensitive data both at rest and in transit is critical. Recently there have been far too many occurrences where data within Amazon S3 buckets have been left open to the public because simple security procedures were not followed.

Administrators seem to have either forgotten or overlooked the fact that cloud services, such as AWS, work on a shared responsibility model—that means it is their responsibility to use the security features provided by AWS correctly.

It is also the company’s responsibility to be aware of how their data is stored and where and to provide appropriate policies and training.

The Shared Responsibility Model gives a good template for what a company needs to do to protect the data in their Amazon S3 buckets. So what happens if they don’t do that properly?

In this article we will look at the recent Verizon breach and how data was transferred to a third party where it was stored within an insecure bucket.

We will discuss why this happened, how bucket security can be improved through configuration and monitoring and how management and security of data transfer can be better managed with tools such as NetApp’s Cloud Sync service.

If we take a look at Verizon’s recent breach that exposed the names, addresses, and account information of millions of customers, we assume that the security policies that were in place to secure the Amazon S3 bucket were incorrect and allowed public access. Let’s take a look at what could have been done to prevent this breach in more detail.

When it comes to the kind of breach that took place at Verizon, it is likely that there were simply not enough precautions taken in securing, monitoring, and managing the company’s data in Amazon S3. Luckily, Amazon provides security features to help in such situations.

To make sure that the same mistakes aren’t made with your data, there are three major factors to keep in mind:

1) carefully managing and monitoring data
2) securing your data at rest in the bucket
3) securing your data when it is in transit either to or from Amazon S3.

1. Managing and Monitoring Data

When we store data of any type, one of the challenges is data life cycle management.

Companies need a comprehensive approach to managing their data which involves policies and procedures as well as applications and monitoring. The cloud generally makes life easier when performing a lot of management tasks, but it’s not difficult to slip up at times. Here are some key considerations to keep in mind:

  • Implement change management procedures.
  • Know where your data is and be able to perform audits on it.
  • Monitor for configuration changes.
  • Correctly tag and name all of your cloud inventory.
  • Create multiple Amazon S3 buckets to separate data with different sensitivities.
  • Carefully consider bucket permissions from the start as fixing issues later can be difficult.

Change management as defined in frameworks such as ITIL (Information Technology Infrastructure Library) is a process designed to understand and minimise risks while making IT changes. This applies as much to cloud computing as it does to on-premises infrastructure.

As well as planning for intentional configuration changes, it is important to monitor for unintentional or accidental changes. Services such as Amazon CloudWatch allow you to monitor bucket and object events within Amazon S3 and issue alerts on them using Simple Notification Service (SNS). For more detailed processing of alerts, you can also use Amazon Simple Queue Service (SQS) or AWS Lambda.

2. Security of At-Rest Data Within Amazon S3

Data stored within Amazon S3 is, by default, secure. Only the bucket owner has access. When you do make permissions changes these are applied through bucket policies and user policies.

Bucket Policies

These apply to an Amazon S3 bucket and control permissions for the bucket and everything in it. They consist of a JSON file describing the resources, actions, effects, and principles (accounts or users) that have access to the bucket and its objects.

As well as providing access to the bucket they can also control objects placed within a bucket. For instance, you can define what types of objects are placed within a bucket, though these checks are rudimentary and can only check file extension not the actual mime type.

User Policies

These are similar to bucket policies but applied directly to a user within their IAM user policies. Like bucket policies, they consist of a JSON file but only require the resources, actions, and effects that user has access to.

The principles section is not required as the policy is directly attached to the user. User policies also allow more granularity in terms of specifying what a user has access to as you can allow access to specific folders within a bucket not just the bucket itself.

Lifecycle Policies

Lifecycle policies are great for automating the archival or deletion of old data. They allow you to move data between the various Amazon S3 storage types, into Amazon Glacier or even to delete it all together. This is handy if for example you have policies or legal requirements defining the maximum length of time you can keep data.

When managing the lifecycle of your files, keep files with different expiration policies apart, either in separate folders or buckets. It’s very easy to mix up data and have critical information archived or worse, deleted. It’s worth noting here that if you use versioning on your bucket you can specify different lifecycle policies for the current versions of objects and previous versions of those objects.

3. Security of In-Transit Data to and from Amazon S3

Securing and managing your data while in transit to and from the cloud is just as important as the secure storage of data. There are a number of options for securing your connection to AWS.

For example, you can setup a VPN connection using to an AWS-managed virtual private gateway allowing you to securely connect directly into your VPC; you can also use services such as Direct Connect to setup a direct, private connection from your on-premises data center to AWS over a non-internet-routed connection.

These services provide secure transfer but they do not manage the transfer of data. To solve this problem you need a third-party tool such as NetApp Cloud SyncNetApp Cloud Sync allows you to securely sync the contents of a directly from on-premises storage directly to Amazon S3.

Cloud Sync is easy to set up and manage, while ensuring that your data remains within your security boundaries. Its continuous two-way sync ensures that changes made to your local filesystem will be applied in the cloud also. This is a great way to automate the management of files you want to upload and to be certain that what’s in the cloud is meant to be there.

Summary

Data security has always been important and as companies move to the cloud they need to ensure that they maintain and update their current policies to be in-line with their cloud usage.

The risk of security breaches is a major concern running any internet-based service, and companies need to be prepared for what to do when they occur. In July 2017 alone more than 143 million records were leaked, with both big and small companies being affected.

Companies need to secure their entire data lifecycle from creation to deletion, at rest and in transit. They need to be able to monitor and audit their data and access to it—Cloud Sync can help you do that.

Want to get started? Try out Cloud Volumes ONTAP today with a 30-day free trial.

-