Easy SFTP Setup with AWS Transfer Family

Enrico Portolan
5 min readOct 14, 2021

In my Solutions Engineer journey, I’ve been working with different customers and use cases. One recurrent use case is customers that want to read or upload reports (such as, application logs etc.) to an SFTP server.

Since the SFTP server will be accessed by external users (your customers), you need to make sure that the right permissions are in place to avoid exposing things the users were not approved for. You need a server to host the service, provide an authentication mechanism and operate the SFTP server itself.

The question is: is AWS offering a solution for this? Of course 🥳! The solution is called AWS Transfer Family.

If you want to follow the tutorial on video, I published the video on Youtube

Photo by Killian Cartignies on Unsplash

Introducing AWS Transfer for SFTP

From AWS Documentation:

AWS Transfer Family is a fully managed AWS service that you can use to transfer files into and out of Amazon Simple Storage Service (Amazon S3) storage or Amazon Elastic File System (Amazon EFS) file systems over the following protocols: SFTP, FTPS and FTP.

So forget about spinning up a new EC2 and security groups, you now have a fully managed AWS service that can do that for you. It’s not cheap, the pricing is the following:

  • $0.30 per hour + $0.04 per gigabyte (GB) transferred (both upload and download)

So it is $0.04*24*30 days = $216 per Month .

AWS Transfer for SFTP — Walkthrough

Let’s go through the steps needed to create the SFTP Walkthrough. I will use an S3 bucket as the “backend” of the SFTP server.

Step 1: Create S3 Bucket from the AWS console. The S3 Bucket will be the place where the files will stay. In the next steps, I’ll show how to set up granular permissions on the path of the S3 Bucket.

Step 2: Navigate to AWS Transfer Family service and select create Server

You can select three different hostname options:

  1. None: AWS will create a DNS record such as {random-123456}.server.transfer.{region}.amazonaws.com.
  2. Route 53 Alias: the service will create an alias on Route 53 so you can use a more friendly name and a subdomain of your Hosted zone for the SFTP server. This is my go-to option because I want my customers to have a DNS record with my domain.
  3. Other DNS: Similar to option 2but you will use a 3rd party DNS Service.

The next step is to select the identity provider to use: Service managed or Custom. With the Service Managed identity provider option, you can manage the users within the service meaning you can manage the user’s public ssh key or ask the users to provide their ssh public key.

If using the custom provider option, the users will authenticate using a username and password, which is managed through a 3rd party identity provider (such as Microsoft AD).

Step 3: choose the domain to use. In our example, we will use S3

Transfer Family — choose a domain

Step 4: provide the logging role to give AWS Transfer permission to create logs in Cloudwatch

Final Step: Hit Create Server Button! 🏋️‍♀️

Configure IAM Roles with S3

So far, we have created an S3 bucket and the SFTP server that will be in front of it. Depending on your use case, you may want to restrict the users to be able to read-only the objects, upload only files (PutObject) or both of them. The following role gives read and writes permissions to the IAM Role

{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::{your-bucket-name}"
],
"Effect": "Allow",
"Sid": "ReadWriteS3"
},
{
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:DeleteObjectVersion",
"s3:GetObjectVersion",
"s3:GetObjectACL",
"s3:PutObjectACL"
],
"Resource": [
"arn:aws:s3:::{your-bucket-name}/*"
],
"Effect": "Allow",
"Sid": ""
}
]
}

At this point, we need to give credentials to the users to access the SFTP server. Click the Add User button:

To create a user, you need to specify the username, Home directory (S3 Bucket), the access policy from IAM, and if desired the scope-down policy (the Policy-down section). In this section, you can specify the S3 bucket and optional directory to give more granular permission. Find below an IAM policy that allows access only to a subfolder:

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowListingOfUserFolder",
"Action": [
"s3:ListBucket"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::${transfer:HomeBucket}"
],
"Condition": {
"StringLike": {
"s3:prefix": [
"${transfer:HomeFolder}/*",
"${transfer:HomeFolder}"
]
}
}
},
{
"Sid": "HomeDirObjectAccess",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:GetObjectVersion"
],
"Resource": "arn:aws:s3:::${transfer:HomeDirectory}*"
}
]
}

Notice thetransfer:HomeBucket and transfer:HomeDirectory placeholders in the IAM policy. These will be replaced by the S3 bucket name and the Home Folder specified in the User creation.

The last step is to add the user’s Public Key. They will need their Private Key to connect to the SFTP Server. If you don’t add the Public Key, the users can connect without any credentials.

UPDATE 2021: AWS Transfer for SFTP recently added support for Logical Home Directories, that gives you “chroot” and “symlink” like capabilities. For full details, see https://aws.amazon.com/blogs/storage/simplify-your-aws-sftp-structure-with-chroot-and-logical-directories/

It’s time to test the connection! Use your favourite SFTP Client (CyberDuck anyone?) and try to connect. If everything is configured correctly, you should see the files in the S3 Bucket.

Conclusion

AWS Transfer service is a great tool that extremely simply the process of creating and managing an SFTP server. Since it’s integrated with S3 and IAM, I find the user and files management straightforward. Using the “restricted folder” policy you can create different users with different permission in the same S3 bucket.

--

--

Enrico Portolan

Passionate about cloud, startups and new technologies. Full-stack web engineer