23

I had to setup secure FTP to Azure Blob Storage using popular FTP clients (like FileZilla, for example). After doing lot of research, I came across a link that says:

Deployed in a worker role, the code creates an FTP server that can accept connections from all popular FTP clients (like FileZilla, for example) for command and control of your blob storage account.

Following the instructions of the link, I had implemented the same and deployed the worker role on Azure production environment and it was successful. But still I am not able to connect the FTP host server (provided by me in configuration file) using FileZilla. I don't know what I had done wrong or missed anything.

Massimiliano Kraus
  • 3,214
  • 5
  • 20
  • 40
techV
  • 897
  • 2
  • 16
  • 37
  • 2
    But.. why? There are already two very good _FTP-style_ Azure Storage clients out there: http://storageexplorer.com/ and https://azurestorageexplorer.codeplex.com/ – evilSnobu Oct 12 '16 at 07:16
  • @evilSnobu thanks!!! ....so you mean to say I don't need to have ftp setup and i can do so by using azure storage explorer. also can upload and download the blob files. – techV Oct 12 '16 at 07:20
  • That's exactly right. – evilSnobu Oct 12 '16 at 07:25
  • yes.. i got it. i had spent almost two days figuring out the solution for this secure ftp connectivity. totally appreciated.. thanks buddy. – techV Oct 12 '16 at 07:31
  • @evilSnobu hey.. one thing what if we need to give access to our clients without telling them sensitive info like account key. as because in order to use storage explorer they need to have these info to connect. – techV Oct 12 '16 at 07:42
  • 1
    Do take a look at storageexplorer.com. It lets you connect to your storage account using a `Shared Access Signature` which doesn't include the account key. – Gaurav Mantri Oct 12 '16 at 07:44
  • @GauravMantri thanks... can you please also tell me where will i find or get the SAS URI... when i am trying to connect to individual blob it is asking me for SAS URI. – techV Oct 12 '16 at 08:03
  • 1
    You would need to create a SAS URI either on a blob or the blob container (depending on what you're trying to do). You can create a SAS URI using this tool itself or programmatically. I would highly recommend reading https://azure.microsoft.com/en-in/documentation/articles/storage-dotnet-shared-access-signature-part-1/ to learn more about SAS. HTH. – Gaurav Mantri Oct 12 '16 at 08:06
  • @GauravMantri great.. sure i will go through this. thanks for your time :) – techV Oct 12 '16 at 08:10
  • @evilSnobu Would appreciate if you could put your comments as an answer. – Gaurav Mantri Oct 12 '16 at 08:11
  • **See also:** https://stackoverflow.com/questions/13195871/windows-azure-and-sftp – dreftymac Oct 08 '19 at 21:41
  • I do not consider the storage explorer to be a very good FTP-style transfer tool. I have not had much like with the retry/resume capability for large multi GB files. – JJ_Coder4Hire Oct 22 '20 at 02:02

2 Answers2

17

If you are okay with a little programming with Node.js, you can host a FTP server directly backed by Azure Blob.

You can use nodeftpd combined with azure-storage-fs. nodeftpd is the FTP server written in Node.js and support third-party file system manager. azure-storage-fs is a file system manager that is designed to use for nodeftpd and talks to Azure Blob directly.

The file system manager integration code is clearly written under README.md of azure-storage-fs. But you will need to write your own authentication code.

Compulim
  • 1,118
  • 9
  • 18
12

But why?

There are already two very good FTP-style Azure Storage clients out there:
http://storageexplorer.com and http://azurestorageexplorer.codeplex.com

Both of them, as @Guarav well pointed out, can use a Shared Access Signature (SAS) to connect to Azure Storage without exposing the account key. You can then use a different SAS for each customer, if you're building a multi-tenant service - although if you think about it - that's not a very sound separation boundary.

Use a SAS

I would use a separate storage account for every customer. That way if a storage account gets compromised, it only affects one customer. The following limit applies:

From https://azure.microsoft.com/en-us/documentation/articles/storage-scalability-targets/:

Scalability targets for blobs, queues, tables, and files

Number of storage accounts per subscription: 200

This includes both Standard and Premium storage accounts. If you require more than 200 storage accounts, make a request through Azure Support. The Azure Storage team will review your business case and may approve up to 250 storage accounts.

Community
  • 1
  • 1
evilSnobu
  • 19,691
  • 5
  • 31
  • 62
  • 46
    Often SFTP servers are used for vendors to send data feeds, and you can't just tell them to use Storage Explorer because the feed is automated from an existing system. – Eric Grover Jun 25 '17 at 02:48
  • 1
    In that scenario your best bet would be an FTP server in a VM (or Cloud Service) with an Azure File share as storage, so you can treat the compute part more or less as stateless. Service Fabric is also a good option with its Reliable Services programming model. – evilSnobu Jun 25 '17 at 09:46
  • fine, just spin up a Linux VM with vsftpd or something. OP never mentions automation once, however he does mention FileZilla twice. – evilSnobu Aug 29 '18 at 16:45
  • I might be wrong but the storage explorer doesnt have all of the capacities of an ftp client such as: (1) download an entire container, (2) download all of the blobs from within a container (they are showed paginated so at most you could select all the blobs from one page and download only those) – sports Feb 01 '19 at 12:57
  • Regarding sound separation boundary: recommend separate subscription for each client--most scalable, easy to automate, easy enough to manage, easy to transfer all client resources to a client owner if necessary (change directory, transfer billing ownership). – lightmotive Nov 22 '19 at 16:23