15

I have a worker role that I use to pull data down from Blob Storage OnStart. Currently I'm testing this by uploading a test.txt file and then downloading it to a local directory. This is working fine.

I now would like to upload a folder to blob storage. This folder contains a batch script as well as several executables that the batch script calls.

What is the recommended way to accomplish this? I think zipping the folder and uploading the *.zip file would be easy... but then, once I download it locally for the worker role to handle, how would I unzip it without any third party libraries?

If there are better options, I'm open to any suggestions. Thanks for the help here - this community has been a huge help for me as I ramp up :)

RobVious
  • 11,766
  • 23
  • 84
  • 169
  • Why not just upload the individual files? Storage is cheap. – paparazzo Jun 14 '12 at 18:44
  • I should have noted that there is also a project folder that contains hundreds of files and other folders. I'd like to be able to download all contents of the container while retaining folders, filenames, etc. Is this possible? – RobVious Jun 14 '12 at 18:57
  • Please define pull data down. You need these for worker role or you need to download these files to the client? Or something else? – paparazzo Jun 14 '12 at 19:24
  • I need these files for the worker role. The easiest thing for me is to use CloudXplorer to drop the project folder into the Container... but I can't quite figure out how to download the folder as a folder from blob storage, for the worker role to use. It's just downloaded as a bytestream, I believe. – RobVious Jun 14 '12 at 19:33

6 Answers6

6

You can use upload-batch:

az storage blob upload-batch --destination ContainerName --account-name YourAccountName --destination-path DirectoryInBlob --source /path/to/your/data

This copies all files found in the source directory to the target directory in the blob storage.

Either a SAS-Token (via --sas-token) or account key has to be specified.

Also works smoothly with a service principal.

kap
  • 927
  • 2
  • 12
  • 19
5

login to azure cli using az login

  1. To upload file use az storage blob upload additional-params

  2. To upload a folder use az storage blob upload-batch additional-params

Refer here for complete commands

Shubham Gupta
  • 89
  • 1
  • 4
3

With the release of AZCOPY tool you can do this in one line. Details here.

Cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"

.\AzCopy /Source:C:\Users\folderName\ `
/Dest:https://storageAccountName.blob.core.windows.net/storageContainerName `
/DestKey:yourStorageAccountKey /S
SqlWorldWide
  • 249
  • 3
  • 16
3

You could use Azure Storage Explorer https://azure.microsoft.com/en-us/features/storage-explorer/ to upload a folder. It uses azcopy in the background but nice to have a user interface.

Raj
  • 144
  • 1
  • 9
2

You can take two approaches: Single-file (e.g. zip) or multi-file (with each file in its own blob). Here's my take on it, then a note about unzipping:

Single zip file

This is a very easy way to maintain a grouped set of files, like an apache install, or a set of static resources. Downloading to local storage from a blob is extremely simple. And, a zip file can handle any level of nested directories.

Downside: To update a single file, you'd need to create a new zip; no way to simply upload one modified asset.

individual blobs

Separate blobs are great when you need to update individual files quickly without worrying about other files. Also, you can direct-link to these blobs whether public or (with Shared Access Signature) private and enbed links in web pages, etc. Look at my answer here, as well as @Sandrino's, for examples of this. Oh, and if you're planning on exposing blobs via CDN, they'll need to be in individual blobs.

Downside: No absolute mapping to nested directories. Blob storage is arranged by account\container\blob. While you can simulate nested folders, you'd need to do some work to map individual files. To download individual blobs, you'd need to grab the container and call ListBlobs() to enumerate through individual blobs names.

How to unzip

The Eclipse project provides a vbs script which is trivial to use. From a Visual Studio project (or really any script), I'd consider downloading something like 7zip, which is free and trivial to install. Then just download the zip from blob storage to local storage (in the proper folder), and pass it to 7zip.

I hope this provides you with enough guidance to make the correct decision. If it were me and I was storing a build (like tomcat), I'd keep the entire directory structure in a zip. That gives me assurance that I haven't broken something by modifying just a single file. And... I can keep a running history of tomcat versions easily, with multiple zips (in separate blobs).

Community
  • 1
  • 1
David Makogon
  • 64,809
  • 21
  • 130
  • 175
  • Very cool. I think I'll try to zip everything up then. This means that I'll need three files in the container: the build archive, the 7zip executable, and a batch script that tells the worker role to install 7zip, unzip the build, and then run the deployment. Does that sound about right? Cheers :) – RobVious Jun 14 '12 at 20:11
  • Sounds about right. 7zip doesn't necessarily have to go in the same container. Have as many containers as you like. The batch script should be part of your Windows Azure deployment. Since 7zip provides an MSI, that can be executed from a startup script. You'll still need to download 7zip to local storage to install it. Easy to do with a public blob, or download it direct from sourceforge. If in private blob, you'd need a way to get to access keys, or have shared access signature to the blob or the container. For now, I'd start simple and go with public blob to store 7zip msi. – David Makogon Jun 14 '12 at 21:13
1

use Azure Storage Explorer. After installing you have to log in and then you can select your Blob and upload folders like a normal upload process.

I ran into the same issue, tried to solve the problem with uploading a zipped folder but there is no option to unzip the folder in Azure.