paint-brush
Azure Hack: Use AzCopy on Azure Cloud Shell to Upload Large Files to Storage Accountsby@uchenebed
4,268 reads
4,268 reads

Azure Hack: Use AzCopy on Azure Cloud Shell to Upload Large Files to Storage Accounts

by Uche NebedNovember 14th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Azure Hack: Use AzCopy on Azure Cloud Shell to Upload Large Files to Storage Accounts. The solution is to get a 40GB Virtual Machine Image to a storage account in Azure so i could create a Virtual Machine from it. AzCopy has a directory named.azcopy in the home directory for caching. Because of the storage limitation you cannot copy large files, even if you manage to get it to the. clouddrive directory. The file downloaded is stored to a file share, so we can move. file as a local file and upload to blob container.

Company Mentioned

Mention Thumbnail
featured image - Azure Hack: Use AzCopy on Azure Cloud Shell to Upload Large Files to Storage Accounts
Uche Nebed HackerNoon profile picture

A little backstory, I ran into an interesting problem today I needed to get a 40GB Virtual Machine Image to a storage account in Azure so i could create a Virtual Machine from it.

Traditionally, I would download the image to my computer then upload to a storage account in Azure. 😞 Unfortunately I had poor bandwidth, this would've taken me almost the whole day.

💡 Azure Cloud Shell to the rescue. Here's what I thought would be easily achieved

  • download file directly to Azure cloud shell
  • upload directly to Azure Blob Storage from command line

Here's what I did not know

  • Azure Cloud Shell by default has very limited storage capacity
  • Azure Cloud Shell creates a file share with a 6GB quota for each user session
  • This file share contains a 5GB linux image which is the OS
  • The file share storage is mounted to a directory called clouddrive
  • AzCopy has a directory .azcopy in the home directory
  • the .azcopy directory is used as a cache directory that needs storage equal to what you are trying to copy (I dont know why, saw a few pointers though - https://github.com/Azure/azure-storage-fuse/issues/130 )

So to achieve my assumptions

Step 1: Increase quota on File Share

A default Storage Account exists that contains this File Share, you would need to locate this Storage Account and increase the File Share quota for your session.

Step 2: Download the file to clouddrive directory

navigate to clouddrive directory

cd clouddrive

download file here

wget https://website-to-get-file

Now the file downloads directly to the file share and you can view the file from the file share in the storage account.

Step 3: provide storage for AzCopy caching

As i specified earlier AzCopy has a directory named .azcopy in the home directory for caching. and because of the storage limitation you cannot copy large files, even if you manage to get it to the clouddrive directory.

the error you would get would look like in the image below

The workaround was simple,

  • move the entire .azcopy to the clouddrive directory.
  • create a symbolic link for .azcopy in the home directory, pointing to the .azcopy in the clouddrive directory.
  • cd ~
    cp -r .azcopy clouddrive
    rm -rf .azcopy
    ln -s clouddrive/.azcopy .azcopy
    

    Now AzCopy has enough storage to perform the copy operation.

    PS: The file downloaded is stored to a file share, so we can move to Azure Blob storage in one of two ways. 1. treat file as a local file and upload to blob container. 2. get access url to file from file share and use this to upload file to blob container.

    you can read more about the Azure Cloud Shell here - https://docs.microsoft.com/en-us/azure/cloud-shell/overview

    The End.