A little backstory, I ran into an interesting problem today I needed to get a 40GB Virtual Machine Image to a storage account in Azure so i could create a Virtual Machine from it.
Traditionally, I would download the image to my computer then upload to a storage account in Azure. 😞 Unfortunately I had poor bandwidth, this would've taken me almost the whole day.
💡 Azure Cloud Shell to the rescue. Here's what I thought would be easily achieved
Here's what I did not know
So to achieve my assumptions
Step 1: Increase quota on File Share
A default Storage Account exists that contains this File Share, you would need to locate this Storage Account and increase the File Share quota for your session.
Step 2: Download the file to clouddrive directory
navigate to clouddrive directory
cd clouddrive
download file here
wget https://website-to-get-file
Now the file downloads directly to the file share and you can view the file from the file share in the storage account.
Step 3: provide storage for AzCopy caching
As i specified earlier AzCopy has a directory named .azcopy in the home directory for caching. and because of the storage limitation you cannot copy large files, even if you manage to get it to the clouddrive directory.
the error you would get would look like in the image below
The workaround was simple,
cd ~
cp -r .azcopy clouddrive
rm -rf .azcopy
ln -s clouddrive/.azcopy .azcopy
Now AzCopy has enough storage to perform the copy operation.
PS: The file downloaded is stored to a file share, so we can move to Azure Blob storage in one of two ways. 1. treat file as a local file and upload to blob container. 2. get access url to file from file share and use this to upload file to blob container.
you can read more about the Azure Cloud Shell here - https://docs.microsoft.com/en-us/azure/cloud-shell/overview
The End.