Tag: Azure Storage

Migrate Amazon S3 bucket to Azure blob Storage

Migrate AWS S3 buckets to Azure blob storage

With the latest version of AzCopy (version 10), you get a new feature which allows you to migrate Amazon S3 buckets to Azure blob storage. In this blog post, I will show you how you can copy objects, folders, and buckets from Amazon Web Services (AWS) S3 to Azure blob storage using the AzCopy command-line utility. This makes it easy to migrate S3 storage to Azure or create a simple backup of your AWS S3 bucket on Azure.

AzCopy will use the Put Block from URL API, which allows you to directly copy files from AWS directly to Azure. This means you will not use a lot of bandwidth from your computer. You can even copy large objects or buckets from S3 to Azure.

Configure access and authorize AzCopy with Azure and AWS

First, you will need to install AzCopy to your machine. After that, you will need to authorize AzCopy with Microsoft Azure and AWS. To authorize with AWS S3, you have to use an AWS access key and a secret access key.

AWS access key and secret access key, and then set these environment variables:

OSCommand
Windowsset AWS_ACCESS_KEY_ID=
set AWS_SECRET_ACCESS_KEY=
Linuxexport AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
macOSexport AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=

Copy an AWS S3 object to Azure blob

You can copy a simple object using the following command:

azcopy cp "https://s3.amazonaws.com/tomsbucket/tomsobject" "https://tomsstorageaccount.blob.core.windows.net/tomscontainer/tomsblob"

Copy and migrate Amazon S3 folder to Azure

You can copy a folder from the Amazon S3 bucket to the Azure blob storage:

azcopy cp "https://s3.amazonaws.com/tomsbucket/tomsfolder" "https://tomsstorageaccount.blob.core.windows.net/tomscontainer/tomsfolder" --recursive=true

Copy an Amazon S3 bucket to Azure blob storage

You can also copy one or multiple Amazon S3 buckets to Azure:

azcopy cp "https://s3.amazonaws.com/tomsbucket" "https://tomsstorageaccount.blob.core.windows.net/tomscontainer" --recursive=true

I hope this gives you a quick idea of how you can migrate data from Amazon AWS S3 storage to Azure using AzCopy. If you want to know more, check out the official Microsoft Docs about how to copy data from Amazon S3 buckets by using AzCopy.



Synchronize Folder with Azure Blob Storage using AzCopy

Sync Folder with Azure Blob Storage

With AzCopy v10 the team added a new function to sync folders with Azure Blob Storage. This is great if you have a local folder running on a server or even on a client device which you can to keep synchronized with Azure Blob storage. This will not only upload new or changed files, with the “–delete-destination” parameter you can let AzCopy remove locally deleted files on Azure blob storage and vice-versa.

First, make sure you install and set up AzCopy.

Sync Folder with Azure Blob Storage

You can use the following command to sync a local folder with Azure Blob Storage. This command will only sync changed and new files, it compares file names and last modified timestamps.

Sync Folder with Azure Blob Storage using AzCopy

 
azcopy sync "C:\Temp\images" "https://tomsaccount.blob.core.windows.net/images" --recursive

As mentioned, if you set the “–delete-destination” parameter to “true”, AzCopy deletes files without a prompt. If you want to check first, which files will be removed, before AzCopy deletes a file, set the –delete-destination flag to “prompt”.

To make sure you are not accidentally are deleting data, make sure to enable the soft delete feature before you use the –delete-destination parameter.

Synchronize Folder with Azure Blob Storage using AzCopy

I deleted the file “3.jpg” locally and I ran the azcopy sync again. You can see that file “3.jpg” was removed from the Azure Blob Storage.

Sync to a local folder

To sync Azure Blob Storage to a local folder, you can use the following command.

 
azcopy sync "https://tomsaccount.blob.core.windows.net/images" "C:\Temp\images" --recursive

As of today, the sync feature does only supports local folders with Azure Blobs. Syncing with AWS or from Storage account to Storage account is currently not supported.

I hope this gives you a quick overview of how you can sync folder with Azure Blob Storage, if you want to know more, check out the Microsoft Docs about how you can transfer data using AzCopy. If you have any questions, please let me know in the comments.



How to Install AzCopy

How to Install AzCopy for Azure Storage

AzCopy is a command-line tool to manage and copy blobs or files to or from a storage account. It also allows you to sync storage accounts and move files from Amazon S3 to Azure storage. In this blog post, I will cover how to install AzCopy on Windows, Linux, macOS, or in update the version in the Azure Cloud Shell.

AzCopy v10 is now generally available to all of our customers and provides higher throughput and more efficient data movement compared to the earlier version of AzCopy (v8). Version 10 also adds additional functionality like sync of blob storage accounts and much more.

Install AzCopy

You can get the latest version of AzCopy from here: Get started with AzCopy

Install AzCopy on Windows

To install AzCopy on Windows, you can run the following PowerShell script, or you can download the zip file and run it from where ever you want. This script will add the AzCopy folder location to your system path so that you can run the AzCopy command from anywhere.

 
#Download AzCopy
Invoke-WebRequest -Uri "https://aka.ms/downloadazcopy-v10-windows" -OutFile AzCopy.zip -UseBasicParsing
 
#Curl.exe option (Windows 10 Spring 2018 Update (or later))
curl.exe -L -o AzCopy.zip https://aka.ms/downloadazcopy-v10-windows
 
#Expand Archive
Expand-Archive ./AzCopy.zip ./AzCopy -Force
 
#Move AzCopy to the destination you want to store it
Get-ChildItem ./AzCopy/*/azcopy.exe | Move-Item -Destination "C:\Users\thmaure\AzCopy\AzCopy.exe"
 
#Add your AzCopy path to the Windows environment PATH (C:\Users\thmaure\AzCopy in this example), e.g., using PowerShell:
$userenv = [System.Environment]::GetEnvironmentVariable("Path", "User")
[System.Environment]::SetEnvironmentVariable("PATH", $userenv + ";C:\Users\thmaure\AzCopy", "User")

Install AzCopy on Linux

To install AzCopy on Linux, you can run the following shell script, or you can download the tar file and run it from where ever you want. This script will put the AzCopy executable into the /usr/bin folder so that you can run it from anywhere.

 
#Download AzCopy
wget https://aka.ms/downloadazcopy-v10-linux
 
#Expand Archive
tar -xvf downloadazcopy-v10-linux
 
#(Optional) Remove existing AzCopy version
sudo rm /usr/bin/azcopy
 
#Move AzCopy to the destination you want to store it
sudo cp ./azcopy_linux_amd64_*/azcopy /usr/bin/

Authorize with Azure Storage

When you start working with Azure Storage, you have two options to authorize against the Azure Storage. You can provide authorization credentials by using Azure Active Directory (AD), or by using a Shared Access Signature (SAS) token.

It also depends on which services you want to use.

Storage typeSupported method
Blob storageAzure AD and SAS
Blob storage (hierarchical namespace)Azure AD
File storageSAS only

Authenticate using Azure AD

To authenticate with AzCopy using Azure AD, you can use the following command

 
azcopy login

Authenticate using SAS token

To authenticate with AzCopy using a SAS token you can use this command as an example

 
azcopy cp "C:\local\path" "https://account.blob.core.windows.net/mycontainer1/?sv=2018-03-28&ss=bjqt&srt=sco&sp=rwddgcup&se=2019-05-01T05:01:17Z&st=2019-04-30T21:01:17Z&spr=https&sig=MGCXiyEzbtttkr3ewJIh2AR8KrghSy1DGM9ovN734bQF4%3D" --recursive=true

To make things easier you can use Azure PowerShell to generate the SAS token for you. I wrote a blog post on ITOPSTALK.com about how you can do that. You can get the SAS token using the following Azure PowerShell command. If you are running Linux or macOS, you can find on this blog post, how to install PowerShell 6.

 
Connect-AzAccount
Get-AzSubscription
 
$subscriptionId = "yourSubscriptionId"
$storageAccountRG = "demo-azcopy-rg"
$storageAccountName = "tomsaccount"
$storageContainerName = "images"
$localPath = "C:\temp\images"
 
Select-AzSubscription -SubscriptionId $SubscriptionId
 
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $storageAccountRG -AccountName $storageAccountName).Value[0]
 
$destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
 
$containerSASURI = New-AzStorageContainerSASToken -Context $destinationContext -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Name $storageContainerName -Permission rw
 
azcopy copy $localPath $containerSASURI --recursive

To learn more about SAS tokens, check out Using shared access signatures (SAS).

I hope this helps you to install AzCopy and configure it. If you have any questions, feel free to leave a comment.



Azure Storage Explorer

Microsoft quietly released Azure Storage Explorer 1.0.0

Microsoft quietly released Azure Storage Explorer 1.0.0 back in April. There was not a lot of noise about it, but it is great that this tool finally reached version 1.0. Azure Storage Explorer is a standalone app that enables you to easily work with Azure Storage data on Windows, macOS, and Linux. This works with Azure as well as Microsoft Azure Stack.

Azure Storage Explorer is an easy to use tool to manage Azure Storage types:

  • Access multiple accounts and subscriptions across Azure, Azure Stack, and the sovereign Cloud
  • Create, delete, view, and edit storage resources
  • View and edit Blob, Queue, Table, File, Cosmos DB storage and Data Lake Storage
  • Obtain shared access signature (SAS) keys
  • Available for Windows, Mac, and Linux

Version 1.0.0 brings some new features which were highly requested. Especially the shared account store with Visual Studio 2017 and the improved Azure Stack  integration are very welcome.

  • Enhanced authentication that allows Storage Explorer to use the same account store as Visual Studio 2017. To use this feature, you will need to re-login to your accounts and re-set your filtered subscriptions.
  • For Azure Stack accounts backed by AAD, Storage Explorer will now retrieve Azure Stack subscriptions when ‘Target Azure Stack’ is enabled. You no longer need to create a custom login environment.
  • Several shortcuts were added to enable faster navigation. These include toggling various panels and moving between editors. See the View menu for more details.
  • Storage Explorer feedback now lives on GitHub. You can reach our issues page by clicking the Feedback button in the bottom left or by going to https://github.com/Microsoft/AzureStorageExplorer/issues. Feel free to make suggestions, report issues, ask questions, or leave any other form of feedback.
  • If you are running into SSL Certificate issues and are unable to find the offending certificate, you can now launch Storage Explorer from the command line with the –ignore-certificate-errors flag. When launched with this flag, Storage Explorer will ignore SSL certificate errors.
  • There is now a ‘Download’ option in the context menu for blob and file items.
  • Improved accessibility and screen reader support. If you rely on accessibility features, see our accessibility documentation for more information.
  • Storage Explorer now uses Electron 1.8.3