AXForum  
Вернуться   AXForum > Microsoft Dynamics AX > DAX Blogs
All
Забыли пароль?
Зарегистрироваться Правила Справка Пользователи Сообщения за день Поиск Все разделы прочитаны

 
 
Опции темы Поиск в этой теме Опции просмотра
Старый 17.11.2018, 20:18   #1  
Blog bot is offline
Blog bot
Участник
 
25,459 / 846 (79) +++++++
Регистрация: 28.10.2006
yetanotherdynamicsaxblog: Setup a cloud storage for database copy operations
Источник: http://yetanotherdynamicsaxblog.blog...base-copy.html
==============

This post will show you how quickly and easily you can setup a cloud storage, and then copy the database around between your environments. Having said that, we are waiting on this feature in LCS, and eventually there will be tooling that does this for us in a fully managed way. However, while we are waiting, we can set this up ourselves.

Setup the Storage Account

You will (obviously) need an Azure Subscription for this to work. All of the steps below can be completed using a PowerShell script, so the advanced users will probably write that up. But I will here show have you can easily get this done with some clicking around. Still, you can set this all up in matter of minutes manually.

Start with opening the Azure Portal and open "Storage Accounts". You will create yourself a new one.



You will ned to choose a Resource Group, or create a new one. I typically have a Resource Group I put "DynOps" stuff in, like this Storage Account.

I want to make this a cheap account, so I tweak the settings to save money. I opt for only Local Redundancy and a Cold Tier. Perhaps the most important setting is the Region. You will want to choose a region that is the same as the VMs you are using. You get better performance and save some money (not much, though, but still).

Oh, and also worth mentioning, the account name must be unique. There are a few naming guidelines for this, but simply put you will probably prefix it with some company name abbreviation. If you accidentally pick something already picked, you won't be able to submit the form, for good measure.



It only takes a few minutes for Azure to spin up the new account, so sit back, relax and take a sip of that cold coffee you've forgot to enjoy while it was still warm.

The next thing you'll do is open the newly created Storage Account, and then scroll down on the "things you can do with it" and locate "Blobs". You will create yourself a new blob, give it a name, like for example "backups" or just "blob". Take not of the name, as you will need it later.



Then you will want to get the Access key. About the Access key, it needs to be kept as secret as possible, since it basically grants access to the things you put into this Storage Account. If you later worry that the key has been compromised, you can regenerate the Access key, but then your own routines will have to get updated as well. There are some other ways to secure usage of the Storage Account, but for the sake of simplicity I am using the Access key in this example.



And now you are set. That entire thing literally takes just a few minutes, if the Azure Portal behaves and you didn't mess anything up.

Using the Storage Account

I've become an avid user of the PowerShell library D365FO.tools, so for the next example I will be using it. It is super easy to install and setup, as long as the VM has an Internet connection. I'm sure you will love it too.

Assuming it is installed, I will first run a command to save the cloud Storage Account information on the machine (using the popular PSFramework). This command will actually save the information in the Registry.


# Fill in your own values
$params = @{
Name = 'Default' # Just a name, because you can add multiple configurations and switch between them
AccountId = 'uniqueaccountname' # Name of the storage account in Azure
Blobname = 'backups' # Name of the Blog on the Storage Account
AccessToken = 'long_secret_token' # The Access key
}

# Create the storage configuration locally on the machine
Add-D365AzureStorageConfig @params -ConfigStorageLocation System -Verbose

Now let's assume you ran the command below to extract a bacpac of your sandbox Tier2 environment.

Import-Module d365fo.tools

$dbsettings = Get-D365DatabaseAccess

$baseParams = @{
DatabaseServer = $dbsettings.DbServer
SqlUser = 'sqladmin'
SqlPwd = 'SQLADMIN_PASSWORD_FROM_LCS'
Verbose = $true
}
$params = $baseParams + @{
ExportModeTier2 = $true
DatabaseName = $dbsettings.Database
NewDatabaseName = $($dbsettings.Database + '_adhoc')
BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
}

Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName)
New-D365Bacpac @params

You now want to upload the bacpac (database backup) file to the blob in your cloud Storage Account using the following PowerShell script.

Set-D365ActiveAzureStorageConfig -Name 'Default'

$StorageParams = Get-D365ActiveAzureStorageConfig
Invoke-D365AzureStorageUpload @StorageParams -Filepath 'D:\Backup\sandbox_adhoc.bacpac' -DeleteOnUpload

The next thing you do, is jump over to the VM (Tier1, onebox) where you want to download the bacpac. Obviously, D365FO.tools must be installed there as well. Assuming it is, and assuming you've also run the command above to save the cloud Storage Account information on the machine, you can run the following PowerShell script to download.

Set-D365ActiveAzureStorageConfig -Name 'Default'

$StorageParams = Get-D365ActiveAzureStorageConfig
Invoke-D365AzureStorageDownload @AzureParams -Path 'D:\Backup'

Finally, you would run something like this to import the bacpac to the target VM.

Import-Module d365fo.tools

$bacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
$sourceDatabaseName = "AxDB_Source_$(Get-Date -UFormat "%y%m%d%H%M")"

#Remove any old temp source DB
Remove-D365Database -DatabaseName $sourceDatabaseName -Verbose

# Import the bacpac to local SQL Server
Import-D365Bacpac -ImportModeTier1 -BacpacFile $bacpacFile -NewDatabaseName $sourceDatabaseName -Verbose

#Remove any old AxDB backup (if exists)
Remove-D365Database -DatabaseName 'AxDB_original' -Verbose

#Stop local environment components
Stop-D365Environment -All

#Switch AxDB with source DB
Switch-D365ActiveDatabase -DatabaseName 'AxDB' -NewDatabaseName $sourceDatabaseName -Verbose

Start-D365Environment -All

Isn't that neat? Now you have a way, while we're waiting for this to be completely supported out of the box in LCS - fingers crossed!

Источник: http://yetanotherdynamicsaxblog.blog...base-copy.html
__________________
Расскажите о новых и интересных блогах по Microsoft Dynamics, напишите личное сообщение администратору.
Теги
d365fo.tools

 

Похожие темы
Тема Автор Раздел Ответов Посл. сообщение
alexef: Sample Extension v2.0 — Dynamics NAV / AL extension / SaaS — ALF Export Setup Tabs to Excel Blog bot NAV: Blogs 3 05.10.2018 23:13
axsa: How to copy a database from on-premises to to cloud Tier 1 in Dynamics 365 for Finance and Operations Blog bot DAX Blogs 0 23.05.2018 22:14
axsa: How to copy a database from cloud tier 1 to on-premises in Dynamics 365 for Finance and Operations Blog bot DAX Blogs 0 23.05.2018 16:11
powerobjects: Cloud, Edge, or Local? Dynamics 365 Finance and Operations Deployment Options Blog bot Dynamics CRM: Blogs 0 13.12.2017 05:12
Microsoft Dynamics CRM Team Blog: Understanding Storage and Storage Notifications in Microsoft Dynamics CRM Online Blog bot Dynamics CRM: Blogs 0 25.07.2013 08:12
Опции темы Поиск в этой теме
Поиск в этой теме:

Расширенный поиск
Опции просмотра

Ваши права в разделе
Вы не можете создавать новые темы
Вы не можете отвечать в темах
Вы не можете прикреплять вложения
Вы не можете редактировать свои сообщения

BB коды Вкл.
Смайлы Вкл.
[IMG] код Вкл.
HTML код Выкл.
Быстрый переход

Рейтинг@Mail.ru
Часовой пояс GMT +3, время: 14:23.
Powered by vBulletin® v3.8.5. Перевод: zCarot
Контактная информация, Реклама.