Upgrading my G/L Source Names Extension to AL – step 3

When upgrading an extension from C/AL to AL (version 1 to version 2) we need to think about the data upgrade process.

In C/AL we needed to add two function to an extension Codeunit to handle the installation and upgrade.  This I did with Codeunit 70009200.  One function to be execute once for each install.

And another function to be executed once for each company in the install database.

For each database I add my permission sets to the installation users and for each company I restore the setup data for my extension and populate the lookup table for G/L Source Name.

The methods for install and upgrade have changed in AL for extensions version 2.  Look at the AL documentation from Microsoft for details.

In version 2 I remove these two obsolete function from my application management Codeunit and need to add two new Codeunits, one for install and another for upgrade.

In the code you can see that this Codeunit is of Subtype=Install.  This code will  be executed when installing this extension in a database.

To confirm this I can see that I have the G/L Source Names Permission Sets in the Access Control table .

And my G/L Source Name table also has all required entries.

Uninstalling the extension will not remove this data.  Therefore you need to make sure that the install code is structured in a way that it will work even when reinstalling.  Look at the examples from Microsoft to get a better understanding.

Back to my C/AL extension.  When uninstalling that one the data is moved to archive tables.

Archive tables are handled with the NAVAPP.* commands.  The OnNavAppUpgradePerCompany command here on top handled these archive tables when reinstalling or upgrading.

Basically, since I am keeping the same table structure I can use the same set of commands for my upgrade Codeunit.

So, time to test how and if this works.

I have my AL folder open in Visual Studio Code and I use the AdvaniaGIT command Build NAV Environment to get the new Docker container up and running.

Then I use Update launch.json with current branch information to update my launch.json server settings.

I like to use the NAV Container Helper from Microsoft  to manually work with the container.  I use a command from the AdvaniaGIT module to import the NAV Container Module.

The module uses the container name for most of the functions.  The container name can be found by listing the running Docker containers or by asking for the name that match the server used in launch.json.

I need my C/AL extension inside the container so I executed

Then I open PowerShell inside the container

Import the NAV Administration Module

and I am ready to play.  Install the C/AL extension

Now I am faced with the fact that I have opened PowerShell inside the container in my AdvaniaGIT terminal.  That means that my AdvaniaGIT commands will execute inside the container, but not on the host.

The simplest way to solve this is to open another instance of Visual Studio Code.  From there I can start the Web Client and complete the install and configuration of my C/AL extension.

I complete the Assisted Setup and do a round trip to G/L Entries to make sure that I have enough data in my tables to verify that the data upgrade is working.

I can verify this by looking into the SQL tables for my extension.  I use PowerShell to uninstall and unpublish my C/AL extension.

I can verify that in my SQL database I now have four AppData archive tables.

Pressing F5 in Visual Studio Code will now publish and install the AL extension, even if I have the terminal open inside the container.

The extension is published but can’t be installed because I had previously installed an older version of my extension.  Back in my container PowerShell I will follow the steps as described by Microsoft.

My AL extension is published and I have verified in my SQL server that all the data from the C/AL extension has been moved to the AL extension tables and all the archive tables have been removed.

Back in Visual Studio Code I can now use F5 to publish and install the extension again if I need to update, debug and test my extension.

Couple of more steps left that I will do shortly.  Happy coding…

 

Building a clean database – remove not licensed objects

I just got a question from a client;

Gunnar,
Do you have a “King Kong” license that will allow you to delete any object?  It appears our development license does not have the rights to some of the newer LS Retail objects and I need to create a CRONUS database with just our stuff.

Well, I don’t have a “King Kong” license.  That is only for Microsoft.

There is a way to solve this dilemma.  It will take a few steps.

Start with we have two databases, one with the data we need (LSRetail), another with the application we need (CRONUS).

After the process is completed the LSRetail database will not be usable as a standalone database, so make a copy if you need one.  A new database will be created, CRONUS_APP.  To clean up it is safe to delete both these databases.

The following powershell script has two options.  Option 1 is to have the company data imported into the CRONUS database in the end.  This option requires a server instance running on the CRONUS database.  Option 2 is to create a new database with SQL Management Studio and merge the CRONUS application and the LSRetail data into that one.

[code lang=”powershell”]
$CronusDatabaseName = "CRONUS" # Database with destination Application
$CRONUSServerInstance = "DynamicsNAV80" # Instance for destination Application if using option 1
$LSRetailDatabaseName = "LSRETAIL" # LS Retail Demo Database, database with company data
$EmptyDatabaseName = "CRONUS WITH COMPANYDATA" # Create a new empty database using SQL Management Studio if using option 2
$SQLServerName = "SQL2014"
$SQLServerInstance = "NAVDEMO" # Set blank for default instance

$AppDatabaseName = $CronusDatabaseName + "_APP"
$ServiceAccount = $env:USERDOMAIN + "\" + $env:USERNAME
$ServerInstance = "UPGRADE"
$NavDataFile = (Join-Path $env:TEMP "NAVmerge.navdata")

$SelectOption = "2"

#Export Application from CRONUS Database to Application Database
Export-NAVApplication -DatabaseServer $SQLServerName -DatabaseInstance $SQLServerInstance -DatabaseName $CronusDatabaseName -DestinationDatabaseName $AppDatabaseName -ServiceAccount $ServiceAccount -Force

#Setup a temporary Server Instance for the new database
Get-Credential | New-NAVServerInstance -ServerInstance $ServerInstance -ManagementServicesPort 33555 -ClientServicesPort 33556 -SOAPServicesPort 33557 -ODataServicesPort 33558 -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $AppDatabaseName -ServiceAccount User -Force
Set-NAVServerConfiguration -ServerInstance $ServerInstance -KeyName "Multitenant" -KeyValue "true" -Force
Set-NAVServerInstance -ServerInstance $ServerInstance -Start -Force

#Prepare LSRetailDatabase for new configuration
Remove-NAVApplication -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Force

#Mount and Sync LSRetailDatabase as a tenant
Mount-NAVTenant -ServerInstance $ServerInstance -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Id DEFAULT -OverwriteTenantIdInDatabase -AllowAppDatabaseWrite -Force
Sync-NAVTenant -ServerInstance $ServerInstance -Tenant DEFAULT -Mode ForceSync -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}

#Option 1, Copy Company data to the original CRONUS database. Requies a service running on the CRONUS database
if ($SelectOption -eq "1")
{
Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -Force
Import-NAVData -ServerInstance $CRONUSServerInstance -FilePath $NavDataFile -AllCompanies -Force
}
#Option 2, Import into the new empty database created by SQL Management Studio
if ($SelectOption -eq "2")
{

Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -IncludeApplication -IncludeApplicationData -IncludeGlobalData -Force
if ($SQLServerInstance -eq "")
{
Import-NAVData -DatabaseServer $SQLServerName -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication

}
else
{
Import-NAVData -DatabaseServer ($SQLServerName + "\" + $SQLServerInstance) -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication
}

}

Set-NAVServerInstance -ServerInstance $ServerInstance -Stop -Force
Remove-NAVServerInstance -ServerInstance $ServerInstance -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}
[/code]

To walk you through what happens;

  • Application from CRONUS is exported into CRONUS_APP database
  • New Service Instance is created for CRONUS_APP database
  • Service Instance is changed to Multi Tenant and started
  • Application is removed from LSRetail database
  • LSRetail database is mounted as a tenant for CRONUS_APP database
  • LSRetail database structure is force-synched to CRONUS_APP application
  • Data from CRONUS_APP and LSRetail tenant is exported to NAVData file
  • NAVData file is imported into an empty database or the existing CRONUS database

 

Technical upgrade from NAV 2013 R2 to NAV 2015

I just made a backup of a live NAV 2013 R2 database to do an upgrade to NAV 2015.  Since the system is live we have several servers running when the backup is created.

The backup was restored in a new environment and the database opened with a NAV 2015 Developement Client.  The database is put in single user mode and converted.  After the conversion a process to upgrade objects starts.  Here is where things start to behave in an unwanted way.

chooseInstance

The database is in a Single User Mode since it is still in the upgrade transaction and no NAV Server instance is connected.  However, in the table “Server Instance” all the old NAV Server instances are listed.  To get through this I needed to press N like a thousand times…

In a test upgrade process like this we need to add another step to the usual upgrade procedure.  After the restore is completed make sure that the tables “Server Instance” and “Active Session” are empty.

After a restore to a new environment it is also good to clean unneeded users from the database.  Only leave what is actually needed.

users

In this case the NAVLIGHT\srvNAV is the service user.  Also make sure that the Compatibility Level is as new as possible and the Recovery Model in line with the backup process.

dbproperty

Also, when moving from one environment to another the users connected to the old domain are obsolete.  Going to Users and trying to remove them results in an error and the user is only disabled.

userdelete

In some cases this could be enough.  It is possible to completely remove the user, just do some housekeeping first.  Find the user in “User Personalization” and Edit the record.  On the Action ribbon we can clean some of the user trails.

clearpersonalization

Finally make sure that the record is deleted from “User Personalization”.

After this cleanup work the user can be safely deleted.

 

Give our customers NAV 2015 Demo with own data in few steps

Microsoft just released the Cumulative Update 1 for Dynamics NAV 2015.  With this update we no longer need to push our data upgrade through NAV 2013 R2.  As the blog says:

“The cumulative update includes an upgrade toolkit for upgrading a Microsoft Dynamics NAV 2009 R2 or Microsoft Dynamics NAV 2009 SP1 database to Microsoft Dynamics NAV 2015. The upgrade toolkit includes several application objects in FOB files that simplify the upgrade process for those of you coming from Microsoft Dynamics NAV 2009 R2 or Microsoft Dynamics NAV 2009 SP1. For more information, see the attached whitepaper.”

Attached is the new PDF document on how to upgrade NAV 2009 R2 or NAV 2009 SP1 Database to NAV 2015

The biggest task in the data upgrade from a customized database to a standard one is to handle all the data modifications.  For this I use the Table Compare tool.  I need both the customized version and the old standard version.  Export all tables from the Object Designer to a text file and import into the tool.

ImportTableData2015

Also import the 2015 objects in a similar way.  After importing all the table definitions into the tool and selecting the appropriate Base Version for each customized version go into the Version Comparisons.

VersionCompare1

After pressing the Compare button the result is showed.

VersionCompareResult

Click on the counters to drill down to the tables.  For all the tables listed as modified and deleted select the action to force the data transformation.

ForceTables

Back on the Version Comparison click the button to Create 2015 Upgrade Code and Save Codeunit Object File.

SaveUpgradeCodeunit

 

Now we step over to the data upgrade task.  We follow the guide by Microsoft; prepare the 2009 R2 version, convert with 2013, open with 2015, compile system tables, attach a service, run the schema synchronization and import the 2015 application objects.

Here we stop after step 21 and add our exported upgrade Codeunit by importing and compiling.  Now, run the Sync. Schema for All Tables With Validation.  This should clean all fields that are not a part of the standard NAV.

Sync

Remove the custom upgrade Codeunit from the database and continue with step 22, Import Upgrade601800.[Country].fob and so forth…

Continue the Microsoft steps and you should have a 2015 version database ready to use as a demo database for the customer.

p.s. This method will also apply to any upgrade process from 2009 R2 to 2015.  If there is customization in the 2015 database, just use the tool to create a temporary tables and data transfer code.

The Next Upgrade Story – Data transfer code generation tool

Guess what, more companies are moving to NAV 2013 R2.

I am about to start a data upgrade for a medium-sized Icelandic company.  Doing a data transfer means that you need to compare the tables of the old version and the new version and where you have mismatch you have to take some action.  The data transfer is build on the upgrade from 2009 to 2013 R2 released by Microsoft.

We have the MergeTool that is used to compare and merge code and a lot of other cool things.  I wanted a tool that did something else so I started designing a writing.  I am hoping that the result will save me a lot of work in future upgrades.

UpgradeMenu

This tool imports exported object files into a table version.  Then I can compare two table versions and get all the difference listed.  I also import the CRONUS tables for each version to make sure that I am not comparing built-in fields.

TableVersions

I give the comparison an identification and select the source and destination versions.  Then I hit Compare.  Within seconds I get the result.

CompareResult3

I have got to make decisions on what to do with modified and deleted tables.  I can list all the tables and check the fields comparison.

Fields

I can select both step 1 and step 2 actions for each table.  For new fields I can select a source field from the source table.  The example I show here is the customer table.  I choose in step 1 to copy all the needed fields to a temporary table.  The tool will find the next available upgrade table automatically.  In step 2 I choose to move the data from the temporary table into the destination version.  I can ask that a new field will get a value from another field in the temporary table.  In this example the Registration No. field has a new ID.

The best part is yet to come.  To really save some work I make the tool create all the needed NAV upgrade objects.  For step 1 I get all the upgrade tables with only required fields.

[code]OBJECT Table 66001 Temp Customer
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
FIELDS
{
{ 1 ; ;No. ;Code20 }
{ 10010400;;Registration No. ;Code20 }
{ 10017241;;Require Purchase Request;Boolean }
{ 10017290;;Freight Paid By ;Option ;OptionString=[Recipient,Prepaid,Senders Account,See Ship-to Address] }
}
KEYS
{
{ ;No. ;Clustered=Yes }
}
FIELDGROUPS
{
}
CODE
{

BEGIN
{
¸Dynamics.is Upgrade Table
}
END.
}
}

OBJECT Table 66002 Temp Cust. Ledger Entry
{

[/code]

The codeunit required to make the data manipulation.

[code]OBJECT Codeunit 66000 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104045
CopyCustomer(StateIndicator);
CopyCustLedgerEntry(StateIndicator);

MoveCashierBankBranch(StateIndicator);

ForcePaymentTerms(StateIndicator);
ForceSalesHeader(StateIndicator);

END

LOCAL PROCEDURE CopyCustomer@2(VAR StateIndicator@1002 : Record 104037);
VAR
Customer@1000 : Record 18;
TempCustomer@1001 : Record 66001;
BEGIN
WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
CALCFIELDS("Picture");
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Registration No.",”);
MODIFYALL("Require Purchase Request",FALSE);
MODIFYALL("Freight Paid By",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE CopyCustLedgerEntry@3(VAR StateIndicator@1002 : Record 104037);
VAR
CustLedgerEntry@1000 : Record 21;
TempCustLedgerEntry@1001 : Record 66002;
BEGIN
WITH CustLedgerEntry DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustLedgerEntry.INIT;
TempCustLedgerEntry.TRANSFERFIELDS(CustLedgerEntry);
TempCustLedgerEntry.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Final Due Date",0D);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCashierBankBranch@26(VAR StateIndicator@1002 : Record 104037);
VAR
CashierBankBranch@1000 : Record 10010402;
TempCashierBankBranch@1001 : Record 10000202;
BEGIN
WITH CashierBankBranch DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCashierBankBranch.INIT;
TempCashierBankBranch.TRANSFERFIELDS(CashierBankBranch);
TempCashierBankBranch.INSERT;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE ForceFixedAsset@48(VAR StateIndicator@1002 : Record 104037);
VAR
FixedAsset@1000 : Record 5600;
BEGIN
WITH FixedAsset DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDFIRST THEN BEGIN
StateIndicator.Update;
MODIFYALL("Straight-Line %",0);
MODIFYALL("Depreciation Starting Date",0D);
MODIFYALL("Biling Contract No.",”);
END;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete step 1 the addition needed to the DeleteDiscontinuedTables trigger.

[code]// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
DeleteTable(72000);
DeleteTable(72001);
DeleteTable(72003);
DeleteTable(87400);
…// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
[/code]

Then there is the upgrade codeunit for step 2

[code]OBJECT Codeunit 66001 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104048
CopyBankingWebServicesUser(StateIndicator);
MoveCustomer(StateIndicator);
.. END

LOCAL PROCEDURE CopyBankingWebServicesUser@2(VAR StateIndicator@1002 : Record 104037);
VAR
TempCashierWebServicesUser@1000 : Record 66022;
BankingWebServicesUser@1001 : Record 10010428;
BEGIN
WITH TempCashierWebServicesUser DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"BankingWebServicesUser".INIT;
"BankingWebServicesUser"."Authentication Code" := "Authentication Code";
"BankingWebServicesUser"."User ID" := "User ID";
"BankingWebServicesUser"."Certificate Key Identifier" := "Certificate Key Identifier";
//"BankingWebServicesUser"."User ID Store" := ;
//"BankingWebServicesUser"."Password Store" := ;
//"BankingWebServicesUser"."Certificate Location" := ;
//"BankingWebServicesUser"."Certificate Value" := ;
"BankingWebServicesUser".INSERT;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCustomer@3(VAR StateIndicator@1002 : Record 104037);
VAR
TempCustomer@1000 : Record 66001;
Customer@1001 : Record 18;
BEGIN
WITH TempCustomer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Customer".GET("No.");
"Customer"."Registration No." := "Registration No.";
//"Customer"."Momentum Claims Active" := ;
"Customer".MODIFY;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete the process the upgrade tables are marked to be deleted.

[code]// Add these lines to the OnRun trigger of Codeunit ID 104003
MarkTable(66001);
MarkTable(66002);
…[/code]

This tool is in testing and I am planning to add support for NAV 2015 upgrade tool.

Help Microsoft with the upgrade story

Customer Survey: Microsoft Dynamics NAV upgrade

​With this survey, we would like to solicit your valuable input in regard to the Microsoft Dynamics NAV upgrade story to better understand your current upgrade situation.

​The questionnaire below focuses on upgrading from Microsoft Dynamics NAV 2009 Classic client to the RoleTailored client as well as the upgrade experience in general. The Microsoft Dynamics NAV team is determined to provide you with the best possible service to help you move your business forward and get the most of our new releases.

http://sgiz.mobi/s3/8dd03168613b

The upgrade story – continued

The Client went live on the 6th.  This is the task I told you about in last September.

The database was 567GB in size with just a little less than 400GB used.  It had 1.506.747 sales invoices and 30.438.858 value entries.  Version 4.0Sp3 with a lot of added fields and customization.

When I first ran the upgrade process with the standard upgrade tool it took about ten days.  As you can see there is no way to stop a company for ten days.  The process was rewritten and these are the steps I used.

  • Backup the live database with compression enabled
  • Copy the backup file to the new SQL server
  • Restore the backup to the new server
  • Drop all statistics that was stopping me from modifying tables
  • Run Inventory Adjustment
  • Import and execute 4.0 SP3 -> 2009 R2 Step 1
  • Import 2009 R2 objects
  • Prepare data for Step 2 by importing conversion tables
  • Execute 4.0 SP3 -> 2009 R2 Step 2
  • Import 2009 R2 -> 2013 R2 upgrade objects and execute Step 1
  • Change database compatibility level to 2012
  • Execute SQL commands to drop all statistics that was blocking database conversion
  • Open the database with NAV 2013 Developement Environment and convert database
  • Open the database with NAV 2013 R2 Developement Environment and convert the database
  • Compiled the system tables
  • Changed the SQL Command Timeout for the service to 12:00:000 and started the service
  • Selected “No” in the Developement Environment option “Prevent data loss from table changes”
  • Imported all NAV 2013 R2 objects
  • From PowerShell, executed Sync-NAVTenant
  • Selected “Yes” in the Developement Environment option “Prevent data loss from table changes”
  • Import 2009 R2 -> 2013 R2 upgrade objects and execute Step 2

There where a few minor steps in between and after that was needed for this customer but basically this was it.

The good news is; this was completed in just under 23 hours with an unexpected delay of just over an hour.  This can be done and the client is happy.