AzureSQL database gives me change tracking error

I just uploaded a SQL bacpac to AzureSQL.  This I have done a number of times.  Connected my service to the SQL database and tried to start the service.

This time I got an error.  Looking in Event Viewer I can see.

I looked into the SQL database, and sure enough there was a line in sys.change_tracking_databases table.  The problem was that in that table the [database_id] was equal to 48 while

resulted in 49.  Hence the error and my service tier failing to start.

To remove the change tracking from the database I executed (found here)

The service tier will take care of turning the change tracking on again when it starts.  You might need to repeat these steps if restarting the service tier.

According to Microsoft a fix is in the pipeline and likely ships in CU2.

Sharing data with multiple tenants

I am upgrading multiple companies to NAV 2016.  I really like to use the multi tenant setup and use it in most cases.

In NAV we have the option to make a table common with all companies.

NoDataPerCompany

This option has been available for all the versions of NAV that I can remember.

Using a multi tenant setup means that you have a dedicated database for each tenant and normally only one company for each tenant.  That makes this option completely useless.

I was running this same multi tenant setup in NAV 2013 R2 and there I solved this issue by modifying the table to be a linked table.

LinkedObject

To successfully setup a linked table I need to manually make sure that the table or view with the correct name and the correct layout is present in every tenant database.  That is a job for the SQL Server Management Studio (SSMS) and can’t be done within NAV.  Doing this also makes upgrades more difficult and can get in the way of a normal synchronization of metadata and tables.

Moving up to NAV 2016 I wanted to get out of this model and use the External SQL methods now available.

ExternalTable

With these properties we can select the table or view name as the ExternalName and the table or view schema as the ExternalSchema.  For all fields in the table we can define an ExternalName.  If that is not defined the normal NAV field name will be used.

FieldNames

This option basically opens the door from the NAV Server to any SQL table.  So, how do we get this to work?

I will show you how I moved from the Linked Table method to the External SQL method.  If you take another look at the properties available for an External SQL table you will see that the DataPerCompany property is not available.  So, an External SQL table is just a table definition for NAV to use and with C/AL code you can define where to find the external table.  This gives you the flexibility to have the same table with all companies and all tenants or select different by tenants and/or companies.

In Iceland we have a national registry.  That registry holds the registration details for every person and every company.  Some companies buy access to the data from the national registry and keep a local copy and are allowed to do a lookup from this data.  Since the data in this table is updated centrally but every company in every tenant wants to have access this is a good candidate for the External SQL table method.

I already had the table defined in NAV with needed data.  I was able to find that table with SSMS.

OriginalTable

By using this table I did not have to worry about the ExternalName for each column in my table definition since it already matched the NAV field names.

I found my application database and used the script engine in SSMS to script the database creation.  I updated the database name to create a central database for my centralized data.  I choose to use this method to make sure that the new database has the same collation as the NAV application database.

I scripted the National Register table creation and created the table in my centralized database.  Then combined the scripts from INSERT INTO and SELECT FROM to insert data into my centralized table.

Finally I made sure that the service user running the NAV service had access to my centralized database.  By doing this I can use a trusted connection between the NAV server and the SQL server.

Moving to NAV developement environment and into the table properties.

NationalRegisterExternalSQL

The ExternalName and ExternalSchema must match the table I created.  Look at the picture from the SSMS to see “FROM [dbo].[National Register]”.  There you can pick up what you need to specify in these properties.

When these changes are synchronized to my database NAV will remove the previous National Register table from the NAV database.  That requires a synchronization with force so be careful.

The actual connection to the centralized database must be done in C/AL.  More information is on this MSDN article.

To complete this solution I use several patterns.

I need a setup data to point me to the centralized database.  I like to look at this as an external service so I link the setup to the Service Connections, Waldo calls this the Discovery Event Pattern.  I create the following function in a Codeunit to register the service.

RegisterConnection

So, if the user has write access to the National Register Setup he will see this service listed in the service connections.

The link to an external database might require me to save a user name and a password.  To successfully do this I apply another pattern for password encryption.  I normally point people to the OCR service setup table and page to find out how to implement these password patterns.

I like to have the Enabled field on my setup tables.  When the service is enabled the user can’t modify the setup data and when trying to enable the service the setup data is verified.  Fields on the setup page are protected by using the EditableByNotEnabled variable.

EditableByNotEnabled

I don’t think you will find a pattern for this method but the setup table in other details follows the Singelton pattern.

NRSetup2

When the user tries to enable the service I do a series or error testing.  The error testing are done with the combination of the Error Message pattern and the TryFunction pattern.

TestSetup

Line 21 does the actual connection test with a TryFunction.

Now, how to connect to the centralized data?

In my setup table I store the database server name and the database name within that server.  With this data I create the connection string.

RegisterUserConnection

The table connection must have a unique id.  I like to create a function to return variables that are linked to the functionality – not using the text constants.

GetConnectionName

This combines what I need to do.  With the correct connection string C/AL registers the connection to my centralized database and set that connection as a default connection for the user.  When NAV needs to use the data from the National Register C/AL must register the connection.

CheckOrRegister

Adding a call to this Codeunit from every page and every function that uses the National Register.

PageInit

Now back to my TryFunction, I can simply check if I can do a FINDFIRST without an error.

TryLookup

 

 

 

My SQL Server 2014 gets flooded and stops responding

All right, I must confess;  my SQL Server is not installed on a super computer.  Also, it is not installed and configured by a SQL Server MVP nor by Alain Krikilion (picture from NAV TechDays 2013).Alain_Krikilion

The Hyper-V machine running my SQL Server is using two processors and 12GB of memory.  Operating system is Windows Server 2012 R2 x64 and SQL Server version 2014.

I have two NAV 2016 CU5 instances on two servers running a multi tenant application.  Every time I started the second instance everything froze.  Even the SQL Server stopped responding.  I am not going to pretend that I understand the issue, but still, I found a solution and wanted to share that with you.

We have a property for the NAV Service called “Max Concurrent Calls”.  This is, by default, set to 40.  I found out that if I lowered this value to 10 on the second instance I was able to start it.  That is a workaround, not a solution.  What if I want to start the third service instance, do I then need to lower that even more or update the configuration for all the previously running instances?

So I turned my attention to my SQL Server.  On MSDN Microsoft states:

This topic describes how to set the user connections server configuration option in SQL Server 2014 by using SQL Server Management Studio or Transact-SQL. The user connections option specifies the maximum number of simultaneous user connections that are allowed on an instance of SQL Server. The actual number of user connections allowed also depends on the version of SQL Server that you are using, and also the limits of your application or applications and hardware. SQL Server allows a maximum of 32,767 user connections. Because user connections is a dynamic (self-configuring) option, SQL Server adjusts the maximum number of user connections automatically as needed, up to the maximum value allowable. For example, if only 10 users are logged in, 10 user connection objects are allocated. In most cases, you do not have to change the value for this option. The default is 0, which means that the maximum (32,767) user connections are allowed.

This page also shows an example on how to configure this setting.  In that example the property is set to 325.  So, I tried that, returning the “Max Concurrent Calls” back to the default value for all NAV Services.  Happy times, everything is running smooth.

So how can two NAV Servers that have “Max Concurrent Calls” set to 40 flood a SQL server that supports 325 concurrent user connections?  I even looked at the network connections by executing

netstat -nao | find “1433” > connections.txt

in command prompt and looking at the output.  The total number of connections was less than 50!  I think I must leave this to Microsoft or my friend Alain to explain this.

So it was clear; the Next, Next, Finish methodology failed me this time.  I needed to get my hands dirty and fix the SQL Server configuration.

First, I looked at the “Max Degree of Parallelism” property.  The default value is zero but is should be set to “No. of CPUs” – 1.

Parallelism

Then updated the connections property.

UserConnections

I restarted the SQL Server Service to apply changes.

I updated one more property and below is the SQL Query I used to update these properties.

 

Why this turns up in NAV 2016 and not in previous NAV versions, what changed, I don’t know.  Perhaps someone out there has the answer.

If you get stuck in a problem like this, see if you have NAS services running on both instances.  That seemed to be my problem.

Localized views removed from NAV 2013 R2

In Dynamics NAV 2009 and older we had an option to maintain views in the database.  Activating this caused NAV to create a view for each language and every table in the database.

This is a useful thing if you are building BI on top of your SQL and would like to skip all the translation work.

This however had some drawbacks.

  • Every table in the database is maintained
  • Option values are shown as number
  • Boolean is shown as number
  • Global dimension do not have the correct caption
  • Time is shown as DateTime
  • Not easy to see the difference between normal date and closing date

I already discussed this in a post back in 2012.  Because of this I never used this built-in option.  The solution I created back in 2012 has been upgraded and is now ready and working in NAV 2013 R2.

SQLViewMenu

I start by exporting all tables from the Object Designer as a text file.  This text file is imported with the task “Read Option Description”.  All options captions from the import are stored in a table in the database.

SQLBIDates

Dates are created according to the Accounting Periods.  The BI Dates table keeps multiple fields with information about the date.  By joining this table to your SQL query or adding it as a dimension to your OLAP cube you have a lot of options to handle dates and periods.

SQLSelectedTables

The tables required for the views are added to a list.  Remember to add he BI Dates table to the list of tables to be used in your business intelligence tool.

SQLCreate

NAV is now ready to create the SQL commands to create the views.  The prefix can be used if the views are to be kept in a separate database, which I suggest you do.  A prefix can for example be “[Demo Database NAV (7-1)].[dbo].”

SQLExportFile

The file for SQL Management Studio is now ready for execution.

[code lang=”sql”]IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$Currency’))
DROP VIEW [dbo].[ENU$Kappi ehf_$Currency]
GO
CREATE VIEW [dbo].[ENU$Kappi ehf_$Currency]
AS
SELECT
[Code] As [Code]
,[Last Date Modified] As [Last Date Modified]
,[Last Date Adjusted] As [Last Date Adjusted]
,[Unrealized Gains Acc_] As [Unrealized Gains Acc_]
,[Realized Gains Acc_] As [Realized Gains Acc_]
,[Unrealized Losses Acc_] As [Unrealized Losses Acc_]
,[Realized Losses Acc_] As [Realized Losses Acc_]
,[Invoice Rounding Precision] As [Invoice Rounding Precision]
,[Invoice Rounding Type] As [Invoice Rounding Type]
,[Amount Rounding Precision] As [Amount Rounding Precision]
,[Unit-Amount Rounding Precision] As [Unit-Amount Rounding Precision]
,[Description] As [Description]
,[Amount Decimal Places] As [Amount Decimal Places]
,[Unit-Amount Decimal Places] As [Unit-Amount Decimal Places]
,[Realized G_L Gains Account] As [Realized G_L Gains Account]
,[Realized G_L Losses Account] As [Realized G_L Losses Account]
,[Appln_ Rounding Precision] As [Appln_ Rounding Precision]
,CASE [EMU Currency]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [EMU Currency]
,[Currency Factor] As [Currency Factor]
,[Residual Gains Account] As [Residual Gains Account]
,[Residual Losses Account] As [Residual Losses Account]
,[Conv_ LCY Rndg_ Debit Acc_] As [Conv_ LCY Rndg_ Debit Acc_]
,[Conv_ LCY Rndg_ Credit Acc_] As [Conv_ LCY Rndg_ Credit Acc_]
,[Max_ VAT Difference Allowed] As [Max_ VAT Difference Allowed]
,[VAT Rounding Type] As [VAT Rounding Type]
,[Payment Tolerance _] As [Payment Tolerance _]
,[Max_ Payment Tolerance Amount] As [Max_ Payment Tolerance Amount]
,[Wage Amount Rounding Precision] As [Wage Amount Rounding Precision]
,[Wage Rate Rounding Precision] As [Wage Rate Rounding Precision]
,[Wage Amount Decimal Places] As [Wage Amount Decimal Places]
,[Wage Rate Decimal Places] As [Wage Rate Decimal Places]
FROM [Kappi ehf_$Currency]
GO

IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$BI Dates’))
DROP VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
GO
CREATE VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
AS
SELECT
[Date] As [Date]
,[Date Name] As [Date Name]
,[Year] As [Year]
,[Week] As [Week]
,[Month] As [Month]
,[Month Name] As [Month Name]
,[Day of Week] As [Day of Week]
,[Day Name] As [Day Name]
,[Day of Month] As [Day of Month]
,CASE [Closing Date]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Closing Date]
,[SQL Month Name] As [SQL Month Name]
,[SQL Day Name] As [SQL Day Name]
,[Quarter] As [Quarter]
,[Year Month] As [Year Month]
,[Year Month Name] As [Year Month Name]
,[Month Year Name] As [Month Year Name]
,[Quarter Name] As [Quarter Name]
,[VAT Period] As [VAT Period]
,[VAT Period Name] As [VAT Period Name]
,[Sorting Date] As [Sorting Date]
,[HRMS Integer Start] As [HRMS Integer Start]
,[HRMS Integer End] As [HRMS Integer End]
,[Day of Year] As [Day of Year]
,[Day of Half Year] As [Day of Half Year]
,[Day of Quarter] As [Day of Quarter]
,[Day of Accounting] As [Day of Accounting]
,[Half Years] As [Half Years]
,[Half Year of Year] As [Half Year of Year]
,CASE [Is Holiday]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Holiday]
,CASE [Is Working Day]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Working Day]
,[Month of Half Year] As [Month of Half Year]
,[Month of Quarter] As [Month of Quarter]
,[Month of Year] As [Month of Year]
,[Quarters of Half Year] As [Quarters of Half Year]
,[Quarters of Year] As [Quarters of Year]
,[Week of Year] As [Week of Year]
,CASE [Is Week Day]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Week Day]
,[Half Year Name] As [Half Year Name]
,[Week Name] As [Week Name]
,[Fiscal Day] As [Fiscal Date]
,[Fiscal Year] As [Fiscal Year]
,[Fiscal Week] As [Fiscal Week]
,[Fiscal Month] As [Fiscal Month]
,[Fiscal Day of Week] As [Fiscal Day of Week]
,[Fiscal Day of Month] As [Fiscal Day of Month]
,[Fiscal Quarter] As [Fiscal Quarter]
,[Fiscal Day of Year] As [Fiscal Day of Year]
,[Fiscal Day of Half Year] As [Fiscal Day of Half Year]
,[Fiscal Day of Quarter] As [Fiscal Day of Quarter]
,[Fiscal Half Years] As [Fiscal Half Years]
,[Fiscal Half Year of Year] As [Fiscal Half Year of Year]
,[Fiscal Month of Half Year] As [Fiscal Month of Half Year]
,[Fiscal Month of Quarter] As [Fiscal Month of Quarter]
,[Fiscal Month of Year] As [Fiscal Month of Year]
,[Fiscal Quarters of Half Year] As [Fiscal Quarters of Half Year]
,[Fiscal Quarters of Year] As [Fiscal Quarters of Year]
,[Fiscal Week of Half Year] As [Fiscal Week of Half Year]
,[Fiscal Week of Month] As [Fiscal Week of Month]
,[Fiscal Week of Quarter] As [Fiscal Week of Quarter]
,[Fiscal Week of Year] As [Fiscal Week of Year]
,[Fiscal Quarter Name] As [Fiscal Quarter Name]
,[Fiscal Half Year Name] As [Fiscal Half Year Name]
,[Fiscal Week Name] As [Fiscal Week Name]
,[Fiscal Month Name] As [Fiscal Month Name]
FROM [Kappi ehf_$BI Dates]
GO
[/code]

We now have a localized version of our table data and all the above drawbacks have been fixed.  Now go ahead and build your reports or your OLAP cube on top of these views.  You can use the BI Dates table for your filter area in the reports to make sure that only available periods are selected.

Download objects.

 

Change table layout and move data

In older versions of NAV I got used to using a report to move data from one table to another.  Since the last update from Microsoft for NAV 2013 R2 I am using the SQL Management Studio more and more.

For example, yesterday, I needed to change a table layout that included a change in the primary key.

I had modified the table in the developement database but was unable to import the new table because of the existing data.

I solved this by creating a temporary table with the original layout.  I can choose any number for the table that I can use it with my developement license since I will not be using the table in NAV.  I designed the original table and used File-Save As to create the new table.

Then I moved over to the SQL Management Studio.  There I found the original table and scripted a SELECT statement to a new query window.  Next I found the newly created shadow table and scripted an INSERT statement to the clipboard.  Then I combined these two statements to look like this.

[code lang=”sql”]USE [NAV_DATABASE]
GO

INSERT INTO [dbo].[NAV Company$Temp Partner Inbound Mapping]
([IC Partner Code]
,[Responsibility Center]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code])
SELECT
[IC Partner Code]
,[Responsibility Center]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code]
FROM [dbo].[NAV Company$IC Partner Inbound Mapping]
GO
DELETE FROM [dbo].[NAV Company$IC Partner Inbound Mapping]
[/code]

At this stage I have deleted all data from the table and was able to import the new modified table into the production NAV database. In this case I have two new fields in the new table. The primary changed to include a line number. That meant that I needed to create a line number for each row. This can be done in with SQL.

[code lang=”sql”]USE [NAV_DATABASE]
GO

INSERT INTO [dbo].[NAV Company$IC Partner Inbound Mapping]
([IC Partner Code]
,[Line No_]
,[Responsibility Center]
,[Ship-to Code]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code])
SELECT
[IC Partner Code]
,10000 * ROW_NUMBER() OVER(ORDER BY [IC Partner Code]) AS Row
,[Responsibility Center]
,”
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code]
FROM [dbo].[NAV Company$Temp Partner Inbound Mapping]
GO

DELETE FROM [dbo].[NAV Company$Temp Partner Inbound Mapping][/code]

If I would have needed to start the row number with 10000 for every instance of IC Partner Code I would have needed to use a cursor and iterate through every instance of IC Partner Code from another table and execute the statement filtered by the IC Partner Code.  This I used for example in the Azure Backup Script.

I finished this process by deleting the temporary table for the production database.

 

Test and fix dimensions before upgrading to NAV 2013

A coworker in Advania is working on a Business Intelligence solution called Advania Insight.  He is working on the installation for a company that is running NAV.  This company has data from Navigator since 1994.

One of the problem we saw was that some of the dimension codes used where not a standard dimension but for example a heading or a footer.  This did not go to well through the analysis and we needed to fix this.

Manually this is a lot of work so I did some SQL queries to find these errors. For example, this query will find all entries in the G/L Entry table that has incorrect value Global Dimension 1 Code.

[code lang=”sql”]USE [Demo Database NAV (7-1)]
GO

SELECT
[Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry],
[dbo].[CRONUS Ísland hf_$Dimension Value]
WHERE [Global Dimension 1 Code] = [Code] AND
[Global Dimension No_] = 1 AND
[Dimension Value Type] > 0
GO
[/code]

This query can be used to find entries with missing entries in the Ledger Entry Dimension table.

[code lang=”sql”]SELECT [Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry] V
WHERE (SELECT COUNT(D.[Dimension Code]) FROM [dbo].[CRONUS Ísland hf_$Ledger Entry Dimension] D
WHERE [Table ID] = 17 AND V.[Entry No_] = D.[Entry No_] AND D.[Dimension Code] = ‘DEILD’) = 0
AND [Global Dimension 1 Code] <> ”
GO
[/code]

This query can be used to find entries with Global Dimension 1 Code that are missing from the Dimension Value table.

[code lang=”sql”]USE [Demo Database NAV (7-1)]
GO

SELECT [Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry] V
WHERE (SELECT COUNT(D.[Dimension Code]) FROM [dbo].[CRONUS Ísland hf_$Dimension Value] D
WHERE [Global Dimension 1 Code] = D.[Code] AND D.[Global Dimension No_] = 1) = 0
AND [Global Dimension 1 Code] <> ”
GO[/code]

Looking forward I saw that it would be a lot of work to manually check all these possibilities so I decided to write a SQL script. First a script that will do the dimension type test on the Ledger Entry Dimension table.

[code lang=”sql”]USE [KS Dynamics NAV]
GO

DECLARE @invalididentifierscars varchar(10)
DECLARE @replacestring varchar(10)
SET @invalididentifierscars = (SELECT [invalididentifierchars] FROM [dbo].[$ndo$dbproperty])

DECLARE @varSQL varchar(max)
DECLARE @DimTableName varchar(256)
DECLARE @DimValueTableName varchar(256)
DECLARE @getCompanyNames CURSOR
DECLARE @CompanyName varchar(50)
DECLARE @loop int
CREATE TABLE #TmpLedgerEntryDimCombinations (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int)
CREATE TABLE #TmpIncorrectDimEntries (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int, EntryNo int)
SET @getCompanyNames = CURSOR FOR SELECT [Name] FROM [dbo].[Company]
OPEN @getCompanyNames
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @DimTableName = @CompanyName + ‘$Ledger Entry Dimension’
SET @DimValueTableName = @CompanyName + ‘$Dimension Value’
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @DimTableName = REPLACE(@DimTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @DimValueTableName = REPLACE(@DimValueTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @varSQL = ‘USE [‘ + DB_NAME() + ‘]; INSERT INTO #TmpLedgerEntryDimCombinations SELECT DISTINCT ”’ + @CompanyName + ”’,[Table ID],O.[Name],L.[Dimension Code],[Global Dimension No_]
FROM [dbo].[‘ + @DimTableName + ‘] L,[dbo].[‘ + @DimValueTableName + ‘] D, [dbo].[Object] O
WHERE D.[Dimension Code] = L.[Dimension Code] AND [Global Dimension No_] > 0 AND O.[Type] = 0 AND O.[ID] = [Table ID]’
EXEC (@varSQL)
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
END
CLOSE @getCompanyNames
DEALLOCATE @getCompanyNames

DECLARE @TableID int
DECLARE @TableName varchar(50)
DECLARE @GlobalDimCode varchar(20)
DECLARE @GlobalDimNo int
DECLARE @DimFieldName varchar(50)
DECLARE @DimFixes CURSOR
SET @DimFixes = CURSOR FOR SELECT CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo FROM #TmpLedgerEntryDimCombinations
OPEN @DimFixes
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @OrigCompanyName varchar(50)
DECLARE @OrigTableName varchar(50)
SET @OrigCompanyName = @CompanyName
SET @OrigTableName = @TableName
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @CompanyName = REPLACE(@CompanyName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @TableName = REPLACE(@TableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @DimFieldName = ‘[Global Dimension ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ Code]’
SET @varSQL = ‘INSERT INTO #TmpIncorrectDimEntries (CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo, EntryNo)
SELECT ”’ + @OrigCompanyName + ”’, ‘ + CAST(@TableID as varchar(20)) + ‘,”’ + @OrigTableName + ”’,”’ + @GlobalDimCode + ”’,’ + CAST(@GlobalDimNo as varchar(1)) + ‘, [Entry No_]
FROM [dbo].[‘ + @CompanyName + ‘$’ + @TableName + ‘], [dbo].[‘ + @CompanyName + ‘$Dimension Value]
WHERE ‘ + @DimFieldName + ‘ = [Code] AND [Global Dimension No_] = ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ AND [Dimension Value Type] > 0’
EXEC(@varSQL)
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
END
CLOSE @DimFixes;
DEALLOCATE @DimFixes;

SELECT * FROM #TmpLedgerEntryDimCombinations
DROP TABLE #TmpLedgerEntryDimCombinations
SELECT * FROM #TmpIncorrectDimEntries
DROP TABLE #TmpIncorrectDimEntries
GO

[/code]

Then a similar script that will update the Ledger Entry Dimension table according to the values in Global Dimension 1 Code and Global Dimension 2 Code.

[code lang=”sql”]USE [KS Dynamics NAV]
GO

DECLARE @invalididentifierscars varchar(10)
DECLARE @replacestring varchar(10)
SET @invalididentifierscars = (SELECT [invalididentifierchars] FROM [dbo].[$ndo$dbproperty])

DECLARE @varSQL varchar(max)
DECLARE @DimTableName varchar(256)
DECLARE @DimValueTableName varchar(256)
DECLARE @getCompanyNames CURSOR
DECLARE @CompanyName varchar(50)
DECLARE @loop int
CREATE TABLE #TmpLedgerEntryDimCombinations (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int)
SET @getCompanyNames = CURSOR FOR SELECT [Name] FROM [dbo].[Company]
OPEN @getCompanyNames
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @DimTableName = @CompanyName + ‘$Ledger Entry Dimension’
SET @DimValueTableName = @CompanyName + ‘$Dimension Value’
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @DimTableName = REPLACE(@DimTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @DimValueTableName = REPLACE(@DimValueTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @varSQL = ‘USE [‘ + DB_NAME() + ‘]; INSERT INTO #TmpLedgerEntryDimCombinations SELECT DISTINCT ”’ + @CompanyName + ”’,[Table ID],O.[Name],L.[Dimension Code],[Global Dimension No_]
FROM [dbo].[‘ + @DimTableName + ‘] L,[dbo].[‘ + @DimValueTableName + ‘] D, [dbo].[Object] O
WHERE D.[Dimension Code] = L.[Dimension Code] AND [Global Dimension No_] > 0 AND O.[Type] = 0 AND O.[ID] = [Table ID]’
EXEC (@varSQL)
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
END
CLOSE @getCompanyNames
DEALLOCATE @getCompanyNames

DECLARE @TableID int
DECLARE @TableName varchar(50)
DECLARE @GlobalDimCode varchar(20)
DECLARE @GlobalDimNo int
DECLARE @DimFieldName varchar(50)
DECLARE @DimFixes CURSOR
SET @DimFixes = CURSOR FOR SELECT CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo FROM #TmpLedgerEntryDimCombinations
OPEN @DimFixes
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
WHILE @@FETCH_STATUS = 0
BEGIN
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @CompanyName = REPLACE(@CompanyName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @TableName = REPLACE(@TableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @DimFieldName = ‘[Global Dimension ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ Code]’
SET @varSQL = ‘DELETE FROM [‘ + @CompanyName + ‘$Ledger Entry Dimension]
WHERE [Table ID] = ‘ + CAST(@TableID as varchar(20)) + ‘ AND [Dimension Code] = ”’ + @GlobalDimCode + ””
EXEC(@varSQL)
SET @varSQL = ‘INSERT INTO [‘ + @CompanyName + ‘$Ledger Entry Dimension] ([Table ID],[Entry No_],[Dimension Code],[Dimension Value Code])
SELECT ‘ + CAST(@TableID as varchar(20)) + ‘,[Entry No_],”’ + @GlobalDimCode + ”’,’ + @DimFieldName + ‘
FROM [dbo].[‘ + @CompanyName + ‘$’ + @TableName + ‘] E
WHERE ‘ + @DimFieldName + ‘ <> ””’
EXEC(@varSQL)
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
END
CLOSE @DimFixes;
DEALLOCATE @DimFixes;

DROP TABLE #TmpLedgerEntryDimCombinations
GO

[/code]

Reading through this you should see that the basic part of the two scripts are similar and could be used to build further testing and for other dimension tables.  It is important that all the testing should be done before upgrading to NAV 2013 or NAv 2013 R2.

There are a lot of tables you will need to consider and do a data check for.

DimensionTables

Some of you have most likely written some kind of dimension test.  Perhaps you can also share them here ?

Restore Database with PowerShell and create Tenants

In my latest blog in this PowerShell series I showed how to install Dynamics NAV on a remote or local computer with PowerShell.

The next step is to create the databases. To prepare PowerShell I use these functions to prepare for NAV and SQL administration

[code lang=”powershell”]Import-Module ‘C:\Program Files\Microsoft Dynamics NAV\71\Service\NavAdminTool.ps1’
Import-Module SQLPS
[/code]

I use the Demo Database that is included with the Dynamics NAV DVD to create the default tenant database. This PowerShell script restores the database backup to a new name in a given path on a given SQL server.

[code lang=”powershell”]$NewDatabaseName = ‘NAV71_L01_DEF’
$NewDatabasePath = ‘C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA’
$BackupFile = ‘C:\SQLDemoDatabase\CommonAppData\Microsoft\Microsoft Dynamics NAV\71\Database\Demo Database NAV (7-1).bak’
$sqlserver = ‘sqlserver’

$NewDatabaseFullPath = Join-Path $NewDatabasePath $NewDatabaseName
Write-Host Restoring $BackupFile to $NewDatabaseFullPath

#Load the DLLs if not using SQLPS
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’) | out-null
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SmoExtended’) | out-null

$Server = New-Object Microsoft.SqlServer.Management.Smo.Server $sqlserver
$Restore = New-Object Microsoft.SqlServer.Management.Smo.Restore

#settings for the restore
$Restore.Action = "Database"
$Restore.NoRecovery = $false;
$Restore.ReplaceDatabase = $false;
$RestorePercentCompleteNotification = 5;
$Restore.Devices.AddDevice($BackupFile, [Microsoft.SqlServer.Management.Smo.DeviceType]::File)

#get the db name
$RestoreDetails = $Restore.ReadBackupHeader($server)

#print database name
"Database Name from Backup File : " + $RestoreDetails.Rows[0]["DatabaseName"]

#give a new database name
$OldDatabaseName = $RestoreDetails.Rows[0]["DatabaseName"]
$Restore.Database = $NewDatabaseName

Write-Host Changing Database Name $OldDatabaseName to $NewDatabaseName

$RestoreFile = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile
$RestoreLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile

#set file names; use the default database directory
$OldDataFileName = $OldDatabaseName + ‘_Data’
$NewDataFileName = Join-Path $NewDatabasePath $NewDatabaseName
$RestoreFile.LogicalFileName = $OldDataFileName
$RestoreFile.PhysicalFileName = $NewDataFileName + ‘.mdf’
Write-Host Changing Data File $OldDataFileName to $NewDataFileName
$OldLogFileName = $OldDatabaseName + ‘_Log’
$NewLogFileName = Join-Path $NewDatabasePath $NewDatabaseName
$RestoreLog.LogicalFileName = $OldLogFileName
$RestoreLog.PhysicalFileName = $NewLogFileName + ‘.ldf’
Write-Host Changing Log File $OldLogFileName to $NewLogFileName
$Restore.RelocateFiles.Add($RestoreFile)
$Restore.RelocateFiles.Add($RestoreLog)

$Restore.SqlRestore($Server)
write-host "Restore of " $NewDatabaseName "Complete"

# add role membership
$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$sqlserver;Integrated Security=SSPI;Initial Catalog=$NewDatabaseName");
$cn.Open()
$q = "EXEC sp_addrolemember @rolename = N’db_owner’, @membername = N’NT AUTHORITY\NETWORK SERVICE’"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$cmd.ExecuteNonQuery() | out-null
$cn.Close()[/code]

To be able to use this the SQL Management tools must be installed. After the database has been restored the script makes the NETWORK SERVICE account an owner. This can of course be customized to another user(s).

The following steps are needed to set up multi tenancy. First I need a temporary Dynamics NAV service Instance.

[code lang=”powershell”]$ClientPort = 7210
$ServiceInstanceName = ‘NAV71_L01_APP’
$SoapPort = $ClientPort + 1
$ODataPort = $SoapPort + 1
$MgtPort = $ODataPort + 1
$DatabaseName = ‘NAV71_L01_DEF’
$DatabaseServer = ‘sqlserver’
$DatabaseInstance = ”

New-NAVServerInstance -ServerInstance $ServiceInstanceName -ClientServicesCredentialType Windows -ClientServicesPort $ClientPort -DatabaseInstance $DatabaseInstance -DatabaseName $DatabaseName -DatabaseServer $DatabaseServer -ManagementServicesPort $MgtPort -SOAPServicesPort $SoapPort -ODataServicesPort $ODataPort
Set-NAVServerInstance -ServerInstance $ServiceInstanceName -Start[/code]

Next step is to export the application to a separate database and change the service configuration to multi tenant.

[code lang=”powershell”]$ServiceInstance = ‘NAV71_L01_APP’
$DatabaseName = ‘NAV71_L01_DEF’
$DatabaseServer = ‘sqlserver’
$DatabaseInstance = ”

# Stop the NAV Service
Set-NAVServerInstance $ServiceInstance -Stop

# Export the NAV Application to a new Database and then remove
Export-NAVApplication -DatabaseName $DatabaseName -DatabaseServer $DatabaseServer -DatabaseInstance $DatabaseInstance -DestinationDatabaseName $ServiceInstance
Remove-NAVApplication -DatabaseName $DatabaseName -DatabaseServer $DatabaseServer -DatabaseInstance $DatabaseInstance -Force

# Change Service to MultiTenant
Set-NAVServerConfiguration -ServerInstance $ServiceInstance -KeyName MultiTenant -KeyValue "true"
Set-NAVServerConfiguration -ServerInstance $ServiceInstance -KeyName DatabaseName -KeyValue ""
Set-NAVServerInstance $ServiceInstance -Start

# Mount Application
Mount-NAVApplication -DatabaseServer $DatabaseServer -DatabaseInstance $DatabaseInstance -DatabaseName $ServiceInstance -ServerInstance $ServiceInstance
Mount-NAVTenant -ServerInstance $ServiceInstance -Id Default -DatabaseServer $DatabaseServer -DatabaseInstance $DatabaseInstance -DatabaseName $DatabaseName -OverwriteTenantIdInDatabase -AllowAppDatabaseWrite

Get-NAVTenant -ServerInstance $ServiceInstance | Format-Table
[/code]

I like to create empty tenant databases that I can later pick up and use for customers. Here I create ten databases and attach them to the service to initialize the common tables.

[code lang=”powershell”]$ServiceInstance = ‘NAV71_L01_APP’
$DatabaseServer = ‘sqlserver’
$DatabaseInstance = ”
$TenantPrefix = ‘NAV71_L01_’

#Load the DLLs if not using SQLPS
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’) | out-null
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SmoExtended’) | out-null

$Server = New-Object Microsoft.SqlServer.Management.Smo.Server $DatabaseServer

$NewIds = (’00’,’01’,’02’,’03’,’04’,’05’,’06’,’07’,’08’,’09’)

foreach ($NewId in $NewIds)
{
$DatabaseName = $TenantPrefix + $NewId
$db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database($DatabaseServer, $DatabaseName)
$db.RecoveryModel = ‘full’
$db.Create()

# add role membership
$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$DatabaseServer;Integrated Security=SSPI;Initial Catalog=$DatabaseName");
$cn.Open()
$q = "EXEC sp_addrolemember @rolename = N’db_owner’, @membername = N’NT AUTHORITY\NETWORK SERVICE’"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$cmd.ExecuteNonQuery() | out-null
$cn.Close()

Mount-NAVTenant -ServerInstance $ServiceInstance
-Id $NewId

-DatabaseServer $DatabaseServer
-DatabaseInstance $DatabaseInstance

-DatabaseName $DatabaseName `
-OverwriteTenantIdInDatabase
}[/code]

Since I did all this on my developement machine I wanted to move all the databases to the production SQL server. So I created a script to backup all the databases.

[code lang=”powershell”]$BackupLocation = ‘C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Backup\’

$appDatabases = Get-NAVServerInstance | Where-Object {$_.DisplayName -match ‘NAV71_’} | Get-NAVApplication
foreach ($appDatabase in $appDatabases)
{
$dbName = $appDatabase.’Database name’
$dbServer = $appDatabase.’Database server’
$backupFile = $BackupLocation + $dbName + ‘.bak’
write-host "Backing up database $dbName on $dbServer"
Backup-SqlDatabase -ServerInstance $dbServer -Database $dbName -BackupAction Database -BackupFile $backupFile
}
$tenantDatabases = Get-NAVServerInstance | Where-Object {$_.DisplayName -match ‘NAV71_’} | Get-NAVTenant
foreach ($tenantDatabase in $tenantDatabases)
{
$dbName = $tenantDatabase.DatabaseName
$dbServer = $tenantDatabase.DatabaseServer
$backupFile = $BackupLocation + $dbName + ‘.bak’
write-host "Backing up database $dbName on $dbServer"
Backup-SqlDatabase -ServerInstance $dbServer -Database $dbName -BackupAction Database -BackupFile $backupFile
}
[/code]

And the last step is to stop and remove the temporary Dynamics NAV service instance.

[code lang=”powershell”]$ServiceInstanceName = ‘NAV71_L01_APP’

Set-NAVServerInstance -ServerInstance $ServiceInstanceName -Stop
Remove-NAVServerInstance -ServerInstance $ServiceInstanceName -Force
[/code]

On the production SQL server I restored the databases with this script. I also give the service user owner rights to the databases.

[code lang=”powershell”]#Load the DLLs if not using SQLPS
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
#Need SmoExtended for smo.backup
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

$Backups = dir ‘T:\Recover\Nav71\*.bak’
$NewDatabasePath = ‘C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA’
$sqlserver = ‘sqlserver’
$ServiceAccount = ‘DOMAIN\SERVICE USER’

foreach ($BackupFile in $Backups) {
‘Restoring ‘ + $BackupFile.Directory + ‘\’ + $BackupFile.Name

$server = New-Object Microsoft.SqlServer.Management.Smo.Server $sqlserver
$backupDevice = New-Object Microsoft.SqlServer.Management.Smo.BackupDeviceItem $BackupFile, "File"
$Restore = New-Object Microsoft.SqlServer.Management.Smo.Restore

#Set properties for Restore
$Restore.NoRecovery = $false;
$Restore.ReplaceDatabase = $true;
$Restore.Devices.Add($backupDevice)
$RestoreDetails = $Restore.ReadBackupHeader($server)
$NewDatabaseName = Get-ChildItem $BackupFile | % {$_.BaseName}
$Restore.Database = $NewDatabaseName

# Specify the need to relocate the data and log files (mdf and ldf)
$resFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$resLog = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")

# The logical file names should be the logical filename stored in the backup media
# The physical file should reflect the updated path
$resFile.LogicalFileName = $RestoreDetails.Rows[0]["DatabaseName"]
$resLog.LogicalFileName = $RestoreDetails.Rows[0]["DatabaseName"] + "_Log"
$resFile.PhysicalFileName = $NewDatabasePath + $RestoreDetails.Rows[0]["DatabaseName"] + "_Data.mdf"
$resLog.PhysicalFileName = $NewDatabasePath + $RestoreDetails.Rows[0]["DatabaseName"] + "_Log.ldf"
$Restore.RelocateFiles.Add($resFile)
$Restore.RelocateFiles.Add($resLog)

$Restore.SqlRestore($server)
# add role membership
$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$sqlserver;Integrated Security=SSPI;Initial Catalog=$NewDatabaseName");
$cn.Open()
$q = "EXEC sp_addrolemember @rolename = N’db_owner’, @membername = N’$ServiceAccount’"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$cmd.ExecuteNonQuery() | out-null
$cn.Close()
}
[/code]

Next we take a look at creating the server instance on a remote machine, mounting the tenant and creating the ClickOnce website along with the Web Client.