Using AdvaniaGIT in Visual Studio Code

It has become obvious that the future of AL programming is in Visual Studio Code.

Microsoft has made a decision to ship all their releases as Docker Containers.

The result of this is a development machine that does not have any NAV version installed.  I wanted to go through the installation and configuration of a new NAV on Docker development machine.

Here is what I did.

I installed Windows Server 2016 with Containers.  The other option was to use Windows 10 and install Docker as explained here.

After installing and fully updating the operating system I downloaded and installed Visual Studo Code.

After installation Visual Studio Code detects that I need to install Git.

I selected Download Git and was taken to the Git download page.

I downloaded and installed Git with default settings.

To be able to run NAV Development and NAV Client I need to install prerequisite components.  I copied the Prerequisite Components folder from my NAV 2018 DVD and installed some of them…

Let’s hook Visual Studio Code to our NAV 2018 repository and install AdvaniaGIT.  I first make sure to always run Visual Studio Code with administrative privileges.

Now that we have our AdvaniaGIT installed and configured we can start our development.  Let’s start our C/AL classic development.  Where this video ends you can continue development as described in my previous posts on AdvaniaGIT.  AdvaniaGIT also supports NAV 2016 and NAV 2017.

Since we are running NAV 2018 we can and should be using AL language and the Extension 2.0 model.  Let’s see how to use our repository structure, our already build Docker container and Visual Studio Code to start our first AL project.

So as you can see by watching these short videos it is easy to start developing both in C/AL and AL using AdvaniaGIT and Visual Studio Code.

My next task is to update my G/L Source Names extension to V2.  I will be using these tools for the job.  More to come soon…

Introducing AdvaniaGIT – SCM for Dynamics NAV

Almost two years ago we in Advania decided to start using GIT as Source Control Management (SCM).  We brought Kamil up to Iceland and we kicked off.  In Sorens session on NAVTechDays last year we demoed SourceTree as the GIT client for NAV SCM.

Everything we in Advania are doing with SCM is available on GitHub.  It is our hope that we can get as many users and companies to use and contribute to this solution.

Over the next coming days and weeks I will be writing here about this tool.  I will also be using the GitHub Wiki for some of the information.

Installing AdvaniaGIT will create a folder structure on your local drive. You can select any of the local drive installed. We suggest that the AdvaniaGIT\Workspace folder should be excluded from Windows Defender and that also goes for any GIT folder used.

Refer to the README.md file inside every subfolder for more details about each subfolder usage.

Inside the Data subfolder we store the module settings in JSON files.

  • BranchSettings.json is automatically managed by the module and used to link GIT branches to local NAV environments.
  • BuildSettings.json contains incremented values that will be used when building new environments.
  • GITSettings.json contains machine settings for the module.
  • NAVVersions.json contains information about locally installed NAV.
  • RemoteSettings.json contains settings for the Remote Management module. Not used by GIT in any way.
  • TenantSettings.json contains settings for each tenant running on a remote server that is managed using the Remote Management module. Not used by GIT in any way.

In the GIT repository folder we require a setup.json file. When the scripts are executed settings from the GIT branch (setup.json) and settings from the machine (GITSettings.json) are merged to a single settings object. If same settings exist in both files the one in the GIT branch will be used.

Installing the module will add custom actions to SourceTree and a command file (StartPowerShell.cmd) to your Windows directory. SourceTree will execute this command file with parameters telling the module what to do. The command file will execute Scripts\Start-CustomAction.ps1 with the same parameters. All custom actions within the Scripts\CustomActions subfolder can be executed.

For teams we suggest using a FTP server for backups and CRONUS text files.

My next blog post will be on the installation and update of AdvaniaGIT.  Stay tuned…

Using NetTcpPortSharing for NAV Servers

I just came back from three weeks vacation yesterday.  During my vacation I had made a decision to implement Tcp Port Sharing for the Instance Administration tool used in Advania Azure.

Early last year I published a function that uses the sc.exe to modify a NAV Service startup type.  When a NAV Service is installed and configured in setup, the startup type is Automatic (Delayed Start).  However, create a new service with Powershell New-NavServerIntance and the statup type is Automatic without the (Delayed Start).

To enable Tcp Port Sharing that same sc.exe function is needed.  Interestingly, after I had finished the task and was reading NAV blogs I saw that Waldo just published a powershell function on his blog to do just this.

The script lines I used and added to my Instance Administration powershell scripts are based on my fist sc.exe function but not using the function it self.  Now when a new NAV service is created by the tool the startup type is modified and if so selected by the deployment settings, the Tcp Port Sharing is also activated.

By default, the Tcp Port Sharing service is disabled.
servicedisabled

The startup type should be changed to Manual.  This can be done manually or by an administrative powershell script.

[code lang=”powershell”]#Set Startup Mode for NetTcpPortSharing to Manual
$command = ‘sc.exe \\$Computer config "NetTcpPortSharing" start= demand’
$Output = Invoke-Expression -Command $Command -ErrorAction Stop
if($LASTEXITCODE -ne 0){
Write-Error "$Computer : Failed to set NetTcpPortSharing to manual start. More details: $Output"
}
[/code]

Similar script is used to update the existing NAV Services to both delayed start and Tcp Port Sharing dependency.

[code lang=”powershell”]

#Stop NAV Server Instances
Get-NAVServerInstance | Set-NAVServerInstance -Stop
#Update Startup Type and Dependency on NAV Server Instances
Get-NAVServerInstance | foreach {
$Service = $_.ServerInstance
Write-Host "Working on service $Service"
$Computer = ‘LOCALHOST’
$command = ‘sc.exe \\$Computer config "$Service" start= delayed-auto’
$Output = Invoke-Expression -Command $Command -ErrorAction Stop
if($LASTEXITCODE -ne 0){
Write-Error "$Computer : Failed to set $Service to delayed start. More details: $Output"
}
$command = ‘sc.exe \\$Computer config "$Service" depend= NetTcpPortSharing/HTTP’
$Output = Invoke-Expression -Command $Command -ErrorAction Stop
if($LASTEXITCODE -ne 0){
Write-Error "$Computer : Failed to set $Service TcpPortSharing. More details: $Output" -foregroundcolor red
}

}
#Start NAV Server Instances
Get-NAVServerInstance | Set-NAVServerInstance -Start
[/code]

It should be obvious that the above script can also use the Set-ServiceStartupMode from my blog and the Enable-NAVServerInstancePortSharing function on Waldo’s blog. That would be a cleaner code and more in line with what we would like to see.

Again quoting Waldo from his previous blog, “When you’re using a dedicated service account, things might become a slight more difficult”.  That is exactly my case, I am using a dedicated service account.

After enabling Tcp Port Sharing and updating the services they would not start.  Event Viewer revealed the reason.

Server instance: CRONUS
The service MicrosoftDynamicsNavServer$CRONUS failed to start. This could be caused by a configuration error. Detailed error information:System.ServiceModel.CommunicationException: The service endpoint failed to listen on the URI ‘net.tcp://mynavserver.dynamics.is:7046/CRONUS/Service’ because access was denied. Verify that the current user is granted access in the appropriate allowAccounts section of SMSvcHost.exe.config. —> System.ComponentModel.Win32Exception: Access is denied

So I started to ask Bing what I could do.  Microsoft MSDN states:

When a net.tcp binding enables port sharing (by setting portSharingEnabled =true on the transport binding element), it implicitly allows an external process (namely the SMSvcHost.exe, which hosts the Net.TCP Port Sharing Service) to manage the TCP socket on its behalf.

Hence, I need to add the Sid of my NAV Service Account to the SMSvcHost.exe.config file.  I could do this manually, but I am a programmer!

Another powershell script was born.  This one could also be converted to a function.  Before executing the script make sure to update the user and domain in the top of the script.  Be smart and execute this function before updating the NAV Services with the script above.

[code lang=”powershell”]
#Modify User and Domain to fit your environment
$UserToAdd = ‘srvNAV’
$UserDomainToAdd = ‘DYNAMICS’

#Initial Values
$UserSidFound = ‘false’
$ConfigurationSet = ‘false’

#Net.Tcp Port Sharing Service Name
$ServiceName = ‘NetTcpPortSharing’

#Get SID for the Service User
$UserSid = ([wmi] "win32_userAccount.Domain=’$UserDomainToAdd’,Name=’$UserToAdd’").SID

#Get Path for SMSvcHost.exe.config file
$SMSvcHostPath = (Get-WmiObject win32_service | ?{$_.Name -like $ServiceName} ).PathName
$SMSvcHostPathConfig = $SMSvcHostPath + ‘.config’

Write-Host "Reading XML from $SMSvcHostPathConfig"
#Read Config file
$xmlDoc = [xml] (Get-Content $SMSvcHostPathConfig)

Write-Host "Looking for access permission for $UserSid"
#Loop through allowed accounts and search for the service user Sid
$allowAccounts = Select-Xml "configuration/system.serviceModel.activation/net.tcp/allowAccounts/add" $xmlDoc
$allowAccounts | ForEach-Object {
$ConfiguredSid = $_.Node.Attributes.Item(0).Value
if ($ConfiguredSid -eq $UserSid) {$UserSidFound = ‘true’}
$ConfigurationSet = ‘true’
Write-Host "Found SID $ConfiguredSid"
}

#Act if Access Configuration is not enabled
if ($ConfigurationSet -eq ‘false’) {Write-Host "Access permission not configured"
$config = [xml] ‘<system.serviceModel.activation>
<net.tcp listenBacklog="10" maxPendingConnections="100" maxPendingAccepts="2" receiveTimeout="00:00:10" teredoEnabled="false">
<allowAccounts>
<add securityIdentifier="S-1-5-18"/>
<add securityIdentifier="S-1-5-19"/>
<add securityIdentifier="S-1-5-20"/>
<add securityIdentifier="S-1-5-32-544" />
</allowAccounts>
</net.tcp>
<net.pipe maxPendingConnections="100" maxPendingAccepts="2" receiveTimeout="00:00:10">
<allowAccounts>
<add securityIdentifier="S-1-5-18"/>
<add securityIdentifier="S-1-5-19"/>
<add securityIdentifier="S-1-5-20"/>
<add securityIdentifier="S-1-5-32-544" />
</allowAccounts>
</net.pipe>
<diagnostics performanceCountersEnabled="true" />
</system.serviceModel.activation>’

$configurationNode = $xmlDoc.DocumentElement
$newConfig = $xmlDoc.ImportNode($config.DocumentElement, $true)
$configurationNode.AppendChild($newConfig)

$allowAccounts = Select-Xml "configuration/system.serviceModel.activation/net.tcp/allowAccounts/add" $xmlDoc
$allowAccounts | ForEach-Object {
$ConfiguredSid = $_.Node.Attributes.Item(0).Value
Write-Host "Found SID $ConfiguredSid"
if ($ConfiguredSid -eq $UserSid) {$UserSidFound = ‘true’}
$ConfigurationSet = ‘true’
}

}

#Add Service User Sid if needed
if ($UserSidFound -ne ‘true’) {
$nettcp = $xmlDoc.SelectSingleNode("configuration/system.serviceModel.activation/net.tcp/allowAccounts")
$addNode = $xmlDoc.CreateElement(‘add’)
$secIden = $xmlDoc.CreateAttribute(‘securityIdentifier’)
$secIden.Value = $UserSid
$addNode.Attributes.Append($secIden)

$nettcp.AppendChild($addNode)
$xmlDoc.Save($SMSvcHostPathConfig)
Write-Host "Configuration Updated"
#Restart Service if running
if ((Get-Service NetTcpPortSharing).Status -eq "Running") {Restart-Service NetTcpPortSharing -Force}
}

[/code]

This script will search for the SMSvcHost.exe.config file, load it and check to see if the NAV Service User is already allowed access.  If not then the config file is updated and saved.  This script must be executed with administrative privileges.

Perhaps this should be what I started with, but the question; why do we need this, should be answered.

First, modifying the startup mode to delayed start is done to make sure that all the required networking and database processes have been started before the NAV Service starts.  This is very important if the SQL Server is running on the same server.  On a dedicated NAV Service server this is not as important but still recommended.

Secondly, accessing a NAV Service in most cases requires changes to a firewall.  Either to open a specific port or setting up a NAT from a public interface.  To minimize the number of ports used also minimizes the networking setup and maintenance.  If different network permissions or network access is required I recommend using separate ports for the NAV Services.

Building a clean database – remove not licensed objects

I just got a question from a client;

Gunnar,
Do you have a “King Kong” license that will allow you to delete any object?  It appears our development license does not have the rights to some of the newer LS Retail objects and I need to create a CRONUS database with just our stuff.

Well, I don’t have a “King Kong” license.  That is only for Microsoft.

There is a way to solve this dilemma.  It will take a few steps.

Start with we have two databases, one with the data we need (LSRetail), another with the application we need (CRONUS).

After the process is completed the LSRetail database will not be usable as a standalone database, so make a copy if you need one.  A new database will be created, CRONUS_APP.  To clean up it is safe to delete both these databases.

The following powershell script has two options.  Option 1 is to have the company data imported into the CRONUS database in the end.  This option requires a server instance running on the CRONUS database.  Option 2 is to create a new database with SQL Management Studio and merge the CRONUS application and the LSRetail data into that one.

[code lang=”powershell”]
$CronusDatabaseName = "CRONUS" # Database with destination Application
$CRONUSServerInstance = "DynamicsNAV80" # Instance for destination Application if using option 1
$LSRetailDatabaseName = "LSRETAIL" # LS Retail Demo Database, database with company data
$EmptyDatabaseName = "CRONUS WITH COMPANYDATA" # Create a new empty database using SQL Management Studio if using option 2
$SQLServerName = "SQL2014"
$SQLServerInstance = "NAVDEMO" # Set blank for default instance

$AppDatabaseName = $CronusDatabaseName + "_APP"
$ServiceAccount = $env:USERDOMAIN + "\" + $env:USERNAME
$ServerInstance = "UPGRADE"
$NavDataFile = (Join-Path $env:TEMP "NAVmerge.navdata")

$SelectOption = "2"

#Export Application from CRONUS Database to Application Database
Export-NAVApplication -DatabaseServer $SQLServerName -DatabaseInstance $SQLServerInstance -DatabaseName $CronusDatabaseName -DestinationDatabaseName $AppDatabaseName -ServiceAccount $ServiceAccount -Force

#Setup a temporary Server Instance for the new database
Get-Credential | New-NAVServerInstance -ServerInstance $ServerInstance -ManagementServicesPort 33555 -ClientServicesPort 33556 -SOAPServicesPort 33557 -ODataServicesPort 33558 -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $AppDatabaseName -ServiceAccount User -Force
Set-NAVServerConfiguration -ServerInstance $ServerInstance -KeyName "Multitenant" -KeyValue "true" -Force
Set-NAVServerInstance -ServerInstance $ServerInstance -Start -Force

#Prepare LSRetailDatabase for new configuration
Remove-NAVApplication -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Force

#Mount and Sync LSRetailDatabase as a tenant
Mount-NAVTenant -ServerInstance $ServerInstance -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Id DEFAULT -OverwriteTenantIdInDatabase -AllowAppDatabaseWrite -Force
Sync-NAVTenant -ServerInstance $ServerInstance -Tenant DEFAULT -Mode ForceSync -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}

#Option 1, Copy Company data to the original CRONUS database. Requies a service running on the CRONUS database
if ($SelectOption -eq "1")
{
Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -Force
Import-NAVData -ServerInstance $CRONUSServerInstance -FilePath $NavDataFile -AllCompanies -Force
}
#Option 2, Import into the new empty database created by SQL Management Studio
if ($SelectOption -eq "2")
{

Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -IncludeApplication -IncludeApplicationData -IncludeGlobalData -Force
if ($SQLServerInstance -eq "")
{
Import-NAVData -DatabaseServer $SQLServerName -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication

}
else
{
Import-NAVData -DatabaseServer ($SQLServerName + "\" + $SQLServerInstance) -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication
}

}

Set-NAVServerInstance -ServerInstance $ServerInstance -Stop -Force
Remove-NAVServerInstance -ServerInstance $ServerInstance -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}
[/code]

To walk you through what happens;

  • Application from CRONUS is exported into CRONUS_APP database
  • New Service Instance is created for CRONUS_APP database
  • Service Instance is changed to Multi Tenant and started
  • Application is removed from LSRetail database
  • LSRetail database is mounted as a tenant for CRONUS_APP database
  • LSRetail database structure is force-synched to CRONUS_APP application
  • Data from CRONUS_APP and LSRetail tenant is exported to NAVData file
  • NAVData file is imported into an empty database or the existing CRONUS database

 

Apply a Cumulative Update with Powershell

Microsoft has changed the way they ship Cumulative Updates.  Now we download the whole DVD image along with the application changes.

This post is intended as additional information for New Developer Tools for Dynamics NAV 2013 R2 and Install Client and Server update with PowerShell posts.

To update a code that is built on a RTM version or an older CU version I use the Merge-NAVToNewCU script package.  I create a temporary folder and export three set of objects.  One file for the original unchanged objects (38457Objects.txt), One file for the new CU objects (40938Objects.txt) and finally the modified version (38457customizedObjects.txt).  When I export I normally skip the MenuSuites that I don’t have permission to import.

Microsoft sometimes adds new fields to tables, fields that are outside of my permission to insert.  Therefore, when I have completed the compare and have my new customized object (40938customizedObjects.txt) I start by importing the object fob file directly from the Microsoft Dynamics NAV Application folder found in the CU package.  I accept the default action, replace most objects and merge tables.  This will create all the new fields needed by the update.  After I am able to import the new customized objects text file and compile.

[code lang=”powershell”]$rootFolderName = $PSScriptRoot
$oldVersion = ‘38457’
$newVersion = ‘40938’

Import-Module "${env:ProgramFiles(x86)}\Microsoft Dynamics NAV\80\RoleTailored Client\Microsoft.Dynamics.Nav.Model.Tools.psd1" -force
Import-Module (Join-Path $rootFolderName ‘Merge-NAVVersionListString script.ps1’) -force

$diffFolderName = (Join-Path $rootFolderName ($oldVersion + ‘to’ + $newVersion + ‘diff’))
$oldObjects = (Join-Path $rootFolderName ($oldVersion + ‘objects.txt’))
$newObjects = (Join-Path $rootFolderName ($newVersion + ‘objects.txt’))
$newFolder = (Join-Path $rootFolderName ($newVersion + ‘update’))
$customObjects = (Join-Path $rootFolderName ($oldVersion + ‘customizedobjects.txt’))
$newCustomizedObjects = (Join-Path $rootFolderName ($newVersion + ‘customizedobjects.txt’))
$newCustomizedFolder = (Join-Path $rootFolderName ($newVersion + ‘customized’))
if (!(Test-Path $diffFolderName))
{
mkdir $diffFolderName
}
if (!(Test-Path $newFolder))
{
mkdir $newFolder
}
if (!(Test-Path $newCustomizedFolder))
{
mkdir $newCustomizedFolder
}
Write-Host "Comparing customized and original…"
Compare-NAVApplicationObject -Original $oldObjects -Modified $customObjects -Delta $diffFolderName | Where-Object CompareResult -eq ‘Identical’ | foreach { Remove-Item (Join-Path $diffFolderName ($_.ObjectType.substring(0,3) + $_.Id + ‘.delta’)) }
Write-Host "Splitting new objects…"
Split-NAVApplicationObjectFile $newObjects $newFolder
Write-Host "Removing unchanged new objects…"
Get-ChildItem -Path $newFolder | foreach { if (!(Test-Path ((Join-Path $diffFolderName $_.BaseName) + ‘.delta’))) { Remove-Item $_.FullName } }
Write-Host "Updating new objects…"
Update-NAVApplicationObject -Target $newFolder -Delta $diffFolderName -Result $newCustomizedFolder -DateTimeProperty FromModified -ModifiedProperty FromModified -VersionListProperty FromModified -DocumentationConflict ModifiedFirst
Write-Host "Updating customized object version list…"
Get-ChildItem -Path (Join-Path $newCustomizedFolder ‘*.txt’)| foreach { if (Test-Path (Join-Path $newFolder $_.Name)) {Set-NAVApplicationObjectProperty -Target $_.FullName -VersionListProperty (Merge-NAVVersionListString -source (Get-NAVApplicationObjectProperty -Source $_.FullName).VersionList -target (Get-NAVApplicationObjectProperty -Source (Join-Path $newFolder $_.Name)).VersionList) }}
Write-Host "Joining customized object to a single file…"
Join-NAVApplicationObjectFile -Source (Join-Path $newCustomizedFolder ‘*.txt’) -Destination $newCustomizedObjects
Write-Host "If you have conflicts then you need to manually fix conflicting code changes"
[/code]

I have also created a set of scripts to update the Binaries.  For some time now Microsoft has not shipped the binary upgrade folders with the CU package.  Now they deliver the whole DVD, including the new updated Demo Database.  The scripts therefore must copy the files from the DVD instead of copying from the binary upgrade folders.

When copying from the DVD we must make sure that we don’t overwrite the configuration files for the server and the web client. This is the server update script. All server on the computer will be stopped, updated and restarted.

[code lang=”powershell”]$NAVDVDFilePath = ‘\\STORAGE\NAV 2015\NAV.8.0.40938.IS.DVD’
$NotToCopy = @(‘Tenants.config’,’CustomSettings.config’)
Write-Verbose "Copying NAV Server Update…"
$ClientKBFolder = Join-Path $NAVDVDFilePath ‘ServiceTier\program files\Microsoft Dynamics NAV\80\Service’
$navInstallationDirectory = Join-Path ${env:ProgramFiles} ‘Microsoft Dynamics NAV\80\Service’
if (Test-Path $navInstallationDirectory)
{
Import-Module (Join-Path $ClientKBFolder ‘Microsoft.Dynamics.Nav.Management.dll’) -DisableNameChecking | Out-Null
$RunningInstances = Get-NAVServerInstance | Where-Object { $_.State -eq "Running" }
Write-Verbose "Stopping Server Instances…"
Get-NAVServerInstance | Where-Object { $_.State -eq "Running" } | Set-NAVServerInstance -Stop
Start-Sleep -s 5

Write-Verbose "Running file copy command…"
Get-ChildItem -Path $ClientKBFolder | % {Copy-Item $_.FullName $navInstallationDirectory -Recurse -Force -Exclude $NotToCopy}

Write-Verbose "Done updating files…"
foreach ($RunningInstance in $RunningInstances)
{
$InstanceName = $RunningInstance.ServerInstance.ToString()
Write-Verbose "Starting Server Instance $InstanceName"
Get-NAVServerInstance -ServerInstance $InstanceName | Set-NAVServerInstance -Start
}
}[/code]

This is the web server update script

[code lang=”powershell”]$NAVDVDFilePath = ‘\\STORAGE\NAV 2015\NAV.8.0.40938.IS.DVD’
$NotToCopy = (‘web.config’,’instanceweb.config’,’Header.png’,’About.png’,’Splash.png’)
function Copy-LatestFile
{
[CmdletBinding()]
param (
[parameter(Mandatory=$true)]
[string]$SourceDirectoryPath,
[parameter(Mandatory=$true)]
[string]$DestinationDirectoryPath
)
Write-Verbose "Running file copy command…"
$sourcefiles = Get-ChildItem $SourceDirectoryPath -Recurse
$destfiles = Get-ChildItem $DestinationDirectoryPath -Recurse
foreach ($sourcefile in $sourcefiles)
{
$copyfile = $false
foreach($destfile in $destfiles)
{
if ($destfile.Name -eq $sourcefile.Name)
{
$copyfile = $true
break
}
}
if (!($copyfile) -or ($NotToCopy -match $sourcefile.BaseName))
{
write-verbose "not copying $sourcefile…"
}
else
{
write-verbose "copying $sourcefile…"
Copy-Item $sourcefile.FullName $destfile.FullName -Force
}
}
}
Write-Verbose "Copying Web Client Update…"
$ClientKBFolder = Join-Path $NAVDVDFilePath ‘WebClient\Microsoft Dynamics NAV\80\Web Client’
$navInstallationDirectory = Join-Path ${env:ProgramFiles} ‘Microsoft Dynamics NAV\80\Web Client’
if (Test-Path $navInstallationDirectory)
{
Copy-LatestFile -SourceDirectoryPath $ClientKBFolder -DestinationDirectoryPath $navInstallationDirectory
}
[/code]

And finally the client update script.

[code lang=”powershell”]$NAVDVDFilePath = ‘\\STORAGE\NAV 2015\NAV.8.0.40938.IS.DVD’
$NotToCopy = (‘Header.png’,’About.png’,’Splash.png’)
function Copy-LatestFile
{
[CmdletBinding()]
param (
[parameter(Mandatory=$true)]
[string]$SourceDirectoryPath,
[parameter(Mandatory=$true)]
[string]$DestinationDirectoryPath
)

Write-Verbose "Running file copy command…"
$sourcefiles = Get-ChildItem $SourceDirectoryPath -Recurse -File
$destfiles = Get-ChildItem $DestinationDirectoryPath -Recurse -File

foreach ($sourcefile in $sourcefiles)
{
$copyfile = $false
foreach($destfile in $destfiles)
{
if ($destfile.Name -eq $sourcefile.Name)
{
$copyfile = $true
break
}
}
if (!($copyfile) -or ($NotToCopy -match $sourcefile.BaseName))
{
write-verbose "not copying $sourcefile…"
}
else
{
write-verbose "copying $sourcefile…"
Copy-Item $sourcefile.FullName $destfile.FullName -Force
}
}
}

Write-Verbose "Copying RTC Update…"
$ClientKBFolder = Join-Path $NAVDVDFilePath ‘RoleTailoredClient\program files\Microsoft Dynamics NAV\80\RoleTailored Client’
$navInstallationDirectory = Join-Path ${env:ProgramFiles(x86)} ‘Microsoft Dynamics NAV\80\RoleTailored Client’
if (Test-Path $navInstallationDirectory)
{
Copy-LatestFile -SourceDirectoryPath $ClientKBFolder -DestinationDirectoryPath $navInstallationDirectory
}
Write-Verbose "Copying Office 14 Update…"
$ClientKBFolder = Join-Path $NAVDVDFilePath ‘Outlook\program files\Microsoft Dynamics NAV\80\OutlookAddin’
$navInstallationDirectory = Join-Path ${env:ProgramFiles(x86)} ‘Microsoft Office\Office14’
if (Test-Path $navInstallationDirectory)
{
Copy-LatestFile -SourceDirectoryPath $ClientKBFolder -DestinationDirectoryPath $navInstallationDirectory
}
Write-Verbose "Copying Office 15 Update…"
$ClientKBFolder = Join-Path $NAVDVDFilePath ‘Outlook\program files\Microsoft Dynamics NAV\80\OutlookAddin’
$navInstallationDirectory = Join-Path ${env:ProgramFiles(x86)} ‘Microsoft Office\Office15’
if (Test-Path $navInstallationDirectory)
{
Copy-LatestFile -SourceDirectoryPath $ClientKBFolder -DestinationDirectoryPath $navInstallationDirectory
}
[/code]

The ClickOnce distribution is always based on the client folder and the script in Install Client and Server update with PowerShell can be used for that purpose.

I am using these binary update scripts in the Instance and Tenant Administration tool and I normally maintain the ClickOnce distribution from there.

ServerManagement

 

New Developer Tools for Dynamics NAV 2013 R2

Cumulative update 9 for Dynamics NAV 2013 R2 has been released.  This new version includes a powershell tool package for developers.

You can read about these tools in the NAV Team Blog, on Soren’s Blog and on Waldo’s Blog.

With every CU package we get an upgrade for the application.  I have a customer that is already running CU8 and I want to install CU9.  This time I will do that with these new powershell tools.

First step is to install the new CU9 binaries.  This I will do with the powershell scripts from here.  This will update windows client files, web client files, server files, outlook integration and click once deployment.  On my developement machine I needed to copy the RTC client files from the CU9 DVD.  The new model tools where not included in the KB RTC folder.

Then to export the Dynamic NAV objects.

I use my CU8 version demo database and export all objects to cu8objects.txt.  I export all objects from the customized database to cu8customizedobjects.txt.  I finally use my CU9 version demo database and export all objects to cu9objects.txt.

Open PoweShell ISE and navigate to the folder with the object files.

Next I import the new model tools.

I create a folder called cu8tocu9diff and execute

I see in my cu8tocu9diff folder that I have 4.036 objects and 3.972 of them are not needed.  I deleted all the files from cu8tocu9diff folder and executed

The result is that I only have the 64 delta files needed in the cu8tocu9diff folder.

Now I need the same 64 objects from the customized database.  I begin by splitting the exported object file into a new folder.

I now have all 4.036 objects in the cu8customized folder.  I delete all unneeded files by executing

I am now in a place to update the customized objects with the cu9 changes.

When I updated the customized objects with these 64 changes I got objects with a new version list.  The version list needs to be merged and for that I use a script that was written by NAV MVP Kamil Sacek.  The script is attached here as Merge-NAVVersionListString script and his scripts are available on CodePlex.  After copying the script to my base folder I need to import the new functions.

And to merge the version list for the new objects I use

I have now 64 updated object that I can import into my customized database and compile.  First I want to join all these 64 files into one object file.

When I imported the new customized object I got an error.  Error because I had new fields in tables that I did not have permission to create.  To solve this I begin by importing the URObjects.IS.37221.fob file that is included in the cumulative update package.  I accept the default action so all new fields should be merged into the existing tables.

Scripts are available to export and import NAV objects so this whole process can be scripted and automated.

Unblock downloaded files

In the process of downloading a file from an insecure location Windows blocks the file.

I have had this bothering to often.  For example when I downloaded an add-in and hit an error.

And more recently I downloaded a knowledge base packed and installed with my PowerShell script just to find out that every file in the download had been blocked.

Once again, PowerShell came to the rescue.  Opened Windows PowerShell ISE in Administration Mode and execute the below script for the needed folders.

[code lang=”ps”]$folder = ‘C:\PUB’
#$folder = ‘C:\Program Files\Microsoft Dynamics NAV’
#$folder = ‘C:\Program Files (x86)\Microsoft Dynamics NAV’
$files = Get-ChildItem -Path $folder -Recurse
foreach ($file in $files)
{
Write-Host "Unblocking file $file…"
Unblock-File -Path $file.FullName
}
[/code]