I am working on an interesting issue.  How to execute data upgrade from NAV 4.0SP3 to NAV 2013 on a 300GB database in just one weekend.  My first go at this task was during the summer and my measures tell me that I would not make it in a week.

The database is heavily modified and I need to move a lot of data to a temporary tables and then back in the end of the data upgrade having reached NAV 2013.  Some fields need to be cleared and in some cases a whole table needs to be emptied and even removed.

The standard code in NAV 4.0SP3 Step 1 is to loop through all entries in the table and initialize the fields, like here

[code]
WITH PurchLineArchive DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Phase Code" := ”;
"Task Code" := ”;
"Step Code" := ”;
MODIFY;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

A faster way is to use MODIFYALL, like here

[code]WITH ProdOrderComponent DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
MODIFYALL("Due Date-Time",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

But I guess it depends on how many fields needs to be initialized.  In the first example the whole table is read record by record, handled in NAV and modified.  In the second example the SQL server does all the job and nothing is handled in NAV.  However, in the second example the SQL server needs to read and modify the whole table for each field that has to be initialized.  So there is a threshold where it is better to handle every record and modify instead of using MODIFYALL for all fields.

In other cases data has to me copied to a temporary table to use later in the upgrade.  This happens if the table structure is changed, data is used in another tables or even if the field has new number or new data type.  Here is an example for the Customer.

[code]WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;

"Our Account No." := ”;
"Mobile Phone" := ”;
"EAN-Id.No." := ”;
"X-400 address" := ”;
"Reports Delivery" := 0;
"Handling fee" := FALSE;
"Shipment request" := ”;
"Distribution Phone No." := ”;
"Distribution GSM No." := ”;
"Distribution E-mail" := ”;
"Distribution Contact" := ”;
"Billing Blocking Code" := ”;
"Responsable To Customer" := ”;
"Order Change Notification" := FALSE;
"Statement Address" := ”;
"Statement Post Code" := ”;
"Statement City" := ”;
"Address of Consignee 1" := ”;
"Address of Consignee 2" := ”;
"Address of Consignee 3" := ”;
"Address of Consignee 4" := ”;
"Pooled Accounts Collection" := ”;
"Block Order Join" := FALSE;
"Missing Order Notification SMS" := ”;
"Missing Order Notif. SMS 2" := ”;
"Social Security No." := ”;
"Electronic Partner – Old" := FALSE;
"Electronic Partner" := FALSE;
"Outstanding POS Sales" := 0;
"Homestead No." := ”;
"Churn No." := 0;
"Notification Process Code" := ”;
"Queue Priority" := "Queue Priority"::"Very Low";
MODIFY;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
[/code]

The first thing I did to speed in my work was to build an Excel document. With the help of Excel I was able to open the table design in NAV, copy all fields that needed to be clear into Excel and have Excel give me the code to initialize the fields.  This saves me some work when customizing the upgrade batch.  A copy of the Excel document will be in the attachment in the bottom of this post.

With the change in dimensions Microsoft faced the fact that the traditional way of data upgrade would not work for the clients.  It would simply be to time consuming.  Their solution was to change parts of the code to build SQL statements and execute them directly on the SQL server.  With this method they where able to reduce the time in their example from 24 hours down to 1.5 hours.

With this in mind I decided to do the same for the data upgrade.  I created an upgrade SQL helper CODEUNIT based on ADO automation to use in the earlier versions.  I will attach it to the post but here is the code

[code]OBJECT Codeunit 99980 Upgrade SQL Helper
{
OBJECT-PROPERTIES
{
Date=07.09.13;
Time=18:02:42;
Modified=Yes;
Version List=UPGW16.00.10,MS;
}
PROPERTIES
{
OnRun=BEGIN
END;

}
CODE
{
VAR
ADOConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000514-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Connection";
ADOStream@1100408004 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000566-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Stream";
WShell@1100408007 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{72C24DD5-D70A-438B-8A42-98424B88AFB8}:’Windows Script Host Object Model’.WshShell";
SystemFilesSystem@1100408014 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{0D43FE01-F093-11CF-8940-00A0C9054228}:’Windows Script Host Object Model’.FileSystemObject";
SystemFile@1100408013 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{C7C3F5B5-88A3-11D0-ABCB-00A0C90FFFC0}:’Windows Script Host Object Model’.File";
SystemTextStream@1100408012 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{0BB02EC0-EF49-11CF-8940-00A0C9054228}:’Windows Script Host Object Model’.TextStream";
MyDatabase@1100408011 : Record 2000000048;
MyServer@1100408010 : Record 2000000047;
dbpropFromStr@1100408003 : Text[30];
dbpropToStr@1100408002 : Text[30];
CommandFile@1100408008 : Text[1024];
ResultFile@1100408009 : Text[1024];
CrLf@1100408015 : Text[2];
dbpropertyInitialized@1100408001 : Boolean;
Connected@1100408005 : Boolean;
Text007@1100408006 : TextConst ‘ENU=Microsoft ADO not found.;ISL=Microsoft ADO finnst ekki.’;

PROCEDURE CreateSQL@1100408007(Qry@1100408000 : Text[1024]);
BEGIN
CLEAR(ADOStream);
CREATE(ADOStream,TRUE,TRUE);
ADOStream.Type := 2;
ADOStream.Open;
ADOStream.WriteText(‘SET ANSI_NULLS ON;’ + CrLf);
ADOStream.WriteText(‘SET QUOTED_IDENTIFIER ON;’ + CrLf);
ADOStream.WriteText(Qry + CrLf);
END;

PROCEDURE AddToSQL@1100408012(Qry@1100408000 : Text[1024]);
BEGIN
ADOStream.WriteText(Qry + CrLf);
END;

PROCEDURE ExecuteSQL@1100408015();
VAR
FileContent@1100408003 : Text[1024];
WinStyle@1100408002 : Integer;
FileSize@1100408004 : Integer;
Wait@1100408001 : Boolean;
Success@1100408000 : Boolean;
BEGIN
Connect;
ADOStream.WriteText(‘;’);
ADOStream.SaveToFile(CommandFile,2);
ADOStream.Close;
Wait := TRUE;
WShell.Run(
STRSUBSTNO(
‘sqlcmd -E -S %1 -d %2 -i %3 -o %4′,
MyServer."Server Name",
MyDatabase."Database Name",
CommandFile,
ResultFile),WinStyle,Wait);
SystemFile := SystemFilesSystem.GetFile(ResultFile);
FileSize := SystemFile.Size;
IF FileSize > 0 THEN BEGIN
SystemTextStream := SystemFile.OpenAsTextStream;
FileContent := SystemTextStream.ReadAll;
IF STRPOS(FileContent,’affected’) = 0 THEN
ERROR(FileContent);
END;
END;

PROCEDURE StringConvert@2(NAVName@1000 : Text[30]) : Text[30];
VAR
i@1001 : Integer;
BEGIN
IF NOT dbpropertyInitialized THEN BEGIN
dbpropertyInitialized := TRUE;
IF GetDBPropertyField(‘convertidentifiers’) = ‘1’ THEN BEGIN
dbpropFromStr := GetDBPropertyField(‘invalididentifierchars’);
FOR i := 1 TO STRLEN(dbpropFromStr) DO
dbpropToStr += ‘_’;
END;
END;

EXIT(CONVERTSTR(NAVName,dbpropFromStr,dbpropToStr));
END;

PROCEDURE GetFullTableNameSQL@8(TableName@1000 : Text[30]) : Text[1024];
BEGIN
EXIT(STRSUBSTNO(‘[%1$%2]’,GetCompanyNameSQL,GetTableNameSQL(TableName)));
END;

PROCEDURE GetCompanyNameSQL@10() : Text[1024];
BEGIN
EXIT(StringConvert(COMPANYNAME));
END;

PROCEDURE GetTableNameSQL@7(TableName@1000 : Text[30]) : Text[1024];
BEGIN
EXIT(StringConvert(TableName));
END;

PROCEDURE GetFieldNameSQL@1100408001(FieldNameNAV@1100408000 : Text[30]) FieldNameSQL : Text[1024];
BEGIN
FieldNameSQL := ‘[‘ + StringConvert(FieldNameNAV) + ‘]’;
END;

PROCEDURE GetFieldClearSQL@1100408008(FieldNameNAV@1100408000 : Text[30];NewValue@1100408001 : Text[250];LastValue@1100408002 : Boolean) : Text[1024];
BEGIN
IF LastValue THEN
EXIT(GetFieldNameSQL(FieldNameNAV) + ‘ = ‘ + NewValue + ‘ ‘)
ELSE
EXIT(GetFieldNameSQL(FieldNameNAV) + ‘ = ‘ + NewValue + ‘,’)
END;

LOCAL PROCEDURE GetDBPropertyField@11(FieldName@1000 : Text[30]) : Text[30];
VAR
ADORecordset@1100408002 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000535-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Recordset";
ADOFields@1100408001 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000564-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Fields";
ADOField@1100408000 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000569-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Field";
Qry@1004 : Text[1024];
i@1001 : Integer;
FieldValue@1005 : Text[1024];
BEGIN
IF FieldName = ” THEN
EXIT(”);

Qry := STRSUBSTNO(‘select top 1 [%1] from [$ndo$dbproperty]’,FieldName);

Connect;
CREATE(ADORecordset,TRUE,TRUE);
ADORecordset.Open(Qry,ADOConnection);
ADORecordset.MoveFirst;
ADOFields := ADORecordset.Fields;
FieldValue := FORMAT(ADOFields.Item(FieldName).Value);
EXIT(FieldValue);
END;

PROCEDURE GetNullDate@1100408002() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullTime@1100408003() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullDateTime@1100408006() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullString@1100408004() : Text[1024];
BEGIN
EXIT(”””);
END;

PROCEDURE GetNullGuid@1100408014() : Text[1024];
BEGIN
EXIT(”’00000000-0000-0000-0000-000000000000”’);
END;

PROCEDURE GetNullInt@1100408005() : Text[1024];
BEGIN
EXIT(‘0’);
END;

PROCEDURE GetNull@1100408013() : Text[1024];
BEGIN
EXIT(‘NULL’);
END;

LOCAL PROCEDURE Connect@1100408000();
BEGIN
IF Connected THEN EXIT;

MyServer.SETRANGE("My Server",TRUE);
MyServer.FINDFIRST;
MyDatabase.SETRANGE("My Database",TRUE);
MyDatabase.FINDFIRST;

CLEAR(ADOConnection);
IF NOT CREATE(ADOConnection,TRUE,TRUE) THEN
ERROR(Text007);

IF ADOConnection.State = 1 THEN
ADOConnection.Close;

ADOConnection.Open(
STRSUBSTNO(
‘Provider=SQLOLEDB.1;Initial Catalog=%2;Data Source=%1;Trusted_Connection=yes;’,
MyServer."Server Name",
MyDatabase."Database Name"));

Connected := ADOConnection.State = 1;
CommandFile := ENVIRON(‘TEMP’) + ‘\Cmd.sql’;
ResultFile := ENVIRON(‘TEMP’) + ‘\Result.txt’;
CrLf[1] := 13;
CrLf[2] := 10;

CLEAR(WShell);
CREATE(WShell,TRUE,TRUE);

CLEAR(SystemFilesSystem);
CREATE(SystemFilesSystem,TRUE,TRUE);
END;

EVENT ADOConnection@1100408000::InfoMessage@0(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::BeginTransComplete@1(TransactionLevel@1100408003 : Integer;pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::CommitTransComplete@3(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::RollbackTransComplete@2(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::WillExecute@4(VAR Source@1100408007 : Text[1024];CursorType@1100408006 : Integer;LockType@1100408005 : Integer;VAR Options@1100408004 : Integer;adStatus@1100408003 : Integer;pCommand@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{B08400BD-F9D1-4D02-B856-71D5DBA123E9}:’Microsoft ActiveX Data Objects 2.8 Library’._Command";pRecordset@1100408001 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000556-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Recordset";pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::ExecuteComplete@5(RecordsAffected@1100408005 : Integer;pError@1100408004 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408003 : Integer;pCommand@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{B08400BD-F9D1-4D02-B856-71D5DBA123E9}:’Microsoft ActiveX Data Objects 2.8 Library’._Command";pRecordset@1100408001 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000556-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Recordset";pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::WillConnect@6(VAR ConnectionString@1100408005 : Text[1024];VAR UserID@1100408004 : Text[1024];VAR Password@1100408003 : Text[1024];VAR Options@1100408002 : Integer;adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::ConnectComplete@7(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::Disconnect@8(adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

BEGIN
END.
}
}

[/code]

I of course used some of the code Microsoft made from NAV 2013, that is the normal way to do things, but I had a problem to solve that did not exist in NAV 2013.  Some of my statements exceeded 1024 characters in length and I was unable to use ADO to execute the statement.  I had to move to sqlcmd to execute the statements.  I use ADO Stream when adding to the SQL command and then saved the content of the stream to a temporary file that is then executed by sqlcmd.  I then look at the result file to catch the errors.

Again, I used Excel to help me to build the code needed.  So, what is gained by this ?

When I need to clear fields in a table a single SQL statement clears all files in one read-modify statement.  Previously this had to be done field by field or by looping through every entry in the table.

[code]
//Delete data from obsolete fields
DBMgt.CreateSQL(STRSUBSTNO(‘UPDATE %1 SET ‘,DBMgt.GetFullTableNameSQL(TABLENAME)));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pantað magn"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pantað í mælieiningu"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME(EDI),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Afgreitt magn"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Afgreitt í mælieiningu"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pick No."),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Item Pick Group Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Original Order No."),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Best fyrir"),DBMgt.GetNullDate,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Síðasti söludagur"),DBMgt.GetNullDate,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Appl.-to Job Entry"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Phase Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Task Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Step Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Job Applies-to ID"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Apply and Close (Job)"),DBMgt.GetNullInt,TRUE));
DBMgt.ExecuteSQL;[/code]

Similarly I use a single SQL statement to copy all needed fields from the source table to the temporary table.

[code]
// Copy Data to Temporary Tables
DBMgt.CreateSQL(STRSUBSTNO(‘INSERT INTO %1 (‘,DBMgt.GetFullTableNameSQL(TempSalesShipmentLine.TABLENAME)));
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Document No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Line No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME(EDI)) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pick No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Item Pick Group Code")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Original Order No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Best fyrir")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Síðasti söludagur")) + ‘ ‘ );
DBMgt.AddToSQL(‘) SELECT ‘);
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Document No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Line No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME(EDI)) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pick No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Item Pick Group Code")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Original Order No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Best fyrir")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Síðasti söludagur")) + ‘ ‘ );
DBMgt.AddToSQL(STRSUBSTNO(‘FROM %1 ‘,DBMgt.GetFullTableNameSQL(TABLENAME)));
DBMgt.ExecuteSQL;[/code]

In this step I also need to delete large peaces of data. I for example am clearing big log tables in the first step of the upgrade to save time later in the process. There is a TRUNCATE TABLE SQL command that clears all data in a table and is wery quick. The TRUNCATE TABLE command works differently than the DELETE command.  If the table has SumIndexFields in any of the keys the TRUNCATE TABLE command will fail.  One option is to first disable all SumIndexFields and then use TRUNCATE TABLE and the other one is to allow NAV to delete the data from tables with SumIndexFields.  If wanting to disable all SumIndexFields an isolated command with COMMIT in the end is needed.  I found out that none of my large tables had SumIndexFields so I used this code.

[code]

TruncateTable(TableID : Integer)
IF OldObject.GET(OldObject.Type::Table,”,TableID) THEN BEGIN
Company.FINDSET;
DBMgt.CreateSQL(”);
REPEAT
RecRef.OPEN(TableID,FALSE,Company.Name);
TableKey.SETRANGE(TableNo,TableID);
TableKey.SETFILTER(SumIndexFields,'<>%1′,”);
TableKey.SETRANGE(MaintainSIFTIndex,TRUE);
TableInformation.SETRANGE("Company Name",Company.Name);
TableInformation.SETRANGE("Table No.",TableID);
IF TableInformation.ISEMPTY OR NOT TableKey.ISEMPTY THEN
RecRef.DELETEALL
ELSE
DBMgt.AddToSQL(
‘TRUNCATE TABLE [‘ +
DBMgt.StringConvert(Company.Name) +
‘$’ +
DBMgt.GetTableNameSQL(RecRef.NAME) +
‘];’);
RecRef.CLOSE;
UNTIL Company.NEXT = 0;
DBMgt.ExecuteSQL;
OldObject.DELETE;
END;[/code]

On the other end of the upgrade batch, where I want to use the data from the temporary tables I also move the code to SQL. Here I use the NAV 2013 Upgrade – SQL Mgt. CODEUNIT unchanged to execute a MERGE SQL statement.

[code]
WITH VendPaymInfo DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF NOT ISEMPTY THEN
BEGIN
SQLMgt.ExecuteSQLCommand(
STRSUBSTNO(
‘MERGE INTO [%1] T ‘ +
‘ USING [%2] S ‘ +
‘ ON T.[%3] = S.[%3] ‘ +
‘WHEN MATCHED THEN ‘ +
‘ UPDATE ‘,
SQLMgt.GetFullTableNameSQL(DATABASE::"Banking Vend. Payment Info"),
SQLMgt.GetFullTableNameSQL(DATABASE::"Temp Vend. Payment Info"),
SQLMgt.StringConvert(FIELDNAME("Entry No."))
) +
STRSUBSTNO(
‘ SET [%1] = S.[%1], ‘ +
‘ [%2] = S.[%2]; ‘,
SQLMgt.StringConvert(FIELDNAME("Bank Branch No.")),
SQLMgt.StringConvert(FIELDNAME("Bank Account No."))
)
);
SQLMgt.ExecuteSQLCommand(
STRSUBSTNO(
‘TRUNCATE TABLE [%1];’,
SQLMgt.GetFullTableNameSQL(DATABASE::"Temp Vend. Payment Info")
)
);
END;
TempVendPaymInfo.DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

Believe me, this is so much faster the the classic method of looping through the records, and even much faster then using the UPDATE SQL statement.  As an example, I am watching the SQL server use 3.260.000 B/sec in total disk activity with the loop-and-update method where as I saw this number go up to 230.000.000 B/sec with direct SQL statement.

Prepare to see you SQL server finally working his socks off.

Upgrade Tools

2 thoughts on “Performance in data upgrade – lets speed things up

  1. And who said that NAV upgrades were supposed to be easy? Thank you for sharing this, I will keep the post in my “favorites”. You never know when this will come in handy.
    Let us know how the final upgrade went and how long time it took.

    1. I will Erik

      Now looking at “Update Posted Document Links” that is running in Step 2 from 4.0Sp3 to 2009. This will take up to five days in execution. It is more complex than a single SQL statement so I will use save the “Item Ledger Entry” table from this test upgrade to use later. By doing that I will only need to process new documents in the next upgrade.

Leave a Reply

%d bloggers like this: