Dynamics NAV data Upgrade times taking too long?

UPDATE

I’ve seen that a Netsuite Marketing guy has used this blog post on linked to say that NAV’s hard to update and . For pre 2013 versions he might have a point but that’s now not the case. This instance was bringing an on premise customer up through 7 releases to the current 2016 release. That’s a choice NetSuite don’t give you, you have no option but to upgrade when it suite them regardless of the issues that may give your staff or business. You also cannot have the software on premise if you want it, NetSuite don’t believe in giving customer choices.

If anyone from NetSuite is reading this there are a couple of other things I'm curious about . Tell me what would be their cost of having that clients 1.2Tb plus database size be on NetSuite exactly every month? This customer has had Dynamics NAV since 2007, how much would they have paid since then with NetSuite's subscription only model, using the subsequent years costs of course rather than just the heavily discounted year one fees?

Original Post

This weekend I’m doing what I haven’t done for a long time and supervising the upgrade of a clients data from Microsoft Dynamics NAV 2009R2 to 2016.  Normally it's one of our developers who does this but due to a family bereavement on Thursday, our plans had to change and so I’m doing my shift.

This client has a largish data set – not the largest one I know by some distance, but still , at 215Gb its not Cronus-sized either. What is notable is that they have 2.3 million sales shipments and sales invoices. These guys are distributors with ten years of history and due to warranty reasons, they need to keep that history.

So why the blog? Well, despite starting at 7pm on Friday we are still sweating over whether we will finish in time for business start on Monday morning. The upgrade is running on a server which is setup as well as possible, with multiple processors, 256Gb of memory and a SAN disk setup that includes flash based as well as mechanical storage. I'm not sure what we could do on the platform side to make it go any faster.

One has to ask, why did ‘stage one’ take 36 hours to run? I became curious especially when I saw a modify counter for sales invoice headers that seemed to reset to zero multiple times. I had a look at the routine or codeunit that was running and found the following code:

WITH SalesShipmentHeader DO
  IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
    TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
    MODIFYALL("Date Sent",0D);
    MODIFYALL("Time Sent",0T);
    MODIFYALL("BizTalk Shipment Notification",FALSE);
    MODIFYALL("Customer Order No.",'');
    MODIFYALL("BizTalk Document Sent",FALSE);

    TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
    StateIndicator.EndUpdateTable(TABLENAME);
  END;

WITH SalesInvoiceHeader DO
  IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
    TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
    MODIFYALL("Date Sent",0D);
    MODIFYALL("Time Sent",0T);
    MODIFYALL("BizTalk Sales Invoice",FALSE);
    MODIFYALL("Customer Order No.",'');
    MODIFYALL("BizTalk Document Sent",FALSE);

    TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
    StateIndicator.EndUpdateTable(TABLENAME);
  END;

It's too late this time and the routine is already running, but this is wrong for me. To do more than two "modifyall’s" on the same table in a row like this is going to be awfully slow – especially when it's running in the unoptimised Dynamics NAV 2009 client. This means that the server, is pulling back the same record and updating just one field, each of the five times separately. It would be much faster to loop through and update all five fields in one single write. Better still, one SQL statement would be a vastly quicker than that.

I’m also of the mind that when we ‘force’ the upgrades new version table object schema over the top of these tables, won’t the fields not required be deleted anyway? Do we have to do all of this process of clearing fields before they are removed? Isn’t that old thinking that we should remove?

For our 2.3m records each modify took about 5 hours, I’m pretty sure that when I rewrite it as a simple loop it will take maybe 7 hours for all five, which is a massive improvement. Better still, I’m going to check first to see if the fields are actually populated in the first place as they relate to the Commerce Gateway module, which is only used in about one in fifty installations. Potentially we don’t need to do any of it!

What is for sure is that we will do some work to optimise all of this process. More than two "modifyall’s" on the same table will give slower performance – not what you want when you're trying to minimise downtime.

So a valuable lesson learnt for me: don’t accept data upgrade timetables at face value and ask if the code has been optimised before throwing super hardware at it or waiting for longer shutdowns.

There are significant time gains to be had.