NAV/BC Version Numbers

Many others have posted this information before. I’m just putting this on my own blog because I have noticed a decrease in the number of ‘old’ blogs out there and this is getting harder to find. I’m just securing the information so that I know I’ll have it for as long as I keep my blog up.

When I started as a Navision developer in March of 2000, version 2.5 had just come out. Most of the clients that I worked for were on 2.01. The version that is current today (v21 for the 2022 Wave 2 release, and v22 is just a couple of months away) directly derives from that version back in early 2000. There are earlier versions than that, and you can search for the history yourself, but I have never encountered anything older than 2.01. I just wanted to have a handy list as a quick reference.

Here They Are

  • Navision in various incarnations (Financials, Attain, Microsoft Business Solutions) 1, 1.1, 1.2, 1.3, 2, 2.01, 2.5, 2.6 (also versions with manufacturing and another with Advanced Distribution, as well as specialty versions for the first NAS 2.65 a, b, c, d, and e), 3.00, 3.01, 3.10, 3.60, 3.70, 4.0, 4.0 SP1-SP2-SP3, 5.0
  • Dynamics NAV 2009 is version 6 – also had SP1 and then R2
  • Dynamics NAV 2013 is number 7 – also had R2
  • Then there were NAV 2015 (8), 2016 (9), 2017 (10), and 2018 (11)
  • Version 12 was the first BC version, AKA Business Central 2018. For a while there (at least through BC14) was a bit of a split personality with the splash screen saying “Microsoft Dynamics NAV Connected to Dynamics 365 Business Central”, a very catchy and easy-to-remember 24-syllable name
  • From then on there is a release every 6 months:
    • 13 – October 2018
    • 14 – April 2019 (this was the last version with C/SIDE)
    • 15 – BC 2019 wave 2
    • 16 – BC 2020 wave 1
    • 17 – BC 2020 wave 2
    • 18 – BC 2021 wave 1
    • 19 – BC 2021 wave 2
    • 20 – BC 2022 wave 1
    • 21 – BC 2022 wave 2
    • and so on and so forth

update 2023-02-14: added some missing versions

NAV on Docker in 2022

One of my clients asked me if I would be able to help them with an ‘upgrade’ of an add-on for Dynamics NAV for one of their customers. For this task I would have to get a working C/SIDE in a number of versions. It’s been years since I’ve done any C/AL development, and I thought this would be a cool task to work on. This post describes what I found is what does and does not work if you want to do this using Docker containers.

Background

Just to paint a picture… First of all, the end user is on NAV2017. They had an older version of this add-on, which was developed on NAV2013. The task at hand was to implement a newer version of the add-on, which was developed in NAV2018. Technically, this was a downgrade of the add-on objects so I had to be careful to avoid any incompatible object attributes. I won’t bore you with the details of the actual ‘upgrade’, nobody wants to read about those.

The Environments

To be able to identify the mods of the original add-on, I needed a C/SIDE environment for NAV2013R2. Since this version is not available in containers, I had to actually install this version.

The end user is on NAV2017, and the ‘new’ version of the add-on is in NAV2018. Both of these versions are available in containers, and supposedly all you need to do is put together the correct artifact URL. You can find this information on Freddy’s blog. Mind you though, the localization for the US is called ‘na’ in NAV, not ‘us’ like in BC.

How about Docker?

what does and does not work? This, to me, is a pragmatic problem. I spent quite some time trying to make NAV2017 and 2018 work in containers, because I have used them successfully in the past. I have a terrible memory though, so I always start from scratch, and what I could find was outdated. At some point I just started the NAV2017 DVD download as I was researching a problem. The d/l completed before I found the answer so I abandoned the container idea for NAV2017 and just installed it. I have plenty of VMs available to do this so one way or another, I’ll get a working instance.

After going through a bunch of troubleshooting and following obsolete download links, I was able to make the NAV 2018 container work. Freddy wrote about troubleshooting here, but not all links still work. You have to enable .NET framework 3.5 and 4 in the Windows features, you need Visual C++ redistributable for Visual Studio 2015 (“The program can’t start because MSVCP120.dll is missing”), and you have to get working SQL bits (weird – the Windows client did work but C/SIDE did not until I installed those). What made the ODBC errors go away for me is the SQL Server 2012 Native Client. The SQL link in Freddy’s blog did not work for me.

I could not get NAV2017 to work at all, it would not even start. As I said I got the DVD to download before figuring out the problem and I installed it from there. It’s not like I have NAV2017 clients lining up, so I did not want to spend a second more than I had to on this.

Lesson Learned

So in the end I ended up installing NAV2013R2 and NAV2017, and was able to get a NAV2018 container up and running. The lesson learned though? I bet it is still possible to get the NAV2017 container to run right, but just as a safety I have downloaded all versions of the DVD going back to NAV 5.0. To Microsoft’s credit, they still have most of those as downloads, but you never know when they will remove them.

This post is mostly for my own benefit, but I wanted to share it if anyone out there also needs these. Let me know in the comments how you’ve made NAV and C/SIDE work in containers.

Translations for Business Central Apps

My struggle to get the new translation files right has been real and too long to recount on this blog. When I finally got it right, it felt like it was more of a coincidence than actual skill to finally get it done.

It started with the app submission checklist, where one of the requirements is to “include all translations of countries your extension is supporting”. The app that I was working on was targeted for the US market, so all I needed was a translation file for ‘en-US’. The page on docs that explains translations explained what I needed to do. Or did it?

Turning on the translation feature in the app.json file and replacing all ML captions and such was easy enough. The trouble begins when you run into the details.

As soon as you rebuild the app, VSCode generated the default translation file in a new folder called ‘Translations’, and the translation file will be called “<YourAppname>.g.xlf”. Because the development language is ‘en-US’, this generated translation file is also specified for the ‘en-US’ language.

The translation page on Docs specifies that you need to copy the generated file “to avoid that the file is overwritten next time the extension is built”. Each time that you build the app it will re-create the generated xlf file. What that means is that you need to make a copy of the translation file and use the copy to do your actual translation work.

So I did make that copy, and because I was working on a US only app, I felt I was done with my translation. I had the generated file in my workspace, I had a copy of it for the ‘en-US’ language, I thought I was good to go. Alas, the app submission failed because they could not find any translation file in my app. Something was missing from my translation file.

The generated file only has nodes for “source”, which comes from all of your text strings like labels and captions and comments and such. The translation itself is in a node that is NOT included in the generated xlf file, you have to create that. The name of this node is “target”, and this is where the actual translation goes for each source element.

So, I copied and pasted all of the source nodes, made target nodes out of them, and by this time I had a direct line to the validation person at Microsoft, who was willing to hop on a Skype call and look at my workspace. He was also surprised to see that my workspace DID have the translation file, while the app file that I sent to him did not. As it turns out, the target node needs to have an attribute called “state” with a value “translated”.

In order to get the translation file in its proper state, you should use the proper tool, and I found that the Microsoft Multilingual app toolkit editor is the easiest one to work with. It puts the right elements into the xlf file, and when I used that tool, my app file was finally accepted.

Announcing NAV 2018

Real quick one today – James Phillips announced General Availability of Microsoft Dynamics NAV 2018.

Marko announced it on TwItter:

Which was retweeted by just about everyone I know. Promise made at the closing keynote at Directions in Orlando is hereby delivered!

Can’t wait to see when they announce “Tenerife”.

NAV Techdays 2017 Recap

My favorite week of the year has just ended. I’m in the high speed train from Antwerpen back to Amsterdam, which is over just like that so I don’t have much time to make this anything elaborate.

As per usual, the organization was superb. The venue is fantastic, with a great expo area, lots of good food and drink choices, and the seats in the great rooms are just about the most comfortable seats you can imagine. The Kinepolis is a movie theater that you can also rent for events. I think I speak for everyone when I say this is one of the most important features, and I hope we will never have to move to a different location.

My week started with a full day pre-conference workshop about automated testing in NAV. This workshop was hosted by none other than Luc van Vugt, who, as per usual, delivered a solid day of learning. I had misread the workshop description and the correspondence that we had prior to the workshop, so I did not get everything out of it that I could have, but Luc was so kind to offer assistance to me so that I could do the exercises at home. Any time you have an opportunity to train under Luc, you should take it.

My company was a sponsor, so we had a booth to staff. It was a pleasure to be there and talk to anyone who had any questions about what we can offer.

As per usual, the two conference days were stuffed with 90 minute deep-dive sessions on any topic you can think of. My favorite ones were Waldo’s “Rock ‘n Roll with VS Code”Anders and Nikola’s “Creating great APIs”, and the Docker session by Freddy, Tobias and Jakub. Fortunately for you, you can watch all of the sessions on YouTube, Luc had all of them uploaded within a week of the conference.

I can’t say enough about this conference. For any technical resource in the NAV channel (and I include Dynamics 365 for Finance and Operations in that category), NAV Techdays is a must to attend, every single year. This is the second time that I’ve gone, and I am still kicking myself for not going the first few years. If I can help it, I will not make that mistake again.

See you next year in Antwerpen!

Directions 2017 EMEA Recap

After a supremely busy week, with lots of last minute session and workshop work, I’m getting ready to go sightseeing in Madrid with my wife and some dear friends. This week has been frustrating, as well as satisfying, as well as educational, although not as educational as I would have wanted because I was too busy doing sessions and workshops to have any time to attend any myself.

One of the key takeaways is that Microsoft is going to release NAV 2018 in December this year, so 4-5 months ahead of the spring release of “Tenerife”. The whole white label thing seems to be gone altogether, so partners can continue to use the Microsoft name in their marketing for their Dynamice 365 products. Still, there is some need for clarity about licensing, and about the long term future.

What everybody needs to acknowledge is that we now live in a world that is being disrupted continuously. Today, Microsoft is heavily investing in their current roadmap, one that they feel very strongly will succeed in the long run. You must understand that this world is moving at the speed of light, and if something happens to our ecosystem, Microsoft WILL react. When they do, they will focus on what they feel is their best chance to survive in this changed world, so it is up to US to make that happen.

This slide was shown at one of the presentations, and it shows where Dynamics 365 “Tenerife” fits into our world. As you can see, it is just one of many boxes. If the surrounding boxes change, there may be a completely different role for “Tenerife”. It might very well happen that the market shifts in such a way that there won’t be a need for it at all. Given everything that I’ve seen this week, I think the “Tenerife” story is awesome, and we are going to absolutely crush it in terms of features and capabilities. What I don’t know is whether we can grow this cloud business enough to remain strong over the long term. The key though, is to fully embrace the entire picture, not just the ‘NAV’ part of it. The days of just ‘NAV’ are over, and they’re not coming back.

If anything, what I get out of this week is that we all must play a crucial part in the success of our entire market. WE the partner channel WITH Microsoft, NOT Microsoft alone, will determine the success of this market. If you want to be a part of this ecosystem, you better adapt and embrace what is here to stay. Fighting it is a losing cause, and you will be left behind.

I say we take a deep breath, a chill pill, we take a good look at what we have to do, and we roll up our sleeves and make it happen.

Registered for NAV Techdays 2017

It’s that time of the year. The official announcement came in on Twitter that registration is open for NAV Techdays 2017, which is again held in Antwerp of course. The official two day conference is 16 and 17 November, but I consider the pre-conference workshops to be part of the event, so a full 4 days of deep technical knowledge sharing.

The session schedule has not been published, but we do know the pre-conference content, go take a look at the Sessions page to check out what is available to you. The familiar sessions will cover PowerShell by Waldo, JavaScript by Vjeko, automated testing by Luc van Vugt, SQL Server performance by Jörg, and of course the 2 day design patterns class by Mark. The new topics include how to SaaSify your software architecture, how to use Visual Studio Code (it is very important to learn about this one) and how to develop extensions with it by Arend-Jan. Another exciting one is the SCM workshop by Sören, he will show you how to make it work directly in VSCode.

Only a few regular sessions have been announced (some fantastic content by our friends at Microsoft), but I have a feeling that we will see a super deep dive into all things related to Dynamics 365. I cannot wait to go to Antwerp, and I hope to see you there.

Localize Objects with PowerShell and VSCode

In this article I will explain how you can use PowerShell to extract the right objects and merge those objects, and how to use Visual Studio Code to resolve most of the conflicts in the merged objects.

In a previous article, I explained how you can use PowerShell to create the environments. The script in this article actually use the variables that are assigned in the other one. What I should really do is create a configuration file and load that from both scripts. I wanted to share what I have so far though, without a handy configuration file but hopefully helpful nonetheless.

We start off with the 4 environments that were created in our last article: ORIGINAL, MODIFIED, TARGET, and RESULT. The first two environments contain the standard W1 and the ISV product. Target and RESULT are identical at this point, and they both contain the standard NA localization. All environments are in the same build number (NAV 2017 CU2 in this case). Because I have a developer license that has insert rights in the ISV number range, my strategy is to merge the product modifications into the NA database. If you do not have those insert rights, then a better strategy would be to merge the NA localization into the product environment instead. Microsoft has recently added insert capabilities for the standard object ranges to regular developer licenses, for this particular purpose. As I was working through the conflict objects of this assignment, I was thinking that this may even be the best default strategy anyway.

Due to a limited amount of time, and a limited amount of PowerShell skill, I decided to approach this task pragmatically:

  • Manually move the ISV specific objects to the RESULT environment
  • Use PowerShell to export all the objects
  • Manually eliminate the unmodified objects
    • NOTE: as I learned later, Waldo’s PowerShell modules actually have logic to remove these after the merge, so this can be scripted as well
  • Use PowerShell to merge the objects
  • Use Visual Studio Code (AKA VSCode) to resolve the conflicts as much as possible
  • Use PowerShell to join the objects
  • Use C/SIDE to resolve any remaining conflicts

The conflict resolution is something that has to be done manually. Everything else can be scripted in PowerShell. During this process I’ve asked Waldo for some help, and he explained that most of what I am doing here is already part of the merge sample scripts. In the Cloud Ready Software PowerShell module, there is a folder called “PSScripts”, and in that folder you will find a large number of scripts that you can use as an example to get started on your task. As you gain experience in using PowerShell, you will recognize a lot of useful features in those scripts, and you can modify them to your specific needs.

Alright, on to the details. The first part is to use PowerShell to extract the objects into their own folders. I also added commands to create the folder structure itself:

At this point, you will have an “Objects” folder in your working folder, with full object files for ORIGINAL, MODIFIED, and TARGET. Those object files have then been split into individual object files in the ORIGINAL, MODIFIED, and TARGET folders. Before using PowerShell to merge those objects, I used a text compare tool (I like Scooter Software’s Beyond Compare) to eliminate unmodified objects. Remember, I only want to work on the modified objects, so I don’t have to worry about getting confused by objects that were not changed for the ISV product.

Now that we only have modified objects in all three object folders, we’ll use the merge Cmdlet to do the actual merging of the objects.

If you notice, the output of the “Merge-NAVApplicationObject” Cmdlet is loaded into the variable “$MergeResult”. This object is then piped into the “Merge-NAVApplicationObjectProperty” Cmdlet. The first Cmdlet is a standard NAV PowerShell Cmdlet, and the second one comes with the Cloud Ready Software modules. The standard merge Cmdlet does not merge the Version List property, it simply takes the Version List from one of the three environments. We can have a discussion about whether the Version List is even important anymore, especially if you use Source Code Management. The reality is that most NAV developers depend on proper tags in the Version List, so it is useful to merge those as well. All you need to do is specify the prefixes that you want to merge (In my case NAVW1 and NAVNA) and it will find the highest value for those prefixes. All other prefixes will be simply copied into the RESULT objects.

At this point, we have all merged files in the RESULT folder. This includes the objects that were merged successfully, but also the objects that the merge Cmdlet could not resolve, for instance when code was added at the same position in MODIFIED and in TARGET. Since we are going to have to learn how to use VSCode, I decided to use that to resolve the conflicts.

With VSCode installed, you can open the RESULT folder and see the content of this folder inside the file browser at the left hand side. In the upper right hand corner is a button that you can use to split the screen into two (and even three) editor windows. This is very useful for resolving merge conflicts, because you can open the conflict file in one side, and the object file in the other side. You are editing the actual object file here, so you may want to take a backup copy of the folder before you get started.

Now what you must understand is that the conflict files only contains the pieces that the merge Cmdlet could not figure out. In a total of 860 objects, with 6956 individual changes, it was able to merge 96.4% of those changes. An object that may have 4 conflicts can also have a ton of other changes that the merge Cmdlet merged successfully. All YOU have to do is focus on the ones that need manual attention. For instance, codeunits 80 and 90 had a TON of modifications, but it only needed help with 3 of them.

I had a total of 115 conflict files, and I could completely resolve 112 of them in VSCode. I made a note of the few that remained, and decided to import those unchanged into C/SIDE, so I can finish those off in the proper IDE.

The last PowerShell command that I used is to create a single object file, which can then be imported into C/SIDE. Then I finished the last few remaining object files, and was able to compile all objects from there.

I’ve done many of these merges completely manually. To say that I was skeptical that PowerShell would do a good job is putting it mildly. I flat out did not trust the merge Cmdlet, surely they could not do as good a job as I could do. I was wrong. I checked a bunch of objects to see if I could find any mistakes and I could not find any. Not only did the merge Cmdlet do a fine job at merging the objects, it did so in about 5 minutes flat.

You can download the scripts here.

Instead of having to manually merge almost 900 objects, all I had to do was focus on the conflicts. Usually, a vertical merge like this would take me anywhere from 2 to 4 weeks, and I was able to finish this one in less than two days. Figuring out the PowerShell scripts took me much longer, but I will be able to use those for the next merge task.

Create NAV Environment with PowerShell

If you kind of know about PowerShell, and you want to use it more, but you don’t really know where to start, then you should read on. I am going to explain to you how you can use PowerShell to create the environments that you need to localize a product.

One of the things that always seem to fall on my plate at my job is to take an ISV product and merge that into the North American localization. I call it ‘localizing a product’. It is not, because localizing a product is much more than simply merging the objects, but it’s what you have to do first. I’ve done a bunch of these manually, and I kind of enjoy the almost mindless nature of the task of working my way through hundreds of objects (sometimes even thousands, I once ‘localized’ a product that had more objects than standard NAV itself).

This time around, I wanted to use PowerShell to automate as much as I could. Lucky for me, I attended Waldo’s “PowerShell Black Belt” workshop at NAV Techdays (read my review of this fantastic event here) and I should have all the tools to get this started.

The first thing you need is to install Waldo’s PowerShell modules, I will be using those for just about everything. I could figure out how to script all of this stuff myself, but why would I if Waldo already did that for us, and he is sharing his scripts. For instructions read Waldo’s blog here.

This post focuses on creating the environments themselves, and for this task you really need just a few things:

  • A SQL backup for the product in the W1 version of NAV, and make sure that you know exactly which build this was developed in. For instance, the product that I am working with was provided as a SQL Backup, and it was developed in NAV 2017 CU2.
  • The DVD’s for standard W1 and NA for the same NAV build.
  • A Development license that has insert rights for the product’s number range. If you only have a regular developer license, or if the ISV refuses to provide you with a license (this happens more often than not) then you will have to merge NA into the product database. This works exactly the same, you just have to figure out which environments are your ORIGINAL, MODIFIED and TARGET.

Use the NA DVD that you just downloaded and install the NA Version with a demo database. I like to give the database a meaningful name, so I called it ‘NAV2017NACU2’. This is all the manual installing that we will do, everything else is PowerShell baby! Actually, you could even use Waldo’s PowerShell modules to do the install as well. You could even write a script that downloads the DVD for you as well, all using Waldo’s PowerShell modules.

If you understand the mechanics of localizing a solution, you will know that we need 4 databases in total: standard W1, Product database in W1, standard NA, and a copy of the standard NA database that will become the product database. In this case, standard W1 is ORIGINAL; product W1 is MODIFIED; standard NA is TARGET and product NA is RESULT. This will become important when we do the merge, which will be another blog post.

Now the goal is to have a re-usable script, so we’ll use variables instead of having to re-type values multiple times, as you can see in the picture.

Remember, I installed standard NA. I created the working folder and placed the three backup files into the ‘Backups’ folder. I like to keep things together, but I can imagine having a network location for the standard backup files. The nice thing about using these variables is you can set them however you need them :). My database server is called KERPLUNK, which is just a regular unnamed instance on my VM called KERPLUNK. You could also use a demo installation and then you’d have to set it to ‘KERPLUNK\NAVDEMO’. I will use the 4 names for the database names as well as the service tier names.

The rest of the script is surprisingly simple to put together, here it is:

The reason why I start with the ‘Copy-NAVEnvironment’ step is because this script turns on port sharing for both server instances, so I don’t have to take care of that step myself. This is one of Waldo’s script that looks at the first server instance and creates a new one just like it. It creates a SQL Backup from the service instance, restores that into a new database, and finally it creates a new server instance for that new database, and it enables port sharing by default. The ‘New-NAVEnvironment’ is also one of Waldo’s scripts, and it does all the heavy lifting of restoring the database, creating the service tier and setting up port sharing. The full script took maybe 2-3 minutes to run for me, not bad for something that used to take all morning.

This same script could also be used to create your environments for developing extensions. For that you only need a standard database for ORIGINAL and a development database for MODIFIED, both of which will then be used to create the DELTA files for the extension package. If you look at the screenshot of my environment you’ll see an APPDEV and an APPTEST database. Both of these were originally created as copies of the standard database, also using PowerShell.

I’m planning to also put this into a video but I actually have this work to deliver so I’ll focus on that first. Up next is getting the objects out and comparing them to create a set of merged objects. Stay tuned!

Update 4/22/2017: added link to the new article about localizing the objects, and you can download the scripts here.

NAV codebase for Dynamics 365

Something that was kind of a big deal happened this week. During the keynote at Directions ASIA in Bangkok, Marko Perisic (General Manager, Microsoft Dynamics SMB) announced that Dynamics 365 for Financials and Dynamics NAV will operate on the same codebase.

There are two reasons why this is a big deal. First it falls right in line with Microsoft’s “AND strategy” when it comes to cloud ERP and on premise ERP. BOTH are essential to Microsoft, and BOTH will have a place in their product line. Second, and this is where NAV is kind of unique, it will allow Dynamics 365 for Financials to be “Full NAV” in the cloud. Let that sink in: Microsoft is committing to having full Dynamics NAV functionality in Dynamics 365. Whether you implement in the cloud or on premise, your ERP will operate on the same codebase. As far as I can tell, there are not many ERP products that provide this. I have a feeling that it is a trend that many will follow though.

The Dynamics 365 codebase has always been the same as NAV, but only parts of its functionality was exposed. The assumption was that the Dynamics 365 codebase could potentially diverge from the on prem version of the product, although how that would happen was not always very clear. Now that there is a firm commitment to keeping a single codebase, the next step is that Dynamics 365 will at some point provide “Full NAV” capabilities, which was in fact also part of the same keynote.

The nuts and bolts are still in progress. Dynamics 365 is updated constantly and NAV only gets monthly cumulative updates and annual version updates. There is also a lot of movement in how add-ons and customizations will be implemented, although the magic word there is “Extensions”.

For Microsoft to come out and announce the equal codebase though….. that is a Very Big Deal.