Repair Companion Tables

One of my clients has been seeing these weird data issues when migrating databases to the cloud. The most important purpose of this post is to point out a handy data tool that is buried deep in the application, read on to learn what I am talking about.

The problems that my client is experiencing mostly have to do with various ways in which they get data from OnPrem to the cloud. One way is to use the cloud migration tools, another is configuration packages. In addition to using different tools, it feels like there may be some issues with the BC platform’s capability to properly manage data in table extensions. I’ll tell you why I think that.

Missing Data

The first indication was a problem where invoices were migrated into the cloud. The migration process seems to have completed, and the posted invoice list shows a list of invoices. The weird part is that when you try to open an invoice, it shows an empty invoice page. Click ‘View Table’ from the page inspector, and you get nothing, a list of zero records even though the posted invoice list shows the records. Go to the admin center to look at the capacity, it tells us there are like 1154 invoices but when you drill down into the number you get another empty list.

This feels very familiar to me, very much like when a lot of companies tried to migrate straight into SQL Server and we would see null field values. As we all know, BC can’t handle null values. Instead of getting an error saying ‘there are null values!! I don’t know what to do!!’ you get weird behavior like empty lists for tables that you know have plenty of records.

Solving The Problem

With the explosion of moving functionality into separate apps, I had a feeling that the problem had something to do with table extensions (which as you know have added fields in what is called ‘Companion Tables’). I had posted the question on Twitter and there were suggestions to uninstall and re-install apps. This worked sometimes but not all the time.

The original Tweet asking for help on this issue

What it feels like to me is that either there are null values in records, or maybe records are missing in companion tables altogether. We don’t have access to Azure SQL so there is no way for me to actually prove that there is an issue with table extensions. SOMETHING is wrong here though, and for the longest time the only way that I thought we could fix it was to uninstall/re-install apps until the issue was fixed. The reason why this works is that each time you install an app, it will update the schema and make sure that data integrity remains intact. In other words, it will make sure that all records in the main tables have corresponding records in all companion tables.

And then I became aware of a VERY handy little tool. This tool is not available when you are working in local containers, but when you are in a cloud tenant, you will have access to it. I’m talking about the “Repair Companion Table Records” process in the “Cloud Migration Management” page.

The tool will go through all tables that have table extensions, and make sure that each record in extended tables has a corresponding record in each companion table. I still can’t prove my theory but I do know that running this process fixed the problem for my client.

Remove BC Bloatware

The number of apps that come with a standard container has exploded over the past handful of releases. My local setup involves doing development in Hyper-V virtual machines, so local system resources are at a premium. It is very easy to trim the fat so to speak, so read on if you want to know how.

What’s with these apps?

From foreign languages to IRS reports. From integration and migration tools to email functionality and even connectors to external systems like the Spotify app. Having these in separate apps is great, because it is very easy to get rid of them.

The downside is that each app comes with its own set of table extensions, which are implemented as companion tables. Each time that a database action is taken against the main table, the system also has to maintain each companion table. Multiply this issue with the number of companies in your system and you can imagine the performance hit this could take.

Funny as it seems if you know what my desk looks like, but I HATE clutter in the extension list, especially if it is functionality that I just don’t ever use. I will NEVER have a need to use the Norwegian language, nor will I EVER need to use the Paypal links in a local container. In other words, I will never need to use the vast majority of these apps. Most importantly though I had started noticing a real slowing down of the performance in my local containers. I had even started wondering if it is time to replace my machine.

Get rid of these apps!

Lucky for us, it is VERY easy to get rid of apps. The BCContainerHelper module has two commands for you. One is UnInstall-BcContainerApp, which we will use in today’s post. The second one is UnPublish-BcContainerApp, which is useful in case you want to completely get rid of the app altogether. If you want to be a real ninja about it, follow the links to see the underlying PowerShell logic that you can use for inspiration. Me, I like to keep it simple, so I’ll use the BCContainerHelper Cmdlets.

If you were thinking ‘never say never’ when you were reading the previous paragraph, you were absolutely right. What if I get a Norwegian customer tomorrow? Let’s use the uninstall Cmdlet. This will leave the app in the system, ready to be installed again at a later date

UnInstall-BcContainerApp `
    -containerName 'MyContainer' `
    -name 'Shopify Connector' `
    -doNotSaveData `
    -doNotSaveSchema `
    -force
  • The name of the app is enough to uninstall the app. You could also specify the publisher and version if you want but for our purposes the name is enough since all apps in the standard container are published by Microsoft
  • The ‘doNotSaveData’ parameter is used to make sure that the data is deleted from the companion tables, important because we are going to get rid of those with the next parameter
  • The ‘doNotSaveSchema’ parameter is used to remove the companion tables from the system. If you do not set this parameter, the schema will remain in the app database

If you are absolutely certain that you will never use the app, you can use the UnPublish Cmdlet instead and REALLY clean up that app list.

Bonus Company Removal

The standard container also comes with a pre-configured company called ‘My Company’ that I personally never use, and we have a command to remove that too:

Remove-CompanyInBcContainer `
    -containerName 'MyContainer' `
    -companyName 'My Company'

Get rid of half the companies, get rid of 50% of the unnecessary companion tables.

Put it All Together

In my personal ‘arsenal’ of goodies, I keep a set of scripts to create new containers. One of those is a script called ‘RemoveBloatware.ps1’ that lists just about every app in the standard container, something like this:

$MyContainerName = 'MyContainer'

Remove-CompanyInBcContainer `
    -containerName $MyContainerName `
    -companyName 'My Company'

UnInstall-BcContainerApp `
    -containerName $MyContainerName `
    -name 'AMC Banking 365 Fundamentals' `
    -doNotSaveData `
    -doNotSaveSchema `
    -force

UnInstall-BcContainerApp `
    -containerName $MyContainerName `
    -name 'Business Central Cloud Migration - Previous Release' `
    -doNotSaveData `
    -doNotSaveSchema `
    -force

UnInstall-BcContainerApp `
    -containerName $MyContainerName `
    -name 'Business Central Cloud Migration - Previous Release (US)' `
    -doNotSaveData `
    -doNotSaveSchema `
    -force

UnInstall-BcContainerApp `
    -containerName $MyContainerName `
    -name 'Shopify Connector' `
    -doNotSaveData `
    -doNotSaveSchema `
    -force

# etcetera, add any app that you don't want

Now you can call this script from your NewContainer script and have a nicely trimmed container. It made a real difference for me, I can really notice the better performance. Very useful when you’re in the thick of coding and you need to deploy code changes frequently.

Free Training

There are SO MANY resources to learn about BC and AL development out there. Some of the best of them are totally For Freeeee!! As I always like to point out: free is in everybody’s budget. Today I noticed a video that popped up in my YouTube timeline. It was a video that I had recorded a few years ago, and I noticed that it had more than 30K views! Just an unreal number, and I want to share the story about these videos.

It Starts…

Back in 2018 I was one of the owners of a well-known company (I’m gonna keep the name out of this post for personal reasons). We were working closely with Microsoft on the cutting edge of all the new technologies. We had developed the material for a number of workshops, and we all traveled to a bunch of different places all over the world to teach NAV people the intricacies of the new technology stack, new processes in the channel, and the philosophy that would become the path that we are now all traveling. Chances are that if you attended any event that was organized or sponsored by Microsoft, you have attended one of our workshops.

At some point, we were stretched quite thin. There were more requests for these workshops than we had staff to hold them. Microsoft then came up with the brilliant idea to record our workshops and create a series of videos that would be made available on PartnerSource.

Creating Content

We created a Very. Long. List. of topics, and the word came in from Microsoft: start creating content! The task of actually creating the videos fell to me, because of my demeaner during in-person classes and my pleasant baritone voice *ahem*. The truth is that most of my co-owners thought this would be a terribly tedious task, and I was the only one that was actually excited to record all of this material. Also, I had a LOT to learn and this was a perfect opportunity to do just that. As far as I’m concerned, this was by far the coolest project I had ever taken on in my professional career. It was explained to me that Microsoft would provide the content, and all that I had to do was record the videos.

It would take a whole series of posts to tell the stories of creating the content, so let me just summarize. Despite assurances from one of my co-owners, nobody at Microsoft knew about the expectation that they would provide ANY sort of content, other than meeting with me to briefly discuss the outline of the content and providing answers to questions. For a number of topics we had some content from short presentations at various events, but none of it was nearly good enough to make it into the videos. As it turned out, I had to create most of the material from scratch. A LOT of willing and eager Microsoft people spent a very limited time with me to first explain the basics and then go over the results of my material before I recorded it.

Let me just make one thing very clear. I enjoyed every single minute of the process of creating and recording the material. It was an absolute joy to work with every single person from the BC team, many of whom had to endure completely ignorant questions from me. I am super grateful to have had the opportunity to work with each and every one of them. I could not have done any of this without their help.

Over the course of 6-7 months, I created more than 20 hours of videos. The topics range from a condensed version of our 2-day AL Development workshop, to videos about how to get your app into AppSource, to automated testing, to source code management. Picked up a bunch of skills that I still benefit from today. It really was one of the best projects of my career.

The Academy That Never Was

The initial idea was that Microsoft would create some sort of ‘academy’ that would be accessible in PartnerSource. Partners would pay a fee for to provide training to their staff, of which our company would receive a percentage. All good with us, because there were BIG plans for the ISV Development Center, so we didn’t think we would have much more time to do in-person workshops anymore anyway.

Soon it became clear that this academy was not going to happen. There were calls from partners that they would not want to pay for this, since they never had to pay to attend any in-person events. At some point about half the videos were done, most of the material for the rest was ready to record, and the question was to continue recording or to stop the project.

There was talk of putting the videos on PartnerSource but not behind a paywall, which just made no sense to me at all. Most developers that I know don’t even have access to PartnerSource, so they would never even see it. Besides, if you are going to provide this content for free, why not just put all of it on YouTube? Just upload it to the public and let anyone that wants to learn all the skills that you need to make it as an AL developer. Once I heard that the content was going to be made available for free, I went all in and talked to anyone that wanted to listen that it should be made available publicly.

To make a long story short, they did end up putting the videos on YouTube (they are all in a single playlist that you can access here).

Unexpected Impact

It’s been almost four years since I created these videos. Still today, every once in a while, videos from this playlist show up in my YouTube timeline, like today. It just struck me that this video had 30K (thirty THOUSAND!!!) views. I was just so surprised about how many people have watched videos that I created. Thinking about all the people that have learned these skills, partly as a result of listening to me explain them. It just makes my head spin.

There are two things about these videos that I take full credit for. First, since I was responsible for the content, I had decided that I wanted to make proper full-length videos. Not condensed summary videos with the high level view of the topic, but deep down detailed videos with ALL the information that you need to execute on that topic.

The second thing is getting the content into YouTube. The project manager told me that my relentless lobbying to every person in the BC team was a key factor in getting them to put these videos on YouTube. I was paid to create the videos but getting them to YouTube was totally done with a community spirit. This is by far the most impactful contribution I have ever made to the BC community, one that I am extremely proud of.

Microsoft the ISV

Microsoft published their new Shopify connector today. It’s great to see them invest resources into what is hopefully the gold standard of integrating with these external services. However, I have serious doubts about whether this is such a good idea.

ISV Partner Channel

In the runup to having BC in the cloud, the story was that the partner channel should refocus their efforts into becoming ISV’s. Rather than one-time bespoke systems for individual customers, they want the partner channel to create extensions that could be used by the masses.

This was (is?) a logical continuation of their story of verticalization that we had heard throughout the past two decades. In itself nothing that I don’t agree with. I too think that having re-usable extensions in a marketplace is a solid way to go. Microsoft’s argument was that they need to focus on the base product, a core set of functionalities. The partner channel would then be free to add functionality, to extend the base product.

There’s Just a Tiny Thing…

One thing that caught my ear was a statement that said that Microsoft does not want to provide specific, industry focused expertise. They said they have no interest to build integrations with external systems. Rather than having Microsoft provide integrations or other specialized functionalities, they would leave this up to the partner channel. There could be an ACME Rockets integration created and supported by an ISV, or even by ACME themselves.

Great soundbite showing great potential, and it sold well. Many partners listened to Microsoft and started creating lists of functionalities that they have know-how for. Many VARs dove right into their inventory of “add-ons” with the intent to turn those around into the next AppSource apps.

I know personally of three separate partners that have invested a lot of time and money into developing Shopify integrations. All three of those partners are LIVID with Microsoft today. The promise was that Microsoft would stay out of this type of functionality, and today’s release is one of an unknown number of apps that we will see come out of Microsoft.

Besides the fact that Microsoft is now on the hook for maintaining this app, they have effectively cut off the potential from the ISV channel. Their work in progress as essentially turned into a big fat tax write-off.

What Next?

Two out of those three partners had already been looking at alternatives to their NAV/BC practice, and I can’t say that I blame them. Licenses are no longer capital investments. Margins are going down with lower subscription fees, so you can no longer afford to focus on smaller businesses as clients. Having to go through a primary CSP means that you have to share what little margin remains. The stack has become much more complex, so you have to hire experts for everything.

One of the last things that are left is to develop your own IP and publish on AppSource. Would you decide to invest in new products if there is a real chance that Microsoft is working on the same thing?

Personally, I think Microsoft is making a huge mistake by creating this type of app. I am not sure if they are capable of taking on the support, and that they will be maintained properly. I am also doubtful about the cooling effect this will have on the partner channel’s willingness to invest in new products.

Most important though is that I am just flabbergasted that they prioritized something like this, when there are SO MANY things still left to improve in the base product.

As I am writing this I am struggling to find a good way to finish this post, I’m clearly not done thinking about this. Let me know in the comments what you think.

Dynamic Enums

Although enums are static lists of values, there is a way to restrict which values can be used. With this post I will show you how to do that, and how you can dynamically set up how this happens.

Didn’t Think it was Possible

I didn’t think dynamic enums were possible but I asked the question on Twitter anyway. The golden tip was a page property called ‘ValuesAllowed’. As I was figuring out how to use this property I thought I’d write this blog post. When I returned to Twitter to post my findings, there were two more links to some other people’s articles in the replies. I’ve since removed much of the details from this post, since they are essentially the same. Go and follow the links in the Tweet replies to read those details.

Both replies cover an essentially hard coded way to restrict option values. I want to take this one step further, where we provide a setup field that is used to manage the choices that you see. Now… I have to say I do NOT like using a static list of options for this purpose. We are still looking at a static list of values, and we are hardcoding what is visible. In my real-world scenario we had to put something in place quickly, and this was indeed a very quick ‘fix’.

Scenario

My actual scenario involved a rather controversial topic, so let’s use a silly scenario instead. We add a field to the Customer table called ‘Dessert Choice’, with an enum type that has 4 values: <blank>, Icecream, Cookies, and ‘Choice Declined’. You need an enum object, a table extension with a field based on the enum, and a page extension to add the field to the Customer Card. Let’s say you want to restrict the ‘Choice Declined’ option. Easy peasy, lemon squeezy, you add a ‘ValuesAllowed’ property to the field on the page extension, and you specify the values that you do want to allow.

In my real-world scenario, my client needed a way to restrict the available options for one company, and provide all of them in others. What we ended up doing was add a toggle to a setup table to turn this restriction on or off.

Show Me The Code

As per usual I was writing and writing, using SO MANY words to describe the situation, and decided to just give you the page extension itself, assuming that you can figure out the fields that I am using.

pageextension 60000 CustomerCardDnStr extends "Customer Card"
{
    layout
    {
        addafter(Name)
        {
            field(DessertChoice; Rec.DessertChoice)
            {
                ApplicationArea = All;
                ToolTip = 'Specifies...';
                Visible = AllVisible;
            }
            field(RestrictedDessertChoice; Rec.DessertChoice)
            {
                ApplicationArea = All;
                ToolTip = 'Specifies...';
                ValuesAllowed = Blank, Icecream, Cookies;
                Visible = (not AllVisible);
            }
        }
    }

    var
        MySetup: Record MySetupDnStr;
        AllVisible: Boolean;

    trigger OnOpenPage()
    begin
        MySetup.GetRecordOnce();
        AllVisible := MySetup.AllowDecline;
    end;
}

Basically you create multiple controls in the page extension for the same field, and you toggle visibility based on a field in a setup table. You could even do this at a record level in a list, by using an InDataSet variable, and putting the code in the OnAfterGetCurrRecord trigger. Again, I’m thinking this should have been done with a table with actual functionality, but this way uses very little code and we had to put something in very quickly.

That’s it, nothing fancy. Not very clever, but useful in my client’s scenario.

Partial Records with SetLoadFields

Fetching and updating records has historically been the greatest culprit of performance problems. The standard way that BC retrieves records is very expensive, since it will always get ALL the fields of a table (and its brethren companion tables). This post covers a (relatively) new option called SetLoadFields, which is used to specify the fields that you want to retrieve.

So What’s the Problem?

The database engine for BC is SQL Server for OnPrem and Azure SQL for SaaS; the business logic translates database operations into T-SQL statements at run time. By default, it issues a SELECT * and that means that for every standard database call, BC retrieves ALL fields from the table. Good for us developers because we never have to think about which fields to fetch. From a performance point of view though this causes a MASSIVE superfluous overhead in data traffic. Some of the most used tables in BC have bazillions of fields, and in any business logic scenario you never need more than a handful of fields.

The problem is exacerbated by the presence of table extensions. Each table extension is represented in the SQL database by a companion table that shares the primary key with the main table. Every time that you retrieve records from the main table, the system also retrieves the fields from the companion tables by issuing a JOIN on the PK fields. You can imagine a popular table like the Sales Line with a dozen table extensions; a SalesLine.FindSet command generates a SQL statement that includes a dozen JOINs.

The problem is that the number of fields that are included in SQL statements has a disproportionate effect on query performance. Read the post that I link to below for more details, but what you need to know is that the same query with all fields can take hundreds of times more than retrieving just half a dozen fields.

Only Get What you Need

To eliminate this overhead, we now have a SetLoadFields command. Basically, what you can do with this command is to define the fields that are included in the SQL statement. Instead of getting all fields and get data that you will never use, you tell BC that you only want your handful of fields.

Need an address from a Vendor? The external document number from an invoice? An Inventory Posting Group from an Item? You don’t have to read 6 million fields to do that anymore.

procedure ShowSomeCode()
var
    Vendor: Record Vendor;
begin
    Vendor.SetLoadFields(Address,"Address 2",City,"Post Code");
    Vendor.Get('10000');
    // do stuff with the address fields and ignore the rest
end;

This code sample generates a SQL statement with a SELECT on just those few fields that are defined in the SetLoadFields command, and it should ignore the companion tables altogether since these are all main table fields.

Read the documentation here and make sure you read how to use it here. For more technical in-depth information on what to do and what happens under the hood (way above my head there), read Mads’ posts here and here.

New Habits

In my day-to-day life as a BC developer I don’t normally see SetLoadFields commands. I even checked the standard objects and it’s actually quite surprising how little it is used there. In a previous life I did a LOT of performance troubleshooting, and this would have been a tremendous help in solving lots of performance problems. I know I will try to make using this command a habit.

Create JSON in AL

Last year I developed an integration to an external REST API from Business Central. One of the things that I had to learn is how to deal with JSON. We now have a bunch of different JSON data types, and if you’re just getting into them, they are hard to keep apart. With this post, I’ll try to explain as easy as I can make it, how to create a JSON object in AL.

JSON Basics

First, of course, the basics.

  • JSON is short for ‘JavaScript Object Notation’, follow this link to read the actual standards but if you’re new to this don’t though, it will only confuse you
  • It’s kind of like XML in concept but much easier to read. No attributes, no formatting rules, just a collection of key/value pairs
  • The equivalent of an ‘XML document’ is called a ‘JSON Object’. The start and end of these JSON objects are marked by curly braces (these {}).
  • Keys and values are always between double quotes (unless they are in non-text data types, but you can always put values in double quotes), separated by a colon. Multiple key/value pairs are separated by commas. This is a very simple JSON object with some key/value pairs:
{
    "A Key": "Example One",
    "Another Key": "Another Value",
    "Number Value": 42,
    "Boolean Value": false
}
  • Nesting is done by adding a new JSON object as a value, like this:
{
    "A Key": "Example Two",
    "Nested thing": {
        "First nested Key": "nested value",
        "Second nested key": "some other value"
    }
}
  • A list of JSON objects is called a ‘JSON Array’. The only restriction here is that each of the objects in an array must be structured the same, otherwise it would not be an array. The start and end of a JSON array is marked by square brackets. Each object has its own start/end curlies, and they are separated by commas. Here’s a simple example:
{
    "A Key": "Example Three",
    "List of Things": [
        {
            "name": "thing 1",
            "age": 10
        },
        {
            "name": "thing 2",
            "age": 12
        }
    ]
}

Note that the objects inside the array are identical in structure. This is really it, you should now be able to decipher any JSON response from any web service.

JSON Data Types in AL

In AL, there are 4 different JSON data types. All of these data types are .NET types that are wrapped in AL data types. You don’t have to worry about constructing these variables, they take care of themselves. All you have to do is make sure that you start with a fresh one, so pay attention to the scope of your variables. If you are not sure about that, use the Clear keyword to empty it out.

  • JsonObject – this data type represents an actual JSON object. It has properties and methods, and you use those to build the object as shown in the code examples below
  • JsonArray – this data type is also an object with properties and methods, and it contains the stuff that is between the square brackets
  • JsonValue – this represents a single value of a simple data type like a Text, an Integer, or a Boolean
  • JsonToken – in the JSON world, the ‘Token’ is similar to a Variant, and it can be either one of the other three Json data types. If you are not sure about the content of a JSON element, you can always use a Token as a data type

Show Me The Code!

When I started this section I was going down a rabbit hole of long sentences and complicated explanations. As I was reading what I wrote I realized that describing these things actually was not making any sense at all. Let me just give you the code, and if you have any questions about this please leave a comment below.

Each example refers to the JSON examples in the ‘JSON Basics’ section above.

Example One

The first example is very straight forward. This object only has simple key/value pairs, so we can simply add those to the object, like this:

    procedure ExampleOne()
    var
        MyJobject: JsonObject;
    begin
        MyJobject.Add('A Key', 'Example One');
        MyJobject.Add('Another Key', 'Another Value');
        MyJobject.Add('Number Value', 42);
        MyJobject.Add('Boolean Value', false);
    end;
Example Two

The second example has another JSON object as one of its values. The ‘Add’ method of the JsonObject data type takes a string as the Key name, and a JsonToken as the value parameter. In the first sample those were simple datatypes, and for this second sample we’re going to create another object to use as the value parameter, like this:

    procedure ExampleTwo()
    var
        MyJobject: JsonObject;
        NestedJObject: JsonObject;
    begin
        MyJobject.Add('A Key', 'Example Two');

        NestedJObject.Add('First Nested Key', 'nested value');
        NestedJObject.Add('Second Nested Key', 'some other value');
        MyJobject.Add('Nested thing', NestedJObject);
    end;
Example Three

The final example includes an array of objects, so we’ll build those one step at a time. Everything just coded in there, I hope you see how the array objects should be refactored into a re-usable function.

    procedure ExampleThree()
    var
        MyJobject: JsonObject;
        ThingJObject: JsonObject;
        MyJArray: JsonArray;
    begin
        MyJobject.Add('A Key', 'Example Three');

        Clear(ThingJObject);
        ThingJObject.Add('name', 'thing 1');
        ThingJObject.Add('age', 10);
        MyJArray.Add(ThingJObject);

        Clear(ThingJObject);
        ThingJObject.Add('name', 'thing 2');
        ThingJObject.Add('age', 12);
        MyJArray.Add(ThingJObject);

        MyJobject.Add('List of Things', MyJArray);
    end;
What about JsonToken?

We did not use a variable of type JsonToken directly, because when we are creating a JSON Object we already know what we have. Indirectly though, we definitely ARE using a Token. The ‘Add’ method of both the JsonObject and the JsonArray data types use a Token as the value parameter. Since the Token can be any simple type or any other Json type, you can simply throw any of those in and it knows what to do with it.

The JsonToken type will come in very useful when you start processing incoming JSON objects, and you need to make sure you correctly convert values into compatile data types in AL.

What’s Next?

These are just some very simple examples of how to build a JSON object, but this is really all the logic you’ll ever need to create your own. The JsonObject will take care of the curlies and the commas, the JsonArray will take care of the square brackets and all the other stuff. The only slightly complex thing you’ll ever need to do beyond this is to now take the JsonObject and set it as the body of an HttpRequest.

Let me know in the comments if this is helpful. There are other super useful posts about this topic, but I wanted to explain this in my own words. I will also write one how to read an incoming JSON object, and at some point I’ll share some helper logic that I’ve created to help take care of conversions and such.

Insider Builds on Collaborate

As a Microsoft partner, you have to create a bunch of accounts to get into a bunch of systems. Microsoft is on a continuing mission to make things easier by putting most of it into a single portal called “Partner Center”. In this post I will focus on one particular thing – how to get access to insider builds of BC, which you need to validate your extension against future releases.

I’m writing this because the engagement was not listed for one of my clients. Microsoft support was no help, and I could not find this information in some of the obvious places. Just a quick post, mostly for my own reference but I can’t imagine that I am the only one that is having difficulties getting into the program. Thanks to our little Twitterverse for helping me out.

Collaborate

First, you need access to Collaborate itself. From your Partner Center portal dashboard, you should see a rectangle with the word ‘Collaborate’, the shortlink is aka.ms/collaborate. If you do not have access yet, this is where you would have a link to where you can submit a request to be admitted.

Insider Builds

To get to the insider builds, you have to be enrolled in something called an engagement called ‘Ready! for Dynamics 365 Business Central’. As a Business Central partner, this engagement should be listed and all you do is hit the ‘Join’ button and you should get access within a couple of days.

Access to the insider builds is buried a little bit. Select the Ready! engagement and click the ‘Packages’ button. The one you need is the document called “Working with Business Central Insider Builds”. The thing that you need to get the actual build is the $sasToken. This value is needed to be able to download the artifacts for the insider build. It’s the password to validate that you are allowed to use those artifacts. By the way, don’t forget to take the time to read the material in there, it’s important information.

What if I don’t see the engagement

Normally, when you register as a development partner for Business Central, you are supposed to see the “Ready to go” program in the available engagements, and getting access is as easy as clicking the ‘Join’ button. This does not always happen for some reason. The screenshot below shows the available engagements for one of my clients, and as you can see the program did not show up, so we could not click the join button.

If for whatever reason you do not see the engagement, do not submit a support ticket from Partner Center. These tickets go to a central support group that does not know about program-specific issues, and you’ll spend weeks going back and forth. You’ll ask for access, and they will tell you to click the Join button. You’ll tell them there IS no join button to click and they’ll say click the button anyway. This is not a knock on them, they just don’t seem to know the details.

Access to the BC programs is handled by the product team themselves, and their email is Dyn365BEP@microsoft.com. This will get your issue directly with the BC team, and they will know exactly what to do. Just provide your publisher name, your MPN ID, and first name, last name and email for the person that needs access. Once I did this, the issue was solved within a few days.

Adding new users

Once you have access to the engagement, you should be able to provide access to your own people by first adding them to the Partner Center account, and then by enrolling them into the engagement.

NAV on Docker in 2022

One of my clients asked me if I would be able to help them with an ‘upgrade’ of an add-on for Dynamics NAV for one of their customers. For this task I would have to get a working C/SIDE in a number of versions. It’s been years since I’ve done any C/AL development, and I thought this would be a cool task to work on. This post describes what I found is what does and does not work if you want to do this using Docker containers.

Background

Just to paint a picture… First of all, the end user is on NAV2017. They had an older version of this add-on, which was developed on NAV2013. The task at hand was to implement a newer version of the add-on, which was developed in NAV2018. Technically, this was a downgrade of the add-on objects so I had to be careful to avoid any incompatible object attributes. I won’t bore you with the details of the actual ‘upgrade’, nobody wants to read about those.

The Environments

To be able to identify the mods of the original add-on, I needed a C/SIDE environment for NAV2013R2. Since this version is not available in containers, I had to actually install this version.

The end user is on NAV2017, and the ‘new’ version of the add-on is in NAV2018. Both of these versions are available in containers, and supposedly all you need to do is put together the correct artifact URL. You can find this information on Freddy’s blog. Mind you though, the localization for the US is called ‘na’ in NAV, not ‘us’ like in BC.

How about Docker?

what does and does not work? This, to me, is a pragmatic problem. I spent quite some time trying to make NAV2017 and 2018 work in containers, because I have used them successfully in the past. I have a terrible memory though, so I always start from scratch, and what I could find was outdated. At some point I just started the NAV2017 DVD download as I was researching a problem. The d/l completed before I found the answer so I abandoned the container idea for NAV2017 and just installed it. I have plenty of VMs available to do this so one way or another, I’ll get a working instance.

After going through a bunch of troubleshooting and following obsolete download links, I was able to make the NAV 2018 container work. Freddy wrote about troubleshooting here, but not all links still work. You have to enable .NET framework 3.5 and 4 in the Windows features, you need Visual C++ redistributable for Visual Studio 2015 (“The program can’t start because MSVCP120.dll is missing”), and you have to get working SQL bits (weird – the Windows client did work but C/SIDE did not until I installed those). What made the ODBC errors go away for me is the SQL Server 2012 Native Client. The SQL link in Freddy’s blog did not work for me.

I could not get NAV2017 to work at all, it would not even start. As I said I got the DVD to download before figuring out the problem and I installed it from there. It’s not like I have NAV2017 clients lining up, so I did not want to spend a second more than I had to on this.

Lesson Learned

So in the end I ended up installing NAV2013R2 and NAV2017, and was able to get a NAV2018 container up and running. The lesson learned though? I bet it is still possible to get the NAV2017 container to run right, but just as a safety I have downloaded all versions of the DVD going back to NAV 5.0. To Microsoft’s credit, they still have most of those as downloads, but you never know when they will remove them.

This post is mostly for my own benefit, but I wanted to share it if anyone out there also needs these. Let me know in the comments how you’ve made NAV and C/SIDE work in containers.

THE Book on Automated Testing in BC

It has taken well over a year to write, and a good 8 months to review (and revise, and rewrite certain parts), and it has been delayed almost 3 months. I am SUPER proud to say though that the second edition of THE BOOK on automated testing in Business Central has been published!

As a reviewer of course I knew this was coming, and Luc finally shared the news

What if I Already Have the First Edition?

Of course you do! You are a BC professional, so therefore you’ve been adding automated testing to all of your projects right from the start, and you purchased Luc’s first book right when he wrote that. There are a few reasons why you should purchase the second edition

First of all, at 387 pages the second edition is almost twice the book as the first edition, which counts *only* 206 pages. Luc has added about a metric ton of stuff to the second edition. Not just expanded information on existing topics, but chapters about completely new topics altogether.

Major Improvements

One of the things that I thought was lacking in the first edition was more in-depth information about Test Driven Development (TDD for short). The mechanics of automated testing were solidly covered, and I was able to apply this knowledge in my work. What I was missing was how to take this to the next level. I knew there was a large methodological body of work out there about TDD, and I did not know where to start looking how that would be relevant for ME.

The second edition has filled that gap. Not only does Luc write eloquently about the methodology itself (there’s a whole chapter on TDD itself now), he puts it into the context of Business Central development specifically. He explains HOW you can use TDD in Business Central, he shows you step by step how to approach this, and he even provides a handy set of tools to support this methodology.

Luc spends a LOT of time writing about all aspects of TDD. More than just the nuts and bolts of creating test apps, he covers how to integrate automated tests in your daily development practice.

Advanced Topics

The brand new section called ‘Advanced Topics’ addresses some lesser known things such as refactoring your code to create more re-usable components, utilize standard components in more complex scenarios, the approach to testing web services and even how to make YOUR code more trestable.

In short, this second edition goes much more in-depth in just about every aspect of the book, plus it provides a wealth of information into a number of valuable topics that were not addressed in the first edition.

I am VERY proud to have played a part in writing this book, Luc did a phenomenal job in making the second edition a much more mature volume of THE book on automated testing in Business Central. Even if you already own the first edition, your money will not go to waste if you buy the second one.

Where can I get it?

Two places that I know of that you can get it:

  • The Packt Publishing website. Some people complain about delivery delays and such, but I don’t mind waiting a few days. When you get it from Packt directly, you have the option to have the book in print as well as e-book. The online reader on the Packt website is on of the best online readers I know. Plus with online access you can start reading right away, so to me well worth the handful of extra days for delivering the print book.
  • Amazon of course. Can’t beat delivery time, but Amazon does not bundle print + eBook and Packt does.