Mastering BC 2.0

If you like enormous books that are chockfull of information on a wide variety of technical topics related to Business Central, you must drop everything that you are doing, RIGHT NOW, and let me tell you all about this book. I’ll even give you a link where you can order it.

Book Review Number Twelve

In early 2023 I got a message from Packt with a question whether I would like to help review another book. Over the years I have reviewed a bunch of books for Packt (this is number 12). It had been a while since the last one, so I was excited to put on my reviewer hat. This one was going to be a new and updated edition of a book called “Mastering Microsoft Dynamics 365 Business Central”. This is a very well-known reference book that is written by the dynamic Italian duo of Stefano Demiliani and Duilio Tacconi, both well known members of the BC community. Do yourself a favor and go follow their Twitter feeds (this one and this one), I’ll wait here.

The first edition came out in 2019, so it was about time that we got an updated version. For some reason I never got around to buying the first one, so I was fresh into this book without any preconceived notions about the content. Packt had sent me a table of content with a whopping 18 (yes, EIGHTEEN – and yes that is really spelled with just one ‘t’) chapters. VERY ambitious indeed…

It took most of 2023 to review these chapters. The sheer amount of content made it so that it regularly took entire weekends to process all the stuff and the code that was written. The last one was submitted in mid October, and the guys spent months processing all the feedback to finally release the book. They spared no effort to write the best book that they could, and it shows.

All the Things

You get 18 chapters on the most important technical topics in our BC world. I was going to write something specific about the content, but I’ll refer you to Stefano’s blog post to read more about that.

What I appreciate about this book is that each one of the chapters goes into enough meaningful detail that you feel you have a good grasp of the topic at hand. At the same time, you never get the feeling that you are taken too far into the matter. Wherever that is relevant, each chapter comes with a repository of sample code, to illustrate the topic. If you want you can write your own, and if you want to just follow along with the samples in the book, you can just open the code and poke at it.

Don’t let the sheer size of this book discourage you. Yes the book itself is MASSIVE, but the chapters individually are very manageable. There are other books that go into much more detail, but you will want to have this one to get yourself started on the topics. Another great feature is that you get access to all the book’s sample code in a GitHub repository.

My one ‘negative’ feedback was that in many instances the chapters contained the full object’s code. Sometimes you get pages of source code, which I personally think is a distraction. When I read a book like this, I have VSCode open with the sample code so I don’t need the full object in the book. I would have chosen to only include parts of the source code and highlight the important bits. When you only have the book in your hand that it can be useful to see the whole object though, so there’s an argument for whole objects anyway. Either way, you get lots of samples and that makes me very happy.

What if I already have the First Edition?

The thing is, I have not read the first edition, so I can’t speak for the content of that book. What I CAN tell you is that BC has evolved a TON since 2019, and all of the new capabilities get their due attention in this second edition. Things like Telemetry and AI were not yet available in those days, and even well-established topics have gotten a serious update, such as VSCode extensions and how DevOps can be done, and a ton of other stuff.

If you have the First Edition, it is time to retire it. This Second Edition deserves a spot on every technical BC person’s desk, or at least within reach.

Where can I Get it?

As per usual, two places where you can get it:

  • The Packt website – Packt’s online reader is really nice and searchable, I prefer it to Kindle, and there seems to be a discount on the eBook right now. Plus, being able to start reading right away is also nice, because Packt is not known for the best delivery times
  • Amazon – WAY better delivery times than Packt, so that’s where I would get it if I only wanted the paper version. The Kindle edition is more expensive than the eBook on Packt.

Mad respect and ALL the credit goes to Stefano and Duilio. In addition to having families and successful careers, they did a phenomenal job of writing this book for us. Writing this book was a monster of a task and they pulled it off. I am super proud to be a small part of making this the book that it is. Now click one of the two links and order the book!

PowerShell Duration

This is a really quick tip just for myself to save the script where I can easily get to it, this time a quick way to output the duration of a PowerShell script. When a script takes longer than expected, in my mind I am waiting HOURS for it to complete but it is probably just a minute or two. I’ve run into this before, where I spend WAY too much time trying to locate a good, easy way to output the duration of a script. Without further ado, here’s the PowerShell code:

$startTime = Get-Date

<insert your script here>

$myTimeSpan = New-TimeSpan -Start $startTime -End (Get-Date)

Write-Output ("Execution time was {0} minutes and {1} seconds." -f $myTimeSpan.Minutes, $myTimeSpan.Seconds)

The key elements here being

  • I was looking for a ‘Time’ function and didn’t realize that in PowerShell you actually need to use the ‘Date’ object to access the time. Get-Date returns a DateTime stamp
  • Coming from BC, I was looking for something called ‘duration’, so it took quite a bit of time to find out about New-TimeSpan. This creates a ‘TimeSpan’ object that has its own members for easy concatenation of a user-friendly message. I’m only using minutes and seconds here

Hopefully next time I need this I will remember to search my own blog 🙂

Create JSON in AL

Last year I developed an integration to an external REST API from Business Central. One of the things that I had to learn is how to deal with JSON. We now have a bunch of different JSON data types, and if you’re just getting into them, they are hard to keep apart. With this post, I’ll try to explain as easy as I can make it, how to create a JSON object in AL.

JSON Basics

First, of course, the basics.

  • JSON is short for ‘JavaScript Object Notation’, follow this link to read the actual standards but if you’re new to this don’t though, it will only confuse you
  • It’s kind of like XML in concept but much easier to read. No attributes, no formatting rules, just a collection of key/value pairs
  • The equivalent of an ‘XML document’ is called a ‘JSON Object’. The start and end of these JSON objects are marked by curly braces (these {}).
  • Keys and values are always between double quotes (unless they are in non-text data types, but you can always put values in double quotes), separated by a colon. Multiple key/value pairs are separated by commas. This is a very simple JSON object with some key/value pairs:
{
    "A Key": "Example One",
    "Another Key": "Another Value",
    "Number Value": 42,
    "Boolean Value": false
}
  • Nesting is done by adding a new JSON object as a value, like this:
{
    "A Key": "Example Two",
    "Nested thing": {
        "First nested Key": "nested value",
        "Second nested key": "some other value"
    }
}
  • A list of JSON objects is called a ‘JSON Array’. The only restriction here is that each of the objects in an array must be structured the same, otherwise it would not be an array. The start and end of a JSON array is marked by square brackets. Each object has its own start/end curlies, and they are separated by commas. Here’s a simple example:
{
    "A Key": "Example Three",
    "List of Things": [
        {
            "name": "thing 1",
            "age": 10
        },
        {
            "name": "thing 2",
            "age": 12
        }
    ]
}

Note that the objects inside the array are identical in structure. This is really it, you should now be able to decipher any JSON response from any web service.

JSON Data Types in AL

In AL, there are 4 different JSON data types. All of these data types are .NET types that are wrapped in AL data types. You don’t have to worry about constructing these variables, they take care of themselves. All you have to do is make sure that you start with a fresh one, so pay attention to the scope of your variables. If you are not sure about that, use the Clear keyword to empty it out.

  • JsonObject – this data type represents an actual JSON object. It has properties and methods, and you use those to build the object as shown in the code examples below
  • JsonArray – this data type is also an object with properties and methods, and it contains the stuff that is between the square brackets
  • JsonValue – this represents a single value of a simple data type like a Text, an Integer, or a Boolean
  • JsonToken – in the JSON world, the ‘Token’ is similar to a Variant, and it can be either one of the other three Json data types. If you are not sure about the content of a JSON element, you can always use a Token as a data type

Show Me The Code!

When I started this section I was going down a rabbit hole of long sentences and complicated explanations. As I was reading what I wrote I realized that describing these things actually was not making any sense at all. Let me just give you the code, and if you have any questions about this please leave a comment below.

Each example refers to the JSON examples in the ‘JSON Basics’ section above.

Example One

The first example is very straight forward. This object only has simple key/value pairs, so we can simply add those to the object, like this:

    procedure ExampleOne()
    var
        MyJobject: JsonObject;
    begin
        MyJobject.Add('A Key', 'Example One');
        MyJobject.Add('Another Key', 'Another Value');
        MyJobject.Add('Number Value', 42);
        MyJobject.Add('Boolean Value', false);
    end;
Example Two

The second example has another JSON object as one of its values. The ‘Add’ method of the JsonObject data type takes a string as the Key name, and a JsonToken as the value parameter. In the first sample those were simple datatypes, and for this second sample we’re going to create another object to use as the value parameter, like this:

    procedure ExampleTwo()
    var
        MyJobject: JsonObject;
        NestedJObject: JsonObject;
    begin
        MyJobject.Add('A Key', 'Example Two');

        NestedJObject.Add('First Nested Key', 'nested value');
        NestedJObject.Add('Second Nested Key', 'some other value');
        MyJobject.Add('Nested thing', NestedJObject);
    end;
Example Three

The final example includes an array of objects, so we’ll build those one step at a time. Everything just coded in there, I hope you see how the array objects should be refactored into a re-usable function.

    procedure ExampleThree()
    var
        MyJobject: JsonObject;
        ThingJObject: JsonObject;
        MyJArray: JsonArray;
    begin
        MyJobject.Add('A Key', 'Example Three');

        Clear(ThingJObject);
        ThingJObject.Add('name', 'thing 1');
        ThingJObject.Add('age', 10);
        MyJArray.Add(ThingJObject);

        Clear(ThingJObject);
        ThingJObject.Add('name', 'thing 2');
        ThingJObject.Add('age', 12);
        MyJArray.Add(ThingJObject);

        MyJobject.Add('List of Things', MyJArray);
    end;
What about JsonToken?

We did not use a variable of type JsonToken directly, because when we are creating a JSON Object we already know what we have. Indirectly though, we definitely ARE using a Token. The ‘Add’ method of both the JsonObject and the JsonArray data types use a Token as the value parameter. Since the Token can be any simple type or any other Json type, you can simply throw any of those in and it knows what to do with it.

The JsonToken type will come in very useful when you start processing incoming JSON objects, and you need to make sure you correctly convert values into compatile data types in AL.

What’s Next?

These are just some very simple examples of how to build a JSON object, but this is really all the logic you’ll ever need to create your own. The JsonObject will take care of the curlies and the commas, the JsonArray will take care of the square brackets and all the other stuff. The only slightly complex thing you’ll ever need to do beyond this is to now take the JsonObject and set it as the body of an HttpRequest.

Let me know in the comments if this is helpful. There are other super useful posts about this topic, but I wanted to explain this in my own words. I will also write one how to read an incoming JSON object, and at some point I’ll share some helper logic that I’ve created to help take care of conversions and such.

THE Book on Automated Testing in BC

It has taken well over a year to write, and a good 8 months to review (and revise, and rewrite certain parts), and it has been delayed almost 3 months. I am SUPER proud to say though that the second edition of THE BOOK on automated testing in Business Central has been published!

As a reviewer of course I knew this was coming, and Luc finally shared the news

What if I Already Have the First Edition?

Of course you do! You are a BC professional, so therefore you’ve been adding automated testing to all of your projects right from the start, and you purchased Luc’s first book right when he wrote that. There are a few reasons why you should purchase the second edition

First of all, at 387 pages the second edition is almost twice the book as the first edition, which counts *only* 206 pages. Luc has added about a metric ton of stuff to the second edition. Not just expanded information on existing topics, but chapters about completely new topics altogether.

Major Improvements

One of the things that I thought was lacking in the first edition was more in-depth information about Test Driven Development (TDD for short). The mechanics of automated testing were solidly covered, and I was able to apply this knowledge in my work. What I was missing was how to take this to the next level. I knew there was a large methodological body of work out there about TDD, and I did not know where to start looking how that would be relevant for ME.

The second edition has filled that gap. Not only does Luc write eloquently about the methodology itself (there’s a whole chapter on TDD itself now), he puts it into the context of Business Central development specifically. He explains HOW you can use TDD in Business Central, he shows you step by step how to approach this, and he even provides a handy set of tools to support this methodology.

Luc spends a LOT of time writing about all aspects of TDD. More than just the nuts and bolts of creating test apps, he covers how to integrate automated tests in your daily development practice.

Advanced Topics

The brand new section called ‘Advanced Topics’ addresses some lesser known things such as refactoring your code to create more re-usable components, utilize standard components in more complex scenarios, the approach to testing web services and even how to make YOUR code more trestable.

In short, this second edition goes much more in-depth in just about every aspect of the book, plus it provides a wealth of information into a number of valuable topics that were not addressed in the first edition.

I am VERY proud to have played a part in writing this book, Luc did a phenomenal job in making the second edition a much more mature volume of THE book on automated testing in Business Central. Even if you already own the first edition, your money will not go to waste if you buy the second one.

Where can I get it?

Two places that I know of that you can get it:

  • The Packt Publishing website. Some people complain about delivery delays and such, but I don’t mind waiting a few days. When you get it from Packt directly, you have the option to have the book in print as well as e-book. The online reader on the Packt website is on of the best online readers I know. Plus with online access you can start reading right away, so to me well worth the handful of extra days for delivering the print book.
  • Amazon of course. Can’t beat delivery time, but Amazon does not bundle print + eBook and Packt does.

Import Media Files for SaaS

One of the standard ‘Problems’ when you’re in an AL workspace in VSCode is a warning that you are no longer allowed to use BLOB as a datatype for images. This has been at the bottom of my priorities list until I had a request to create a new image for a standard field. With this post I’ll show you how easy it is.

Media Field

The first element that you need is a field in the table. Instead of a BLOB field with subtype Bitmap, you now need a field of type ‘Media’. There is also a data type called ‘MediaSet’ but that’s not what we are going to use. Go to Docs to read about the difference between Media and MediaSets. The field is not editable directly because we will be importing the image through a function.

In addition to the field itself, you need a function to import an image file into the field. In the object below I have a simple table called ‘Book’ with a number, a title and a cover. We use the ImportCover function to do the import, and implement that as an internal procedure, so it can only be used internal to the app. You can of course set the scope as you see fit.

table 50100 BookDnStr
{
    Caption = 'Book';
    DataClassification = CustomerContent;

    fields
    {
        field(10; "No."; Code[20])
        {
            Caption = 'No.';
            DataClassification = CustomerContent;
        }
        field(20; "Title"; Text[100])
        {
            Caption = 'Title';
            DataClassification = CustomerContent;
        }
        field(30; Cover; Media)
        {
            Caption = 'Cover';
            Editable = false;
        }
    }

    keys
    {
        key(PK; "No.")
        {
            Clustered = true;
        }
    }

    internal procedure ImportCover()
    var
        CoverInStream: InStream;
        FileName: Text;
        ReplaceCoverQst: Label 'The existing Cover will be replaced. Do you want to continue?';
    begin
        Rec.TestField("No.");
        if Rec.Cover.HasValue then
            if not Confirm(ReplaceCoverQst, true) then exit;
        if UploadIntoStream('Import', '', 'All Files (*.*)|*.*', FileName, CoverInStream) then begin
            Rec.Cover.ImportStream(CoverInStream, FileName);
            Rec.Modify(true);
        end;
    end;
}

Factbox for the image

Similar to how Item images have been implemented, you can create a factbox to show the book cover and add that to the Book Card. Using a factbox also makes it easy to keep the related actions close to the control.

page 50100 BookCoverDnStr
{
    Caption = 'Book Cover';
    DeleteAllowed = false;
    InsertAllowed = false;
    LinksAllowed = false;
    PageType = CardPart;
    SourceTable = BookDnStr;

    layout
    {
        area(content)
        {
            field(Cover; Rec.Cover)
            {
                ApplicationArea = All;
                ShowCaption = false;
                ToolTip = 'Specifies the cover art for the current book';
            }
        }
    }
    actions
    {
        area(processing)
        {
            action(ImportCoverDnStr)
            {
                ApplicationArea = All;
                Caption = 'Import';
                Image = Import;
                ToolTip = 'Import a picture file for the Book''s cover art.';

                trigger OnAction()
                begin
                    Rec.ImportCover();
                end;
            }
            action(DeleteCoverDnStr)
            {
                ApplicationArea = All;
                Caption = 'Delete';
                Enabled = DeleteEnabled;
                Image = Delete;
                ToolTip = 'Delete the cover.';

                trigger OnAction()
                begin
                    if not Confirm(DeleteImageQst) then
                        exit;
                    Clear(Rec.Cover);
                    Rec.Modify(true);
                end;
            }
        }
    }
    trigger OnAfterGetCurrRecord()
    begin
        SetEditableOnPictureActions();
    end;

    var
        DeleteImageQst: Label 'Are you sure you want to delete the cover art?';
        DeleteEnabled: Boolean;

    local procedure SetEditableOnPictureActions()
    begin
        DeleteEnabled := Rec.Cover.HasValue;
    end;
}

Add to the Page

All that is left is to add the factbox to the page where you have the import action. In this case I have a very simple Card page for the book, and the factbox is show to the side.

page 50101 BookCardDnStr
{
    Caption = 'Book Card';
    PageType = Card;
    ApplicationArea = All;
    UsageCategory = Administration;
    SourceTable = BookDnStr;

    layout
    {
        area(Content)
        {
            group(General)
            {
                field("No."; Rec."No.")
                {
                    ToolTip = 'Specifies the value of the No. field.';
                    ApplicationArea = All;
                }
                field(Title; Rec.Title)
                {
                    ToolTip = 'Specifies the value of the Title field.';
                    ApplicationArea = All;
                }
            }
        }
        area(FactBoxes)
        {
            part(BookCover; BookCoverDnStr)
            {
                ApplicationArea = All;
                SubPageLink = "No." = field("No.");
            }
        }
    }
    actions
    {
        area(Processing)
        {
            group(Book)
            {
                action(ImportCover)
                {
                    Caption = 'Import Cover Art';
                    ApplicationArea = All;
                    ToolTip = 'Executes the Import Cover action';
                    Image = Import;
                    Promoted = true;
                    PromotedCategory = Process;
                    PromotedOnly = true;

                    trigger OnAction()
                    begin
                        Rec.ImportCover();
                    end;
                }
            }
        }
    }
}

This was a fun one to figure out. Let me know in the comments if it was useful to you

Containers And Bacpacs

A while ago an ISV client of mine was working on getting their app into the Embed program. Part of this process was to upload a bacpac with certain characteristics. The characteristics themselves are not relevant for this post but as I was helping them, but I thought I’d write this quick post to share how you can extract your bacpac files from a container, and how to use those bacpac files to create another one.

The setup

I’m starting out with a standard BC container, which was created using BcContainerHelper, and it is called DenSterDev. Coincidentally, I am also using BcContainerHelper to extract the bacpacs and to create the new container. I am using the “C:\ProgramData\BcContainerHelper folder to store the bacpacs, because that folder is recognized both inside and outside of the container.

Extract Bacpac Files

The container is multi-tenant, so there are two databases that we care about: one is the app database, and the other is the tenant database. Both of those are necessary to create the new container. If you have any apps installed on top of the standard container, those will be included in the bacpac file for the app database, and the bacpac for the tenant database contains the data itself.

The benefit of using BcContainerHelper is that we have very handy Cmdlets to get all this stuff in and out of containers, and the bacpacs is no exception. The command is very easy:

Export-BcContainerDatabasesAsBacpac `
    -containerName 'DenSterDev' `
    -tenant default `
    -sqlCredential $Credential `
    -bacpacFolder C:\ProgramData\BcContainerHelper `
    -doNotCheckEntitlements

The tenant name is the default name of ‘default’ that is created in each standard BC container. The sqlCredential is a PSCredential object that was created during the container generation, using a username and a secure string password. As stated above, the bacpacFolder is a folder that can be accessed both in and out of the container. The entitlement flag is to bypass the check and prevent an error. When you execute this script, the bacpac files will show up in the bacpacFolder:

Create New Container from Bacpac

We are going to use these same bacpac files to create a new container. I’ll use the same container name:

New-BCContainer `
    -accept_eula `
    -containerName 'DenSterDev' `
    -artifactUrl '<ProperArtifactURL>' `
    -auth NavUserPassword `
    -assignPremiumPlan `
    -updateHosts `
    -accept_outdated `
    -Credential $Credential `
    -additionalParameters @('--env appbacpac=C:\ProgramData\BcContainerHelper\app.bacpac','--env tenantbacpac=C:\ProgramData\BcContainerHelper\default.bacpac')

Same as before, the -Credential parameter contains a PSCredential object. Note that the -additionalParameters spans across multiple lines here, but that should go on the same line in your PowerShell editor.

This command will download all the necessary artifacts and create the same container as the standard. The only difference will be that the app and tenant databases will be created from the bacpac files in your folder, instead of the standard database from the artifact. You can follow along with the script in the terminal window.

Nothing earth shattering, and made super easy by BcContainerHelper, but it took me a while to find the information and make this work. Hat’s off to Dmitry in the BC team, he was very patient with me as I got familiar with this process. Let me know in the comments if this was helpful or if you want to add anything.

Understanding OAuth Authorization Code

In this post, I write about my first experience with developing an app that implements an integration between BC and an external API. If you’re like me, most of the web services stuff goes straight over your head. Sure, with some sample code and following along with demos in a class, I can get it to work. BUT… just ONE itsy bitsy teeny weeny tiny thingy goes wrong and you’re absolutely dead in the water. Thankfully I have friends who are willing to help, and I want to pass on this knowledge so others don’t have to spend days trying to figure this out.

The Flow Itself

The “Authorization Code Grant Flow” is just one of several OAuth flows, and one of the first to be implemented. Go here if you want to read more technical details, but if this is the first time you’re reading about this I can guarantee you that you will not understand any of it, I know I didn’t 🙂

As with all OAuth flows, to get access to the actual API you must first get an access token. The thing that makes the authorization code grant flow stand out from other OAuth flows is:

  • There is an additional token that you must get before requesting the access token, this token is called the ‘Authorization Code’
  • In order to get this authorization code, a human being must enter the credentials. This flow is specifically designed NOT to provide any automated way to get the tokens

The MOST confusing thing is that although there is supposed to be an industry standard for REST APIs, it seems that each one has a slightly different way of connecting. One API that I worked on, for instance, had yet another token that is used for the API itself. As if this stuff isn’t hard enough to understand, some of them make it even more difficult. Most of them make an honest effort to provide really good documentation and in some cases even support forums.

The essential flow requires three things:

  1. API Credentials. You get these by signing up with the API provider, and they usually consist of a login ID and a ‘secret’, sometimes an additional password. They are meant to authenticate a human being logging into the API
  2. Authorization Code. This code is used to request the ‘Access Token’ that is used to get access to the API itself
  3. Access/Refresh Token. This is usually a set of two tokens. The Access Token is used for access to the API, and it usually has an expiration date/time. The Refresh Token is used to get a fresh Access Token. As long as you have a valid Refresh Token, you will not need to log back into the API

Each API that implements the authorization code grant flow will provide an endpoint for the authorization code, as well as an endpoint for the access and refresh tokens. You’d be surprised at how many ways this “standard” flow can be implemented though, so you’ll have to find the details yourself. Just hope that the API provider has good documentation.

Log In for the Authorization Code

You get credentials from the API provider, usually in the form of an ID and a ‘Secret’, sometimes with an additional password. For the authorization code grant flow, a human being is required to enter those credentials to authenticate the connection. In BC, the only way to get past this stage is by using a standard control add-in called ‘OAuthControlAddIn’. I’ve written another post that explains the details.

The control add-in provides the mechanics behind the login and processing the redirect response that comes back from the authorization endpoint, and it passes the authorization code itself back to AL through an event in the control add-in. You then take this authorization code and pass that to the token endpoint for the final piece.

Request The Access/Refresh Token

The token endpoint is the final step of the authentication process of the Authorization Code Grant Flow. Sometimes there is a separate endpoint for new tokens and another one for refreshing tokens. Other APIs have a single endpoint with two modes. One accepts the authorization code, the other accepts a refresh token, and they both return a new token pair.

The API checks the validity of the refresh token in every single API call. It is up to you to make sure that your token is valid, and the API usually provides a straightforward way for you to keep track of this yourself, by for instance providing the expiration date/time as part of the token response.

Keep Your Tokens Fresh

The Access Token usually has an expiration date/time. Some tokens are valid for a short period like 10 minutes, others have a longer shelf life. It is up to you to develop logic that checks the validity of your current tokens, and to request new tokens when they expire.

As long as your tokens are valid, you should not have to re-enter credentials, and there is no need for a new Authorization Code. The authorization code is only used when authenticating a fresh connection to the API. Once you’re past the authorization code stage, you should be able to keep the tokens fresh without having a human being log back in.

In AL, the most common way to store the tokens is through the isolated storage functionality. You can set the scope of isolated storage for the whole company, so that multiple users can share the API connection.

My Difficult Experience

One of the reasons why my experience was so difficult was the fact that the API that I was working with had a third token called ‘RestToken’. At the time, I was barely understanding these codes and tokens, and then there was this other token that I could not find in the excellent training that I had followed. Lucky for me, I had some help and was able to understand what was causing the confusion.

My guess (and it really is only a guess) why this is the case is that the API was initially developed with just this ‘RestToken’ and that at some point they built a wrapper around the API to comply with the OAuth “standard”.

The point is that each API has its own unique attributes, and its own way of implementing something that is supposed to be “standard”. Slowly but surely I started seeing the elements of what makes the flow work, and had an excellent teacher who showed me a solid way to handle that in AL code. Normally I would share the code in these posts. At the moment though I only have the training material and the client production code, neither one is mine to share. Maybe in the future when I’m less busy I’ll take some time to put some code together.

Let me know if this helps or not, I’d be happy to get your feedback and try to help if you need some.

OAuth JS Login

This post explains just the login part of the “Authorization Code Grant Flow”, one of the ways to get an OAuth access token from an endpoint. It has taken me WAY too much time to get this, and I had to get some help from my good friend AJ to explain it to me. This is one flow that you will not be able to do in a local container, since the callback URL must be accessible in SaaS.

Authorization Code Grant Flow

The Authorization Code Grant Flow is the most rudimentary OAuth flow. This post won’t explain the flow itself; I’ll write about that in other posts (if I ever get the courage to actually publish that) but I needed to get this part down while it’s still fresh in my mind.

The tricky part about this flow is that it requires a human being to enter credentials of the endpoint. With those credentials, you will get what is called the ‘authorization code’. This authorization code is then used to get the access/refresh tokens themselves.

OAuth Control Add-In

In standard Business Central, there is just one way to catch the authorization code, and that is through a standard JavaScript control called ‘OAuthControlAddIn’. This standard controladdin provides two essential things. First it opens a login screen, where a human being can enter the credentials. Second, it catches the authorization code response. The control add-in feeds the code back through the trigger called ‘AuthorizationCodeRetrieved’.

Why Not Catch the HttpResponse Directly?

THAT, my friend, is a GREAT question, and please forgive me if I get the technical details of the answer wrong because I barely understand this part. When the authorization code endpoint returns the response, it contains the redirect URL that you send into the endpoint. The type of the response causes BC to then automatically forward the response to the redirect URL, WITHOUT a way for you to intercept the response itself. In other words… the response that you see is not the initial response itself, but the response to the response from the redirect call, and THAT response does NOT have the authorization code in it.

Using Postman or the REST client, you can turn off the auto redirect, but AL does not have a way to do that. Microsoft has decided to not allow us to intercept the initial response, and the only way to get the actual authorization code is to use the control add-in. It is the add-in that catches the code and provides that through the ‘AuthorizationCodeRetrieved’ trigger.

Only in SaaS

This automatic forwarding of the authorization code response is the reason why you can’t use this flow on a local container. The redirect URL must be available publicly, which requires your current connection to be in SaaS.

The standard redirect URL is ‘https://businesscentral.dynamics.com/oauthlanding.htm’. You can follow this URL and look at the page source code, and you will see the JavaScript logic there that catches the response. Since a local container does not provide this public access, the redirect will always fail, and you will not be able to catch the authorization code locally.

Here’s How It Works

On the page that you want to provide the action to connect to the API, you add the following control to the content area within the layout section of the page. Note the ‘AuthorizationCodeRetrieved’ trigger that calls another funtion called ‘GetNewTokens’, which is where we finally get the access/refresh tokens.

usercontrol(OAuthControl; OAuthControlAddIn)
{
    ApplicationArea = All;

    trigger ControlAddInReady()
    begin
        ControlAddInReady := true;
    end;

    trigger AuthorizationCodeRetrieved(AuthCode: Text)
    begin
        GetNewTokens(AuthCode);
    end;

    trigger AuthorizationErrorOccurred(AuthError: Text; AuthErrorDescription: Text)
    begin
        Error('%1 %2', AuthError, AuthErrorDescription);
    end;
}

To initialize the login procedure, you then call the ‘StartAuthorization’ method of the add-in. You could have a ‘Login’ action with a call to a ‘DoTheLogin function, like this:

local procedure DoTheLogin()
var
    ConnectionEstablishedMsg: Label 'The connection has already been established';
begin
    if (AccessToken = '') and (RefreshToken = '') then
        CurrPage.OAuthControl.StartAuthorization(GetAuthUrl())
    else
        Message(ConnectionEstablishedMsg);
end;

The control addin then fires the AuthorizationCodeRetrieved trigger, with the authorization code as a parameter, which you can then use to get the access/refresh tokens. Now this code IS part of the initial response, but the HttpClient in AL does not allow us to intercept that response without automatically redirecting the response.

It took me SO LONG to understand how this works, and I could never have done it without AJ’s help. Reading this back now I still don’t know if I am getting the details correct, so I don’t blame you for not getting it. Leave me a message or a comment if this helps or not.

Browse Files in Docker

Use the Docker VSCode extension to browse files in your container

If you struggle using the command prompt to figure out where the files are in your Docker container, this post is for you. I will show you how easy it is to actually browse around the files inside your container.

The Struggle is Real

The first time that I sat behind a PC was in high school in the early 80s. At the time, the only way to ‘communicate’ with your computer was through a DOS prompt. If you were REALLY fancy, you had a .bat file that provided a menu, and you had to type the number and then hit enter to execute what was behind the number. We read how to do the cool things in paper magazines, because the only other resource was books at the library.

The command prompt was not my favorite, and I never really got into computers as much as you’d expect. Not until years later did I find myself working ‘in computers’, and at that time I tried to stick to GUI based tools. For some reason, CLE based tools are back in vogue (or I’m just recently discovering that this is where it’s at) and I find myself struggling to navigate. I kind of know how it works, but it’s difficult for me to keep straight where I am and where the connections are.

Finding Files in my Container

Up until now, the only way that I know of to find files inside a Docker container is to use the command prompt. Using the BCContainerHelper module, you can connect to the container by using the Enter-BCContainer <ContainerName> command. You can tell by the prompt when you are in the container.

The container has its own file system, with folders, just like your host computer. To make things easy, there are two Very Important Folders:

  • The ‘C:\run\my’ folder in the container is mapped to the ‘C:\ProgramData\BcContainerHelper\Extensions\<ContainerName>\my’ folder. This means that the files in those folders are shared by the host and the container, but the path in the container is NOT the same as the path on the host
    • NOTE: this is a container specific folder, so anything that you put into this folder will be deleted when you destroy the container
  • The ‘C:\programdata\bccontainerhelper’ folder is mapped to the same folder on the host. This means that the folder is also shared between the host and the container, PLUS the path is the same in both contexts
    • NOTE: As long as you have BcContainerHelper installed, any files (and additional folders) that you put into this folder will remain there, even when you remove containers. This is a perfect folder for sharing purposes.

This is important to understand, because you will probably use PowerShell scripts to do all sorts of things with containers, and you will need to read and/or write files to folders within the proper context.

A Better Way

The Docker extension for VSCode was updated this week, and it has a new feature that enables you to browse the files from inside VSCode.

This shot shows the folder structure inside my container

For me, this is a MUCH better way, because I find it very hard to keep track of where I am in the folder structure, and this gives me a bit more context. What I am missing is an easy way to tell which folders are shared, and what the path on the host is.

If you don’t have the Docker extension for VSCode yet, you can find it here. You can also search for it in the VSCode marketplace.

SQL Server and Docker

Learn how to use SQL Server to access the databases in your Docker container

This post is for you if you want to be able to access the SQL Server database inside your Docker container, without having to write the query.

For one of my projects, I needed to be able to see the apps that were uninstalled but still had their schema in the tenant database. That information is in the $ndo$navappuninstalledapp table, and with SQL Server Management Studio (SSMS) it is super easy to look at table data. In my container I assumed that I would have to figure out a way to write the actual query (something I am not very good at). As it turns out, I was wrong. In this post I will explain two very easy ways to access the SQL Server database inside your Docker container.

SQL Server Management Studio

The first, most obvious, option is to do a complete install of SQL Server in whatever edition you have access to. If you want to keep things lean though, you can also install a standalone SSMS. You can download SQL Server Management Studio here.

Connecting to a Docker container could not be easier. In the connection dialog, simply enter the name of your container as the server name, and SSMS will connect to it for you. I always use NavUserPassword authentication in my containers, and by default your container password will also work as the sa password inside the container.

My container is called ‘densterdev’, and you can see the default app and tenant databases inside the container.

All I needed to do was look at some table data, and that works just fine. I did not try to do anything more advanced than that.

SQL Server Extension in VSCode

The second option is even easier than installing SSMS, because you are already using VSCode to do your AL development. Microsoft has created a ‘SQL Server’ extension, which works very similar to SSMS. In the extensions search box in VSCode, type ‘SQL’ and select the one made by Microsoft.

After installing you may need to reload VSCode to enable the extension. You will see a new tab on the left navigation pane that will show you the tooltip ‘SQL Server’ when you hover over it. When you click this tab, you will see a heading that says ‘Connections’ at the top, with a + sign next to it. Click this + and follow the prompts. Just like SSMS, you enter the container name as the server name, and it should connect to it with no problem.

The same app and tenant databases are shown inside VSCode

I still did not need to do anything more complex than looking at some data, so I really can’t say what features are available beyond that. My guess is that it is less capable than SSMS, so this may not be an option if you need more advanced capabilities.