Save Report Filters

Going back into the stone age, report objects have always remembered the filter values for the next time that you run the report. What I never noticed was a capability to actually save filter values (yes I now realize that it’s been in there for years now). It’s similar to saving a view of a list page, but for reports. It works a little hinky but let me try and explain.

What I Am Talking About

You may have noticed it in some (but not all) reports, processing-only or actual printed ones. Right at the top of the request page, you see a dropdown box with the words “Last used options and filters”. Click the dropdown arrow and that should show you an option to “Select from full list”.

This opens the “Select – Report Settings” page. The page actually has nothing to do with any settings, but it does show you a list of saved values for the options and filters of the report object. In my mind I am calling it a ‘view’.

I just created a bogus empty report just to show you the options, so ignore the content of the report itself. The interesting part of this page is what you can do. When you click the ellipses, you get a number of options.

  • Delete – obviously to delete the currently selected option
  • New – creates a new record, where you can give the saved value a label and you can define whether this view is for all users or not. Assigning the view to a specific user is done on the list page itself
  • Copy – creates a new record with the same values as the one that’s called “Last used options and filters”
  • Edit – This only works on custom views, and it runs the report’s request page where you can enter the options and filter values for the current view

You can assign the view to a specific user or share it with all users. If this capability is enabled for a report, you should be able to pick from the list of views right from the request page.

How To Enable This

You need to do two things to enable this capability.

  • First, you set the ‘SaveValues’ property on the report’s requestpage. If the report doesn’t have a requestpage, you can add one with just the property and no controls
  • Second, you need to run the report. When you first deploy the report with SaveValues turned on, it does not yet have a view saved. Run the report with any filter/option value and it should create this record for you
requestpage
{
    SaveValues = true;
}

A Little Hinky

To me it feels a little unfinished, like it was rushed into the system. To begin, it’s inconsistent in labeling. On the requestpage it is called “Use default values from” which is incorrect. This is about ‘saved’ values, not ‘default’ values. Then when you open the full list, it is labeled “Select – Report Settings” which again doesn’t feel right. I don’t consider filter values a ‘setting’. Then when you click ‘New’ it opens a screen that is captioned “Edit – Pick Report”… I mean come on… Finally, having to run the report in order to even get the dropdown is not very user friendly. When I first tested this I thought I had done something wrong because the dropdown would not show.

Another big drawback is that the SaveValues property is not available in Report Extensions. You’ll have to copy the standard object in order to provide the capability.

Regardless of its hinkiness and shortcomings, the feature itself is great, especially when you have periodic activities to run for different sets of filters. it’s really nice to have the ability to preconfigure sets of option/filter values. It has the potential to increase productivity and eliminate typos in entering filters. In my opinion, this feature should be enabled by default.

Document Attachments

One of my clients had asked me to add Document Attachments to Bank Deposits. Thinking this was a quick and easy one I added the factbox, set the link, and went on with my day. When my client said they see all attachments for all records, I realized it is a little more involved. It took some time to figure out how it actually works, and this post explains the whole thing, including how to get document attachments through the posting process.

How Does it Really Work?

The reason why it’s not so simple is because the Document Attachment works with RecRef instead of a hardcoded table relationship. Take a look at the Document Attachment table (table number 1173 in the base app) for the field definitions. I’ll focus on single field PK records in this post, so tables where a single Code20 field has the unique identifier of the record. The more complex compound PK works the same, you just need to set more fields.

Standard BC has a limited number of tables that have document attachment capabilities. If you want to add another table, whether a custom table or another standard table, you will need to subscribe to some events to make that work. Let’s first look at how standard document attachments work.

Open Document Attachments

Let’s take a look at the Customer Card page in standard BC. The “Attachments” action has the following OnAction trigger code:

trigger OnAction()
var
    DocumentAttachmentDetails: Page "Document Attachment Details";
    RecRef: RecordRef;
begin
    RecRef.GetTable(Rec);
    DocumentAttachmentDetails.OpenForRecRef(RecRef);
    DocumentAttachmentDetails.RunModal();
end;

The “Document Attachment Details” page is where the magic happens. If you drill down into the ‘OpenForRecRef’ function, you will see that it takes a RecRef variable (which in this example is looking at the current Customer record) and sets filter on the table ID of the RecRef and the value of the PK field of the record itself. This is the function that defines all the tables in the standard BC app that have Document Attachments.

The important thing in this function is the call to OnAfterOpenForRecRef, which is an event publisher that we will subscribe to later. This event gives you the capability to set a filter to any table. Note that this is just a filter on a page. All that this does is make sure that you only see the document attachments for this particular record.

The Document Attachments Factbox

Another way to give the user access to the document attachments is the “Document Attachment Factbox” that you can see in the factboxes area. In the standard app on the Customer Card, the SubPageLink property links the “Table ID” field to a hardcoded value 18 (which is Customer table’s object id). If you want to create your own link, you should use the table name instead of its number. So we will be linking to the “Bank Deposit Header” table, so our constant value will be Database::”Bank Deposit Header”.

Take a look at the OnDrillDown trigger of the NumberOfRecords field in the factbox. It first sets up the RecRef, which is again hardcoded for the standard range of tables that have document attachments. Then it executes the same logic on the Document Attachment Details page as the action mentioned above.

Note the OnBeforeDrillDown function call in the ‘else’ leg of the ‘case’ statement. This is the event that you need to subscribe to in order to properly filter the document attachments for non-standard tables.

What is important to understand about this particular factbox is that records are not entered directly into the factbox. You have to click an action that adds the record, and as a result the “No.” field in the factbox is NOT populated by the page link.

Creating New Document Attachments

So far we’ve only looked at how to display the proper document attachments to the user. What’s left is how BC actually stores these records. The part that is tricky is not about getting the file itself, but how BC gets the value from the RecRef. The function that stores the values from the RecRef into the document attachment is called InitFieldsFromRecRef. You can see that this function again goes through all of the same hard coded standard BC tables that we’ve seen before, and it provides an event publisher called OnAfterInitFieldsFromRecRef that you can use for additional tables.

Document Attachments for Additional Tables

Alright, now let’s put it all together for a new table. I recently added this for Bank Deposits for a client of mine, so I’ll use the same table to illustrate. I’ll focus on the new ‘Bank Deposit Header’ and its Posted sibling. The new implementation of bank deposits can be found in the “_Exclude_Bank Deposits” app that is part of standard BC.

To get started, create a page extension for the Bank Deposit and the Posted Bank Deposit pages, and add the Document Attachment factbox. You can copy this from the Customer Card and change the links appropriately. Note that the records that you will see when you drill down into the details are filtered on the table but not by the PK value of the record that you are looking at. In other words, just adding this factbox will only give you a list of all the Document Attachments that are linked to ALL (Posted) Bank Deposits.

Finally, create a new codeunit, I called mine ‘DocAttachmentSubs’.

Filter the Details

First, we need to make sure that the “Document Attachment Details” page is filtered on the correct Bank Deposit number. For this we subscribe to the OnAfterOpenForRecRef event.

[EventSubscriber(ObjectType::Page, Page::"Document Attachment Details", 'OnAfterOpenForRecRef', '', true, true)]
local procedure DocAttDetailsPageOnAfterOpenForRecRef(var DocumentAttachment: Record "Document Attachment"; var RecRef: RecordRef)
var
    MyFieldRef: FieldRef;
    RecNo: Code[20];
begin
    if RecRef.Number in [Database::"Bank Deposit Header", Database::"Posted Bank Deposit Header"] then begin
        MyFieldRef := RecRef.Field(1); // field 1 is the "No." field in both tables
        RecNo := MyFieldRef.Value();
        DocumentAttachment.SetRange("No.", RecNo);
    end;
end;

Filter the Factbox

Next, we need to make sure that the details are filtered properly when the user clicks the DrillDown from the document attachment factbox. For this we subscribe to the OnBeforeDrillDown event in the factbox.

[EventSubscriber(ObjectType::Page, Page::"Document Attachment Factbox", 'OnBeforeDrillDown', '', true, true)]
local procedure DocAttFactboxOnBeforeDrillDown(DocumentAttachment: Record "Document Attachment"; var RecRef: RecordRef)
var
    BankDepositHeader: Record "Bank Deposit Header";
    PostedBankDepositHeader: Record "Posted Bank Deposit Header";
begin
    case DocumentAttachment."Table ID" of
        Database::"Bank Deposit Header":
            begin
                RecRef.Open(Database::"Bank Deposit Header");
                if BankDepositHeader.Get(DocumentAttachment."No.") then
                    RecRef.GetTable(BankDepositHeader);
            end;
        Database::"Posted Bank Deposit Header":
            begin
                RecRef.Open(Database::"Posted Bank Deposit Header");
                if PostedBankDepositHeader.Get(DocumentAttachment."No.") then
                    RecRef.GetTable(PostedBankDepositHeader);
            end;
    end;
end;

So now when the user clicks the drilldown, BC will set the RecRef to look at the (Posted) Bank Deposit, which is then sent into the details page, which then knows how to properly filter.

Set the Right Link

The last thing we need is to make sure that new document attachments have the (Posted) Bank Deposit number, by subscribing to the OnAfterInitFieldsFromRecRef event of the Document Attachment table itself.

[EventSubscriber(ObjectType::Table, Database::"Document Attachment", 'OnAfterInitFieldsFromRecRef', '', true, true)]
local procedure DocAttTableOnAfterInitFieldsFromRecRef(var DocumentAttachment: Record "Document Attachment"; var RecRef: RecordRef)
var
    MyFieldRef: FieldRef;
    RecNo: Code[20];
begin
    if RecRef.Number in [Database::"Bank Deposit Header", Database::"Posted Bank Deposit Header"] then begin
        MyFieldRef := RecRef.Field(1); // field 1 is the "No." field in both tables
        RecNo := MyFieldRef.Value();
        DocumentAttachment.Validate("No.", RecNo);
    end;
end;

You should now be able to create new document attachments for unposted and posted bank deposits. All that’s left is to get document attachments to flow through the posting process.

Posting

Document Attachments are not intrinsically difficult. In the end they are just records in the database. The records are identified by their Table ID and their PK values. For Bank Deposits the PK is a single Code20 field, and there are two versions of the table. All we need to do is write a little loopyloopy that reads the records for the unposted record, copy it into the posted record, and get rid of the old ones. Would be cool to just change the records, but since both the Table ID and the record identifier are all part of the PK, you can’t do a ‘Rename’ because then you’d have to sit there and click confirmations all day long.

For Bank Deposits, you can use the OnAfterBankDepositPost event in the “Bank Deposit-Post” codeunit.

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Bank Deposit-Post", 'OnAfterBankDepositPost', '', true, true)]
local procedure BankDepositPostOnAfterBankDepositPost(BankDepositHeader: Record "Bank Deposit Header"; var PostedBankDepositHeader: Record "Posted Bank Deposit Header")
begin
    MoveAttachmentsToPostedDeposit(Database::"Bank Deposit Header", BankDepositHeader."No.",
                                    Database::"Posted Bank Deposit Header", PostedBankDepositHeader."No.");
end;

local procedure MoveAttachmentsToPostedDeposit(FromTableId: Integer; FromNo: Code[20]; ToTableId: Integer; ToNo: Code[20])
var
    FromDocumentAttachment: Record "Document Attachment";
    ToDocumentAttachment: Record "Document Attachment";
begin
    FromDocumentAttachment.SetRange("Table ID", FromTableId);
    FromDocumentAttachment.SetRange("No.", FromNo);

    if FromDocumentAttachment.FindSet() then begin
        repeat
            Clear(ToDocumentAttachment);
            ToDocumentAttachment.Init();
            ToDocumentAttachment.TransferFields(FromDocumentAttachment);
            ToDocumentAttachment.Validate("Table ID", ToTableId);
            ToDocumentAttachment.Validate("No.", ToNo);
            ToDocumentAttachment.Insert(true);
        until FromDocumentAttachment.Next() = 0;
        FromDocumentAttachment.DeleteAll();
    end;
end;

I have this in a separate function because I also had to make it work for the old Deposit implementation. You could totally combine this into a single event subscriber.

I’m thinking about creating a video about this topic. Let me know if this was useful in the comments and if you’d like to see the video.

Obsolete But Not Gone

This post was born out of a bit of an embarrassing situation, in which I had to obsolete some fields from an AppSource app. Just a quick one to let you know what the official process is, and also that you don’t have to wait for the next major version of BC to fully obsolete a field.

The ‘Situation’

This app has table extensions for the Sales Line table and the posted sales line tables (invoice and credit memo). Not until their first live implementation did we find out that the field numbers in the credit memo line table were not aligned, so at that point BC will yell at you that field types are not compatible. Oops….

For those that don’t understand, let me explain. Field values in the Sales Line table are copied to their posted counterparts by a command called ‘TransferFields’. This command basically copies all the field values from the source table into the target table. Couple things to note: the fields must have the same field numbers, and they must have the same data types. In our case, we had fields 1 and 2 in the Sales Line table, and their corresponding fields in the posted credit memo line had field numbers 2 and 3. The posting process tried to put a number 2 value into field number 1, with the predictable result that there was a data type mismatch.

Why is this a problem? Can’t you just change the field number? Well no, because once an app is published on AppSource, you are not allowed to make any ‘breaking changes’, and renumbering a field is considered a breaking change. Doesn’t matter how I personally feel about that (I don’t agree that field numbers are breaking) and so we did not have any other choice but to use the obsoletion process.

As a side note – yes this exposed a serious issue in our process. Posting a credit memo had clearly not been tested, and matching field numbers is something that a BC developer with my experience should never have a problem with. To my client’s credit: no fingers were pointed, we addressed the issue, and we shared the cost of fixing it.

Obsoleting a Field

Alright, so the official process of obsoleting a field is described in the ‘deprecation guidelines‘. Skip the preprocessing piece if you have not seen that before, and focus on the steps in making code obsolete. There are plenty of blog posts that explain the process itself, so I will not go into any detail.

My main concern was how fast can we get this into AppSource. The process to make a field obsolete has two stages:

  • First, the ObsoleteState of the field is set to ‘Pending’.
    • This means that the field has been marked, but it can still be used in the app
    • The purpose of this stage is to flag the code to any party that has an extension of the code, so that they can take steps to address the change
    • All references to the field will be shown in the VSCode problems window with a warning. There is a reason and a tag property that is used to define which in BC version the change will become permanent
  • Second, the ObsoleteState of the field is set to ‘Removed’.
    • In this state, the field still exists but it cannot be used any longer
    • All references to the field will show up as errors in the problems list

There is a perfectly valid reason why there are two steps. My concern was how long it would take us to get the fields to be removed. The documentation does not address the required AppSource timeline, and I could not find any definite answers in Yammer. The only timeline reference that I could find in any Microsoft documentation was that code must be in ‘Pending’ state for one entire major version. This issue came up in early October, days after 2022 wave 2 was released. If this was true, we would have to wait until 2023 wave 1 (April next year!!) to get the fields obsoleted.

What We Did

Some partners assured me that there is no mandatory wait time for a whole major cycle. The intention of that timeline is not to limit partners but a practice that Microsoft uses. This is to make sure that the partners always have at least one full release cycle to address any compatibility issues due to obsolete code in the base app.

So, with that in mind we went to work. What saved us is that this particular app only had one implementation live, so all we had to do is make sure that they upgrade the app to the ‘Pending’ version as soon as humanly possible. Depending on the number of live implementations you have, this could take longer. I actually don’t even know if there is a dashboard where you can see the live versions that are in use.

I have to say that I was very impressed with how smooth the process of pushing a new version of an app into AppSource is now. After properly testing the change, we pushed the ‘Pending’ app to AppSource, and that was done and ready to publish within a day. The end user then upgraded the app, and we pushed the ‘Removed’ app to AppSource right away. We were able to address the issue within days, and the client lived happily ever after.

Repair Companion Tables

One of my clients has been seeing these weird data issues when migrating databases to the cloud. The most important purpose of this post is to point out a handy data tool that is buried deep in the application, read on to learn what I am talking about.

The problems that my client is experiencing mostly have to do with various ways in which they get data from OnPrem to the cloud. One way is to use the cloud migration tools, another is configuration packages. In addition to using different tools, it feels like there may be some issues with the BC platform’s capability to properly manage data in table extensions. I’ll tell you why I think that.

Missing Data

The first indication was a problem where invoices were migrated into the cloud. The migration process seems to have completed, and the posted invoice list shows a list of invoices. The weird part is that when you try to open an invoice, it shows an empty invoice page. Click ‘View Table’ from the page inspector, and you get nothing, a list of zero records even though the posted invoice list shows the records. Go to the admin center to look at the capacity, it tells us there are like 1154 invoices but when you drill down into the number you get another empty list.

This feels very familiar to me, very much like when a lot of companies tried to migrate straight into SQL Server and we would see null field values. As we all know, BC can’t handle null values. Instead of getting an error saying ‘there are null values!! I don’t know what to do!!’ you get weird behavior like empty lists for tables that you know have plenty of records.

Solving The Problem

With the explosion of moving functionality into separate apps, I had a feeling that the problem had something to do with table extensions (which as you know have added fields in what is called ‘Companion Tables’). I had posted the question on Twitter and there were suggestions to uninstall and re-install apps. This worked sometimes but not all the time.

The original Tweet asking for help on this issue

What it feels like to me is that either there are null values in records, or maybe records are missing in companion tables altogether. We don’t have access to Azure SQL so there is no way for me to actually prove that there is an issue with table extensions. SOMETHING is wrong here though, and for the longest time the only way that I thought we could fix it was to uninstall/re-install apps until the issue was fixed. The reason why this works is that each time you install an app, it will update the schema and make sure that data integrity remains intact. In other words, it will make sure that all records in the main tables have corresponding records in all companion tables.

And then I became aware of a VERY handy little tool. This tool is not available when you are working in local containers, but when you are in a cloud tenant, you will have access to it. I’m talking about the “Repair Companion Table Records” process in the “Cloud Migration Management” page.

The tool will go through all tables that have table extensions, and make sure that each record in extended tables has a corresponding record in each companion table. I still can’t prove my theory but I do know that running this process fixed the problem for my client.

Free Training

There are SO MANY resources to learn about BC and AL development out there. Some of the best of them are totally For Freeeee!! As I always like to point out: free is in everybody’s budget. Today I noticed a video that popped up in my YouTube timeline. It was a video that I had recorded a few years ago, and I noticed that it had more than 30K views! Just an unreal number, and I want to share the story about these videos.

It Starts…

Back in 2018 I was one of the owners of a well-known company (I’m gonna keep the name out of this post for personal reasons). We were working closely with Microsoft on the cutting edge of all the new technologies. We had developed the material for a number of workshops, and we all traveled to a bunch of different places all over the world to teach NAV people the intricacies of the new technology stack, new processes in the channel, and the philosophy that would become the path that we are now all traveling. Chances are that if you attended any event that was organized or sponsored by Microsoft, you have attended one of our workshops.

At some point, we were stretched quite thin. There were more requests for these workshops than we had staff to hold them. Microsoft then came up with the brilliant idea to record our workshops and create a series of videos that would be made available on PartnerSource.

Creating Content

We created a Very. Long. List. of topics, and the word came in from Microsoft: start creating content! The task of actually creating the videos fell to me, because of my demeaner during in-person classes and my pleasant baritone voice *ahem*. The truth is that most of my co-owners thought this would be a terribly tedious task, and I was the only one that was actually excited to record all of this material. Also, I had a LOT to learn and this was a perfect opportunity to do just that. As far as I’m concerned, this was by far the coolest project I had ever taken on in my professional career. It was explained to me that Microsoft would provide the content, and all that I had to do was record the videos.

It would take a whole series of posts to tell the stories of creating the content, so let me just summarize. Despite assurances from one of my co-owners, nobody at Microsoft knew about the expectation that they would provide ANY sort of content, other than meeting with me to briefly discuss the outline of the content and providing answers to questions. For a number of topics we had some content from short presentations at various events, but none of it was nearly good enough to make it into the videos. As it turned out, I had to create most of the material from scratch. A LOT of willing and eager Microsoft people spent a very limited time with me to first explain the basics and then go over the results of my material before I recorded it.

Let me just make one thing very clear. I enjoyed every single minute of the process of creating and recording the material. It was an absolute joy to work with every single person from the BC team, many of whom had to endure completely ignorant questions from me. I am super grateful to have had the opportunity to work with each and every one of them. I could not have done any of this without their help.

Over the course of 6-7 months, I created more than 20 hours of videos. The topics range from a condensed version of our 2-day AL Development workshop, to videos about how to get your app into AppSource, to automated testing, to source code management. Picked up a bunch of skills that I still benefit from today. It really was one of the best projects of my career.

The Academy That Never Was

The initial idea was that Microsoft would create some sort of ‘academy’ that would be accessible in PartnerSource. Partners would pay a fee for to provide training to their staff, of which our company would receive a percentage. All good with us, because there were BIG plans for the ISV Development Center, so we didn’t think we would have much more time to do in-person workshops anymore anyway.

Soon it became clear that this academy was not going to happen. There were calls from partners that they would not want to pay for this, since they never had to pay to attend any in-person events. At some point about half the videos were done, most of the material for the rest was ready to record, and the question was to continue recording or to stop the project.

There was talk of putting the videos on PartnerSource but not behind a paywall, which just made no sense to me at all. Most developers that I know don’t even have access to PartnerSource, so they would never even see it. Besides, if you are going to provide this content for free, why not just put all of it on YouTube? Just upload it to the public and let anyone that wants to learn all the skills that you need to make it as an AL developer. Once I heard that the content was going to be made available for free, I went all in and talked to anyone that wanted to listen that it should be made available publicly.

To make a long story short, they did end up putting the videos on YouTube (they are all in a single playlist that you can access here).

Unexpected Impact

It’s been almost four years since I created these videos. Still today, every once in a while, videos from this playlist show up in my YouTube timeline, like today. It just struck me that this video had 30K (thirty THOUSAND!!!) views. I was just so surprised about how many people have watched videos that I created. Thinking about all the people that have learned these skills, partly as a result of listening to me explain them. It just makes my head spin.

There are two things about these videos that I take full credit for. First, since I was responsible for the content, I had decided that I wanted to make proper full-length videos. Not condensed summary videos with the high level view of the topic, but deep down detailed videos with ALL the information that you need to execute on that topic.

The second thing is getting the content into YouTube. The project manager told me that my relentless lobbying to every person in the BC team was a key factor in getting them to put these videos on YouTube. I was paid to create the videos but getting them to YouTube was totally done with a community spirit. This is by far the most impactful contribution I have ever made to the BC community, one that I am extremely proud of.

Microsoft the ISV

Microsoft published their new Shopify connector today. It’s great to see them invest resources into what is hopefully the gold standard of integrating with these external services. However, I have serious doubts about whether this is such a good idea.

ISV Partner Channel

In the runup to having BC in the cloud, the story was that the partner channel should refocus their efforts into becoming ISV’s. Rather than one-time bespoke systems for individual customers, they want the partner channel to create extensions that could be used by the masses.

This was (is?) a logical continuation of their story of verticalization that we had heard throughout the past two decades. In itself nothing that I don’t agree with. I too think that having re-usable extensions in a marketplace is a solid way to go. Microsoft’s argument was that they need to focus on the base product, a core set of functionalities. The partner channel would then be free to add functionality, to extend the base product.

There’s Just a Tiny Thing…

One thing that caught my ear was a statement that said that Microsoft does not want to provide specific, industry focused expertise. They said they have no interest to build integrations with external systems. Rather than having Microsoft provide integrations or other specialized functionalities, they would leave this up to the partner channel. There could be an ACME Rockets integration created and supported by an ISV, or even by ACME themselves.

Great soundbite showing great potential, and it sold well. Many partners listened to Microsoft and started creating lists of functionalities that they have know-how for. Many VARs dove right into their inventory of “add-ons” with the intent to turn those around into the next AppSource apps.

I know personally of three separate partners that have invested a lot of time and money into developing Shopify integrations. All three of those partners are LIVID with Microsoft today. The promise was that Microsoft would stay out of this type of functionality, and today’s release is one of an unknown number of apps that we will see come out of Microsoft.

Besides the fact that Microsoft is now on the hook for maintaining this app, they have effectively cut off the potential from the ISV channel. Their work in progress as essentially turned into a big fat tax write-off.

What Next?

Two out of those three partners had already been looking at alternatives to their NAV/BC practice, and I can’t say that I blame them. Licenses are no longer capital investments. Margins are going down with lower subscription fees, so you can no longer afford to focus on smaller businesses as clients. Having to go through a primary CSP means that you have to share what little margin remains. The stack has become much more complex, so you have to hire experts for everything.

One of the last things that are left is to develop your own IP and publish on AppSource. Would you decide to invest in new products if there is a real chance that Microsoft is working on the same thing?

Personally, I think Microsoft is making a huge mistake by creating this type of app. I am not sure if they are capable of taking on the support, and that they will be maintained properly. I am also doubtful about the cooling effect this will have on the partner channel’s willingness to invest in new products.

Most important though is that I am just flabbergasted that they prioritized something like this, when there are SO MANY things still left to improve in the base product.

As I am writing this I am struggling to find a good way to finish this post, I’m clearly not done thinking about this. Let me know in the comments what you think.

Dynamic Enums

Although enums are static lists of values, there is a way to restrict which values can be used. With this post I will show you how to do that, and how you can dynamically set up how this happens.

Didn’t Think it was Possible

I didn’t think dynamic enums were possible but I asked the question on Twitter anyway. The golden tip was a page property called ‘ValuesAllowed’. As I was figuring out how to use this property I thought I’d write this blog post. When I returned to Twitter to post my findings, there were two more links to some other people’s articles in the replies. I’ve since removed much of the details from this post, since they are essentially the same. Go and follow the links in the Tweet replies to read those details.

Both replies cover an essentially hard coded way to restrict option values. I want to take this one step further, where we provide a setup field that is used to manage the choices that you see. Now… I have to say I do NOT like using a static list of options for this purpose. We are still looking at a static list of values, and we are hardcoding what is visible. In my real-world scenario we had to put something in place quickly, and this was indeed a very quick ‘fix’.

Scenario

My actual scenario involved a rather controversial topic, so let’s use a silly scenario instead. We add a field to the Customer table called ‘Dessert Choice’, with an enum type that has 4 values: <blank>, Icecream, Cookies, and ‘Choice Declined’. You need an enum object, a table extension with a field based on the enum, and a page extension to add the field to the Customer Card. Let’s say you want to restrict the ‘Choice Declined’ option. Easy peasy, lemon squeezy, you add a ‘ValuesAllowed’ property to the field on the page extension, and you specify the values that you do want to allow.

In my real-world scenario, my client needed a way to restrict the available options for one company, and provide all of them in others. What we ended up doing was add a toggle to a setup table to turn this restriction on or off.

Show Me The Code

As per usual I was writing and writing, using SO MANY words to describe the situation, and decided to just give you the page extension itself, assuming that you can figure out the fields that I am using.

pageextension 60000 CustomerCardDnStr extends "Customer Card"
{
    layout
    {
        addafter(Name)
        {
            field(DessertChoice; Rec.DessertChoice)
            {
                ApplicationArea = All;
                ToolTip = 'Specifies...';
                Visible = AllVisible;
            }
            field(RestrictedDessertChoice; Rec.DessertChoice)
            {
                ApplicationArea = All;
                ToolTip = 'Specifies...';
                ValuesAllowed = Blank, Icecream, Cookies;
                Visible = (not AllVisible);
            }
        }
    }

    var
        MySetup: Record MySetupDnStr;
        AllVisible: Boolean;

    trigger OnOpenPage()
    begin
        MySetup.GetRecordOnce();
        AllVisible := MySetup.AllowDecline;
    end;
}

Basically you create multiple controls in the page extension for the same field, and you toggle visibility based on a field in a setup table. You could even do this at a record level in a list, by using an InDataSet variable, and putting the code in the OnAfterGetCurrRecord trigger. Again, I’m thinking this should have been done with a table with actual functionality, but this way uses very little code and we had to put something in very quickly.

That’s it, nothing fancy. Not very clever, but useful in my client’s scenario.

Partial Records with SetLoadFields

Fetching and updating records has historically been the greatest culprit of performance problems. The standard way that BC retrieves records is very expensive, since it will always get ALL the fields of a table (and its brethren companion tables). This post covers a (relatively) new option called SetLoadFields, which is used to specify the fields that you want to retrieve.

So What’s the Problem?

The database engine for BC is SQL Server for OnPrem and Azure SQL for SaaS; the business logic translates database operations into T-SQL statements at run time. By default, it issues a SELECT * and that means that for every standard database call, BC retrieves ALL fields from the table. Good for us developers because we never have to think about which fields to fetch. From a performance point of view though this causes a MASSIVE superfluous overhead in data traffic. Some of the most used tables in BC have bazillions of fields, and in any business logic scenario you never need more than a handful of fields.

The problem is exacerbated by the presence of table extensions. Each table extension is represented in the SQL database by a companion table that shares the primary key with the main table. Every time that you retrieve records from the main table, the system also retrieves the fields from the companion tables by issuing a JOIN on the PK fields. You can imagine a popular table like the Sales Line with a dozen table extensions; a SalesLine.FindSet command generates a SQL statement that includes a dozen JOINs.

The problem is that the number of fields that are included in SQL statements has a disproportionate effect on query performance. Read the post that I link to below for more details, but what you need to know is that the same query with all fields can take hundreds of times more than retrieving just half a dozen fields.

Only Get What you Need

To eliminate this overhead, we now have a SetLoadFields command. Basically, what you can do with this command is to define the fields that are included in the SQL statement. Instead of getting all fields and get data that you will never use, you tell BC that you only want your handful of fields.

Need an address from a Vendor? The external document number from an invoice? An Inventory Posting Group from an Item? You don’t have to read 6 million fields to do that anymore.

procedure ShowSomeCode()
var
    Vendor: Record Vendor;
begin
    Vendor.SetLoadFields(Address,"Address 2",City,"Post Code");
    Vendor.Get('10000');
    // do stuff with the address fields and ignore the rest
end;

This code sample generates a SQL statement with a SELECT on just those few fields that are defined in the SetLoadFields command, and it should ignore the companion tables altogether since these are all main table fields.

Read the documentation here and make sure you read how to use it here. For more technical in-depth information on what to do and what happens under the hood (way above my head there), read Mads’ posts here and here.

New Habits

In my day-to-day life as a BC developer I don’t normally see SetLoadFields commands. I even checked the standard objects and it’s actually quite surprising how little it is used there. In a previous life I did a LOT of performance troubleshooting, and this would have been a tremendous help in solving lots of performance problems. I know I will try to make using this command a habit.

Excel Buffer for the Cloud

One of my clients asked me to help them convert an add-on that they developed in C/CIDE into an AppSource app. This add-on includes the functionality to export some data into an Excel file, using the Excel Buffer table.

The Excel Buffer table is also available in AL, but one of the issues is that as soon as you set the target of the extension to ‘Cloud’ (Which as you know is an attribute in app.json), the compiler will scream at you that you can’t use certain functions of the Excel Buffer, because their Scope has been set to on premises. So if your C/AL object uses the ‘OpenExcel’ function, for instance, you can’t use that for AppSource apps because its scope is OnPrem. This type of thing usually takes me days to figure out, so I thought I’d ask Twitter with my favorite community hashtag #bcalhelp

Within a day I received a bunch of helpful suggestions, I just love this community! The one that put me over the top was a phone call with my good friend AJ, who not only showed me, but he also sent me some sample code that he was working on. He’s working on a blog post about this topic himself, so I’ll let him share that and I’ll post a link to his blog once he puts it online. I want to mention Owen too because he had sent me essentially the same suggestions, but to an email address that I hardly ever use anymore, so I didn’t see that until days later.

As you can see by the trigger name, I had to put this into a report object (which I’ll share when I find time to put it in a repo). My main problem was that I needed to be able to provide a way for the user to open the Excel file. For this to work, you use the OpenExcel function. This actually does not open Excel, but what it does instead is it downloads the Excel file into the Downloads folder on your computer, and then you can open that file from there.

Some additional pointers:

  • CreateNewBook creates a new file, with a new sheet. If you already have the file created, and you need to add a sheet to the existing file, then you would use the SelectOrAddSheet function
  • TheWriteSheet function writes the records from the Excel Buffer table into the sheet. Each record represents a cell value
  • You will need to use the NewRow, AddColumn functions to ‘walk the grid’ of the cells in your sheet. Also very useful functions: ClearNewRow and SetCurrent. I ended up adding a GetCurrentRow function to an Excel Buffer table extension
  • The CurrentRow and CurrentCol variables in the Excel Buffer table are your friend. Forget about the letter/numbers of the Excel file itself, just use the row/column numbers
  • SetFriendlyFileName is not mandatory, but otherwise the file will be called ‘Book1’ or something

Like I said before, AJ is working on a post for this as well, and he said he was going to offer a repo with the objects as well. If I don’t forget I’ll create a sample report and offer that as a PR to AJ’s Excel repo.

Translation File Names Must Match App.json

This is a quick follow-up on my previous post about creating a container for modified Base App development, about the translation file issue. After publishing that post, I also reported the error message to the AL repo on Github and to the MicrosoftDocs repo.

As @NKarolak suggested, the names of the translation files must match the name in app.json. I was very skeptical about this, because this was never the case in any of the AppSource apps I’ve worked on, and the Doc for the translation files specifically says that there is no enforced naming of the translation files. It might be a new requirement though.

When I first created my AL workspace by exporting it from my container, the translation files were named as follows:

The name in app.json is ‘Base Application’ so the space character is replaced with ‘%20’ which is the html representation for the space character. Since the original error message did not mention the file name, I did not think that the file name itself was the problem.

I decided to try Natalie’s suggestion and replaced the ‘%20’ with a regular space, and voila, it published the app as expected.

Next, I changed the name in my app.json to ‘Super Base Application’ and it errored out again. Once I changed the translation files to match the name in app.json, it worked again.

Moral of the story: when developing a modified Base App, you have to match the translation files to the name in app.json.