Modified Base App on Docker

How to get started with modifying the Base Application using Docker

Many partners are still focused on doing custom development for their customers with their one-off implementations. MANY of those customers are existing customers with existing NAV systems with existing customized objects. As much as everyone wants to go to extensions only, and most partners see the need and are more than willing to make the necessary changes, the reality is that many of these existing customers do not want to pay for migrating all of their custom modifications. This reality comes with the need to modify the base app. Since C/SIDE is no longer available, the only way to do this is to use VSCode. This post will explain how you can create a Docker container, and use that container to do modifications on the Base Application.

To get started, click here to read the article on docs.microsoft.com. I say ‘get started’ because it was not enough to get me all the way there, which is the reason why I wrote this post. This article seems to have been written for an actual installation from the product DVD, and there were some additional things you need to know to make it all work if you want to use Docker. At least, that is per the date of this post, because things may change :). I’ll try to revisit this post if it does change.

Alright, so to make this work, you need a few things:

  • A Docker container that is based on the latest Business Central Docker image.
  • Configure the Service Tier in the container
  • Extract the objects from the container into a new AL workspace
  • Uninstall and Unpublish the Base Application and its dependencies

Create a new Container

For Business Central development I always use the NavContainerHelper module, so before you use any of the commands in this post, update your module:

Update-Module navcontainerhelper

To get the latest Docker image for Business Central I will be using the ‘mcr.microsoft.com/businesscentral/onprem:na-ltsc2019’ image. You can leave the ‘ltsc2019’ part out if you are not sure about the host OS or if you are on Windows Server 2016. You can substitute ‘na’ for your own localization, or leave that tag out altogether if you want to be on the W1 version. To read about which image to use, visit Freddy’s blog here and follow the links to what you need to know. Here is the script that I used to create my container:

$imageName = 'mcr.microsoft.com/businesscentral/onprem:na-ltsc2019'
$licenseFile = '<path to your BC 15 developer license>.flf'
$ContainerName = 'mysandbox'
$UserName = 'admin'
$Password = ConvertTo-SecureString 'Navision4ever!' -AsPlainText -Force
$Credential = New-Object System.Management.Automation.PSCredential ($UserName, $Password)


New-NavContainer `
    -accept_eula `
    -containerName $ContainerName `
    -imageName $imageName `
    -licenseFile $licenseFile `
    -auth NavUserPassword `
    -alwaysPull `
    -Credential $Credential `
    -includeAL `
    -updateHosts `
    -additionalParameters @("-e customNavSettings=ExtensionAllowedTargetLevel=OnPrem")

I use the ‘-alwaysPull’ switch to make sure that I always have the latest version of the Docker image. The ‘-includeAL’ switch is necessary to include references to the DotNet assemblies in the Docker container. The ‘-additionalParameters’ switch (h/t @tobiasfenster) is used to set the ExtensionAllowedTargetLevel property to ‘OnPrem’. I’ll explain how to set this with a simple PowerShell Cmdlet in a minute.

One more important switch is the ‘-useCleanDatabase’ switch, which can be used to uninstall and unpublish the Base Application and its dependencies, as I will discuss in a little bit. At this point, you have a vanilla Docker container with the latest on premises version of Business Central.

Configure the Service Tier

As the Doc states, there are three things you need to set. It is not very clear exactly how to do that, and not at all how that works on Docker, so let me just explain from scratch.

First, you need to know how to look at, and modify, the Service Tier settings inside the container. Some of these types of commands are available in the navcontainerhelper module, but some of them are not. I did find a Cmdlet to see the settings, but I could not find one to actually modify them. So, to cover all of it, I will show you how you can connect to the container and run regular BC PowerShell Cmdlets from inside the container.

Open a PowerShell ISE window as administrator, and run the commands in the screenshot

Our container name is ‘mysandbox’, and you connect to it by using the ‘Enter-BCContainer’ Cmdlet. You can see how the prompt changes to show you that you are inside the container. At this stage, the navcontainerhelper does not work, so you will have to use the regular BC PowerShell Cmdlets. The next Cmdlet shows you all the properties of the Service Tier that runs inside your container, which in this version of Business Central is called ‘BC’.

According to the Doc, the following settings are important. I am using the names that are used in PowerShell rather than the names in the Doc.

  • ExtensionAllowedTargetLevel should be set to ‘OnPrem’, although it seems that the value ‘Internal’ also works.
  • DeveloperServicesEnabled should be set to true. This should be the default value of this particular setting
  • There is also a mention of the EnableSymbolLoadingAtServerStartup property in the Doc, but I’ve received confirmation (h/t @freddydk) that this property was meant for hybrid C/AL and AL environments, so that is not needed anymore for BC 2019 wave 2

To modify these settings, use the following PowerShell command

Set-NAVServerConfiguration `
      -ServerInstance BC `
      -KeyName ExtensionAllowedTargetLevel `
      -KeyValue OnPrem

After modifying those settings, restart the service tier using the ‘Restart-NAVServerInstance -ServerInstance BC’ command. At that point, the service tier in your container should be configured for doing on premises development. The next thing you need to do is get the application objects out of the container.

Create AL Workspace from Base App

This step is easy, using a navcontainerhelper Cmdlet, so you need to first exit the container (type ‘exit’ and then enter). Then, run this Cmdlet:

$ContainerName = 'mysandbox'
$UserName = 'admin'
$Password = ConvertTo-SecureString 'Navision4ever!' -AsPlainText -Force
$Credential = New-Object System.Management.Automation.PSCredential ($UserName, $Password)

Create-AlProjectFolderFromBcContainer `
    -containerName $ContainerName `
    -alProjectFolder 'C:\MyProjects\BaseApp' `
    -useBaseAppProperties `
    -credential $Credential 

One thing to note here is that the ‘-useBaseAppProperties’ switch uses the properties from the container. You will end up with a fully functioning AL workspace, with an app.json and launch.json that is configured to look inside the container for the objects and the DotNet probing path. You will need to configure this yourself if your configuration needs to be different. But, since we’re making this work for a standard container, we’re going to use the standard configuration as well.

One other important thing to note…. As I am writing this post, I’ve had a persistent error message that prevented me from compiling the app, which I narrowed down to having to remove the translation files. The annoying part is that the error message itself does not mention the translation files, but it started working again after I removed them. In your new BaseApp folder, there is a folder called ‘Translations’. Remove all files from that folder, except the ‘*.g.xlf’ file.

Update 2019/11/27 follow up on the translation file issue

One final thing to note is that this is just a simple AL workspace. In a real life situation, you are doing this for a particular customer, so you need to think about source control, workspace settings, things like that. There are some capabilities in the Cmdlet, so take a look here to see all the available parameters of the Cmdlet.

The last thing you will need is to download the symbols for the system apps from the container. The Doc also mentions adding the assemblyProbingPaths to the workspace settings, but if you used the ‘-useBaseAppProperties’ switch, that is already taken care of for you and the setting will point to one of the container’s shared folders.

Uninstall / Unpublish Base App

In the previous step, you’ve created an AL workspace with all of the objects from the Base Application. Now, your container already has a Base App, so in order to create a modified Base App, you will have to get rid of the standard one first. You can be a PowerShell warrior and run the Cmdlets in this section, or you can also use the ‘-useCleanDatabase’ switch in the New-BCContainer Cmdlet in the first section. This will remove the Base App and all its dependencies from your container right away.

On to the PowerShell… In the Doc, under bullet 11, you will find the functions to accomplish this. These are regular NAV PowerShell Cmdlets, so you will need to enter the container first:

function UnpublishAppAndDependencies($ServerInstance, $ApplicationName)
{
     Get-NAVAppInfo -ServerInstance $ServerInstance | Where-Object { 
    # If the dependencies of this extension include the application that we want to unpublish, it means we have to unpublish this application first.
    (Get-NavAppInfo -ServerInstance $ServerInstance -Name $_.Name).Dependencies | Where-Object {$_.Name -eq $ApplicationName}
 } | ForEach-Object {
    UnpublishAppAndDependencies $ServerInstance $_.Name
 }

 Unpublish-NavApp -ServerInstance $ServerInstance -Name $ApplicationName
}

function UninstallAndUnpublish($ServerInstance, $ApplicationName)
{
    Uninstall-NavApp -ServerInstance $ServerInstance -Name $ApplicationName -Force
    UnpublishAppAndDependencies $ServerInstance  $ApplicationName

}

This loads the functions into memory, and then you can run the script:

UninstallAndUnpublish -ServerInstance BC -ApplicationName "Base Application"

This will completely remove the Base App and its dependencies.

Ready to Start Developing

That’s it, you should now be ready to start your development. See how that works. Add a field to a table, add that field to its Card page and hit Ctrl+F5. It will probably take a while to compile, but you should see your new field on the page.

Now I do need to say that I completely and wholeheartedly agree with the entire community, and code customizations should really not be done anymore. All development should be done using extensions instead of change the Base App itself. It makes everyone’s life a lot easier if you minimize the amount of development done to the Base App, so even if you have no other choice, try to design the development in such a way that most of it is in an extension, and only modify the Base App for the parts that you can’t figure out how to do in an extension.

Update 2019/11/27: created a GitHub repo with the scripts

Extending the AL Language

Over the past year, I’ve taught many people how to develop extensions for Business Central using Visual Studio Code. Usually I try to keep the workshop to standard features in VSCode and the standard AL Language extension. One of the things I don’t usually cover in any detail is an additional extension that extends that language, the “CRS AL Language Extension”. Since I am one of the owners of CRS, I could take part of the credit for it, but you should know that it was developed pretty much 100% by Waldo. If you want to read more about the extension itself, read Waldo’s latest blog post about it, he’s much better at explaining it than I am.

There are a bunch of really useful features in this extension, but I want to specifically mention a couple that I think are indispensable. In fact, I would bet quite a bit of money that Microsoft will include some of these features in the official AL Language sooner rather than later. It would really not even be necessary, since these extensions are all open source anyway.

Rename/Reorganize

The feature that I use the most myself is the rename and reorganize feature. The extension provides a way to set up how you want files to be organized, and what you want the naming convention to be. Personally I don’t really care all that much about the specifics of any particular convention, as long as what I am doing is consistent, so that at some point things will be in the same place for every project that you work on. I usually just leave the default settings in there, and I know exactly where to find my objects. Go here to read more about how you can customize it to your needs.

Run in Web Client

There are a few standard ways to run a page that you are currently working on. If you’ve added access to the page to a role center you can just start the web client and browse to the page. If this is not the case, you can use the Search feature and start your page from there. You could also set a startup object in launch.json, and when you start the web client from VSCode, it will open on that object. Waldo’s AL extension provides a really easy way to start the current object from the Command Palette, using the ‘Run current object’ command. In the new version of the extension, this command now also shows up in the status bar. Finally, you can right click an object and the ‘Run Current Object’ command can be selected from the context menu.

These two features are the ones that I use the most, and they alone are worth getting the extension. I could not do AL development work without this extension. Download this extension and use it. If you have ideas to make it better, let Waldo know, he loves getting feedback and making it better.

My Take on Using Docker

This past week, there was another post by my good friend Arend-Jan Kauffmann about using Docker directly on Windows 10 (what are you still doing here? Go read AJ’s post!). He had previously written about using Docker in a Hyper-V VM, and he has helped me understand how this all works a number of times. Just to be sure I mention this here, you can read all about the technical details on Tobias Fenster’s blog but that goes over my head very quickly.

The reason why I am writing this is because I am very reluctant to make the step to install Docker directly on my laptop. What works for me at the moment is where I have Hyper-V enabled on my laptop, and I have a VM with just Windows Server 2016 (creating one with Windows Server 2019 is very high on the todo list). My Docker is installed in a snapshot of that VM, and that is where I do all of my development work. I wrote about this before, read it here.

See… I am the king of screwing up my computer. If there is anything, ANYTHING, that will mess up my computer and render it absolutely useless, I WILL find it, and I will kill my computer (I am hearing that in Liam Neeson’s voice by the way). I have had to re-install my laptop so many times because of things that went wrong. When I have a problem like this in my VM, I don’t even spend any time trying to figure out what went wrong (that gives me a headache just thinking about it). All I need to do is delete the snapshot, create a new one, and I’m back up in a matter of minutes. All my dev work is in repos that I sync regularly, so I never have to worry about losing any work.

I’ve read about Docker straight on Windows 10, and it sounds very nice and easy to use. At the same time, I read blog posts and even Tweets that mention damage to the host OS from normal Docker operations, and I just KNOW that if I try it will happen to me. My reluctance to use Docker on Windows 10 directly does not come from wanting to stay in the past, but it is more from the knowledge that I’m going to screw up my computer.

Maybe I’m too cautious, but for now I will stick to my setup and continue to use Docker inside a VM. It works for me, and for now that’s good enough.

Look Out for the Code Police

One of the coolest features in VSCode is the ability to check your code at design time for specific things. This post will explain how you can turn on code analysis, and how to get away with breaking the rules that it tries to enforce.

There are three things you have to know about code analysis: First, it is a feature that can be enabled and disabled at will. Second, there are sets of rules for specific purposes that you can turn on and off. Finally, you can define exceptions to those rules, and what to do when the code analyzer finds a violation of one of the rules. All three items are found in the user settings, and the exceptions are then stored in a separate file called a ‘ruleset.json’ file.

Open the user settings from the Command Palette. You will need to have different levels of scrutiny for different projects, like one client has an on premises implementation, and another is developing an app for AppSource. These must follow different sets of rules, so they get their own codeanalyzers. Since each project is different, I would say that you define the code analysis attributes at the workspace level. You can set these features up in the sort of UI rendering of user settings, but I like to see the json file in the editor and use Intellisense there.

The code analysis feature is turned on by setting “al.enableCodeAnalysis” to “true”. In the “al.codeAnalyzers” property, you can define which set of rules is enabled. The one you should always enable is the ‘CodeCop’, which enforces some basic syntax rules. Then, depending on whether you are doing development for AppSource or for a tenant specific extension you can choose either the ‘AppSourceCop’ or the ‘PerTenantExtensionCop’. You should not have both of those last two enabled at the same time, because some rules for AppSource don’t apply for PerTenant and vice versa.

In my settings.json, I’ve turned on code analysis, and I have enabled the CodeCop and the AppSourceCop. To show you what this looks like when code analysis finds a violation in a code editor I’ve created a very simple codeunit:

Code analysis doesn’t like my code, the CodeCop does not approve of using BEGIN..END for a single statement. Personally I don’t agree with that rule, because I always use BEGIN..END in IF statements, I make fewer mistakes that way. The rule is not really a big problem, because the squiggly line under my code is green. If I had violated a really important rule, like missing a prefix in a field name, it would have been red.

Lucky for me, I can define for myself how certain rules are handled. Note that the problems screen shows which rule is broken (number AA0005). Let me show you how you can define what happens.

First, you create a new .json file in your workspace, and you set it up to be a ruleset. I am calling mine ‘Daniel.ruleset.json’ and I am putting it in my workspace root. Here’s a screenshot of the ruleset file:

Under the “action” you can set what you want to happen when this rule is broken. I don’t like the rule at all so I want it to ignore this rule altogether so I’ve set it to “None”. All you have to do now is tell settings where to look for additional rulesets, like this:

The rule itself still works, I’ve just overridden its behavior to something that I like. Going back to my codeunit, there is no longer the annoying little squiggly line, and this violation is no longer listed in the problems window.

No problem, I’m happy 🙂

One word of warning about using the ruleset to create exceptions on AppSource rules. Some of these rules are there because they are required for acceptance into AppSource. For instance, you MUST give EVERY field name a specific prefix/suffix. You can turn this rule off, but if any of your fields is missing a prefix/suffix, your app will not be accepted. Be aware which rules you break, because the code police WILL find you eventually 🙂

NAV Techdays 2018 Recap

I started to write this post while flying across the Atlantic Ocean on the second of a three leg journey home, a BA flight from London to Phoenix. It has been a very long trip that started when I traveled to Holland for Directions EMEA in Den Haag at the end of October. Since Directions and NAV Techdays were relatively close together, I decided to just stay with my family in Holland for those 4 weeks rather than fly back and forth twice in less than a month. This has been the longest that I’ve ever been away from home, and I was SO ready to be back in my own house.

NAV Techdays ended last Friday, and it’s been another fantastic week, as we’ve come to expect. As far as I can tell, the attendees in my pre-conference workshop were happy with the content, I can’t wait to get the feedback and see what I can improve for next time.

As per usual, Luc has posted the videos in record time, less than a week after the event. The whole playlist can be found here, and I wanted to highlight some of my favorite sessions. One of the most important developments in current technology is machine learning and AI. Dmitry Katson and Steven Renders put together an awesome session to introduce machine learning to us. The award for most entertaining session goes to Waldo and Vjeko, who put on a concert and wowed the audience with some really cool content. I also want to point out the session about CI/CD, which is going to be one of the most important things for everyone that is serious about implementing a professional development practice. Of course, I have to also mention the Docker session, which is the technology that makes it all possible.

Furtunately, next year’s event is not scheduled on Thanksgiving, which is a national holiday here in the US, one that typically involves lots of friends and family, and lots of food. I’ve had to miss it the past couple of years, and each time I’ve been bummed to hear the stories of all the great meals and gatherings that my family got to have without me. Next year I’ll be home for Turkey Day!

Thanks for another super event, it’s one of my favorite weeks of the year.

Extensible Enums

The AL language has an object type called ‘enum’. This object type defines a list of possible values in the form of a set of key/value pairs, plus captions. You can then create a field in a table or table extension enum as its data type, and the field will provide the user with a drop down list of those values. Just like option fields, the database stores the numerical values of the enum in the field.

To define a new enum, you create a new .al file in which you define the enum as an object, and you list the options of the enum as follows:

Note that the ‘Extensible’ property is set to true, so it will be possible to extend the enum with additional options when the enum is used in other extensions.

To link a field in a table or a table extension, you define the field as an enum type field, and specify the enum name as part of the field definition. In the following screenshot we’re adding an enum type field to the Customer table in a new tableextension:

Now, in order for this enum to be extended, you would have the app that includes the enum as a dependency (which puts the original enum into the current app’s symbol references), and then you would create a new object called an ‘enumextension’, in which you define additional values.

Now when you look at the Customer Card, you can see all the values in the dropdown for the new field:

It is also possible to link an option field in C/SIDE to an enum in AL, as shown in the following screenshot:

When I learned about the extensible enum type, I was salivating at the thought that it would be possible to extend the available options in a ton of tables (type in sales/purchase line, account type in journals, entry type in ledgers to name just a few of them). It IS possible to do just that, and eventually the goal is to replace all option type fields in Business Central with enum type fields, it’s just that it comes with a crap ton of refactoring of existing code.

There is a lot of code that checks for all available option values, with an ELSE leg in the CASE statement for ‘other values’. All of that code will need to be refactored to allow for extended enums instead of just raising an error with an unrecognized value.

Now you know about enums, start using them instead of option type fields, and make them as extensible as possible.

Source Code Management for Business Central

So you’re getting your feet wet with Visual Studio Code, and you’re starting to get the hang of how developing extensions for Business Central works. You’re getting comfortable with all the elements of a VSCode workspace and how to connect your workspace to your Docker container. Now is the time to dive into source code management.

One thing that makes it easy to get started is that Visual Studio Code supports source code management as a built-in feature of the program itself, through integrating with the GIT source control management protocol. You have to install this separately, but once you have all the right bits in place, VSCode will keep track of the changes that you make to your objects, and you can take action based on that, right inside VSCode.

Install the right bits

From what I understand, the reason why Git itself is not installed as part of VSCode is because of the open source license that comes with Git. As a result, you have to install the protocol separately. The installer can be found on the Git website at https://git-scm.com/. Click on the download button and just accept all the prompts in the installer (hit ‘next’ about 9 times). All you need to do now is restart VSCode, and now VSCode is integrated with Git. It will issue all the commands that you want as if they were features of VSCode itself.

Tell Git who you are

Basically, source control management is a system that keeps track of who made what changes to which files at what point in time. It is the ultimate CYA tool, and it also can and will be used against you :). You have to tell Git who you are by setting two global system parameters, the user’s name and email address. Enter these parameters in the Terminal window in VSCode.

Sign up for GitHub and create a repository

Go to https://github.com/, sign up for an account, using the same email that you entered in VSCode as the Git user email. This is a super easy thing to do, I’m sure you can figure that part out yourself.

Once you’re in your Github account, create a repository there. Either click on ‘new repository’ from the dropdown button right next to your avatar, or click on ‘repositories’ and then on ‘new’. Give it a name, and if you want to be adventurous add a .gitignore and/or a license. You will figure out about all the details as you learn to use this.

The repository on Github is your ‘remote’. The image shows a new empty repository called ‘Kerplunk’ in my GitHub account. I set it to private because there’s really nothing to share in there at this point.

Initialize your first repository

As you probably already know, the VSCode ‘workspace’ is nothing more than a folder on your local drive. By some lucky coincidence, Git also works with folders, only in SCM terms, a folder is called a ‘repository’. You will have to learn to call these ‘repos’ if you want to look like you know what you’re talking about. A ‘folder’, a ‘workspace’, a ‘repository’, they are really all the same thing.

In order to make this happen, you need to initialize the folder as a repo. The easiest way to do this is to copy the repo from github into VSCode. As you may have noticed, in Github there was a big green button called ‘clone or download’. When you click on that button, it will drop down a little box where you can copy the link to your repo.

Now go back to VSCode, open the command palette and issue the ‘Git: Clone’ command. It will now prompt you for a link (paste in the clone link) and a folder. VSCode will then download the content of the repo from Github. Just some terms that you have to know: the repo on Github is the ‘Remote’, the copy of that repo on your hard drive is called the ‘local repo’ and the process of downloading the remote to local is  called a ‘cloning’.

All Set

You are now ready to rock and roll with source code management. Open Windows Explorer and VSCode at the same time, and note that the content of the workspace in VSCode hides the Git folder. VSCode now knows that the workspace is also a repo, and to track all changes in there. Copy the files of a brand new AL project into the repo and see what happens. VSCode will notice that there are a few new files in the repo.

Note how the files in the workspace  get a green color, and there is a badge with the number 3 on the source control tab.

Stage, Commit and Sync

Remember, you are always working on a LOCAL copy of the repo, you never work directly in the remote repo. To get your changes into the remote is a two step process. First you have to save the changes to the LOCAL repo, which is called ‘Committing’ the changes. You get to decide which files you want to commit by way of a process called ‘staging’.

Click on the Source Control button in VSCode, and note that all changed files are listed under a heading called ‘CHANGES’. When you hover your mouse over either one of the files, it shows a plus sign. When you click on that plus, it will move the file to another heading called ‘STAGED CHANGES’. Those staged changes are the files that will be committed to the local repo. You can select all or some of the files and then hit the checkmark, which is the commit button. This writes the changes into your local repo.

The last step is to send the commits from your local to your remote repo. This is called the ‘Sync’ process. You can either issue the ‘Git: sync’ command from the command palette, or hit the Sync button in the status bar. VSCode will send any new changes to the remote, and it will download any new changes from the remote.

Now check your repo in github and verify that your changes are up there.

Congratulations my friend, are now using source control management 🙂

Update Sep 24, 2018: added YouTube link. As I was writing this post, I was also working on a training video to get started with source code management. You can find it on YouTube here:

Update March 30, 2019 – new screenshots to show current look of VSCode

 

Ready To Go

I’m on my way home from a bucket list kind of trip this past week. This post is more than just a travel log, I’ll get to a good link that you will be able to use to get Ready to Go for Business Central. Bear with me and let me tell the story of my trip 🙂

Through my work I’ve had the opportunity to travel pretty much to the opposite side of the globe to teach the CRS workshop for developing extensions for Business Central with VSCode. The first 2 day workshop was in Hong Kong on Monday and Tuesday, and then Thurday and Friday I hosted the same workshop in Manila. To have some time for sightseeing I arrived in Hong Kong on Saturday early morning. My hotel was on Hong Kong Island, and I spent the day mostly walking around the area near the hotel. It was a very long trip, and I had a looooong night sleep to recover from the journey.

On my flight over, I was a little too vigorous trying to clean the keyboard on my laptop, and it stopped working. As I kind of need a keyboard for the workshop, part of my Sunday sightseeing was to roam the city in search of an external keyboard, which fortunately I found. It’s an unusual souvenir, but I am actually typing this on a keyboard with Chinese characters.

Instead of my original plan to take a ferry across the harbor and find some of the famous food places, I stayed around town on the island. There was an awesome dragon boat race, and I took the trolley up to Victoria Peak, which is where I took the picture at the top of this post. It was fantastic to be in Hong Kong, the people were very hospitable, I really hope I get the chance to go back there some day and spend some serious time exploring the area there.

These workshops are part of a program called ‘Ready to Go’, which was created to help the Microsoft partner channel build the skills that are necessary to succeed in the new ecosystem for Business Central. I’ll post something with more details probably in a few days, and go into a little more detail then, but the link that you want to jot down is http://aka.ms/ReadyToGo. On this page, you will find a ton of good content, and even more links to other places with even more content. This link can be your starting point for any information that you might need to succeed.

The workshops were a success, and I’ve received lots of positive feedback. It was a pleasure visiting these two places, and I hope to go back there one day. These are people that are Ready to Go!

NAV Techdays 2018 – I’m Speaking!

Registration for NAV Techdays 2018 is open, and this year is going to be SUPER exciting for me, because I am going to teach an all-day pre-conference workshop! Go to the sessions overview page of the NAV Techdays website to see the details of all of the sessions and the pre-conference workshops. Of course I would LOVE it if you sign up for my workshop, but really you can’t go wrong with any of them.

My workshop is called “A Day in the Life of a Business Central Developer”. I still need to put the material together, but the plan is to cover all aspects of what it means to be a developer for Microsoft Dynamics 365 Business Central. Think about the development environment, how to create an app, how to create multiple apps with dependencies (an extension of another extension), how to connect to web services, how to use source control, and even design patterns and Docker.

I realize that it is a very ambitious agenda, but I am sure that we can fill a whole day with great content. I’m not sure if there will be much time to do any extensive lab work, so I might end up just teaching all day and giving you some things to take home and work on after the workshop is done.

Most importantly – go register for NAV Techdays, it is really THE premium event for our industry. Spend an extra couple of days in Antwerpen for the pre-conference workshops, they are all fantastic and worth every penny.

See you in Antwerpen in November!

Signing an App Package File

One of the cool things about my work is that I get to participate in some things very early. This is often really cool, but it also comes with some frustration when things don’t go very smoothly, or when there is little information to work with. One of those things, which I had absolutely NO knowledge of, was signing an app file… I had not a clue what this means, and no clue where to go to get this information.

The page where Microsoft explains how this works can be found here. It looks like a really nice and informative page now, but a few months ago it was confusing as heck, and it was not very helpful to me. At the time, I was working on an app for one of our customers, and one of the steps to get apps into AppSource is to sign the app file before you submit it.

Electronically signing a file is essentially a way to identify the source of the file and certify that the file comes from a known source. The ISV partner that develops an app must register with the signing authorities, and then every time that they release a file, they have to stamp that file with their identifying attributes. The process to do this is to ‘sign’ the file.

I’m not going into the details of how to get this done, the resource in Docs.microsoft is quite good now, so you can read it there. One thing I do want to share is that you should ALWAYS timestamp your signing. If you don’t timestamp the signature, your app will expire the same date as your certificate. If you DO timestamp, the signature will be timestamped with a date that was within the validity of your certification, and your app file will never expire. You do have to keep your certificate valid of course, but at least by timestamping the signature, the files that you sign will not expire.

During the whole process of getting the certificate and the signature, I worked with someone at Microsoft, who helped me get my customer’s app signed, and he also took my feedback to improve the documentation. I noticed something about the documentation that I think should be pointed out.

Documentation for Business Central is now in a new space called ‘docs.microsoft.com‘. In contrast with MSDN, Docs is almost interactive with the community. Maybe you’ve noticed, but each page in Docs has a feedback section. Scroll down on any page in there, and you will see that there is a section where you as a consumer of this information can leave your feedback.

I did this, and to my surprise I got an email. As it happened, the person that was working on the signing page knew my name and knew how to get a hold of me, and we worked together to make the page more informative. It was a coincidence that we knew about each other, but what was no coincidence was that there are actual product group people at Microsoft that are responsible for the documentation. There is a team of documentation people that watch out for issues on Docs, and they pick up issues within days of submission!!

The feedback system links back through GitHub issues, so if you’ve ever submitted something to the AL team, you know that this is pretty direct communication. I am wondering though, if Microsoft will take this a step further, and open up Docs as a public repository where people can make suggested changes. I think yes, but I’m not sure because there’s not really a history of direct collaboration like that. I have good hope though, because the culture at Microsoft is getting more collaborative by the day.