Today I had to send a specific email from Business Central from a specific backoffice account. Roberto Stefanetti has written an excellent blog post about Business Central email and email scenarios. Which is fine for the built in processes but I had an entirely new scenario.
The solution was simple, all I had to do was extend the “Email Scenario” enum and use that scenario for generating my email. Below is the whole setup and code. Enjoy!
On with the show. Today I will show you a simple process that I have automated with the Azure Event Grid and Logic Apps, sending an invoice PDF to the Azure Event Grid. Since we have events inside Business Central, all we have to do is subscribe to an event in an extension and send it to the Azure Event Grid. How hard can it be?
Of course we first need to create an Event Grid Topic. The Event Grid is completely integrated into Azure. You can subscribe to any number of built in events, like a notification that a blob has been created on the Azure Blob Storage. For applications that have no built in Event Grid events you can also create a custom topic. Once that is up and running you can send events to it. The Event Grid works with a Push-Subscribe method. We send events with a push message, an HTTP POST, and receive events by subscribing to a topic, a webhook. For this blog I will only discuss the push, subscribing to events will be discussed in blog post 3 in this series.
For an in depth exploration of the event grid this would be a good start. For the purpose of this blog I have created a new event grid topic and called it red-blog. From this event we need the event URL and the access key.
To send a BC event to the event grid we need to subscribe to a Business Central event and create a call to the Event Grid REST API, it is really simple. the Event Grid message format has space for your own message JSON so you can send all the data you may need. In my case I need to send some invoice metadata and the PDF file as a base64 string. You can also choose to just send an API URL for your resource so any application that receives the event can fetch the necessary data. Personally I prefer to send as much data as I can so Business Central does not get burdened with any unnecessary web service requests. The message format of the Event Grid message body is this. In this JSON message we find a data object. In this data object we can nest our payload, in any JSON format we want.
In my Business Central Code I create an event subscription to the event I need. When my event gets triggered the first thing I do is to create the Event Grid JSON with the JsonObject datatype.
local procedure CreateBody(JobQueueEntry: Record "Job Queue Entry") message: JsonArray
var
RecRef: RecordRef;
body: JsonObject;
begin
RecRef.Get(JobQueueEntry."Record ID to Process");
RecRef.SetRecFilter();
body.Add('id', JobQueueEntry.ID);
body.add('eventType', JobQueueEntry.Description);
body.add('subject', StrSubstNo('%1 %2', RecRef.Name, RecRef.GetFilters()));
body.Add('eventTime', CurrentDateTime());
body.Add('data', GetRecData(RecRef));
message.Add(body);
end;
local procedure GetRecData(RecRef: RecordRef) data: JsonObject
var
Base64: Text;
FileType: Text;
begin
data.Add('table', RecRef.Name);
data.Add('company', CompanyName);
data.Add('bcId', GetId(RecRef));
data.Add('bcData', GetProperties(RecRef));
GetPdf(RecRef, Base64, FileType);
data.Add('file', Base64);
data.Add('filetype', FileType);
end;
local procedure GetPdf(RecRef: RecordRef; var Base64: Text; var FileType: Text);
var
SalesInvoiceHeader: Record "Sales Invoice Header";
ReportSelections: Record "Report Selections";
Base64Convert: Codeunit "Base64 Convert";
TempBlob: Codeunit "Temp Blob";
FileManagement: Codeunit "File Management";
Instr: InStream;
RecVar: Variant;
CustomerNo: Code[20];
begin
case RecRef.Number of
Database::"Sales Invoice Header":
begin
RecRef.SetTable(SalesInvoiceHeader);
RecVar := SalesInvoiceHeader;
CustomerNo := SalesInvoiceHeader."Bill-to Customer No.";
end;
else
exit;
end;
ReportSelections.GetPdfReportForCust(TempBlob,
ReportSelections.Usage::"S.Invoice", RecVar, CustomerNo);
TempBlob.CreateInStream(Instr);
Base64 := Base64Convert.ToBase64(Instr);
FileType := 'pdf';
end;
In the above code you will find three functions. The first created the Event Grid JSON body. For the body object the GetRecData function is called. You will notice I have used my own JSON format for the data object. We will get to that when we get to using the event in Azure.
To send the event to the Event Grid all we need to do is send the JSON body we created to the Event Grid Topic we created earlier. To do so we need to add a request header with the API key. We will send the request to the URL we copied from the Event Grid Topic
local procedure SendMessage(message: JsonArray) Result: text
var
Client: HttpClient;
Response: HttpResponseMessage;
Content: HttpContent;
ContentHeaders: HttpHeaders;
begin
Client.DefaultRequestHeaders.Add(
'aeg-sas-key',
'W+rdBTon0hNgMMv8pYiPlDRpqM31AlGwPKVeJ9CSgiM=');
Content.WriteFrom(Format(message));
Content.GetHeaders(ContentHeaders);
ContentHeaders.Clear();
ContentHeaders.Add('Content-Type', 'application/json');
if not Client.Post(
'https://red-blog.westeurope-1.eventgrid.azure.net/api/events',
Content, Response)
then
Error(CannotConnectErr);
if not Response.IsSuccessStatusCode then
Error(WebServiceErr, Response.HttpStatusCode, response.ReasonPhrase);
Response.Content.ReadAs(Result);
end;
A second design decision I made was to make all Event Grid web service calls run asynchronous by sending them with the job queue. The advantage of that is of course that my primary process is not delayed and I am sure that I can see an error message when the message does not get delivered.
When that is done I can create a simple application to subscribe to my Event Grid events and do something with the data it receives. You can use any kind of Azure application to subscribe to your event, like an Azure Function or a Logic App, or you can choose to write an application that can subscribe to the Event Grid webhook. I generally use Logic Apps to process my BC events as they are easy to create for non developers. For this example I have created a Logic App that is subscribed to my Event Grid Topic webhook.
When I run my code this is what the Logic App receives.
The main advantage here of course is scalability and ease of development. I can easily create another subscriber to my Topic that can email my customer, or store the PDF file in OneDrive or the Azure Blob storage. The list is endless. The Azure Event Grid will scale to match you requirements. Unlike your Business Central service I might add.
I realize it’s been a while since you heard from me. What can I say, demands from family and clients tend to keep me from writing. Besides, writing is hard. I prefer going for a ride on my beloved Harley to sitting at my desk trying to organize my thoughts.
This blog post series has been in the making for several years though, I feel that I ought to write it now, almost like a debt owed. In more that one respect this is a monumental event, I need to tell you about event driven serverless architecture.
An event-driven architecture uses events to trigger and communicate between decoupled services and is common in modern applications built with microservices. An event is a change in state, or an update, like an item being placed in a shopping cart on an e-commerce website.
This probably sounds more than a bit abstract, I know it took me a while to get my head around it. Event driven architecture is a really deep down fundamental design choice, a choice to move away from big centralized applications to smaller integrated microservices. It is not just a product, or a tool, that you can start using. It is a different way of thinking about an entire application landscape.
Let’s try to make this a bit less abstract. Please consider this usage scenario. Stuff & Co is a company that sells items to customers through several webshops. They have their own webshop but they also use several marketplaces to sell their items. All of a sudden I need to keep track of many events, not just incoming orders, also Item price and availability, customers, shipments, invoices. Even a small event like placing an item in a shopping basket becomes interesting. If 20 customers place an item in their basket while I only have 10 of these items in stock I’m going to need to gear up my replenishment or production processes.
Stuff & Co uses Business Central for their financial administration and warehousing. Unfortunately they find they are drowning in a million different integrations that slow down the primary process of their Business Central server. All these integrations are also impossible to maintain, it takes serious effort to integrate a new webshop, or to keep items up to date across the application landscape.
Stuff & Co are not alone in this. One of the biggest concerns my clients seem to have in the use of Business Central seems to be optimizing Business Central for posting as many documents, usually shipments, as possible without locks. This is nothing new and it is at the heart of my design philosophy; Business Central is used for the reliable tracking of money and goods. And nothing else.
The problem we have here of course is the “nothing else”. How are we going to do the other things we need to have done? How are my items and customers in many different companies, databases, or even applications going to stay in sync? How will I import and export my data? Invoice scanning and recognition? And many more questions. Fortunately my clients also gave me the answer for this, serverless computing. More than once I have helped clients to move non core Business Central processes to a serverless back-end they have already started to implement. Being independent I have used both Amazon Web Services and Microsoft Azure. I must confess to a slight preference for Azure, though that may be caused by years of conditioning. Both are able to provide the services I need but I will use Azure for demonstration purposes.
Using Business Central to handle data imports and exports is like using an 18 wheel truck to pop to the shop for a loaf of bread. Of course it’s possible. It’s also not practical and a waste of resources that can be better used elsewhere.
Why would I pay for a service just to send data to other systems, I hear you think. This, of course, is a very good question that is easy to answer. Scalability. The great joy of using an event driven architecture is that it is insanely easy to scale. Remember Stuff & Co and their multiple webshop integrations? What if I want to integrate a new webshop? Or notify customers that an item is being picked? By using an event driven architecture I can easily build small applications that will subscribe to events. If something changes I can simply change these small applications.
The big challenge is of course to integrate Business Central to the events infrastructure that Azure or AWS (or Google or any number of vendors) offers. I need to push my Business Central events to my serverless back-end and I need to subscribe to events that occur elsewhere in my application landscape. With the added challenge of not interrupting the flow of my Business Central core processes.
I realize I have taken up a lot of your time already. The second blog post in this series will describe sending events from Business Central to the Azure Event Grid, the third will describe subscribing to external events, the fourth and final blog post will describe how to make Business Central data available to your other microservice applications.
Sometimes I just want to tell the world how awesome life is right now for Business Central developers. The tools we have available to us are only getting better. Mostly because we are now in direct dialogue with the people that create them. Just have a look at the Business Central GitHub, Yammer, and Twitter communities.
One of these tools I just had to share with you. the al.browser feature. It will allow you to choose which browser will be opened when you publish your extension from Visual Studio Code.
For me this is a great feature because I now don’t have to use my main browser, with all it’s history, for my demos, webinars, and training sessions.
When I mentioned this feature to the good people of ForNAV they were more than helpful in adding this feature to their report designer as well. Here is a sneak preview from the coming release, allowing you to preview your Business Central reports in the browser of your choice, while designing them.
If this is the future, then that is alright then. Those were my words after getting off the new Harley Davidson LiveWire. For those of you that don’t know, the LiveWire is Harley Davidson’s all new electric motorcycle.
If you follow this blog at all you will know I have a, probably unhealthy, love of Harley Davidson motorcycles. I love riding them, and I love writing about them. There is something immensely gratifying on sitting on top of an engine so big that it has its own gravity field. These are loud, shaking, living, breathing beasts of motorcycles that are also well built and well engineered. So much so that despite their size they are the culminating companions for ceaseless cruising.
When it comes to writing it is just satisfying to scribble in superlatives. There is no way you can overdo writing about Harleys. Therefore, when I was invited to ride the new LiveWire at my local Harley dealer I grabbed my thesaurus and jumped at the chance.
Riding a LiveWire is like strapping a warp engine to your back and pressing its do not press button. Twisting the throttle launches you and the bike into an alternate reality where pedestrian things like natural laws don’t exist. In fact, the only things that exists there is a surge of quiet speed only punctuated by the mad whooping noises that emanate unbidden from the core of your being. It is quiet, poised, and handles like it is on rails. And still, despite it being all computers and software, it is still a living, breathing beast of a motorcycle. It’s a Harley and like any Harley it speaks to a part of your soul that most people don’t know they have. If this is the future, then that is alright. More than alright.
I’m not buying one though. Not because it is eye watering expensive, but because it does not work for me. For the simple reason that my left knee can’t handle the LiveWires riding position for more than twenty minutes.
This brings us to the real reason for writing this post. There is no point in buying an amazing bike if you can’t ride it just like there is no point in investing in tech that won’t serve your business. Which brings us back to extensions and VS Code.
I spent some time in the past six months helping some people in getting started with creating extensions. Most often these people are confused about how to get started because every time they did a training or saw a presentation they were drowned in stuff like Docker, Source Control, CICD, automated testing, Azure Functions, and more great tools. That is why I wanted to use this blog post to look at what you need to build extensions.
Before you all get on my case on how important source control is and how we need automated testing. I know it is important. But it is more important to have an easy way in and get started with building extensions. Source control, automated testing, and all sorts of other things are not needed to build an extension. What you need to get started is a Business Central cloud sandbox, VS Code and that is it. You don’t need anything else. Once you get going though you will need some, but maybe not all these things. Let me give you a guide into getting started with building extensions and improving your development process. This guide is based on my own experience as a small business owner who builds extensions for paying customers.
VS Code and AL development. Just create a new project on your local hard drive, connect to a Business Central sandbox and start coding. Don’t make it more complicated than this.
GitHub, you might want to work on your project with a colleague, or you may want to have a simple change log. GitHub is easy to learn and easy to start using. Stay away from Azure DevOps!
Docker, you may need to spin up a new docker container because you don’t want to wait for your sandboxes all the time. Don’t get started with Business Central on Docker unless you have 1 TB of free disk space.
Test Codeunits, at some point you may want to publish your extension to the app source. For this you need test Codeunits. Once you start building tests you will realize that you should have built your test before building your extension. Only you could not because you needed to learn how to make extensions first.
Azure and control add-ins, once you start working on extensions for cloud sandboxes you will run into things you just can’t to with AL. And things you can do better with other tools. Learning C#, .net core, JavaScript, and many other things will be next on your list.
For most businesses this will be enough. At least for now. Things like automated builds, automated testing, and all sorts of tools are simply not needed for small teams. It is not hard to spin up a container and running some tests manually. Nor is it hard to build an extension manually. Remember the LiveWire, you don’t need fancy tech if it does not suit your needs. Keep it as simple as you can!
Please note, once again, that I’m not arguing against automating your development process as far as you can. I’m just saying that you need to make getting started as easy as possible, and that not every developer needs to be a DevOps engineer.
This is, I think, the end of my experiences about getting started with extensions. It has been a fun journey; I hope I shared enough of it to inspire you to try some new things. For me personally the last twelve months have been a transformation from an employed developer to a small business owner. This change has given me a new perspective on many things and I’m sure that there are many exciting things still to learn and explore. Some things have not changed though. This crafty creative still likes to create cunning code, compose capital content, and commute on a commanding cruiser. If this is the future, then that is alright.