Monitored Fields

Another new functionality introduced with the Business Central 2020 Wave 2 (with version 17.1 in November specifically) is called “Monitored Fields”. This functionality is built over the original Change Log functionality and use the same tables and procedures; however, extends the core functionality with email notifications when any of the defined fields are changed.

To find all related pages, type “monitor” into “Tell me”. Let’s start with Field Monitoring Setup.

Field Monitoring Setup

Using this setup, the functionality itself could be enabled. On the page, it is also possible to define email profile that will be used for notifications and the user who will be notified about any change to the monitored field (for notification, the email from User table of the corresponding user is used).

Monitored Fields Worksheet

Using the worksheet, you can define all combination of tables and fields you want to monitor. If the Data Sensitivity functionality is in use, the monitored fields can be automatically generated based on data in this table. Nevertheless, it is also possible to define all fields manually.

It is important to remember that the field is not monitored until the “Notify” field is enabled. It can be done manually for every line or using actions “Set for Notification” (to enable notifications) or “Clear for Notification” (to disable notifications).

Monitored Field Log Entries

On this page, all changes to monitored fields could be found. Right now, it is basically just a new page for entries in Change Log.T here are also listed all changes to Monitored field configuration tables (similarly to Change Log).

Email notifications

Nonetheless, the most useful thing that comes with this functionality is the notifications itself. All described above is just a new design for ChangeLog; however, the changelog only logs the information about the change, and nobody knows that something had changed. The Monitored fields functionality extends logging by notifying a responsible user using email notification.

CU 2021/01 for Microsoft Dynamics NAV 2016 – 2018 and Microsoft Dynamics 365 Business Central on-premises has been released

The cumulative update 2021/01 for Microsoft Dynamics NAV 2016, NAV 2017, NAV2018 and for Microsoft Dynamics 365 Business Central (2019 wave 1, 2019 wave 2, 2020 wave 1, 2020 wave 2) on-premises has just been released.

VersionLinks
Microsoft Dynamics NAV 2016/CU63DownloadDocumentation
Microsoft Dynamics NAV 2017/CU50DownloadDocumentation
Microsoft Dynamics NAV 2018/CU37DownloadDocumentation
Microsoft Dynamics 365 Business Central Spring 2019 on-premises/CU20DownloadDocumentation
Microsoft Dynamics 365 Business Central 2019 Wave 2 on-premises/U15.14DownloadDocumentation
Microsoft Dynamics 365 Business Central 2020 Wave 1 on-premises/U16.9DownloadDocumentation
Microsoft Dynamics 365 Business Central 2020 Wave 2 on-premises/U17.3DownloadDocumentation

D365 Business Central Bootcamp (05. – 07. 03. 2021)

The first 2021 Business Central technical conference is here. If you are interested in my blog and topics, please vote for my sessions to be available at the conference!

AL Language: Coding for performance

Microsoft brought much new functionality in the last versions to increase the performance of the system; however, the core of all usual performance problems are linked to the way, how extensions are developed. In this session, we will look in detail on the following topics: Table Extensions objects, Partial Records (Loaded Fields), Impact of table extensions on performance and on Programming patterns for performance.

AL Language: Enums & Interfaces

In this session, we will look at how to use enums and interface properly. Enums are full replacement of the Options data type and together with Interfaces can create a powerful, extensible environment.

Microsoft Dynamics 365 Business Central Certification is available!

Right now, the only official Microsoft exam & certification for the Business Central (details of MB800 exam and details of certification) is in the full version! After more than 4 months when the exam was in beta version, Microsoft just switched the exam and corresponding certification to the full version available for everyone.

I can definitely recommend the exam to anyone experienced in the Business Central’s functionality (or in last versions of Dynamics NAV; however, some of the tasks are from the Business Central only).

Microsoft Certified: Dynamics 365 Business Central Functional Consultant Associate

I did the exam two and a half months ago (and received the results and my passing score two hours ago once the exam was approved as a full version). The exam covers all core topics and is build in an exciting way (from my point of view, tasks are much better than those in “old” certifications like MCSA/MCSE).

Have you done the exam? What are your opinions? Share your ideas and thoughts in comments!

Happy New Year 2021!

Happy New Year! I wish all the best to you and your families, especially during these hard pandemic days. I am happy that I met some of you on tech conferences last year and hope that we will meet each other in person in 2021!

I am also glad to say that I was offered a place in the team of experienced tech speakers on the COMMUNITY SUMMIT EUROPE 2021.

Stay tuned for my summit’s session topics and also for some other conferences I will attend! I will share all details in the upcoming weeks!

New Year Card by Christmas vector created by kjpargeter – www.freepik.com

Partial Records in detail (part 2)

This is another article about Partial Records (as we already discussed earlier in the article about Partial Records and later in the first part of articles about partial record’s performance). In this article, we will explore the ration between performance, partial records and system with more tableextensions.

Let’s start with the example of our AL Code which we use for performance testing. This code is very similar to the previous article’s code; we only removed parts that were linked to the JIT loading example.

 report 90001 "TKA Partial Records 2"
 {
     Caption = 'Partial Records 2';
     UsageCategory = Administration;
     ProcessingOnly = true;
     ApplicationArea = All;

     trigger OnPostReport()
     var
         GLEntry: Record "G/L Entry";
     begin
         ForEachGLEntries(GLEntry);
         Clear(GLEntry);

         GLEntry.SetLoadFields(Amount);
         ForEachGLEntries(GLEntry);
     end;

     procedure ForEachGLEntries(var GLEntry: Record "G/L Entry"): Decimal;
     var
         SumTotal: Decimal;
         Counter: Integer;
         TempName: Text;
     begin
         if GLEntry.FindFirst() then
             repeat
                 SumTotal += GLEntry.Amount;
                 Counter += 1;
             until GLEntry.Next() < 1;
         Message(TempName);
         exit(SumTotal / Counter);
     end;
 }

The first example is trivial. We just run our AL example without any change on the empty Cronus database. Unsurprisingly, if we loop through all G/L Entry table records, generated SQL Command load all fields.


 SELECT 
      "17"."timestamp",
      "17"."Entry No_",
      "17"."G_L Account No_",
      ... All Table Fields (56 fields hidden) ...
      "17"."Last Modified DateTime"
      "17"."$systemId",
      "17"."$systemCreatedAt",
      "17"."$systemCreatedBy",
      "17"."$systemModifiedAt",
      "17"."$systemModifiedBy"
 FROM "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$437dbf0e-84ff-417a-965d-ed2bb9650972" "17"  WITH(READUNCOMMITTED)  
 ORDER BY "Entry No_" ASC OPTION(OPTIMIZE FOR UNKNOWN, FAST 50)

If we change our code to use SetLoadField method and specifies all fields that we need, the generated SQL Command is optimized. It contains fewer fields (= less data that needs to be transferred from a database server to an application server).

 
 SELECT 
      "17"."timestamp",
      "17"."Entry No_",
      "17"."G_L Account No_",
      "17"."Amount",
      "17"."Bal_ Account Type",
      "17"."Source Type",
      "17"."FA Entry Type",
      "17"."$systemId",
      "17"."$systemCreatedAt",
      "17"."$systemCreatedBy",
      "17"."$systemModifiedAt",
      "17"."$systemModifiedBy" 
 FROM "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$437dbf0e-84ff-417a-965d-ed2bb9650972" "17"  WITH(READUNCOMMITTED)  
 ORDER BY "Entry No_" ASC OPTION(OPTIMIZE FOR UNKNOWN, FAST 50)

Although it could look like a significant difference, from database server performance point of view, both queries are almost the same (the first one needs a bit more performance to load & process additional unnecessary fields).

HOWEVER, the biggest difference comes with the environment, which uses table extensions!

Table Extension impacts on performance.

Let’s imagine that we have an environment that has our extensions (that has one table extension on G/L Entry table with two Text[2048] fields) and also another 3 AL Extensions each with G/L Entry table extension each with two Text[2048] fields. For example, the table extension can look like:


 tableextension 90003 "TKA G/L Entry Extension 4" extends "G/L Entry"
 {
     fields
     {
         field(90006; "TKA Veeery Long Description 7"; Text[2048])
         {
             Caption = 'Veeery Long Description 7';
             DataClassification = CustomerContent;
         }
         field(90007; "TKA Veeery Long Description 8"; Text[2048])
         {
             Caption = 'Veeery Long Description 8';
             DataClassification = CustomerContent;
         }
     }
 }

We have one table from the Base App, one Table from our Extension and another 3 Tables from other extensions.

Once we rerun the example, the generated SQL Command is much complicated, contains INNER JOIN for all tables and also in addition to standard table fields, also all fields from all tables extensions (and we have absolutely no control about their design and performance-requirements).

 
 SELECT 
      "17"."timestamp",
      "17"."Entry No_",
      "17"."G_L Account No_",
      ... All Table Fields (56 fields hidden) ...
      "17"."Last Modified DateTime",
      "17_e1"."TKA Veeery Long Description",
      "17_e1"."TKA Veeery Long Description 2",
      "17_e3"."TKA Veeery Long Description 5",
      "17_e3"."TKA Veeery Long Description 6",
      "17_e4"."TKA Veeery Long Description 7",
      "17_e4"."TKA Veeery Long Description 8",
      "17_e5"."TKA Veeery Long Description 3",
      "17_e5"."TKA Veeery Long Description 4",
      "17"."$systemId",
      "17"."$systemCreatedAt",
      "17"."$systemCreatedBy",
      "17"."$systemModifiedAt",
      "17"."$systemModifiedBy" 
 FROM "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$437dbf0e-84ff-417a-965d-ed2bb9650972" "17"  WITH(READUNCOMMITTED)  
 JOIN "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$425df973-0cda-4173-9e7c-e91ae677bae1" "17_e1"  WITH(READUNCOMMITTED)  ON ("17"."Entry No_" = "17_e1"."Entry No_") 
 JOIN "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$fa14d844-0ad9-4210-9171-9a86c033de53" "17_e3"  WITH(READUNCOMMITTED)  ON ("17"."Entry No_" = "17_e3"."Entry No_") 
 JOIN "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$fa14d844-0ad9-4210-9271-9a86c033de53" "17_e4"  WITH(READUNCOMMITTED)  ON ("17"."Entry No_" = "17_e4"."Entry No_") 
 JOIN "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$fa14d944-0ad9-4210-9171-9a86c033de53" "17_e5"  WITH(READUNCOMMITTED)  ON ("17"."Entry No_" = "17_e5"."Entry No_") 
 ORDER BY "Entry No_" ASC OPTION(OPTIMIZE FOR UNKNOWN, FAST 50)

Nevertheless, the second command generated absolutely same SQL Command as earlier when we had no additional tables installed. The performance difference is obvious now.

 
 SELECT 
      "17"."timestamp",
      "17"."Entry No_",
      "17"."G_L Account No_",
      "17"."Amount",
      "17"."Bal_ Account Type",
      "17"."Source Type",
      "17"."FA Entry Type",
      "17"."$systemId",
      "17"."$systemCreatedAt",
      "17"."$systemCreatedBy",
      "17"."$systemModifiedAt",
      "17"."$systemModifiedBy" 
 FROM "CRONUS".dbo."CRONUS International Ltd_$G_L Entry$437dbf0e-84ff-417a-965d-ed2bb9650972" "17"  WITH(READUNCOMMITTED)  
 ORDER BY "Entry No_" ASC OPTION(OPTIMIZE FOR UNKNOWN, FAST 50)

SQL stats

What is behind these SQL Commands? In the previous article, we saw that the environment’s execution plan without any extension is not really interesting… It is just one Index Scan from the table itself.

However, the same Plan is used for environments with table extensions too, if we use fields specified using SetLoadFields! Furthermore, if we use the “old” approach (= without using SetLoadFields), SQL Server must join all table extension using INNER JOIN command that results in much complicated (and much performance-consuming)! It can be easily seen in the picture below; for example, the relative query cost of the command generated using SetLoadField is 4% compared to 93%! That is a huge difference and could result in many troubles and performance issues.

Conclusion

To summarize, it is essential to start using Partial Records for any logic with bigger-than-small complexity. Furthermore, it is crucial to minimize the number of extensions on tables, which can have many rows and/or are used within standard processes (which unfortunately does not use Partial Records yet).